Model-informed LIME Extension for Business Process Explainability

Guy Amit, Fabiana Fournier, Shlomit Gur, Lior Limonad

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations

Abstract

Our focus in this work is on the adaptation of eXplainable AI techniques for the interpretation of business process execution results. Such adaptation is required since conventional employment of such techniques involves a surrogate machine learning model that is trained on historical process execution logs. However, being a data-driven surrogate, its representation faithfulness of the real business process model affects the adequacy of the explanations derived from it. Hence, native use of such techniques is not ensured to be adhering to the target business process explained. We present a business-process-model-driven approach that extends LIME, a conventional machine-learning-model-agnostic eXplainable AI tool, to cope with business processes constraints that is replicable and reproducible. Our results show that our extended LIME approach produces correct and significantly more adequate explanations than the ones given by LIME as-is.

Original languageEnglish
Pages (from-to)1-12
Number of pages12
JournalCEUR Workshop Proceedings
Volume3310
StatePublished - 1 Jan 2022
Externally publishedYes
Event2022 Workshop on Process Management in the AI Era, PMAI 2022 - Wien, Austria
Duration: 23 Jul 2022 → …

Keywords

  • Augmented Business Process Management System
  • Business Process
  • Machine Learning
  • Situation-Aware eXplainability
  • eXplainable Artificial Intelligence

ASJC Scopus subject areas

  • General Computer Science

Fingerprint

Dive into the research topics of 'Model-informed LIME Extension for Business Process Explainability'. Together they form a unique fingerprint.

Cite this