Choose Wisely: Leveraging Explainable AI to Support Reflective Decision-Making

Maximilian Förster, Philipp Schröppel, Chiara Schwenke, Lior Fink, Mathias Klier

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Explainable Artificial Intelligence (XAI) can contribute to the idea of AI being an instrument for reflection when used for augmentation of human decision-making. In the educational domain, reflective decision-making is crucial as decisions have a meaningful and long-term impact. Against this background, we propose an XAI-based approach that supports users in making reflective educational decisions. Our approach introduces three main ideas: concepts as a “shared language” between AI and users, concept-based explanations, and concept-based interventions. We demonstrate the practical applicability of our approach for a real-world dataset with university courses. We evaluate the efficacy of our approach in a user study with 495 participants. Results suggest that our novel approach effectively supports users in making reflective decisions compared to black box recommender systems, while increasing users’ exploration, self-reflection, confidence, and trust. The effectiveness of our approach is attributable to the combination of concept-based explanations and the opportunity to intervene.
Original languageEnglish
Title of host publicationICIS 2024 Proceedings
PublisherAssociation for Information Systems
Pages1-17
Number of pages17
Volume22
StatePublished - 24 Oct 2024

Fingerprint

Dive into the research topics of 'Choose Wisely: Leveraging Explainable AI to Support Reflective Decision-Making'. Together they form a unique fingerprint.

Cite this