EXSS-ATEC: Explainable smart systems and algorithmic transparency in emerging technologies 2020

Alison Smith-Renner, Tsvi Kuflik, Advait Sarkar, Styliani Kleanthous, Simone Stumpf, Casey Dugan, Brian Lim, Jahna Otterbacher, Avital Shulner

Research output: Contribution to journalConference articlepeer-review

Abstract

Smart systems that apply complex reasoning to make decisions and plan behavior, such as decision support systems and personalized recommendations, are difficult for users to understand. Algorithms allow the exploitation of rich and varied data sources, in order to support human decision-making and/or taking direct actions; however, there are increasing concerns surrounding their transparency and accountability, as these processes are typically opaque to the user. Transparency and accountability have attracted increasing interest to provide more effective system training, better reliability and improved usability. This workshop will provide a venue for exploring issues that arise in designing, developing and evaluating intelligent user interfaces that provide system transparency or explanations of their behavior. In addition, our goal is to focus on approaches to mitigate algorithmic biases that can be applied by researchers, even without access to a given system's inter-workings, such as awareness, data provenance, and validation.

Original languageEnglish
JournalCEUR Workshop Proceedings
Volume2582
StatePublished - 1 Jan 2020
Externally publishedYes
Event2020 Workshop on Explainable Smart Systems for Algorithmic Transparency in Emerging Technologies, ExSS-ATEC 2020 - Cagliari, Italy
Duration: 17 Mar 2020 → …

Keywords

  • Accountability
  • Explanations
  • Fairness
  • Intelligent systems
  • Intelligibility
  • Machine learning
  • Transparency
  • Visualizations

ASJC Scopus subject areas

  • General Computer Science

Fingerprint

Dive into the research topics of 'EXSS-ATEC: Explainable smart systems and algorithmic transparency in emerging technologies 2020'. Together they form a unique fingerprint.

Cite this