Judging One’s Own or Another Person’s Responsibility in Interactions With Automation

Nir Douer, Joachim Meyer

Research output: Contribution to journalArticlepeer-review

6 Scopus citations


Objective: We explore users’ and observers’ subjective assessments of human and automation capabilities and human causal responsibility for outcomes. Background: In intelligent systems and advanced automation, human responsibility for outcomes becomes equivocal, as do subjective perceptions of responsibility. In particular, actors who actively work with a system may perceive responsibility differently from observers. Method: In a laboratory experiment with pairs of participants, one participant (the “actor”) performed a decision task, aided by an automated system, and the other (the “observer”) passively observed the actor. We compared the perceptions of responsibility between the two roles when interacting with two systems with different capabilities. Results: Actors’ behavior matched the theoretical predictions, and actors and observers assessed the system and human capabilities and the comparative human responsibility similarly. However, actors tended to relate adverse outcomes more to system characteristics than to their own limitations, whereas the observers insufficiently considered system capabilities when evaluating the actors’ comparative responsibility. Conclusion: When intelligent systems greatly exceed human capabilities, users may correctly feel they contribute little to system performance. They may interfere more than necessary, impairing the overall performance. Outside observers, such as managers, may overweigh users’ contribution to outcomes, holding users responsible for adverse outcomes when they rightly trusted the system. Application: Presenting users of intelligent systems and others with performance measures and the comparative human responsibility may help them calibrate subjective assessments of performance, reducing users’ and outside observers’ biases and attribution errors.

Original languageEnglish
Pages (from-to)359-371
Number of pages13
JournalHuman Factors
Issue number2
StatePublished - 1 Mar 2022
Externally publishedYes


  • decision making
  • human-automation interaction
  • warning compliance
  • warning systems

ASJC Scopus subject areas

  • Human Factors and Ergonomics
  • Behavioral Neuroscience
  • Applied Psychology


Dive into the research topics of 'Judging One’s Own or Another Person’s Responsibility in Interactions With Automation'. Together they form a unique fingerprint.

Cite this