Lydia Harbarth
(Over)Trusting AI Recommendations: How System and Person Variables Affect Dimensions of Complacency
Harbarth, Lydia; Gößwein, Eva; Bodemer, Daniel; Schnaubert, Lenka
Authors
Eva Gößwein
Daniel Bodemer
Dr LENKA SCHNAUBERT Lenka.Schnaubert@nottingham.ac.uk
ASSISTANT PROFESSOR
Abstract
Over-trusting AI systems can lead to complacency and decision errors. However, human and system variables may affect complacency and it is important to understand their interplay for HCI. In our experiment, 90 participants were confronted with traffic route problems guided by AI recommendations and thereby assigned to either a transparent system providing reasons for recommendations or a non-transparent system. We found transparent systems to lower the potential to alleviate workload (albeit not to neglect monitoring), but to simultaneously foster actual complacent behavior. On the contrary, we found performance expectancy to foster the potential to alleviate workload, but not complacent behavior. Interaction analyses showed that effects of performance expectancy depend on system transparency. This contributes to our understanding how system- and person-related variables interact in affecting complacency and stresses the differences between dimensions of complacency and the need for carefully considering transparency and performance expectancy in AI research and design.
Citation
Harbarth, L., Gößwein, E., Bodemer, D., & Schnaubert, L. (2025). (Over)Trusting AI Recommendations: How System and Person Variables Affect Dimensions of Complacency. International Journal of Human-Computer Interaction, 41(1), 391-410. https://doi.org/10.1080/10447318.2023.2301250
Journal Article Type | Article |
---|---|
Acceptance Date | Dec 29, 2023 |
Online Publication Date | Jan 22, 2024 |
Publication Date | 2025 |
Deposit Date | Jan 23, 2024 |
Publicly Available Date | Jan 25, 2024 |
Journal | International Journal of Human-Computer Interaction |
Print ISSN | 1044-7318 |
Electronic ISSN | 1532-7590 |
Publisher | Taylor and Francis |
Peer Reviewed | Peer Reviewed |
Volume | 41 |
Issue | 1 |
Pages | 391-410 |
DOI | https://doi.org/10.1080/10447318.2023.2301250 |
Keywords | Complacency; human-AI interaction; performance expectancy; transparency; trust |
Public URL | https://nottingham-repository.worktribe.com/output/30110889 |
Publisher URL | https://www.tandfonline.com/doi/full/10.1080/10447318.2023.2301250 |
Files
Over Trusting AI Recommendations How System And Person Variables Affect Dimensions Of Complacency
(2.1 Mb)
PDF
Licence
https://creativecommons.org/licenses/by/4.0/
Publisher Licence URL
https://creativecommons.org/licenses/by/4.0/
Copyright Statement
© 2024 The Author(s). Published with license by Taylor & Francis Group, LLC.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The terms on which this article has been published allow the posting of the Accepted Manuscript in a repository by the author(s) or with their consent.
You might also like
Effects of Uncertainty Markers on Metacognitive Group Awareness and Regulation
(2022)
Presentation / Conference Contribution
Downloadable Citations
About Repository@Nottingham
Administrator e-mail: discovery-access-systems@nottingham.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search