Skip to main content

Research Repository

Advanced Search

Identifying interaction types and functionality for automated vehicle virtual assistants: An exploratory study using speech acts cluster analysis

Clark, Jediah R.; Large, David R.; Shaw, Emily; Nichele, Elena; Galvez Trigo, Maria J.; Fischer, Joel E.; Burnett, Gary; Stanton, Neville A.

Authors

Jediah R. Clark

DAVID LARGE David.R.Large@nottingham.ac.uk
Senior Research Fellow

Emily Shaw

Elena Nichele

Maria J. Galvez Trigo

JOEL FISCHER Joel.Fischer@nottingham.ac.uk
Professor of Human-Computer Interaction

Gary Burnett

Neville A. Stanton



Abstract

Onboard virtual assistants with the ability to converse with users are gaining favour in supporting effective human-machine interaction to meet safe standards of operation in automated vehicles (AVs). Previous studies have highlighted the need to communicate situation information to effectively support the transfer of control and responsibility of the driving task. This study explores ‘interaction types’ used for this complex human-machine transaction, by analysing how situation information is conveyed and reciprocated during a transfer of control scenario. Two human drivers alternated control in a bespoke, dual controlled driving simulator with the transfer of control being entirely reliant on verbal communication. Handover dialogues were coded based on speech-act classifications, and a cluster analysis was conducted. Four interaction types were identified for both virtual assistants (i.e., agent handing over control) - Supervisor, Information Desk, Interrogator and Converser, and drivers (i.e., agent taking control) - Coordinator, Perceiver, Inquirer and Silent Receiver. Each interaction type provides a framework of characteristics that can be used to define driver requirements and implemented in the design of future virtual assistants to support the driver in maintaining and rebuilding timely situation awareness, whilst ensuring a positive user experience. This study also provides additional insight into the role of dialogue turns and takeover time and provides recommendations for future virtual assistant designs in AVs.

Citation

Clark, J. R., Large, D. R., Shaw, E., Nichele, E., Galvez Trigo, M. J., Fischer, J. E., …Stanton, N. A. (2024). Identifying interaction types and functionality for automated vehicle virtual assistants: An exploratory study using speech acts cluster analysis. Applied Ergonomics, 114, Article 104152. https://doi.org/10.1016/j.apergo.2023.104152

Journal Article Type Article
Acceptance Date Oct 12, 2023
Online Publication Date Oct 17, 2023
Publication Date 2024-01
Deposit Date Oct 18, 2023
Publicly Available Date Oct 19, 2023
Journal Applied Ergonomics
Print ISSN 0003-6870
Electronic ISSN 1872-9126
Publisher Elsevier
Peer Reviewed Peer Reviewed
Volume 114
Article Number 104152
DOI https://doi.org/10.1016/j.apergo.2023.104152
Keywords Interface design; Automated vehicles; Autonomous vehicles; Communication; Natural language interfaces; Virtual assistant
Public URL https://nottingham-repository.worktribe.com/output/26221078
Publisher URL https://www.sciencedirect.com/science/article/pii/S0003687023001904?via%3Dihub
Additional Information This article is maintained by: Elsevier; Article Title: Identifying interaction types and functionality for automated vehicle virtual assistants: An exploratory study using speech acts cluster analysis; Journal Title: Applied Ergonomics; CrossRef DOI link to publisher maintained version: https://doi.org/10.1016/j.apergo.2023.104152; Content Type: article; Copyright: © 2023 The Authors. Published by Elsevier Ltd.

Files





You might also like



Downloadable Citations