Abstract
Vibro-tactile interfaces can support users in various aspects and contexts. Despite their inherent advantages, it is important to realize that they are limited in the type and capacity of information they can convey. This study is part of a series of experiments that aim to develop and evaluate a "tactile taxonomy" for dismounted operational environments. The current experiment includes a simulation of an operational mission with a remote Unmanned Ground Vehicle (UGV). During the mission, 20 participants were required to interpret notifications that they received in one (or more) of the following modalities: auditory, visual and/or tactile. Three specific notification types were chosen based on previous studies, in order to provide an intuitive connection between the notification and its semantic meaning. Response times to notifications, the ability to distinguish between the information types that they provided, and the operational mission performance metrics, were collected. Results indicate that it is possible to use a limited "tactile taxonomy" in a visually loaded and auditory noisy scene while performing a demanding operational task. The use of the tactile modality with other sensory modalities leverages the participants' ability to perceive and identify the notifications.
Original language | English |
---|---|
Pages (from-to) | 817-821 |
Number of pages | 5 |
Journal | Proceedings of the Human Factors and Ergonomics Society |
Volume | 63 |
Issue number | 1 |
DOIs | |
State | Published - 1 Jan 2019 |
Event | 63rd International Annual Meeting of the Human Factors and Ergonomics Society, HFES 2019 - Seattle, United States Duration: 28 Oct 2019 → 1 Nov 2019 |
ASJC Scopus subject areas
- Human Factors and Ergonomics