Can you feel it? What does it mean? Notifications for Operators of Unmanned Ground Vehicles (UGVs) During Operational Missions

Nuphar Katzman, Tal Oron-Gilad

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Vibro-tactile interfaces can support users in various aspects and contexts. Despite their inherent advantages, it is important to realize that they are limited in the type and capacity of information they can convey. This study is part of a series of experiments that aim to develop and evaluate a “tactile taxonomy” for dismounted operational environments. The current experiment includes a simulation of an operational mission with a remote Unmanned Ground Vehicle (UGV). During the mission, 20 participants were required to interpret notifications that they received in one (or more) of the following modalities: auditory, visual and/or tactile. Three specific notification types were chosen based on previous studies, in order to provide an intuitive connection between the notification and its semantic meaning. Response times to notifications, the ability to distinguish between the information types that they provided, and the operational mission performance metrics, were collected. Results indicate that it is possible to use a limited “tactile taxonomy” in a visually loaded and auditory noisy scene while performing a demanding operational task. The use of the tactile modality with other sensory modalities leverages the participants’ ability to perceive and identify the notifications.
Original languageEnglish GB
Title of host publicationProceedings of the Human Factors and Ergonomics Society Annual Meeting
Pages817-821
Number of pages5
Volume63
Edition1
DOIs
StatePublished - 2019

Fingerprint

Dive into the research topics of 'Can you feel it? What does it mean? Notifications for Operators of Unmanned Ground Vehicles (UGVs) During Operational Missions'. Together they form a unique fingerprint.

Cite this