Remotely Operated Vehicles (ROVs) from the Bottom-Up Operational Perspective

Tal Oron-Gilad, Yaniv Minkov

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

10 Scopus citations

Abstract

This chapter describes augmented visual displays and audio and tactile cues used in human-robotic interface (HRI) displays. According to a recent North American Treaty Organization (NATO) report, human interface issues associated with individual uninhabited military vehicle (UMV) control station design include providing appropriate situational awareness and effective information presentation and control strategies. Early robotic control systems, including teleoperation systems in which the user co ntrolled robots from a distance mainly used visual feedback from a camera mounted on the robot. The chapter presents some of the extensive research performed in the area of ground control station displays for the HRI. Many researchers have explored visual, audio, and tactile modalities, as well as combinations of the three, and much of this research involves teleoperation. Spatial audio displays have been shown to increase situation awareness in the operation of unmanned robots from ground-based stations. Spatial audio and tactile displays have been used in the mitigation of pilot SD in aircraft.
Original languageEnglish
Title of host publicationHuman-Robot Interactions in Future Military Operations
EditorsFlorian Jentsch, Michael Barnes
PublisherCRC Press
Pages211-227
Number of pages17
ISBN (Electronic)9781315587622
ISBN (Print)9780754675396, 9781138071704
DOIs
StatePublished - 1 Dec 2010

ASJC Scopus subject areas

  • General Computer Science
  • General Engineering

Fingerprint

Dive into the research topics of 'Remotely Operated Vehicles (ROVs) from the Bottom-Up Operational Perspective'. Together they form a unique fingerprint.

Cite this