Three-dimensional kinematics-based real-time localization method using two robots

Guy Elmakis, Matan Coronel, David Zarrouk

Research output: Contribution to journalArticlepeer-review

Abstract

This paper presents a precise two-robot collaboration method for three-dimensional (3D) self-localization relying on a single rotating camera and onboard accelerometers used to measure the tilt of the robots. This method allows for localization in global positioning system-denied environments and in the presence of magnetic interference or relatively (or totally) dark and unstructured unmarked locations. One robot moves forward on each step while the other remains stationary. The tilt angles of the robots obtained from the accelerometers and the rotational angle of the turret, associated with the video analysis, make it possible to continuously calculate the location of each robot. We describe a hardware setup used for experiments and provide a detailed description of the algorithm that fuses the data obtained by the accelerometers and cameras and runs in real-time on onboard microcomputers. Finally, we present 2D and 3D experimental results, which show that the system achieves 2% accuracy for the total traveled distance (see Supporting Information S1: video).

Original languageEnglish
JournalJournal of Field Robotics
DOIs
StateAccepted/In press - 1 Jan 2024

Keywords

  • field robotics
  • kinematics
  • localization
  • tracked robot

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Three-dimensional kinematics-based real-time localization method using two robots'. Together they form a unique fingerprint.

Cite this