Global monocular indoor positioning of a robotic vehicle with a floorplan

John Noonan, Hector Rotstein, Amir Geva, Ehud Rivlin

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

This paper presents a global monocular indoor positioning system for a robotic vehicle starting from a known pose. The proposed system does not depend on a dense 3D map, require prior environment exploration or installation, or rely on the scene remaining the same, photometrically or geometrically. The approach presents a new way of providing global positioning relying on the sparse knowledge of the building floorplan by utilizing special algorithms to resolve the unknown scale through wall-plane association. This Wall Plane Fusion algorithm presented finds correspondences between walls of the floorplan and planar structures present in the 3D point cloud. In order to extract planes from point clouds that contain scale ambiguity, the Scale Invariant Planar RANSAC (SIPR) algorithm was developed. The best wall-plane correspondence is used as an external constraint to a custom Bundle Adjustment optimization which refines the motion estimation solution and enforces a global scale solution. A necessary condition is that only one wall needs to be in view. The feasibility of using the algorithms is tested with synthetic and real-world data; extensive testing is performed in an indoor simulation environment using the Unreal Engine and Microsoft Airsim. The system performs consistently across all three types of data. The tests presented in this paper show that the standard deviation of the error did not exceed 6 cm.

Original languageEnglish
Article number634
JournalSensors
Volume19
Issue number3
DOIs
StatePublished - 1 Feb 2019
Externally publishedYes

Keywords

  • Floorplan
  • Indoor positioning
  • Robotic vehicle
  • Vision-based navigation

Fingerprint

Dive into the research topics of 'Global monocular indoor positioning of a robotic vehicle with a floorplan'. Together they form a unique fingerprint.

Cite this