Abstract
This paper presents a novel two-robot collaboration method for precise 2D self-localization using relatively simple sensors. The main advantage of this method lies in its ability to precisely measure the orientations of the robots, therefore reducing cumulative errors. Each robot is fitted with a rotating turret carrying a camera to track the moving robot and calculate the relative distance and position, and an encoder to provide the orientation of the turret. At each step, a single robot advances while the other remains stationary and measures the position of the moving robot (continuously or at the end of the step), using the angular orientation of the turret and the distance measured using the camera. The orientation of the moving robot is obtained by turning its own turret towards the static robot and measuring its turret orientation. By fusing the data from the two robots, the precise location and orientation of the moving robot are obtained. We also present an analytical model of the position of the robots as a function of the sensor data and then proceed to present a statistical estimate using Monte Carlo simulations of the location of the robots while assuming that the sensor data includes random errors. Additionally, lab experiments are presented and compared to simulation results.
Original language | English |
---|---|
Article number | 8851128 |
Pages (from-to) | 154044-154055 |
Number of pages | 12 |
Journal | IEEE Access |
Volume | 7 |
DOIs | |
State | Published - 1 Jan 2019 |
Keywords
- Localization
- multi-robot systems
ASJC Scopus subject areas
- General Computer Science
- General Materials Science
- General Engineering