Estimating Rigid Transformations of Noisy Point Clouds Using the Universal Manifold Embedding

Amit Efraim, Joseph M. Francos

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

We present a closed form solution to the problem of registration of fully overlapping 3D point clouds undergoing unknown rigid transformations, as well as for detection and registration of sub-parts undergoing unknown rigid transformations. The solution is obtained by adapting the general framework of the universal manifold embedding (UME) to the case where the transformations the object may undergo are rigid. The UME nonlinearly maps functions related by certain types of geometric transformations of coordinates to the same linear subspace of some Euclidean space while retaining the information required to recover the transformation. Therefore registration, matching and classification can be solved as linear problems in a low dimensional linear space. In this paper, we extend the UME framework to the special case where it is a priori known that the geometric transformations are rigid. While a variety of methods exist for point cloud registration, the method proposed in this paper is notably different as registration is achieved by a closed form solution that employs the UME low dimensional representation of the shapes to be registered.

Original languageEnglish
Pages (from-to)343-363
Number of pages21
JournalJournal of Mathematical Imaging and Vision
Volume64
Issue number4
DOIs
StatePublished - 1 May 2022

Keywords

  • Deformable templates
  • Parameter Estimation
  • Point clouds
  • Registration
  • Rigid transformations

ASJC Scopus subject areas

  • Statistics and Probability
  • Modeling and Simulation
  • Condensed Matter Physics
  • Computer Vision and Pattern Recognition
  • Geometry and Topology
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Estimating Rigid Transformations of Noisy Point Clouds Using the Universal Manifold Embedding'. Together they form a unique fingerprint.

Cite this