ICP
ICP (Iterative Closest Point) registration algorithm
the ICP algorithm aims at finding the transformation between a point cloud and some reference surface (or another point cloud), by minimizing the square errors between the corresponding entities.
That is, ICP solves a surface matching problem which can be expressed as follows: given a reference surface S_ref and a set of points {p_i}, to find the roto-translation q=(t, theta) that minimizes the distance of the points p_i, roto-translated by q, to their projection on surface S_ref.
Circled + denotes the roto-translation operator. Pi symbol denotes the Euclidean projector on S_ref.
ICP minimizes (1) iteratively, starting from a first guess **q_**0. The point projections over S_ref are computed according to the old guess q_k, and a solution is found for the new guess q_k+1
Point-to-point metric used by vanilla ICP is very slow. So, point-to-line metric is used instead.
Overview of the process of registering a pair of point clouds. After the two views are in memory, a pre-processing sequence is applied in order to improve the signal to noise ratio or to recover information about the surface (normals) that is lost because of the sampling involved in the scanning procedure.
There are two available paths: the first one consists of computing keypoints and their descriptors and then estimating a relative transformation between the two clouds.
- Path 1: Feature-based registrtation algorithms for computing initial alignments
- Geometric feature descriptors are computed andmatched in some high dimensional space.
- The more descriptive, unique, and persistent these descriptors are, the higher is the chance that all found matches are pairs of points which truly correspond to one another.
- Path 2: Iteriative registration algorihtms
-
- search for closest points (matching step)
-
- align the found point pairs (alignment step).
Steps of ICP
-
Selection (Sampling Representative Subsets)
- point clouds can become quite large. Consequently, registering large point clouds is considerably more computationally expensive than registering clouds of smaller cardinality.
- In principle, two methods of data reduction can be distinguished.
- automatically extracting a small set of unique and repeatable keypoints
- sampling of the original data with respect to a desired target distribution.
-
Matching (Closest Points Correspondence Estimation)
- the process of pairing points from the source point cloud P to their closest neighbors in the target cloud Q.
- data structures for rapid searches have been proposed, such as octrees and kd-trees.
- FLANN, an open-source library for fast approximate nearest neighbor searches. It’s been empirically proven to be one of the best performing solutions.
-
Rejecting and Filtering Correspondences
- rejection based on distance (using threshold)
- rejection based on normal compatibility
- rejection of pairs with duplicate target matches
- It might happen that a point in the target cloud is assigned multiple corresponding source points. This rejector only keeps a single pair.
- rejection of pairs that contain boundary points
- RANSAC to estimate a transofrmation for subsets
-
Alignment ( Error metrics and transformation estimation )
-
Estimate the transformation which minimizes the error of the point pairs.
-
two main error metrics
- point to point
- point to plane
(q_k, p_k) is the k-th of the N pair correspondences from the source cloud to target cloud.
Coarse Alignment via Descriptor Matching
- why needed?
- ICP-based algorithms, due to their greedy nature, require a reliable initial alignment to avoid converging to bad local minima.
- Determine point-to-point correspondences bewteen 3D keypoints extracted from both clouds via matching of the associated keypoint descriptors.
- Increase the chance of successful alignment as well as making ICP converge faster.
- Steps
- Estimating keypoints
- Describing keypoints (Feature Descriptors)
- For each detected keypoint a descriptor is computed for being able to determine correspondences between the two point clouds.
- Correspondence estimation and filtering
- Transformation estimation
RGBD image integration
to support large scenes, we use a hierarchical hashing structure introduced in Integrater in ElasticReconstruction.