Coherent Point Drift Networks: Unsupervised Learning of Non-Rigid Point Set Registration

Lingjing Wang1,2,3
Xiang Li1,3
Jianchun Chen1,3
Yi Fang1,2,3

1NYU Multimedia and Visual Computing Lab
2New York University Abu Dhabi
3New York University Tandon School of Engineering


Given new pairs of source and target point sets, standard point set registration methods often repeatedly conduct the independent iterative search of desired geometric transformation to align the source point set with the target one. This limits their use in applications to handle the real-time point set registration with large volume dataset. This paper presents a novel method, named coherent point drift networks (CPD-Net), for the unsupervised learning of geometric transformation towards real-time non-rigid point set registration. In contrast to previous efforts (e.g. coherent point drift), CPD-Net can learn displacement field function to estimate geometric transformation from a training dataset, consequently, to predict the desired geometric transformation for the alignment of previously unseen pairs without any additional iterative optimization process. Furthermore, CPD-Net leverages the power of deep neural networks to fit an arbitrary function, that adaptively accommodates different levels of complexity of the desired geometric transformation. Particularly, CPD-Net is proved with a theoretical guarantee to learn a continuous displacement vector function that could further avoid imposing additional parametric smoothness constraint as in previous works. Our experiments verify the impressive performance of CPD-Net for non-rigid point set registration on various 2D/3D datasets, even in the presence of significant displacement noise, outliers, and missing points. Our code will be available at this https URL.



News



Paper

Lingjing Wang, Xiang Li, Jianchun Chen, Yi Fang

Coherent Point Drift Networks: Unsupervised Learning of Non-Rigid Point Set Registration

Preprint, 2019

[Paper]
[Bibtex]


Model



Results







This webpage template was borrowed from MEPS.