%ARanade, S.%AYu, X.%AKakkar, S.%AMiraldo, P.%ARamalingam, S.%AIshikawa, H. Ed.%ALiu, CL. Ed.%APajdla, T. Ed.%AShi, J. Ed.%BJournal Name: Lecture notes in computer science; Journal Volume: 12622 %D2021%I %JJournal Name: Lecture notes in computer science; Journal Volume: 12622 %K %MOSTI ID: 10296127 %PMedium: X %TMapping of Sparse 3D Data Using Alternating Projection %XWe propose a novel technique to register sparse 3D scans in the absence of texture. While existing methods such as KinectFusion or Iterative Closest Points (ICP) heavily rely on dense point clouds, this task is particularly challenging under sparse conditions without RGB data. Sparse texture-less data does not come with high-quality boundary signal, and this prohibits the use of correspondences from corners, junctions, or boundary lines. Moreover, in the case of sparse data, it is incorrect to assume that the same point will be captured in two consecutive scans. We take a different approach and first re-parameterize the point-cloud using a large number of line segments. In this re-parameterized data, there exists a large number of line intersection (and not correspondence) constraints that allow us to solve the registration task. We propose the use of a two-step alternating projection algorithm by formulating the registration as the simultaneous satisfaction of intersection and rigidity constraints. The proposed approach outperforms other top-scoring algorithms on both Kinect and LiDAR datasets. In Kinect, we can use 100X downsampled sparse data and still outperform competing methods operating on full-resolution data. %0Journal Article