PeRL STUDIES AUTONOMOUS NAVIGATION & MAPPING FOR MOBILE ROBOTS IN A PRIORI UNKNOWN ENVIRONMENTS.
Synopsis
Browse Publications by Ryan Eustice and the rest of the PeRL Team.
Browse by year
2020, 2019, 2018, 2017, 2016, 2015, 2014, 2013, 2012, 2011, 2010, 2009, 2008, 2007, 2006, 2005, 2004, 2003, 2002, 2000
Visually bootstrapped generalized ICP
Summary
Gaurav Pandey, James R. McBride, Silvio Savarese and Ryan M. Eustice, Visually bootstrapped generalized ICP. In Proceedings of the IEEE International Conference on Robotics and Automation, pages 2660-2667, Shanghai, China, May 2011.Abstract
This paper reports a novel algorithm for bootstrapping the automatic registration of unstructured 3D point clouds collected using co-registered 3D lidar and omnidirectional camera imagery. Here, we exploit the co-registration of the 3D point cloud with the available camera imagery to associate high dimensional feature descriptors such as scale invariant feature transform (SIFT) or speeded up robust features (SURF) to the 3D points. We first establish putative point correspondence in the high dimensional feature space and then use these correspondences in a random sample consensus (RANSAC) framework to obtain an initial rigid body transformation that aligns the two scans. This initial transformation is then refined in a generalized iterative closest point (ICP) framework. The proposed method is completely data driven and does not require any initial guess on the transformation. We present results from a real world dataset collected by a vehicle equipped with a 3D laser scanner and an omnidirectional camera.
Bibtex entry
@INPROCEEDINGS { gpandey-2011b,
AUTHOR = { Gaurav Pandey and James R. McBride and Silvio Savarese and Ryan M. Eustice },
TITLE = { Visually bootstrapped generalized {ICP} },
BOOKTITLE = { Proceedings of the IEEE International Conference on Robotics and Automation },
YEAR = { 2011 },
MONTH = { May },
ADDRESS = { Shanghai, China },
PAGES = { 2660--2667 },
}