PeRL STUDIES AUTONOMOUS NAVIGATION & MAPPING FOR MOBILE ROBOTS IN A PRIORI UNKNOWN ENVIRONMENTS.
Synopsis
Browse Publications by Ryan Eustice and the rest of the PeRL Team.
Browse by year
2020, 2019, 2018, 2017, 2016, 2015, 2014, 2013, 2012, 2011, 2010, 2009, 2008, 2007, 2006, 2005, 2004, 2003, 2002, 2000
Ford campus vision and lidar data set
Summary
Gaurav Pandey, James R. McBride and Ryan M. Eustice, Ford campus vision and lidar data set. International Journal of Robotics Research, 30(13):1543-1552, 2011.Abstract
In this paper we describe a data set collected by an autonomous ground vehicle testbed, based upon a modified Ford F-250 pickup truck. The vehicle is outfitted with a professional (Applanix POS-LV) and consumer (Xsens MTi-G) inertial measurement unit, a Velodyne three-dimensional lidar scanner, two push-broom forward-looking Riegl lidars, and a Point Grey Ladybug3 omnidirectional camera system. Here we present the time-registered data from these sensors mounted on the vehicle, collected while driving the vehicle around the Ford Research Campus and downtown Dearborn, MI, during November--December 2009. The vehicle path trajectory in these data sets contains several large- and small-scale loop closures, which should be useful for testing various state-of-the-art computer vision and simultaneous localization and mapping algorithms
Bibtex entry
@ARTICLE { gpandey-2011a,
AUTHOR = { Gaurav Pandey and James R. McBride and Ryan M. Eustice },
TITLE = { Ford campus vision and lidar data set },
JOURNAL = { International Journal of Robotics Research },
YEAR = { 2011 },
VOLUME = { 30 },
NUMBER = { 13 },
PAGES = { 1543--1552 },
}