PeRL STUDIES AUTONOMOUS NAVIGATION & MAPPING FOR MOBILE ROBOTS IN A PRIORI UNKNOWN ENVIRONMENTS.

At a Glance

Synopsis

Here are the softwares and datasets that we have released to the public.

Dig Deeper

University of Michigan North Campus Long-Term Vision and LIDAR Dataset


The Segway robot used to collect this dataset on the University of Michigan's North Campus.

This is a large-scale, long-term autonomous dataset for robotics research collected on the University of Michigan's North Campus. The dataset consists of omnidirectional imagery, 3D lidar, planar lidar, and proprioceptive sensors for odometry collected using a Segway robot. The dataset was collected to facilitate research focusing on long-term autonomous operation in changing environments. The dataset is comprised of 27 sessions spaced approximately biweekly over the course of 15 months. The sessions repeatedly explore the campus, both indoors and outdoors, on varying trajectories, and at different times of the day across all four seasons. This allows the dataset to capture many challenging elements including: moving obstacles (e.g., pedestrians, bicyclists, and cars), changing lighting, varying viewpoint, seasonal and weather changes (e.g., falling leaves and snow), and long-term structural changes caused by construction. To further facilitate research, we also provide ground-truth pose for all sessions in a single frame of reference.


The left displays the path that the Segway was driven on during the course of a typical session. The right shows the accumulated SLAM graph.

Download

The University of Michigan North Campus Long-Term Vision and LIDAR Dataset is available for download here.