PeRL STUDIES AUTONOMOUS NAVIGATION & MAPPING FOR MOBILE ROBOTS IN A PRIORI UNKNOWN ENVIRONMENTS.

At a Glance

Synopsis

Browse Publications by Ryan Eustice and the rest of the PeRL Team.

Browse by year

2020, 2019, 2018, 2017, 2016, 2015, 2014, 2013, 2012, 2011, 2010, 2009, 2008, 2007, 2006, 2005, 2004, 2003, 2002, 2000

Theses

Real-time visual SLAM for autonomous underwater hull inspection using visual saliency

Summary


Ayoung Kim and Ryan M. Eustice, Real-time visual SLAM for autonomous underwater hull inspection using visual saliency. IEEE Transactions on Robotics, 29(3):719-733, 2013.

Abstract

This paper reports on a real-time monocular visual simultaneous localization and mapping (SLAM) algorithm and results for its application in the area of autonomous underwater ship hull inspection. The proposed algorithm overcomes some of the specific challenges associated with underwater visual SLAM, namely limited field of view imagery and feature-poor regions. It does so by exploiting our SLAM navigation prior within the image registration pipeline and by being selective about which imagery is considered informative in terms of our visual SLAM map. A novel online bag-of-words measure for intra- and inter-image saliency are introduced, and are shown to be useful for image key-frame selection, information-gain based link hypothesis, and novelty detection. Results from three real-world hull inspection experiments evaluate the overall approach---including one survey comprising a 3.4 hour / 2.7 km long trajectory.

Bibtex entry

@ARTICLE { akim-2013a,
    AUTHOR = { Ayoung Kim and Ryan M. Eustice },
    TITLE = { Real-time visual {SLAM} for autonomous underwater hull inspection using visual saliency },
    JOURNAL = { IEEE Transactions on Robotics },
    YEAR = { 2013 },
    VOLUME = { 29 },
    NUMBER = { 3 },
    PAGES = { 719--733 },
}