PeRL STUDIES AUTONOMOUS NAVIGATION & MAPPING FOR MOBILE ROBOTS IN A PRIORI UNKNOWN ENVIRONMENTS.

Autonomous Ship Hull Inspection

Current State of the Art


Present day methods for ship-hull inspection are time-consuming and imprecise. PeRL is working on automating this task with AUVs.

Present day means for ship hull and port facility inspection require either putting divers in the water or piloting a remotely operated vehicle (ROV) over the area of interest — both of which are manpower intensive and generally cannot guarantee 100% survey coverage. The Navy would benefit from being able to automate this task, allowing for autonomous robotic inspection of its ships and port facilities for foreign objects such as limpet mines or improvised explosive devices (IEDs) on a routine round-the-clock basis. Automating this task, however, is challenging and compounded by the fact that areas around ships in berth are severely confined, cluttered, and complex sensing environments (e.g., acoustically, optically, magnetically). Current tethered robotic inspection systems present issues of snagging, maneuver degradation, and tether management, all of which make maneuvering around the ship at pier difficult. Moreover, current robotic inspection methods require human in-the-loop intervention for both sensory interpretation and control (piloting). Navigation feedback in these scenarios is typically performed using acoustic transponder-based time-of-flight ranging. This necessitates setup and calibration of the associated acoustic-beacon navigation infrastructure, and therefore vitiates our ability to rapidly and repeatably inspect multiple underwater structures.

Technical Approach


Imagery of the hull of the USS Saratoga collected during AUVFest08 with the HAUV1B and PeRL`s camera system.

The technical objective of this work is to develop an optical/acoustic real-time Featured Based Navigation (FBN) capability for explosive ordinance disposal (EOD) autonomous ship-hull inspection. FBN is a vital requirement for autonomous robotic ship-hull inspection. Current robotic inspection methods require human in-the-loop intervention for both sensory interpretation and control (piloting). Navigation feedback in these scenarios is typically performed using acoustic transponder-based time-of-flight ranging – which necessitates setup, calibration, and infrastructure – and thereby vitiates the Navy's ability to rapidly and repeatably inspect multiple underwater structures.

We are currently in Year 1 of a three year project to develop a real-time feature-based navigation system. Years 1 and 2 are focused on developing the overall mapping framework using vision as the main perceptual sensor. Year 3 of the project will investigate transitioning the VAN framework to sonar-based perception, which will yield a larger standoff range sensing capability in turbid water.


HAUV outfitted with PeRL's 12-bit GigE camera and light.

Experimental Validation and Testing: We are currently collaborating with MIT and Bluefin Robotics to prototype and test our VAN algorithms on real-world ship hull inspection data using the Hovering-AUV (HAUV) [1]. The vehicle is designed around a Doppler-based hull-relative navigation strategy using a 1200 kHz DVL mounted on a tilt actuator to measure vehicle velocities with respect to the ship hull for positioning. Open-water navigation to the hull is achieved using the DVL tilted toward the seafloor in bottom-lock mode aided by GPS surface fixes. Hull sensing is achieved using a Dual frequency IDentification SONar (DIDSON); this sonar modality was chosen for its ability to see through turbid water with high resolution. As an operation requirement, the DIDSON requires a grazing angle of 15°–20°, therefore, it also is mounted on a tilt actuator so that this particular graze angle can be maintained.


Two camera configuration on HAUV

To participate in hull-search experiments with the HAUV testbed, PeRL developed a strap-on camera bottle that can be easily mounted to the vehicle. The VAN hardware consists of a 12-bit Prosilica GigE camera and a remote light. This system can be run topside from a laptop computer over the HAUV’s fiber-optic tether. Two different camera configurations are available for the best camera imagery depending on the purpose of mission.

Results

USS Saratoga (2013.08) : The video below shows the multi-session SLAM demonstration for in-water hull inspection performed in Aug, 2013 on the aircraft carrier USS Saratoga. A large-scale SLAM prior map was generated by merging six different dives from May, 2013. In this August demo, the robot successfully localized itself in real-time into the prior map by visually recognizing superstructure above the surface and registering itself to the prior graph from May, 2013.

SS Curtiss (2012.02) : Below is the visual SLAM result on a larger size vessel, surveying SS Curtiss 2012 Feb. The SS Curtiss is a 183 m long singlescrew roll-on/roll-off container ship currently stationed at the U.S. Naval Station in San Diego, California. This SLAM result is only with the camera without sonar measurement. For this 2.7 km and 3.4 hour long mission, camera could register reducing the navigation drift, successfully obtaining loop-closure over 3 hours of time difference.


SLAM result of SS Curtiss

Below is a video clip of a larger spaced (for 100% sonar coverage) mission on SS Curtiss.


Feb. 2011 real-time SLAM results on the SS Curtiss in San Diego.

Picture of the hull inspection tests on the USCG Cutter Seneca.

USCG Seneca (2012.04) : Depicted below is a video summarizing results from an April 2011 test of the real-time visual SLAM system being used on the hull of the US Coast Guard Cutter Seneca in Boston Harbor. This test was conducted using the HULS3 vehicle from Bluefin Robotics and is joint-work with MIT. The video shows combined camera/sonar SLAM results between UMich and MIT and nicely shows the SLAM result being used to control the HULS3 vehicle back to an operator-selected waypoint on the hull of the ship.


April 2011 real-time SLAM results on the USCG Cutter Seneca in Boston Harbor.