Joshua Mangelson

PhD Candidate in Robotics at the University of Michigan
Graduate Student Researcher at the Perceptual Robotics Laboratory and the ROAHM Lab

My research interests are focused on perception, planning, and state-estimation for mobile field robotic systems. Specifically, I'm interested in developing algorithms that apply theory from the fields of sensor-fusion, machine learning, and convex optimization in ways that provide mathematical guarantees so that autonomous systems can be trusted in real-world environments. I am especially interested in developing algorithms for autonomous underwater vehicles and autonomous heterogeneous multi-agent systems., Google Scholar, CV


Communication Constrained Trajectory Alignment for Multi-Agent Inspection via Linear Programming
Joshua G. Mangelson, Ram Vasudevan, and Ryan M. Eustice
In Proceedings of the IEEE/MTS OCEANS Conference and Exhibition (OCEANS 2018, Charleston, South Carolina) [abstract] [BibTeX]

Pairwise Consistent Measurement Set Maximization for Robust Multi-robot Map Merging
Joshua G. Mangelson, Derrick Dominic, Ryan M. Eustice, and Ram Vasudevan
In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA 2018, Brisbane, Australia) [pdf] [BibTeX] [code]
Won ICRA Best Paper on Multi-Robot Systems!

Legged Robot State-Estimation Through Combined Forward Kinematic and Preintegrated Contact Factors
Ross Hartley, Joshua G. Mangelson, Lu Gan, Maani Ghaffari Jadidi, Jeffrey M. Walls, Ryan M. Eustice, and Jessy W. Grizzle
In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA 2018, Brisbane, Australia) [pdf] [BibTeX]

Robust Visual Fiducials For Skin-to-Skin Relative Ship Pose Estimation
Joshua G. Mangelson, Ryan W. Wolcott, Paul Ozog, and Ryan M. Eustice
In Proceedings of the IEEE/MTS OCEANS Conference and Exhibition (OCEANS 2016, Monterey, California) [pdf] [BibTeX] [poster]


Current Projects

Autonomous Ship Hull Inspection

I am currently working on autonomous ship hull inspection as part of a project funded by the Office of Naval Research. Using the Hovering Autonomous Underwater Vehicle or (HAUV) designed by BlueFin Robotics, I am developing perception, estimation, and planning algorithms that enable the use of multiple HAUVs to cooperatively map the bottom of a ship hull in real-time.

Perception For Bipedal Locomotion

I am also currently working on a project in collaboration with Jessy Grizzle and his bipedal locomotion lab. They develop state-of-the-art control algorithms to enable robotic vehicles to robustly walk and run. We are adding perceptual sensors to the Agility Robotics bipedal robot, CASSIE, so we can enable it to run over rough terrain and through complex outdoor environments. This project is funded by the Toyota Research Institute.

Choosing Consistent Measurements for Robust Multi-Robot Map Merging

The merging of robotic maps when performing Multi-Robot Simultaneous Localization and Mapping (SLAM) can be complicated by outlier measurements suggesting an incorrect alignment between robot trajectories and the absence of a prior on relative pose. We've developed a method that uses efficient maximum clique algorithms to select a set of pairwise consistent measurements that can then be used to accurately merge maps. Because our method searches for a set of consistent measurements, it does not require a prior, outperforms existing "Robust SLAM" methods, and can handle situations with large percentages of outliers. [more]

Contact/Forward Kinematic
Legged Robot Odometry

Accurately estimating the motion of a robotic agent is an essential prerequisite for autonomous interactions in real-world environments. By taking into account the fact that legged robots come into rigid contact with the environment, we've developed a method that fuses forward kinematic (FK) information with inertial measurements (IMU) via sparse pose graph optimization. [more]

Past Projects

Robust Fiducials for Estimating Pose in Outdoor Lighting

Visual fiducials are often used in robotics and artificial reality to estimate the position and orientation of objects with respect to a camera. By detecting known points on the tag in the image frame, we can estimate the most likely pose of the camera given the detected points. However, many of the tags were designed indoors and fail in outdoor lighting and at large distances because lighting affects the estimated location of the tag points. We developed an extension that can be applied to a variety of visual fiducials that overcomes some of these biasing problems. [more]


Digital Systems Teaching Lab Manager

In the Electrical and Computer Engineering Department at Brigham Young University, I managed the digital teaching lab responsible for the laboratory portion of two courses: Fundamentals of Digital Systems (EcEn 220) and Digital System Design (EcEn 320). Course material covered included digital logic design, basic processor architecture, VHDL/verilog coding, HDL simulation and testing, synchronous and asynchronous circuits, and computer arithmetic.

My responsabilities included teaching weekly laboratory lectures on VHDL/Verilog, managing/training a team of teaching assistants, laboratory assignment development, answering student questions, all grading for the courses (except exams), and substituting in lecture.

Mobile Robotics Course Assistant and Grader

I've also had the opportunity to serve as a grader and course assistant for NA568/EECS568 Mobile Robotics: Methods & Algorithms taught by my advisor, Ryan Eustice, at the University of Michigan. Topics covered in this course include: Bayesian filtering; stochastic representations of the environment; motion and sensor models for mobile robots; algorithms for mapping, localization, and state-estimation in the presence of uncertainty; application to marine, ground, and air vehicles.




129 NAME Bldg
2600 Draper Drive
Ann Arbor, MI 48109-2145