Mobile robots must move through complex environments in many applications. A highly successful approach is to create geometric maps of the environment and plan and follow a collision-free, safe trajectory to avoid sparse large obstacles, as exemplified by self-driving cars. However, many crucial applications require robots to traverse complex 3-D terrain with cluttered obstacles as large as themselves, such as search and rescue in rubble, environmental monitoring in forests and mountains, and sample collection in extraterrestrial rocks.
Although many animals traverse such complex 3-D terrain at ease, bio-inspired robots are still poor at doing so. This is largely due to a lack of fundamental understanding of how to use and control physical interaction with such terrain to move through. Here, I review my lab’s progress toward filling this gap by integrating animal experiments, robot experiments, and physics modeling.
In the first half of the talk, I will discuss work on locomotor transitions of insects and multi-legged robots. Previous studies of multi-legged locomotion largely focused on how to generate near-steady-state walking and running on relatively flat surfaces and how to stabilize it when perturbed. By contrast, multi-legged locomotion in complex 3-D terrain occurs by stochastically transitioning across stereotyped locomotor modes. A potential energy landscape approach revealed surprising general principles for a diversity of locomotor challenges encountered in such terrain—why these modes occur, and how to transition across them.
In the second half, I will discuss work on the limbless locomotion of snakes and snake robots. Previous studies of limbless locomotion largely focused on how to move on relatively flat surfaces with sparse vertical structures as anchor points for stability or push points for propulsion. By contrast, limbless locomotion in 3-D terrain benefits from coordinated lateral and vertical body bending, which not only helps conform to the terrain for stability but also provides access to many more push points for propulsion.For both directions, we are currently working on how to sense physical interaction during locomotion and use feedback control to enable autonomous locomotion across complex 3-D terrain.
Chen Li is an Assistant Professor in the Department of Mechanical Engineering and a faculty member of the Laboratory for Computational Sensing and Robotics at Johns Hopkins University. He earned B.S. and Ph.D. degrees in physics from Peking University and Georgia Tech, respectively, and performed postdoctoral research in Integrative Biology and Robotics at UC Berkeley as a Miller Fellow.
Dr. Li’s research aims at creating a new field of terra dynamics, analogous to aero- and hydrodynamics, at the interface of biology, robotics, and physics, and using terra dynamics to understand animal locomotion and advance robot locomotion in the real world.