Woman and robots shown from above

Equipping robot teams to move smoothly around people

Approaching someone in a hallway, do you both bear right to avoid a collision? Veer left?

It turns out that this awkward moment of deadlock is not just a human experience. Robots can encounter the same issue when working in teams or moving around humans in a warehouse or at a disaster recovery scene.

Research in the lab of Naomi Ehrich Leonard aims to help robots move safely and gracefully around humans while achieving desired goals. Using mathematical models — some rooted in evolutionary biology or social sciences — her team explores how groups of robots can autonomously coordinate their activities.

“A big challenge in my group is developing mathematical models that reveal underlying mechanisms of collective sensing, learning, and decision making, and providing principled and systematic means to design the actions that robots in a group should take when they’re working together or working with people in complex settings,” said Leonard, the Edwin S. Wilsey Professor of Mechanical and Aerospace Engineering.

Her group has collaborated with scholars from many disciplines to study flocks of starlings, schools of fish, colonies of ants, and troupes of dancers. With new insights into group dynamics and decisions, they’ve created systems that have guided robot teams to explore the ocean depths and to search out anomalies in nuclear facilities.

Charlotte Cathcart, a Ph.D. student in Leonard’s group, is working with other current and former group members on a project to apply their new models of opinion dynamics to robot navigation — to spare robots and the humans around them from collisions, as well as irritating and unproductive deadlocks. Opinion dynamics models are commonly used in the social sciences to examine the spread of views on politics or public health, but are also useful for programming robots to choose safe, predictable pathways as they navigate around people, obstacles, and other robots in dynamic environments.

“How do I tell a robot to be more assertive or more cooperative, or to pay attention to how a human is walking toward it and play a submissive role? Or to take time to make sure that the person is very comfortable as the robot passes, as opposed to moving full speed ahead toward a goal?” asked Cathcart. “These are all things that opinion dynamics can help us figure out.”

Related Faculty

Naomi Leonard portrait

Naomi Ehrich Leonard

Related Departments

Student uses drill press while others observe.

Mechanical and Aerospace Engineering

Solving problems in energy, combustion, fluids, lasers, materials science, robotics and control systems, and nuclear security