How a Horse Whisperer can Help Engineers Build Better Robots

Eakta Jain, Ph.D.
During her yearlong study of the human-horse relationship, Eakta Jain, Ph.D., learned how to ride a horse and says she hopes to one day own a horse.

Humans and horses have enjoyed a strong working relationship for nearly 10,000 years — a partnership that transformed how food was produced, people were transported and even how wars were fought and won. Today, we look to horses for companionship, recreation, and as teammates in competitive activities like racing, dressage and showing.

Can these age-old interactions between people and their horses teach us something about building robots designed to improve our lives? Researchers at the University of Florida say yes.

“There are no fundamental guiding principles for how to build an effective working relationship between robots and humans,” said Eakta Jain, Ph.D., an associate professor in the Department of Computer & Information Science & Engineering (CISE). “As we work to improve how humans interact with autonomous vehicles and other forms of AI, it occurred to me that we’ve done this before with horses. This relationship has existed for millennia but was never leveraged to provide insights for human-robot interaction.”

Dr. Jain, who did her doctoral work at the Robotics Institute at Carnegie Mellon University, conducted a year of fieldwork observing the special interactions among horses and humans at the UF Horse Teaching Unit in Gainesville, Fla. She recently presented her findings at the Association for Computing Machinery Conference on Human Factors in Computing Systems in Hamburg, Germany.

As horses did thousands of years before, robots are entering our lives and workplaces as companions and teammates. They vacuum our floors, help educate and entertain our children, and studies are showing that social robots can be effective therapy tools to help improve mental and physical health. Increasingly, robots are found in factories and warehouses, working collaboratively with human workers and sometimes even called co-bots.

As a member of the UF Transportation Institute, Dr. Jain was leading the human factor subgroup that examines how humans should interact with autonomous vehicles, or AVs.

“For the first time, cars and trucks can observe nearby vehicles and keep an appropriate distance from them as well as monitor the driver for signs of fatigue and attentiveness,” Dr. Jain said. “However, the horse has had these capabilities for a long time. I thought why not learn from our partnership with horses for transportation to help solve the problem of natural interaction between humans and AVs.”

Looking at our history with animals to help shape our future with robots is not a new concept, though most studies have been inspired by the relationship humans have with dogs. Dr. Jain and her colleagues in the Herbert Wertheim College of Engineering and UF Equine Sciences are the first to bring together engineering and robotics researchers with horse experts and trainers to conduct on-the-ground field studies with the animals.

The multidisciplinary collaboration involved expertise in engineering, animal sciences and qualitative research methodologies, Dr. Jain explained. She first reached out to Joel McQuagge, from UF’s equine behavior and management program who oversees the UF Horse Teaching Unit. He hadn’t thought about this connection between horses and robots. But McQuagge provided Dr. Jain with full access, and she spent months observing classes. She interviewed and observed horse experts, including thoroughbred trainers and devoted horse owners. Christina Gardner-McCune, Ph.D., a CISE associate professor, provided expertise in qualitative data analysis.

A thematic analysis of Dr. Jain’s notes resulted in findings and design guidelines that can be applied by human-robot interaction researchers and designers.

“Some of the findings are concrete and easy to visualize, while others are more abstract,” she says. “For example, we learned that a horse speaks with its body. You can see its ears pointing to where something caught its attention. We could build in similar types of nonverbal expressions in our robots, like ears that point when there is a knock on the door or something visual in the car when there’s a pedestrian on that side of the street.”

A more abstract and groundbreaking finding is the notion of respect. When a trainer first works with a horse, he looks for signs of respect from the horse for its human partner.

“We don’t typically think about respect in the context of human-robot interactions,” Dr. Jain said. “What ways can a robot show you that it respects you? Can we design behaviors similar to what the horse uses? Will that make the human more willing to work with the robot?”

Dr. Jain, originally from New Delhi, says she grew up with robots the way people grow up with animals. Her father is an engineer who made educational and industrial robots, and her mother was a computer science teacher who ran her school’s robotics club.

“Robots were the subject of many dinner table conversations,” she says, “so I was exposed to human-robot interactions early.”

However, during her yearlong study of the human-horse relationship, she learned how to ride a horse and says she hopes to one day own a horse.

“At first, I thought I could learn by observing and talking to people,” she says. “There is no substitute for doing, though. I had to feel for myself how the horse-human partnership works. From the first time I got on a horse, I fell in love with them.”


Story originally published on UF News.


By Karen Dooley
UF News