Brown Researchers Create Tool to Simplify Human-Robot Communication

The system was tested on 21 simulations and in real life using Boston Dynamics’ robot dog Spot, reporting an 80% accuracy rate

Scarlett Evans, Assistant Editor, IoT World Today

November 22, 2023

2 Min Read
The system was tested using Boston Dynamics’ robot dog Spot
The system was tested using Boston Dynamics’ robot dog SpotBrown University, Juan Siliezar

Brown University researchers have unveiled new software they say translates everyday language instructions into robotic behavior without needing to first be trained on “thousands of hours” of data. 

Currently, software used to give robots navigation capabilities can’t accurately move from everyday language to the mathematical language robots use to understand and perform requested tasks. 

The development of large language models (LLMs) is working to meet this challenge, giving robots a greater understanding of human language and clearer interpretations of instructions.

Researchers at Brown University’s Humans to Robots Laboratory have built on advancements in AI-enabled LLMs to create a system they say enables more “seamless communications” between humans and robots.

“We were particularly thinking about mobile robots moving around an environment,” said Stefanie Tellex, senior author of the team’s study. “We wanted a way to connect complex, specific and abstract English instructions that people might say to a robot — like go down Thayer Street in Providence and meet me at the coffee shop, but avoid the CVS and first stop at the bank — to a robot’s behavior.”

According to the team, its new system compartmentalizes and breaks down instructions to eliminate the need for training data.

Related:ChatGPT Gives Boston Dynamics Robot Dog Personalities

It also provides navigation robots with a “grounding tool” that translates natural language commands into behaviors, including the individual steps necessary to complete the required task.  

The plug-and-play system can operate in any environment without the usual lengthy training process. All it needs, the team said, is a map of its environment and it’s ready to go.

“In the future, this has applications for mobile robots moving through our cities, whether a drone, a self-driving car or a ground vehicle delivering packages,” said Tellex. “Anytime you need to talk to a robot and tell it to do stuff, you would be able to do that and give it very rich, detailed, precise instructions.”

In tests, the team put the software through simulations in 21 cities using OpenStreetMap, with the system showing an 80% accuracy.

The system was also deployed on Brown’s campus using a Boston Dynamics Spot robot dog. 

Next, the team said its planning to release a simulation on OpenStreetMaps that users can test out for themselves, allowing them to type in natural language commands to control a drone in the simulation. 

Soon after, the team said it hopes to add object manipulation capabilities to the software.

About the Author(s)

Scarlett Evans

Assistant Editor, IoT World Today

Scarlett Evans is the assistant editor for IoT World Today, with a particular focus on robotics and smart city technologies. Scarlett has previous experience in minerals and resources with Mine Australia, Mine Technology and Power Technology. She joined Informa in April 2022.

Sign Up for the Newsletter
The most up-to-date news and insights into the latest emerging technologies ... delivered right to your inbox!

You May Also Like