Robot Dog Handles Objects with Hand-Like Grippers

The LocoMan quadruped robot can pick up and manipulate objects with the same legs it uses to walk

Scarlett Evans, Assistant Editor, IoT World Today

April 25, 2024

1 Min Read
A still of a robot dog opening a drawer with a gripper-like hand
Carnegie Mellon

A team of researchers from Google DeepMind, Carnegie Mellon University and the University of Washington have created a new four-legged robot that can pick up and handle objects with its legs.

The design features hand-like grippers attached to the quadruped robot’s legs, bridging the gap between the mobile manipulation of bipedal robots and the navigation capabilities of quadrupedal robots. The grippers follow user commands based on predefined plans, human motions or joysticks.

The system, known as LocoMan, successfully completed a range of object manipulation tasks in a series of tests, including opening doors, plugging into sockets and picking up objects.

To test LocoMan, the system was incorporated into a Unitree Go1 robot, with the grippers attached to the robot’s front calves. 

“Quadrupedal robots have emerged as versatile agents capable of locomoting and manipulating in complex environments,” the team said. “Traditional designs typically rely on the robot's inherent body parts or incorporate top-mounted arms for manipulation tasks. However, these configurations may limit the robot's operational dexterity, efficiency and adaptability, particularly in cluttered or constrained spaces, 

“In this work, we present LocoMan, a dexterous quadrupedal robot with a novel morphology to perform versatile manipulation in diverse constrained environments.”

Related:Google DeepMind Creates AI System to Train General-Purpose Robots

The grippers are designed to not hinder the robot’s movements, enabling it to move through an environment as usual but switching between using its front appendages as arms or legs.

The system is also designed to be low-cost and can be integrated into most existing quadruped robots.

The team said they will continue developing the system to incorporate vision-language machine learning models, giving the robot greater environmental awareness.

About the Author(s)

Scarlett Evans

Assistant Editor, IoT World Today

Scarlett Evans is the assistant editor for IoT World Today, with a particular focus on robotics and smart city technologies. Scarlett has previous experience in minerals and resources with Mine Australia, Mine Technology and Power Technology. She joined Informa in April 2022.

Sign Up for the Newsletter
The most up-to-date news and insights into the latest emerging technologies ... delivered right to your inbox!

You May Also Like