Robots as a Data Source: Obtaining Insights From Robotic Autonomy

Oxford Robotics Institute director discusses how autonomous robots can be valuable data sources for industrial firms at AI Summit London

Ben Wodecki, Junior Editor - AI Business

June 14, 2024

3 Min Read
Nick Hawes, director of the Oxford Robotics Institute
Ben Wodecki

Robots operating autonomously could soon become valuable data sources for industrial companies, providing insights and repeated measurements that humans struggle to match, according to the director of the Oxford Robotics Institute.

Speaking at the AI Summit London, Nick Hawes explained how his team has deployed autonomous robots like Boston Dynamics' robot dog Spot at sites like a former nuclear fusion reactor, gathering radiation data over 35 days with minimal human involvement.

The Oxford Robotics Institute consists of seven different research groups working across the robotics stack on tasks including dexterous control manipulators to fully AI-driven robots in industrial settings.

He said 80% of the Institute’s work relates to AI – both in a general setting working on fundamental research questions and in the client setting.

The Institute’s early work in robotics led to the research that put the first autonomous cars on the road through Oxbotica, now known as Oxa.

Now they’re working on autonomy, specifically, deploying robots in industrial settings, getting robots into places humans can’t reach.

Hawes said industrial companies are considering autonomy, but they largely prefer a human-in-the-loop approach, where operators collaborate with robots and can take control when necessary.

Related:Boston Dynamics Highlights AI-Powered Robot Safety at AI Summit London

The move toward full autonomy won’t occur overnight, Hawes explained, as systems need to get better at handling the uncertainty.

“If the world was fixed and 100% predictable then we’d write a Python script and be done,” Hawes said. “The reason you need AI on a robot is because it needs to respond to changes.”

He outlined several autonomous robotic deployments his team has been involved in, including using quadrupedal robots from Boston Dynamics to patrol industrial plants.

For example, his team fitted Spot robots with lidar and advanced 3D mapping technologies so they could reliably navigate an industrial site autonomously as well as a mission or task-specific payload, like a hardware device for monitoring radiation levels, for example.

Boston_Dynamics_Robot_Dog.jpeg

In another deployment, Spot operated autonomously for 35 days, walking around the U.K.’s former fusion reactor site. It was tasked with gathering data on alpha radiation emissions and Hawes said it required minimal involvement, programmed to return to its charging unit when its battery was low.

“The robot had a little script that said, here are the six places I want you to look at and this is what your battery looks like,” Hawes said. “The robot was able to then plan and optimize for that information to get up every day, walk around the site, gather the information and go back home.”

Related:NASA Charts AI, Robotics, 3D Printing as Path for Mars Sustainability

With robots operating for long periods, they’re going to create multiple maps of the environment – something Hawes said could prove helpful for operators as they can compare the findings.

The 35-day deployment created repeatable data that can be fed into a different solution, like a digital twin to create virtual representations of industrial environments, for example.

“A robot is something that can deliver actionable data over long periods of time,” Hawes said. “Humans struggle because they get bored and point the camera at the wrong place or forget to take an image whereas a robot gives you a bit more reliability.”

Hawes showed how the data from the fusion reactor site could be used to create a virtual environment emergency services could use to understand the environment before entering the building in case of a disaster.

Describing the concept of artificial general intelligence (AGI) as “nonsense” the professor said robotics developers like his team were focused on autonomy that flexibly changes what that robot performs.

Instead of AGI, he described autonomy as a robot having the ability to repeatedly perform tasks or measure data in the same way, every time.

Read more about:

AI Summit London 2024

About the Author(s)

Ben Wodecki

Junior Editor - AI Business

Ben Wodecki is the junior editor of AI Business, covering a wide range of AI content. Ben joined the team in March 2021 as assistant editor and was promoted to junior editor. He has written for The New Statesman, Intellectual Property Magazine, and The Telegraph India, among others.

Sign Up for the Newsletter
The most up-to-date news and insights into the latest emerging technologies ... delivered right to your inbox!

You May Also Like