Get all your news in one place.
100's of premium titles.
One app.
Start reading
The Independent UK
The Independent UK
Technology
Anthony Cuthbertson

Robot dog can now read and reason after AI upgrade

Boston Dynamics has integrated Google's Gemini Robotics to bring AI capabilities to its robot dog Spot - (Boston Dynamics)

Leading robotics maker Boston Dynamics has unveiled a new AI-powered robot that can perform household chores independently.

The firm’s four-legged Spot robot, which was first unveiled in 2016, is now fitted with Google DeepMind’s latest robotic AI model to give it “embodied reasoning” abilities.

Demonstrations of the quadruped using the Gemini Robotics-ER 1.6 artificial intelligence model show it reading tasks from a board before carrying them out.

They include tidying up, recycling cans and checking mouse traps for any dead rodents.

Boston Dynamics said the integration of the AI was “identical” to an operator manually controlling its Spot robot.

A promo video also shows the robot dog walking a real dog on a leash, and throwing balls for it to chase.

One demo shows Boston Dymanics' Spot robot walking a dog (Boston Dynamics)

The AI upgrade allows people to engage with the robot using natural language, rather than inputting lines of code.

“Robots like Spot are already extremely capable of navigating complex and changeable environments, collecting data and sensor readings, and manipulating objects,” said Spot engineer Issac Ross.

“This demo points to a future where users can rely more on natural language to guide Spot’s actions, rather than complex code. The engineer’s role shifts toward setting goals and objectives.”

Google DeepMind first unveiled the Gemini AI models last year, with the hope that robotics firms would use them to build robots that can perform tasks without training.

Boston Dynamics is one of only a few “trusted testers” of the models, though it is not clear when the AI-enabled robots will be commercially available. The Independent has reached out for further information.

Earlier versions of Spot are already being used in everything from car manufacturing plants to rocket launchpad sites.

The addition of in-built reasoning, together with the ability to communicate in natural language, could see the robots used within domestic environments.

“Capabilities like instrument reading and more reliable task reasoning will enable Spot to see, understand, and react to real-world challenges completely autonomously,” said Marco da Silva, the general manager of Spot at Boston Dynamics.

DeepMind said the embodied reasoning will allow robots to “bridge the gap between digital intelligence and physical action”, allowing them to carry out tasks autonomously in a range of environments.

“For robots to be truly helpful in our daily lives and industries, they must do more than follow instructions, they must reason about the physical world,” DeepMind’s Laura Graesser and Peng Xu wrote in a blog post.

“By enhancing spatial reasoning and multi-view understanding, we are bringing a new level of autonomy to the next generation of physical agents.”

Sign up to read this article
Read news from 100's of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.