AI helps robots understand humans better, and a new generation of robot dogs that can understand human language are being trained

Publisher:nu23Latest update time:2023-11-07 Source: OFweek机器人网Author: Lemontree Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

How can robots driven by code better interact with humans? Recently, the Human-Robot Laboratory at Brown University tested a new system powered by AI, the goal of which is to enable robots to understand instructions given by humans in everyday language and perform tasks accurately.

The key point of this research is that they have developed a new system that enables robots to perform complex tasks without thousands of hours of data training. In traditional machine training, in order to let robots navigate in different places, a large number of examples are needed to tell the robot how to understand and execute instructions, and this new system allows robots to operate in different environments, only providing a detailed map of the area.

The researchers introduced the role of the large language model embedded in their system, which enables robots to understand and perform tasks without a large amount of training data by breaking down instructions. The system is not only able to accept natural language instructions, but also calculate the logical jumps that the robot may need based on the context of the environment, which makes the instructions simpler and clearer, including what the robot can do, what it cannot do, and in what order to execute.

“When choosing the subjects, we specifically considered mobile robots that move around in an environment,” said Stefanie Tellex, a professor of computer science at Brown University and one of the lead researchers on the project. “We wanted to find a way for the robot to understand complex, spoken instructions that humans could give it, such as “go down Thayer Street in Providence to meet me at a coffee shop, but avoid CVS and stop at a bank first,” and follow the instructions exactly as they were given.”

If the research is successful, it will be applied to many mobile robots in cities in the future, including drones , self-driving cars, unmanned transport vehicles, etc. You only need to interact with the robot in the same way as you usually communicate with people, and it will be able to accurately understand your instructions, making it possible to use mobile robots in complex environments.

To test the system, the researchers conducted simulations using OpenStreetMap in 21 cities. The results showed that the system was able to accurately perform tasks 80% of the time. This accuracy rate is much higher than other similar systems, which usually only achieve an accuracy rate of around 20% and cannot handle complex instructions and tasks.

At the same time, the team also conducted indoor tests on the Brown University campus using Boston Dynamics' Spot robot, which is considered one of the world's leading general-purpose quadruped robots. The successful verification on Spot will help promote the applicability of the system to robots from other manufacturers.

Jason Xinyu, a computer science PhD and a key member of the research team, used an example to explain how the system works.

Suppose the user tells the drone to go to the "store" on "Main Street", but to go to the "bank" first. After entering this command, first, the software identifies the two locations, and then the language model begins to match these abstract locations with the specific locations where the robot is located; at the same time, it also analyzes the metadata of the location, such as the address or type of location, to help the system make decisions. In this case, there are several stores nearby, but only one is on Main Street, so the system knows where to go; then, the language model translates the command into linear temporal logic, which is a mathematical code and symbol to express commands; finally, the system substitutes the current mapped location into this formula and tells the robot to go to point A, but after point B.

Jason said that in November, a simulation based on OpenStreetMaps will be released online to allow users to test the system themselves. Users can enter natural language commands on the web page to guide the drone in the simulation to perform navigation tasks to help researchers fine-tune the software.

This means that an "AI+Robot" project trained by the public is coming to us.

Reference address:AI helps robots understand humans better, and a new generation of robot dogs that can understand human language are being trained

Previous article:New research on embodied intelligence: Robots learn to open cans, Turing Big Three LeCun praises
Next article:Robot Today News Flash 2023.11.7

Latest robot Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号