How can robots driven by code better interact with humans? Recently, the Human-Robot Laboratory at Brown University tested a new system powered by AI, the goal of which is to enable robots to understand instructions given by humans in everyday language and perform tasks accurately.
The key point of this research is that they have developed a new system that enables robots to perform complex tasks without thousands of hours of data training. In traditional machine training, in order to let robots navigate in different places, a large number of examples are needed to tell the robot how to understand and execute instructions, and this new system allows robots to operate in different environments, only providing a detailed map of the area.
The researchers introduced the role of the large language model embedded in their system, which enables robots to understand and perform tasks without a large amount of training data by breaking down instructions. The system is not only able to accept natural language instructions, but also calculate the logical jumps that the robot may need based on the context of the environment, which makes the instructions simpler and clearer, including what the robot can do, what it cannot do, and in what order to execute.
“When choosing the subjects, we specifically considered mobile robots that move around in an environment,” said Stefanie Tellex, a professor of computer science at Brown University and one of the lead researchers on the project. “We wanted to find a way for the robot to understand complex, spoken instructions that humans could give it, such as “go down Thayer Street in Providence to meet me at a coffee shop, but avoid CVS and stop at a bank first,” and follow the instructions exactly as they were given.”
If the research is successful, it will be applied to many mobile robots in cities in the future, including drones , self-driving cars, unmanned transport vehicles, etc. You only need to interact with the robot in the same way as you usually communicate with people, and it will be able to accurately understand your instructions, making it possible to use mobile robots in complex environments.
To test the system, the researchers conducted simulations using OpenStreetMap in 21 cities. The results showed that the system was able to accurately perform tasks 80% of the time. This accuracy rate is much higher than other similar systems, which usually only achieve an accuracy rate of around 20% and cannot handle complex instructions and tasks.
At the same time, the team also conducted indoor tests on the Brown University campus using Boston Dynamics' Spot robot, which is considered one of the world's leading general-purpose quadruped robots. The successful verification on Spot will help promote the applicability of the system to robots from other manufacturers.
Jason Xinyu, a computer science PhD and a key member of the research team, used an example to explain how the system works.
Suppose the user tells the drone to go to the "store" on "Main Street", but to go to the "bank" first. After entering this command, first, the software identifies the two locations, and then the language model begins to match these abstract locations with the specific locations where the robot is located; at the same time, it also analyzes the metadata of the location, such as the address or type of location, to help the system make decisions. In this case, there are several stores nearby, but only one is on Main Street, so the system knows where to go; then, the language model translates the command into linear temporal logic, which is a mathematical code and symbol to express commands; finally, the system substitutes the current mapped location into this formula and tells the robot to go to point A, but after point B.
Jason said that in November, a simulation based on OpenStreetMaps will be released online to allow users to test the system themselves. Users can enter natural language commands on the web page to guide the drone in the simulation to perform navigation tasks to help researchers fine-tune the software.
This means that an "AI+Robot" project trained by the public is coming to us.
Previous article:New research on embodied intelligence: Robots learn to open cans, Turing Big Three LeCun praises
Next article:Robot Today News Flash 2023.11.7
- Popular Resources
- Popular amplifiers
- Using IMU to enhance robot positioning: a fundamental technology for accurate navigation
- Researchers develop self-learning robot that can clean washbasins like humans
- Universal Robots launches UR AI Accelerator to inject new AI power into collaborative robots
- The first batch of national standards for embodied intelligence of humanoid robots were released: divided into 4 levels according to limb movement, upper limb operation, etc.
- New chapter in payload: Universal Robots’ new generation UR20 and UR30 have upgraded performance
- Humanoid robots drive the demand for frameless torque motors, and manufacturers are actively deploying
- MiR Launches New Fleet Management Software MiR Fleet Enterprise, Setting New Standards in Scalability and Cybersecurity for Autonomous Mobile Robots
- Nidec Drive Technology produces harmonic reducers for the first time in China, growing together with the Chinese robotics industry
- DC motor driver chip, low voltage, high current, single full-bridge driver - Ruimeng MS31211
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- Analysis of 2020 E-sports Contest C: The realization process of the 99-point ramp car
- MSP430F4152 development board schematic diagram
- Are there any recommended books for beginners in circuit development?
- MSP430 G2553 Launchpad implements capacitance measurement
- [GD32L233C-START Review] 12. Button - External Interrupt
- [Qinheng RISC-V core CH582] Environment configuration and lighting test
- Design and implementation of image cropping circuit based on FPGA
- MSP430 Launchpad MSP430g2452 SHT10 Temperature and Humidity Sensor
- Establishment of RF chip/modem chip design team
- F28335 uses external SRAM for program simulation