Enthusiast Network reported (Text/Li Wanwan) The perception system is a key component for humanoid robots to achieve interaction and autonomous action. The system usually includes a variety of and is used to collect, process and analyze information from the external environment. The perception systems used by different companies will be different, such as UBTECH, Xiaomi, Yushu Technology, etc.
What sensors and algorithms are included in the perception system of humanoid robots? The
perception system of humanoid robots includes various sensors, such as visual sensors: through the observation principle of the human binocular visual system, binocular cameras, depth cameras, lidar, etc. are used to obtain real spatial information of the surrounding environment. For example, a pure visual solution is used on a humanoid robot, and three cameras are used to achieve low-cost environmental perception.
Position sensor: such as IMU (inertial measurement unit), which is used to measure the movement and posture information of the robot. Joint force control sensor: including one-dimensional and one-dimensional torque sensors, which are used to measure the force and torque information of the joint. Tesla places these sensors on each linear joint and rotational joint to achieve motion control.
Wrist and ankle sensors: Six-dimensional torque sensors are used to provide force and torque information of the wrist and ankle. These sensors are essential to improving the dexterity and stability of robots.
Tactile sensor: Adding tactile sensors to the hands can improve the dexterity of the robot's hands. Tesla uses 10 tactile sensors in its hands, and others are also exploring tactile sensors based on visual solutions.
In terms of algorithms, such as visual perception algorithms, including binocular three-dimensional environmental perception, 3D point cloud registration, pose estimation algorithms, etc., are used to achieve a true portrayal of the external environment and reconstruction of digital models.
Data fusion algorithm: Fuse data from different sensors to improve the accuracy and robustness of the perception system.
Generally speaking, the perception system of a humanoid robot will adopt multimodal perception. By integrating multiple sensors, it can realize multimodal perception such as vision, position, and touch to obtain more comprehensive environmental information. In addition, high precision and real-time performance are required. The use of high-precision sensors and algorithms can ensure that the perception system can accurately perceive changes in the external environment. The perception system needs to have high real-time performance to ensure that the robot can respond to environmental changes in a timely manner.
What kind of perception technology do humanoid robots on the market use?
There are many humanoid robots on the market, such as UBTECH's humanoid robot Walker X. Its perception system is mainly composed of high-performance servo joints, multi-dimensional force perception, multi-eye stereo vision, omnidirectional hearing, inertia and ranging and other all-round perception technologies.
Specifically, Walker X has 41 high-performance servo joints, which not only have the characteristics of high torque (4.5Nm-200Nm) and high speed (30rpm-90rpm), but also achieve fast and stable movement through precise control algorithms. These servo joints are an important part of Walker X's perception system, providing flexible and powerful support for the robot's limbs.
Walker X has multi-dimensional force perception capabilities, which can perceive the size and direction of the contact force between the robot's limbs and the external environment in real time. This perception capability enables Walker X to perform smooth physical interactions in complex environments, ensuring the safety of human-machine interaction. Walker
X uses multi-eye stereo vision technology to capture three-dimensional information of the surrounding environment through multiple cameras. This technology enables Walker X to perceive the position, shape and size of objects in real time, providing important support for the robot's navigation, obstacle avoidance and object recognition tasks.
In addition, Walker X's omnidirectional hearing system can perceive sound information from all directions in real time. Through advanced processing algorithms, Walker X can accurately recognize voice commands, environmental noise and other information, and interact with users in natural language.
Walker X is also equipped with inertial and ranging sensors for real-time perception of its own posture, speed and position. These sensors can ensure the stability and accuracy of Walker X in tasks such as walking and navigation.
Xiaomi Technology's humanoid robot CyberOne, its perception system consists of multiple sensors and advanced algorithms, enabling the robot to perceive the surrounding environment in real time, perform autonomous movements and make decisions. The system integrates a variety of perception technologies, including vision, hearing, force perception, etc., to ensure that the robot can respond accurately in various complex environments. In terms of
visual perception, CyberOne is equipped with the self-developed Mi-Sense deep vision module, combined with algorithms, so that it has complete three-dimensional space perception capabilities. The visual system can reconstruct the real world in three dimensions, providing important support for the robot's navigation, obstacle avoidance and object recognition tasks. At the same time, through and technology, CyberOne can realize functions such as character identification, gesture recognition, and expression recognition, so as to interact with humans more naturally.
In terms of auditory perception, CyberOne has an omnidirectional auditory system that can perceive sound information from all directions in real time. Combined with the self-developed MiAI environmental semantic recognition engine and MiAI voice emotion recognition engine, CyberOne can recognize 85 kinds of environmental sounds and 45 kinds of human emotions in 6 categories to achieve a smarter interactive experience.
In terms of force perception, CyberOne has multi-dimensional force perception capabilities, which can perceive the size and direction of the contact force between the robot's limbs and the external environment in real time to ensure the safety of human-computer interaction. CyberOne is also equipped with inertial and ranging sensors for real-time perception of its own posture, speed, position and other information to ensure the stability and accuracy of the robot in tasks such as walking and navigation.
The perception system of Unitree G1, a humanoid robot of Yushu Technology, is the core part of its intelligent interaction and efficient task execution. In terms of visual perception, Unitree G1 is equipped with RealSense D435 camera and LIVOX-MID360 3D LiDAR. These advanced visual sensors together constitute G1's all-round detection and perception capabilities. Through these sensors, G1 can achieve 360° detection and perception, greatly enhancing its understanding and adaptability to the surrounding environment.
In terms of sensor configuration, Unitree G1 is equipped with Intel RealSense D435 camera, a high-performance RGB-D camera that can provide depth information and color images to help G1 identify objects and people, as well as build a three-dimensional model of the surrounding environment. Unitree G1 is also equipped with LIVOX-MID360 3D LiDAR, which can scan the surrounding environment and provide G1 with accurate distance measurement and obstacles to ensure the robot's safe navigation in complex environments.
Written in the end
At present, with the development of technology, the perception system of humanoid robots will be more intelligent and better able to understand and respond to human needs. The perception system in the future will also be more modular and can be flexibly configured and expanded according to different application scenarios and needs. At the same time, with the continuous development of sensors and algorithms, the cost of humanoid robot perception systems will gradually decrease, and humanoid robots will become more popular and practical.
Previous article:NVIDIA Accelerates Development of Humanoid Robots
Next article:Edge AI has a big trick! AI models support virtual digital humans and robotic arms, Intel's edge platform helps partners accelerate innovation
- Popular Resources
- Popular amplifiers
- Wan Bin of Ruihan Medical: "Brain-computer interface + AI + robot" is the future of rehabilitation track
- Xirun Medical's Meng Mingqiang: Hand-function soft robot helps stroke patients regain new life
- China's robot density has surpassed Germany and Japan
- Molex releases new report on the future of robotics and explores the huge potential of human-robot collaboration
- my country is developing a six-legged moon landing robot: no worries even if the legs are broken
- Using IMU to enhance robot positioning: a fundamental technology for accurate navigation
- Researchers develop self-learning robot that can clean washbasins like humans
- Universal Robots launches UR AI Accelerator to inject new AI power into collaborative robots
- The first batch of national standards for embodied intelligence of humanoid robots were released: divided into 4 levels according to limb movement, upper limb operation, etc.
- Intel promotes AI with multi-dimensional efforts in technology, application, and ecology
- ChinaJoy Qualcomm Snapdragon Theme Pavilion takes you to experience the new changes in digital entertainment in the 5G era
- Infineon's latest generation IGBT technology platform enables precise control of speed and position
- Two test methods for LED lighting life
- Don't Let Lightning Induced Surges Scare You
- Application of brushless motor controller ML4425/4426
- Easy identification of LED power supply quality
- World's first integrated photovoltaic solar system completed in Israel
- Sliding window mean filter for avr microcontroller AD conversion
- What does call mean in the detailed explanation of ABB robot programming instructions?
- STMicroelectronics discloses its 2027-2028 financial model and path to achieve its 2030 goals
- 2024 China Automotive Charging and Battery Swapping Ecosystem Conference held in Taiyuan
- State-owned enterprises team up to invest in solid-state battery giant
- The evolution of electronic and electrical architecture is accelerating
- The first! National Automotive Chip Quality Inspection Center established
- BYD releases self-developed automotive chip using 4nm process, with a running score of up to 1.15 million
- GEODNET launches GEO-PULSE, a car GPS navigation device
- Should Chinese car companies develop their own high-computing chips?
- Infineon and Siemens combine embedded automotive software platform with microcontrollers to provide the necessary functions for next-generation SDVs
- Continental launches invisible biometric sensor display to monitor passengers' vital signs
- Take it with you when shopping for New Year's goods, and never complain about being tired again
- Award-winning evaluation: 100 sets of evaluation boards for Toshiba's smallest photorelay TLP3547 are available for free
- Reference materials for control questions: Intelligent car material series
- Another generation of super processor? Kirin 820 running score announced: single-core/multi-core both surpass Kirin 980
- protous 8.9 green version
- A brief discussion on the relationship between driving capability and timing
- LED switching power supply topology BUCKBOOST analysis
- FAQ_How to set the impedance of BlueNRG-12 ADC_V1.0
- 【Perf-V Review】Timer and Dot Matrix Module Display
- Do I need to apply for use of the 868mhz frequency band in China?