Mobile grasping is a complex robot system composed of a mobile chassis, an arm, and multiple components. Since it is a complex system, we might as well break it down into multiple systems and then talk about them one by one.
As shown in the figure above, the mobile chassis is the "legs and feet" of the mobile grasping robot system, responsible for "walking"; the mechanical arm is the "hand" of the robot system, and this "hand" is a subsystem that integrates the arm, the end effector (usually an electric gripper), and the visual camera. The entire robot system "calls each other", integrates and collaborates, and finally realizes mobile grasping.
In the entire system of the mobile grasping robot, the mobile chassis belongs to the drive layer. The chassis itself has independent movement and can control the forward and backward movement and steering of the chassis. In addition to the basic forward and backward movement and steering functions, after connecting to related sensors, 2D-SLAM, 3D-SLAM and visual SLAM navigation planning functions can be realized. The following takes 3D-SLAM as an example to briefly introduce how the mobile chassis can realize path planning, obstacle avoidance and other functions.
▍Build
The main components list is as follows (schematic):
Refer to the figure below for the hardware installation effect. (PS: A robotic arm will need to be installed later, and the actual installation requires a reasonable layout of the industrial computer and RTK)
▍Navigation process
The navigation process first requires environmental modeling to generate an environmental map, and then uses the environmental map for positioning and path planning.
The modeling process uses point cloud P matching, integrating 3D Lidar, IMU, and GPS data to build a 3D point cloud map of the environment in real time.
After completing the modeling, specify the target point according to the built map, and the mobile chassis can perform intelligent path planning!
▍Robot arm "system"
As mentioned above, the arm that realizes grasping is a small system of the mobile grasping robot. The following will introduce how this subsystem realizes the process from target recognition to grasping.
Visual recognition can be achieved through the following method. That is, a large number of data sets containing recognition targets are pre-labeled in possible scenarios, and interference items are added to obtain a complete training set. After the sample instances of the training set are scaled and segmented according to a certain strategy, the convolution iteration training is used to obtain the feature map. These feature maps are then used to match feature anchors in the actual picture, and finally the feature edges are corrected by the bounding box regression algorithm to obtain the feature candidate area.
After the camera recognizes the target object, it sends its coordinate points to the robotic arm, ultimately achieving accurate grasping of the target object.
After the above detailed analysis of the mobile grasping robot system, we can know that the mobile grasping robot system is not a simple "material stacking", but a highly complex robot system with intelligent path planning of the mobile chassis, visual recognition, mobile chassis and grasping system, and multi-sensor fusion as the core.
In the mobile grasping robot system, the mobile chassis gives unlimited application possibilities to the traditional robotic arms that can only work at fixed points. Therefore, whether in traditional fields or in the scientific and educational market, there is no shortage of professional manufacturers working in this direction. In the industrial field, mobile grasping robots can make the application scenarios of robotic arms more diverse, which can be said to free them from fixed points, and not be limited to traditional applications such as palletizing, loading and unloading.
This is especially true in the field of science and education. As a complex robot system, mobile grasping covers multiple aspects of robot-related fields, such as robot motion control, environmental perception, navigation planning, learning, robotic arm motion planning, etc. A set of robot systems can be used by researchers to study different aspects, truly realizing multiple uses of one machine.
However, a set of mobile grasping robots requires high costs, from hardware selection to integrated installation, to SLAM algorithm optimization, and finally to truly realize mobile grasping.
"It is very expensive to develop a complete set of mobile grasping robots from scratch, and it also deviates from the research and learning purposes of many scientific research teams and student teams. Based on the ROS robot platform independently developed by our company, we have developed multiple sets of mobile grasping robots, covering a variety of mainstream robotic arms, lidars, depth cameras, etc. at home and abroad. Users can choose as needed. Individuals, research institutes, and university teams engaged in ROS research, motion control research, agricultural picking research, inspection research, and vision research can adopt ours, which can save a lot of time and cost, so that they can invest in research and development more conveniently, efficiently, and attentively." said Dr. Song Zhangjun, general manager of the Robot++ Shenzhen team.
It is reported that the Robot++ Shenzhen team (Shenzhen Shihe Robot) is a professional mobile robot manufacturer. It has been deeply involved in the field of science and education and has a keen sense of smell for the science and education market. As early as its inception, it began to invest in research and development to develop mobile grasping robots. Now it has established in-depth cooperative relationships with robot arm and sensor manufacturers such as KUKA, Kinova, Ruike Zhilian, and Suteng, and has launched a number of mobile grasping robots with multiple functions such as multi-machine collaboration, autonomous path planning, and visual recognition.
By Ken Shirriff
Reviewing Editor: Huang Fei
Previous article:Yushu Technology releases its first bipedal robot H1
Next article:Market analysis of various sensors for humanoid robots
- Popular Resources
- Popular amplifiers
- Using IMU to enhance robot positioning: a fundamental technology for accurate navigation
- Researchers develop self-learning robot that can clean washbasins like humans
- Universal Robots launches UR AI Accelerator to inject new AI power into collaborative robots
- The first batch of national standards for embodied intelligence of humanoid robots were released: divided into 4 levels according to limb movement, upper limb operation, etc.
- New chapter in payload: Universal Robots’ new generation UR20 and UR30 have upgraded performance
- Humanoid robots drive the demand for frameless torque motors, and manufacturers are actively deploying
- MiR Launches New Fleet Management Software MiR Fleet Enterprise, Setting New Standards in Scalability and Cybersecurity for Autonomous Mobile Robots
- Nidec Drive Technology produces harmonic reducers for the first time in China, growing together with the Chinese robotics industry
- DC motor driver chip, low voltage, high current, single full-bridge driver - Ruimeng MS31211
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- 【Bluesun AB32VG1 RISC-V Evaluation Board】Development Board Introduction
- How to prevent wage subsidy fraud? All Sohu employees were exposed to wage subsidy fraud
- How to use a power amplifier to amplify and output a pulse train signal? How to use the Burst function of a signal generator?
- 【DIY Creative LED】LED lights and holes
- Surge arrester explanation and working principle
- Is there any error in the schematic diagram of the electrostatic generator?
- Cheap_Flash_FS (SPI_Flash version) -- embedded SPI_FLASH file system free source code, please download
- How to write interrupt function after using library function in MSP430F5529
- TPS61040 boost circuit abnormality
- [Atria AT32WB415 series Bluetooth BLE 5.0 MCU] + CAN communication