This case demonstrates how to use the vision system to improve the grasping accuracy of the arm, and successfully realizes the humanoid double-arm grasping, which is no longer limited to single-arm grasping.
Introduction
There are many different types of humanoid robots on the market today, covering various fields such as the service industry and the medical industry. These robots are gradually integrated into our daily life and work with their digital and automated characteristics. However, although existing humanoid robots have shown great potential in specific application scenarios, how to further improve their operational accuracy, adaptability and versatility remains a key challenge in the development of robotics technology. In response to this demand, this case explores how to improve the autonomous operation ability of humanoid robots in complex environments by integrating vision systems and robotic arm technology, especially in applications in precise grasping and manipulation tasks.
Target:
By combining the openc & STag mark code vision system and the Mercury X1 wheeled humanoid robot, objects of various shapes and sizes can be accurately grasped, sorting efficiency and accuracy can be improved, and two-handed cooperation can be achieved to give full play to the role of the humanoid robot.
Mercury X1
Mercury X1 is an advanced humanoid robot developed by Elephant Robocs, specially designed to cope with various automation tasks. It has 19 degrees of freedom (7 degrees of freedom per arm), which makes it highly flexible and adaptable when performing tasks. Mercury X1 is equipped with a wheeled mobile base, driven by a high-performance direct drive, capable of stable movement in complex environments, and has a battery life of up to 8 hours, suitable for personal and commercial applications.
The robot uses a high-performance main controller system and is equipped with NVIDIA Jetson Xavier to provide powerful computing support to process complex algorithms such as visual ranging, fusion, positioning and mapping, obstacles and path planning. In addition, the mobile base of Mercury X1 is equipped with LiDAR, ultrasonic sensors and 2D vision systems to achieve highly perceptual environmental interaction.
myCobot Pro aptive Gripper
is an adaptive gripper that can pick up any object in any shape and never let go. Use it to complete a complete range of applications and quickly go into production - no robotics expertise required. It is the key to a highly flexible and reliable robotic cell.
myCobot Pro Camera Flange
uses a -B data cable to acquire images from a camera module.
Next, we will explore how these technologies are integrated into robots in real-world applications and demonstrate their performance in specific tasks.
Technical Overview
OpenCV
OpenCV is an open source library for image processing and it plays a vital role in this case, without it this project cannot be completed. The robot's camera analyzes the collected visual data through OpenCV to identify and locate objects. OpenCV's algorithms enable the robot to identify the shape, size and precise coordinates of objects, which are essential for accurate grasping and manipulation.
By providing the robot with the coordinates of the object, precise grasping can be achieved.
S-Tag Marker Technology
S-Tag markers are a highly reliable identification system designed to provide accurate tag recognition in visually challenging environments. These tags are used to identify objects and locations in the operating environment of the Mercury X1 robot. Even in low light or obstructed vision, S-Tags ensure that the robot accurately identifies the target object through its camera system.
https://youtu.be/vnHI3GzLVrY
Through the application of these technologies, the Mercury X1 elephant humanoid robot is able to perform complex tasks such as autonomous navigation, object recognition and precise manipulation, which are indispensable capabilities for modern automated and intelligent systems.
pymycobot
pymycobot is a library for controlling Mercury X1 robot arms and end effectors such as grippers. It allows precise control of the arm's angles, coordinates, and motion modes, including differential and refresh modes. This library provides a high degree of flexibility and customizability for the robot, enabling it to perform complex grasping and manipulation tasks and adapt to a variety of operational requirements.
Project implementation
Preparation before use
First, make sure the zero position of the robot arm is correct. You can calibrate the zero position by the following method:
1) Use the release command to release the joint motor (Note! After release, you need to hold the joint to prevent the robot arm from falling and being damaged!)
ml.release_all_servos() mr.release_all_servos()
2) Drag the robot arm back to zero position
The location of the zero point can be determined by the following details
3) Send a command to lock the motor
ml.focus_all_servos() mr.focus_all_servos()
4) Check whether the zero point is correct
Enter the command to get the joint angle to query the current position
ml.get_angles() mr.get_angles()
If the returned joint angles are close to [0, 0, 0, 0, 90, 0], the zero point is considered correct.
5) Zero point calibration
If the joint angle read at the zero position is significantly different from [0, 0, 0, 0, 90, 0], the joint zero position needs to be calibrated.
f i in range(1,7): ml.set_servo_calibration(i) mr.set_servo_calibration(i)
After calibration, read the joint information. If the value returned is [0, 0, 0, 0, 90, 0], it means the calibration is successful.
ml.get_angles() mr.get_angles()
This is ready, and we can continue to implement our functional part.
Camera and gripper installation
The camera and gripper installation method corresponds to the hand-eye matrix of visual recognition. A hand-eye calibration value for the Mercury X1 camera gripper has been prepared in advance. If it is changed, the hand-eye calibration needs to be re-performed.
First move the robot back to the zero point:
ml.over_limit_return_zero() mr.over_limit_return_zero()
Installation: Install the camera first (pay attention to the direction) and then install the gripper, also paying attention to the direction.
Visual algorithm module
Then we will use the camera_detect package, which encapsulates some methods for using the camera. Next, I will introduce some methods for recognizing STag codes. It is very convenient to use and does not require you to perform hand-eye calibration calculations again.
obj = camera_detect(Camera_ID, mark_size, mtx, dist, direction) : camera_ID: Camera ID mark_size: stag code side length mm direction: 0: left arm 1: right arm Function: Camera and robotic arm initialization obj.camera_open_loop() Function: Display camera image obj.stag_identify_loop() Function: Display camera image, identify and print camera coordinates of stag code coords = obj.stag_robot_identify(direction) parameter: direction: 0: left arm 1: right arm return: robot arm coordinates of stag code Function: Calculate the robot arm coordinates of the stag code obj.vision_tre(catch_mode, direction) parameter: catch_mode: 0: horizontal grab, 1: vertical grab direction: 0: left arm 1: right arm Function: Display camera image, identify and print camera coordinates of stag code
The following is the code for calling the camera_detect package, which is very clean and concise.
import numpy as np from pymycobot import Mercury # Import the visual recognition stag package from camera_detect import camera_detect if __name__ == "__main__": camera_pas = np.load("camera_params.npz") # Camera configuration file mtx, dist = camera_params["mtx"], camera_params["dist"] ml = Mercury("/dev/left_") # Set the left arm mr = Mercury("/dev/right_arm") # Set the right arm port arm = 0 # 0 left arm, 1 right arm catch_mode = 0 # 0 horizontal grab, 1 vertical grab # The camera ID needs to be changed according to the actual situation left_camera_id = 3 right_camera_id = 6 stag_size = 32 # Create a new left and right arm visual recognition object ml_obj = camera_detect(left_camera_id, stag_size, mtx, dist, arm) mr_obj = camera_detect(right_camera_id, stag_size, mtx, dist, arm) # Move the left and right arms to the horizontal observation position for grabbing mr_obj.vision_trace(catch_mode, mr) ml_obj.vision_trace(catch_mode, ml)
The steps of normal hand-eye calibration are as follows:
1. Data collection: Collect several sets of hand-eye data, including the posture (i.e., position and orientation) of the robot end effector at different positions and the position and orientation of the feature points seen by the camera.
2 Establish the motion equation: In order to determine the transformation relationship between the camera coordinate system and the robot end effector coordinate system
3 Solve the transformation matrix: The obtained value is the value describing the spatial transformation relationship between the camera coordinate system and the robot end effector (hand) coordinate system.
The following code is an example of solving a transformation matrix.
# Convert the rotation matrix to Euler angles def CvtRotationMatrixToEulerAngle(pdtRotationMatrix): pdulerAngle = np.ze(3) pdtEulerAngle[2] = np.arctan2(pdtRotationMatrix[1, 0], pdtRotationMatrix[0, 0]) fCosRoll = np.cos(pdtEulerAngle[2]) fSinRoll = np.sin(pdtEulerAngle[2]) pdtEulerAngle[1] = np.arctan2(-pdtRotationMatrix[2, 0], (fCosRoll * pdtRotationMatrix[0, 0]) + (fSinRoll * pdtRotationMatrix[1, 0])) pdtEulerAngle[0] = np.arctan2((fSinRoll * pdtRotationMatrix[0, 2]) - (fCosRoll * pdtRotationMatrix[1, 2]), (-fSinRoll * pdtRotationMatrix[0, 1]) + (fCosRoll * pdtRotationMatrix[1, 1])) return pdtEulerAngle # Convert Euler angles to rotation matrix def CvtEulerAngleToRotationMatrix(ptrEulerAngle): ptinAngle = np.sin(ptrEulerAngle) ptrCosAngle = np.cos(ptrEulerAngle) ptrRotationMatrix = np.zeros((3, 3)) ptrRotationMatrix[0, 0] = ptrCosAngle[2] * ptrCosAngle[1] ptrRotationMatrix[0, 1] = ptrCosAngle[2] * ptrSinAngle[1] * ptrSinAngle[0] - ptrSinAngle[2] * ptrCosAngle[0] ptrRotationMatrix[0, 2] = ptrCosAngle[2] * ptrSinAngle[1] * ptrCosAngle[0] + ptrSinAngle[2] * ptrSinAngle[0] ptrRotationMatrix[1, 0] = ptrSinAngle[2] * ptrCosAngle[1] ptrRotationMatrix[1, 1] = ptrSinAngle[2] * ptrSinAngle[1] * ptrSinAngle[0] + ptrCosAngle[2] * ptrCosAngle[0] ptrRotationMatrix[1, 2] = ptrSinAngle[2] * ptrSinAngle[1] * ptrCosAngle[0] - ptrCosAngle[2] * ptrSinAngle[0] ptrRotationMatrix[2, 0] = -ptrSinAngle[1] ptrRotationMatrix[2, 1] = ptrCosAngle[1] * ptrSinAngle[0] ptrRotationMatrix[2, 2] = ptrCosAngle[1] * ptrCosAngle[0] return ptrRotationMatrix # Coordinates are converted to homogeneous transformation matrix, (x, y, z, rx, ry, rz) unit rad def Transformation_matrix(coord): position_robot = coord[:3] pose_robot = coord[3:] # Convert Euler angles to rotation matrix RBT = CvtEulerAngleToRotationMatrix(pose_robot) PBT = np.array([[position_robot[0]], [position_robot[1]], [position_robot[2]]]) temp = np.concatenate((RBT, PBT), axis=1) array_1x4 = np.array([[0, 0, 0, 1]]) # Concatenate two arrays row by row matrix = np.concatenate((temp, array_1x4), axis=0) return matrix
Previous article:Innovative IC products gathered at Songshan Lake to help robots develop intelligently
Next article:Laser cladding repair technology for cylinder bore
- Popular Resources
- Popular amplifiers
- Using IMU to enhance robot positioning: a fundamental technology for accurate navigation
- Researchers develop self-learning robot that can clean washbasins like humans
- Universal Robots launches UR AI Accelerator to inject new AI power into collaborative robots
- The first batch of national standards for embodied intelligence of humanoid robots were released: divided into 4 levels according to limb movement, upper limb operation, etc.
- New chapter in payload: Universal Robots’ new generation UR20 and UR30 have upgraded performance
- Humanoid robots drive the demand for frameless torque motors, and manufacturers are actively deploying
- MiR Launches New Fleet Management Software MiR Fleet Enterprise, Setting New Standards in Scalability and Cybersecurity for Autonomous Mobile Robots
- Nidec Drive Technology produces harmonic reducers for the first time in China, growing together with the Chinese robotics industry
- DC motor driver chip, low voltage, high current, single full-bridge driver - Ruimeng MS31211
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- Introduction to F28335 ADC configuration code modules and their sorting methods
- High-performance fully integrated successive approximation register analog-to-digital converter
- Bicycle speedometer
- Orange pi GPIO output control, start from lighting
- People who work in software have more hair loss, will people who work in hardware be better?
- Huawei employee was sentenced for using company system bug to access confidential data without authorization; Huawei issued a letter of apology
- EEWorld Development Board Chip Coin is now open! Participate in the bidding to win a free order!
- What is the difference between SMT inductors and SMD inductors?
- Inside the DLP NIRscan Nano Evaluation Module
- How to process the current transformer signal with a single chip microcomputer