introduction
Robot technology has developed rapidly in recent years. People hope that in the future, mobile robots can not only complete tasks quickly and accurately in known structured environments, but also complete tasks proposed by people in unknown and unstructured environments [l]. This requires mobile robots to have good maneuverability and ground passability.
This design uses a six-legged bionic robot with flexible movement and strong environmental adaptability as a movement carrier, uses VR glasses to observe the working environment in a panoramic view, and realizes remote operation through a remote control handle. The robot is suitable for places with narrow and dangerous working environments and inconvenient access for personnel. The staff can operate the robot and complete rescue, detection and other tasks more effectively and safely while ensuring personal safety.
1 Overall design of immersive bionic robot
The immersive bionic robot is mainly composed of VR video imaging equipment, a six-legged bionic robot and a wireless remote control. The operator wears VR glasses and uses the wireless remote control to control the six-legged robot. The overall design block diagram is shown in Figure 1.
2. Structural design and physical production of bionic robots based on Solidworks
So1idworks is a 3D design software with functions such as part modeling, assembly, engineering drawing, and simulation. Use So1idworks to build the upper and lower body parts, joints, feet, camera brackets, control panel brackets and other part models of the hexapod robot: use the assembly function to assemble the established part models into a hexapod robot: use the simulation function to make a simulation animation of the hexapod robot. Double-click the icon on the desktop, select the button in the pop-up "New So1idworks" file dialog box, and click the "OK" button to enter the So1idworks part modeling interface. Use the "Sketch", "Extrude Boss/Base", "Extrude Cut", "Round Corner" and other commands to complete the drawing of the various parts of the hexapod robot. The modeling of the main parts is shown in Figure 2.
Create a new hexapod robot assembly, click the button, and click the "OK" button. Select "Browse" under "Start Assembly" on the main interface. First insert the body, then click the button on the assembly toolbar, and use geometric relationships (coaxiality, coincidence, parallelism, verticality, tangency, etc.) to insert the body 2, waist joint 3, upper limb 4, lower limb 5, battery cover 6, camera bracket 7, control panel bracket 8 and other parts in sequence, and assemble each component. The assembly and exploded view of the hexapod robot are shown in Figures 3 and 4 respectively.
Save the parts drawn by so1idworks as "*.st1" files, input them into the 3D printer to make the part entities, and assemble the parts with the servo, binocular camera, Arduino control board, Raspberry Pi, etc. into a six-legged bionic robot. The entity is shown in Figure 5.
3 Bionic gait design
Each leg of the robot is controlled by three servos. Servo No. 1 connects the robot body to the waist joint shown in Figure 2(c) and is used to control the leg swinging action of the robot during movement. Servo No. 2 connects the waist joint to the upper limb shown in Figure 2(d), similar to the hip joint, and is used to control the lifting and lowering of the robot's upper limbs. Servo No. 3 is used to connect the upper and lower limbs, similar to the knee joint, and is used to control the movement of the robot's lower limbs, imitating the kicking action of insects during movement.
The servo numbers are shown in Figure 6.
When working, the robot imitates the triangular gait of a six-legged insect. Legs 1, 4, and 5 move simultaneously as a group; legs 2, 3, and 6 move simultaneously as a group. The six legs are numbered as shown in Figure 7.
During movement, the robot always has at least three legs to support its body, giving it good stability.
4. Immersive experience design
4.1 Binocular Camera Imaging Principle
Binocular stereo vision is a method of using binocular cameras to obtain two images of the object under test from different angles, and using the principle of parallax to calculate the position deviation between corresponding points in the images to establish the three-dimensional geometric information of the object under test, as shown in Figure 8. This design is based on the open source openCV visual function library, using binocular cameras to synchronously collect images, and through image preprocessing, 3D re-projection, etc., to form a three-dimensional visual effect [3].
4.2 Real-time VR imaging implementation method
Install JuicessH software on the Android platform, enter sudoifconfig in the Raspberry Pi terminal to view the IP address, click connections in JuicessH, and enter the Raspberry Pi IP address to remotely control the Raspberry Pi. After the connection is successful, the Raspberry Pi terminal will be displayed on the platform. Enter sudoraspi-config through this terminal to enter the configuration interface, start the VNC service and enter sudoapt-getinstallguvcview in the Raspberry Pi terminal to install the video recording software, and enter sudoguvcview to start the software for configuration, as shown in Figure 9. After the configuration is completed, the camera display screen will appear, as shown in Figure 10.
The binocular camera collects real-time environmental images and performs three-dimensional processing on the images. The processed images are transmitted to the Android platform through the Raspberry Pi. The operating environment seen by users using VR glasses is a system simulation of multi-source information fusion, interactive 3D dynamic views and physical behaviors, giving users an immersive experience.
5 Conclusion
The bionic robot designed in this paper simulates the movement form of hexapod insects, combines virtual reality technology with robotic arms, and has superior road passing ability, good manipulation experience and diversified functions.
The embedded development board can be used to process the image information collected by the binocular camera in real time and transmit it to the mobile phone via Wi-Fi wirelessly: By using the mobile VR box, users can be provided with a good virtual reality experience, allowing users to be fully immersed in the environment to operate the robot. Through virtual reality technology, users can better grasp the changes in the environment: the robotic arm, as an extension of the operator's body, can replace the operator to complete certain dangerous tasks.
Previous article:Design and economic analysis of heat pump recycling water waste heat heating system
Next article:Factory planning and design based on SLP
- Popular Resources
- Popular amplifiers
- Huawei's Strategic Department Director Gai Gang: The cumulative installed base of open source Euler operating system exceeds 10 million sets
- Analysis of the application of several common contact parts in high-voltage connectors of new energy vehicles
- Wiring harness durability test and contact voltage drop test method
- Sn-doped CuO nanostructure-based ethanol gas sensor for real-time drunk driving detection in vehicles
- Design considerations for automotive battery wiring harness
- Do you know all the various motors commonly used in automotive electronics?
- What are the functions of the Internet of Vehicles? What are the uses and benefits of the Internet of Vehicles?
- Power Inverter - A critical safety system for electric vehicles
- Analysis of the information security mechanism of AUTOSAR, the automotive embedded software framework
Professor at Beihang University, dedicated to promoting microcontrollers and embedded systems for over 20 years.
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- The difference between a two-pipe wired controller and a four-pipe thermostat
- Build a laser tripwire alarm using the MSP430 LaunchPad
- Switching power supply testing "Five rules"
- Source Insight Open Issues
- STM Arduion cannot download firmware?
- An article explains the function and principle of Via hole
- DSP IIR digital filter implementation program source code
- NXP Rapid IoT Review] +③ NXP Rapid IoT online compilation and operation of various demos
- Pressure measurement
- What is the use of this circuit? The input of the previous stage is the level of the comparator output