Design and implementation of immersive bionic robot

Publisher:lqs1975Latest update time:2022-06-20 Source: 21icKeywords:Immersive Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

introduction

Robot technology has developed rapidly in recent years. People hope that in the future, mobile robots can not only complete tasks quickly and accurately in known structured environments, but also complete tasks proposed by people in unknown and unstructured environments [l]. This requires mobile robots to have good maneuverability and ground passability.


This design uses a six-legged bionic robot with flexible movement and strong environmental adaptability as a movement carrier, uses VR glasses to observe the working environment in a panoramic view, and realizes remote operation through a remote control handle. The robot is suitable for places with narrow and dangerous working environments and inconvenient access for personnel. The staff can operate the robot and complete rescue, detection and other tasks more effectively and safely while ensuring personal safety.


1 Overall design of immersive bionic robot

The immersive bionic robot is mainly composed of VR video imaging equipment, a six-legged bionic robot and a wireless remote control. The operator wears VR glasses and uses the wireless remote control to control the six-legged robot. The overall design block diagram is shown in Figure 1.

2. Structural design and physical production of bionic robots based on Solidworks

So1idworks is a 3D design software with functions such as part modeling, assembly, engineering drawing, and simulation. Use So1idworks to build the upper and lower body parts, joints, feet, camera brackets, control panel brackets and other part models of the hexapod robot: use the assembly function to assemble the established part models into a hexapod robot: use the simulation function to make a simulation animation of the hexapod robot. Double-click the icon on the desktop, select the button in the pop-up "New So1idworks" file dialog box, and click the "OK" button to enter the So1idworks part modeling interface. Use the "Sketch", "Extrude Boss/Base", "Extrude Cut", "Round Corner" and other commands to complete the drawing of the various parts of the hexapod robot. The modeling of the main parts is shown in Figure 2.

Create a new hexapod robot assembly, click the button, and click the "OK" button. Select "Browse" under "Start Assembly" on the main interface. First insert the body, then click the button on the assembly toolbar, and use geometric relationships (coaxiality, coincidence, parallelism, verticality, tangency, etc.) to insert the body 2, waist joint 3, upper limb 4, lower limb 5, battery cover 6, camera bracket 7, control panel bracket 8 and other parts in sequence, and assemble each component. The assembly and exploded view of the hexapod robot are shown in Figures 3 and 4 respectively.

Save the parts drawn by so1idworks as "*.st1" files, input them into the 3D printer to make the part entities, and assemble the parts with the servo, binocular camera, Arduino control board, Raspberry Pi, etc. into a six-legged bionic robot. The entity is shown in Figure 5.


3 Bionic gait design

Each leg of the robot is controlled by three servos. Servo No. 1 connects the robot body to the waist joint shown in Figure 2(c) and is used to control the leg swinging action of the robot during movement. Servo No. 2 connects the waist joint to the upper limb shown in Figure 2(d), similar to the hip joint, and is used to control the lifting and lowering of the robot's upper limbs. Servo No. 3 is used to connect the upper and lower limbs, similar to the knee joint, and is used to control the movement of the robot's lower limbs, imitating the kicking action of insects during movement.

The servo numbers are shown in Figure 6.

When working, the robot imitates the triangular gait of a six-legged insect. Legs 1, 4, and 5 move simultaneously as a group; legs 2, 3, and 6 move simultaneously as a group. The six legs are numbered as shown in Figure 7.


During movement, the robot always has at least three legs to support its body, giving it good stability.


4. Immersive experience design

4.1 Binocular Camera Imaging Principle

Binocular stereo vision is a method of using binocular cameras to obtain two images of the object under test from different angles, and using the principle of parallax to calculate the position deviation between corresponding points in the images to establish the three-dimensional geometric information of the object under test, as shown in Figure 8. This design is based on the open source openCV visual function library, using binocular cameras to synchronously collect images, and through image preprocessing, 3D re-projection, etc., to form a three-dimensional visual effect [3].


4.2 Real-time VR imaging implementation method

Install JuicessH software on the Android platform, enter sudoifconfig in the Raspberry Pi terminal to view the IP address, click connections in JuicessH, and enter the Raspberry Pi IP address to remotely control the Raspberry Pi. After the connection is successful, the Raspberry Pi terminal will be displayed on the platform. Enter sudoraspi-config through this terminal to enter the configuration interface, start the VNC service and enter sudoapt-getinstallguvcview in the Raspberry Pi terminal to install the video recording software, and enter sudoguvcview to start the software for configuration, as shown in Figure 9. After the configuration is completed, the camera display screen will appear, as shown in Figure 10.

The binocular camera collects real-time environmental images and performs three-dimensional processing on the images. The processed images are transmitted to the Android platform through the Raspberry Pi. The operating environment seen by users using VR glasses is a system simulation of multi-source information fusion, interactive 3D dynamic views and physical behaviors, giving users an immersive experience.


5 Conclusion

The bionic robot designed in this paper simulates the movement form of hexapod insects, combines virtual reality technology with robotic arms, and has superior road passing ability, good manipulation experience and diversified functions.


The embedded development board can be used to process the image information collected by the binocular camera in real time and transmit it to the mobile phone via Wi-Fi wirelessly: By using the mobile VR box, users can be provided with a good virtual reality experience, allowing users to be fully immersed in the environment to operate the robot. Through virtual reality technology, users can better grasp the changes in the environment: the robotic arm, as an extension of the operator's body, can replace the operator to complete certain dangerous tasks.


Keywords:Immersive Reference address:Design and implementation of immersive bionic robot

Previous article:Design and economic analysis of heat pump recycling water waste heat heating system
Next article:Factory planning and design based on SLP

Latest Embedded Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号