- - -
# Preface
This is the third e-sports competition I have participated in. The national competition was in 2019 and the provincial competition in 2020. Although we are already seniors, the three of us still have the passion for e-sports and decided to insist on participating. After 4 days and 3 nights of hard work, we completed all the requirements of the competition, as well as the self-designed performance items, and finally won the first prize in the country, which brought a perfect end to university life. Using key technologies such as binocular visual fixed-pointing, laser radar obstacle avoidance, and OPENMV recognition, the drone can autonomously cruise when it reaches the take-off altitude, complete seeding, and return home smoothly.
# Team introduction
The team consists of three senior students, two from the telecommunications major and one from the automation major.
#Project Analysis
## Task requirements
![image.png]
![image.png]
## Introduction to the work
With the rapid development of artificial intelligence technology, intelligent aircraft have been applied to various industries in modern production and life. One of the most successful applications is "plant protection aircraft". By giving instructions to the aircraft, the aircraft can autonomously complete complex tasks such as spraying pesticides and sowing seeds. On the one hand, it can free up human resources; on the other hand, fully automated operations can greatly improve production efficiency. This team designed and developed a plant protection quadcopter drone using the TIVA microcontroller produced by TI as the drone main control and NVIDIA's Jetson NX as the visual processor. The aircraft uses binocular cameras as a visual odometer to achieve fixed-point and cruise. It is also equipped with lidar and optical flow modules to assist the drone in positioning, and uses K210 and OPENMV for target detection. The laser pointer is used to simulate "seeding". The aircraft can realize many functions such as automatic cruise, fixed-point "seeding", target detection, obstacle avoidance, and barcode recognition.
## System Solution
This system is mainly composed of a single-chip microcomputer control module, an attitude calculation module, a height-fixing module, a fixed-point module, a vision module, and a ranging module. The selection of these modules is discussed below.
### Single chip module
solution one: Choose STM32F103RBT6 as the main control chip. This chip is one of the common chips of STMicroelectronics. It has a lot of information and is easy to operate. It is often used in embedded development and design. However, it is not the best choice for systems such as drones that require high-performance data processing and control capabilities.
Option 2: Choose TM4C123GH6PM as the main control chip. The main frequency of this chip can reach 80MHz. It has sufficient resources such as PWM and UART and has extremely strong performance, which can fully meet the control needs of the aircraft.
Based on the above two options, choose option two.
### Attitude calculation module
solution 1: Use MPU6050 to calculate the attitude and communicate through I2C. The fastest communication speed is 400k/s, and the six-axis data of the current UAV can be obtained.
Option 2: Use ICM20602 to calculate the attitude and communicate through SPI. The fastest communication rate is 10M/s. The sensor noise is also greatly reduced, and the current motion status of the drone can be obtained more efficiently.
Based on the above two options, in order to obtain better control effects, choose option two.
### Height-fixing module
option 1: Use ultrasonic ranging to determine height. The ranging range of the ultrasonic ranging module is 0~150cm, the accuracy is 3mm, and the accuracy reaches 0.3%. The distance measurement meets the requirements, but is affected by the external environment. It changes a lot and is not very stable during flight.
Option 2: Use the barometer SPL06 to determine the height. The barometer data fluctuates greatly, the accuracy is low, and the indoor height determination effect is not good.
Option 3: Use laser ranging to determine height. The ranging range is 0.1m~12m and the accuracy is 1%. It has strong resistance to ambient light and is safer and more stable. However, the laser is susceptible to sudden changes in height and can be compared with the accelerometer data. The items are fused to get the actual height.
Based on the above three options, the second option is combined with the third option, that is, laser height determination, accelerometer and barometer data correction.
### Fixed-point module
solution one: Optical flow fixed-point. Since there is no GPS signal for indoor competitions, optical flow fixation has become the preferred solution, and it can be integrated with IMU data to achieve better fixation effects. However, optical flow is affected by issues such as ground reflection, and the competition questions have high requirements for UAV fixed-point accuracy, so optical flow speed measurement cannot meet the requirements.
Option 2: Two-dimensional lidar fixed point. Use two-dimensional lidar to obtain the distance of the drone relative to its surroundings. Fusion of lidar data and IMU data for UAV positioning. It has been tested that it works well indoors. However, due to the sparse network around the site, the radar scanning effect is not good and it is difficult to be used for fixed points.
Option 3: Visual odometry. Use Jetson NX equipped with a binocular camera and fuse it with IMU data as a visual odometry, send the position data to the flight control, and fuse it with the optical flow data. The tested error can reach centimeter level. At the same time, a coordinate system is established with the starting point as the origin to assist the drone in cruising.
Based on the above three options, choose option three.
### Vision module
solution one: using OPENMV4 camera module. OPENMV4 is composed of ARM Cortex-H7 high-performance microprocessor and OV7725, and supports multiple format output. And it has built-in algorithms for identifying color blocks, shapes, etc., which can greatly shorten the development cycle. But the recognition effect on complex shapes is poor.
Option 2: Use K210 sensor module. K210 is a 64-bit CPU designed by Canaan Technology Company with a built-in neural network accelerator KPU. You can collect image data sets and train them with neural networks before deploying them to the K210. After neural network training, the frame rate can reach more than 30 when detecting targets, but the frame rate when identifying color blocks and shapes is only about 5fps.
Based on the above two solutions, we chose to use OPENMV to identify the color block "seeding area" and use K210 to identify the take-off and landing points and "A" point.
#Schematic circuit analysis
## The schematic diagram of the lidar data processing board
shares the function of the drone's main control, receives and analyzes a large amount of data from the lidar, and obtains useful information, such as the distance information of obstacles around the drone, And feedback to the drone main control to achieve additional obstacle avoidance function. Use stm32f103c8t6 as the main control, the operating frequency is 72M, pay attention to the design of the clock circuit. As well as the relationship between serial port R->T and T->R, because the SH1.0 interface is used, it is troublesome to exchange the pin order.
![Picture 1.png]
## K210 expansion board schematic diagram
In order to make the communication and structural connection between K210 and UAV stable, and to expand peripherals at the same time, it can be used for data transfer of other peripheral modules, such as: QR code recognition module. The serial port on the main control board is not enough, so data is transferred through K210. Each pin of K210 can be reused as a special pin, pay attention to the power interface.
![Picture 2.png]
## OPENMV adapter board schematic diagram
In order to stabilize the communication connection between openmv and the drone and the structural connection with the drone, at the same time expand the peripheral interface, connect other sensors, and communicate with the drone through the LCD display Press the button to adjust and modify the color threshold on site, read the distance through the ranging sensor, and exchange data with the drone.
![Picture 3.png]
## Transistor drive circuit schematic diagram of each peripheral module
Since the drone question requires driving power devices, such as electromagnets, high-brightness LEDs, buzzers, and laser pointers, during the preparation process, I drew a transistor drive circuit and set aside different interfaces for different devices.
![Picture 4.png]
## The schematic diagram of the drone power distribution board
is to be compatible with the fixed structure of the drone and distribute power to the ESC and main control board.
![Picture 5.png]
#PCB design analysis
## For the lidar data processing board,
pay attention to the component layout of the board. Interfaces and switches should be placed around it.
![Picture 6.png]
## K210 expansion board
Pay attention to the component layout of the board. The interfaces and switches should be placed around it.
![Picture 7.png]
## OPENMV adapter board
Pay attention to the component layout of the board and the corresponding relationship.
![Picture 8.png] ## Pay attention to the component layout
of the transistor drive circuit of each peripheral module . ![Picture 9.png] ## Since the UAV distribution board has many modules, pay attention to the influence of the power supply current. ![Picture 10.png] # Physical display ![98C69DAB6B72B6FAE76FE341F242B3B5.jpg] Demonstration video link: [Plant Protection Aircraft](https://www.bilibili.com/video/BV16r4y1D7oF/) # The work is assembled with a drone using a 450 rack , equipped with 10-inch propellers and 30A ESC. The main structure uses a carbon fiber body, with acrylic materials as auxiliary materials. From top to bottom are lidar, flight control, NX processor, K210 processor and camera, with barcode recognition modules and binocular cameras on both sides. The overall height is about 50cm. #Programming ## System block diagram ![Picture 11.png] ## Overall flow of UAV system control ![Picture 12.png] ## Flight control program flow ![Picture 13.png] The UAV system is complex and must be passed The tick timer simulates task scheduling to accurately execute each task. After testing, it is easy to debug and the effect is excellent. ## Design algorithm ### Quad-rotor attitude calculation ![image.png] ### Low-pass filtering algorithm ![image.png] ### The core of the visual odometry (VIO) algorithm is visual inertial fusion, which is IMU estimation The pose sequence is aligned with the pose sequence estimated by the camera to estimate the true scale of the camera trajectory. Moreover, the IMU can well predict the pose of the image frame and the position of the feature points at the previous moment in the next frame image, improving the matching speed of the feature tracking algorithm and the robustness of the algorithm to cope with rapid rotation. Finally, the gravity vector provided by the accelerometer in the IMU Convert the estimated position into the world coordinate system required for actual navigation. #Implementation ## Model training and testing plan: Build YOLOV2 network on PC, train the data set, and deploy it to K210 for detecting point "A" and take-off and landing points. Test results: The model training loss function is as shown in the figure: ![Picture 14.png] Analysis: The loss functions of both the training set and the test set can be reduced to less than 0.05, and the recognition accuracy is high. ## Path planning test plan: The drone uses the visual odometry to obtain the position coordinates at each moment, cruises based on the prior knowledge of the map, K210 finds the take-off and landing points and "A" point, OPENMV identifies the "seeding area" and "non-sowing area" "Sowing Area", the laser radar looks for the pole, the LED flashes after scanning the code, and finally returns to home based on VIO. The schematic diagram of cruise is as follows. The solid line represents the laser pointer flashing, and the dotted line represents the laser pointer turning off. ![Picture 15.png] Test results: The positioning error of the UAV based on the VIO system is less than 5cm, and the target detection accuracy is above 0.7. Barcodes can be accurately scanned from a distance of 30cm from the pole. The return error of the drone is within 10cm. Analysis: Using multi-sensor data fusion, the solution design is reasonable and can meet the question requirements. # Summary The entire competition process is not smooth sailing. Try various solutions to choose the best one. The whole process is a process of continuous learning. The clever use of visual odometry, lidar-assisted obstacle avoidance, the combination of deep learning methods and traditional visual methods, and the flexible application of innovative methods finally fulfilled all the requirements of the question and added additional functions as appropriate. I believe that the plans and experiences gained in this process will also be of great help to future drone research and study. Apply theory to practice, and then to development and application, so as to truly apply what you have learned.