More advanced fully automatic parking systems will be combined with millimeter-wave radar systems, which have stronger distance detection and anti-interference capabilities. For example, in order to support powerful automatic parking technology, some provide up to 12 parking radars, while the 360° camera also allows the driver to know the situation around the vehicle in the car, and can personally intervene in the parking action when necessary. Now combined with unmanned driving sensor technology, including the use of millimeter-wave radar and laser radar, the detection distance and detection accuracy can be greatly improved.
Autonomous driving - AVM surround view system algorithm framework for automatic parking
AVM is a relatively mature technology, which is deployed in mid-to-high-end models, but there are not many technical blogs that describe the AVM system algorithm in detail. I have built an AVM algorithm framework and have some good demos. I mainly want to explain each operator in the AVM algorithm framework clearly. The style of this article is a combination of theory and practice, and contains some code. It is suitable for readers who have some basic knowledge of computer vision .
AVM System Overview
As shown in the figure, the AVM car surround image system consists of four external fisheye cameras installed on the front bumper, trunk, and rearview mirror. The operators included in the system are in the following order: de-distortion, four-way fisheye camera joint calibration, projection transformation, bird's-eye view fine-tuning, splicing fusion, 3D model texture mapping, etc. The image information captured by the four fisheyes is used to generate a 2D and 3D panoramic view through the above operators. The AVM algorithm is divided into two parts: offline stage and online stage. The online stage is a simplification of the offline stage and is more suitable for engineering implementation.
Schematic diagram of avm system
Offline phase algorithm pipeline
Let's take a quick look at the operators included in the AVM algorithm Pipeline:
2D AVM
2D AVM Pipeline
3D AVM
3D AVM Pipeline
Fisheye camera dedistortion based on distortion table
1.1 Fisheye camera distortion model The projection method of ordinary cameras and wide-angle cameras is generally perspective projection, that is, through the principle of triangle similarity, objects in the three-dimensional world under the camera coordinate system are projected onto a plane. This is based on the ideal perspective projection model (no distortion). But the actual situation is that the final image obtained is somewhat different from the ideal perspective projection model, namely radial distortion (barrel, pillow) and tangential distortion. Therefore, the distortion will be corrected in camera calibration. There are many assumptions about the projection method of fisheye cameras, such as equidistant projection, equal solid angle projection, orthogonal projection, stereoscopic projection, and linear projection. However, the real fisheye camera lens does not completely follow the above model assumptions. Therefore, Kannala-Brandt proposed a general form of estimation that is applicable to different types of fisheye cameras: , which is also included in the fisheye camera distortion model in opencv. Compare the following figure: is the incident angle of the light, is the distance between the outgoing light and O on the normalized plane of the camera or on the imaging plane of the camera (in opencv represents the imaging position of the light on the normalized plane of the camera).
Fisheye camera model Camera dedistortion usually uses the checkerboard calibration method. First, a relatively good initial solution is obtained through matrix derivation, and then the optimal solution is obtained through nonlinear optimization, including the camera's internal parameters, external parameters, and distortion coefficients, and then the fisheye image is dedistorted. The internal parameters are: the camera internal parameter matrix. However, this calibration method is not suitable for this scenario. 1.2 Fisheye image dedistortion based on the manufacturer's distortion table Since the checkerboard calibration method is to fit the global image to obtain a global optimal solution, it is necessary to ensure that the checkerboard of the calibration plate captured multiple times can cover the entire image area. The assumed scenario is to require the vehicle to be calibrated on the assembly line, that is, the camera has been installed on the vehicle. Obviously, due to the occlusion of the vehicle body, it is difficult to guarantee the above conditions. In addition, the checkerboard calibration method is not suitable for mass production. Therefore, the distortion table provided by the manufacturer is selected to dedistort the fisheye camera image. Camera manufacturers have professional optical engineers , and the distortion tables provided by large manufacturers are usually more accurate. Of course, there are also some optimization methods based on the distortion table, such as using the minimum reprojection method to calculate the optimal camera principal point position, and then using the distortion table for dedistortion. In other scenarios, there are some methods that first calibrate the camera's intrinsic parameters, and then use the intrinsic parameters in conjunction with the distortion table. The following is a dedistortion method based on the distortion table:
Distortion table provided by the manufacturer The table above shows part of the camera distortion table. The manufacturer gives the real distance from the imaging point to the center of the imaging plane on the real imaging plane of the camera with a focal length of f , in mm. If you want to use the API provided by opencv to do distortion removal, you need to use the focal length f provided by the manufacturer and convert to the normalized plane of the camera (i.e. divide by f). Then calculate these distortion parameters by polynomial fitting. For example, you can use the curve_fit library of python for polynomial fitting. Call the Opencv API, and m_distortion_coeffs is the distortion parameter of the polynomial fitting.
fisheye::initUndistortRectifyMap(m_intrinsic_matrix, m_distortion_coeffs, R, NewCoeff, image_size*2, CV_32FC1, mapx, mapy);cv::remap(disImg, undisImg, mapx, mapy, INTER_LINEAR);
In layman's terms: the fisheye camera distortion removal process is actually to traverse the coordinate points on the desired distortion-free image, and find the pixel position of the coordinate point on the distortion image through the two lookup tables mapx and mapy. Usually the pixel position is a floating point type, and bilinear interpolation is required. Otherwise, there will be jagged problems on the texture edge. This conclusion has been verified by implementing the opencv remap function. If you are interested, you can implement the mapping process (lookup + interpolation). Let's take a look at the picture:
Fisheye image dedistortion The right figure shows the result of dedistortion based on the distortion table. It can be seen that the dedistortion effect generally meets the requirements, such as the edge of the pillar, the edge of the calibration cloth, and the lane line are straight lines. However, the dedistortion effect is still poor in some areas, and the straight lines are not straight enough. This problem will appear more prominent in the bird's-eye view, and it is also an important reason for the uneven stitching of the coverage area. There may be several reasons: (1) The intersection point (principal point) of the camera optical axis and the imaging plane does not coincide with the center of the image plane, that is, in the intrinsic parameter matrix . (2) The focal length f given by the manufacturer is inaccurate (3) The distortion table given by the manufacturer has errors. In theory, camera calibration is a process of calculating the global optimal solution. It can be understood as follows: the intrinsic parameters can be not so accurate, and the obtained distortion table can also be not so accurate, but as long as the optimization target reprojection error is small, or the distortion is removed relatively cleanly, then this global optimal solution is acceptable. Therefore, the method of minimizing the reprojection error is used to obtain the internal parameters , and then the distortion table is used; in some scenes, some people use chessboards to calibrate the camera's internal parameters, and then use them in conjunction with the distortion table. These contents will be optimized later.
Four-way fisheye joint calibration The purpose of fisheye camera joint calibration is to obtain the pose relationship between the four fisheye cameras, and then put the captured images into the same coordinate system to obtain a panoramic view.
The schematic diagram of camera joint calibration is shown in the figure. The field of view of the panoramic bird's-eye view is a parameter given by humans and can be adjusted according to user preferences. The size of the chessboard on the calibration cloth, the size of the black grid, and the distance between the car and the calibration cloth are all known prior information. The scale relationship between the above prior information in the real world and the panoramic view is 1:1, that is, 1 pixel represents 1cm (of course, this scale can also be adjusted, and you can let one pixel represent n centimeters). The significance of doing this joint calibration is that you can know the coordinate relationship between the corner points of the chessboard in the image after the front, back, left, and right fisheye cameras are dedistorted, and the corresponding coordinate relationship between the corner points of the chessboard in the front, back, left, and right bird's-eye views. In this way, the entire image can be projected onto the corresponding bird's-eye view according to the projection transformation. In addition, since the four bird's-eye views are just spliced together in the joint calibration, the above method is used to project all four images onto the bird's-eye view. In the ideal case without considering the error, they should be just spliced together. The above is the idea of joint calibration.
Previous article:Analysis of the research and application of commercial vehicle drive-by-wire chassis technology
Next article:Six technical routes for in-vehicle gesture interaction research
- Huawei's Strategic Department Director Gai Gang: The cumulative installed base of open source Euler operating system exceeds 10 million sets
- Analysis of the application of several common contact parts in high-voltage connectors of new energy vehicles
- Wiring harness durability test and contact voltage drop test method
- Sn-doped CuO nanostructure-based ethanol gas sensor for real-time drunk driving detection in vehicles
- Design considerations for automotive battery wiring harness
- Do you know all the various motors commonly used in automotive electronics?
- What are the functions of the Internet of Vehicles? What are the uses and benefits of the Internet of Vehicles?
- Power Inverter - A critical safety system for electric vehicles
- Analysis of the information security mechanism of AUTOSAR, the automotive embedded software framework
Professor at Beihang University, dedicated to promoting microcontrollers and embedded systems for over 20 years.
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- [NXP Rapid IoT Review] Classic Snake purely hand-typed
- SPI device driver learning BUG problem record
- Built-in password keyboard to input numeric variables and jump to the page according to the password
- Smartphone connector overall test solution
- Selling various low, medium and high-end development boards
- EEWORLD University Hall ---- Zhougong Series Lectures —— Basic Principles of Zhougong PID
- Security monitoring system protection discussion
- 【GD32E231 DIY】IAR development environment construction and running lights
- [NXP Rapid IoT Review] + NXP Rapid IoT Unexpected Bluetooth IoT
- EEWORLD University Hall ---- Zhou Gong Series Lectures - Temperature Instruments