Vehicle autonomous driving includes perception, judgment and execution, and perception is the source of the whole process and an important module of the autonomous driving system. During the driving process of the vehicle, the perception system will collect information about the surrounding environment in real time through sensors, which is equivalent to the "eyes" of the autonomous driving car, helping the car to achieve the same observation ability as human drivers. The perception system is mainly composed of sensors such as cameras, ultrasonic radars, millimeter-wave radars, and lidars (optional). As the main environmental perception sensor, the camera plays a very important role. It can achieve 360° comprehensive visual perception, make up for the shortcomings of radar in object recognition, and is the sensor closest to human vision. With the development of autonomous driving technology, more and more on-board cameras are required, with higher and higher clarity and stronger stability.
At present, the cameras installed in L2+ and L3 level vehicles are mainly divided into five categories according to the installation location: front view camera, surround view camera, rear view camera, side view camera and built-in camera. When driving, the front view, side view, rear view, millimeter wave and lidar are perceived and integrated together, and the information such as the drivable area and target obstacles is provided to the algorithm module to realize ACC/ICA/NCA, AEB, LKA, TSR and other functions. At the same time, the built-in camera will monitor the driver's status and realize the fatigue monitoring function; when parking, the surround view camera and ultrasonic radar jointly perceive the parking space environment to realize the APA function. The on-board camera plays an important role in the advanced driver assistance system (ADAS) of the car, providing strong support for our driving safety.
So how to implement Camera simulation when doing ADAS HiL testing? Beihui Information provides the following two implementation solutions
Video dark box The video dark box connects the virtual simulation scene video signal to the display in the dark box, uses a real camera to shoot video at the display, and transmits the captured video signal to the autonomous driving controller through a coaxial cable, so that the controller thinks it is in a real car environment, thereby achieving the purpose of testing the ADAS controller.
Figure 2 Schematic diagram of the video dark box solution The dark box equipment is mainly composed of a box, a slide rail, a display, a lens, a camera and related brackets and bases. The video dark box does not require the OEM or Tire1 to provide a communication protocol between the image acquisition module and the image processing module. It uses a real camera. This method is easy to implement and has low cost, but it needs to accurately set the placement and angle of the camera according to the size of the display. It is easily affected by light and the display. At the same time, the frequency of the display may cause delays in image recognition. This solution is suitable for monocular cameras, and the camera field of view angle must be less than 120° (the surround view camera cannot use this solution). The video dark box equipment is large, and one dark box only supports one camera, and the accuracy is also low.
Figure 3 Overall structure of the video dark box
Camera calibration is divided into two parts. The first is the hardware device position calibration, which keeps the center of the camera, lens, and display on a horizontal line; the second is to calibrate the captured lane lines in the simulation scene. Video injection The video injection system can be used to inject the original data stream of the camera, and use the VX1161 video re-injection hardware to replace the on-board camera sensor of the ADAS system. The camera simulation device receives the video signals of different camera perspectives of the virtual simulation scene through the HDMI/DVI interface, and injects the video signals of a specific protocol into the ADAS controller after internal image processing.
Figure 4 Schematic diagram of video injection scheme Video injection technology is not affected by light, has high simulation accuracy, and supports online adjustment of camera signal color space (RGB, YUV, RAW, etc.). A VX1161 video re-injection hardware supports multi-channel camera simulation at the same time. The device is small in size. When simulating multi-channel camera signals, the video signals of each channel can be synchronized through the serializer to trigger the signal, which is suitable for multi-camera and multi-channel injection. Video injection requires specific video protocol information, and the OEM or Tire1 needs to provide the communication protocol between the image acquisition module and the image processing module. There are technical difficulties such as distortion calibration and color difference adjustment in development, and the cost is high. The video injection system supports the configuration of multiple camera installation positions and features (including Resolution, Frame rate, Optics and Sensor features, etc.), and is suitable for various camera-based applications. Using DYNA4 as the scene and dynamics simulation software, the video injection + camera model can also realize other lens characteristic effects in the simulation environment, such as screen flicker, lens distortion, fisheye, motion blur, etc., simulating imaging failures caused by sudden changes in ambient light, such as short-term over- or underexposure of the camera, errors in gain adjustment of some or all channels, camera imaging noise or image distortion, and lens obstruction by rain, fog or mud.
For the video injection solution, the camera simulation model needs to be generated based on real distortion data, FOV, pixel size, resolution and other parameters, but the simulation model still has slight distortion differences from the real vehicle camera, so calibration is required. There are two methods for calibration. The first method is to obtain the picture taken by the camera model, calculate the distortion parameters of the picture, and modify the camera distortion parameters configured by the ADAS controller; the second method is to use the black and white checkerboard image generated by the model to compare with the real camera image, and slightly adjust the parameters of the simulation model to achieve the same distortion parameters.
In ADASHiLADAS HiL, the simulated camera video stream data is transmitted to the controller together with the dynamic model data and other sensor data, and the experiment is managed in the CANoe software to form a closed-loop link.
1
The simulation camera can simulate various scenarios and situations in the real world, including different road conditions, weather conditions, and traffic situations. By simulating these scenarios, the performance and robustness of the controller can be evaluated in various situations.
2
The ADAS controller receives raw video stream data, lidar point cloud data, millimeter wave radar and ultrasonic radar target list data, and evaluates the controller's ability to fuse and process different sensor data.
3
Camera simulation can also be used to test and verify control algorithms and functions. By simulating various scenarios and situations, the accuracy and reliability of the controller's functions such as object detection, object tracking, lane keeping, and automatic emergency braking can be verified.
4
The video stream data simulates all-black, all-white, noise superposition, motion blur, frame loss, delay and other faults, and is injected into the controller to verify the functional safety mechanism of the controller.
Summarize
This article first introduces the role of vehicle cameras in ADAS systems, focusing on the differences between the two camera simulation solutions of video dark box and video injection in ADAS HiL, and finally briefly introduces the application of vehicle camera simulation in ADAS HiL.
As Vector's technical partner, Northlink Information covers intelligent driving system MiL/HiL/ViL testing, vehicle networking testing, sensor perception testing, etc., providing customers with high-quality intelligent driving test solutions, test integration systems and services, and facilitating the rapid verification and testing of intelligent driving simulation test systems.
Previous article:Automotive SENT sensor testing
Next article:SZSMF4L automotive grade 400 watt transient suppression diode introduction
Recommended ReadingLatest update time:2024-11-16 09:43
- Popular Resources
- Popular amplifiers
- Distributed robust Kalman filter fusion algorithm for ADAS system vision and millimeter wave radar
- End-to-end learning of ADAS vehicle spacing and relative speed based on monocular camera
- Improved SSD algorithm for real-time target vehicle detection in ADAS
- Multi-port and shared memory architecture for high-performance ADAS SoCs
- Huawei's Strategic Department Director Gai Gang: The cumulative installed base of open source Euler operating system exceeds 10 million sets
- Analysis of the application of several common contact parts in high-voltage connectors of new energy vehicles
- Wiring harness durability test and contact voltage drop test method
- Sn-doped CuO nanostructure-based ethanol gas sensor for real-time drunk driving detection in vehicles
- Design considerations for automotive battery wiring harness
- Do you know all the various motors commonly used in automotive electronics?
- What are the functions of the Internet of Vehicles? What are the uses and benefits of the Internet of Vehicles?
- Power Inverter - A critical safety system for electric vehicles
- Analysis of the information security mechanism of AUTOSAR, the automotive embedded software framework
Professor at Beihang University, dedicated to promoting microcontrollers and embedded systems for over 20 years.
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- Sharing a carefully adjusted hysteresis comparator
- The simulation result of LC parallel resonance frequency is inconsistent with the calculation. Is it because the component model is not selected correctly?
- Using Bluetooth module for fast IoT circuit design
- Using the Ginkgo USB-I2C adapter and the BMP180/BMP085 sensor to implement an atmospheric pressure detector
- High-speed plate actual application case
- What is the difference between TMS320C6416 and TMS320C6416T?
- Disassembling the x990
- Greenhouse automatic spraying system ---- H743 water lamp
- Design of wireless serial port hub using ARM microprocessor and ZigBee module
- [TI recommended course] #Live replay: TI mmWave millimeter wave radar application in automobiles#