Simulation Method of Vehicle Camera in ADAS HiL

Publisher:upsilon30Latest update time:2024-05-27 Source: elecfans Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

Vehicle autonomous driving includes perception, judgment and execution, and perception is the source of the whole process and an important module of the autonomous driving system. During the driving process of the vehicle, the perception system will collect information about the surrounding environment in real time through sensors, which is equivalent to the "eyes" of the autonomous driving car, helping the car to achieve the same observation ability as human drivers. The perception system is mainly composed of sensors such as cameras, ultrasonic radars, millimeter-wave radars, and lidars (optional). As the main environmental perception sensor, the camera plays a very important role. It can achieve 360° comprehensive visual perception, make up for the shortcomings of radar in object recognition, and is the sensor closest to human vision. With the development of autonomous driving technology, more and more on-board cameras are required, with higher and higher clarity and stronger stability.


At present, the cameras installed in L2+ and L3 level vehicles are mainly divided into five categories according to the installation location: front view camera, surround view camera, rear view camera, side view camera and built-in camera. When driving, the front view, side view, rear view, millimeter wave and lidar are perceived and integrated together, and the information such as the drivable area and target obstacles is provided to the algorithm module to realize ACC/ICA/NCA, AEB, LKA, TSR and other functions. At the same time, the built-in camera will monitor the driver's status and realize the fatigue monitoring function; when parking, the surround view camera and ultrasonic radar jointly perceive the parking space environment to realize the APA function. The on-board camera plays an important role in the advanced driver assistance system (ADAS) of the car, providing strong support for our driving safety.

22c78a66-4185-11ee-8e12-92fbcf53809c.png

So how to implement Camera simulation when doing ADAS HiL testing? Beihui Information provides the following two implementation solutions
Video dark box The video dark box connects the virtual simulation scene video signal to the display in the dark box, uses a real camera to shoot video at the display, and transmits the captured video signal to the autonomous driving controller through a coaxial cable, so that the controller thinks it is in a real car environment, thereby achieving the purpose of testing the ADAS controller. 22e5166c-4185-11ee-8e12-92fbcf53809c.png Figure 2 Schematic diagram of the video dark box solution The dark box equipment is mainly composed of a box, a slide rail, a display, a lens, a camera and related brackets and bases. The video dark box does not require the OEM or Tire1 to provide a communication protocol between the image acquisition module and the image processing module. It uses a real camera. This method is easy to implement and has low cost, but it needs to accurately set the placement and angle of the camera according to the size of the display. It is easily affected by light and the display. At the same time, the frequency of the display may cause delays in image recognition. This solution is suitable for monocular cameras, and the camera field of view angle must be less than 120° (the surround view camera cannot use this solution). The video dark box equipment is large, and one dark box only supports one camera, and the accuracy is also low. 234a5d7e-4185-11ee-8e12-92fbcf53809c.png Figure 3 Overall structure of the video dark box

Camera calibration is divided into two parts. The first is the hardware device position calibration, which keeps the center of the camera, lens, and display on a horizontal line; the second is to calibrate the captured lane lines in the simulation scene. Video injection The video injection system can be used to inject the original data stream of the camera, and use the VX1161 video re-injection hardware to replace the on-board camera sensor of the ADAS system. The camera simulation device receives the video signals of different camera perspectives of the virtual simulation scene through the HDMI/DVI interface, and injects the video signals of a specific protocol into the ADAS controller after internal image processing.

237668e2-4185-11ee-8e12-92fbcf53809c.jpg

Figure 4 Schematic diagram of video injection scheme Video injection technology is not affected by light, has high simulation accuracy, and supports online adjustment of camera signal color space (RGB, YUV, RAW, etc.). A VX1161 video re-injection hardware supports multi-channel camera simulation at the same time. The device is small in size. When simulating multi-channel camera signals, the video signals of each channel can be synchronized through the serializer to trigger the signal, which is suitable for multi-camera and multi-channel injection. Video injection requires specific video protocol information, and the OEM or Tire1 needs to provide the communication protocol between the image acquisition module and the image processing module. There are technical difficulties such as distortion calibration and color difference adjustment in development, and the cost is high. The video injection system supports the configuration of multiple camera installation positions and features (including Resolution, Frame rate, Optics and Sensor features, etc.), and is suitable for various camera-based applications. Using DYNA4 as the scene and dynamics simulation software, the video injection + camera model can also realize other lens characteristic effects in the simulation environment, such as screen flicker, lens distortion, fisheye, motion blur, etc., simulating imaging failures caused by sudden changes in ambient light, such as short-term over- or underexposure of the camera, errors in gain adjustment of some or all channels, camera imaging noise or image distortion, and lens obstruction by rain, fog or mud.

For the video injection solution, the camera simulation model needs to be generated based on real distortion data, FOV, pixel size, resolution and other parameters, but the simulation model still has slight distortion differences from the real vehicle camera, so calibration is required. There are two methods for calibration. The first method is to obtain the picture taken by the camera model, calculate the distortion parameters of the picture, and modify the camera distortion parameters configured by the ADAS controller; the second method is to use the black and white checkerboard image generated by the model to compare with the real camera image, and slightly adjust the parameters of the simulation model to achieve the same distortion parameters.


248637ee-4185-11ee-8e12-92fbcf53809c.png

In ADASHiLADAS HiL, the simulated camera video stream data is transmitted to the controller together with the dynamic model data and other sensor data, and the experiment is managed in the CANoe software to form a closed-loop link.

1

The simulation camera can simulate various scenarios and situations in the real world, including different road conditions, weather conditions, and traffic situations. By simulating these scenarios, the performance and robustness of the controller can be evaluated in various situations.

2

The ADAS controller receives raw video stream data, lidar point cloud data, millimeter wave radar and ultrasonic radar target list data, and evaluates the controller's ability to fuse and process different sensor data.

3

Camera simulation can also be used to test and verify control algorithms and functions. By simulating various scenarios and situations, the accuracy and reliability of the controller's functions such as object detection, object tracking, lane keeping, and automatic emergency braking can be verified.

4

The video stream data simulates all-black, all-white, noise superposition, motion blur, frame loss, delay and other faults, and is injected into the controller to verify the functional safety mechanism of the controller.


249a9edc-4185-11ee-8e12-92fbcf53809c.png

Summarize

This article first introduces the role of vehicle cameras in ADAS systems, focusing on the differences between the two camera simulation solutions of video dark box and video injection in ADAS HiL, and finally briefly introduces the application of vehicle camera simulation in ADAS HiL.
As Vector's technical partner, Northlink Information covers intelligent driving system MiL/HiL/ViL testing, vehicle networking testing, sensor perception testing, etc., providing customers with high-quality intelligent driving test solutions, test integration systems and services, and facilitating the rapid verification and testing of intelligent driving simulation test systems.


Reference address:Simulation Method of Vehicle Camera in ADAS HiL

Previous article:Automotive SENT sensor testing
Next article:SZSMF4L automotive grade 400 watt transient suppression diode introduction

Recommended ReadingLatest update time:2024-11-16 09:43

ADAS and vehicle automation: Learning from others’ experience can help improve our own
Automakers can benefit greatly from data centers when designing scalable systems and applications that require low latency and high energy efficiency. The inclusion of Advanced Driver Assistance Systems (ADAS) features has become an important aspect of automotive design to improve safety and ease of u
[Automotive Electronics]
ADAS and vehicle automation: Learning from others’ experience can help improve our own
ADAS video acquisition and injection solution | CANoe+DYNA4+VX1161.51
Advanced Driver Assistance Systems (ADAS) refers to the use of various perception sensors, high-precision maps, regulatory control algorithms, data processing and other technologies to provide driving assistance functions. ADAS systems can improve the safety and comfort of vehicles. Cameras are an integra
[Embedded]
ADAS video acquisition and injection solution | CANoe+DYNA4+VX1161.51
Qualcomm releases automotive chip Mobileye expects ADAS revenue to reach US$17 billion in 2030
Editor's note: At CES2023, car manufacturers and chip manufacturers have launched the latest products around smart cars . In the trailer park of the Las Vegas Convention Center, there were two cars on the Google booth - one equipped with the new  Android  Auto Experience a BMW i7 and a Volvo EX90 with Google built-in.
[Automotive Electronics]
Qualcomm releases automotive chip Mobileye expects ADAS revenue to reach US$17 billion in 2030
ADAS system safety architecture design and safety level decomposition
A brief discussion on system security architecture design The author has been working in the field of functional safety for more than eight years. Based on personal experience, I would like to share my understanding of system safety architecture design, hoping to solve the pain points of some colleagues in safe
[Embedded]
ADAS system safety architecture design and safety level decomposition
NXP announces second generation 77 GHz RFCMOS radar transceiver for ADAS and autonomous driving
NXP Semiconductors NV (NASDAQ: NXPI) announced that its second-generation RFCMOS radar transceiver family has entered production. The TEF82xx is the next generation of the TEF810x, which has been market-proven and shipped in tens of millions of units. The TEF82xx is optimized for fast chirp modulation and supports sho
[Automotive Electronics]
NXP announces second generation 77 GHz RFCMOS radar transceiver for ADAS and autonomous driving
How to Deliver More Than 100A of Current to ADAS Processors
The electrification of automotive systems in advanced driver assistance systems (ADAS), including autonomous driving vision analysis, parking assistance and adaptive control functions, is becoming increasingly prevalent. Intelligent connectivity, safety-critical software applications and neural network processing al
[Embedded]
How to Deliver More Than 100A of Current to ADAS Processors
Can ADAS and DMS work together?
Driver Monitoring System (DMS) technology, long sidelined, has recently gained renewed attention in discussions about automotive safety.   The growing awareness that DMS can improve vehicle safety has driven new regulations to improve the safety level of new vehicles. The European Parliament has updated its General Sa
[Automotive Electronics]
Can ADAS and DMS work together?
Latest Embedded Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号