Validation of smart car radar and camera models

Publisher:素心轻语Latest update time:2020-08-13 Source: eefocus Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

 

2. Simulation verification of sensor functional model

In this paper, three sensor models are established respectively with reference to the parameters of Delphi ESR millimeter-wave radar, Ibeo four-line lidar and Delphi IFV300 monocular camera, and the models are simulated and verified as follows.

 

2.1 Millimeter-wave radar model

2.1.1 Functional Verification

The ESR millimeter wave radar model can sense various moving and stationary obstacles and represent the sensing results as particles. The millimeter wave functional model simulates the millimeter wave radar's perception of the radial relative distance, radial relative speed, and angle of the target.

 

Figure 5 shows the perception results of the millimeter-wave radar model in a complex traffic environment. The model detects both vehicles and stationary obstacles and identifies the vehicle in front as an ACC tracking target. In addition, this verification is an example of a picture after adding Gaussian white noise. The rest of the unmarked ones are noise points at that moment.

 

Figure 5 (a) is a screenshot of the virtual scene, which includes vehicles and roadside lampposts, and Figure 5 (b) shows the short-range and long-range millimeter-wave radar perception results. As can be seen from the figure, the millimeter-wave radar can detect the vehicle and lamppost in front, and mark the four vehicles in front of the lane as ACC tracking targets, while characterizing the occlusion relationship between vehicles. At this time, the main vehicle speed is 30 km/h and the target vehicle speed is 50 km/h.

 

 

Figure 6 shows a schematic diagram of the millimeter-wave radar model curve detection process. Figures 6 (a) and 6 (b) show the detection results when the oncoming vehicle has not entered the detection range of the main vehicle's millimeter-wave radar, and Figures 6 (c) and 6 (d) show the detection results when the oncoming vehicle enters the detection range of the main vehicle's millimeter-wave radar. In this figure, the detection display of the street light pole is turned off. At this time, the main vehicle speed is 0 and the target vehicle speed is 50 km/h.

 

 

Figure 7 shows the detection results of the millimeter wave radar when the front vehicle is changing lanes. Figure 7 (a) and Figure 7 (b) show the detection results when the front vehicle is changing lanes. At this time, vehicle 2 is changing lanes and blocking vehicles 3 and 4.

 

 

Figure 7 (c) and Figure 7 (d) show the detection results after the front vehicle has completed lane change. At this time, vehicle 2 has completed lane change and is in the same lane as vehicles 3 and 4. The unmarked points are lamp posts. At this time, the main vehicle speed is 0 and the target vehicle speed is 50 km/h.

 

2.1.2 Performance Verification

In order to verify the accuracy of millimeter-wave radar perception, a stationary vehicle with a length of 4.2m and a width of 1.8m was placed at 14.9, 34.9, 54.9, 84.9, 124.9 and 164.9m in front of the vehicle, with an angle of -0.2757°. The detection results of the millimeter-wave radar model are shown in Figure 8.

 

 

The ideal distance variance and ideal angle variance are both set to σ = 0.5/3 = 0.1667, the actual detection distance variance average is 0.1663, and the actual detection angle variance average is 0.1582. It can be seen that the millimeter wave radar established in this paper can simulate and detect obstacles more accurately, and the detection accuracy is high.

 

2.2 LiDAR Model

The lidar model established in this paper can sense various moving and stationary obstacles, calculate their geometric contours, estimate classification, and simulate the lidar's perception process of the target's relative position, speed, and geometric contour.

 

2.2.1 Functional Verification

Figure 9 shows the effective perception results of the lidar model on the surrounding vehicles in a complex traffic environment. The model simulates the occlusion effect and can only perceive nearby vehicles. Moreover, the geometric contours of the perceived vehicles are not the same. Vehicles that are close and located on both sides of the main vehicle can return a length that is closer to the actual value, while vehicles that are far away and located directly in front of the main vehicle can only return the width, and the length value is very small.

 

 

Figure 10 shows the detection results of the laser radar when the front car is blocked. Figure 10 (a) and Figure 10 (b) show that the front car 1 blocks the front car 2, making the position of the front car 2 detected by the laser radar

 

The length is less than the set threshold, so the two cars are not visible.

 

Figure 10 (c) and Figure 10 (d) show that car 1 in front continues to travel at a higher speed than car 2. At this time, the part of car 2 blocked by car 1 is decreasing. The part that can be detected by the laser radar is larger than the set threshold, but its side is not detected. Therefore, only the body width of car 2 is displayed but not the body length.

 

 

2.2.2 Performance Verification

To verify the perception accuracy of the laser radar, the main vehicle is placed at the origin, the target vehicle coordinates: X = 80m, Y = 0, Z = 0, specifications: 4.2m × 1.8m, radar installation position: X = 0, Y = 0, Z = 0, and the laser radar model detection data is shown in Table 1. As can be seen from Table 1, when the laser radar simulation established in this paper detects obstacles, the distance detection value and width detection value are highly consistent with the ideal value.

 

 

2.3 Camera Model

The camera model developed in this paper simulates the camera's perception of the vehicle's relative position, longitudinal speed, width, type, and lane markings.

 

2.3.1 Functional Verification

The camera model established in this paper can perceive the lane markings on the road (set to 4 in this paper, two on the left and right of the main vehicle), and is suitable for road conditions such as curves and ramps. The overall perception result is shown in Figure 11.

 

 

2.3.2 Performance Verification

In the case of noise, the target vehicle is 4.2m×1.8m, the camera installation position is X=+3m, Y=0, Z=0, the longitudinal distance detection accuracy is 2m, σxexep=2/3=0.6667, the lateral distance detection accuracy is 0.5m, σyexep=0.5/3=0.1667. The target width detection accuracy is 0.5m. The camera model detection data results are shown in Table 2.

 

 

It can be seen from Table 2 that when the camera model established in this paper simulates obstacle detection, the actual detection distance and width values ​​are basically consistent with the ideal values, and the actual distance variance and width variance also deviate slightly from the ideal values.

 

3. Model calculation efficiency verification and conclusion

The sensor model established in this paper improves computing efficiency while meeting real-time computing requirements, and can be used for multiple smart cars in simulation scenarios.

 

To test the three sensor models established in this paper, 10 sensor model instances with exactly the same parameters are run simultaneously for each sensor on an ordinary PC. The average time taken for each sensor model instance to run 10,000 times is counted. The results are shown in Table 3.

 

 

The computer processor model is Intel (R) Core (TM) i7-4790, with a main frequency of 3.6 GHz. The simulation results show that the sensor function model established in this paper not only reflects the necessary physical characteristics of the sensor, but also has a very high computational efficiency, with a time of less than 0.2 ms, and supports concurrent operation, creating conditions for concurrent simulation of multiple intelligent vehicles on a virtual test platform.

 

in conclusion

Based on the above modeling framework and key methods, and referring to the parameters of vehicle-mounted sensor products, the millimeter-wave radar, lidar, and camera models established in this paper can simulate physical characteristics such as occlusion between objects and sensor perception errors. Dozens of sensor models can be run concurrently on a computer, and the calculation cycle of each model is at the sub-millisecond level, which can support real-time simulation of complex test scenarios involving multiple intelligent vehicles at the same time.

[1] [2]
Reference address:Validation of smart car radar and camera models

Previous article:The development environment and future market space of Internet of Vehicles
Next article:BMW is developing new forms of lithium batteries, but internal combustion engines will remain the mainstream product for a short time

Recommended ReadingLatest update time:2024-11-15 21:57

Velodyne Lidar releases Puck 32MR sensor with the highest resolution
  Image source: Velodyne Lidar official website   According to foreign media reports, Velodyne Lidar, Inc. has launched the Puck 32MR? sensor, which provides cost-effective perception solutions for the low-speed autonomous driving market, including industrial vehicles, robots, space shuttles and drones. The lidar sen
[Embedded]
Velodyne Lidar releases Puck 32MR sensor with the highest resolution
Velodyne Lidar automotive laser radar solutions unveiled at the 2021 Guangzhou International Auto Show to help future mobility
Solid-state lidar sensors Velarray H800 and Velarray M1600, long-range Alpha Prime sensors... These advanced Velodyne Lidar lidar solutions were all unveiled at the 2021 Guangzhou International Auto Show. By fully displaying breakthrough, cost-effective solid-state lidar sensors and leading technologies led by smart i
[Automotive Electronics]
Velodyne Lidar automotive laser radar solutions unveiled at the 2021 Guangzhou International Auto Show to help future mobility
Technical Article—How an open-source LIDAR prototyping platform can shorten your design process
Summary   This article explores ADI’s new and broadly marketed LIDAR prototyping platform and how it helps customers shorten product development time by providing a complete hardware and software solution that enables users to prototype their algorithms and custom hardware solutions. It details the modular hardware d
[Automotive Electronics]
Technical Article—How an open-source LIDAR prototyping platform can shorten your design process
Li Yuan from Beixing: The next five years will be a sea of ​​stars for hybrid solid-state LiDAR
Dr. Li Yuan, CEO of Benewake, was recently invited to attend the "2021 Autonomous Driving Development Conference" jointly organized by the China Optical Engineering Society, Tsinghua University, China Intelligent Connected Vehicle Industry Innovation Alliance, Institute of Microelectronics of the Chinese Academy of Sc
[Automotive Electronics]
Li Yuan from Beixing: The next five years will be a sea of ​​stars for hybrid solid-state LiDAR
Velodyne Lidar Supplies Alpha Prime Sensors for Motional Autonomous Vehicles
SAN JOSE, Calif.--(BUSINESS WIRE)--Velodyne Lidar, Inc. (Nasdaq: VLDR) today announced a multi-year sales agreement for its Alpha Prime™ sensors with Motional, a global leader in autonomous technology. Velodyne will become the exclusive supplier of long-range surround view lidar sensors for Motional’s SAE Level 4 auto
[Automotive Electronics]
Velodyne Lidar Supplies Alpha Prime Sensors for Motional Autonomous Vehicles
aiSim5 LiDAR model verification method (Part 2)
The LiDAR in aiSim is a sensor based on ray tracing that can simulate the laser beam emitted by a real LiDAR and generate a 3D point cloud in the LAS v1.4 standard format, including azimuth, pitch angle, distance, etc. aiSim can simulate LiDAR monostatic and coaxial configurations. In aiSim, LiDAR simulat
[Embedded]
aiSim5 LiDAR model verification method (Part 2)
Latest Embedded Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号