PlusAI has delivered another new answer: it has launched a stereoscopic vision perception technology with an effective distance of 1,600 meters to accelerate the implementation of unmanned trucks.
Recently, PlusAI released its latest multi-view stereoscopic perception solution - a stereoscopic vision perception technology with an effective distance of 1 mile (1600 meters), which can accurately determine the position and speed of objects. Combined with a multi-sensor fusion solution, it will greatly improve system security.
It is worth noting that what does an effective distance of 1,600 meters mean in the field of autonomous driving?
From the data point of view, in the perception link, the perception distance of the three main sensors, lidar, camera and millimeter-wave radar, is within 300 meters. However, Waymo, as the technological leader in the field of autonomous driving, can only achieve a perception distance of about 300 meters.
According to Xin Zhijia, perception is a key link in autonomous driving technology, just like the eyes, ears and nose of the human body, which obtains information from the outside world to determine the decision-making and behavior of the human body or autonomous driving vehicles.
The stronger the perception ability, the more accurate the results of decision-making and behavior will undoubtedly be. When projected into self-driving cars running at high speeds, it will undoubtedly bring greater safety and efficiency.
So the question is, how did PlusAI achieve an effective sensing distance of 1,600 meters?
Multi-view stereo vision and algorithm breakthroughs
Zhijia said that the stereoscopic vision perception system with an effective perception distance of 1,600 meters announced this time uses a multi-eye vision solution module and algorithm based on deep learning to build a high-precision object detection cognitive model.
This system successfully breaks through the limitations of stereo vision perception accuracy in long-distance autonomous driving, and realizes the recognition and further tracking of objects and lanes at a distance of 1,600 meters.
In fact, in autonomous driving, perception is not just about "seeing", but also requires distance measurement, speed measurement, classification, and tracking of objects in the driving environment.
Zhijia Technology told Xinzhijia that the breakthrough in perception distance of 1,600 meters is mainly attributed to the fact that in the past year, the company's R&D team used cutting-edge technologies such as edge deep learning, deep stereo vision, and dynamic intelligent object tracking to finally achieve this breakthrough.
How to accurately determine position and speed?
Those who are engaged in autonomous driving logistics and freight know the difference between long-distance perception of 200 meters and 1,600 meters.
In other words, a perception capability of 200 meters means a braking distance of 200 meters, which does not take into account the load of the unmanned truck.
This means that the longer the perception distance, the longer the braking distance will be.
Today, with the effective distance from 300 meters to 1,600 meters, PlusAI has proposed the technical indicators of "seeing clearly" and "understanding". This "confidence" comes from its core technology point, the multi-eye stereo vision system.
Compared with the visual perception capability brought by a monocular camera, PlusAI's multi-camera stereo vision system realizes distance measurement in the three-dimensional world, and demonstrates strong performance in effectively identifying the vehicle's self-position, the type, position, speed and lane lines of moving vehicles, and other factors.
To put it in a more vivid way, through this system, PlusAI's in-vehicle sensing equipment can simultaneously identify the various states of vehicles 1,600 meters away: whether it is a sedan or a heavy truck, whether it is moving or parked on the side of the road, whether it is in the current lane or the adjacent lane, etc.
Tim Daly, chief architect of PlusAI, said: “1,600 meters is not just a breakthrough in terms of numbers, but should also include two standards:
One is to see clearly, which requires the physical perception ability brought by clearer cameras and more powerful system algorithms;
It also needs to be ‘understandable’, which means relying on cutting-edge artificial intelligence technology based on deep learning to accurately understand the environment 1,600 meters away, which can help the system draw conclusions and provide important judgment basis for safe driving.
The breakthrough in the effective distance of stereoscopic vision will have a direct impact on the implementation of autonomous driving scenarios in terms of safety and cost reduction and energy conservation.
Surveys show that 20% of traffic accidents caused by drivers are due to insufficient stereoscopic vision perception of the drivers.
自动驾驶系统立体视觉有效距离从300米到1600米的突破,无疑将显著推升系统的整体感知能力,进而保证驾驶过程中更强的安全性。
At the same time, the effective safety distance provides sufficient prediction time, allowing autonomous driving to operate reasonably, reduce high fuel consumption such as emergency braking, and reduce operating costs.
Is long-distance perception competition necessary?
According to Xinzhijia, American trucks run at a speed of about 30 meters per second, and it takes about 10 seconds for a truck to change lanes. Therefore, a distance of 300 meters is only enough for a truck to change lanes once, and the driver's advance observation will take more than 10 seconds. In other words, a shorter perception distance is not enough.
However, improving the perception distance requires more R&D investment and hardware costs, so is it necessary to improve the perception level to 1,600 meters?
First, multi-sensor fusion will greatly improve system security.
Specifically, the autonomous driving industry is currently working hard to first implement vertical application scenarios represented by highway freight, and the vehicle characteristics of heavy trucks also bring more stringent safety requirements to the system's perception, planning and control capabilities.
In actual driving, various sudden or low-probability factors will affect the prediction of high-speed heavy truck driving. At the same time, the fuel savings brought by good driving habits also rely on better visibility conditions. Therefore, the technical pursuit of the effective perception distance limit has gradually become a consensus among many high-speed scene autonomous driving technology companies.
It is worth mentioning that PlusAI emphasizes that the effective sensing distance exceeding 1,600 meters is an important part of its multi-sensor fusion solution.
In addition, this technological breakthrough, combined with PlusAI's existing short-range and medium-range cameras as well as lidar, millimeter-wave radar and other sensors, will greatly enhance the redundancy of environmental perception information and thereby improve safety.
Zheng Hao, founder and CTO of Plus.ai, said: "Perception and positioning are an extremely complex set of technologies. The effective perception distance of 1,600 meters is an important parameter for the autonomous driving system to make predictions and plans, but it is still not enough. The safety of autonomous driving should be the first priority. This number can help the system obtain more comprehensive redundant data, but the ultimate pursuit of environmental perception for safety purposes is endless. We believe that multi-sensor fusion based on lidar, millimeter-wave radar and vision is necessary."
Last year, PlusAI, FAW and Manbang formed a commercialization iron triangle of "software technology-vehicle platform-operation management" for autonomous trucks. The strategic alliance aims to put unmanned logistics trucks on the road in 2021. The multi-eye vision stereo perception technology released by PlusAI this time has an effective distance of 1,600 meters, indicating that they are steadily moving towards the goal of gradually realizing their goal of commercial operation of autonomous driving.
Previous article:SoftBank announces cloud-based 5G network-based vehicle spacing control technology
Next article:2019 CES: Baidu Internet of Vehicles releases connected car specifications
- Popular Resources
- Popular amplifiers
- A new chapter in Great Wall Motors R&D: solid-state battery technology leads the future
- Naxin Micro provides full-scenario GaN driver IC solutions
- Interpreting Huawei’s new solid-state battery patent, will it challenge CATL in 2030?
- Are pure electric/plug-in hybrid vehicles going crazy? A Chinese company has launched the world's first -40℃ dischargeable hybrid battery that is not afraid of cold
- How much do you know about intelligent driving domain control: low-end and mid-end models are accelerating their introduction, with integrated driving and parking solutions accounting for the majority
- Foresight Launches Six Advanced Stereo Sensor Suite to Revolutionize Industrial and Automotive 3D Perception
- OPTIMA launches new ORANGETOP QH6 lithium battery to adapt to extreme temperature conditions
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions
- TDK launches second generation 6-axis IMU for automotive safety applications
- LED chemical incompatibility test to see which chemicals LEDs can be used with
- Application of ARM9 hardware coprocessor on WinCE embedded motherboard
- What are the key points for selecting rotor flowmeter?
- LM317 high power charger circuit
- A brief analysis of Embest's application and development of embedded medical devices
- Single-phase RC protection circuit
- stm32 PVD programmable voltage monitor
- Introduction and measurement of edge trigger and level trigger of 51 single chip microcomputer
- Improved design of Linux system software shell protection technology
- What to do if the ABB robot protection device stops
- Huawei's Strategic Department Director Gai Gang: The cumulative installed base of open source Euler operating system exceeds 10 million sets
- Download from the Internet--ARM Getting Started Notes
- Learn ARM development(22)
- Learn ARM development(21)
- Learn ARM development(20)
- Learn ARM development(19)
- Learn ARM development(14)
- Learn ARM development(15)
- Analysis of the application of several common contact parts in high-voltage connectors of new energy vehicles
- Wiring harness durability test and contact voltage drop test method
- Reset mode of TMS320C6678
- The PWM module of the MSP430G2 microcontroller controls the brightness of the LED indicator
- Summary of DSP boot mode / boot loader / power-on sequence / online upgrade and other issues
- 【Silicon Labs BG22-EK4108A Bluetooth Development Evaluation】 III. Modify the example project to achieve dual-channel PWM output
- Does anyone have the routine for balancing a ball? It's a ping-pong ball sliding on a round flat surface and never falling off.
- micropython update: 2020.5
- IPC-TM-650 Test Method Manual CN Chinese Version 2020 Latest (104 Methods) Scanned Version
- msp430F4152 official website routine program
- msp430f5529 clock module DCO
- Adjustable high-speed sensorless BLDC circuit diagram