Building a multi-camera visual perception system for ADAS domain controllers with integrated processors

Publisher:温暖的拥抱Latest update time:2023-11-21 Source: 德州仪器 (TI) Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

As we drive through neighborhoods and towns and see children walking and biking, we realize the importance of road safety. A 2021 study by the National Highway Traffic Safety Administration (NHTSA) shows that an average of 20 pedestrians are killed in traffic crashes every day in the United States—one pedestrian dies every 71 minutes. A 2022 study by the World Health Organization found that 1.3 million people die each year in road traffic accidents, more than half of whom are pedestrians, cyclists and motorcyclists. Unfortunately, driver distraction is one of the leading causes of these accidents, and the trend seems to be increasing every year.


Advanced Driver Assistance Systems  (ADAS) help mitigate the effects of distracted driving to protect drivers, pedestrians and vulnerable road users. To achieve a five-star safety rating and meet regulatory requirements, a backup camera , front-facing camera and driver monitoring system will need to be added. Therefore, many manufacturers are improving their vehicle architecture to  integrate various active safety functions in ADAS domain controllers .


Domain controllers typically require:


Ability to interface with a variety of sensors : number, mode, and resolution.


Vision, artificial intelligence  ( AI ) and general processing for perception, driving and parking applications .


Connectivity to low-bandwidth and high-speed in-vehicle networks.


Functional safety and security prevent critical components from being compromised.


ADAS Domain Controller Processing and System Requirements


Growing demands on system memory, computing performance, and input/output (I/O) bandwidth make system designs more complex and increase system costs. Today's high-end ADAS systems use multiple cameras with different resolutions and various radar sensors around the car to provide a complete view of the driving environment. For each set of images collected from the sensor, AI and computer vision-powered


Both detection and classification algorithms need to run at high frame rates per second to accurately interpret the scene. This creates several challenges for system and software designers, including connecting these sensors to processing systems, transferring their contents to memory, and synchronizing the data for real-time processing by classification algorithms.


Texas Instruments' TDA4VH-Q1  system-on-chip  (SoC) (shown in Figure 1) integrates functions such as visual pre-processing, depth and motion acceleration, AI network processing, automotive network interfaces and secure microcontrollers  ( MCUs ). The TPS6594-Q1 power management integrated circuit is optimized to power the TDA4VH-Q1 in applications requiring automotive safety integrity level (ASIL) D and includes functional safety features such as voltage monitoring, hardware error detection for the TDA4VH-Q1 SoC , and a Q&A watchdog that monitors the MCU on the SoC for software bugs that cause lock-ups.

Figure 1: Simplified diagram of TDA4VH-Q1 SoC


Support multi-camera visual perception


One example of an ADAS application that requires increased processor performance is multi-camera visual perception. Installing cameras around the car provides a 360-degree view, helping to prevent frontal collisions and helping drivers stay alert to traffic and pedestrian activity in blind spots and adjacent lanes.


Phantom AI uses TI's J784S4 processor open source software development kit (SDK) to develop a multi-camera visual perception system for the TDA4VH-Q1. Phantom AI’s PhantomVision™ system provides a complete


A set of ADAS functions, ranging from compliance with EU general safety regulations to compliance with the Society of Automotive Engineers (SAE) L2 and L2+ standards. In addition to basic functions such as vehicle, vulnerable road group, free space, traffic sign and traffic light detection, PhantomVision™ also includes additional functions such as construction zone, turn signal and tail light detection and AI-based self-path prediction. Its multi-camera perception system combines front, side and rear-view cameras to cover the vehicle's 360-degree field of view, helping to eliminate blind spots (Figure 2).

Figure 2: Camera positions for 360-degree field of view used by Phantom AI


Phantom AI enables real-time operations by leveraging the TDA4VH-Q1's combination of high-performance computing, deep learning engines, and dedicated accelerators for signal and image preprocessing. Dedicated vision pre-processing accelerators handle the camera pipeline, including image capture, color space conversion and multi-scale image pyramid construction. Combined with Texas Instruments' deep learning libraries, the TDA4VH-Q1's high-speed multi-operation per second multi-core digital signal processor and matrix multiplication auxiliary engine provide efficient neural networks with fast algorithms and minimal I/O operation scheduling, enabling high-speed Accuracy and low latency. In this video, you can learn about the ADAS capabilities of the PhantomVision™ system using the TDA4VH-Q1 processor.


Conclusion


Building complex multi-sensor ADAS systems for SAE Level 2 and Level 2+ driving does not require water-cooled supercomputers . With well-designed SoCs like TI's TDA4VH-Q1, designed and developed by professional automotive engineers such as Phantom AI, cost-effective systems that meet functional safety requirements can be brought to the market. While we are enthusiastic about the future of autonomous driving , the real purpose of designing cost-effective systems that meet functional safety requirements is to make our world a safer place. Let ADAS technology benefit more areas of the automotive market (make more cars equipped with more ADAS), and bring a better and safer experience to drivers and pedestrians.


Reference address:Building a multi-camera visual perception system for ADAS domain controllers with integrated processors

Previous article:How 77GHz millimeter-wave radar sensors address the challenges of kick-open systems
Next article:Domestic substitution is accelerating! Jiefa Technology MCU enters the new energy vehicle power battery field

Latest Automotive Electronics Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号