Development Trend of Intelligent Cockpit Human-Computer Interaction Technology

Publisher:恬淡如云Latest update time:2023-05-18 Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

At present, cars have not only changed in terms of power sources, driving methods and driving experience, but the cockpit has also bid farewell to the traditional boring mechanical and electronic space. The level of high-techintelligence , the current car smart cockpit's capabilities in environmental perception, information collection and processing have been significantly enhanced, and it has become an "intelligent assistant" for human driving.


A significant sign that smart cockpits bid farewell to simple electronics and enter the smart assistant stage is that the interaction between humans and the cockpit changes from passive to active, and this "passive" and "active" are defined around the cockpit itself. In the past, information exchange was mainly initiated by people, but now it can be initiated by both people and machines. The level of human-computer interaction has become an important symbol for defining the grade of smart cockpit products .


In order to achieve "interactive" human-computer interaction, machines need to better understand human behavior. The processors, controllers, sensors , etc. in each device are the key to achieving this. Currently, Mouser Electronics has a wealth of products and development kits for automotive smart cockpit applications, helping engineers to create industry-competitive smart cockpit products more quickly and efficiently.


From simple response to proactive and multi-modal


As more and more electronic components enter the automotive cockpit, smart cockpits are reshaping the new landscape of the automotive industry. In the past, when consumers purchased a car, most of the cost was spent on the power/transmission system. In the future, consumers may spend nearly half of their money on a car that is more comfortable, convenient, and smarter.


Driven by such demand, the instrument display system, central control screen, visual perception system and voice interaction system will be deeply integrated and endowed with the user's personality and emotional characteristics. Therefore, the development of smart cockpits in the future will center on user needs, experience, and user emotions, and be driven by specific scenarios to build a human-centered intelligent system.


Of course, with more and more information and functions in the car, the interaction method of physical knobs/buttons has become obviously outdated, which will cause problems for users to use cockpit functions. Digital interfaces and digital functions have become qualified successors, creating a huge and new automotive demand market.


According to statistics and forecast data from ICVTank, the market size of the global smart cockpit industry has reached US$36.4 billion in 2019, and will exceed 46 billion yuan in 2022. The development of the smart cockpit industry will explode in the next few years, and by 2025 The industry scale of the entire smart cockpit industry is expected to reach US$103 billion in 2022, and the compound annual growth rate (CAGR) from 2022 to 2025 will reach an astonishing 30.83%.


An important reason why consumers are willing to pay for intelligent functions that are added to car travel is interaction and experience. We have mentioned above that the future development of smart cockpits will be people-centered and evolve into the third space in people's lives.


This kind of human-computer interaction is by no means a simple call and response, but a multi-channel, multi-level, and multi-modal communication experience. From the perspective of drivers and passengers, human-computer interaction in future smart cockpits will use intelligent voice as the main communication method, and touch, gestures, movements, expressions, etc. as auxiliary communication methods, freeing the driver's hands and eyes and reducing driving fatigue. Operational risk. At the same time, the in-vehicle infotainment system will bring more fun to non-drivers according to the driving scene without affecting the driver and other passengers.


In terms of emotional interaction, the accompanying, uninterrupted and personalized intelligent voice will be combined with intelligent recognition functions such as face recognition and gesture recognition to determine a person's mood, fatigue status, concentration, etc., and will be used in emotional interaction and fatigue driving warning. , focus monitoring and other scenarios to play a prominent role.


With the help of these highly innovative human-computer interaction functions, smart cockpits will allow cars to break through the definition of travel tools and become people's travel companions.


Sensor modules that can be used for cockpit detection


The smart cockpit must be people-oriented, so it needs to pay attention to the status of the driver and passengers and then make specific responses. On Mouser Electronics, the AD PD144RI PPG optical sensor module from manufacturer ADI can be used for health monitoring of drivers and passengers, mainly optical heart rate monitoring and reflective SpO2 (blood oxygen saturation) measurement.


This sensor, whose supplier number is ADPD144RI-ACEZ-RL7, is a highly integrated optical front-end. As can be seen from the system block diagram below, the module integrates three optical devices: 660nm LED, 880nm infrared LED and photodiode . It can perform optical plethysmography on blood oxygen saturation (SpO2) by simultaneously detecting red light and infrared wavelengths ( PPG) detection.

pYYBAGRljNuAX5ZWAAJOGETtFXk646.png

Figure 2: ADPD144RI system block diagram (Source: ADI)


The ADPD144RI module uses synchronous detection of light pulses to improve the suppression of ambient light. It is designed for ultra-low direct optical reflection applications. This method also provides much lower power performance than asynchronous architecture, making the entire module have low power consumption. feature.


As seen in Figure 3, the ADPD144RI module combines a high-efficiency light-emitting diode (LED) emitter and sensitive 4-channel deep-diffusion photodiodes (PD1 to PD4) with a custom application-specific integrated circuit ( ASIC ) in a compact package. And provides optical isolation between the integrated LED emitter and detection photodiode to improve signal-to-noise ratio (SNR) through tissue.

poYBAGRljNeAAiMkAADmpw2sxPE917.png

Figure 3: Internal schematic diagram of ADPD144RI (Source: ADI)


In addition, the ADPD144RI module also provides rich configurations such as 4-channel analog front end ( AFE ), 14-bit analog-to-digital converter ( ADC ), and I2C data and control interface. What excites the majority of engineer friends is that although it provides such a rich configuration, the module is only 2.8mm × 5.0mm in size.


What needs to be pointed out in particular is that in order to facilitate the design of applications such as user interaction and cockpit detection, the ADPD144RI module adopts a customized optical package suitable for glass windows, which greatly enriches the application dimensions of the product. By installing the ADPD144RI module and other functional units on car windows and windshields, the physical status of drivers and passengers can be detected through key indicators such as pulse and blood oxygen. This is the basis for machine-initiated interactions in smart cockpit human-computer interaction. Help the system to provide next response or instructions.


In order to facilitate everyone to understand the ADPD144RI module as soon as possible, ADI has also specially developed an EVAL-ADPD144RIZ-SF evaluation board to evaluate the ADPD144RI leading photometric system. The evaluation board's manufacturer number on Mouser Electronics is EVAL-ADPD144RIZ-SF.


In addition to providing full configuration of the ADPD144RI module, the evaluation board includes the WaveTool graphical user interface (GUI), which provides low-level and high-level configurability, real-time frequency and time domain analysis. In order to facilitate the system development of applications such as smart cockpits, the evaluation board can be easily connected to the development system with the help of User Datagram Protocol (UDP) transmission function.


i.MX 8/8X baseboard supporting multi-sensor fusion


As we just mentioned, after the sensor information reaches the smart cockpit system, it needs to analyze the data and make a judgment. Similar sensor data also includes hand recognition information, voice information, driver's gaze direction information, driving time information, etc. The device we are going to introduce next is the key to the smart cockpit understanding this information. It is the i.MX 8X series application processor from the manufacturer NXP , with the manufacturer number MIMX8DX2AVOFZAC.


The i.MX 8X series processors feature the same subsystems and architecture as the high-end i.MX 8 series, with up to four Arm  Cortex-A35 cores, an Arm Cortex-M4F core for real-time processing, and an Arm Cortex-M4F core for audio, speech, and The integrated Cadence  Tensilica HiFi  DSP processed with advanced graphics capabilities and efficient system performance can support graphics, video , image processing and voice functions, and can meet the needs of security authentication and high energy efficiency.


We have mentioned in the previous part that in the future development of smart cockpits, voice will be an important way and method of human-computer interaction. The i.MX 8X series processors can perform multi-domain speech recognition , and the integrated Tensilica HiFi 4 DSP can provide functions such as audio pre- and post-processing, keyword detection, and speech recognition (for hands-free interaction).


As you can see from Figure 6, the MPU is the core of NXP’s electronic cockpit solution. With the help of rich peripheral interfaces, whether it is audio signals , networking signals, sensor signals, NFC signals, display signals, etc., they will eventually be brought together into the NXP MPU. The i.MX 8X series processors are the high-end series of NXP MPUs. Their performance in information processing and feedback is undoubtedly better, bringing a more humane experience to the cockpit system.

[1] [2]
Reference address:Development Trend of Intelligent Cockpit Human-Computer Interaction Technology

Previous article:Qualcomm Automotive Technology and Cooperation Summit will be held next week, joining hands with industry leaders to showcase automotive innovation in a panoramic manner
Next article:Renesas Investor Day announced multiple blueprints: developing single-chip radar and entering the silicon carbide market

Latest Automotive Electronics Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号