Smart car HMI develops towards natural interaction

Publisher:电子艺术大师Latest update time:2024-10-14 Source: 盖世汽车 Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

With the improvement of automobile intelligence , human-computer interaction has expanded from simple operation and feedback between drivers and vehicles to multimodal interactive experience including touch, voice, vision and even emotional communication. The application of technologies such as artificial intelligence , augmented reality (AR) and virtual reality (VR) is constantly expanding the boundaries of human-computer interaction.


The core points of this report are as follows:


The application of technologies such as large-size touch screens , voice control , gesture recognition, and face recognition is redefining the way people interact with cars. In the future, with the integration and development of new technologies such as 5G , artificial intelligence, and augmented reality (AR), the intelligent, emotional, and personalized development of automotive HMI technology will become more prominent.


HMI architecture and new technologies


HMI architecture: Based on cockpit SoC and domain control, with operating system, middleware , and application software as software support, it realizes human-machine interaction such as touch, voice, and gesture through LCD screen, HUD, audio, microphone , and other hardware .


HMI interface: It has developed rapidly in the fields of instrument panel, central control, HUD and rearview mirror. At the same time, with the technological breakthroughs in smart surfaces, ambient lights, seats, etc., it has brought new directions and space for the development of vehicle HMI interface.


Multimodal Large model : Able to integrate various data from Smart cockpits , such as voice, gestures, facial expressions and other information, to promote the interaction mode of smart cockpits to a more natural and intelligent direction


Smart surface: It is an automotive interior component that integrates sensors , actuators and communication modules. It can be activated by touch sensing, gestures or voice commands, and can realize a variety of intelligent interaction methods.


Market Application


At present, the in-vehicle HMI interface is developing rapidly in the fields of instrument panel, central control, rear entertainment screen, HUD, voice interaction and cockpit monitoring. With the technological breakthroughs of smart surface, ambient light, seat and facial recognition algorithm, as well as the application of in-vehicle acoustics and AI technology , it brings new direction and space for the development of HMI.


In-car display : The overall display screen in the cockpit is showing a trend of large size and high definition. In terms of display technology , Mini LED and OLED have been mass-produced, and Micro LED is expected to enter the mass production stage in the next two years


HUD field: PGU technology will still be dominated by TFT-LCD and DLP in the short term , and LCoS has great development potential. In terms of optical display, domestic dual-focal AR-HUD has the capacity for mass production, and many manufacturers are accelerating the layout of multi-focal and 3D AR-HUD technology.


Electronic rearview mirror: The new national standard for electronic rearview mirrors has been implemented, with clear requirements for electronic rearview mirror parts and installation. Many models have adopted the strategy of top-of-the-line or optional equipment for mass production, but high cost is still the main obstacle to its explosive scale of adoption.


In-vehicle acoustics: In addition to installing more speakers, some car companies are also developing their own in-vehicle acoustics by purchasing white-label speakers to pursue cost-effectiveness.


Voice interaction: AI big model Enable voice interaction, the current vehicle AI voice assistant A breakthrough has been achieved in the functional fields of continuous dialogue, semantic understanding, content generation, self-learning, etc. Many car companies have realized the continuous dialogue function of voice assistants


Cockpit monitoring system: Fatigue driving reminder, active DMS fatigue detection, driver vital signs monitoring, facial recognition, emotion recognition, voiceprint recognition , and gesture recognition functions are accelerated, and emotion recognition and voiceprint recognition functions are relatively less used


Ambient lighting: Ambient lighting presents a trend of integrating points, lines, surfaces and three-dimensional forms, helping human-computer interaction develop towards multi-modal perception and intelligence.


3D HMI: 3D navigation, 3D car model desktop, 3D air conditioning, 3D seats, 3D virtual assistants, etc. have become common 3D applications in smart cockpits


Development Trend


Emerging HMI interfaces: Touch-controlled smart surfaces are expected to become the next breakthrough point for smart surfaces; multi-modal fusion of ambient lighting facilitates multi-modal perception and intelligent interaction; smart seats will expand human-computer interaction through health monitoring, system fusion, etc.


New trends in HMI: Human-machine interaction extends from inside the car to outside the car, mainly through headlights, laser projectors and other devices. Human-machine interaction based on headlight equipment will become a new area of ​​differentiated competition for future car companies; 3D HMI applications will be further expanded to provide more scene interactions and more natural emotional interactions; multi-modal large models can integrate multiple data from smart cockpits, and multi-dimensional information fusion promotes more natural and intelligent HMI



Smart car HMI is based on cockpit SoC and domain control, and supported by operating system, middleware and application software. It realizes human-computer interaction such as touch, voice and gesture through hardware such as LCD screen, HUD, audio and microphone. AI and cloud computing are the key to achieving HMI differentiation.


The in-vehicle HMI interface has developed rapidly in areas such as instrument panels, central control, HUD and rearview mirrors. At the same time, with technological breakthroughs in smart surfaces, ambient lights, seats, etc., new directions and space are being brought to the development of in-vehicle HMI interfaces.



The main display products in the field of in-vehicle display are front and rear screens and HUD. In recent years, the number of models equipped with large-size, high-definition integrated screens and large-size HUD has gradually increased, and the overall display screens in the cockpit are showing a trend of large size and high definition. In the field of rear entertainment screens, multi-screen linkage with the central control, instrument panel, co-pilot screen, etc. is realized, and wireless screen projection, content sharing, car control and other functions are realized. Some car companies or suppliers have launched rear adsorption pad solutions and slidable and rollable rear entertainment screens to save rear space.



With the advancement of facial recognition algorithms, functions such as fatigue driving reminders, active DMS fatigue detection, driver vital signs monitoring, facial recognition, emotion recognition, voiceprint recognition and gesture recognition have begun to be installed in cars. At present, the functions realized by gesture recognition mainly include taking pictures, music control, making and receiving calls, navigation, window control, etc. The camera is generally arranged above the rearview mirror.



The multimodal large model can integrate various data from the smart cockpit, such as voice, gestures, facial expressions and other information, and promote the evolution of the interaction mode of the smart cockpit from traditional physical operations to more natural and intelligent interaction forms such as voice, gestures, and facial recognition.


Reference address:Smart car HMI develops towards natural interaction

Previous article:Equipped with Black Sesame Smart Wudang C1236 chip, the NOA function in the city is deployed on the vehicle with one-shot demonstration
Next article:Apple's car-making plan suffers another major setback

Latest Automotive Electronics Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号