1. Introduction
At present, cars have not only changed in power source, driving mode and driving experience, but also the cockpit has bid farewell to the traditional boring mechanical and electronic space, and the level of intelligence has soared, becoming the "third space" of people's lives after home and office. Through high-tech technologies such as face and fingerprint recognition, voice/gesture interaction, and multi-screen linkage, the ability of today's automotive smart cockpit in environmental perception, information collection and processing has been significantly enhanced, becoming a "smart assistant" for human driving. One of the significant signs that the smart cockpit has bid farewell to simple electronics and entered the stage of smart assistants is that the interaction between people and the cockpit has changed from passive to active. This "passive" and "active" is defined centered on the cockpit itself. In the past, information exchange was mainly initiated by people, but now both people and machines can start it. The level of interaction between people and machines has become an important symbol for defining the level of smart cockpit products.
2. Background of Human-Computer Interaction Development
The history of computers and mobile phones can reflect the development of the interaction methods between machines and people, from complex to simple and concise, from abstract actions to natural interactions. The most important development trend of human-machine interaction in the future is to change the machine from passive response to active interaction. Following the extension of this trend, the ultimate goal of human-machine interaction is to anthropomorphize the machine, so that the interaction between man and machine is as natural and smooth as the communication between man and man. In other words, the history of human-machine interaction is the history of man adapting to machines and adapting to man through machines. The development of smart cockpits has a similar process. With the advancement of electronic technology and the expectations of car owners, there are more and more electronic signals and functions inside and outside the car, so that car owners can reduce the waste of attention resources and thus reduce driving gas. As a result, the method of car interaction has gradually changed: physical knob/keyboard-digital touch screen-language control-natural state interaction.
2.1 Natural interaction is the ideal model for the next generation of human-computer interaction
2.1.1 What is natural interaction? In short, it is communication through actions, eye tracking, language, etc. The consciousness mode here is more specifically similar to human "perception", which is a mixture of various perceptions and corresponds to the five major perceptions of human vision, hearing, touch, smell, and taste. The corresponding information media include various sensors, such as sound, video, text, infrared, pressure, and radar. Smart cars are essentially manned robots. The two most critical functions are self-control and interaction with people. Without one of them, it will not be able to work efficiently for people. Therefore, an intelligent human-computer interaction system is very necessary.
2.1.2 Ways to implement natural interaction More and more sensors are integrated into the cockpit, and the sensors have improved the ability of morphological diversity, data richness and accuracy. On the one hand, it makes the computing power demand in the cockpit leap, and on the other hand, it also provides better perception support. This trend makes richer cockpit scene innovation and better interactive experience possible. Among them, visual processing is the key to cockpit human-computer interaction technology. And fusion technology is the real king. For example, when it comes to voice recognition under noisy conditions, microphones alone are not enough. In this case, people can selectively listen to someone speaking, not only by ears, but also by eyes. Therefore, by visually determining the sound source and reading lip reading, better results can be obtained than simple voice recognition. If the sensor is the five senses of a person, then the computing power is an automatically interactive human brain. The AI algorithm combines vision and voice, and through various cognitive methods, it can identify multiple signals such as face, action, posture, voice, etc. As a result, more intelligent human target interaction can be achieved, including eye tracking, voice recognition, oral recognition linkage and driver fatigue status detection. The design of cockpit personnel interaction usually needs to be completed through edge computing rather than cloud computing. Three points: security, real-time and privacy security. Cloud computing relies on the network. For smart cars, relying on wireless networks cannot guarantee the reliability of their connections. At the same time, data transmission delays are uncontrollable and cannot guarantee smooth interactions. To ensure a complete user experience for the automatic operation security domain, the solution lies in edge computing. However, personal information security is also one of the problems faced. The private space in the cab is particularly important in terms of security protection. Today's personalized voice recognition is mainly implemented on the cloud, and private biometric information such as voiceprints can more conveniently display private identity information. By using edge AI design on the car side, private biometric information such as pictures and sounds can be converted into car semantic information and then uploaded to the cloud, thereby effectively protecting the personal information security of the car.
2.1.3 In the era of autonomous driving, interactive intelligence must match driving intelligence In the foreseeable future, drone cooperative flight will become a long-term phenomenon, and cockpit drone interaction will become the first interface for people to master active flight skills. At present, the field of intelligent driving faces the problem of uneven evolution. The level of human-computer interaction lags behind the improvement of the level of autonomous driving, resulting in frequent autonomous driving problems and hindering the development of autonomous driving. The characteristic of human-computer interaction cooperation behavior is the human operation loop. Therefore, the human-computer interaction function must be consistent with the autonomous driving function. Otherwise, it will lead to serious expected functional safety risks, and most fatal autonomous driving incidents are related to this. Once the human-computer interaction interface can provide its own cognitive results of driving, it can further understand the energy boundary of the autonomous driving system. This will greatly help to improve the acceptance of L+ level autonomous driving functions. Of course, the current interactive mode of the smart cockpit is mainly an extension of the Android ecosystem of mobile phones, mainly supported by the host screen. The current display is getting bigger and bigger, and this is actually because low-priority functions occupy the space of high-priority functions, bringing additional signal interference and affecting operational safety. In the future, although physical display screens will still exist, I believe that in the future, they will be replaced by natural human-computer interaction + AR-HUD. If the intelligent driving system is developed to L4 or above, people will be liberated from boring and tiring driving, and cars will become "people's third living space". In this way, the positions of the entertainment area and the safety function area (human-computer interaction and automatic operation) in the future cab will change, and the safety area will become the main control area. Automatic driving is the interaction between the car and the environment, and the interaction between people is the interaction between people and cars. The two are integrated to complete the coordination of people, cars and the environment, forming a complete driving closed loop. Second, the dialogue interface of automatic dialogue mode + AR-HUD is safer. When communicating with language or gestures, it can avoid the diversion of the driver's line of sight, thereby improving driving safety. This cannot be achieved on the large screen in the cockpit, but ARHUD can avoid this problem while displaying the automatic driving perception signal. Third, the natural conversation method is an implicit, concise and emotional natural conversation method. You cannot occupy too much precious physical space in the car, but you can accompany the free person anytime and anywhere. Therefore, the future integration of smart driving and smart cockpits will be a safer way of development, and the final development will be the central system of the car.
2.2 Practical principles of human-computer interaction 2.2.1 Touch interaction In the early days, the center console screen only displayed the information of the radio, and most of the area was occupied by a large number of physical interaction buttons, which basically communicated with humans through tactile interaction. With the development of intelligent interaction, large central control screens appeared, and physical interaction buttons began to gradually decrease. The large central control screens are getting bigger and bigger, occupying more and more important positions. The physical buttons on the center console have been reduced to nothing, and at this time the occupants can no longer interact with people by touch, but at this stage gradually turn to visual interaction. People no longer communicate with people by touch, but mainly operate by vision. But if people only use vision to communicate with humans in the smart cockpit, it will be absolutely inconvenient. Especially during driving, 90% of human visual attention must be devoted to observing road conditions in order to keep their eyes on the screen for a long time and communicate with the smart cockpit.
Previous article:New Energy Vehicle Supply Chain Navigation Map
Next article:AEC-Q200 certification, everything you want to know is here!
- Huawei's Strategic Department Director Gai Gang: The cumulative installed base of open source Euler operating system exceeds 10 million sets
- Analysis of the application of several common contact parts in high-voltage connectors of new energy vehicles
- Wiring harness durability test and contact voltage drop test method
- Sn-doped CuO nanostructure-based ethanol gas sensor for real-time drunk driving detection in vehicles
- Design considerations for automotive battery wiring harness
- Do you know all the various motors commonly used in automotive electronics?
- What are the functions of the Internet of Vehicles? What are the uses and benefits of the Internet of Vehicles?
- Power Inverter - A critical safety system for electric vehicles
- Analysis of the information security mechanism of AUTOSAR, the automotive embedded software framework
Professor at Beihang University, dedicated to promoting microcontrollers and embedded systems for over 20 years.
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- Is there any delay when directly connecting FPGA I/O pins?
- Evaluation Weekly Report 20220207: How many days are left to apply for Qinheng ch582 and Pingtouge Linux RISC-V kit
- About Modbus slave response packet address
- ESD resistance of capacitors
- Date in spring + small flowers blooming in the spring bushes
- [Sipeed LicheeRV 86 Panel Review] Debian Python + Serial Communication
- What is the 5G battle about?
- Zero-based development of WIFI devices
- A detailed description of the development history of Bluetooth technology from 1.0 to 5.0
- Fundamentals of mmWave Sensors (mmWave Training Series)