At the 2019 CES Asia, technology and product displays in the automotive field occupied half of the exhibition, and the automobile exhibition hall was expanded to two. Some people even jokingly said that CES Asia has become an auto show.
In fact, the technological transformation in the automotive field has been going on for a long time, and the traditional means of transportation has gradually transformed into a mobile smart terminal.
At CES Asia 2019, Nissan showcased its Invisible-to-Visible technology that combines reality and virtuality, its Brain-to-Vehicle technology that interprets brain signals, and its new all-electric all-wheel drive model, the Nissan IMs concept car, bringing us one step closer to future car driving scenarios.
Invisible visualization technology: combining virtual and real
First, let's look at the "invisible visualization" technology. This future technology hopes to allow you to see through the buildings on the road and follow the guided route while driving. Literally, it means "from invisible to visible". It can help drivers see things that are originally blocked and "invisible" by integrating virtual displays and real objects in the front field of view.
Invisible Visualization Technology Demonstration
As part of Nissan's Intelligent Mobility technology, this technology creates a 360-degree virtual space around the vehicle, called Metaverse, through sensors inside and outside the vehicle and data stored in the cloud, providing information on roads, intersections and nearby pedestrians, allowing the system to not only perceive the current environment around the vehicle, but also "see" corners or buildings that are not visible in front of it in advance.
When the vehicle is in autonomous mode, the technology can make driving easier and more pleasant. For example, when the vehicle is driving in the rain, the invisible visualization technology can project a sunny scene inside the vehicle; when visiting a new place, the system can search for knowledgeable local guides in the virtual world and let the guide communicate with the driver and passengers.
When the vehicle is in manual driving mode, the "invisible visualization" technology will use the data collected by the "omni-directional perception technology" to provide the driver with a full coverage view. This information can help the driver evaluate and predict situations such as low-visibility corners, unconventional roads or oncoming vehicles. At the same time, the vehicle driver can also book a professional driver in the virtual world to get personal guidance in real time. The professional driver appears as a projected image or a virtual tracking vehicle in the vehicle driver's field of view to demonstrate the best driving method.
In daily driving, if a person suddenly runs out on the road, the virtual scene seen by the driver will be marked as a reminder, allowing the driver to avoid danger in time.
In the scene of parking in the parking garage, we can see some information about the parking lot in the virtual reality on the right, such as the first floor is full. In the scene of driving on a mountain road, the vehicle can simulate a racing driver sitting in the passenger seat to assist the driver in driving.
Because this technology is expected to take some time to come out. So if you want to experience it at the Nissan booth, you need to wear a pair of augmented reality AR glasses temporarily.
Brain-controlled car technology: Driving with the brain is safer
Imagine a scenario where in the future when we drive a car, our car can predict the driver's thoughts and take actions in advance on behalf of the driver, such as turning the steering wheel 0.2s to 0.5s earlier than the artificial brain-controlled car technology, or slowing down the car, and the driver is almost unaware of the whole process.
This scenario can be realized through Nissan's Brain-to-Vehicle (B2V) technology. At this CES Asia, Nissan demonstrated this technology to us.
Brain-controlled car technology
The "brain-controlled car" technology can convey the driver's reaction to the vehicle more quickly, allowing the vehicle to continuously make corresponding adjustments according to changes in driving conditions. Nissan's technology uses brain decoding technology to predict driver behavior and detect driver discomfort while driving.
By predicting the driver's reaction, brain-controlled car technology can enhance the vehicle's maneuverability, and can also adjust in time by learning the driver's habits to reduce the driver's discomfort. In Nissan's view, brain-controlled cars will not eliminate our driving pleasure, but will enhance the driving experience of new/novice drivers through relevant technologies.
In addition to controlling vehicle driving, the B2V system can also adapt to more application scenarios. For example, the volume, air conditioning temperature, seat angle and other settings in the car can all be completed through the brain control system, allowing the driver to sit in the self-driving car and still feel a strong sense of participation and control.
Today, autonomous driving still requires the driver to set instructions in advance, and sometimes the autonomous driving system cannot make people completely trustworthy. When driving at high speeds, the monitoring system may not be able to quickly identify some obstacles, and manual intervention is still required.
This is when the driver's ability to respond is tested, because in the state of automatic driving, the driver often does not concentrate on observing the road. Nissan's brain-controlled vehicle technology (B2V) will help the safety of future automatic driving systems.
In the future, brain-controlled car technology will be combined with cars with autonomous driving functions. In this way, the autonomous driving system will not become programmed, and the driver can easily intervene to control the vehicle.
Nissan IMs concept car
Finally, Nissan also displayed the new pure electric all-wheel drive model, the Nissan IMs concept car, which was unveiled at CES earlier this year. This is the first concept car equipped with invisible visualization technology.
Summarize:
Compared with the current touch and voice control systems, driving in the future may only require you to use your brain. Brain-controlled car technology and invisible visualization technology seem to be far away from us, but with the commercialization of 5G technology and the development of VR technology and brain wave technology, it is just around the corner for these two technologies to come to us.
Previous article:Jia Yueting's Faraday Future has a turbulent fate, and he responded silently to the car manufacturing
Next article:CATL lifted the ban on 980 million restricted shares and closed up 0.87%
Recommended ReadingLatest update time:2024-11-16 17:49
- Popular Resources
- Popular amplifiers
- Huawei's Strategic Department Director Gai Gang: The cumulative installed base of open source Euler operating system exceeds 10 million sets
- Analysis of the application of several common contact parts in high-voltage connectors of new energy vehicles
- Wiring harness durability test and contact voltage drop test method
- Sn-doped CuO nanostructure-based ethanol gas sensor for real-time drunk driving detection in vehicles
- Design considerations for automotive battery wiring harness
- Do you know all the various motors commonly used in automotive electronics?
- What are the functions of the Internet of Vehicles? What are the uses and benefits of the Internet of Vehicles?
- Power Inverter - A critical safety system for electric vehicles
- Analysis of the information security mechanism of AUTOSAR, the automotive embedded software framework
Professor at Beihang University, dedicated to promoting microcontrollers and embedded systems for over 20 years.
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- How to Use a Network Analyzer for Cable Testing
- Network port problem
- Oscilloscope measurement of automobile CAN-BUS bus signal and waveform analysis
- Xun uses Yocto file system to develop QT file system for i.MX6ULL Terminator
- Msp430F5438A interrupt
- MSP430F5529 clock 25MHz setting method
- The impact of intermittent power charging on battery life
- Synthesizable Verilog Syntax (Cambridge University, photocopy)
- Oscilloscope measurement and waveform analysis of intake air pressure sensor signal in auto repair
- PIC16F1823 compiles with mplab and reports an error, please help