Today we will extend our focus to XR.
The emergence of XR has brought rich connotations to the new round of human-computer interaction. The immersive experience of virtual and real has always been the direction of the industry's joint efforts. So what does "immersive" mean? How far are we from immersive XR? This issue will answer your questions.
Since the emergence of computer technology, people and hardware have begun to establish an interactive relationship. Looking back at the history of hardware development from the perspective of human-computer interaction, in the past 50 years, the form of hardware has roughly experienced a process from game consoles to personal computers, and then to smartphones, corresponding to the development paths of vertical computing hardware, general computing hardware, and then to miniaturized hardware. The emergence of XR has brought rich connotations to the new round of human-computer interaction.
The content-hardware-ecological positive cycle chain generated by the continuous iteration of XR technology and equipment has brought us to the entrance of the metaverse. For the XR industry, the three core underlying technologies supporting its development include visual and intelligent computing technology, large-screen display technology, and high-speed connection technology.
UNISOC has long invested in core technologies such as IP integration capabilities and audio and video multimedia IP, and has continuously innovated advanced technologies in communication infrastructure such as 5G/Wi-Fi, high dynamic range (HDR), wide color gamut (WCG), and high frame rate (HFR) required for immersive large-screen displays. In addition to product research and development, UNISOC is deeply involved in XR industry standards and actively contributes to XR technology research and industry advancement with ecological partners.
With the acceleration of the integration and innovation of all parties in the industrial ecosystem, XR and its key technical capabilities are gradually integrated into all aspects of consumer daily life. In the booming market, it is no longer far away for us to perceive and transmit information anytime and anywhere and experience the diverse world between virtual and real.
What is XR?
Extended Reality (XR) technology includes Virtual Reality (VR), Augmented Reality (AR), Mixed Reality (MR), etc. It integrates computing and AI technologies, audio and video processing technologies, simulation technologies, communication technologies, etc., and builds virtual environments or creates new virtual spaces by simulating the real world, bringing three-dimensional immersion and multi-sensory virtual interactive experience.
XR brings together different digital technologies to create a multi-dimensional immersive experience, focusing on visual immersion, physical immersion, and cognitive immersion. Specifically, visual immersion relies on supporting technologies such as ultra-high-definition display technology, ultra-high software and hardware computing power, and high data transmission technology. Physical immersion aims to break the boundaries between physical and digital space, including core technologies such as multimodal interaction, spatial computing, three-dimensional reconstruction, positioning and map construction. Cognitive immersion further expands XR's semantic and geometric understanding of real scenes, pointing to more technologies with broad future development prospects, including brain-computer interface, naked-eye 3D, and light field.
What are the key technologies of XR?
Visual quality is the key to XR's ultimate experience. 80% of the information received by the human body comes from visual perception. A visual experience close to that of the human eye requires ultra-high pixel density and a suitable field of view (FOV). The pixel density depends on the image resolution and display resolution, which measure the fineness of the image. High resolution can improve the "screen door effect"¹, making the clarity as close to the real world as possible. The size of the field of view affects the immersion and clarity of XR. Generally speaking, the larger the field of view, the more images need to be rendered, the stronger the visual immersion, and the corresponding higher computing power is required.
Definition of human eye field of view
Multimodal perception interaction integrates multiple technologies such as facial tracking, eye tracking, voice recognition, gesture interaction, and tactile simulation, enabling people to access and interact in the way that best suits their current scenarios and business processes.
Taking voice recognition as an example, different scenarios have different requirements for device voice recognition and processing capabilities. In home scenarios, multiple types of noise, complex sound source locations, long pickup distances, and poor signal-to-noise ratios are the top priorities for improving user experience; in gaming scenarios, it is necessary to enhance audio perception capabilities, combine AI noise reduction, AI enhancement, and other technologies to increase the sense of presence...
Starting from the immersive experience, UNISOC has launched an audio and video system solution that supports everything from clear voice, efficient video encoding and decoding to ultra-high-definition smart display, covering the entire smart terminal field, further promoting the implementation of XR products and applications, and achieving an upgrade from mobile to immersive experience.
For example, the M6780, the first smart display chip platform equipped with NPU, integrates a complete end-side intelligent voice solution. Through Unisoc's self-developed multi-microphone array audio capture algorithm, intelligent voice wake-up, end-side command word recognition and other technologies, the chip can hear clearly, understand and execute.
5G promotes rich XR experience
The traffic characteristics of XR services, such as non-integer periodicity, jitter, high rate, and strict packet delay requirements, pose huge challenges to the energy saving and capacity of communication systems. Affected by the battery capacity of handheld devices and wearable devices, the energy consumption optimization of terminals also needs to be considered. Under the premise of ensuring the quality of XR services, achieving optimization in energy saving and capacity will promote the faster implementation of extended reality applications.
5G technology supports high speed, low latency and high reliability. Based on the characteristics of XR traffic, it also introduces energy-saving and capacity optimization mechanisms, which not only meet the communication needs of XR services, but also expand user usage scenarios and provide users with more flexible usage methods. Combined with AI, cloud computing and other technologies, the efficiency and quality of immersive content transmission are greatly improved. Users are no longer passive recipients of information, but can "actively" choose and create new experiences through real-time interaction.
Since 2018, 3GPP has included XR as an important part of the 5G standard, which covers a comprehensive review of XR concepts, key technologies, device types, and performance indicators. For 5G-A and 6G, in addition to meeting the basic XR business scenarios, the needs of more scenarios in the metaverse will become the focus, and under the general trend of perception + computing + network integration, the ultimate vision of "intelligence everywhere" will be realized.
3GPP TR26.928 Different types of extended reality technologies and their applications
Today, we can already experience intelligent immersive interactive experiences in scenes such as entertainment exhibitions, industrial manufacturing, social chats, and medical surgeries. From the perspective of long-term industry development, the current XR still needs to overcome many challenges in terms of interactive fluency, response speed, cognitive immersion, and content production. However, it is undeniable that with the help of technological integration and popularity, the exploration of XR technology will also be accompanied by the expansion of the content ecosystem, penetrate into more diverse scenes, and continuously broaden new horizons and commercial market space for the industry.
Remark:
The "screen door effect" means that due to the resolution of the display, the human eye will directly see the pixels of the display, just like looking at something through a screen.
Previous article:VR equipment manufacturers are catching up? Switchable panels and hand tracking technology upgrades
Next article:The consumer AR market has entered a period of rapid growth, and all-in-one AR devices will grow by nearly 200% in 2023
Recommended ReadingLatest update time:2024-11-23 08:23
- Popular Resources
- Popular amplifiers
- "Cross-chip" quantum entanglement helps build more powerful quantum computing capabilities
- Why is the vehicle operating system (Vehicle OS) becoming more and more important?
- Car Sensors - A detailed explanation of LiDAR
- Simple differences between automotive (ultrasonic, millimeter wave, laser) radars
- Comprehensive knowledge about automobile circuits
- Introduction of domestic automotive-grade bipolar latch Hall chip CHA44X
- Infineon Technologies and Magneti Marelli to Drive Regional Control Unit Innovation with AURIX™ TC4x MCU Family
- Power of E-band millimeter-wave radar
- Hardware design of power supply system for automobile controller
Professor at Beihang University, dedicated to promoting microcontrollers and embedded systems for over 20 years.
- Intel promotes AI with multi-dimensional efforts in technology, application, and ecology
- ChinaJoy Qualcomm Snapdragon Theme Pavilion takes you to experience the new changes in digital entertainment in the 5G era
- Infineon's latest generation IGBT technology platform enables precise control of speed and position
- Two test methods for LED lighting life
- Don't Let Lightning Induced Surges Scare You
- Application of brushless motor controller ML4425/4426
- Easy identification of LED power supply quality
- World's first integrated photovoltaic solar system completed in Israel
- Sliding window mean filter for avr microcontroller AD conversion
- What does call mean in the detailed explanation of ABB robot programming instructions?
- STMicroelectronics discloses its 2027-2028 financial model and path to achieve its 2030 goals
- 2024 China Automotive Charging and Battery Swapping Ecosystem Conference held in Taiyuan
- State-owned enterprises team up to invest in solid-state battery giant
- The evolution of electronic and electrical architecture is accelerating
- The first! National Automotive Chip Quality Inspection Center established
- BYD releases self-developed automotive chip using 4nm process, with a running score of up to 1.15 million
- GEODNET launches GEO-PULSE, a car GPS navigation device
- Should Chinese car companies develop their own high-computing chips?
- Infineon and Siemens combine embedded automotive software platform with microcontrollers to provide the necessary functions for next-generation SDVs
- Continental launches invisible biometric sensor display to monitor passengers' vital signs
- 【Silicon Labs Development Kit Review】– Compilation Environment Installation
- BS EN IEC 62485-5:2021 Safety requirements for secondary cells and battery installations Part 5: Stationary lithium-ion batteries...
- Qinheng benefits are here, evaluation boards are given away! Choose from three models: CH549, CH559, and CH554 for free!
- CC2640R2F power management architecture
- What is the value of BLOG in the WEB2.0 era?
- The output of this mp23ab01dh is abnormal?
- Tsinghua University's "Chip Academy" was established, and by the way, let's get to know "Gao Xiaosong and Li Jian" again
- In Memory of Master Shan Tianfang
- Bear Pie Huawei IoT operating system LiteOS bare metal driver transplantation 04-E53_IA1 expansion board driver and use
- TDS510-USB2.0 TI DSP EMULATOR driver installation problem