Vision can be said to be one of the most important sensory experiences for humans. Almost half of the memory in the human brain is used for visual image processing. Because of the importance of visual experience, technology giants have laid out augmented reality/virtual reality (AR/VR). With the participation of giants, the concept of "metaverse" has detonated the entire market. They use enhanced vision as a starting point and regard it as a key technology for the human-machine interface (HMI) of the next generation of hardware devices. With the development of technology, the "screenless era" is bound to come. AR/VR technology will replace computer screens, and AR/VR glasses will replace smartphones. However, electronic components (brain) and optical components (eyes) still face many technical obstacles. This article reveals AR/VR black technology from the perspective of optics and intelligent imaging systems.
What is an imaging system?
What is their role in AR/VR?
In various applications such as space exploration, industrial 5.0 machine vision, infotainment, driver assistance, computers and smartphones, imaging systems are mainly used for observation, image capture or image display. An imaging system generally consists of a camera with a lens and a display, which can be a stand-alone unit or a modular component.
The concept of digitalization began in the mid-20th century. The earliest human-machine interfaces (MHI) were large computers that only computer experts could operate. In the late 1970s, the first personal computers entered homes, and then the birth of smartphones made human-machine interaction a daily routine. According to forecasts, head-up displays and hands-free AR/VR systems will be the next wave of human-machine interfaces. At present, no one can accurately predict the future form of human-machine interfaces, but there are still huge challenges to achieve seamless connectivity for edge computing (non-cloud connectivity).
Immersive virtual reality living environments are still just a beautiful vision. Processing a large amount of information about the surrounding environment is a huge challenge for augmented reality systems. In addition, there are many limitations in price, overall performance (image quality, power consumption) and ergonomics. But one day, AR/VR systems will be as light, seamless, efficient, and comfortable to wear as contact lenses, and will not cause distraction problems. Instead, they can improve people's attention by providing customized information about the target location in the field of view.
Just as our brain and eyes are two major organs of the human body, computing chips and imaging systems are also the core of the future AR/VR human-machine interface. Based on Moore's Law, the semiconductor ecosystem has made great efforts to miniaturize computing chips for decades. As a result, the computing power of our practical smartphones today is comparable to that of supercomputers in the 1970s. The optoelectronics community is working hard to innovate and use the miniaturization and digitization of AR/VR smart imaging systems.
Miniaturization of imaging systems
Today, for $1, you can buy a set of eight free-form plastic lenses that together measure less than one thousandth of an inch. The complex lens design was the result of optimizing more than a thousand parameters. CMOS image sensors and computer vision chips have also been miniaturized and improved in the wake of the lens set, and today's smartphones typically have four cameras that take better images than the first digital cameras of 20 years ago, which had bulky 1-megapixel lenses.
Likewise, displays are getting smaller and smaller, and the next generation of AR devices may not just be glasses that we wear, but more likely to be placed in our eyes. Some companies are developing tiny systems that can be mounted on contact lenses, namely invisible AR glasses. The display of this bionic eye is smaller than a grain of sand, and it will emit light to the most sensitive part of our eyes, the fovea, where the number and density of light receptors are the highest. The next generation version will include a processor, eye tracking sensors and communication chips, and a small battery will be enough to power the entire device.
Digital imaging system
Similar to electronics a few decades ago, optics today are largely analog technologies. Optical systems consist of complex bulk surfaces, multiple layers of thin films, or continuous shapes. Alternatives to this traditional approach have only recently emerged. With the advent of planar optics, nanostructured devices (smaller than a wavelength) have emerged that can reproducibly perform optical functions such as focusing, color filtering, and reflection. The main advantages of this technology are its compactness, versatility, and compatibility with semiconductor processes, which opens new possibilities for integration between optics and electronics. There are many hurdles to overcome to bring this technology to maturity. Leaders in materials and semiconductor processes are using their expertise to help advance planar optics into markets of all sizes.
Every optical element will be converted from analog technology to digital technology. Fully digital imaging systems will collect or display images in a more flexible and specific way. Foveated cameras or displays effectively demonstrate how to use pixel imaging systems in a more personalized and efficient way, bringing detailed visual information precisely to the information processing unit and reducing the resolution of other areas at the same time. In a sense, digital imaging systems are imitating the function of the human eye.
Designing for the future: Intelligent imaging systems
Almost half of our human brain’s memory is used for visual image processing. Likewise, imaging systems must be intelligent, which means that optics and electronics must be co-designed, co-optimized, and possibly even co-manufactured.
Imaging system Consists of sensors that collect light, converters that transmit raw sensor signals to processing units, and computer vision units that post-process the digital signals to make decisions. Each component is designed separately by different teams and manufactured using different technologies.
Near Sensor computing or in-sensor computing will become the mainstream computing approach , helping to realize more innovative imaging systems, reduce energy consumption, achieve low latency and higher security. For example, a CMOS image sensor can be composed of a planar system chip that integrates a photodiode array, pulse modulation circuits and a simple ADC converter, bringing the front-end processing unit close to or into the sensor layer. Other approaches include jointly optimizing optical components, sensors and algorithms, and then using them together to solve a specific imaging problem: unlike taking pictures, you also need to recognize objects and make decisions, and video surveillance cameras should be different from eye tracking cameras. After the intelligent display engine is fully optimized, we don’t even need to build intermediate images to present truly immersive content.
Previous article:VeriSilicon Image Signal Processor IP Obtains IEC 61508 Industrial Functional Safety Certification
Next article:Lens design ceiling - see how CODE V can easily handle free-form surface design
Recommended ReadingLatest update time:2024-11-16 13:22
- Popular Resources
- Popular amplifiers
- Single chip microcomputer control technology (Li Shuping, Wang Yan, Zhu Yu, Zhang Xiaoyun)
- Principles and Applications of Single Chip Microcomputers and C51 Programming (3rd Edition) (Xie Weicheng, Yang Jiaguo)
- Computer Vision Applications in Autonomous Vehicles: Methods, Challenges, and Future Directions
- Hardware Accelerators in Autonomous Driving
- Melexis launches ultra-low power automotive contactless micro-power switch chip
- Infineon's PASCO2V15 XENSIV PAS CO2 5V Sensor Now Available at Mouser for Accurate CO2 Level Measurement
- Milestone! SmartSens CMOS image sensor chip shipments exceed 100 million units in a single month!
- Taishi Micro released the ultra-high integration automotive touch chip TCAE10
- The first of its kind in the world: a high-spectral real-time imaging device with 100 channels and 1 million pixels independently developed by Chinese scientists
- Melexis Launches Breakthrough Arcminaxis™ Position Sensing Technology and Products for Robotic Joints
- ams and OSRAM held a roundtable forum at the China Development Center: Close to local customer needs, leading the new direction of the intelligent era
- Optimizing Vision System Power Consumption Using Wake-on-Motion
- Infineon Technologies Expands Leading REAL3™ Time-of-Flight Portfolio with New Automotive-Qualified Laser Driver IC
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- Share a domestic 32-bit M3 microcontroller CH32F103
- In-depth Linux kernel architecture (Chinese version)
- Ask for points
- EEWORLD University ---- Correctly Test MLCC
- Piezoelectric Driven Power Amplifier Application--Piezoelectric Ceramic Impedance Test
- Altium Designer v21.7.1.17
- VGA display image
- Is there any chip solution with dual adjustable PWM frequency and duty cycle?
- Basic principles and driving waveforms of BLDC six-step commutation method
- [Xianji HPM6750EVKMINI Review] 5# HPM6750 Camera Usage