Machine vision is traditionally defined as electronic imaging for inspection, process control, and automatic navigation. In machine vision applications, computers (not humans) use imaging technology to capture images as input to extract and transmit information output. According to MEMS Consulting, in addition to traditional industrial applications, machine vision capabilities for advanced driver assistance systems (ADAS), augmented reality and virtual reality (AR/VR) technologies, and intelligent safety systems all require the use of advanced digital imaging technology. This technology enables machine vision to "see" more clearly and farther in low-light or no-light conditions, and since no visible light source is required, it will not interfere with normal human activities.
Historically, machine vision technology has relied on a variety of light sources to capture images, including fluorescent, quartz halogen, LED, metal halide (mercury), and xenon. When used alone, a large amount of energy is consumed and the resulting image quality is poor. As such, these light sources cannot meet the needs of use beyond traditional industrial applications.
AR/VR, safety systems, and ADAS driver monitoring use eye tracking, facial recognition, gesture control, and face recognition, as well as ADAS surround-view cameras integrated with night vision capabilities, but these applications require lighting outside the visible light spectrum to achieve the desired effect. In the past few years, advances in digital near-infrared (NIR) imaging technology have revolutionized machine vision and night vision capabilities.
Why is NIR a necessity for current machine vision applications?
NIR is used to illuminate objects or scenes outside the visible light spectrum and enables cameras to "see" in low-light or no-light situations that are beyond the capabilities of human vision. Although NIR is still needed to augment low-level LEDs in some applications, NIR requires very little power and is almost non-intrusive to the user. In applications such as AR/VR or driver monitoring systems, these characteristics of NIR are very important for accurate eye tracking and gesture control. In the case of security camera applications, NIR can monitor intruders without them knowing.
In addition, NIR produces more photons than visible light under night vision conditions, which makes it an ideal choice for night vision applications. As an example, let’s compare the advantages and disadvantages of the two methods under night vision conditions in ADAS systems.
One automaker uses a passive far infrared (FIR) system that records images based on the heat of an object and displays them as a bright negative image. Although it can detect an object up to 980 feet away, the images it produces are not clear because it relies on the heat emitted by the object to record.
Another automaker uses NIR technology that produces crisp, clear images in the dark, as if the car were illuminated by high beams. The image is captured regardless of the object’s temperature. However, the maximum effective detection range of this NIR system is 600 feet.
Limitations of NIR
In most cases, NIR is a significant improvement over alternative methods, but using it is not without its challenges. The effective range of a NIR imaging system is directly related to its sensitivity. Under the best conditions, current NIR sensor structures can achieve sensitivities of ≤ 800nm. If the sensitivity of NIR imaging systems can be increased to 850nm or higher, then their effective distance can be further extended.
The effective distance of NIR optical imaging is determined by two key measurements: quantum efficiency (QE) and modulation transfer function (MTF). The QE of an imager represents the ratio of photons it captures to the photons that are converted into electrons.
The higher the QE, the farther the NIR illumination can be seen, and the brighter the image. A QE of 100% means that all captured photons are converted into electrons, resulting in the brightest possible image. But currently, even the best NIR sensor technology only achieves a QE of 58%.
MTF is a measure of the image sensor's ability to transfer contrast from an object to an image at a specific resolution. The higher the MTF, the sharper the image. MTF is affected by the noise of electronic signals that jump out of the pixels. Therefore, in order to maintain a stable MTF and achieve a sharp image, the electrons need to stay in the pixel at all times.
Figure 1. This simulated image shows the clear difference between low and high MTF.
Challenges with existing solutions
When NIR is used in non-visible conditions, the wavelength of the NIR increases. As a result, the QE of silicon decreases, and the efficiency of photon conversion in the crystal decreases. Therefore, to produce the same number of photons, thicker silicon is required. Therefore, the traditional way to increase QE is to use thick silicon. Compared with thin silicon, using thick silicon increases the chance of photon absorption, providing higher QE and enhancing signal strength.
In the case of a single-pixel detector, using thick silicon can improve the QE of the NIR to more than 90%. However, if the application requires smaller pixels and the silicon thickness continues to increase, when the silicon thickness increases to 100 μm, it will cause photons to jump to adjacent pixels, causing crosstalk, which will reduce the MTF. As a result, although the image sensor is more sensitive to NIR illumination, its resolution is lower, and the image formed is bright but blurry.
One way to address this problem is to use deep trench isolation (DTI) technology to create a barrier layer between pixels. While standard DTI has been shown to improve MTF, it can also produce defects that ruin dark areas of the image. This poses a problem for companies working to improve NIR lighting for machine vision applications.
Technological breakthrough
There have been some recent technological breakthroughs that have solved the problem of only being able to use thick silicon to increase photon absorption. First, the upgraded DTI method is to develop and use advanced 300mm manufacturing processes to create a silicon oxide barrier between adjacent pixels, which can cause the refractive index between the oxide and the silicon to change, thereby forming an optical confinement within the same pixel. Unlike traditional DTI, the upgraded DTI does not make the trench wider, but deeper, and the trench remains narrow, which helps to control the photons.
Secondly, an absorption structure similar to the pyramid structures used in solar cell processing is implemented on the surface of the wafer to create a scattering optical layer. Careful implementation of this optical layer prevents defects from appearing in dark areas of the image and further increases the path length of the photons in the silicon. The shape of the structure allows the light to take a longer path inside the silicon, rather than straight up or down. This affects the light path length by breaking up the light wave path and scattering it. As a result, light reflected from the absorption structure bounces back and forth like a ping-pong ball, increasing its probability of absorption.
Getting the angle of the absorption structure right is crucial to the effectiveness of the light-scattering layer. If the angle is wrong, it can cause photons to reflect to the next pixel instead of returning to the original pixel.
Figure 2 The shape of the absorbing structure causes the light path inside the silicon to be longer, rather than straight up or down.
in conclusion
Working closely with foundry partners, OmniVision developed Nyxel NIR technology, which solves the performance issues that often plague NIR development. By combining thick silicon and upgraded DTI, and using a light scattering layer to manage surface texture, sensors using Nyxel technology can improve QE performance by 3 times compared to OmniVision's previous generation sensors , allowing NIR sensitivity to 850nm without degrading other image quality indicators.
The results are compelling. Sensors equipped with this technology can detect images at greater distances in extremely low light conditions, provide higher image quality, and operate with reduced light input and power consumption, thus meeting new requirements for advanced machine vision such as AR/VR, ADAS, and night vision applications.
Previous article:MIT announces top 10 scientific achievements of 2017: facial recognition payment, thermal energy batteries, etc. are on the list
Next article:How does an orange peel unlock fingerprint recognition? The key lies in the conductive pen coating
- Popular Resources
- Popular amplifiers
- Mir T527 series core board, high-performance vehicle video surveillance, departmental standard all-in-one solution
- Akamai Expands Control Over Media Platforms with New Video Workflow Capabilities
- Tsinghua Unigroup launches the world's first open architecture security chip E450R, which has obtained the National Security Level 2 Certification
- Pickering exhibits a variety of modular signal switches and simulation solutions at the Defense Electronics Show
- Parker Hannifin Launches Service Master COMPACT Measuring Device for Field Monitoring and Diagnostics
- Connection and distance: A new trend in security cameras - Wi-Fi HaLow brings longer transmission distance and lower power consumption
- Smartway made a strong appearance at the 2023 CPSE Expo with a number of blockbuster products
- Dual-wheel drive, Intellifusion launches 12TOPS edge vision SoC
- Toyota receives Japanese administrative guidance due to information leakage case involving 2.41 million pieces of user data
- LED chemical incompatibility test to see which chemicals LEDs can be used with
- Application of ARM9 hardware coprocessor on WinCE embedded motherboard
- What are the key points for selecting rotor flowmeter?
- LM317 high power charger circuit
- A brief analysis of Embest's application and development of embedded medical devices
- Single-phase RC protection circuit
- stm32 PVD programmable voltage monitor
- Introduction and measurement of edge trigger and level trigger of 51 single chip microcomputer
- Improved design of Linux system software shell protection technology
- What to do if the ABB robot protection device stops
- CGD and Qorvo to jointly revolutionize motor control solutions
- CGD and Qorvo to jointly revolutionize motor control solutions
- Keysight Technologies FieldFox handheld analyzer with VDI spread spectrum module to achieve millimeter wave analysis function
- Infineon's PASCO2V15 XENSIV PAS CO2 5V Sensor Now Available at Mouser for Accurate CO2 Level Measurement
- Advanced gameplay, Harting takes your PCB board connection to a new level!
- Advanced gameplay, Harting takes your PCB board connection to a new level!
- A new chapter in Great Wall Motors R&D: solid-state battery technology leads the future
- Naxin Micro provides full-scenario GaN driver IC solutions
- Interpreting Huawei’s new solid-state battery patent, will it challenge CATL in 2030?
- Are pure electric/plug-in hybrid vehicles going crazy? A Chinese company has launched the world's first -40℃ dischargeable hybrid battery that is not afraid of cold
- MSP430FR2433 LaunchPad Development Kit
- The world's first integrated 5G baseband processor Kirin 990 5G was released today
- [CH579M-R1] + Help: Simulating I2C to read data failed (solved)
- CC2530 RF part use - to achieve point-to-point transmission and reception
- [Qinheng RISC-V core CH582] I2C lights up the OLED screen
- Area Occupancy Detection Reference Design for mmWave Sensors
- I bought an old power supply, and this method of cleaning dust is the most effective
- MSP430 MCU Development Record (27)
- Comparison of three RC oscillator circuits
- ADS2021 analog circuit design, encountered problems