iPhone X 3D Camera (TrueDepth) Teardown Analysis
Source: Content from MEMS, thank you.
While imaging industry experts know that Apple has designed a complex 3D camera for its iPhone X, called "TrueDepth," most of the details of the 3D system inside the module, including chips, components and even the substrate, remain a deep, dark secret.
EE Times spoke with Yole, which this week completed a teardown of the TrueDepth module in the Apple iPhone X with its partner System Plus Consulting. They inferred that silicon-on-insulator (SOI) wafers are being used for near-infrared (NIR) image sensors. They noted that SOI played a key role in improving the sensitivity of NIR image sensors developed by STMicroelectronics to meet Apple’s stringent requirements.
Pierre Cambou, head of Yole’s imaging and sensors business, said the SOI-based near-infrared image sensor is “a very interesting milestone for SOI.”
Many companies in the so-called “Imaging Valley” near Grenoble, France, use SOI wafers developed by Soitec, initially for back-illuminated (BSI) image sensors. Meanwhile, research on SOI for near-infrared image sensors dates back to 2005, according to Cambou.
But Cambou pointed out that Apple's adoption of STMicroelectronics' near-infrared image sensors marks the beginning of SOI's mass production of image sensors. "Image sensors are characterized by a wide surface due to the physical size of the light. Therefore, it is a pretty good market for substrate suppliers like Soitec."
Meanwhile, Jean-Christophe Eloy, chairman and CEO of Yole, told EE Times that in designing TrueDepth, “Apple took the best of both worlds, combining the best of both worlds from STMicroelectronics and ams.” Apple used STMicroelectronics’ leading near-infrared image sensors, as well as point sources from ams. Eloy noted that ams “excels in its complex optical modules.” Earlier this year, ams acquired Heptagon, known for its optical packaging based on time-of-flight (ToF) technology.
Cost Analysis of Apple iPhone X 3D Camera (TrueDepth)
Apple has integrated a 3D camera, TrueDepth, on the front of the iPhone X to recognize the user’s face and unlock the phone. As Yole previously explained, to achieve this, Apple combines a ToF ranging sensor with an infrared “structured light” camera that can use uniform “flood” or “dot pattern” lighting.
The working principle of the 3D camera is very different from that of a normal CMOS image sensor that takes photos. First, the iPhone X combines an infrared camera with a projector illuminator to project a uniform infrared light in front of the phone. Then the image is captured and the facial recognition algorithm is triggered accordingly.
However, this facial recognition feature is not always running. The infrared camera connected to the ToF ranging sensor sends a signal to instruct the camera to take a photo when a face is detected. The iPhone X then activates its dot matrix projector to capture the image. The general image and the dot pattern image are then sent to the application processing unit (APU), which is trained through a neural network to recognize the phone user and turn on the phone.
Yole's Cambou noted that no calculations are currently being done to generate 3D images. The 3D information is contained in the dot-pattern image. "To run 3D applications, the same APU can use another algorithm that calculates the depth map of the image." He added: "Structured light methods are known to be computationally intensive, and the iPhone X takes full advantage of the processing power of the A11 chip. The use of neural networks is the key technology that makes this possible."
The teardown analysis by Yole and System Plus Consulting found a "complex combination of five submodules" in Apple's 3D camera (TrueDepth). They are: near-infrared camera, ToF ranging sensor + infrared flood illuminator, RGB camera, dot matrix projector and color/ambient light sensor.
As shown in the figure below, the infrared camera, RGB camera projector and dot matrix projector are all aligned.
Apple iPhoneX 3D Camera (TrueDepth) Teardown Analysis
At the heart of the 3D camera (TrueDepth) in Apple’s iPhone X is an STMicroelectronics near-infrared image sensor. Yole and System Plus found “the use of silicon-on-insulator (SOI) on top of deep trench isolation (DTI)” inside STMicroelectronics’ near-infrared image sensor.
The concept of DTI technology is well known. Generally speaking, the problem faced by today's cameras that require high sensor resolution is that the pixels are confined to the same space, causing interference (noise), discoloration or pixelation of adjacent sensors when taking pictures. DTI is used to prevent leakage between photodiodes. Apple reportedly etches actual trenches between each one and then fills the trenches with insulating material to block the flow of electricity.
So, in addition to using DTI, why does Apple use SOI wafers for near-infrared image sensors?
From an optical point of view, Cambou explained that SOI wafers are advantageous because the insulating layer functions like a mirror. “Infrared light penetrates deeper and is reflected back to the active layer,” he noted.
Cambou pointed out that from an electrical perspective, SOI improves sensitivity in the near infrared, mainly because it is very good at reducing leakage within the pixel. The improved sensitivity provides good image contrast.
Contrast is important, Cambou explains, because "structured light manipulation is easily disturbed by sunlight."
Of course, a regular CMOS image sensor or near-infrared sensor "is happy to have extra light if the goal is to have a better image," Cambou said. But light is a problem when a user tries to unlock an iPhone X in bright sunlight.
“The problem is the contrast of the near-infrared projection point versus the ambient light from the sun or any other source,” Cambou said. “But the sun is usually the biggest problem.” Therefore, it is crucial that Apple improves the contrast in the near-infrared by adopting SOI wafers.
When asked whether STMicroelectronics’ near-infrared image sensors use FD-SOI or SOI wafers, Cambou said the research company cannot tell at this time.
Near-infrared image sensor in the Apple iPhone X 3D camera (TrueDepth)
As for the NIR image sensor, do we know whether Apple uses 850nm or 940nm wavelength NIR?
Cambou noted that “we cannot say for sure which one.” However, he speculated that “Apple is most likely using 850nm like others (e.g. Intel’s RealSense, Facebook, HTC, etc.), but STMicroelectronics is known for developing 940nm SPAD photon detectors, so it is possible that they intend to move to this wavelength in the future.”
When asked about the unexpected findings of the teardown, Cambou cited the size of STMicroelectronics' near-infrared image sensor. It measures 25mm^2 and has only 1.4 million pixels due to the large 2.8μm pixel size. Cambou pointed out: "Nevertheless, in this category, this pixel is considered 'small' compared to competitors who usually use 3.0μm to 5μm."
Yole positioned the iPhone X as the beginning of a new era of 3D imaging.
Cambou also believes that Apple is building a future for near-infrared image sensors. Pointing to the acquisition of InVisage Technologies announced last week, he said: "I think Apple wants InVisage to provide near-infrared image sensor capabilities, although there are several ways to interpret this acquisition."
Cambou doesn’t believe InVisage will be able to match STMicroelectronics’ products in terms of performance, but it could offer a solution for miniaturization. “Thus, facial recognition technology could be shrunk down to other products, such as augmented reality (AR) headsets,” he said.
On the one hand, Apple's iPhone X is creating huge business opportunities for SOI wafer manufacturers such as Soitec. Equally important, it has triggered a meaningful comeback for STMicroelectronics. Cambou believes that STMicroelectronics will become a player in the emerging ToF camera market.
Of course, the semiconductor business is often subject to brief boom-and-bust cycles. But STMicroelectronics, which saw its business shrink dramatically after losing Nokia to rivals in the mobile phone market, “has made a very graceful transition,” Cambou observed.
STMicroelectronics creates different types of image sensor applications: moving from CMOS image sensors to future near-infrared image sensors and SPAD sensors, while leveraging its assets and internally developed foundational technologies.
Today is the 1465th issue of content shared by "Semiconductor Industry Observer" for you, welcome to follow.
R
eading
Recommended reading (click on the article title to read directly)
★ China's memory industry is marching forward
Marvell is back on track
★ After Qualcomm rejected Broadcom, the struggle has just begun
Follow the WeChat public account Semiconductor Industry Observation , reply to the keyword in the background to get more content
Reply Popular Science , read more popular science articles about the semiconductor industry
Reply to DRAM , read more DRAM articles
Reply Lithography , read more articles related to lithography technology
Reply Samsung , read more articles related to Samsung
Reply Full screen , read more articles related to full screen
Reply Dual camera , read more articles about dual cameras on mobile phones
Reply Millimeter wave , see more articles related to millimeter wave
Reply to IPO to see more articles related to semiconductor company IPOs
Reply Exhibition , see "2017 Latest Semiconductor Exhibition and Conference Calendar"
Reply Submit your article and read "How to become a member of "Semiconductor Industry Observer""
Reply Search and you can easily find other articles that interest you!
Click to read the original text and join the Moore Elite