Apple has been working on ways to allow "Apple Glass" style headsets to perform chroma keying, or replacing solid color backgrounds with Apple AR images. An Apple U.S. patent application titled "Low Latency Chroma Keying Embedded Head-Mounted Display for Mixed Reality" was filed in February 2020 but was only revealed this week. We are all familiar by now with replacing the backgrounds of offices and homes with other images, such as logos or beach scenes. We are also familiar with the fact that these things rarely work well, especially when the speaker moves.
That's partly because this kind of background replacement takes a lot of processing power, even when done very effectively. Before the July 2020 update, Microsoft Skype for iOS couldn't even blur its background like the Mac or PC versions. This blurring is a complicated process, but part of it involves determining where a person's head ends and the background begins. Chroma key makes this very simple because the person is placed in front of a single, solid-color background. It's the same green or blue screen process used in movies.
Systems like Zoom or Skype make it easier to replace backgrounds when it's obvious which part of a video conference is the person speaking and which part is their background. Those are 2D systems, though, and the goal with Apple Glasses is to do it in 3D so that the background can be replaced as the wearer moves around, rather than just looking at a flat screen. As a result, Apple's patent application is both about making 3D stereoscopic background replacements for Apple AR, and about making it work fast enough. Part of that has to do with headset processing power, but it's also about how fast that data can be returned to the user's field of view. Apple says it requires a wearable immersive head display (HMD) that captures the environment using embedded stereo RGB cameras, does real-time color keying, and a display system to show real-world images augmented with virtual content. The system runs at a high frame rate (typically 75 frames per second or more), and by capturing images and formatting them for display in the headset itself, it achieves latency of less than one frame.
So, what Apple's proposal says is that the headset has a camera sensor that performs Chroma key in a mixed reality environment, and that it does so with minimal latency to the wearer. Low latency is achieved by embedding processing in the headset itself, specifically, formatting the camera image, detecting the selected color range, and compositing it with the virtual content. The patent application does not address where the replacement image comes from. In a Zoom or Skype meeting, this would be selected by the presenter themselves.
Previous article:Microsoft executives show off Surface Duo again: OneDrive is optimized
Next article:Google confirms Pixel 4/4 XL is out
Recommended ReadingLatest update time:2024-11-15 17:29
- Apple faces class action lawsuit from 40 million UK iCloud users, faces $27.6 billion in claims
- Apple and Samsung reportedly failed to develop ultra-thin high-density batteries, iPhone 17 Air and Galaxy S25 Slim phones became thicker
- Micron will appear at the 2024 CIIE, continue to deepen its presence in the Chinese market and lead sustainable development
- Qorvo: Innovative technologies lead the next generation of mobile industry
- BOE exclusively supplies Nubia and Red Magic flagship new products with a new generation of under-screen display technology, leading the industry into the era of true full-screen
- OPPO and Hong Kong Polytechnic University renew cooperation to upgrade innovation research center and expand new boundaries of AI imaging
- Gurman: Vision Pro will upgrade the chip, Apple is also considering launching glasses connected to the iPhone
- OnePlus 13 officially released: the first flagship of the new decade is "Super Pro in every aspect"
- Goodix Technology helps iQOO 13 create a new flagship experience for e-sports performance
- LED chemical incompatibility test to see which chemicals LEDs can be used with
- Application of ARM9 hardware coprocessor on WinCE embedded motherboard
- What are the key points for selecting rotor flowmeter?
- LM317 high power charger circuit
- A brief analysis of Embest's application and development of embedded medical devices
- Single-phase RC protection circuit
- stm32 PVD programmable voltage monitor
- Introduction and measurement of edge trigger and level trigger of 51 single chip microcomputer
- Improved design of Linux system software shell protection technology
- What to do if the ABB robot protection device stops
- Keysight Technologies Helps Samsung Electronics Successfully Validate FiRa® 2.0 Safe Distance Measurement Test Case
- Innovation is not limited to Meizhi, Welling will appear at the 2024 China Home Appliance Technology Conference
- Innovation is not limited to Meizhi, Welling will appear at the 2024 China Home Appliance Technology Conference
- Huawei's Strategic Department Director Gai Gang: The cumulative installed base of open source Euler operating system exceeds 10 million sets
- Download from the Internet--ARM Getting Started Notes
- Learn ARM development(22)
- Learn ARM development(21)
- Learn ARM development(20)
- Learn ARM development(19)
- Learn ARM development(14)
- Faster CircuitPython LED animation with ulab
- Simple analysis of a circuit for collecting the average value of AC voltage
- FPGA Learning Notes-----FPGA Competition Adventure
- W806 Lighting
- Analysis of the role of terminal resistance in CAN bus
- Download the information and watch the video to win a prize | Tektronix HDMI 2.1 test solution is now available for download, and you can also watch the supporting video
- Things to note when using C/C++ to write programs based on TMS320 series DSP
- Generate sine wave data using microcontroller DAC and C language
- RISC-V RVB2601 First Experience--Section 4--Sensor Module Interface
- Based on micropython-1.9.4 ESP8266 motor control firmware