What role does eye tracking technology play in VR?
[Editor's Note] The author of this article is Huang Tongbing, CEO of Qixin Yiwei.
As an industry that has just emerged in the past two years, virtual reality has become more and more popular among entrepreneurs. Last year alone, hundreds of companies in China announced their entry into the VR field, including Internet giants such as Tencent, LeTV, and iQiyi. Most practitioners believe that 2016 will be the first year of VR and will begin to enter a stage of rapid development.
VR has attracted much attention mainly because it can simulate real scenes and give people sensory satisfaction in the virtual world. As people's demands for VR experience increase, this sensory experience will become closer and closer to the real feeling. In this process, eye tracking technology will become an indispensable application module. At present, pioneers have begun to try to solve the current problems of clarity, immersion, natural interaction, etc. in the VR field through eye tracking technology.
A German eye-tracking technology company demonstrated its gaze rendering technology at this year's CES conference. A domestic eye-tracking technology company, Qixin Yiwei, also demonstrated the application of eye tracking on VR devices in a short video clip recently released. The local rendering function will bring a qualitative leap to VR. The problem currently faced by VR hardware manufacturers is that the user's computer hardware cannot meet the needs of high-definition rendering of display devices. Taking Oculus Rift as an example, users need to be equipped with a computer worth more than $1,000 to run normally. The cost of Nvidia GeForce 970 or AMD Radeon 290 graphics cards is as high as $300, and this is only for rendering 1k resolution. To make the rendering resolution match the resolution of the real world, a single eye must render 8K resolution. The hardware configuration alone is enough to give manufacturers a headache.
To solve this problem, the most recognized method at present is to combine local rendering technology with eye tracking.
During the imaging process of the human eye, the foveal vision area has a clear image, which only covers 1○~2○ of the visual field and has high visual acuity; the peripheral vision field has a fuzzy image.
As shown in the figure, when the human eye is looking at screen H, although the entire screen can be seen, only the foveal field of view in area B is clear, and the image in area AC is blurred. Therefore, during the image rendering process, only a small range of the foveal field of view needs to be rendered, and the peripheral field of view is blurred. When the eyeball moves, the high-definition rendering area changes with the change of the gaze point, which can not only provide a high-definition visual experience, but also reduce the GPU load, thereby greatly reducing the hardware requirements of VR devices.
In addition, the local rendering solution happens to be in line with the imaging characteristics of the human eye. There is no need for the human eye to actively adapt to the screen, and it can also avoid eye fatigue caused by excessive eye use.
Of course, in addition to image rendering, eye tracking technology can also greatly improve the interactive experience of VR devices. Users can interact with the VR user interface by moving their eyes and directly control menus and trigger operations, freeing people from unnatural head operations.
The importance of eye tracking technology in the field of VR is already obvious. Palmer Luckey, the founder of Oculus, has also said that eye tracking technology will become an "important part" of the future of VR technology. Not only can it realize foveated rendering technology, it can also be used to create a kind of depth sensing to create a better user interface.
It is well known that light will refract when passing through the lens, so the edges of the viewing angle of current VR display devices will produce distortion and chromatic aberration. Oculus is using the appropriate optical advantages to try to fix this problem, but optical design alone cannot solve it perfectly, and anti-distortion and dispersion optimization are also needed in the software. Some products now use a correction solution based on the center of the lens. Although it has been effective, when the position of the human eye is offset from the position of the lens, the anti-distortion effect will be weakened. If the anti-distortion processing is combined with eye tracking technology, and the correction solution is adjusted to be based on the center of the human eye's gaze rather than the center of the lens, the correction effect will be greatly improved.
Eye tracking technology is to VR what the mouse is to the Windows system. It will make the experience more complete, more convenient to use, and more easily accepted by users. Although there are not many cases of successfully installing eye tracking technology on VR devices, referring to the rapid iteration of current VR display solutions, it can be seen that eye tracking technology will become the most indispensable technical module for VR devices.