How Apple Vision Pro’s Infrared Eye-Tracking Technology Works

The Apple Vision Pro is many things, not the least of which is that it is an incredible piece of technology. Whether people love it or hate it, or most commonly just don’t want to spend $3,500 on it, nearly everyone agrees that Apple pulled off some amazing technical achievements with the headset.

One of the most impressive parts of the Vision Pro is its eye-tracking system. The high-performance eye-tracking system relies on sophisticated and tiny infrared cameras inside the eyepieces that project invisible light patterns onto each of the wearer’s eyes. These infrared cameras rapidly and accurately track where the user is looking, enabling them to control the operating system and the entire user interface through their gaze and hand gestures, which are tracked and analyzed using external cameras on the bottom of the Vision Pro headset.

The fine folks at The Slow Mo Guys have taken a peek at the Vision Pro’s infrared cameras and lights using a high-speed camera that lacks the typical infrared-blocking filter that most cameras (and human eyes) have. Of course, not all interchangeable lens digital cameras block infrared light.

The Apple Vision Pro also has infrared illuminators on the front of the headset, which provide vital data about the user’s surroundings. On the inside, there appear to be at least eight infrared illuminators per eye, for a total of 16 or more. With the three on the front, the total number of infrared lights on the Vision Pro is just under 20.

Apple Vision Pro and how it uses infrared light

Apple famously plays its cards close to the chest. However, it does explain that the Vision Pro has two high-resolution main cameras, six world-facing tracking cameras (on the outside of the Vision Pro when it is worn), and four eye-tracking cameras on the inside of the headset. These four internal cameras capture and track the infrared light projected by the 16 IR illuminators.

Apple is no stranger to infrared light and sensors. The company uses an infrared sensor with its FaceID technology, which unlocks many iPhone and iPad devices. Apple devices also use an infrared sensor to measure ambient light and adjust screen brightness. Further, LiDAR technology, like that used for low-light autofocus and depth measurements, relies on infrared wavelengths.


Image credits: Apple

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Swift Telecast is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – swifttelecast.com. The content will be deleted within 24 hours.

Leave a Comment