Apple releases Apple Vision Pro, a spatial computing device

At the twilight unveiling of WWDC 23 today, Apple introduced the Apple Vision Pro, a novel type of device launched after several years, which Apple dubs a revolutionary spatial computing device (essentially, a VR headset). Equipped with two micro-OLED displays, it boasts a wide color range and HDR capabilities. On the lens front, Apple asserts that the Apple Vision Pro utilizes custom refractive reflector lenses, and for users requiring vision correction, Zeiss optical lenses can be optionally inserted.

In a unique dual-chip configuration, the Apple Vision Pro employs the M2 chip for performance processing and the R1 chip to handle information from 12 cameras, 5 sensors, and 6 microphones. Apple indicates that the R1 chip can stream new image data to the display within 12 milliseconds. Regarding battery life, the Apple Vision Pro can operate all day when powered externally, and when connected to a battery, it can last up to two hours.

The acclaimed Spatial Audio on AirPods has also found its way to the Apple Vision Pro. The audio components on either side of the Apple Vision Pro each feature a dual-driver unit to deliver personalized spatial audio.

Additionally, the Apple Vision Pro places great emphasis on the interaction between the user and surrounding individuals. The EyeSight feature assists users in staying connected with those around them. When someone approaches the user, the device’s front displays the user’s eyes, and the user can see the person approaching through the screen. This is courtesy of the efficient eye-tracking system; the Apple Vision Pro employs a high-speed camera and a ring of LEDs to project invisible light images onto the user’s eyes for immediate intuitive feedback.

The Apple Vision Pro has a starting price of $3499 and will be available for purchase in the United States early next year, with a wider release in additional countries and markets following later in the year.