Apple unveiled Vision Pro at WWDC 2023 on June 5th. The device ships in early 2024 at $3,499. The developer implications are more immediate than the consumer ones.
The vision Pro hardware
Vision Pro uses dual micro-OLED displays with 23 million pixels total, eye tracking for UI interaction, hand gestures for input, and a Spatial Audio system. The M2 chip handles computing, and a co-processor called R1 handles sensor input to keep the AR latency below 12 milliseconds, the threshold below which motion sickness becomes a significant risk. The hardware engineering is genuinely impressive and represents years of work Apple could not compress further.
The visionOS SDK
visionOS is a new operating system derived from iOS and macOS. The SDK uses SwiftUI and RealityKit. Existing iPad apps run in a flat floating window by default without modification. Apps built specifically for visionOS can place 3D content anywhere in the user's space, interact with the physical environment's geometry, and anchor virtual objects to real-world surfaces. The developer investment to get from 'iPad app that runs on Vision Pro' to 'visionOS-native experience' is significant.
The developer conference reaction
Developer reaction at WWDC was divided between awe at the hardware and scepticism about the developer economics. Building a quality visionOS application requires a Vision Pro ($3,499) and time investment for a device with a market size that, at launch, will be measured in hundreds of thousands. The business case requires either an enterprise use case with a clear ROI or a bet that consumer adoption will accelerate.
The enterprise bet
The most immediate business case for Vision Pro is enterprise: design review, surgical planning, training simulations, field service support with spatial overlays. A $3,499 device is inexpensive for a single-user industrial application. The iPad followed a similar path: early enterprise adoption of a device that eventually became a consumer mainstream product.