Apple Vision Pro Hardware And Developer Tools - Unity Vs Native!
In today's post, I take a look into Apple's exciting leap into Extended Reality (XR) with the unveiling of the Apple Vision Pro. My goal is to explore its specifications, share my perspective, and provide an in-depth analysis of the developer tools, including native versus Unity tools, announced at WWDC23. Furthermore, I recommend checking out my latest YouTube video on the Apple Vision Pro, where I go into many of the details discussed here, supplemented with additional video references and resources.
Here are some quick facts about the Apple Vision Pro:
It is scheduled for consumer release in early 2024.
The starting price stands at 3,499 USD.
The VisionOS Software Development Kit (SDK) for the Apple Vision Pro appears to be releasing by the end of June 2023.
Apple Vision Dev Kits Units will be available from July 2023. However, Apple hasn't been explicit about the eligibility criteria.
Initial Impressions of the Apple Vision Pro
The Apple Vision Pro's aesthetics are stunning, as is the case with every other Apple product. It takes me back to the days of the original iPhone. It boasts a laminated glass exterior that's visually appealing, complemented by an aluminum alloy frame in the interior, similar to the first iPhone model. The device incorporates a Digital Crown, which transitions from Virtual Reality (VR) to Mixed Reality (MR), or what Apple dubs Full Immersed (VR) to Immersed (MR) experiences.
With a display offering 4k+ per eye, the outer display projects images from the inner cameras. It's powered by a battery that lasts up to 2 hours and can be extended using an external battery pack. It includes an M2 + R1 Processor (M2 handles VisionOS + CV, R1 manages inputs from cameras, sensors, microphones, and streams images to displays within 12 ms—virtually no latency).
As for input methods, it offers hand tracking, eye tracking, and voice commands.
Exploring the Apple Vision Developer Tools
Among the developer tools available are SwiftUI, RealityKit, ARKit, and Unity PolySpatial. The Apple Vision Pro supports immersive experiences with passthrough, shared spaces with other apps, and fully immersive experiences.
In terms of Unity components, all content in shared space is rendered using RealityKit. Unity materials and shaders need translation, a task undertaken by PolySpatial. This tool also supports Unity features and handles the translation of physically based materials and custom materials through the Unity Shader Graph.
However, PolySpatial does not support certain functions. For instance, you can use render texture as an input to ShaderGraph shaders, but post-processing isn't supported. Still, PolySpatial supports unlit and occlusion effects (for passthrough content through objects), along with MeshRenderer and SkinnedMeshRenderer. It is compatible with both URP and Standard.
The Apple Vision Pro also provides powerful WorldAnchors, which are truly insane. All the complexity we've seen with other platforms, which require extensive coding, is completely eliminated. Apple automatically persists your anchors, and a real-time map is constructed for you. The system recognizes your location and determines if any WorldAnchors need to be loaded based on this identified location. So, what happens if you move away from a preferred location? Maps and anchors are unloaded and a new location is constructed, for example, transitioning from your home to the office. As you commute back and forth, one location is unloaded and the other is loaded.
Simulation Features and Device Interactions
Simulation features like physics, animation and timeline, pathfinding, and monobehaviors work as per usual. The interactions function well both in the Unity Play Mode and on the actual device. The device input mechanisms include eyes, hands (XR Hands), and head pose, facilitated through a new input system. ARKit, Plane Detection, World Mesh, Image Markers, and more are only available in unbounded volumes, and with permissions. Keyboards, controllers, and other supported devices can be accessed through Unity's Input System.
Preparing for Apple Vision Pro with Unity
To prepare for the Apple Vision Pro with Unity, consider these recommendations:
Upgrade to Unity 2022 or later
Convert shaders to Shader Graph
Adopt the Universal render pipeline
Move to the Input System package
Rethink about volumes
To get more information about Unity's support, visit unity.com/spatial to sign up for PolySpatial Beta access.
In summary, the Apple Vision Pro stands as a testament to Apple's commitment to driving technological advancement, showcasing a beautifully designed XR device that integrates the familiar with the revolutionary. With a suite of powerful developer tools, this device is poised to unlock a myriad of immersive experiences, reshaping the landscape of Extended Reality. It's truly an exciting time to witness the convergence of art, technology, and reality!