Getting Started with Apple’s Vision OS Development: RealityView Attachments, Systems, And Components!

Today, I am very excited to bring you video #2 and a post as part of my visionOS development series, where I walk you through the process of adding RealityView attachments, systems, custom components, and more. The primary focus for today is gaining a deep understanding of attachments, which play a crucial role when adding UI interactions that affect 3D models. This concept applies to both volumetric windows and immersive spaces, whether they are fully immersive or mixed.

To provide more context, let me list all the topics covered today before we delve into the structure of a RealityView, including all the available functions.

📚 VisionOS SDK topics covered:

  • The structure of RealityViews, including the Update, Placeholder, and Attachments functions.

  • Adding a Swift extension functions to provide image-based lights and image-based light receivers to loaded entities.

  • Introducing an immersive space of the mixed type, in addition to the existing volumetric and full views created in the previous video.

  • Implementing an Orbit System and Orbit Component for achieving circular movement applied to the SpaceX capsule.

  • Utilizing the Preview tag for expediting development during System and Component testing.

RealityView Structure

As per Apple definition - A RealityView Is “A SwiftUI view for displaying RealityKit content on visionOS.” which means we can have 2D or 3D content that floats in space either on a volumetric window, a full immersive space (VR), or simply a mixed immersive space (Mixed Reality). These RealityViews allow us to also add interactions (gestures) as well as UI in which we can provide to the user to execute different actions. Let me give you an example, an example in which I am using the SpaceX Falcon 9 for the project I demonstrated on YouTube as part of my visionOS video series.

In the code block shown next, we have a `CapsuleRealityArea.swift` view, and in this view, we implement a `RealityView` inside the View instantiation. Here's a breakdown of the steps involved:

  1. Pull an entity called "Scene" from the `realityKitContentBundle`, which is a resource added to this project's resources.

  2. If the resource is found, simply call `content.add` to add the entity.

  3. Set lighting settings.

  4. Set an `OrbitComponent`, which moves the capsule in circular increments.

  5. An important part to note is that `attachments.entity` pulls an attachment by its ID, "attachmentId," which is initially defined.

  6. If the attachment is found, position it to the left of the capsule and angle it along the x-axis by -0.5 degrees.

  7. It's important to notice that nowhere in the `RealityView` initialization do we have a view for the attachment. Instead, this happens in the `attachments` function call, which you can find near the end of the code.

  8. In this section, we create an instance of `Attachment` by passing the ID defined at the top.

  9. Additionally, `CapsuleDetails` is instantiated, which is essentially another SwiftUI View.

  10. In `CapsuleDetails`, options are added for people to set the light entity on and off, turn orbit options on and off, and display information about the SpaceX Falcon 9.

import SwiftUI
import RealityKit
import RealityKitContent

struct CapsuleRealityArea: View {
    
    @Environment(ViewModel.self) private var model
    @State private var capsule: Entity?
    let attachmentID = "attachmentID"
    
    var body: some View {
        RealityView { content, attachments in
            guard let entity = try? await Entity(named: "Scene", in: realityKitContentBundle) else {
                fatalError("Unable to load scene model")
            }
            content.add(entity)
            
            self.capsule = entity
            self.capsule?.setSunlight(intensity: 13)
            self.capsule?.components.set(OrbitComponent(radius: 0.02, speed: 0, addOrientationRotation: true))
            
            if let sceneAttachment = attachments.entity(for: attachmentID) {
                sceneAttachment.position = SIMD3<Float>(-0.2, -0.1, 0.1)
                sceneAttachment.transform.rotation = simd_quatf(angle: -0.5, axis: SIMD3<Float>(1,0,0))
                content.add(sceneAttachment)
            }
        } update: { content, attachments in
            print("RealityView changes detected...")
            
        } placeholder: {
            ProgressView()
                .progressViewStyle(.circular)
                .controlSize(.large)
        } attachments: {
            Attachment(id: attachmentID){
                CapsuleDetails {
                    self.capsule?.setSunlight(intensity: 13)
                } turnOffLight: {
                    self.capsule?.setSunlight(intensity: 6)
                } turnOnOrbit: {
                    self.capsule?.components[OrbitComponent.self]?.speed = 1
                } turnOffOrbit: {
                    self.capsule?.components[OrbitComponent.self]?.speed = 0
                }
            }
        }
        .onDisappear {
            model.isShowingRocketCapsule = false
        }
    }
}

#Preview {
    CapsuleRealityArea()
        .environment(ViewModel())
}

Swift UI Extensions

In the example project I've created, I followed Apple's recommendation to add an extension method for Entity. This means that even types we didn't implement can be extended using an extension method. If you know me by now, you know how much I love C#. Well, in C#, we can also achieve something similar. The beauty of this approach is that we can maintain clean code by adding specific functionality associated with classes we didn't implement. This is in contrast to creating a static method available in the global scope to everyone, which is not an elegant solution. Extensions, on the other hand, are restricted to the extended types. Let's examine the example I've prepared for today and take a look at the comments.

import RealityKit

// When definig an extension simply use the "extension" keyword followed by the 
// type you like to extend. For instance: extension [Type]
// To use it, simply access the func defined as setSunlight from
// any instance of type Entity.
extension Entity {
    func setSunlight(intensity: Float?) {
        if let intensity {
            Task {
                guard let resource = try? await EnvironmentResource(named: "Sunlight") else { return }
                var iblComponent = ImageBasedLightComponent(
                    source: .single(resource),
                    intensityExponent: intensity)
                
                components.set(iblComponent)
                components.set(ImageBasedLightReceiverComponent(imageBasedLight: self))
            }
        } else {
            components.remove(ImageBasedLightComponent.self)
            components.remove(ImageBasedLightReceiverComponent.self)
        }
    }
}

Immersive Spaces And Volumetric Windows

So, a common question I often receive is, "Dilmer, how can we create a Mixed Reality experience for the Apple Vision Pro? Or, how do we go about crafting a Virtual Reality experience for the Apple Vision Pro? And, what exactly is a volumetric window?" I understand that these terms can be confusing, especially because we often use phrases like "Virtual Reality" versus "Augmented Reality" versus "Mixed Reality," and the terminology used by Apple doesn't align precisely with the XR industry's terms.

Well, I'm here to help you gain a better understanding of these concepts. Allow me to illustrate with a straightforward example. Please be sure to refer to the comments and images below for clarification.

// This creates a volumetric window
// which means we're creating a window that can render 2D/3D content.
WindowGroup(id: model.capsuleRealityAreaId){
    CapsuleRealityArea()
        .environment(model)
}
.windowStyle(.volumetric)
.defaultSize(width: 0.65, height: 0.3, depth: 0.3, in: .meters)

// This creates a full immersive space
// which means we're creating a full virtual reality experience.
ImmersiveSpace(id: model.fullRocketRealityAreaId){
    FullRocketRealityArea()
        .environment(model)
}
.immersionStyle(selection: .constant(.full), in: .full)

// This creates a mixed immersive space
// which means we're creating a mixed reality experience.
ImmersiveSpace(id: model.mixedRocketRealityAreaId){
    FullRocketRealityArea()
        .environment(model)
}
.immersionStyle(selection: .constant(.mixed), in: .mixed)

RealityKit Systems, Components, and Entities

Before we begin delving into these topics, let's first take a look at how Apple defines these core concepts:

A System contains code that RealityKit calls on every frame to implement a specific type of entity behavior or to update a particular type of entity state. Systems use components to store their entity-specific state and query for entities to act on by looking for ones with a specific component or combination of components”.

Components are modular building blocks that you add to an entity; they identify which entities a system will act on, and maintain the per-entity state that systems rely on. Components can contain logic, but limit component logic to code that validates its property values or sets its initial state”.

Entities are the core actors of RealityKit. Any object that you can put into a scene, whether visible or not, is an entity and must be a descendent of Entity. Entities can be 3D models, shape primitives, lights, or even invisible items like sound emitters or trigger volumes”.

In my video series, I use the Capsule (SpaceX Falcon 9) as one entity and a Rocket as another entity. It's important to note that each entity can contain one or many different items, such as 3D models, shapes, lights, etc. In our code, we can pull practically anything from a Reality Composer Pro scene into an entity. Once again, one or more different content types can be pulled depending on your needs. Let's look at an example for each of these based on my latest demo.

import SwiftUI
import RealityKit
  
/// Component: to store Orbit information for our entities.
struct OrbitComponent: Component {
    var radius: Float
    var speed: Float
    var angle: Float
    var addOrientationRotation : Bool
    init(radius: Float = 2.0, speed: Float = 1.0, angle: Float = 0, addOrientationRotation: Bool = false) {
        self.radius = radius
        self.speed = speed
        self.angle = angle
        self.addOrientationRotation = addOrientationRotation
    }
}

/// A system that rotates entities with a rotation component.
struct OrbitSystem: System {
    static let query = EntityQuery(where: .has(OrbitComponent.self))
    public init(scene: RealityKit.Scene) {}
    func setOrientation(context: SceneUpdateContext, entity: Entity, component: OrbitComponent){
        entity.setOrientation(.init(angle: component.speed * Float(context.deltaTime), axis: [0, 1, 0]), relativeTo: entity)
    }
    
    func update(context: SceneUpdateContext) {
        for entity in context.entities(matching: Self.query, updatingSystemWhen: .rendering) {
            if var component: OrbitComponent = entity.components[OrbitComponent.self]
            {
                if component.radius == 0 {
                    setOrientation(context: context, entity: entity, component: component)
                }
                else
                {
                    if component.addOrientationRotation {
                        setOrientation(context: context, entity: entity, component: component)
                    }
                    
                    // Calculate the position on the circle
                    let x = component.radius * cos(component.angle);
                    let z = component.radius * sin(component.angle);
                    
                    // Update the Entity's position
                    entity.transform.translation = SIMD3(x, 0, z);
                    
                    // Increment the angle based on time and speed
                    component.angle += component.speed * Float(context.deltaTime)
                    
                    // write out component back to memory
                    entity.components.set(component)
                }
            }
            else { continue }
        }
    }
}
import SwiftUI
import RealityKit
import RealityKitContent

struct FullRocketRealityArea: View {
    
    @State private var audioController: AudioPlaybackController?
    
    var body: some View {
        RealityView { content in
            // An Immersive entity which is part of my RealityKit content bundle
            guard let entity = try? await Entity(named: "Immersive", in: realityKitContentBundle) else {
                fatalError("Unable to load immersive model")
            }
            
            let ambientAudioEntity = entity.findEntity(named: "AmbientAudio")
            
            guard let resource = try? await AudioFileResource(named: "/Root/Space_wav", from: "Immersive.usda", in: realityKitContentBundle) else {
                fatalError("Unable to load space.wav audio resource")
            }
            
            ambientAudioEntity?.ambientAudio?.gain = -30;
            audioController = ambientAudioEntity?.prepareAudio(resource)
            audioController?.play()
            
            
            // find rocket and apply orbit component
            if let rocket = entity.findEntity(named: "Rocket"){
                rocket.components.set(OrbitComponent(radius: 0.02, speed: 0.5, addOrientationRotation: true))
            }
            
            content.add(entity)
        }
        .onDisappear(perform: {
            audioController?.stop()
        })
    }
}

#Preview {
    FullRocketRealityArea()
}

📲 VisionOS tutorial project requirements 👇

  • Xcode Version 15 beta 8 or greater

  • VisionOS Version 1 beta 3 Simulator Runtime or greater

  • Reality Composer Pro Version 1.0 (393.3) or greater (which is bundled with Xcode)

  • Source code shown today can also be found on this GitHub Repo

Well, that wraps up what I've recently learned with the visionOS SDK. If you have any questions about any of these concepts, please feel free to leave a comment below, and I'll be more than happy to assist you.

Thanks

Dilmer

Previous
Previous

Quest 3 Mixed Reality With Meta Depth API - New Occlusion Features!

Next
Next

Unreal Engine 5 For Magic Leap 2 Is Now Available!