Mixed Reality Utility Kit: Build Spatial-Aware Experiences In Unity

Mixed Reality Utility Kit Unity Tutorial & MR Demos

Hello everyone! Today, I'd like to share my experience testing and using Meta’s Mixed Reality Utility Kit (MRUK) features over the last few days. First of all, MRUK makes scene understanding incredibly easy to implement, unlike how it was before, which was much more complex and, in my opinion, very confusing for developers. MRUK provides a very easy-to-use workflow that can be fully implemented without ever pushing your Mixed Reality games or applications to the headset. However, I do recommend doing so as well, but it's cool to know that MRUK can be run by simply hitting play in the Unity Editor. Ok, let’s go over a few definitions before we go deeper into its features.

What Is Scene Understanding?

In the Meta SDK world, we can refer to scene understanding as the ability our apps or games have in knowing about the physical world. For instance, if we’ve a ground, walls, tables, chairs, and other objects then our experiences could know about where they’re in relation to our virtual space. These physical objects have dimensions in the physical world so those would translate to dimensions we can represent in the virtual space. This is helpful because we could now integrate real world objects with our experiences, allowing us to create all kinds of fun mechanics where digital objects interact with physical objects.

How Does Meta Understand the Scene (The Physical World)?

Scene Setup & MRUK demo with scene debugger

Meta did something very unique. Normally, you would imagine this being done through real-time meshing, similar to how Magic Leap 2 handles it using their powerful sensors. Magic Leap 2 generates a mesh of the physical world and stores it in persistent storage so that any experiences with the necessary permissions can access that information. But how does Meta handle Scene Understanding? Well, it's very similar but with a cool twist at the end.

First, you go through the Scene Setup process, where Meta provides tools to scan your environment. As you walk around the area, a mesh is generated in real-time based on your surroundings. Finally, Meta generates outlines of the entire scene captured by the sensors, allowing you to adjust the outlines as precisely as possible. You also have the option to tag or name your environment (e.g., living room, office, etc.).

Additionally, it's worth mentioning that you can create multiple scene models, which your mixed reality application in Unity can use to interact with the real world.

Why Do We Need MRUK (Mixed Reality Utility Kit)?

MRUK provides a variety of features that can help you, as an XR developer or XR designer, create applications that understand the physical world in a very straightforward manner. While scene understanding is a complex topic, MRUK abstracts much of this complexity by offering a seamless workflow for integrating scene understanding into your applications. I know this doesn’t fully answer the question yet, but let me explain how this workflow works.

MRUK Tools can be found from Packages > Meta MR Utility Kit > Core > Tools.

  1. The Scene Model: this happens at the OS level, as we covered above, but you need a scene model created if you intend to test with a physical device. If not, you can skip this step and use the MRUK prefab (described below) with Unity during play mode or by using Meta’s XR Simulator.

  2. MRUK Tools: there’re two ways to get the Meta MR Utility Kit today: one is by just using the Meta XR All-In-One SDK or by using their standalone Meta MR Utility Kit package. This depends on what you’re trying to build, but at a minimum, you need the latter option.

  3. MRUK Prefab: this is essentially the core of scene understanding in Unity. Simply adding this prefab to the hierarchy will allow the system to automatically query scene information. It will create all the game objects with their appropriate dimensions and anchors, all retrieved from the detected scene model. This object also provides several events that are critical for determining the following:

    • Scene Loaded Event: this executes when your application successfully loads the scene model. This event is helpful for determining when certain game objects in your scene become active. For instance, if your game relies on scene understanding, this will be key to deciding when to transition from one game mode to another.

    • Room Created Event: this executes when the room is created. It's very important because we need to know if, for instance, the ground is available before placing the player. Otherwise, the player may fall, resulting in a poor experience for users.

    • Room Updated Event: this occurs when the room is updated from the scene capture. If your scene model changes, this event will be triggered.

    • Room Removed Event: called right before the room is removed.

  4. EffectMesh Prefab: this component is required if you want to control some of the rendering for the scene information retrieved. For instance, if you want to assign specific materials or colliders to your walls, tables, etc., then this component is necessary.

  5. FindSpawnPositions Prefab: allows you to find valid (inside the room, outside furniture bounds) random positions for content spawning.

  6. RoomGuardian Prefab: allows you to create a guardian protective mesh. Personally, I haven’t used this component before, but I believe it allows you to assign a guardian like material in case you are too close to the guardian area.

  7. SceneDebugger Prefab: a pretty cool scene debugger tool that visualizes different aspects of your scene, such as the closest surfaces, largest tables, scene anchors, and more.

Scene Debugger in action and running with Meta Quest 3 from my latest MRUK YouTube video.

Mixed Reality Utility Kit (MRUK) Unity Tutorial

Well now that we got some of the concepts down let me tell you how to can start using MRUK with your own Unity project. This will be super straightforward as Meta is providing us with all the tools to get started. Also, if you’re more of a visual person, then take a look at my YouTube video where I go through this process as well as a few interesting demos.

MRUK - Running in play mode.

  • Create a new Unity project with the Standard Rendering Pipeline. I recommend using Unity 2022 LTS (in my case, I used version 2022.3).

    • If you prefer to use URP, that’s completely fine. Just be aware that some of the materials in the Meta MR Utility Kit are not currently compatible with URP, so you may need to tweak a few things to make it work.

  • Install Meta XR All-In-One SDK or by using their standalone Meta MR Utility Kit package. You can do this by going to Window > Package Manager > My Assets and searching for them. Make sure you’ve added these assets to your Unity account before trying to search for them.

  • Go to the Project Setup Tool by accessing Player Settings > Meta XR and then apply all the fixes. Also, install the XR Plug-in Management and enable Oculus for both the Standalone and Android platforms.

    • Double-check the Project Setup Tool after enabling the Oculus plugins to ensure you don’t have any pending fixes.

  • Go to Meta > Tools > Building Blocks and add the following building blocks:

    • Passthrough

    • Controller Tracking

    • Scene Debugger (Optional)

  • In the Project tab, go to Packages > Meta MR Utility Kit > Core > Tools, then drag and drop the MRUK and EffectMesh prefabs into your hierarchy.

  • Hit Play in Unity, and you should see a test room generated, which is very helpful during development.

Well, that’s it for today everyone. Let me know if you’ve any questions about MRUK and I’ll be more than happy to help out. Also, be sure to take a look at the resources below as those should help out as you integrate these tools into your own mixed reality application.

Thank you!

Dilmer

Previous
Previous

Meta Immersive Debugger Now Available - For Devs And Non-Devs!

Next
Next

Object Tracking For visionOS Is Here - But How Does It Work?