OpenXR With Magic Leap 2 - Unity Setup, Plane Detection, And Gaze Features!
In today's post, I'd like to talk a bit about the OpenXR tools now available for Magic Leap 2. And YES, you heard that right, Magic Leap is now migrating from their custom MLSDK solution to the amazing cross-platform and standardized open framework called OpenXR. BUT what can you expect from such a move if you're already using their existing SDK tools? Do you need to move right away? Well, let's cover that in the next section.
From MLSDK To Unity OpenXR
I didn't see an exact deadline on their website regarding when you need to migrate to OpenXR, but the fact that they're "phasing out," which are words they used to describe the move, means you probably need to start planning your migration strategy. One thing I liked is that they provide "MLSDK Interoperability," which means MLSDK should still work even when using OpenXR, except for the following two areas:
Graphics-related features such as Global/Segmented Dimming and Headlocked mode.
Subsystem-related logic, including Input and Meshing.
So in my mind, you could begin migrating the two points mentioned but leave everything else running with MLSDK. Then, plan each feature migration into OpenXR as your team becomes available to do so.
MLSDK To OpenXR API Examples
Most of the implementation between MLSDK and OpenXR is very similar. You may need to refactor your code, add the new OpenXR plugin, and plan on a lot of testing, but from a quick view, it sounds like the code differences between MLSDK and OpenXR are not too far off. Here are a couple of examples:
// FOR THE GLOBAL DIMMER
// Legacy MLSDK API
MLGlobalDimmer.SetValue(0.6f);
// New OpenXR API
var renderFeature = OpenXRSettings.Instance.GetFeature<MagicLeapRenderingExtensionsFeature>();
renderFeature.globalDimmerValue = 0.6f;
// FOR USER CALIBRATION
// Legacy MLSDK API
MLHeadsetFit.GetState(out state);
MLEyeCalibration.GetState(out state);
// New OpenXR API
userCalibrationFeature = OpenXRSettings.Instance.GetFeature<MagicLeapUserCalibrationFeature>();
userCalibrationFeature.GetLastHeadsetFit(out var headsetFitData);
userCalibrationFeature.GetLastEyeCalibration(out var eyeCalibrationData);
So for the most part, it looks like OpenXR is just more descriptive. Also, if you are interested in learning more about MLSDK to OpenXR API changes, take a look at this document from Magic Leap.
Magic Leap 2 OpenXR Tutorials With Unity
I've been working on a new video series covering all the features available with Magic Leap 2 and Unity OpenXR. So far, I've detailed the following areas, which I recommend watching as you work on new applications or migrate existing ML2 applications to OpenXR:
Video Link: OpenXR With Magic Leap 2 NOW Available - Unity Setup & Plane Detection!
GitHub Demo Project: https://github.com/dilmerv/MagicLeapPlaneDetection
Video Link: Diving Into Unity OpenXR ML2 Gaze Features - Eye Tracking!
GitHub Demo Project: https://github.com/dilmerv/MagicLeapEyeTracking
In addition, I've created GitHub repositories available for each video as listed above, and all projects are ready for you to simply hit build and run in Unity to deploy them right to your device without the need to make any changes. These also serve as good references during your development process.
Additional ML2 OpenXR Resources
I found these resources to be of great help when learning why ML2 was moving to OpenXR. Honestly, I had a feeling why, but I really recommend reading through these documents to understand why this is a great move and not just take my word for it.
ML2 OpenXR Overview Docs
ML2 OpenXR With Unity Docs
OpenXR Specification Docs (this one is huge, and it may seem scary at first, but honestly, take some time to go through some of its sections. For instance, when I was learning about Gaze/Eye Tracking, it was invaluable to explore this OpenXR section, which taught me a lot about why Unity made certain coding architectural decisions)
Well, to wrap up this post, I had a ton of fun testing OpenXR with ML2. Personally, I've used Unity OpenXR before with AR Foundation and the XR Toolkit, so I pretty much felt like I was at home. I think that's the goal—to use a set of tools that work well not just with one headset, but with a variety of them, which is why moving to OpenXR makes a lot of sense to me.
I hope you find the ML2 resources mentioned helpful. I will be more than happy to answer any questions in the comments below as you work with these tools.
Thanks, everyone, and happy XR coding!
Dilmer