XR Interaction Toolkit 2.3 New Features

Hello Everyone !

There are some great news from Unity about the XR Interaction Toolkit and I would love for you to learn all about it.

The XR Interaction Toolkit for Unity version 2.3 prerelease was recently released 🎉 and Unity pushed some of their most requested features, I also had the opportunity to learn all about it and would love for you to check out a break down of all the features which I posted to YouTube.

💡 If VR and AR development sounds interesting for 2023 keep reading 👇

What’s the XR Interaction Toolkit ❓

Unity defines it as a high-level / cross-platform, component-based, interaction system for creating VR and AR experiences. It provides all the tools that allow interacting with 3D objects and UI components by simply dragging and dropping components.

What features are available in XRI 2.3 ❓

  • Poke Interactor

  • Gaze Interactors and Snap Volumes

  • Interaction Groups

  • Device Simulator Usability Improvements

  • Interaction Affordance System

What is a Poke Interactor ❓

This is a component which allows you to perform basic poking functionality in VR with hands and controllers.

What is a Gaze Interactor ❓

This allows you to use eye or head gaze for interacting with 3D or UI components in VR with the help of XR Gaze Interactor and XR Interactable Snap Volume.

What are Interaction Groups ❓

Allows you to group interactors together, basically if one interactor takes priority over another then that’s the one that is activated.

What is the Device Simulator ❓

Well forget about deploying to the device or connecting through Oculus Link, instead use mock tools in Unity to simulate a real XR device which activates based on simulated keyboard inputs.

What is the Interaction Affordance System ❓

This is huge for feedback, you can use this system to get feedback based on the state of the interaction, for instance when hovering over an object with a Ray you could change the color or size, play audio, etc.

Screenshots from all new XRI 2.3 features are shown above…

What are the major components in XRI ❓

  • Cross-platform XR input: Meta Quest (Oculus), OpenXR, Windows Mixed Reality, & more.

  • Basic object hover, select and grab

  • Haptic feedback through XR controllers

  • Visual feedback

  • Basic canvas UI interaction

  • Utility for interacting with XR Origin, a VR camera rig for handling stationary and room-scale VR experiences

What about AR with XRI ❓

  • XRI requires AR Foundation package which you can get from here https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@5.0/manual/index.html and with it you can use the following features provided by the XRI:

  • AR gesture system to map screen touches to gesture events

  • AR interactable can place virtual objects in the real world

  • AR gesture interactor and interactables to translate gestures such as place, select, translate, rotate, and scale into object manipulation

  • AR annotations to inform users about AR objects placed in the real world


📢 To learn more about XRI and XR development consider subscribing to my XR YouTube channel.

Thanks everyone and let me know if posts like this are helpful.

Previous
Previous

Meta Quest Pro Eye Tracking With Port 6 Touch SDK

Next
Next

Interaction SDK Ray Interactions With User Interfaces