XR Hands Custom Gestures Now Available!

Hello Everyone And Welcome!

Yesterday, I published a new video about the new Unity XR Hands package 1.4.0. In this version, we now have a new set of components that allows you to easily create custom hand gestures. This package also works with the OpenXR Plugin, and that’s what I ended up using for the video. Additionally, I used a Meta Quest 3 to test all of these new features.

BUT how does this process work? Is there a lot of coding involved? Well, let me walk you through some of the main components and workflow to author your own hand gestures.

Custom Gestures “Hand Shape” Scriptable

Fig 1.0 - Differences between Full Curl, Base Curl, and Tip Curl

This is will be the main asset you will use to setup your custom gesture by providing finger conditions, each finger requires a finger shape which determines the current state of each of your fingers. For instance, Unity provides the following finger shapes:

  • Full Curl: The overall curve of a finger. (A combination of base and tip curl.)

  • Base Curl: The angle between the hand and the base of the finger.

  • Tip Curl: The curve of the outer portions of the finger.

  • Pinch: Whether the finger is in a pinching posture based on how close the tip of the finger is to the tip of the thumb.

  • Spread: The spread between this finger and the next (moving from thumb to little finger).

Credits to Unity for providing the definitions above.

So each one of the finger shapes above requires a normalized value from 0 to 1. For instance, let’s say that you wanted to know the state of your index finger. If your entire hand is currently wide open, then your index finger Full Curl value will be 0. However, if you make a fist, then the Full Curl value will be 1. Fig 1.0 provides a great illustration, provided by Unity, about the differences in these values as we look at different gestures.

Custom Gestures “Hand Pose” Scriptable

In contrast to the “Hand Shape,” where you tell the gesture recognizer the shape of each finger, this instead specifies what orientation your hand needs to be. Also, not only orientation but you can also tell the system to look at different reference directions. For instance, we could have our palm facing our head or against our head, or we could enforce our nose or chin direction. For more information about this, I recommend reviewing the Unity docs here.

Static Hand Gesture component

Fig 1.1 - Static Hand Gesture Component

This component is one of the most useful ones in terms of what we can do when a gesture is detected and what to do when it finishes. It allows you to bind to GesturePerformed() and GestureEnded() events, which are pretty self-explanatory by their method signatures. This component also allows you to specify the “Minimum Hold Time” as well as a “Gesture Detection Interval.” (Fig 1.1 shows you how this looks in the inspector window)

What Is The Workflow In Unity?

I normally have a step-by-step tutorial here about what to do, but instead, I recommend watching my latest YouTube video titled "Unity XR Hands Custom Gestures Tools Are Here" because that video goes in-depth into what you would need to create custom gestures and how to quickly test them by using their powerful UI Debug tools.

My Takeaways

Fig 1.2 - Showing All Custom Gestures Available With XR Hands 1.4 or above.

I don’t remember ever having a hand system so simple to use. Honestly, you can do everything through the Unity editor, and there is hardly any coding involved. I love to code, don’t get me wrong, but having tools that allow you to quickly develop custom gestures is incredibly helpful. One thing I enjoyed the most was the UI Custom Gesture debug tools. These tools allow you to visualize exactly what is happening for each finger. All of the options mentioned previously are displayed in real-time, and you can also set up custom gestures which are highlighted as each hand gesture is detected. Enough said, I loved this tool and highly recommend it. Be sure to check it out and start your own project today!

Also, make sure to watch the small video as shown on Fig 1.2 which I took while testing all the Unity built-in custom gestures, as well as the “Vulcan Salute” and “Gun Pointing” gestures I created. It was so much fun to work on and test.

Well, that’s my summary everyone and thank you for reading.

Dilmer

Previous
Previous

My Experience With Cognitive3D VR/AR Analytics And SDK Integration For Unity!

Next
Next

Meta Haptics Studio and Haptics SDK: Full Walkthrough