Meta Haptics Studio and Haptics SDK: Full Walkthrough

Today, I would like to share my experience using some of the new haptic tools released by Meta as part of their 60.0.0 release. These new tools allow you to easily design, test, and integrate highly detailed haptic vibration patterns into apps or games created with Unity or Unreal. This all sounds great, but how do you go about integrating haptics? What is needed to implement them into existing or new experiences? Well, let me first introduce you to some of the new tools, and then I'll explain how straightforward it is to make this work.

Also, consider watching my full YouTube video about using Meta Haptics, trust me that it will be worth your time ;)

Meta Haptics Studio

Think of Meta Haptics Studio as the central design tool you will use to configure haptics. Here, you import all your audio files (.wav, .ogg, .mp3), and the tool analyzes and generates a graph resulting in vibration patterns based on each imported sound. These vibration patterns, when tested, closely replicate what you would expect to feel in a real-life experience, but in this case from the Meta controllers.

Before these new tools, your options were limited to generating a constant vibration. You could specify how long the controller would play the vibration and the intensity, but that's where it ended. With these new tools, you now have access to a multitude of parameters within the Quest 2, Quest 3, or Quest Pro controller motors. This expanded control allows you to specify more parameters when generating vibrations, including Amplitude, Frequency, and Emphasis.

Few helpful definitions of each parameter (or called envelopes) as per Meta’s documentation

  • Amplitude: It allows you to modulate the amount of vibration or force that the motor is creating as it moves.

  • Frequency: It allows you to control the speed in which the haptic motor vibrates.

  • Emphasis: An emphasis point is a short, momentary haptic sensation.

Download the Meta Haptics Studio for Windows or Mac from here.

Meta Haptics Companion App

This is an application that runs locally on your Meta device and communicates directly with Meta Haptics Studio using your computer's IP address. It allows you to easily test any of your Meta Haptics Studio imported audio files to ensure they produce the right level of vibrations based on your experience. This is where these tools shine. For example, if you prefer a higher amplitude on a specific sound or want to add emphasis points at specific timeframes, you can use Meta Haptics Studio to apply those changes and the companion app to test it. Finally, when you're satisfied with the results, you can export it and import it into your preferred game engine. I will provide more details about the exporting process in the “Meta Haptics SDK Unity Tutorial” section below.

Download the Meta Haptics Companion App from here.

Click on the image above to take a peek at the Meta Haptics Companion App running during my last video (highlighted in red)

Meta Haptics SDK

So far, we've talked about the tools for designing and testing your generated haptics, but we haven't discussed how to actually implement them with Unity or Unreal. In my case, I will explain how it works in Unity since that's where my experience lies. However, feel free to refer to the documentation if you'd like more information about the Meta Haptics SDK for Unreal. In short, the Haptics SDK is what enables you to play HapticClips from within Unity using C#.

To test the Haptics SDK, I experimented by cloning Meta's The World Beyond mixed reality experience. I saw this as a great opportunity to test these tools from start to finish. At this moment, this repository has not integrated the Haptics SDK. I thought it would be a great idea to test these tools by integrating it myself. Interestingly enough, it took me more time to upgrade this project to use the latest Meta integration tools than to integrate the Haptics SDK. It's funny, but it speaks to how easy this SDK is to implement, which is great news.

To expand on this, I gathered all the audio files under the MultiToy audio folder, created a new project with Meta Haptics Studio called "TheWorldBeyond," adjusted some of the generated haptic analysis for each audio file, and then exported all the files as .haptic files into the Unity resource folder. The next step was simply to play the Haptic Files, for which I created the script HapticsManager which allowed me to interact with .haptic files through C# in Unity.

Meta Haptics SDK Unity Tutorial

I highly recommend watching my YouTube video titled "Meta Haptics Studio and Haptics SDK: Full Walkthrough NOW Available" In this video, I provide a comprehensive demonstration of the entire process and discuss all these tools in detail. However, if you prefer a step-by-step written tutorial, please continue reading.

Okay, to begin integrating the Haptics SDK with a minimal setup, follow the steps below:

Package Manager - Meta XR Haptics SDK

  • Your Meta Device OS must have firmware version v47 or greater

  • You will need Unity 2021.3 or greater

    • Make sure the modules “Android Build Support” with “OpenJDK” and “Android SDK & NDK Tools” is also added during the installation.

  • Create a new Unity Project and import this com.meta.xr.sdk.haptics tarball by going to Unity > Window > Package manager, then click + and select “add package from tarball”.

  • In the Package Manager click on “Meta XR Haptics SDK”, go to the samples tab, and click on “Import” to import the “Meta Haptics Minimal Sample” (See Fig 1.0)

  • Now open the “HapticsSampleScene” scene which provides you with a demo to learn how to play .haptic clips generated from Meta Haptics Studio

  • Add this scene to your File > Build Settings and deploy it to your Quest 2, Quest 3, or Quest Pro device. You can also run this with Oculus Link if you like and everything should work just like it would during a deployment.

Wrapping this up…

Well, that's a summary of the new Haptics tools recently released by Meta. In short, this was so easy to use that I felt guilty for not adding more complexity here—though I suppose that's a good thing. Many people might consider Haptics as something to address towards the end of their project, but I recommend prioritizing Haptics and including it in initial conversations when designing your experience. Haptics, especially with the Haptics SDK, adds a significant level of immersion. The vibrations generated from these tools added a considerable amount of polish when I integrated them into the "The World Beyond" game.

If you have questions, feel free to drop them below. I'm curious to know if you've gone through this process or if you're considering adding the Haptics SDK.

Meta Haptics SDK - Sample Script

using Oculus.Haptics;
using UnityEngine;

public class HapticsManager : MonoBehaviour
{
    public static HapticsManager Instance;

    [SerializeField] private HapticClip clip1;
    [SerializeField] private HapticClip clip2;
    [SerializeField] private HapticClip clip3;

    private HapticClipPlayer player;

    private void Awake()
    {
        if (Instance == null)
        {
            Instance = this;
        }
        else if (Instance != null)
        {
            Destroy(gameObject);
        }
        DontDestroyOnLoad(gameObject);
        player = new HapticClipPlayer(clip1);
    }

    public void PlayClip1()
    {
        player.clip = clip1;
        player.Play(Controller.Both);
        Debug.Log($"Haptics Played: {clip1.name}");
    }

    public void PlayClip2()
    {
        player.clip = clip2;
        player.Play(Controller.Right);
        Debug.Log($"Haptics Played: {clip2.name}");
    }

    public void PlayClip3()
    {
        player.clip = clip3;
        player.Play(Controller.Right);
        Debug.Log($"Haptics Played: {clip3.name}");
    }

    // This method is for sample purposes only
    // I recommend caching the .haptic files on start and not
    // doing a Resources.Load every time you play a HapticClip
    public void PlayWithClipName(string clipName,
        Controller controller = Controller.Right)
    {
        string hapticFile = $"Haptics/{clipName}";
        var hapticsClip = Resources.Load<HapticClip>(hapticFile);
        if (hapticsClip)
        {
            player.clip = hapticsClip;
            player.Play(controller);
            Debug.Log($"Haptics Played: {hapticsClip.name}");
        }
    }

    private void OnDestroy()
    {
        player.Dispose();
    }

    private void OnApplicationQuit()
    {
        Haptics.Instance.Dispose();
    }
}
Previous
Previous

XR Hands Custom Gestures Now Available!

Next
Next

Unity PolySpatial Play To Device For visionOS Development Is Here!