AR Whack a Mole: Part 1

First part of a multipart series implementing a demo game for the Meta 2 platform. 

Whack-A-Mole Demo

In the next series of posts, we are going to explore the creation of a Unity scene leveraging the Meta 2 SDK to make a "Whack-A-Mole" style game. The objective will be a game that we can use our hands to whack the moles and score points. This game is going to leverage the hand tracking features, surface reconstruction, and augmented reality aspects of the Meta 2. For this first post, we will look at the hand tracking features. 

We're going to assume that you have some basic skills working with the Unity environment. If you are a true beginner, you may want to check out some of Unity's tutorial videos (specifically, the roll a ball tutorial) before attempting this walk through. Beyond that, this post will leverage a lot of the content from the "HandCubeInteraction" demo in the Meta SDK's examples. This project is going to start by using Meta SDK v2.4 - but it is possible that Meta will release a new version in the mean time. 

Starting a new Project

This post is going to be split into a video showing the necessary steps in Unity and some listings below containing the source code below where we will talk through what is going on. 

Prior to starting through the steps in this video, you will want to have tested that your Meta 2 HMD works by using the Meta Workspace and trying some of the unity examples that come as part of the Meta 2 SDK. The "HandCubeInteraction"demo is particularly pertinent. You will also need a development environment, either Visual Studio's or Mono Develop. 

PS: Sorry for the poor quality - Next video will be better.

MetaHands and HandsProvider

The MetaHands prefab has a script attached to it called the HandsProvider. This class provides a mechanism to translate the interop HandData structure into Hand objects. The HandsProvider object has a internally defined class called HandsProvider.Events which defines the event callbacks that can be used to provide information about the hands present in the user's spatial environment. The events that we are primarily going to use for this demo are "OnHandEnter" and "OnHandExit". The idea is that hands from the user (or possibly other people's hands) enter and leave the scene on demand. They only show up when the meta's cameras can see them.  

Note that this code really is just a wrapper around the interop with lower level unmanaged code. It seems that the algorithms that are doing the actual hand detection and tracking are implemented in C++ for performance reasons. 

Models and Scale

In the video, we first try to manifest hammers on the user's hands but they are way too big. Similarly, when debugging this demo, I first placed the moles way too far away without knowing it. The spatial environment units in Unity seem to be in meters. So, if you want the user's hands to interact with objects while sitting down, you have to consider their nominal reach. I find that less than 0.5 meters is a good starting space, but consider that human arm span and reach generally is going to depend on height. This may need to be a configurable setting in any future application.

Source Code

Below are listing for the two scripts that were added to the project in the video. 

HammersManager

HammersManager.cs

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using Meta;
using Meta.HandInput;


public class HammersManager : MonoBehaviour {

	public Transform HammerPrefab;

	private Dictionary<HandType, Transform> _hammers = new Dictionary<HandType, Transform>();
	private Dictionary<Hand, Transform> _activeHands = new Dictionary<Hand, Transform>();

	void Start () {
		if ( this.HammerPrefab == null)
		{
			throw new System.Exception("Invalid Prefab");
		}

		HandsProvider provider = GameObject.FindObjectOfType<HandsProvider>();
		provider.events.OnHandEnter.AddListener(this.OnHandEnter);
		provider.events.OnHandExit.AddListener(this.OnHandExit);
	}

	void OnHandEnter(Hand hand)
	{
		Transform hammer = null;
		try
		{
			hammer = this._hammers[hand.HandType];
		} 
		catch( KeyNotFoundException )
		{
			hammer = Instantiate(this.HammerPrefab, this.gameObject.transform);
			this._hammers[hand.HandType] = hammer;
		}

		hammer.transform.position = hand.Palm.Position;
		hammer.gameObject.SetActive(true);
		this._activeHands[hand] = hammer; 
	}

	void OnHandExit(Hand hand)
	{
		try
		{
			var hammer = this._activeHands[hand];
			this._activeHands.Remove(hand);
			hammer.gameObject.SetActive(false);
		}
		catch ( KeyNotFoundException )
		{
			Debug.Log("Did Not Find Hand - Sync Issue?");
		}
	}


	// Update is called once per frame
	void LateUpdate () {
		foreach( var hand in this._activeHands.Keys )
		{
			try
			{
				var hammer = this._activeHands[hand];
				hammer.transform.position = hand.Palm.Position;
			}
			catch ( KeyNotFoundException )
			{
				Debug.Log("No Hand Obj in Dictionary - Sync Issue?");
			}
		}
	}
}

So the hammers manager object has the following duties: 

  • Keep a list of active Hand objects
  • Keep a cache of hammer objects.
  • Update the position of the hammer objects on every frame based on the position of the active hands in the scene.

The "HammerPrefab" data member allows the user in Unity to set what object gets placed on the user's hands. Get creative - put your own model in your demo.

To keep an active list of hand objects, we use the "OnHandEnter" and "OnHandExit" methods as previously discussed. We then create a cache of hammer objects that are associated with the HandType of the hand. This is basically just an enumerated type for the right and left hands. The code as currently written is kind of limited but this works fine for the single user case, but it reduces the thrashing of object creation and lazy creates the objects only when needed. 

The "LateUpdate" method contains the main active component of this script which is to use the Hand.Palm.Position to set the position of each of the hammer objects.

HammerHit

HammerHit.cs


using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class HammerHit : MonoBehaviour {

	private void OnTriggerEnter(Collider other)
	{
		if ( other.gameObject.tag == "Mole") {
			Destroy(other.gameObject);
		}
		
	}

}

The HammerHit script implements the actual "whacking" of moles. Basically, we are using the collider of the hammer's head component to detect when we hit objects. We then filter the objects that we interact with by using the gameObject's 'tag' parameter. In the video, we added the "Mole" tag to the mole game objects so that we delete only the mole objects. 

Hand Tracking Issues

When testing the appropriately sized hammers in the video, you will notice that the hand tracking has some difficulties when the left and right hands overlap. There does seem to be some hand tracking issues with the current release. Additionally, there is some latency in the hand tracking. It is not large but it is noticeable when moving your hands around. I plan to do some characterization of this latency in the future. Some kind of lag compensation similar to what is used in video games could potentially address this. Alternatively, we could hope for a better implementation from Meta in the future.

Another issue I noticed is that the "Hand.Palm.Position" value changes depending on whether the user's hand is open or making a fist. This indicates to me that the hand tracking algorithm at the lower level is using primarily just a centroid of the detected hand to find the palm position, instead of estimating the actual pose of the hand. Not a huge issue for this demo but definitely something to watch out for. It is possible this behavior may change in the future.

Wrapping Up 

This is a work in progress. Keep an eye out for the next installment - Part 2: Using Reconstruction for finding the Game Surface.

Package of Unity Assets

Comments on Reddit