Click here to Skip to main content
13,861,061 members
Click here to Skip to main content
Add your own
alternative version

Stats

9.8K views
1 bookmarked
Posted 1 Apr 2017
Licenced CPOL

2. HoloToolkit Unity - Gaze and Airtap

Rate this:
Please Sign up or sign in to vote.
Augmented Reality and Hololens is finally a reality and demand for skilled developers will soon emerge.

Introduction

This article will show you how to use functionality from the HoloToolkit to trigger your own actions either when you start or stop gazing at a hologram or you gaze at it while you are performing the airtap gesture.

Background

The Holographic academy is a great learning resource. Unfortunately, the videos over there are based on the old version (pre 2017) of HoloToolkit. The newer version seems to have moved more towards using Interface tech to expose functionality rather than sending messages to active objects.

Creating apps for Hololens should be fun and simple. Simplicity is very often the best approach to solving a problem. I will therefore try to illustrate how to utilize functionality from the 2017 HololensToolkit as simple as possible.

Gaze - To Have Or Not To Have Focus, That's The Question...

I will continue from where I left off in my former article about setting up this stuff and have it play... So start the Test_HoloToolkit project, open Scene_1 and save it as Scene_2 in your own _CP\Scenes folder. While you are at it, you might want to create a Scripts folder in your _CP folder. (Just to keep things organized :-)).

In your Scripts folder, create a new C# Script and name it CubeGazeActions.cs. Drag the script onto your Cube. Double click it to Edit it in Visual Studio.

If you are like me, code syntax is not always that easy to remember correctly, so me thinks the thing called Intellisense is a great feature. In order for the Toolkit to be able to provide it for you, the first thing to do, is to create a new using statement in your code.

using HoloToolkit.Unity.InputModule;

As I mentioned, the Toolkit functionality is quite Interface based so if you put a comma after the MonoBehaviour, you inherit from in your class definition and press I, intellisense should come into play and show you the IFocusable interface defined in the Toolkit.

public class CubeGazeActions : MonoBehaviour, IFocusable

You still will have to deal with a red squiggly line beneath IFocusable so hover over it with your cursor, click on your yellow small lightbulb friend when it pops up and click on Implement interface. The thing should work its magick and you should be provided with two brand new methods to play with, namely OnFocusEnter and OnFocusExit.

public void OnFocusEnter()
{
    throw new NotImplementedException();
}

public void OnFocusExit()
{
    throw new NotImplementedException();
}

Another great feature in Visual Studio is the ability to divide code into regions. Please feel free to surround these methods with a region. You could name the region IFocusable and you will see from the code example beneath how it is done. This might greatly improve the readability of your bigger scripts.

Now you might want to implement some functionality like this:

using HoloToolkit.Unity.InputModule;
using UnityEngine;

public class CubeGazeActions : MonoBehaviour, IFocusable
{
    private Color defaultColor;
    void Start()
    {
        defaultColor = gameObject.GetComponent<MeshRenderer>().material.color;
    }

    #region IFocusable
    public void OnFocusEnter()
    {
        gameObject.GetComponent<MeshRenderer>().material.color = Color.green;
    }

    public void OnFocusExit()
    {
        gameObject.GetComponent<MeshRenderer>().material.color = defaultColor;
    }
    #endregion IFocusable
}

This thingie simply makes your Cube go green when you look at it (place the Cursor on it). When you gaze away, the Cube becomes its own boring mousegrey self again. This might seem like wizardry on a very simple level, but think of all the things you could accomplish, evil as well as good, with your own imagination...

AirTap - Your mouseclick in Mixed Reality

Implementing this functionality will be very similar to the fomer one, so I will simply show you a small script I put together to handle the OnInputClicked event presented by the IInputClickHandler interface.

using HoloToolkit.Unity.InputModule;
using UnityEngine;

public class CubeAirtapActions : MonoBehaviour, IInputClickHandler
{
    #region IInputClickHandler
    public void OnInputClicked(InputClickedEventData eventData)
    {
        if (gameObject.GetComponent<MeshRenderer>().material.color != Color.red)
        {
            gameObject.GetComponent<MeshRenderer>().material.color = Color.red;
        }
        else
        {
            gameObject.GetComponent<MeshRenderer>().material.color = new Color(0,0,255,255);
        }
    }
    #endregion IInputClickHandler
}

With this script also attached as a component of your Cube, your Cube will change between being Red and Blue when Gazed upon simultaneously as airtap is performed. (From some unknown reason, it seems like your very first click when running in the editor does not function, but this is really too small an issue to be even mentioned, so consequently, I just had to mention it.)

In the HoloToolkit\Input\Tests\Scenes directory there is a scene called InputTapTest that is almost too similar to what we have been through here. I strongly recommend you to tag along and play with it. I did...

What's Next?

Now that we are familiar with all those strange people on trains and busses, seemingly talking to themselves while using their smart phones, talking to Siri, Cortana, your dog, your Hololens or whatever is no longer a strange phenomena, and will under most circumstances not have you committed.

Next time, we will try to learn the Hololens to really listen to you. (We can't have it flying around performing on its own then, can we now?)

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

Share

About the Author

Harald Heide Gundersen
Software Developer (Senior) EVRY
Norway Norway
Knowledgeable of:

Visual Studio 2017
C#, VB, Java
MS SQL Server
Unity3D
Blender 3D
Adobe Photoshop, Illustrator Gimp etz.
Virtual- / Augmented- / Mixed- Reality
Developing for HTC Vive, Samsung Gear VR, Google Cardboard, Hololens

You may also be interested in...

Comments and Discussions

 
-- There are no messages in this forum --
Permalink | Advertise | Privacy | Cookies | Terms of Use | Mobile
Web04 | 2.8.190214.1 | Last Updated 1 Apr 2017
Article Copyright 2017 by Harald Heide Gundersen
Everything else Copyright © CodeProject, 1999-2019
Layout: fixed | fluid