Building VTK User Interfaces: Interlude – Using a Wrap920 VR Headset

This ‘interlude’ post looks at adding head-tracking with a pair of Wrap 920 VR glasses so that you can fly around the scene. I wasn’t quite happy with the interactivity of the 1st-person user camera in the previous post – it was clunky and difficult to move around – there is definitely room for improvement, and a proof-of-concept is demonstrated here. This is based on the work done with ARX, however there are some technical challenges in using the VR glasses in Linux.

Quick Note: I’ve called this an interlude post because it’s not officially part of the ‘Building VTK User Interfaces’ thread. The code isn’t required for the next few posts – it’s just a branch to demonstrate the power of using VTK in this way.

I’ve compiled this into a small informal movie to show how it works:

The code for this post (i.e. the CodeBlocks Vuzix driver and the updated Python project) can be downloaded from here.

User Requirements

First, the user requirements:

  1. I want my bot pilot or planner to be able to fly around the scene with full immersion:
    1. Monitors are a terrible way to interact with a 3D environment, let’s use VR glasses to aid in the immersion
    2. Head tracking should allow a user to ‘look’ around the scene
  2. I don’t want my user to be tethered to a keyboard and a mouse:
    1. Even if you can fly around, if a user has to press ‘W’ to move forward, he/she will spend most of their time with the glasses propped on their forehead trying to find the ‘W’ key on the keyboard
    2. The user should be able to move around intuitively with either a 3D mouse, Kinect hand tracking, or a Peregrine power glove

Unfortunately, my Oculus Rift glasses are still in ‘pre-order’ purgatory, so I rummaged through my storage and found an old pair of Vuzix Wrap920 glasses with head-tracking.

I also located a Peregrine glove to use – a Kinect for hand-tracking would be more intuitive,  but that’s more than a weekend’s work so that will be addressed in the November working session. In the interim, a Peregrine glove can be used to move around with simple motions, which is still far more intuitive than a keyboard.

The remainder of this post will looks at how the system is constructed:

  1. Vuzix Head-Tracking Integration
  2. Peregrine Glove Integration
  3. Adding in a Head-Tracking Camera

Vuzix Wrap 920 Integration

There doesn’t seem to be a Linux driver for the Vuzix headset, which means something needs to be written to make the scenario work. Luckily, two developers from Germany (Justin Philipp Heinermann and Jendrik Poloczek) provided the start of a driver on GitHub, which can be found at Wrap920 GitHub Project). It’s great code that so all it took was a bit of adapting to get it to compile on Ubuntu 14.04. A small calibration stage was integrated in to minimize the sensor biases and an LCM channel was included to push the data to an LCM channel.

This is slightly out of the scope of this topic, so it’s provided as-is in the download above. I’m trying to avoid diving into the code as Wrap920’s are older technology and this is just for demonstration purposes, but feel free to email me if you want to talk through the code.

Here’s a quick high-level overview of the architecture:

Design_Overview

The glasses provide a stream of raw measurements on a device in Linux ‘/dev/hidraw0’ which are processed to provide the head orientation (pitch, roll, and yaw) as an LCM frame:

  • Once you connect the glasses:
    • Start the driver by running the executable ‘VuzixTest’ as a super-user (so ‘sudo VuzixTest’) in ‘Wrap920 Driver\VuzixTest\bin\Debug’
    • If it can connect to the device, it should show a calibration step
    • Once you have calibrated it, it will print a continuous feed of outputs
  • As the driver prints the message it also publishes it to a channel called ‘Pilot_J.Sparrow’ so, in theory, you could subscribe to all of the ‘Pilot_*’ channels to pick up all the headsets (if more were attached)
  • From that point we can work with the data as though it’s any head-tracking measurement 🙂

As above, if you’re trying this, or want to adapt it to a newer pair of glasses, feel free to message me any questions about the code.

Peregrine Glove Integration

The Peregrine glove maps as a keyboard, so it doesn’t require any custom drivers. However, you can’t use the Peregrine Toolbox in Linux, so you will need to configure it on a Windows system. I remapped my Peregrine to respond with the following keys:

  • ‘h’ is mapped to touching your thumb to index finger tip – this will move the camera forward
  • ‘g’ is mapped to touching your thumb to middle finger tip – this will move the camera backward
  • ‘t’ is mapped to touching your thumb to the inner part of the index finger – this will move the camera up
  • ‘i’ is mapped to touching your thumb to the inner part of the middle finger – this mill move the camera down

Peregrine

 

This is then used inside the Python style interactor to move it as you are looking around.

Adding in a Head-Tracking Camera

Now that the data is available over LCM, the last step is to actually build the style interactor/camera. The code for this can be found in ‘Interactor1stPersonVuzix.py’ in the ‘scene’ folder of ‘bot_vis_platform’. I’m not going to reproduce code here and discuss it in detail here as this is just a demonstration, but again feel free to ask if you want any details. The style interactor is very similar to the normal 1st-person camera, except that it gets the current head-tracking orientation from the LCM channel and sets the camera to look in that direction.

The data is provided by a small LCM channel which can be found in ‘lcm_controller’ in the ‘lcm_channel’ folder of ‘bot_vis_platform’. LCM is discussed in Post 5 of this thread, which should be up in 2 weeks or so, so I’ll save the in-depth discussion until then.

Summary

This is a sidetrack from the normal VTK thread that demonstrates how to integrate a head-tracking VR headset into the bot environment. With this working, I hope my Oculus Rift escapes from pre-order purgatory soon – I’ll then reproduce this with two users (one using a Wrap920 and one the Oculus), which should be really neat!

– Sam

Advertisements

One thought on “Building VTK User Interfaces: Interlude – Using a Wrap920 VR Headset

  1. Pingback: Building VTK User Interfaces: Mid-Way Summary and Next Steps | Semi-Sorted

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s