sloth.gif

VR Interactions

 
 

Design & Development
Jeff Chang

My original intent for learning 3D animation was to create 360 videos. This series of animals driving automobiles was intended to be viewed through the Google cardboard.

Halfway through however, I realized I didn’t simply want this to be a passive experience. I wanted my viewers to interact with my art. This led to me experimenting with different interactions in VR.

 
 

Oculus DK2 and Leap Motion

In my early prototypes, I purchased a Oculus DK2 and started learning how to code with Unity.

During this time, the Oculus Touch controllers have not been announced, so I paired the headset with the Leap Motion hand tracker. I wanted to leverage those basic human interactions and make them feel natural in an artificial space. 

This prototype involved using hand gestures to trigger different animation states.

 
comparison2.gif
bear_test.gif
 

Photogrammetry and Object Transforms

For a while, I played around with how these hand gestures would work with volumetric capture by Depthkit and photogrammetry.

Hand gestures became more detailed and granular. “Pinching” allowed for transform controls such as scale and rotation.

 
 

Conversational UI

Besides hand interactions, I was interested in conversational UI. For this test, I used IBM’s Voice to Speech recognition. I realized it was important to keep commands simple and easy for users to remember.

In this experiment, a user can point to any world-space coordinate with their index finger. Pointing will activate the voice-to-speech component, so when the user says “go,” the little character will go towards those coordinates.

 
 
 

All of these quick experiments culminated in my VR exhibit at VRLA 2016.

The user is placed in an environment surrounded by a variety of animals. They can perform different types of hand gestures to elicit reactions from these characters.

 
 

Environmental Onboarding

Rather than have the environment act as a simple wall paper, I wanted it to be central in guiding the user.

I used the concept of vignetting to “frame” images of the hand gestures around each animal. Doing this would make them more discoverable to users.

 
signs.gif
 

Takeaways

Overall, a lot of people responded positively. The aesthetic lended itself to become more approachable and inviting, especially for people who have never tried VR! I had a person come up to me and say they never thought VR could ever have an art style like this.

Despite some technological constraints, the Leap Motion helped made it feel so natural, that one user even tried talking to the animals and saying “heeeey!” Voice-to-speech never made it into the final build because of latency issues, but it did made me reconsider experimenting with it again in the future.

 
leapmotion_gallery.gif