sloth.gif

Itadakimasu VR Experience

“Itadakimasu” (Japanese for ‘Bon Appetit’) is a therapeutic VR experience that allows users to interact with animals through different hand gestures. The focus of this piece stems from research findings that animal-assisted therapy can help decrease anxiety and reduce blood pressure in patients. 

Although the experience is simple in content, my intent is that it could act as a short-term solution for people in places where owning a pet is logistically difficult. 

Ideation and Planning

Ideation and Planning

The goal is to create an emotional response through interactions between the user and animals. 

To me, it was most important to get the interactions right. Once that was achieved, I could then start playing around with the different types of animals and animations. 

Interacting with the Leap Motion

Interacting with the Leap Motion

I chose to work with optical hand tracking using the Leap Motion in order to perform gestures that were natural to our culture. Although an analog controller would have been great for haptic feedbacks, the Leap Motion provided an experience that was more personal and intimate. 

Using Leap Motion’s detector scripts, I can easily detect what the user’s hands are doing. This can be anything from figuring out the palm direction to seeing if the fingers are curled or extended. 

Cute Animals

Cute Animals

One of my biggest inspirations is taken from PARO, the therapeutic seal robot. PARO is an advanced interactive robot used in hospitals and extended care facilities in Japan and Europe. You can read more about it here. 

I wanted to emulate that same sensation of joy through animation. 

In addition to reusing the sloth from my previous work, I added a red panda, an otter, and a hedgehog. Rather than going for more realistic animal behaviors, I wanted to place these characters in funny or unusual situations. For example, the red panda spends his time eating ramen or the otter is getting ready to jam with his clam guitar. 

In order to ensure good feedback, the animals are highlighted whenever the user gazes at them. This lets the user know that they can start to perform an action. 

Environment as Onboarding

Environment as Onboarding

Rather than have the environment act as a simple wallpaper, I wanted it to be central in guiding the user, so that a first-time user would be able to see hints of what to do embedded in the background.

In order to do that, I made sure that instructions would ‘frame’ certain animals. For example, the three hand gestures are integrated as flyers that rest above the red panda and otter. This ensured that the discoverability of these actions were high.

 Taking cues from the interior design concept of ‘vignettes,’ I grouped my environmental objects around each animal as a picture frame. So not only was it pleasing to look at, but it conveyed the necessary information, as well.   For the sloth,

Taking cues from the interior design concept of ‘vignettes,’ I grouped my environmental objects around each animal as a picture frame. So not only was it pleasing to look at, but it conveyed the necessary information, as well. 

For the sloth, I chose a slightly different approach. The sushi conveyor belt sits in the foreground of the sloth. Every so often, the user will see the same three hand gestures as signs that pass through along with the sushi. 

The conveyor belt is also meant to guide the user’s eyes so they could follow the sushi to see the rest of the scene. 

VRLA Reactions

VRLA Reactions

Overall, there were strong positive reactions to my piece. The Leap Motion worked well and everyone reacted naturally with the gestures. I feel this would have been a different experience if I had gone with a clunky controller. 

I did notice that some of the participants assumed they could do any gesture, other than the three, to get a reaction. Some of them waved and even used voice commands like saying “hiiiiiii.”

Another observation is that several people were not immediately aware they could turn around to see more animals to interact with. This has to do with the user only being able to see one animal at a time, with the others being out of their peripheral vision. I feel if I had four animals, they would be evenly spaced for the user to notice and want to turn around. This also presents an opportunity next time to experiment with light, shadows, and sound to gives cues for users to turn around. 

In the end, it was truly heartwarming to see most of the participants leave this experience with a laugh or smile on their face. 

If you’re curious about trying this experience, you can download it now from Leap Motion’s website here (Requires an Oculus Rift or HTC Vive with a Leap Motion).