Locomotion in Bathysphere is handled using head tracking. Users walk up to objects they try to interact with that are too far away by path finding using a nav mesh. The user can also interact with a point on the ground to walk to that point.

Larger objects in the gallery space pop out text so the user can learn about various sea life that is present in the gallery.

Object inspection lets the user pick up and "inspect" fish and other smaller ocean artifacts. Rotation of the object is tied to head tracking.

By interacting with video canvases around the gallery users can watch various VR videos. Video quality looks poor because of the video, not the renderer; our content partner didn't have their 360 stereo rig built when I was running the team.

Bathysphere is an educational application for Google Cardboard I worked on in partnership with Fish Eye Project while I was attending the University of Victoria. It was publicly displayed and demoed at the Royal BC Museum as part of World’s Oceans Day. I concepted the project, pitched it, and brought it all the way through to a beta product. I personally programmed the VR video playback system and object inspection systems for Bathysphere.

Download the APK for Bathysphere prototype here.
Code available upon request.