Check out the DEMO VIDEO page now!

Adding interactivity to videos is proven to increase viewer engagement. Imagine adding standard web interactivity to a media platform that already demands interaction in a way normal, linear, 2D videos can’t. 360 videos require that a viewer be interacting with them lest she miss critical moments while looking away from the action. They encourage exploration and invoke a constant curiosity as a viewer wonders what is going on behind her. Interactive 360 videos are the next logical step, but adding interactivity to 360 videos is no simple task. Accurately positioning and smoothly tracking those elements is even trickier, but without them the experience falls apart.

The problem stems from the fact that normal web elements (DOM elements) and 360 videos use two entirely different technologies.

DOM elements are ubiquitous. They make up the sentence you are reading right now and every other thing you see on the web. DOM elements are widely understood, easy to build, and easy to interact with – they can be clicked, highlighted, typed-into, copied, linked, submitted, etc, etc, etc.

360 videos or “videospheres” are actual 3D environments — they use a technology called “WebGL” to create a spherical “mesh” then “texture” that mesh with a video (basically wrap a 2D image around a 3D object). Then a virtual “camera” is placed at the videosphere’s center, allowing a viewer to rotate it in any direction. The entire WebGL instance is rendered in your browser via a “canvas” DOM element -but within that canvas, normal web rules don’t apply.

“(360 videos) encourage exploration and invoke a constant curiosity…”

So, since the 360 video is already implemented in WebGL, why not add a few more WebGL objects, apply some textures, and write a little code to make them interactive? The obvious answers are time, effort, expense, and expertise. Creating something like a simple web form in WebGL would require custom-building each interaction, utilizing a “raycaster” that maps 2D clicks to the 3D world space, and a whole lot of specialized technical knowledge. The objects in a WebGL scene can’t be interacted with in the same way that DOM elements can, hence the natural desire for content makers to place DOM elements in the 3D space. Furthermore, by making it possible to add DOM elements to the videosphere, a creator can recycle existing HTML elements along with any accompanying CSS styling or JS behavior.

Unfortunately, WebGL doesn’t support the inclusion of DOM elements, so we can’t simply wrap an element as a WebGL object and place it in our scene. We have to find a way to bridge these two separate technologies.

“We have to find a way to bridge these two separate technologies… It became clear that we would have to recreate the 3D space…”

The first approach we took involved gathering information from the WebGL scene and using it to position each elements’ 2D “x” and “y” position values on the screen. This was moderately successful but would fail around the video edges as the video warped and the DOM elements remained within their 2D space. We wanted to place elements in the scene accurately enough that they could conceivably be seen as a part of the original video, so this wouldn’t do.

It became clear that we would have to recreate the 3D space using only DOM elements and keep our 3D DOM scene in sync with the WebGL scene. The solution was to build a 3D space using CSS transforms and place individual DOM elements at the videosphere’s radius. The 3D “CSS Sphere” is then rotated in tandem with the videosphere. We constructed a small API that, in conjunction with the amazing aframe.io library (chosen based on it’s simple videosphere setup and the easy-to-use rotation values that can be gathered from it’s WebGL camera instance), allows a user to place any DOM element in a 360 video or photosphere and gives them access to familiar 3D editing tools to position, scale, and rotate their elements.

Editing your Interactive 360 Video

Try the EDITING DEMO now!

Using our API, a user can create a scene just by specifying an MP4 video, then place any DOM element within that scene at a specific position and time. Simple animations are currently possible between two points. Anyone who has worked with 3D software will be instantly familiar with the 3D control widget which allows an editor to change an element’s position, rotation, and scale values. Global position around the camera can be changed by toggling the “X Lock” and “Y Lock” buttons then dragging on the videosphere to rotate it.

“Accurately positioning and smoothly tracking those elements is even trickier, but without them the experience falls apart.”

If you are interested in hearing more about interactive 360 video, please contact us at support@hapyak.com

Facebook
Google+
https://corp.hapyak.com/resources/interactive-360-videos-accuracy/
Twitter
LinkedIn