Create, Share, and Explore Interactive Virtual Reality. Marzipano Tool. 360 Rumors. Andrew Hazelden's Blog. Blackmagic Design Fusion. Stereoscopic Panoramics. There are three frequently used techniques for rapidly displaying either photographic or computer generated surround environments, they are: cylindrical panoramics, spherical maps, or cubic maps.
In the late 80s panoramic and spherical maps were popularised by Apple with their QuickTime VR software, more recently (2000) that technology was extended to handle cubic maps. In all three cases images are mapped onto some geometry (cylinder, sphere, cube) with a virtual camera located in the center, depending on the performance of the host hardware and software the user can interactively look in any direction. This can lead to a strong sense of immersion especially if the environment is projected onto a wide display that fills up a significant part of the viewers field of view. One might ask how a greater sense of immersion can be achieved and in particular whether stereoscopic projection is possible.
The degree of horizontal shift is easy to calculate given the geometry above. Elevr. Pre-rendered spherical stereoscopic panoramas. In a previous article I described how to display full-spherical stereoscopic images and videos in Unity.
In this article I’m going to start looking at how to create that content. One approach (described in detail on the excellent eleVR site) uses a rig consisting of multiple cameras to capture stereoscopic image pairs which are then stitched together in software to create a pair of spherical images, one for the left eye and one for the right. The results can be quite good, but there are two fundamental problems with this approach — one technical, one practical. The technical problem has to do with the need for the cameras to have overlapping fields of view in order for the stitching software to work. Full 360 stereoscopic video playback in Unity.
For a recent project, I needed to implement playback of stereoscopic spherical videos in Unity.
There are certainly people doing this already (e.g. Whirligig), but I needed to integrate the code with an existing application since we want to have 3D objects in the scene in addition to the stereoscopic video. Here’s how I went about it. Starting simple We’re going to start off with something much simpler — a standard photosphere. Photospheres, like many spherical videos, are stored using an equirectangular mapping. Setting up the scene We need to create a sphere surrounding the virtual camera in our scene, and put the image on the inside of that sphere. Now, some people find shaders intimidating. The shader we’ll be using is the simplest one available, the “Unlit/Textured” shader.
Once we have our shader, the next step is to create a sphere. Now let’s add support for the Rift. Hit Play and you should be able to look all around at your beautiful photosphere, reliving your vacation memories. Full 360 stereoscopic video playback in Unity. Implementing 360 Video in Unity for Gear VR and Cardboard – Immersive Blog. This entry will describe our efforts to get 360 video working in Unity for apps running on Gear VR and Cardboard (both iOS and android).
This is more of a work in progress than a full guide, but I hope it helps. Feel free to leave questions or suggestions! The overall process is: Start with google cardboard camera or oculus camera demo scene.Add a sphere with an equirectangular UV mapping and inward facing normals around the camera.Purchase a plugin to play a movie on that sphere’s texture. (Note: if you just want to run on Gear VR, you can adapt their movie example to play on the inside of a sphere. Movie & Video Plugins for Unity and Android / iOS / Gear VR. The most success so far has been achieved using the Easy Movie Texture plugin for Unity, currently $45. The best part of this plugin is that it comes with a demo scene including a sphere to play back equirectangular videos. I also tried prime31’s iOS video plugin, LiveTexture, which was $75.