Omnidirectional stereo video

Jump uses a special projection model called omnidirectional stereo (ODS) which produces seamless stereo in all directions (except directly above and below the camera). The result looks just like 360 video which can be easily edited, streamed, and displayed.

What’s so hard about stereo 360° video?

Our brains achieve stereo vision is by fusing the two images from our eyes. However, achieving stereo in 360 video is not as simple as placing two 360 cameras side by side eyes-width apart. An obvious problem is that the cameras will see each other. In reality, what happens is that your eyes move as you turn your head. This is how VR works for real-time CG environments.

Now imagine you could capture a stereo video for every head orientation, say one pair for every degree. That would be a lot of videos! But if you’re willing to tolerate a small amount of distortion, you can actually create a pair of 2D omnidirectional images which achieve nearly the same effect.

ODS projection

Instead of capturing a full image of rays at each camera position, imagine capturing only the central ray of each camera and borrowing the other ray directions from the other cameras. The ray for your left pixel is actually the central ray captured by a camera counter-clockwise along the circle. In this case, the ray geometry for left and right eyes looks like this:

...which, when extended to be omnidirectional, looks like this:

If you compare the ODS projection to a Jump camera, you can see that some of these viewing rays are captured exactly by the physical cameras in the rig. The rest of the rays fall in between two physical cameras. The Jump Assembler synthesizes these missing rays using a computer vision technique called view interpolation.

Limits and distortions

Of course, ODS does not exactly reproduce the image seen by the eye. This is because instead of all rays originating at a single point, each ray originates from a slightly different point on the capture circle. These distortions are typically negligible, but for very close objects, the following may be observed:

  • Line bending--straight lines in the real world may appear slightly curved in the image.
  • Vertical disparity--objects may project to slightly different y coordinates in left and right eyes. If this effect is severe, the two images will not be fused by our visual system.

Another issue arises when you consider off-horizon rays. For these rays, the difference between the ODS ray origin and the physical camera origin can create a vertical stretching artifact for very close objects as shown here:

The object fills 30 degrees of the physical camera’s field of view. The same object would have to be taller to fill the same field of view in the ODS projection.

Furthermore, very close objects may cause our view interpolation algorithms to fail, resulting in ghosting artifacts.

The severity of these artifacts depends on a number of factors including the size of the Jump camera and the complexity of the scene, but as a rule of thumb, we recommend that objects of interest should remain at least 1 meter from the camera for best results.