CES: CubicSpace Demos Solution to a Consistent VR Problem

There’s a knotty problem present in every single available VR device, and it gives most people a headache or eyestrain when using the device long enough: the distance between your eyes and the displays remains the same no matter how far away an object appears to be. At CES 2025 in Las Vegas this week, Canadian spatial media company CubicSpace demonstrated a software mitigation to this issue, showing us images on a standard 3D display and a stock Meta Quest 3 device, with a before-and-after effect of native pipeline and via their software.

Think of it like this: You can make yourself squint by staring at your outstretched finger and bringing it to your nose, because your eyes pivot in that way (“vergence”) so you’re always looking directly at something no matter how far away it is. At the same time, muscles in your eyeballs deform your lenses to keep things in focus (“accommodation”).

However, a VR headset’s displays never move, even when objects being displayed demand a different amount of vergence. This is disorienting and can make our eyes hurt as our poor mammal brains try in vain to solve this vergence-accommodation conflict (“VAC”). This is what CubicSpace aims to address.

Every VR device manufacturer using current in-market technology has to make a choice of how “far away” the displays will appear to the user. For example, the Apple Vision Pro device has a fixed focal distance of 1.8m/6 feet. Viewing objects that purport to be this distance from the user are comfortable to view, which is why Apple put native visionOS UI elements at that distance, and why you should position your virtual displays or movies there, too, when in VR.

CubicSpace’s founder confirmed VAC as one target of its technology. Images viewed using CubicSpace’s technology appear to be processed to reduce the distance range within the image, and to reposition the subject of an image in relation to the fixed focal distance of a given device.

This is done by analyzing the combination of capture and presentation hardware, as well as the image itself. The effect necessarily flattens an image somewhat, but makes a scene immediately more pleasant to view, while giving sufficient depth cues for the brain to reconstruct a depth map based on what is known about how far objects ‘should’ be from each other.

In other words, one quickly stops noticing the compression of distance range.

Until VAC is solved by technology like wave guides in VR hardware, one can imagine a solution such as CubicSpace’s becoming a switchable feature in VR operating systems. Any depth information lost may be a penalty worth paying for this level of increased comfort.

No Comments Yet

You can be the first to comment!

Leave a comment

You must be logged in to post a comment.