With the introduction this year of the Oculus Rift, the Leap Motion and Google Glass, we seem to have entered yet another period of accelerated experimentation with user interfaces that have the potential to fundamentally change how people interact with digital devices.
While new devices launch all the time, they seldom lead to wholesale revolution — over the past 30 years, we’ve seen just two: From the command line/keyboard/typing paradigm to the GUI/mouse/click-and-drag paradigm, and more recently to the touchscreen/finger/swipe-and-pinch paradigm ubiquitous on mobile devices. Along the way there have been innovations that have found niches or improved on the dominant method — scroll wheels on mice, click wheels on iPods, the 3D mouse, the laptop touchpad, the Wii Remote and Thinkpad’s red nub among others. So where will this latest crop of devices end up — among the game changers, or with the niche players?
At Söderhavet, understanding user interfaces and their contribution to the user experience is our bread and butter, so we are naturally excited to explore what the near future holds. Anything that promises to make the navigation of data more efficient and seamless is worth trying, so we’ve been playing intensively with the Oculus Rift and Leap Motion to see what potential they hold. Here on the blog I’ll take a look at each of these devices in turn, starting with the Oculus Rift in this post. (Other than the chance to put Google Glass on my face for a minute, I’ve not had the good fortune of having one at my disposal. In any case, a version ready for prime time may be years off.)
The Oculus Rift is a 3D head-mounted display (HMD) with head tracking, the first to be aimed at the mass market, and developed with help from a breakaway Kickstarter campaign. The prototype is now available to developers, who have been busy retrofitting existing games or developing entirely new apps for it. Having tried out a good many of these efforts, it’s time to draw a few conclusions, and to point out the best practices that make some of the apps stand out.
The main challenge to going mainstream: Motion sickness
Having watched 100-odd people put the Rift prototype on their face for the first time in the past few months, the first two reactions are pretty much universal: Gasps of amazement at the immersive quality of the experience, followed soon after by an onset of queasiness that compels them to rip the Rift off their face. Persistent use does not, as a rule, result in the effect wearing off, so this prototype, at least, has no chance of gaining mainstream adoption.
The most important question facing the Rift, then, is whether this motion sickness arises from limitations of the prototype, from the very nature of head-mounted displays, or from how the software constructs the user experience within it. The answer seems to be: All three. Let’s take each in turn.
Limitations of the prototype:
The display in the current prototype has a resolution of 1280 x 800 pixels, or 640 x 800 pixels per eye, and because these relatively few pixels have to fill the entire field of vision, they are large and conspicuous, breaking the immersive spell. The consumer version will have a resolution of 1920 x 1080 pixels, which should go some way to alleviating this issue, and there is no reason to believe display technology won’t improve until this problem goes away. These devices will one day be capable of providing persistent immersion.
The nature of HMDs:
Head tracking in the Rift is surprisingly sensitive, and as the best apps prove, the system can work with negligible lag, so that the response to turning one’s head can feel instantaneous. What HMDs cannot do, however, is turn in-Rift perceptions of acceleration into real-world effects. Experiences in-Rift that involve high and variable acceleration — dodging bullets in a first-person shooter or riding a roller coaster — fool the brain into expecting these same forces on the physical body, except that they never appear.
The corollary to this effect is finding yourself inside a sailboat cabin on choppy seas: Your eyes tell you your surroundings are stationary, but your body is getting a very different message; cue the vomit bucket.
This limitation suggests that HMDs will never do well in simulated high and variable acceleration environments. But there is no reason to suppose the opposite can’t be true: While this Oculus Rift does not record the user’s real-world positional data, future models or hacks could. Imagine wearing a wireless HMD while walking around in an empty room, and having the in-Rift experience mirror your movements in a simulated dungeon. Here lie some real opportunities for jaw-dropping uses, especially if the Rift gets to be aware of other nearby users.
One caveat: When demoing the Rift at Stockholm Makerspace a few weeks ago, a dad held on to his 4-year-old as he squealed with delight through a roller coaster simulation for over 15 minutes, to no apparent ill effect. That kid had absolutely no qualms about the high Gs. I suspect his immunity has something to do with how the brain develops, and if the future sees children immersed digitally at an early age, then the motion sickness issue may be moot.
How the software constructs the user experience:
Software is the most malleable of all the inputs that make the Rift experience, so it is here that the most can be learned today, including from the failures. The variety of the results that developers have come up with allows for some clear best practices to become evident, including some unexpected ones:
1. Cockpits, not FPSs: The killer app for HMDs may not be first-person shooters (FPS) like Team Fortress 2 or Half Life 2, but cockpit experiences, because providing a stationary frame of reference around the field of vision in an immersive experience increases the sensation that the user is stationary while the world revolves around her, mitigating the dissonance between the virtual and the real. In conventional 2D gameplay, the screen provides this stationary frame of reference, albeit at the cost of losing the immersive experience.
A similar issue exists when navigating Earth in Google Earth. When Earth is far away, it has the semblance of an object being manipulated by a stationary user. But if you zoom in and start flying through the Grand Canyon, at some point the sensation switches to that of a stationary planet and a moving viewpoint. From within a Rift, Earth as an object would not generate simulated Gs, but Earth as an environment suddenly would. It is all in the mind, ultimately.
2. Low-G experiences, please: Even without a cockpit as a stationary frame of reference, The Rift generates less motion sickness if the experience avoids simulated acceleration effects. Some of the gentler apps out there today have you simply floating — or falling — at a constant velocity.
3. Don’t mix inputs: On Rift-enabled FPSs like Half Life 2 and Team Fortress 2, you can look around both by turning your head or by moving your mouse. Inevitably, you use both, and this is a clear recipe for motion sickness. Some games built natively for the Rift have come up with better solutions: Head movements only change your direction of view, whereas another input (an arrow key, or a mouse, or another device) determines your movement or where you aim your gun or flashlight.
4. Avatars are a good thing: This one surprised me — Immersive experiences where you look down to find yourself disembodied are surprisingly unsettling. This no doubt corresponds to the deep-seated phantom limb effect; and just as in real life, it can be treated with a mirror box — or in Rift’s case, an avatar. No matter how crude, its presence seems to make all the difference.
5. Head-tracked menu navigation: Almost all apps present in-world menu options that appear to float in front of the user, often on semi-translucent surfaces. This seems like a natural fit, and works well. Several apps let you navigate these menus through head tracking, moving a cursor at the center of your field of view to highlight a menu item. But what happens next? Selecting an option by pressing a key works far better than time-based selection, where staring at an option for a few seconds leads you to select it. This is inefficient both ways — if you read quickly you’re wasting time, and if you read slowly you need to avoid highlighting it while reading.
Favorite Rift-enabled apps:
Our favorites fall into three categories: Roller coaster simulations, space simulations, and “other”, for want of a better term.
– Roller coasters: These four roller coaster simulations are clearly technical demonstrations — you can’t do anything but look around — but for introducing newbies to Oculus Rift, you can’t go wrong with any of these.
Surprisingly, it’s difficult to pick a clear winner. The Windows-only Rift Coaster remains a popular favorite because of its engaging medieval surroundings, even though the user perspective does not turn sideways in the curves, as you would expect it to. The Tornado roller coaster does do this, and has a great cockpit to boot, but the detailed graphics lead to a low frame rate and some lag. The ParrotCoaster strikes a better balance between detail and smoothness, while the the Cobra roller coaster has fantastic surroundings and a good ghost body to boot.
– Space simulations: The most impressive app of all the ones we’ve tried, in any genre, is Titans of Space, a 15-minute guided tour of the solar system showing the relative sizes of the planets and stars. Titans of Space manages to do something special: You gain an intuitive understanding of spatial information that is unique to the medium and impossible to grasp any other way. It’s a revelation, and an indication that HMDs have a bright future as an immersive educational tool:
Blue Marble, meanwhile, aims more for a realistic portrayal of Earth from space, and the end result is a calm and beautiful if passive demo. First Law is a proper space cockpit game; head-tracking dictates your view while other controls steer the spaceship. The game manages to minimize motion sickness both through the deft use of the cockpit as a stationary frame of reference and by avoiding heavy Gs.
In Spacewalk, you’re an astronaut on an EVA, where the helmet becomes your viewport and your space suit your avatar. Spacewalk strives for and delivers realism, with the controls delivering small incremental changes to direction and speed. This game above all is one I can’t wait to see in higher resolution. It is also remarkably nausea-free.
– Other category: Several other genres are promising: Falling games like SkyDIEving focus on high-speed low-G experiences, while perhaps the easiest app of all for a newbie to try is Eden River, a relaxing skim across a pond. For a proper Matrix-like experience, try Ciess, a game where you try to hack a network through both strategy and old-school arcade gameplay:
And finally, there’s Alone in the Rift, a short horror romp in a forest at night where scarce visual cues are coupled to a rich and haunting soundscape for maximal effect:
The technology will get better, and software developers will learn what works best natively, but will that be enough to lift future versions of the Oculus Rift from niche product to ubiquitous immersive device? I don’t think the Rift will soon become the gear of choice for traditional first-person shooter games, simply because they become harder to play, but the Rift will likely become a gateway to entirely new genres of natively produced games, as well as to immersive educational experiences, and no doubt more risqué adult entertainment as well.
If you’re curious about the Oculus Rift and find yourself in Stockholm, feel free to drop by Söderhavet to give it a try. You can find us here.