Integrating The OmniWear Cap With An Oculus
One of the central challenges for VR/AR companies is making a usable, compelling product once the initial novelty of the technology has worn off. We’ve been playing quite a bit with the Oculus Dev Kit 2, specifically, working on integrating it with our haptic feedback device to provide as unified an experience as possible. I’ve been making some notes as we got started to make sure I captured my first impressions of the Oculus as a serious product (rather than something worn in a rather contrived demo). Here are the results…
The main observation is one that I bet most consumers haven’t grocked yet: just how powerful a machine you need to operate the device. Oculus recommends a Nvidia 970 GPU, a i5-4590 processor, and at least 8GB of RAM. That is a very powerful set-up. Even a moderately enthusiastic gamer like myself, it turns out, lacks the horsepower to adequately run the DK2 – I thought I had a reasonable machine with a 750ti and tons of RAM…but I’m afraid not. And by “adequately” I mean at the recommended 90 frames per second, which the 750ti will not typically get. Even though I was seeing framerates of around 75, I still felt queasy after wearing the device for 30 minutes or more. It’s different than the olden days where you could just live with sub-30 framerates if you didn’t have an adequately powerful machine – I think you’d actually have difficulty using the device at all without getting the full 90 FPS…it might be just too uncomfortable. Turning my head quickly back and forth created the sensation of…ummmmm…being rather intoxicated – I was seeing double images of everything!
I decided to bite the bullet and buy a 980 and an upgraded processor, but together these run around $700. So my prediction is that there will be many, many enthusiasts out there who will think they can get away with an Oculus and less than stellar hardware – I just don’t think this will be the case. It will be interesting to see how this plays out over time, though happily processing power has a wonderful tendency to get cheaper with time! Maybe it will be like Tesla’s first car – it cost over $100,000 and only appealed to the wealthy or true believers, but it was a completely viable business strategy for them to get a product quickly into the hands of enthusiastic consumers, and then to move downmarket.
Beyond the rather stringent hardware requirements, in the midst of the extensive coverage and excitement about VR, it’s easy to forget that every consumer-grade VR product out there is still seriously young. There are a number of idiosyncrasies we went through to get the device up and running. For example, which runtime version installed makes a massive difference, and many versions seemed to break at least some compatibility with previous versions. I expect this will be solved with time – it’s just so very early in the life of this technology.
As far as getting the right version of the runtime, I downloaded 10 example projects from share.oculusvr.com and of these, only two worked properly right out of the box. The others didn’t launch at all, simply responding with an error message. Again: Immature product. I have every confidence this will solved with time.
As for the experience itself, for me it’s a bit like sitting waaaaay too close to a TV. You definitely can see each pixel, which initially demands your focus, but as you get used to it and the overall image resolves, your eyes will eventually be tricked into focusing on the larger picture. The trick certainly isn’t perfect, and it made me feel off from time to time. Not motion sickness so much but just kind of…yucky. I think this may largely be solved by a better GPU…so we’ll see.
That said, there were a few demos that I got working right off the bat, and they were great fun that really showcased the tech. The coolest demo I played was a racing game, placing me squarely in the driver’s seat of the high performance race car. A quick glance up revealed my rearview mirror, nicely simulating the real world function, without taking away from what’s going on in front of the windshield.
My time with the Oculus was fun, but you’ve got to reset your expectations (at least for now) for visual fidelity. Now that we’re used to seeing absolutely gorgeous high def visuals, it’s a kind of a bummer to see the pixelated graphics. It’s kind of like playing Gran Turismo 2. I’m sure this will be fixed as display technology improves, if it’s not already fixed for the consumer edition.
After trying out these projects, it’s clear that there’s a steep learning curve and an art to creating good VR experiences. There are so many considerations with the new tech – how to prevent motion sickness, how to guide a user to look where they’re supposed to, and volumes of aspects that haven’t occurred to us yet.
And one more thing – wearing a HMD is a deeply isolating experience. You lose all sense of what’s going on around you, which is a bit spooky. There were window washers at our place yesterday and they could have easily come and gone, looking at the uncoordinated lunatic wearing the HMD and headphones… and I never would have known! Perhaps I’ll just double bolt the doors and try not to think too much about it…
I think of it this way: we’ve been able to believably replace audio for a while now. A good pair of headphones can instantly transport you somewhere else and replace sounds in a truly authentic way. We’ve still got a ways to go, but soon, HMDs may be able to do the same for the eyes, showing us something artificial, but pretty damn believable just the same.
So as far as we’re concerned, that just leaves touch to virtually re-create. Well, there’s still taste and smell, but I’ll let some other pioneer handle that one. I don’t think we’ll ever be able to create a believable touch experiences for people, outside of a few highly-constrained examples, but there’s a huge amount of empty space to play with along that road. I’ll continue to post on our experiments as we continue down this road. My next big task is to create a unified OmniWear-Oculus application in Unity…