Internet2

close
Use Internet2 SiteID

Already have an Internet2 SiteID?
Sign in here.

Internet2 SiteID

Your organization not listed? Create a local account to use Internet2 services.

Create SiteID

Blogs

HTC Demonstrates Virtual Reality at the University of Illinois

Mar 02, 2016, by Ben Fineman
Tags: Metaverse Working Group

Last week, I was fortunate to be able to visit the National Center for Supercomputing Applications (NCSA) at the University of Illinois to participate in the traveling national demo of the HTC Vive. This device is a stereoscopic head mounted display – a “virtual reality headset," and a competitor of the Oculus Rift. What differentiates this platform from the Rift is the focus on room-scale experiences. Sensors in the corners of the room enable users to walk around and explore environments mapped into the actual physical space, with virtual walls that appear when you get too close to the real ones (to prevent bonked heads and stubbed toes). 

Many have posted detailed comments on the demo itself, and I want to share my personal experience about the demonstration. The Vive’s visual experience seemed similar to the Rift Developer Kit 2. I did feel a bit cross eyed wearing the device, since I wasn't able to adjust it much. Both the Vive and the Rift have similarly reasonable fields of view, sharpness, and pixel density. If anything, the visibility of pixels (“screen door effect”) of the Vive seemed improved versus the DK2 version of the Rift. But while the devices were similar in some ways, they also differed. The experiential factors of the Vive were better, in my opinion, based on two main criteria: motion tracking controllers and room scale tracking.

Motion tracking controllers

Of course, the Rift will eventually have controllers too, but since I haven’t been able to try them yet, I can’t compare. Suffice it to say that motion tracking controllers are critical for moving toward immersion. I’ve led several hundred Rift demos, and in the vast majority of them, the user will try to reach out and touch something in the demo. Of course, they can’t, and the immersion is broken in a fundamental way. Compare this with a controller that maps your hand into virtual space – even with the abstraction of using a controller versus only your hand to interact, the difference is immeasurable.  Being able to reach out and pick up a virtual cup of coffee in the office demo (and one or two virtual doughnuts) helped me feel more immersed in the Vive than any Rift demo to date.

Room scale tracking

Room scale tracking is the ability to walk around in a virtual space, instead of being confined to sitting in a chair or standing in place. The Vive enables this through the use of multiple tracking cameras in the corners of the room, as opposed to the Rift’s single tracking camera. This small implementation difference yields a vast experiential improvement. For the first time, I was able to wander around in the environment without needing to use a controller to artificially move myself (which makes me motion sick, every time I try it). Additionally, the tracking on the Vive was rock solid – with the Rift, there is constant concern about blocking the view of the single camera and breaking the motion tracking –the Vive and its use of multiple cameras make this issue disappear. Certainly challenges remain, especially in terms of how to traverse distances greater than your actual physical space, but people are working on it, and this approach is absolutely a step in the right direction. The big remaining challenge is that it needs to be wireless: getting tangled in cords is no fun. And VR will go wireless in time, it just isn’t there yet.

Ben Fineman demoing a virtual reality system

I left this demonstration feeling excited and more optimistic than ever regarding the potential for virtual reality to revolutionize the way we communicate and collaborate. Video conferencing and telepresence have long endeavored to make us feel like we are together in the same room ("immersion"), but taking an experience to three dimensions enables both levels of immersion and modes of interaction that just aren’t possible with traditional telepresence. It’s not much of a leap to take computer controlled avatars in a shared virtual space and make them human controlled by a remote collaborator – and universities are already experimenting with this to deliver distance learning, in addition to many other educational applications. The NCSA themselves are well positioned to be a leading content provider for VR – they have already shown us how to create amazing content designed for 180 degree dome viewing in planetariums – making the leap to full 360 degree VR is an easy logical step.

The Vive will be continuing its higher ed tour over the coming weeks, so please contact me and share your impressions if you take part in a demo. Join our Metaverse Working Group to get updates on university stops and participate in the conversation around virtual reality in research and education.