The reality of virtual reality is a lot closer to the fictional world of “Ready Player One” than most people realize — and much of VR’s future is being imagined and created at the University of Utah.
It’s not just about games, like the immersive environment imagined in Steven Spielberg’s new movie (opening in theaters nationwide Thursday) or in the 2011 Ernest Cline novel on which it’s based. The programmers, developers and artists in the U.’s Entertainment Arts and Engineering (EAE) program are also using the lessons learned from games to create apps with real-world effects.
In a basement at the Spencer Eccles Health Library, part of the U.’s medical school, the EAE’s Therapeutic Games and Apps Laboratory — known as The GApp Lab — student game developers work with medical students to create applications for patients and professionals in training.
In a long room filled with desks and computer monitors, Alex Pedersen dons a set of Oculus Rift goggles to demonstrate the Virtual Home Simulator, or VHS. Pedersen, who is producer on a team that includes programmers, engineers and an artist, shows off how the simulator can give would-be social workers a taste of how to navigate a home visit.
“Nothing can really prepare people studying to become social workers for what they’ll encounter in the field,” said Jesse Ferraro, a project manager at The GApp Lab. Rookies entering a home for the first time, he added, “often have a hard time distinguishing the good things from the bad — they only see the bad.”
In the simulation, the trainee enters a home, each room a staged setting photographed with a 360-degree VR camera rig. Looking around, the trainee can highlight different areas in each room to red-flag problems — a cracked TV in the living room, for example, or a broken liquor bottle in the bathroom sink. The trainee can also highlight positive signs, such as an abundance of children’s toys.
A trainee can click photos of problem areas, and record verbal notes for later reference. The program records what positive and negative things the trainee has spotted, and after the simulation lets the trainee know what he or she missed.
The team is working on other simulations — a divorced dad’s bachelor pad, or a courthouse — to give future social workers a fuller range of experiences.
At the other end of the room, producer Alanna Carroll is developing another app. Her Virtual Medical Records app gives patients in danger of diabetes a visual allegory of blood-sugar levels, allowing them to stand on rooftops of buildings that represent healthy and unhealthy levels.
“It gives them an educational and an empathy experience,” Carroll said.
A third project being developed at The GApp Lab, said the lab’s director, Roger Altizer, is designed to help children on the autism spectrum. It’s a musical VR game, developed in conjunction with an autism researcher and a dancer, that will help kids on the spectrum self-soothe.
The history of virtual reality has its roots at the University of Utah, Altizer said. Ivan Sutherland, one of the original professors in the U’s landmark computer graphics program, in 1965 developed a head-mounted display — called “The Sword of Damocles” — that projected a 3-D computer image on a live video image. It was “augmented reality” before the term was invented for it.
Sutherland and his mentor, David Evans, left the U. in 1968 to form Evans & Sutherland, one of the pioneering companies in computer graphics. Their students at the U. included Jim Clark (who founded Silicon Graphics and Netscape), John Warnock (who co-founded Adobe) and Ed Catmull (who co-founded Pixar).
Altizer, who is also associate director of the U.’s EAE program, said current VR technology is closing in on that immersive experience depicted in “Ready Player One.”
“We have a lot of momentum with displays,” Altizer said. “We’re not quite there, but we’re figuring out the video part really well. And we’re figuring out the audio part really well. … The hard part is the haptics. How do you make it feel like something’s in your hand? Or a suit that makes it feel it’s on your body?”
Altizer cited a Utah company, Tactical Haptics, founded by a former U. of U. researcher, that is developing wearable technology that would tug slightly on the skin to create the illusion of holding something.
“If you don’t see it happening in your hand … the mind will assume that it’s weight,” Altizer said. “If it stretches the skin the right way, the mind is fooled.”
But technology alone won’t make an experience fully immersive, Altizer said.
“Good books really draw people in,” Altizer said. “You hear people talk about, ‘I felt like I was there when I was reading this book.’ And the graphics on books suck. It hasn’t changed in hundreds of years. It’s just print on a page. … The immersive part of it is in the storytelling, and the language that is used.”
In virtual reality, Altizer said, “nobody knows how to do that just yet. We’re porting those three things — storytelling from books, cinematography from movies, and interaction from video games — into VR. But we haven’t quite figured out what the content in VR is.
“We don’t have our ‘Halo’ for VR just yet, or we don’t have our ‘Citizen Kane’ for VR, or we don’t have our ‘Great Gatsby’ for VR,” Altizer said. “Once you get those great pieces of content, then the technology will actually rally around it, and we’ll say, ‘How do we make that content better?’”