This MIT PhD has his fingers on the pulse of virtual-reality

Alum’s hand-tracking system plays key role in new Oculus Rift platform

These days the buzz around virtual reality (VR) has never been bigger. Last month VCs invested $800 million in a secretive venture called Magic Leap, while just this week major platforms have finally hit the market from HTC’s Vive and Facebook’s Oculus VR.

Oculus’ highly anticipated system, the Oculus Rift, features some intricate hand-tracking software courtesy of Robert Wang PhD ’11, whose start-up Nimble VR was bought by Oculus last year after raising more than $2 million in funding.

Where most other hand-tracking systems require special sensors or markers, Nimble VR’s technology - with its infrared depth-sensing, 110-degree view and extremely accurate skeletal tracking of the hands - is completely glove-free.


While Wang can’t discuss the particulars of his work at Oculus, the original Nimble VR system involves a specialized camera working in conjunction with a laser. Every 20 milliseconds, the camera sends out a signal, which bounces back off of objects. The camera data allows the system to not only track the user’s hands, but detect objects’ positions, shapes and movements, creating a point cloud of the world around the user.

Nimble VR was directly born out of Wang’s work at CSAIL. His early hand-tracking projects involved developing Lycra gloves that were colored to help his algorithms identify specific parts of a person’s hands. But he soon realized that the gloves posed too big of a barrier for everyday computer users.

“If someone feels like it’s inconvenient to put on a pair of gloves, they’re just going to resort to their keyboard and mouse,” he says. “That’s when I realized that we needed to figure out a way to make the user experience as seamless as possible.”

After he graduated, Wang developed a gloveless system that could pick up gestures like pointing, grabbing and pinching. Diving more deeply into hand-tracking helped him recognize the intricate dexterity involved in human actions like opening a lid or playing an instrument.

“We humans have an intuitive sense of where our body parts are, like being able to touch your nose even when your eyes are closed,” he says. “I was motivated to see if this ability could translate to interaction in virtual environments as well.”

Wang founded the company in 2012 with Kenrick Kin and Chris Twigg, two computer scientists from UC-Berkeley and Carnegie Mellon. According to Wang, joining Oculus has relieved him of a lot of the pressures of being part of a startup, like having to move fast and raise funds.

“At a company like Oculus that has more support and resources, I don’t feel the same worry about running out of money on a month-to-month basis,” he says. “It’s great to be able to just focus on developing the best possible technologies.”

There have been claims about virtual-reality becoming an actual reality for decades, but many computer scientists believe that the time is finally here thanks to higher-resolution displays,  improved motion-detection sensors, and the continuing drop in hardware costs.

The potential applications are vast and ever-changing, from gaming and fitness to health-care and exposure therapy. Wang says that nobody truly knows how systems like Oculus Rift will make the biggest impact until the wider public starts using them.

While he is excited to see the the first high-quality systems get launched, he says that some of the most interesting problems are actually still ahead of us.

“This generation of VR has reached a high-quality bar, but even now it’s very much in the beginning stages,” Wang says. “The more we improve, the more we realize that there’s still much more to learn.”