With countless new virtual worlds just around the corner, the future is already taking shape in hardware labs and hidden codebases around the globe. The only question is how we can make the future more human as we take our first steps into places beyond reality.
Last week at Vision Summit, Leap Motion CTO David Holz talked about the rapidly emerging future of VR technologies and what it means for human experience. Some of the future from his talk last year on the road to 2022 has already happened, while other trends are shifting the landscape in unexpected ways.
Near the end, he showcased a couple of our latest internal projects, including an arm-level interface, an application launcher, and a serious place “with furniture and cats.” We’ve attached the slides from the talk at the bottom of this post. There’s also a solid discussion around the video here on r/oculus.
It took you (and CEO Michael Buckwald) around 3 years between conception and your seed round…. Any words of wisdom for aspiring entrepreneurs in the hardware startup space?
Michael and I have been friends since 4th grade, I visited his house in DC in 2010 and talked to him about making a company and talking to investors. He just happened to be in the process of selling his second company and we decided to join forces and make Leap Motion. We moved out to the Bay Area and raised a seed round a few months later.
Working with major hardware companies is a HUGE challenge for an early startup trying to make a physical device. Luckily once we released the video and had some demos we got a flood of support from the industry. Make sure that if you’re thinking of starting a company it’s to solve a problem you really REALLY care about. It’s a tough quest to be on, but sometimes it’s the only way to solve that kind of problem.
The technology looks amazing and the demos are super cool. But are there any large consumer or commercial areas where you’ve experienced or envision significant appeal?
It’s easy to envision a future where you can wear a pair of glasses that can project onto the world anything you can imagine. In even the most conservative case, if we could just project a screen in front of your eyes that looked the same as a smartphone or TV, we simply wouldn’t need those devices anymore. We might not need anything with a screen anymore, which is essentially all technological devices that exist today.
So we see there’s a paradigm shift on the horizon, where the digital and physical worlds merge and everything becomes one space, and you’re just one human creature in all of those. In this case, it’s hard to imagine us using anything but our hands; the universal human interface, to interact with this merged digital-physical reality.
Was super easy to drop Orion into AltspaceVR, and we are loving it…. What were some of the technical breakthroughs that were necessary to get optical tracking to work so well?
Excited to see Orion support in AltspaceVR! I spent 4 hours in there a few weeks ago testing our early builds of Orion I had a dance-off with a dude from France (who also had a Leap Motion Controller). It was the first and probably only time in my life I’ll ever win a dance competition.
The challenge has really been around how to handle situations where we can’t see the parts we want to see of the hand. This can mean occlusion, or it can mean your hand is right up against a white surface that looks the same color as skin. Solving this has been a HUGE challenge, and we weren’t sure it was even possible with the existing Leap Motion device. Everyone is super thrilled that things have worked out so well!
How does it feel to allow the creation of the best thing ever?
Our engineers at the office were pretty stoked about this. There were even some discussions about paw tracking!
I noticed that my Image Hands weren’t quite the same as my “real” ones in the VR Visualizer. Why is that?
Orion thus far has been focused on general robustness and sometimes makes small sacrifices right now with fine precision. Hands come in a HUGE variety of shapes, and there isn’t a lot of good data on it right now.
To be safe at the moment we change the scale of your hand, but we don’t change the scale of individual fingers. This will sometimes lead to small misalignment but the overall actions of what you’re doing with your hands should be pretty solid. That said, we’re working on doing even better in the future!
How well does this tech work outdoors if at all?
Orion is way improved over V2 for outdoor use. Our guys are starting to get sunburned testing it nowadays. The only thing that’s tough for the peripheral is if the hand has sunlight falling directly on it from the perspective of the device and it’s also really really far away (at some point the sun overpowers the LEDs on the peripheral device). However, future embedded modules don’t have this issue (we added another LED).
What is the maximum tracking distance that the Leap Motion can detect with the switch to Orion?
We artificially kill hands beyond 80 cm (2.6 feet) right now. It goes a bit farther.
Despite the obvious usage in VR/AR how would you personally like to see the Leap Motion technology being used outside of the enthusiast scene?
I think education is really interesting! Our kids will grow up not just playing with soccer balls, but galaxies and quantum particles and the fundamental mathematical laws that underlay not just this universe but all possible universes. This will seem as natural to them tomorrow as a basketball seems to us today.
You can read the full AMA here on Reddit. Now for the slides!
The post Soccer Balls and Galaxies: David Holz on the Future of VR, Inputs, and Sensors appeared first on Leap Motion Blog.