Meta’s new hand tracking feature almost feels like touching the future

[ad_1]

Meta is testing what could become a foundational upgrade to its Quest VR headsets: a way to tap and scroll on virtual elements with only your hands, no controllers required. The idea is that you’ll be able to do actions you might already be familiar with from your smartphone, like swiping up and down a page, pressing a button to activate it, or typing on an onscreen keyboard, using just your fingers in the air.

The new experimental feature is called “Direct Touch,” and it’s included with the Quest v50 software update that’s rolling out now. After weeks of waiting, the update finally arrived for me, so, of course, I immediately flipped it on. 

When hand tracking is on, the Quest 2 uses its external-facing cameras to follow your hands, and inside the headset, you’ll see them in VR as dark hand-like shadows. (CEO Mark Zuckerberg’s video of Direct Touch, which looks to be taken from a Quest Pro, shows more hand / arm detail.) You can use those shadows to approximate when your hand will “touch” a menu or window in front of you. With Direct Touch, when you make “contact,” things will start to scroll or light up. Scrolling is jerky, but it’s usually more responsive than I thought it would be.

Typing with Direct Touch, however, sucks. When you tap on a part of the UI where you can to input text, the Quest’s onscreen keyboard pops up under the window, and you can “press” individual keys to spell things out. But since there’s no physical place to rest your hands or fingers, it’s hard to have any idea of where — or what — you’re actually typing. (Imagine the lack of feedback you get with the iPad’s onscreen keyboard, and then imagine there’s no glass.) Even when I resort to VR hunt-and-peck to futilely write even a single word, the UI sometimes thinks that I pressed a different key than the one I intended. Fortunately, the keyboard does suggest words as you’re typing, which can help in a pinch. 

The bad typing and decent scrolling mean that the Quest web browser is perhaps the best showcase of the Direct Touch controls. If I fudge up the spelling of a web search, the search engine is probably going to fix it. Scrolling up and down works well enough, as does tapping on links. Weirdly, The Verge’s homepage doesn’t scroll past our Top Stories list on the Quest’s browser for some reason, but tapping any one of the six stories I can actually see works better than I expected.

If you’d like to see me actually trying to use the browser, I filmed it for you:

Most other built-in Quest apps that I tried were at least usable with Direct Touch, but many apps from the Quest Store, including Meta’s own Horizon Worlds VR social network, haven’t been updated to work with just your hands. They wouldn’t even open unless I had a controller. I certainly wasn’t expecting apps like Beat Saber to be better when I was controller-free, but I was hoping I’d at least have the option to mess around with them.

Right now, it’s clear why Direct Touch is labeled an experiment. With every mid-air poke, I can’t quite trust that my hand is actually going to “touch” a virtual piece of the Quest’s UI, so using it for longer than a few minutes at a time quickly gets frustrating. Holding out my arms in space just to move around the UI gets tiring after a while, too. Meta’s other controller-free hand gestures, which involve pinching, are generally more reliable, though I find them less intuitive.

The idea of Direct Touch is extremely cool

That all being said, I still think the idea of Direct Touch is extremely cool. Scrolling and tapping on virtual surfaces in my VR headset makes me feel like I’m living out some kind of sci-fi dream, even if my words-per-minute plummets by 99 percent and I don’t think that any of my taps will work the way I expect. When Direct Touch works as intended, using my hands is also way more convenient than using the Quest’s controllers. I know that’s a major asterisk, but just popping on the headset and scrolling through something with my hands removes a lot of friction I normally associate with putting on the Quest. (That said, because Direct Touch is so finicky, I have to make sure the controllers are nearby anyway.)

It’s also obvious to see where this technology could go, especially if Meta’s still-years-away AR glasses actually come to fruition. While wearing those glasses out in the world, you probably won’t want to also have a controller or two when you could just use your hands. And we may not just be working with Meta devices with our hands in the air; Apple’s long-rumored mixed reality headset may let users type on onscreen keyboards, so it seems possible that Apple is exploring these sorts of interactions as well.

For now, I’m largely going to stick with using the Quest’s controllers. But if I just need to check something quickly on my headset, I may leave the controllers on the table and try to accomplish it with my hands instead. It might take three times as long, but it’s a heck of a lot cooler.



[ad_2]

Source link