Sunday Runday
In this weekly column, Android Central Wearables Editor Michael Hicks talks about the world of wearables, apps, and fitness tech related to running and health, in his quest to get faster and more fit.
Everyone who watched Meta Connect was understandably excited by the Orion AR glasses prototype. Still, its most consumer-ready feature had nothing to do with glasses themselves; it’s the electromyography (EMG) band controller for reading neural signals that I expect to see soon on store shelves or in Quest boxes.
Meta’s EMG tech is too good to be a mere accessory, though; it should be the centerpiece for a Meta watch or tracker. And we already have evidence Meta is working on a watch.
Anyone who heard Mark Zuckerberg say on stage that Meta’s EMG band allows you to “send a signal from your brain to the device” via a “neural interface” may have gotten visions of some creepy mind-reading tech, but it’s fairly straightforward.
Right now, a smartwatch like the Galaxy Watch 7 can detect hand gestures like double taps or fists by using the HR monitor and accelerometer/ gyroscope. A VR headset (like the new Quest 3S) or AR glasses use computer vision to interpret pinch gestures for hand-tracking controls. Either way, you get missed inputs or false positives.
An EMG band cuts out the visual or tactile middleman. If you make a pinching gesture, your brain sends a neural signal to your hands to trigger this. The band perceives that you’re pinching your fingers even if they barely move, so tracking errors are unlikely and you don’t have to exaggerate the motion.
…
This article was first published at Source link . You can check them out for other stuffs