The way we interact with computers might become a lot effortless in the future. 

Facebook’s Reality Labs (FRL), a research lab within Facebook focused on augmented reality technologies, is working on wristbands that can intercept your brain’s signals and use them to let you easily interact with an AR system. 

The wristband is based on electromyography (EMG), which uses sensor to translate electrical motor nerve signals that travel from the wrist to the hand into digital commands. So, for example, when you think about moving your hand, typing something on a keyboard, or push a button, the wristband can know exactly what you want to do, and then use that data to let you interact with a virtual keyboard or button. 

Facebook says this is not mind-reading; the machine doesn’t know what you’re thinking. Rather, when you decide to take an action, it can decode what it’s going to be.  

Facebook is working on wristbands that let you control a computer with your mind

Image: screenshot: facebook

“What we’re trying to do with neural interfaces is to let you control the machine directly, using the output of the peripheral nervous system — specifically the nerves outside the brain that animate your hand and finger muscles,” FRL director of neuromotor interfaces Thomas Reardon said in a statement. 

I’ve tried something similar back in 2017, when the Italian Institute of Technology showed me a prototype of a system that let me control a robotic arm by contracting and relaxing the muscles of my hand. While impressive at the time, the technology now seems pretty crude compared to what Facebook is cooking with its neural wristbands. 

FRL’s blog post is accompanied with a video that shows how this technology might look in real life. Some of it looks like it comes straight from a science-fiction movie; for example, the bit where a woman uses the wristbands to fire a virtual arrow from a virtual bow into a virtual target. 

FRL says that these wristbands will initially only offer simple controls, though, like a virtual click on a button. And the company hasn’t even launched its AR glasses, though we do know they’re in the works

Eventually, you’ll be able to control virtual objects and interact with virtual user interfaces. And with additional technology such as contextually-aware AI and haptic feedback, the stuff we’ve seen in basically every sci-fi movie in the past ten years — you know, the bit where the protagonist easily shuffles some virtual objects around with her hands — might become reality.