Elon Musk’s brain-machine interface company inserts electrodes directly into the brain to connect your mind and your technology. But there are a lot less invasive ways to control technology simply with the power of your mind, and none of them involve Star Wars Jedi tricks.
Or brain surgery.
My biggest question: what does it feel like to control tech with your mind? And will it be something that you will want to do tomorrow?
“The biggest thing I walk away from with this, is that my brain really, really loves a workout, really enjoys it,” futurist and tech evangelist Cathy Hackl told me recently on the TechFirst podcast. “When I think about using these technologies, it’s like this endorphin, like I get this like thing where I’m like, man, I really want to do this.”
Personally, I’m not sure I want to virtually go to the gym every time I need to turn the lights on or close the garage door with the power of my mind.
Hackl says it’s actually not that hard, though.
While Elon Musk’s Neuralink is inserting electrodes through your skull, most brain-machine interface technology that’s currently available (or soon will be) lives outside of your skin and uses sensors to detect impulses that the device can translate into actions. For example, NextMind, which says it is “working on creating a telepathic link between humans and technology,” fits on the back of your skull and allows you to control computer interfaces — or VR games — with your mind in real-time.
Other technology like Muse detects brain states not to control the outside world, but to give you greater control over your own brain. I have one, and enjoy it, when I remember to use it.
Hackl says she’s used brain-machine interface technologies to change channels on the TV, dim the lights, get out of a virtual escape room by inputting numbers and codes using just her thoughts, and more.
It brings up an interesting question: where is the progression of technology for controlling our environment going? If we gauge it by physical effort required, we might come up with some kind of tech mastery scale:
- Level 1: Physical direct action. Get up, walk to the light switch, physically turn it on
- Level 2: Physical indirect action. Signal a light to turn on or off with a dedicated, specific physical action (remember The Clapper?).
- Level 3: Physical presence. Ambient technology detects a human presence and turns the light on as you walk into a room.
- Level 4: Digital device mediation. Open an app on a phone or other digital device and digitally turn a light on or off
- Level 5: Spoken commands. Tell Alexa to turn the lights on. (Note: you could argue that this is functionally equivalent to level two. The key difference is the wide range of actions you can convey quickly via language versus a limited set of physical actions or signals, and the additional sophistication of machine-based natural language processing.)
- Level 6: Brain-machine interface. Will the lights to turn on simply by thinking it.
But level six is only easier if it’s not that virtual workout in the gym, straining and tensing with all the power of your mind to make some simple physical thing happen.
So what’s it like?
“You concentrate, you have to concentrate … you have to focus,” Hackl says. “It’s training … depending on the headset, like I’ve used, some of them need a little longer to calibrate. Some of them calibrate faster. Some do require training, like this one, like the Neurosity.”
It’s not focusing so hard that you start shaking or sweating, in other words. And it certainly is easier than initial versions which required a full cap and a liquid interface (”goo”) between the sensors and your skull. But it’s also not as simple as just thinking: lights off.
The question quickly becomes, however: do you really want to wear a device on your head to scroll a web page, turn on a light, or change the channel?
Ultimately, says Hackl, it’s about where the technology evolves to in another few years. In that time, the devices might have shrunk down and fit in a baseball cap, a pair of glasses, or some other kind of decorative object.
In other words, less like the Borg, the machine-biological cyborgs in Star Trek.
And more like fashion.
“If you look at the current companies and what they’ve been developing over the last, I would say, two years, they’ve evolved,” Hackl says. “The first time I did this, I had to wear a cap and they have to put all this like goo on my head for it to be able to read my brainwaves. Now I can just put kind on a little device.”
And others, of course — biohackers — will simply insert devices beneath the skin. People are already chipping themselves to gain entry to buildings and pay for products in stores.
That, however, is an entirely different level of commitment. And companies will have to show significant advantages — and safety guarantees — before biohacking goes mainstream.