NextMind’s brain-computer interface is ready for developers

NextMind is the latest in a long line of companies trying to harness the brain as a means of controlling our digital world. At first, its take on things may seem familiar: Don a headset which places a sensor on the back of your head, and it’ll detect your brainwaves which can then be translated into digital actions. One area where NextMind differs is that the sensor seems more practical than many we’ve seen and won’t leave you looking like a shower cap-wearing lab rat. In fact, the wearable can just as easily clip onto the rear of a snapback.

Beyond size and aesthetics, NextMind’s technology also seems fairly mature. I tried a demo (via the developer kit which goes on sale today for $399) and was surprised by how polished the whole experience was. Set up involved just one basic “training” exercise and I was up and running, controlling things with my mind. The variety of demos made it clear that NextMind is thinking way beyond simple mental button pushes.

There’s still a slight learning curve to get the “knack” — and it won’t replace your mouse or keyboard just yet. Mostly because we’ll need to wait for a library of apps to be built for it first, but also it’s still a new technology — and it takes some practice to become “fluent” with it, as my terrible performance on a mind-controlled game of Breakout can attest. But the diverse and creative demo applications I experienced do hold a lot of promise.

NextMind brain-computer interface

James Trew / Engadget

Right now, the applications are pretty simple: Mostly controlling media and games and so on, but NextMind’s founder and CEO, Sid Kouider is confident the technology will evolve to the point where you can simply think of an image to search for it, for example. There are also complementary technologies, like AR, where this sort of control not only seems apt, but almost essential. Imagine donning some augmented reality glasses and being able to choose from menu items or move virtual furniture around your room just with a glance.

The technology driving things is familiar enough: The sensor is an EEG that gently rests against the back of your head. This position is key, according to Kouider, as that’s where your visual cortex’s signals can most easily (or comfortably) be reached. And it’s these signals that NextMind uses, interpreting what you are looking at as the item or signal to be acted upon. In its simplest form, this would be a button or trigger, but the demos also show how it can be used to DJ, copy and paste and even augment (instead of simply replace) other inputs, such as that mouse or a game controller you are already using.

Source: NextMind’s brain-computer interface is ready for developers | Engadget

Organisational Structures | Technology and Science | Military, IT and Lifestyle consultancy | Social, Broadcast & Cross Media | Flying aircraft