In Depth We could soon be controlling our tech with thoughts alone
You may already be having basic conversations with your smartphone, desktop PC, games console, TV and, soon, your car, but such voice recognition is – in the scientific community, at least – firmly in a folder market ‘dumb’ technology.
New ways of controlling consumer electronics goods with both basic voice and gestures are suddenly common, but we could soon be operating computers not by barking out instructions or waving, but purely by thinking.
Research into the long researched brain-computer interface (BCI) – also known as the ‘mind-machine’ interface – is becoming so advanced that it’s set to create a whole new symbiotic relationship between man and machine.
It could even lead to a situation where speech is rendered useless, and people wirelessly communicate through universal translator chips. No more complaining about loud music in nightclubs, then.
The BCI goes way further than simple speech-to-text technology like Nuance’s Dragon Dictation
Forget about the wireless revolution – this revolutionary tech demands cables. “A brain-computer interface encompasses any form of controlling a computer via a direct electrical connection to the human body,” says Peter Cochrane, ex-CTO of BT and now an independent analyst.
That connection can be any form of nerve signal or impulse accessed from the surface of the human body, including head and limbs, or muscle impulses picked up by electrodes on the arm, hand, face or forehead generated by physical movement.
Away from actually moving to establish a link between a person and computer, a BCI can use either an MRI scanner or a direct electrical connection to the human brain.
The movie Avatar popularised the idea of a human controller ‘binding’ with an external body
If you’re already thinking about mind control, you’re not far wrong. Even the movie Avatar, where humans remotely piloted a genetically engineered alien being, is closer than you might think.
Attempting to fill the gap between automatic vacuum cleaners and true sentient machines, the burgeoning robotics industry has come up with a product that acts like a puppet; the prototype TELESAR V allows a human operator to ‘bind’ with it, see what it sees, and replicate the exact movements of a human hand inside a sensor-filled glove.
Described as a ‘surrogate anthropomorphic robot’ and hailing from Japan (the Japanese Science And Technology Agency, Keio University and Tokyo University, to be precise), the human user also gets feedback on what the robot hand is experiencing, both in terms of touch and temperature.
Ideal for remotely handling toxic substances, explosives or investigating nuclear accidents such as Fukushima, the uses for this kind of technology appear endless.
Perhaps we’ll see robots like TELESAR V perform complex surgery where a rock-steady hand is required, work on the Moon, or in search and rescue operations.
And yet the brain-computer interface offers so much for paralysed patients, such as locked-in syndrome sufferers and right-to-die campaigner Tony Nicklinson, who passed away in August. Nicklinson operated a computer using eye movements to communicate, though a BCI needs no such voluntary movements of muscles – they work using thoughts alone.
Clare Carmichael, a research analyst at e-accessibility charity AbilityNet, is working on a BCI prototype – called BrainAble – which has been developed to assist people with extreme disabilities and locked-in syndrome.
“A BCI is a system that enables interaction with a computer based on changing electrical signals that occur in the brain,” Carmichael tells us. “The signals can be taken invasively or non-invasively either from inside the brain or from the scalp. Non-invasive BCI takes signals that are present at micro-volt levels on the scalp and then amplifies them using an EEG. These signals are then digitised so that they can be used by the computer.”
During testing with disabled participants in Barcelona and in Liverpool participants use ‘thinking strategies’ to produce specific electrical activity from which data is extracted for use by the BCI.
“So far people have been able to communicate through a speller, perform binary tasks such as turning on and off a light and a TV, changing the channel and adjusting the volume,” says Carmichael. “Participants have been able to navigate a robot and control a camera and to enter a virtual reality that enables them to meet and talk to other people using BrainAble.”
“It can potentially enable people with locked in syndrome to communicate and to continue to be creative.”
The BCI goes way beyond voice and gesture control, as seen on Xbox 360′s Kinect
Such tech is, for now, concentrating on those it can help most, but the research will eventually trickle into everyday use. “The BCI has tremendous potential as a technology and is already used by gamers and in extreme incident management,” says Carmichael. “Ultimately it’s possible to think of a world where it offers people additional bandwidth. “I like the idea of an as yet unrealised future world where I can wirelessly communicate through my universal translator chip …”
If you like the sound of that, it does come with a word of warning. “BCIs will fit into the Internet of Things by including chips and implants in people and animals – everything will be connected by default,” says Cochrane, who thinks the BCI and the Internet of Things go hand-in-hand.
“If your brain and nervous system get connected onto the net then they are automatically a part of it – in effect, you become your own cloud.” So next time you think you’re spending too much time online, just remember – this is just the beginning.