Merging brains and machines

The brain inspires some of the most exciting computer technologies of our time, but the relationship isn’t just one-way. Computers—including the deep-learning algorithms used in today’s AI—can also help us understand how the brain works.

The brain–machine relationship goes much deeper than just serving as a research tool or inspiration. In this chapter we’ll delve into some of the incredible medical treatments and other developments emerging in the field of brain-machine technologies which aims to create a seamless human-machine interface by combining engineering and neuroscience.

Mind-controlled machines

British science fiction writer and futurist Arthur C. Clarke once said, “Any sufficiently advanced technology is indistinguishable from magic”. It seems prophetic in the face of recent advancements in brain–computer interfaces, which appear to border on the magical. 

In 2012, for example, a woman paralysed from the neck down used nothing but her thoughts (and a robotic arm) to bring a bottle filled with coffee to her mouth.

At the start of the 2014 FIFA World Cup in Brazil, a young paraplegic man used his thoughts and a robotic exoskeleton to kick a soccer ball. And in 2017, ‘locked-in’ patients, unable to move a muscle or even blink, were able to communicate with doctors via their brain activity.

What connects these extraordinary examples is the brain–computer interface. This works by relaying brain activity to computers, which translate it into actions. In the case of the paralysed woman, 96 electrodes recorded her brain activity over many hours as she trained herself to think “move the arm in direction X”.

Eventually, researchers could decipher the patterns of brain activity that produced different movement commands. These patterns could then be translated to a digital signal sent from a computer to the robotic arm.

Machines that bypass the brain

Today’s invasive brain-computer interfaces are highly experimental, and because they need to be surgically implanted, their use is limited to people with severe medical conditions.

Spinal cord injury has, so far, been the most common target, the idea being to restore movement to legs or arms by bypassing spinal cord damage: instead of a brain command going through the spinal cord to muscles in the limbs, it is decoded by a computer that then sends a command to a limb.

The limb may be a prosthetic device or the patient’s own arm or leg. While still in the early days, the technology has been successful.

In 2017 a man with quadriplegia successfully manipulated his own arm to feed himself using a brain-controlled system that electrically stimulated his arm muscles, in an arrangement known as functional electrical stimulation. 

Machines that read minds

If we can control machines with our brains, does that mean machines can read our minds? Not really. Using brain signals to control a device is very different to a computer knowing what we’re thinking.

Nevertheless, some experiments have shown that our brain activity can be used to digitally recreate images we have seen. Although impressive, these machines can mostly only interpret what we’re seeing—a long way from knowing our thoughts and intentions. 

This ‘mind-reading’ technology works by first showing people hundreds of images while their brains are scanned in a particular type of MRI machine. A computer program, often based on a deep neural network, learns that particular brain activity patterns occur when certain image features are displayed.

Then, when a person in the MRI machine is shown a new image, the computer program can decipher, at some level, what the person is seeing.

In 2013, Japanese researchers used a similar approach to deduce some basic content from people’s dreams. Our dreams are highly visual, making them suitable for the type of decoding described here. Still, like other ‘mind-reading’ feats, the level of detail and accuracy is low and restricted to visual features.

There’s no evidence we are close to true mind-reading, such as gauging what a person is thinking, intending or desiring.

Wearable brain-computer interfaces

A less invasive way to eavesdrop on neurons is by using something called EEG (electroencephalography) to measure brain activity through the scalp. Typically, this is done by placing a cap with electrodes on the head, hooking up the wires to a computer and recording the electrical signals of the brain.

Whereas implanted devices allow researchers to measure the activity of individual cells, EEG signals represent the entire activity of thousands of neurons, so their level of detail is much lower. Instead of listening to individual conversations among a crowd of people watching a game of tennis, say, it’s like you’re outside the stadium only able to hear the collective oohs and ahhs or murmuring of the crowd.

Still, this general information about brain activity can be useful for medical therapies, education and even gaming.


Stroke survivors with severe movement problems may be unable to physically practice rehabilitation tasks, but they may be able to imagine making those movements. This process called ‘motor imagery’ activates some of the same brain networks used in actual movement.

Then, using neurofeedback, an EEG program can provide real-time feedback on motor imagery. For example, a program that detects the brain signals of a movement such as closing a fist could send a signal to stimulate forearm muscles and cause the fist to close.

This feedback helps patients link intended movements to successful ones, promoting new connections in the brain to compensate for damaged areas.


Another area of interest is using brain-computer interfaces to improve learning, in particular, how attention can be measured from brain activity, and how that might be used to keep students’ minds focused.

Using an EEG, leading QBI cognitive neuroscientist Professor Jason Mattingley and colleagues from The University of Queensland have converted attention signals into an easy-to-understand visualisation on a computer monitor, a form of neurofeedback. In one study, subjects used this visualisation to regulate their own attention levels.

QBI psychology researcher, Dr David Painter, who developed the technology, says it's advanced enough to be tested in applied settings, such as classrooms or workplaces.

Video games

Video games are a good platform to test the control of actions with thoughts, as many require only a few commands or buttons; a major limitation with brain-computer interfaces is the number of signals that can be reliably decoded and distinguished. QBI’s Dr Painter says this is just one of several hurdles.

“The decoding, or so-called mind-reading technology, is already here,” he says. “The challenge for developers is to invent natural and useful means for interaction—the interface itself.” Also important, he says, is what advances this technology offers: the unique experiences it could create. “Developers worldwide are solving this problem as we speak.” This means mind-controlled video games could be the next big advance in gaming.

Memory manipulation

A possible use for brain-computer interfaces is to enhance brain function. Studies have shown that electrically stimulating the brain can boost memory. Most of these take a general approach, trying to enhance all memories using the same stimulation pattern.

But one 2018 study used stimulation to improve a particular memory. Scientists recorded the brain activity of epilepsy patients with implanted electrodes as they played a memory game. Later, when their brains were stimulated with the recorded pattern, they scored about 35% better on the memory test.


Neurotechnology Engineered hardware that connects with the nervous system. Neurotechnologies can be input devices that alter brain activity (eg. deep brain stimulation electrodes) or output devices that record brain activity (eg. EEG devices). Prosthetics such as the cochlear implant and robotic arms are also neurotechnologies. 

Feedback given to a person about their own brain activity, generally to help them self-regulate or train aspects of their own brain function. For example, brain signatures of attention levels can be detected and converted into a visual scale, which the user can learn to modulate themselves.