Mind reading is an exciting thing. All of us once or more in our lifetime have had a strong desire to read the mind of persons around us. How would it feel? Will it hurt us or will it make us happy? Can’t say; because none of us can actually read someone else’s mind. But researchers at the University of Washington are doing an experiment that can pave the way for mind reading. Using brain implants and sophisticated software, researchers can now predict what their subjects are seeing with high speed and accuracy.
A neurological process is involved in viewing a two-dimensional image on a paper and transforming that image into something easily recognized by our minds. To understand this very neurological process, a research team led by University of Washington neuroscientist Rajesh Rao and neurosurgeon Jeff Ojermann demonstrated that it’s possible to decode human brain signals at nearly the speed of perception.
To achieve this, an experiment was conducted with the help of seven patients who were undergoing treatment for epilepsy. It was found that medicines had proved futile in controllingthe seizures because of which these patients were given temporary brain implants and electrodes were used to pinpoint the focal points of their seizures.Since Rao and his team also need electrodes for their experiment, this group proved to be a good one considering electrodes were already fitted on their minds so it was just a question of giving them an additional task.
The patients were shown a random sequence of pictures—images of human faces, houses, and blank gray screens—on computer monitors in brief 400 millisecond intervals. Their specific task was to watch for an image of an upside-down house. At the same time, the electrodes in their brain were connected to software that extracted two distinct brain signal properties, namely “event related potentials” (when massive batches of neurons simultaneously light up in response to an image) and “broadband spectral” changes (signals that linger after viewing an image).
A computer was used to sample and digitize the incoming brain signal at a rate of 1000 times per second, as the images flickered one after the other on the screen. This resolution allowed the software to determine which combination of electrode locations and signals correlated best to what the patients were seeing.The team observed that the responses differed, some were sensitive to faces and some were sensitive to houses.
Based on these responses, the software was trained and then the patients were shown an entirely new set of pictures by the researchers. Without previous exposure to these new images, the computer was able to predict with 96 percent accuracy when a test subject was seeing a house, a face, or a grey screen. And it did so at nearly the speed of perception. But it is to be noted that this accuracy was achieved only when the computer considered both event-related potentials and broadband changes because in order to understand how a person perceives a complex visual object; a holistic picture of the neural networks needs to be considered.
No huge claims can be made as of now because the sample size of the study was too small. But it can be hoped that if success is achieved moving on this path, this kind of brain decoding could be used in brain mapping to identify locations in the brain responsible for certain types of information in real time. It can also be used to build communication mechanisms for “locked-in” patients who are paralyzed or have suffered a stroke.
[adinserter block=”7″]
Author: Technology Blog

