Top row: segments of a movie trailer that a subject viewed while in the MRI. Bottom row: reconstructed segments measured using MRI.
Imagine watching your nightly dreams the morning after. On YouTube.
Whether you find the notion intriguing or cringe-inducing, scientists at the University of California, Berkeley, said such a thing will one day be possible via a blend of brain imaging and computer simulation software that will also serve a variety of altruistic needs beyond dream reconstruction.
Through the use of functional magnetic resonance imaging and computational models, the university researchers have been able to decode and reconstruct people's visual experiences, said Jack Gallant, a professor of neuroscience at the school. He coauthored a paper about his group's work, which was published in the Sept. 22, 2011, online version of the journal Current Biology.
This technology could give doctors insight into the minds of stroke victims, coma patients, and others with disabilities.
As yet, the system only reconstructs movie clips people have already viewed, Gallant said. But the work paves the way for reproducing the movies inside our heads such as dreams and memories, he added.
The technology could give doctors and scientists a better understanding of what goes on in the minds of stroke victims, coma patients, and others who can't speak. It may also lay the groundwork for a brain-machine interface so that people with cerebral palsy or paralysis, for example, can guide computers with their minds.
"Our natural visual experience is like watching a movie," said Shinji Nishimoto, a post-doctoral researcher in Gallant's lab. "In order for this technology to have wide applicability, we must understand how the brain processes these dynamic visual experiences."
Nishimoto and two other research team members served as subjects for the experiment, because the procedure requires volunteers to remain still inside the MRI scanner for hours at a time.
They watched two separate sets of Hollywood movie trailers, while the scanner measured blood flow through the visual cortex, the part of the brain that processes visual information. The brain activity recorded while subjects viewed the first set of clips was fed into a computer program that learned, second by second, to associate visual patterns in the movie with the corresponding brain activity, Nishimoto said.
Ultimately, Nishimoto said, scientists need to understand how the brain processes dynamic visual events that we experience in everyday life.
"We need to know how the brain works in naturalistic conditions," he said. "But for that, we need to first understand how the brain works while we are watching movies."
To read the latest issue of Mechanical Engineering, click here.
We need to know how the brain works in naturalistic conditions. But for that, we need to first understand how the brain works while we are watching movies.
Shinji Nishimoto, University of California, Berkeley
More on this topic
Marine bioengineers hope ocean mollusks will serve up a new generation of drugs – on the half-shell.
By taking a closer look at how natural processes work, such as the intricate design of an insect’s joint, engineers can be inspired to create ...