How does the human brain, a collection of nearly 100 billion densely packed nerve cells, allow us to think, feel, act and perceive the world?
Scientists at Yale University’s Brain Function Lab are trying to better understand what happens in our brain during a conversation, using skullcaps connected to 64 cables.
At the tip of half of those fiber optic cables, weak laser beams slip through the wearers’ skulls and penetrate about 2.5 cm into their brains. There, the beams bounce off, reflect back and are picked up by the other half of the cables.
Scientists obtain detailed brain images with bright blotches of colour indicating where the action is taking place. It’s not a direct picture, like an X-ray, but rather a reconstruction.
“We want to understand the neuro circuitry that is associated with interaction between individuals,” explains neuroscientist Joy Hirsch, director of the Yale Brain Function Lab. “It’s probably one of the most fundamental functions of the human species and yet we know very little about it. The new information here is that visual reports of, say, facial information are an intimate part of the language system as it is being used in an interactive situation like a dialogue.”
Two researchers at the lab are offering a glimpse into their brains as they chat: two brains in conversation, carrying out an intricate dance of internal activity. As they talk, scientists are also identifying visual patterns.
Altered patterns have revealed conditions like panic disorder and depression. It’s hoped the research will also help understand how autism affects individuals.
“In the case of autism, the primary hallmark of the disorder, the one that is usually first noticed when parents and health professionals suspect that a child might have autism, is the fact that the child fails to engage with other individuals and yet we know very little about that neuro-circuitry that involves engagement with others,” says Joy Hirsch.
At Princeton University, too, neuroscientist Uri Hasson is looking at how brains interact during conversation.
Functional MRI scans detect brain activity by monitoring blood flow. When a brain region is active, it needs more blood to provide oxygen and nutrients. As a result, the active regions light up on a computer screen.
The study also shows that the more listeners understand a story, the more their brain activity pattern follows the speaker’s.
“We developed a new method in which I can really scan people while they are telling real-life stories, in the scanner, and then I can play the stories to a group of listeners,” explains Uri Hasson, associate professor of neuroscience and psychology at Princeton University. “And what we are asking you is whether the listener’s brain becomes similar to the speaker brain (while) doing natural, real-life communication. So the (more) similar the brain patterns, the better the communication and the understanding. So if you really get me now, your brain pattern becomes similar and coupled to my brain pattern.”
The next goal is to move the technology out of the lab and into more natural settings.
Making portable devices to replace huge MRI machines could open new possibilities for brain decoding, say the researchers, who imagine a world where composers could write music just by imagining it, or artists could simply think about a painting and let a computer do the rest