You are here
EIT News Story
Auditory Cortex Study Reveals Cells' "Individuality"
New research shows our brains are a lot more chaotic than previously thought, and that this might be a good thing.
Researchers at the University of Maryland have discovered information about how the brain processes sound that challenges previous understandings of the auditory cortex that suggested an organization based on precise neuronal maps. In the first study of the auditory cortex conducted using advanced imaging techniques, Patrick Kanold, assistant professor of Biology and the Institute for Systems Research (ISR), Shihab Shamma, professor of Electrical and Computer Engineering and ISR, and Sharba Bandyopadhyay, an assistant research scientist within ISR, describe a much more complex picture of neuronal activity. Their findings are published in the January 31 online edition of Nature Neuroscience (article available online).
Most of our knowledge of how the brain works has been based on taking a small sample of available neurons and making inferences about how the other neurons respond. In contrast, Shamma, Kanold, and Bandyopadhyay were able to look at the activity of all the neurons in a large region of the auditory cortex simultaneously. To get the highest resolution picture to date of how auditory cortex neurons are organized, the researchers used a technique to fill neurons in living mice with a dye that glows brightly when calcium levels rise, a key signal that neurons are firing. They then selectively illuminated specific regions of the cortex with a laser and measured the neuronal activity of hundreds of neurons in response to stimulation by simple tones of different frequencies.
This "in vivo 2-photon calcium imaging" technique was developed by German researchers and advanced by Harvard scientists, who used it to study the visual cortex in the mid-2000s. The University of Maryland study is the first to apply this technique to the auditory cortex and provides an unprecedented amount of detail about how hearing happens.
Their findings suggest that the brain is far more adaptable than previously thought.
"These results may rewrite our classical views of how cortical circuits are organized and what functions they serve," suggests Prof. Shamma, whose previous research has involved mapping responses in the auditory cortex using traditional microelectrodes.
By using different dyes, this study measured differences in how the neurons receive sound information (the inputs), and how they process that sound (the outputs). It was previously assumed that neighboring neurons receiving the same inputs would also produce the same outputs, but Kanold's research found something very different.
"Neighboring neurons do their own thing by creating different outputs," Kanold explains. "You can imagine that you and your neighbor both receive water to your houses from the same pipe, but you do different things with it - you might cook with it while your neighbor waters the lawn. You can't assume that they are doing the same thing just because they are neighbors."
This is the first time that this level of individuality has been observed in neighboring neurons. Kanold, who studies the brain's ability to reorganize neural pathways, believes that there is a tremendous advantage in this apparent disorder. "Each individual neuron is getting inputs from a wide range of frequencies, and by selecting which frequencies they are strongly responding to, they might be very easily able to shift their function," he says. For example, it is well known that we can quickly listen in on a variety of conversations around us, the so-called "cocktail party effect." It may be that neurons having access to a large range of inputs might be able to quickly change which inputs they are responding to.
This suggests that there is very little redundancy in the function of cells in the auditory cortex, which differs notably from the visual cortex, in which neighboring neurons perform the same function as one another. This could be because our acoustic environment, such as the speech we hear, changes much faster than our visual environment, so we have to constantly adapt to new situations.
By better understanding how the brain processes sound, researchers may someday be able to give computers the same capability.
For more information, visit the Nature Neuroscience website.
This news item was adapted from a press release written by Kelly Blake in Chemical and Life Sciences.
February 1, 2010