The Balance of Sound

The Burger lab has recently identified several properties of auditory neurons that appear to “tune” them to their own frequencies along the tonotopy within the brain.

Sound is everywhere and hearing these sounds is critical as we navigate our surroundings. To make sense of sound, a major function of the ear is to separate frequencies. This is the process that allows you to appreciate the complexity of music and language. It achieves this by processing each frequency in a separate “channel.” That is, separate neurons in the ear respond to low or high frequencies independently. In the coiled tube of the inner ear, or cochlea, vibrations produced by sounds are converted to neural activity before being sent to the brain. Due to mechanical properties of this tube, low frequencies resonate at one end and high frequencies at the other, similarly to a musical instrument. This mapping of ‘frequency to place’ is called ‘tonotopy.’ This tonotopic organization is then repeated everywhere in the brain where sounds are processed. The tonotopic organization of hearing is of particular interest to neuroscientist R. Michael Burger.

 

“We can cut a window in the egg at an early stage, when the [chicken] looks more like a worm. With a microscope we can see the primordial ear tissue, and we drive the gene for Bmp into this ear. We can then make the entire cochlea behave like a low frequency ear."

The Burger lab has recently identified several properties of auditory neurons that appear to “tune” them to their own frequencies along the tonotopy within the brain. The key question is, how did this perfect tuning arise in development? Burger thinks it may be explained by one of two theories. One suggests that the tonotopic properties first arise in the ear, then during development, the ear drives the tuning of neurons in the brain. Alternatively, the brain’s organization may develop independently of the ear, instead relying on cues present in the developing brain itself to establish tonotopic patterns. A collaborator of Burger’s at the National Institutes of Health, Dr. Matthew Kelley, discovered that a bone morphogenic protein (Bmp-7), is expressed in the developing cochlea and signals tonotopic specificity in the ear.  Exposure to Bmp-7, along with another protein called chordin, balance one another to control cell fate. The relative dose of each protein tells each cell whether to become tuned to low or high frequencies. 

Burger, associate professor of biological sciences, together with graduate student Lashaka Jones is taking advantage of this finding to create chicken embryos that develop with one normal and one non-tonotopic ear using a technique called in-ovo(in the egg) plasmid electroporation to manipulate expression of the Bmp-7 gene.

“We can cut a window in the egg at an early stage, when the animal looks more like a worm,” he says. “With a microscope we can see the primordial ear tissue, and we drive the gene for Bmp into this ear. We can then make the entire cochlea behave like a low frequency ear. The animal develops with one normal ear and one tonotopically disrupted ear, which will allow us to examine whether the brain on the disrupted side of the brain takes on all low-frequency characteristics. With tools we’ve developed in the lab during our recent studies, we can quickly assess the brain in an animal with the disrupted ear and determine if the normal tonotopic gradient exist in the brain.”

Using the newly genetically manipulated animals, Burger hopes to be able to resolve the degree to which the highly specialized properties of these neurons depend on normal input from the ear. These are the first studies to investigate the developmental dependence of brain neurons on their inputs from the ear to establish their frequency specific properties. These studies will give scientists a greater understanding of the principles the brain uses to create its exquisite and complex organization.