Brain networks activated by listening to narrowband noise (all images are in radiologic orientation). A, Primary auditory areas (anterior superior temporal gyri) activated by controls and children with unilateral sensorineural hearing loss. B, Auditory association areas (parietal lobules and supramarginal gyrus) activated by controls only. C, Attention networks (right [C1] and left [C2] prefrontal cortices) activated by controls only.
Brain networks activated by listening to speech-in-noise stimuli. A, Secondary auditory processing areas (right [A1] and left [A2] superior temporal gyri) activated bilaterally in controls and on the left (A2) by children with unilateral sensorineural hearing loss (USNHL). B, Attention areas (right prefrontal cortex [B1] and medial frontal cortex [B2]) activated by controls and children with left-sided USNHL. C, Visual association areas (middle occipital gyri [C1] and precuneus and cuneus [C2]) activated bilaterally by children with left-sided USNHL.
Customize your JAMA Network experience by selecting one or more topics from the list below.
Propst EJ, Greinwald JH, Schmithorst V. Neuroanatomic Differences in Children With Unilateral Sensorineural Hearing Loss Detected Using Functional Magnetic Resonance Imaging. Arch Otolaryngol Head Neck Surg. 2010;136(1):22–26. doi:10.1001/archoto.2009.208
Functional magnetic resonance imaging (fMRI) provides information about neuronal excitation by measuring changes in cerebral hemodynamics. This study used fMRI to compare neuroanatomic activation patterns in children with unilateral sensorineural hearing loss (USNHL) with the neuroanatomic activation patterns in normally hearing individuals.
Patients were presented with narrowband noise and speech-in-noise tasks while undergoing fMRI of the brain. In the narrowband noise task, 5 chirps at center frequencies of 250 Hz, 500 Hz, 1 kHz, 2 kHz, and 4 kHz were presented monaurally for 1 second in a randomized order to children in both groups. In the speech-in-noise task, Bamford-Kowal-Bench (BKB) sentences were presented over 4-talker babble to both ears, and scans were acquired after each stimulus. We compared fMRI data across groups using independent component analysis and Bayesian (hierarchical) linear models.
Tertiary referral center.
Twelve children with USNHL and 23 normally hearing controls.
Perform fMRI while subject listens to narrowband and speech-in-noise tasks.
Main Outcome Measures
Neuroanatomic differences in fMRI.
In the narrowband noise task, children with USNHL had less activation of auditory areas and failed to activate auditory association areas and attention networks compared with normally hearing controls. In the speech-in-noise task, children with USNHL activated only secondary auditory processing areas in the left hemisphere, while controls activated these areas bilaterally. Children with right-sided USNHL failed to activate attention areas that were activated in controls and in children with left-sided USNHL. Only children with left-sided USNHL activated bilateral visual association areas.
Results show significant differences in the cortical processing of sound between children with severe to profound USNHL and normally hearing children. These differences may account for the functional auditory problems that children with USNHL experience.
Approximately 2.6% of school-aged children in the United States have hearing loss that substantially interferes with their education.1,2 Almost three-quarters of these children have unilateral, as opposed to bilateral, hearing loss.2 Unilateral hearing loss is generally defined as a permanent hearing loss of any degree (mild to profound) in 1 ear.
Patients with unilateral sensorineural hearing loss (USNHL) are unable to take advantage of binaural hearing.3 This loss results in auditory perceptual problems such as difficulty understanding speech in noise, reduced localization ability, poor understanding of the person talking on the side of the deaf ear, loss of binaural summation, and reduced spatial balance. In children, these difficulties have been shown to manifest as deficits in speech recognition, sound localization, academic performance, and behavior; low energy; stress; and low self-esteem.4-8
Functional magnetic resonance imaging (fMRI) has previously been used to study brain activation in response to auditory and language stimulation in children.9 Cortical activation leads to localized increases in blood flow that can be detected as signal intensity changes. This method of measuring changes in blood flow is called blood oxygenation level–dependent (BOLD) contrast MRI. Brain activation maps can be constructed and compared for activated and nonactivated states.
Previous studies investigating USNHL in adults using BOLD fMRI have demonstrated broad differences in cortical responses such as a more balanced bilateral response in USNHL compared with the strong lateralization response seen in normally hearing controls, which suggests adaptation following USNHL.10-12 We previously reported that when children with USNHL were presented with pure tones, they demonstrated disparate neural responses on fMRI evaluation, depending on the ear stimulated.13 The purpose of the present study was to further elucidate the differences in neural responses between children with USNHL and normally hearing children in situations approximating those found in everyday life. For example, we used fMRI to measure the effects of background noise and attention on cortical activation.
Subjects were recruited from the Center for Hearing and Deafness Research at Cincinnati Children's Hospital Medical Center. Institutional review board approval was obtained for the study, and informed consent was obtained from the subjects' parents, with assent obtained from the subjects themselves when appropriate. Subjects were 12 children with USNHL (6 right-sided and 6 left-sided) and 23 children with normal hearing bilaterally.
Subjects were diagnosed as having USNHL when the worse ear had a mean pure tone audiometry (PTA) average air conduction threshold of greater than 65 dB of hearing loss (HL) and no single frequency better than 45 dB HL at 500 to 4000 Hz; and the better ear had a PTA of 15 dB HL or less and no single frequency greater than 25 dB HL. All patients had idiopathic hearing loss and normal imaging findings (computed tomographic or magnetic resonance) and no history of meningitis, cytomegalovirus, or trauma, and no family history of hearing loss.
Stimuli were presented during completely silent scanner intervals at 80 to 85 dB sound pressure level (calibrated via a B & K audiometer, Buffalo, New York) via Etymotic ER-30 headphones (MRI compatible) (Elk Grove Village, Illinois). The narrowband noise task was created by presenting 5 narrowband noise chirps at center frequencies of 250 Hz, 500 Hz, 1 kHz, 2 kHz, and 4 kHz for a duration of 1 second each for a total of 5 seconds. Narrowband noise chirps were presented in randomized order. The control task was silence. Stimuli were presented monaurally for children with USNHL and for normally hearing controls.
In the speech-in-noise task, Bamford-Kowal-Bench (BKB) sentences were presented by a male talker over 4-talker female babble. Stimuli were presented at 21-dB signal-to-noise ratio. Stimuli were presented binaurally to detect possible advantages in normally hearing controls. The subjects' understanding of the sentence was monitored by asking them to press a button. The control task consisted of the narrowband noise chirps at the 250 Hz and 500 Hz center frequencies. Subjects also pressed the button after hearing the narrowband noise chirps.
The fMRI scans were collected on a Siemens (Munich, Germany) 3T Trio or Phillips (Vienna, Austria) 3T Achieva system. Imaging parameters used for the echo-planar imaging fMRI scans were as follows: repetition time, 3000 milliseconds; echo time, 38 milliseconds; field of view, 25.6 × 25.6 cm; imaging matrix, 64 × 64; SENSE factor, 2. Twenty-four slices of 5-mm thickness were imaged, covering almost the whole brain. After each stimulus presentation, 3 fMRI scans were acquired at 2-second intervals to capture the peak of the hemodynamic response.
The fMRI data was postprocessed using routines written in IDL programming language (Research Systems Inc, Boulder, Colorado). Geometric distortion in the echo-planar images was corrected for using a multiecho reference scan.14 Motion correction was performed using a pyramid iterative algorithm, and the motion-corrected data were then transformed into stereotaxic (Talairach) space using landmarks obtained from the T1-weighted anatomic images.15 Data were processed using independent component analysis (ICA). Regions of interest were defined from ICA maps, and fMRI time courses were extracted. For the speech-in-noise task, the contrast used was speech at 0 dB signal to noise ratio (the most difficult) vs the control task. A Bayesian (hierarchical) linear model was used to find networks with significant differences between children with USNHL and controls (P < .05).
The mean (SD) age of children with USNHL was 9.04 (1.62) years (range, 7.23-11.75 years). The mean age of normally hearing controls was 9.75 (1.43) years (range, 7.32-11.61 years). There was no significant difference in age between children with and without USNHL (P = .19 by the t test) and between children with right and left USNHL (P = .50 by the t test).
In the narrowband noise task (Figure 1), although both normally hearing controls and children with USNHL activated primary auditory areas such as the anterior superior temporal gyri bilaterally (P < .001 for both areas), children with USNHL had less activation of these areas (P = .003). In addition, only controls activated the auditory association areas such as the parietal lobules and supramarginal gyrus (P < .001) and attention networks such as the right prefrontal cortex and left prefrontal cortex (P < .001), while children with USNHL failed to do so (P = .59 for the association areas; for the attention networks, P = .62 for the right prefrontal cortex, and P = .73 for the left prefrontal cortex). Figure 1 displays right and left prefrontal cortices as separate images because the ICA algorithm analyzes each side independently.
In the speech-in-noise task (Figure 2), normally hearing controls activated secondary auditory processing areas bilaterally such as the superior and middle temporal gyri (right hemisphere, P = .04; left hemisphere, P = .02), whereas children with USNHL activated only left-sided auditory processing areas (right, P = .32; left, P = .02). Figure 2 displays right and left auditory processing areas as separate images because the ICA algorithm analyzes each side independently. In addition, normally hearing controls and children with left-sided USNHL activated attention networks (right prefrontal cortex, P = .03 and P = .03, respectively; medial frontal cortex, P = .01 and P = .04, respectively), whereas children with right-sided USNHL failed to activate these networks (P = .85 and P = .99, respectively). Finally, only children with left-sided USNHL activated bilateral visual association areas in the occipital lobe (middle occipital gyri, P = .01; precuneus and cuneus, P = .005).
Normally hearing controls and children with USNHL activated primary auditory areas bilaterally (anterior superior temporal gyri). Bilateral activation of primary auditory areas in children with USNHL is not surprising, given that cochlear nerve fibers synapse in the dorsal and ventral cochlear nuclei and second order neurons project bilaterally thereafter.16 However, children with USNHL had less activation of auditory areas than normally hearing controls, likely because they have less auditory input overall than children with binaural hearing. In monaural deaf adults, a more balanced auditory response has been reported following monaural stimulation as opposed to the strong contralateral response typically seen following monaural stimulation in normally hearing adults.10 Decreased activation of auditory areas in adults with binaural sensorineural hearing loss compared with normally hearing individuals has been demonstrated previously.17
Controls activated auditory association areas (parietal lobules and supramarginal gyrus). Auditory association areas are involved in a higher level of sound integration than primary auditory areas. Traditionally, only speech and phonetic analysis were believed to activate auditory association areas; however, more recent studies have demonstrated strong activation of these areas in response to pure tones as well.18 Children with USNHL failed to activate auditory association areas. Previous studies have found decreased activation of auditory association areas in cochlear implant users with a longer, vs shorter, duration of deafness.19 Results from the present study suggest that unilateral auditory stimulation using a simple stimulus such as narrowband noise may not provide enough stimulation for activation of auditory association areas.
Finally, controls activated attention networks (bilateral prefrontal cortices), whereas children with USNHL failed to activate these networks. Results suggest that auditory attention networks are not as robust in children with USNHL as they are in normally hearing controls, which may account for the poor academic performance, low energy, and low self-esteem seen in children with USNHL.4-8
Normally hearing controls activated secondary auditory processing areas bilaterally (superior and middle temporal gyri), whereas children with USNHL activated these areas only on the left side. Children with USNHL activated auditory processing areas in the speech-in-noise task but failed to do so in the narrowband noise task, likely because speech-in-noise stimulus is more complex than narrowband noise. Left-hemisphere dominance within the primary auditory cortex has been shown using fMRI and monaural pure tone stimulation in normally hearing individuals as old as 30 years, regardless of which ear was stimulated.20 Results from the present study suggest that secondary auditory processing areas may have a similar left-hemisphere dominance in children with USNHL.
Normally hearing controls and children with left-sided USNHL activated attention networks (right prefrontal cortex and medial frontal cortex), whereas children with right-sided USNHL failed to activate these networks. Previous studies using positron-emission tomography have demonstrated that auditory attention engages a network of right-hemisphere cortical regions.21,22 Results from the present study suggest that right-sided auditory input and subsequent stimulation of the left auditory cortex, which is known to be dominant,20 result in greater activation of attention areas despite the fact that the attention areas are located in the contralateral brain. Clinically, children with USNHL have demonstrated deficits in speech recognition, sound localization, academic performance, behavior, energy, and self-esteem,4-8 all of which may be owing to a combination of decreased auditory input and less robust attention networks. Results from the present study suggest that these deficits may be more pronounced in children with right-sided USNHL.
Finally, only children with left-sided USNHL activated bilateral visual association areas (middle occipital gyri, precuneus, and cuneus) in the occipital lobe. Bilateral occipital lobe activation in normally hearing subjects exposed to an auditory stimulus with their eyes shut has been demonstrated previously using fMRI.17 This suggests that even in individuals with normal hearing and vision, there may exist a connection between auditory and visual cortices. Greater activation of the visual cortex in adults with bilateral hearing loss than in normally hearing adults has also been described previously.17 Conversely, previous studies have also observed activation of the visual cortex during auditory stimulation in blind subjects and auditory cortex activation during visual stimulation in deaf subjects, further confirming evidence for cross-modal plasticity (utilization of brain areas that typically interpret input from other sensory modalities) between auditory and visual cortical regions.23,24 Results from the present study provide evidence for cross-modal plasticity in children with left-sided USHNL and suggest that auditory-visual connections may be more robust in the left side of the brain than in the right. This may result from left auditory cortex dominance.20
Limitations of the present study are the small sample size and that results may not be generalizable to children of all ages. In addition, the present study excluded children with known causes of USNHL such as temporal bone anomalies, meningitis, cytomegalovirus, or trauma, and therefore results may not be applicable to these children.
In conclusion, our results show significant differences in the cortical processing of sound between children with severe to profound USNHL and normally hearing children. In addition, children with USHNL demonstrated decreased activation of attention networks in response to auditory stimuli. This may contribute to the auditory and behavioral problems seen in children with USNHL.
Correspondence: Evan J. Propst, MD, MSc, FRCSC, Division of Pediatric Otolaryngology–Head and Neck Surgery, Cincinnati Children's Hospital Medical Center, MLC 2018, 3333 Burnet Ave, Cincinnati, OH 45229-3039 (firstname.lastname@example.org).
Submitted for Publication: April 2, 2009; final revision received June 8, 2009; accepted August 3, 2009.
Author Contributions: Dr Schmithorst had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis. Study concept and design: Greinwald and Schmithorst. Acquisition of data: Schmithorst. Analysis and interpretation of data: Propst and Schmithorst. Drafting of the manuscript: Propst, Greinwald, and Schmithorst. Critical revision of the manuscript for important intellectual content: Propst, Greinwald, and Schmithorst. Statistical analysis: Propst and Schmithorst. Study supervision: Greinwald.
Financial Disclosure: None reported.
Previous Presentation: This article was presented at the American Society for Pediatric Otolaryngology Annual Meeting; May 23, 2009; Seattle, Washington.