ISU-Meridian Professor Studies Brain Waves to Help Children with Cochlear Implants Learn to Speak
Fall 2010 Issue | By Chris Gabettas
Dressed in jeans, cowboy boots and a dinosaur shirt, 5-year-old Samuel Fish sits on a chair in the Speech and Language Laboratory at the Idaho State University-Meridian Health Science Center.
Samuel Fish squirms as he is fitted with the electrode-sensor net that measures brain activity.
Crinkling his nose and squinting his eyes, he does his best to hold still as Dr. Jeanne Johnson and graduate assistant Elizabeth Subasic stretch a cap-called an electrode sensor net-over the top of his head. The cap is a maze of wires connecting 128 electrodes that correspond to various regions of the brain.
"Good job Sam. We're ready to go," says Johnson, Ph.D., an associate professor of speech-language pathology and associate chair of ISU's Department of Communication Sciences and Disorders.
Johnson, who joined the ISU-Meridian faculty in 2009 after spending 21 years at Washington State University, is building on research she first conducted in 2007 while on sabbatical at the University of Louisville Developmental Neuroscience Laboratory. She's studying brain waves to determine spoken language development patterns in children who have cochlear implants.
Cochlear implants are complex devices that, when surgically implanted in the inner ear, can help the severely or profoundly deaf distinguish sounds. Through extensive speech therapy and auditory training, they often learn to speak. According to the National Institute on Deafness and Other Communication Disorders, young children and toddlers are some of the best candidates for cochlear implants because youngsters tend to be more flexible in learning spoken language than adults.
Samuel, who is not deaf, is playing a key role in Johnson's research by helping her build a database of brain waves made up of children who hear normally. Johnson will then compare their waves to the brain waves of children with cochlear implants to see if the brain waves are similar.
"Ultimately, I'm trying to determine why some children with cochlear implants do well (with understanding and using spoken language) and others do not," says Johnson.
With the electrode net connected to a computer, Samuel watches a silent cartoon of "Sylvester and Tweety Bird" to hold his attention as Johnson instructs him to listen to three syllables-Ba/Da/Ga.
When Johnson administered the same Ba/Da/Ga exercise to the 30 children in her 2007 research at University of Louisville-15 with cochlear implants and 15 without-she found both groups were processing sound similarly. The brain waves were different, though, when she presented sentences that had unexpected words at the end, such as "dogs like to dark" instead of the expected "bark." The brain waves of children with cochlear implants indicated they did not realize there was an unexpected word.
"Why were some of the children with cochlear implants not catching this?" Johnson asks.
She hopes her research at ISU will help her answer that question and help identify deaf children who are likely to have more difficulty learning spoken language than their peers.
"With that knowledge, we can then tailor the way we teach spoken language to fit a child's individual needs," Johnson says. "For instance, some children may need more visual cues to accompany the spoken language they are hearing while others may do well with listening alone."