Study: Predicting literacy skills in children years before they read
A new study says non-reading children as young as age 3 carry objective neurophysiological markers that signal whether they will struggle to read. Children so young have never been tested before, but the research team found a way to measure their brain responses to sound, a key part of pre-reading development.
The groundbreaking study found promising results in how to predict reading ability before reading instruction begins. More testing is needed, but if the approach works, scientists may literally predict if a toddler is at risk. That could lead to early intervention strategies that dramatically improve a child’s reading skills, said senior researcher and neurobiologist Nina Kraus, director of Northwestern’s Auditory Neuroscience Laboratory, Evanston, Ill.
“If you know you have a 3-year-old at risk, you can, as soon as possible, begin to enrich their life in sound so that you don’t lose those crucial early developmental years,” Kraus told The Huffington Post.
The study published in the July issue of PLOS Biology is one of the first to find that the brain’s ability to process the sounds of consonants in noise is critical for language and reading development. In other words, reading begins with the ears, not the eyes, as our brains index meaningful sounds and attempt to block out noise, all within microseconds.
“This is arguably some of the most complex computation that we ask our brain to do,” Kraus told National Public Radio.
Noisy environments tax all of us when we’re trying to listen for meaningful sound, according to Martha Burns, Ph.D., Joint Appointment Professor at Northwestern University. But for children with auditory processing disorders (APD), meaningful sounds sound, well, simply muddled. And classrooms can be very noisy places, where children with APD may find it difficult to filter out irrelevant noise.
“The child’s natural instinct, just like yours, is to stop listening. As a result, children with APD often achieve way under their potential despite being very bright,” Burns wrote.
Researchers have already found ways to help children with auditory processing disorders. Burns notes, for example, that programs such as Fast ForWord Language v2 can change the brainstem response to speech and improve auditory processing skills, helping children improve their ability to listen for competing words and deciphering words that are unclear.
But noise, as distinct from sound, particularly affects the brain’s ability to hear consonants. Consonants are said very quickly and not as loudly or as long as vowels - which in contrast - are acoustically simple.
The methodology: how did the study work?
Kraus and her team used a combination of consonants and vowels, specifically the sound “da,” to see how well kids’ brains could filter out background noise. The results showed tremendous potential for identifying children with potential reading problems later in life. “Our results suggest that the precision and stability of coding consonants in noise parallels emergent literacy skills across a broad spectrum of competencies – all before explicit reading instruction begins,” the study says.
Here’s how Kraus’s team discovered the neurological markers in youngsters too young to read. In a series of experiments, they asked 112 kids between the ages of 3 and 14 to choose a favorite movie and sit in a comfortable chair. Then, researchers attached electroencephalograms (EEGs) to each of the children’s scalps to monitor their brain waves while they listened to a video soundtrack in one ear and to noise in the other ear. The transmitted noise, specifically the sound “da,” was imposed over background chatting of about a half dozen people.
The EEG output to a computer allowed Kraus’s team to actually see the kids’ brain waves and understand how well they could separate the sound “da” from the noisy chatter. The brain should respond the same way repeatedly to the sound “da,” Kraus said. But if the brain doesn’t respond the same way over and over again, something may be wrong with the child’s auditory processing. “If the brain responds differently to that same sound - (even though) the sound hasn’t changed – how is a child to learn?” Kraus said.
The team tested the 3-year-olds, then re-tested them the following year. In the follow-up, researchers learned they could predict which of the 4-year-olds would struggle with reading. They also tested children as old as 14 and found that they could predict reading skills and learning disabilities.
What are the implications?
Based on the results, researchers developed a model to predict reading performance. The test is a “biological looking glass into a child’s literacy potential,” Kraus told NPR.
“If the brain’s response to sound isn’t optimal, it can’t keep up with the fast, difficult computations required to process in noise,” she said. “Sound is a powerful, invisible force that is central to human communication. Everyday listening experiences bootstrap language development by cluing children in on which sounds are meaningful. If a child can’t make meaning of these sounds through the background noise, he won’t develop the linguistic resources needed when reading instruction begins.”
The findings have far-reaching consequences for parents and educators because, clearly, they show that reading is about perceiving sound. To reiterate, reading is about the ears, not the eyes.
And, when it comes to reading intervention, the earlier, the better.
“My vision for this is to have every child tested at birth,” Kraus said.
For further reading: