Professional Development: Blog

The Science of Learning

August 9, 2017
Why Silent Reading Doesn’t Work for Struggling Readers

Reading fluency is understanding what you read, reading at a natural pace, and reading with expression. How will your students, or your child, become more fluent this year?  It’s tempting to think that reading more is the key. It turns out that silent reading does not build reading fluency in struggling readers. According to the National Reading Panel, “there is insufficient support from empirical research to suggest that independent, silent reading can be used to help students improve their fluency”. Knowing this, if the student isn’t proficient in reading, should they continue “reading” a not-right selection in the hope of becoming a better reader? I liken this to me learning how to paint. I have recently started to go to local painting outings. With some coaching/instruction, I am starting to understand the basic techniques. The instructor goes around to everyone to make sure they are doing what is asked of them. The instructor breaks each step down, section by section, and then checks back in with everyone as we progress. Shouldn’t we do the same with our students — making sure they are reading proficiently, every step of the way, similar to how my painting instructor goes to each student to make sure our technique will get us the end result that we want? If I stayed quiet and fumbled through my painting, would I get better in terms of technique and form? This is similar to having a student stay quiet when they are struggling with reading the text at hand. This being said, silent reading does have an important place in students’ lives. Once a student is a fluent reader, they should continue reading anything and everything that is available to them at their just-right level.  Here are the best practices for building reading fluency for those who struggle with reading: […]

June 28, 2017
Implicit vs. Explicit Instruction: Which is Better for Word Learning?

Does traditional or exploratory learning work better? As educators, we are constantly faced with the question of how we can best present material so that it is optimally “learnable” for the different students we are trying to reach. There is considerable evidence both for and against self-directed and exploratory learning, so there is a great opportunity for neuroscience to examine the ground-level differences between these and more traditional methods of instruction and how the brain reacts to each. One of those differences is the subject of current investigation: the divide between explicit and implicit instruction. By explicit instruction, we mean teaching where the instructor clearly outlines what the learning goals are for the student, and offers clear, unambiguous explanations of the skills and information structures they are presenting. By implicit instruction, we refer to teaching where the instructor does not outline such goals or make such explanations overtly, but rather simply presents the information or problem to the student and allows the student to make their own conclusions and create their own conceptual structures and assimilate the information in the way that makes the most sense to them. Which is more effective? One study out of Vanderbilt University recently looked at this question as it applies to word learning. In this study, principal investigator Laurie Cutting and her team examined 34 adult readers, from 21 to 36 years of age. The subjects were taught pseudowords—words that are similar to real words but that have no meaning, such as “skoat” or “chote.” Then, through both explicit and implicit instruction, subjects were taught meanings for these words. (In the study, both of these pseudowords were associated with the picture of a dog.) The goal was to gain a clearer understanding of how people with different skills and capabilities processed short-term instruction, how […]

April 20, 2011
Students Exceed State Average on TAKS after Fast ForWord, Maintain Gains

Since the 2004-2005 school year, the Dallas Independent School District has used the Fast ForWord products in many of their high schools. This multi-year study followed more than 500 high school students from 20 schools over the years of their Fast ForWord participation.   This study shows impressive longitudinal results on the TAKS which is The Texas Assessment of Knowledge and Skills which is administered annually throughout Texas and is closely aligned with the state curricular standards.   A longitudinal study is a type of study that follows the same subjects over time. Students started with the Fast ForWord Middle & High School product, now known as the Fast ForWord Literacy product. Many went on to use the Fast ForWord Language to Reading and Fast ForWord to Reading products. On average, students spent 60 days using the products during a 5 ½ month period.  The scores of Fast ForWord participants moved in step with the state average until the students started to use Fast ForWord products.  During the year of Fast ForWord product use, the participants experienced accelerated learning that separated their performance from that of their peers.  Even up to two years after they finished using the products, the Fast ForWord participants maintained their improvements. The TAKS gains made during the study were statistically larger for the Dallas Fast ForWord participants than the gains made by their statewide peers.

March 3, 2011
Truth in Numbers: School Achieves Statistically Significant Improvements on TAKS

In the 2008-2009 school year, selected students at Sam Houston Elementary School in the Grand Prairie Independent School District, TX, worked with the Reading Assistant software. To evaluate the impact of this intervention, the school conducted an observational study using scores from the Texas Assessment of Knowledge and Skills, or “TAKS,” the annual state assessment. Administered in the spring of each year, students throughout Texas take the TAKS, which measures progress against the state’s curricular standards. On average, the study students worked with the Reading Assistant software for a total of two and a half hours over a 27-day period. The outcomes measure used for the study was the reading portion of the TAKS. Assessment results were reported in Lexile scores, which provide a continuous scale for tracking students’ reading achievement over time. Before and after scores were available for 18 fifth graders who had worked with the software: Prior to using Reading Assistant, many of these students were struggling readers. Only 56% of study participants met the state standard for reading proficiency in 2008. The group’s average reading level was more than a year below what it should have been for their grade. After using Reading Assistant, the percentage of students who met the Texas state standard for reading proficiency increased from 56% to 78%. The group’s average Lexile score went up from 541 before using the software to 753 after using the software. The study group showed statistically significant gains in both reading score and passing rate, suggesting that guided oral reading practice with Reading Assistant had a dramatic impact on reading achievement. Reading Assistant software combines advanced speech recognition technology with research-based interventions to function as a personal tutor for guided oral reading practice.

Copyright © 2021 Scientific Learning Corporation. All rights reserved.
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram