End Near for Reading Recovery in MMSD?

The reduction of over $680,000 of ESEA Title 1 entitlement grant dollars challenges the district to change the way students and teachers are supported under Title 1. The current direct service model of student support cannot be supported in the long run with current funding. The administration will use the first semester of next year to develop a new model. (Page 252, Department & Division Detailed Budgets)

The MMSD uses Title 1 money to fund Reading Recovery. Does the statement above mean the end of Reading Recovery in the district?

29 thoughts on “End Near for Reading Recovery in MMSD?”

  1. I’m sure that between the band, choir,library, G/T, strings, sports budgets and soda pop vending machine revenues, enough money can be diverted to keep it afloat.
    Sacred cows are tough critters.
    Reed Schneider

  2. I don’t know what grants might be available from the federal government at this point in time.
    As you might know, Jane, Superintendent Rainwater returned more than $2 million in 2004 from a Reading First grant that he didn’t like. See http://www.schoolinfosystem.org/archives/2004/10/madison_superin.php.
    According to the Reading First Web site (http://www.ed.gov/programs/readingfirst/index.html),
    new grants are not currently available, so the MMSD can’t reapply.

  3. The district can apply for $20,000 in state funds (REACh Initiative-see DPI website) to continue work schools have begun in early literacy (Thoreau, Hawthorne and Leopold). It’s not a lot of money, but it’s purpose is to provide release time for teachers for professional development and discussions about how to improve literacy (and they can choose math as an area as well as behavior, too). The idea is to provide evidence-based reading/math/social-behavioral prevention and intervention programs to assist all students in achieving standards. The money can also be supported by 15% of district flow through dollars that can now be used for early intervening services in the district. I’m sure most parents don’t even know about the ongoing initiative, even though they are supposed to be an integral part of the process!

  4. The whole Reading First program is under investigation by the federal government’s Inspector General because of complaints that those charged to oversee the implementation were steering districts to assessments and commercial reading programs which they had a financial connection to. Further, there have been charges that the program’s implementation has not adhered to the scientific principles outlined in the actual law.
    The Madison district’s experience is similar to many of the districts who filed the original complaints. It is well to remember that the grant money was, ultimately, tied to purchasing certain commercial programs and using these programs exclusively in the five schools which were part of the grant. The data on the district’s approach (Balanced Literacy) showed better results than the data from any of the designated commercial programs.
    Of all the first grade children who received Reading Recovery in 2002 and were still in the district 2 years later to take the 3rd grade reading test, 89% tested at grade level (66% scored Proficient or Advanced). These are much more appropriate figures for judging the effectiveness of the program than the one Isthmus prefers to emphasize, the 53% who “successfully complete” the program.
    The decision to reject the Reading First grant was made based on the unanimous recommendation of the teachers and other staff who tried to work with the federal contact person.

  5. Carol,
    What does the budget’s cryptic message on reading mean?
    You’ve cited an 89% success rate for Reading Recovery before, and I cannot find it in the district’s damning analysis of the program. Can you please tell me where you get the figure?
    And thanks for posting.

  6. Carol,
    If you go to RR’s website you will see a letter they sent to the federal government complaining about the treatment they have been getting. They admit to tens of thousands of fewer kids in RR now than there were in 2000. It is always blamed on less funding. This argument, less or no funding, is without merit for one important reason. RR advocates, when selling the program to a district, promise that while the program is expensive it will actually save the district money in the long run due to lower remedial costs later on. If that is to be believed, then no district would ever get rid of the program….even if all state and federal funding were pulled. Of course, this RR promise is never ever realized.
    Funding removal, then, is merely a convenient excuse for districts to jettison the program after they figure out its not working as advertised….and to save face.
    I also respectfully question the 89% success rate for RR grads. First, RR screeners will typically evaluate the lowest performing 20% of 1st graders. Those they believe will not do well in RR often never make it into the program. Second, how many in the 89% figure received remediation after they went through RR. I bet quite a few did. Those should not be counted in the 89% either. Third, it’s true some kids leave the district but you can still find out how they are doing by checking their progress at their new school. When you take these three factors into account, an accurate number of 50% will hold.
    As to balanced literacy being the best of the commercial packages available, please go to http://www.air.org. They reviewed 22 school wide reform models for effectiveness via a grant from the US Dept of Educ. The Literacy Collaborative from Ohio State was one of them. The LC is the “branded” cadillac version of the generic term “balanced literacy”. It received a “limited effectiveness” rating. An excerpt from the report is below:
    “Of the 22 reform models examined, Direct Instruction (Full Immersion Model), based in Eugene, Ore., and Success for All, located in Baltimore, Md., received a “moderately strong” rating in “Category 1: Evidence of Positive Effects on Student Achievement.”
    Five models met the standards for the “moderate” rating in Category 1: Accelerated Schools PLUS, in Storrs, Conn.; America’s Choice School Design, based in Washington, D.C.; Core Knowledge, located in Charlottesville, Va.; School Renaissance in Madison, Wis.; and the School Development Project, based in New Haven, Conn. Models receiving a “moderate” rating may still show notable evidence of positive outcomes, but this evidence is not as strong as those models receiving a “moderately strong” or “very strong” rating.
    Eight models earned a “limited” rating in Category 1: ATLAS Communities and Co-nect, both in Cambridge, Mass.; Different Ways of Knowing, located in Santa Monica, Calif.; Integrated Thematic Instruction, based in Covington, Wash,; Literacy Collaborative, from Columbus, Ohio; National Writing Project, in Berkeley, Calif.; Modern Red Schoolhouse, based in Nashville, Tenn.; and Ventures Initiative Focus System, located in New York, N.Y.”
    Clearly there are better programs than what is described as “balanced literacy”.
    Reed Schneider

  7. How do the district’s other reading strategies compare – cost, results? What grades do they serve?
    Isn’t the more appropriate question for the School Board looking at all the district’s reading strategies and results – and comparing all of this with other places?

  8. Two observations after reading Reed’s last comment.
    1. My reading of the AIR report is that the evaluation of the Literacy Collaborative was not a matter of proven ineffectiveness, but of not enough studies that met AIR’s criteria. “The studies tracked trends on reading performance
    of second-grade students for up to 6 academic
    years that Literacy Collaborative was in place. Each study showed increases in reading achievement over time but did not indicate whether the increases were statistically significant. Upon request, the studies’
    authors provided data to the CSRQ Center that allowed the Center to test the findings reported in the studies. The results in all of the studies showed statistically significant pre-post differences over time, and the majority showed statistically significant, distinct upward trends in reading achievement.” Given the cost of the program, strong evidence of effectiveness is desireable. However, I don’t see anything in the AIR report that indicates that balanced literacy is ineffective. Perhaps I missed something…
    2. Declining participation is not evidence of ineffectiveness. The Reading First initiative has crowded out several programs, including Success For All. Success for All, as Reed noted, has one of the strongest ratings in the AIR report, but has seen (I have heard) participation decrease since the inception of Reading First.

  9. I’m still trying to decipher the secret cryptic message that Art is sending me in the reading portion of the budget. I really wish he’d just call and explain it. My decoder ring is old, my software is out of date, and my brain implant doesn’t seem to transmit to the Doyle building now that they’ve reinforced Art’s inner sanctum with enriched uranium from Iran;)

  10. Your post made me chuckle, David.
    The real concern is that the simple statement in the budget could lead to the top-down implementation of a dumbing-down curriculum like the middle school model and English 10 at West.
    I hope that the board convinces the administration to involve parents, teachers, and reading experts from the community to help “develop a new model.”

  11. Tim,
    Thanks for your comments.
    1) The LC is “balanced literacy.” It has gone by other names as well such as “Early Literacy learning Initiative” ELLI. These frameworks have been in existence for 10-15 years now. If their makers, or independent research can’t show “statistically significant” increases by now, they never will.
    I will also have to read the AIR report again since I don’t understand your sentence that begins “the results in all the studies showed.” Help me out if you like.
    2) I guess I have to disagree with your statement that decreased participation is not an indication of ineffectiveness. I believe it most certainly is. As I said earlier, if RR truly saved remediation expenses later on, no school would ever get rid of it. For example, if you knew spending $5 would save you $7, you would do it forever…..even if the amount was raised to paying $500,000 to save $700,000. If the $5 grant you were being given to save the $7 dried up, you would be forced to find the $5 on your own…unless you determined that you weren’t really saving the $7 after all…..in which case you would stop.
    RR was just fine when the feds paid the bill. When that dries up, districts are forced to behave like rational consumers and make decisions accordingly. Reading First, therefore, has crowded out nothing. It has instead “persuaded” districts to spend their general budget funds more wisely. Personally, I think that is a good thing.
    reed schneider

  12. The economic savings is attractive, and are certainly important. I would like to see the School Board’s Performance and Achievement Committee have a serious discussion with input from numerous perspectives on reading curriculum – educational and financial issues, so the School Board is having and leading these important discussions in a public forum.
    Was the draft evaluation ever finalized?

  13. Carol,
    If Reading Recovery had an 89% success rate for students on the third grade reading test, I’d be it’s strongest advocate. But I don’t believe the number is anyplace close to accurate. I hope you can post your citation for 89%.

  14. I’ve said it before, but I will say it again: despite the spin the Isthmus and others put on the “Potter Report” http://www.nrrf.org/RdgRcvryEval.pdf
    I don’t think it was all that “damning.” For discontinued students RR has done quite well in Madison. The only practice that seems to be extremely questionable is enrolling students in RR late in the school year. This does not seem to be very cost effective. Still, I don’t know what should be done for the students who are identified as needing remedial services late.
    Ed is correct that Carol’s 89% number does not seem to be in the report itself. The category she identifies, the 2002 1st grade cohort of discontinued students who were also tested in 2004, is not disaggregated anywhere that I can find. Maybe I didn’t look hard enough. Still, it is a believable number if you look at the overall picture for discontinued students (graphically displayed on page 21 of the report).
    I’m intrigued by something else in her post and that is the relationship between “Grade Level” and the “Basic, Proficient and Advanced” groupings. I’m sure I could look it up and should know it, but is “Basic” the same as “Grade Level” or does it include some who are below “Grade Level”? Anyone want to help me out?

  15. TJM,
    Funny you should bring up “discontinued” students, because I was thinking of an analogy while biking home tonight. (I believe the analogy is also an example of begging the question in the technical sense.)
    Anyway, a surgeon who claims to be highly successful, says, “I have had no fatalities among my patients who survived the surgery.”
    Reading Recovery’s “discontinued” definition is exactly the same. “The successful students (those discontinued) succeeded when we tested them after they were discontinued.”
    You cannot look only at discontinued students any more than the surgeon can look only at the survivors, because Reading Recovery selects students the teachers expect to succeed, drops those who don’t succeed after a few lessons, and keeps only those who succeed. To measure the success rate accurately, the calculation of success must include all of the students initially enrolled in Reading Recovery, at a minimum.
    The superintendent said that he wasn’t surprised, let along disappointed, in Reading Recovery’s 50% success rate. He didn’t bother to quibble about the findings, as most Reading Recovery defenders do. He just simply defended it as worthwhile. However, I’d define success as 89% or better for most reading programs, as does Carol apparently. Unfortunately, I’m still waiting for the documentation of her claim.

  16. This will be brief – and not answer all questions – but I am getting ready to leave town for a few days. To clarify – the data I quoted was from the district’s research guy, Kurt Kiefer. I have asked the administration to do a report on several years of Reading Recovery students in terms of their performance on the third grade reading test. Ed – the data is on ALL students who took part in Reading Recovery – not just those who successfully completed the program. And the district has been looking at how to make sure that all the students have enough time to complete the program.

  17. There’s lots of discussion around reading recovery, but what reading interventions are available, for what grades across the district? How do these compare – children reached, results over time, costs, teacher support.
    It will be helpful to have some clarification about reading recovery, but information and discussion of reading curriculum on the Performance and Achievement Committee might be worthwhile and provide information to a wider audience. I don’t think this is simply about Reading Recovery = right or wrong. Somehow focusing on this seems too narrow.

  18. You’re right, Jane.
    This discussion is narrow, but that’s imposed on us because the MMSD uses Reading Recovery as the only intervention for elementary students (but only first graders) who struggle with learning to read. Otherwise, the student gets a balanced literacy curriculum in the classroom or Title 1, if the student’s school even has Title 1. In the upper grades, a few schools use Read 180 to help nonreaders, but the district does’t have a particularly strong commitment to Read 180.

  19. Carol,
    Thanks for your post. I hope you enjoy your break away from this madness.
    Actually, Carol, Tim Potter already prepared the detailed report you say you requested. It’s posted at http://www.thedailypage.com/features/docfeed/docs/2945c.pdf.
    In the summary on page 1, the report says:
    “When combining both successful RR students and the unsuccessful RR students over an entire school year of service delivery the overall program impact does not yield statistically significant achievement gains when comparing performance of
    participants to similar but non-participating students after controlling for intervening affects (e.g., poverty, special education status, parent education, etc.).”

  20. Apologies for any confusion caused by my misunderstanding of the data Carol cited.

  21. TJM,
    You wondered whether “basic” equated to grade level. It doesn’t.
    Here are the definitions from DPI’s Web site:
    Advanced: Demonstrates in-depth understanding of academic knowledge and skills tested on WKCE.
    Proficient: Demonstrates competency in the academic knowledge and skills tested on WKCE.
    Basic: Demonstrates some academic knowledge and skills tested on WKCE.
    Minimal Performance: Demonstrates very limited academic knowledge and skills tested on WKCE.

  22. I am a teacher in a school district far from yours. Take my advice and jettison Reading Recovery. It is an inferior program…and an expensive one at that.

  23. In other words, “Basic” should include only students who are not at grade level. “Grade level” implies appropriate “competence” at the expected level. of course, there is also the question of whather or not the “academic knowledge and skills tested on WKCE” are even grade level. I know several students who tested ‘proficient’ on the fourth grade WKCE in reading, for example, whose actual applied reading or literacy levels are not sufficient to support grade level engagement.
    How do you define “proficieny”? Is a fifth grader (A) reading at “grade level” if he or she can stumble slowly through a reading passage in fifteen minutes that stronger a grade level peer (B) can read through in five, and fifth grader “A” gets about 70% of (even surface-level) comprehension questions correct, while fifth grader “B” gets 90-100% correct in one-third of the reading time? How are you going to read instructional texts and get much out of them if it takes you three times as long and you get 20-30% less out of it in terms of accurate understanding? If you are struggling to even “catch up” on (the physical act of) a reading assignment, you are typically not going to be reading carefully enough to understand it at a deep enough level to answer thought-provoking questions (and possibly not even just fact-regurgitating questions asked immediately after reading the text in question).
    And I had to smile at your comment on how tough “sacred cows” are, Reed. I also had to think of the seven skinny cows who swallowed the seven fat cows and looked just as malnourished as they did before.

  24. Carol Carstensen provided the third grade reading scores for students who had been in Reading Recovery in first grade. According to the 2003-04 WRCT results of former Reading Recovery students, only 57% scored at or above grade level, not 89% as Carol suggested in a comment above.
    I suggested to Carol that it might be wise for the district to pilot a curriculum stressing systematic, direct, and explicit instruction, since Reading Recovery does little of those. A pilot would tell the board whether another curriculum would help students even more.
    She’ll probably say no way.

  25. Although the string of comments ends on May 26th, and there was one comment early in the discussion that indicated that MMSD will evaluate whether to continue with Reading Recovery this fall, was RR kept in the 06-07 budget intact?
    As to the research data and inconsistencies that have been discussed between “successful completion” and students who started but did not finish with RR, I would suggest that the focus return to how students are selected in the first place to participate in RR. If a student is so weak that the staff does not believe RR will work, what happens to them? If a student makes it past first grade before falling behind in reading (as many do in every district), what interventions are available prior to middle and high school (and Read 180)? Would MMSD care to investigate a series of brain research software products (Fast ForWord) that have been shown to work with the lowest level students as well as those who are only “marginally behind” (Basic)? Take a look at data from schools in WI that have used other resources that end up being substantially lower in cost than RR in bottom line costs and amount of time away from other instruction. DPI has tracked other programs via grant-funded READS and READING FIRST grants over the past five years, and can now support using several other programs that have shown equal or better results for lower costs–but the Sacred Cow may need to be sacrificed in order for MMSD to investigate further…

  26. Good questions and suggestions, Ed, but the MMSD will not even entertain such considerations. The superintendent staunchly defends Reading Recovery’s 40% failure rate; Carol Carstensen defends Reading Recovery at every turn; other board members say it’s not their role to make curriculum decisions (though they made several in the recent budget). It’s hopeless.
    With the MMSD’s commitment to balance literacy for everything but Reading Recovery, a first grader who struggles to read in a balanced literacy classroom may participate in Reading Recovery but if the child fails in Reading Recovery he or she returns to the balanced literacy classroom and/or balanced literacy in Title 1. Beyond first grade, the student who struggles with balanced literacy in the classroom may be referred to get balanced literacy in Title 1. Beyond the first few grades, the MMSD offers only Read 180 in a few schools IF the principal chooses to offer Read 180.

Comments are closed.