U.W. psychologist, Mark Seidenberg, wrote an editorial in Sunday's (12/12/04) edition of the Wisconsin State Journal critical of the way that the district is presenting its reading data. He also points out that although Superintendent Rainwater would like the public to believe "that accepting the Reading First funds would have required him to "eliminate" the district's current reading curriculum - the one used throughout the district. ... The acceptance of Reading First funding has no bearing on the curriculum used in other schools."
Madison schools distort reading data
12/12/04
Mark S. Seidenberg
As a taxpayer who believes in the importance of reading, I'm having trouble understanding why Madison schools Superintendent Art Rainwater turned down $2 million that was supposed to be used to help educationally disadvantaged children in five Madison schools.
The superintendent and Assistant Superintendent Jane Belmore have offered explanations that don't wash. The district accepted funds for the first year of a five- year award under the federal government's Reading First program. After the first year, the program was assessed by an educational consultant hired to evaluate how the funds were being used. The evaluator found that reading programs in the target schools were not adequately documented. She asked for information about "scope and sequence" (educationese for "what will be taught when") and daily instructional activities. The school district in its wisdom decided that rather than comply with these conditions it would give back the money. Why?
Rainwater's explanation of this precipitous decision - echoed in published comments by Belmore and school board member Carol Carstensen - is that accepting the Reading First funds would have required him to "eliminate" the district's current reading curriculum - the one used throughout the district.
These assertions are unequivocally false. The acceptance of Reading First funding has no bearing on the curriculum used in other schools. The evaluator clearly requested changes in the Reading First program at the five schools, not the district as a whole. If the school district administrators were confused about this, they could have requested clarification. If they felt the conditions were unreasonable, they could have appealed.
Rainwater's explanation also emphasized the fact that 80 percent of Madison children score at or above grade level. But the funds were targeted for students who do not score at these levels. Current practices are clearly not working for these children, and the Reading First funds would have supported activities designed to help them.
Madison's reading curriculum undoubtedly works well in many settings. For whatever reasons, many chil dren at the five targeted schools had fallen seriously behind. It is not an indictment of the district to acknowledge that these children might have benefited from additional resources and intervention strategies.
In her column, Belmore also emphasized the 80 percent of the children who are doing well, but she provided additional statistics indicating that test scores are improving at the five target schools. Thus she argued that the best thing is to stick with the current program rather than use the Reading First money.
Belmore has provided a lesson in the selective use of statistics. It's true that third grade reading scores improved at the schools between 1998 and 2004. However, at Hawthorne, scores have been flat (not improving) since 2000; at Glendale, flat since 2001; at Midvale/ Lincoln, flat since 2002; and at Orchard Ridge they have improved since 2002 - bringing them back to slightly higher than where they were in 2001.
In short, these schools are not making steady upward progress, at least as measured by this test.
Belmore's attitude is that the current program is working at these schools and that the percentage of advanced/proficient readers will eventually reach the districtwide success level. But what happens to the children who have reading problems now? The school district seems to be writing them off.
So why did the school district give the money back? Belmore provided a clue when she said that continuing to take part in the program would mean incrementally ceding control over how reading is taught in Madison's schools (Capital Times, Oct 16). In other words, Reading First is a push down the slippery slope toward federal control over public education.
Parents and educators are right to be concerned about the incursion into local school districts via legislation such as "Leave No Child Behind." However, the place to make a stand was not refusing monies that could have been used in many ways to help children in need. Our school administrators placed their politics above their responsibility to educate all of our children.
Seidenberg is a UW-Madison psychology professor.
This is my first post to this blog, so I’ll start by introducing myself. My name is Bill Herman. I have two kids at Crestwood ES, and a third will start in the fall. Also, I work in K-12 education; I’m the technology director for Monona Grove Schools.
I read “Paper #1,” criticizing MMSD for declining $2 million of federal money for reading, with interest and some dismay. With interest because it does seem odd that the district would reject such a sum even if some strings are attached. With dismay because neither side in the debate had a good way to weigh the district’s key claim—that the existing program has improved student reading.
Both sides used WKCE scores to support their claims. Unfortunately, the WKCE is not a useful tool to assess the effectiveness of programs at MMSD or anywhere else, because it isn’t designed to measure student progress over time, or to compare scores from one year with scores from another year. This means that we have a bigger problem than not knowing if elementary reading instruction is effective in MMSD. We are not able to decisively assess the effectiveness of any instructional program in the Madison schools.
It may be hard to believe that state-administered tests in reading, math, and language arts can’t show whether students are doing better or worse over time, but DPI has warned that this is the case:
“It is very difficult to accurately compare [02-03 WKCE] scores with past years for two basic reasons. First, the tests are different. New test questions were added at the fourth- and eighth-grade levels and the tests were entirely customized at the tenth-grade level. Second, the cut scores for each proficiency category are different based on the bookmarking process conducted in February [2003] by 240 educators.” (http://www.dpi.state.wi.us/oea/pdf/profnewq&a.pdf)
That is, there is no way of knowing whether previous WKCE tests were easier or harder than today’s, and also, DPI has changed the curve. For both reasons, we can’t use WKCE to gauge student progress (or lack of it) over time.
Pause and think about this. DPI says WKCE cannot tell us whether the academic skills of Wisconsin students are improving, staying the same, or getting worse over time.
This means that WKCE cannot answer the questions that are required to judge program effectiveness: Are student skills in the program area greater now than they were when the program was introduced? Can students read better? Can they do math better? Is this true of kids across the spectrum? High achievers? Low achievers? All grade levels? Different socioeconomic levels? Different ethnic groups? If we can’t answer these questions, we have no way of knowing where we are doing well and where we need to improve. We are in the dark.
There are standardized tests that can pinpoint gains or losses in student learning over time. An excellent one is NWEA’s MAP test (www.nwea.org), used by 1,200 school districts nationally. Like WKCE, MAP is a set of standardized tests that students take at least once a year. But unlike WKCE, MAP reveals where curriculum is effective and exposes where it is not, by measuring students’ growth over time—and comparing it against local and national norms--in subject areas, grade levels, quartiles, and ethnic groups. MAP, unlike WKCE, can be used to gauge program effectiveness, because it is measures growth over time, and not merely current proficiency.
Currently 70 districts in Wisconsin use MAP, including Monona Grove and several other Madison suburban districts. I anticipate that MMSD would be reluctant to adopt MAP, in part because it would mean even more testing. However, DPI could alleviate this by allowing districts to give MAP instead of WKCE.
DPI recently signed a 10-year contract with McGraw-Hill to provide the “state assessment” (the test Wisconsin schools use to show they are meeting the requirements of No Child Left Behind). This means that for the next 10 years, WKCE is the official Wisconsin No Child Left Behind test.
However, No Child Left Behind does not require states to choose a single test; only to develop an “assessment plan” that ensures that all students’ proficiency can be accurately determined. And Wisconsin statute authorizes the state superintendent to determine which test or tests can be used to determine proficiency. It is within the power of DPI to include alternatives to WKCE in its No Child Left Behind assessment plan. That is, DPI can approve alternative assessments, and then school districts could give MAP instead of WKCE.
DPI does not want to do this, because it would mean extra work for them and it would create risk for them by adding one more thing to Wisconsin’s assessment plan that might get rejected by the feds. DPI’s official position is that MAP is great for measuring growth but no good for determining proficiency as defined by USDOE, and they have discouraged us at Monona Grove from trying to get MAP approved as an alternative state assessment.
So I think the battle, if anyone is interested in joining it, is on two fronts—one, convincing officials at MMSD of the need to give a test that measures academic growth over time, and two, pushing DPI to approve MAP as an alternative state assessment. This would make it possible for MMSD and all Wisconsin schools to give a single test to satisfy the feds while measuring student growth.
Until we start giving tests that accurately show where students are gaining academically and where they are stalling, MMSD officials will be able to believe whatever they want to about the effectiveness of their programs, because no one will have a way to substantiate or refute their claims.