School Information System
Newsletter Sign Up |

Subscribe to this site via RSS: | Newsletter signup | Send us your ideas

December 13, 2004

Are MMSD Programs Effective? Who Knows?

This is my first post to this blog, so I�ll start by introducing myself. My name is Bill Herman. I have two kids at Crestwood ES, and a third will start in the fall. Also, I work in K-12 education; I�m the technology director for Monona Grove Schools.

I read �Paper #1,� criticizing MMSD for declining $2 million of federal money for reading, with interest and some dismay. With interest because it does seem odd that the district would reject such a sum even if some strings are attached. With dismay because neither side in the debate had a good way to weigh the district�s key claim�that the existing program has improved student reading.

Both sides used WKCE scores to support their claims. Unfortunately, the WKCE is not a useful tool to assess the effectiveness of programs at MMSD or anywhere else, because it isn�t designed to measure student progress over time, or to compare scores from one year with scores from another year. This means that we have a bigger problem than not knowing if elementary reading instruction is effective in MMSD. We are not able to decisively assess the effectiveness of any instructional program in the Madison schools.

It may be hard to believe that state-administered tests in reading, math, and language arts can�t show whether students are doing better or worse over time, but DPI has warned that this is the case:

�It is very difficult to accurately compare [02-03 WKCE] scores with past years for two basic reasons. First, the tests are different. New test questions were added at the fourth- and eighth-grade levels and the tests were entirely customized at the tenth-grade level. Second, the cut scores for each proficiency category are different based on the bookmarking process conducted in February [2003] by 240 educators.� (

That is, there is no way of knowing whether previous WKCE tests were easier or harder than today�s, and also, DPI has changed the curve. For both reasons, we can�t use WKCE to gauge student progress (or lack of it) over time.

Pause and think about this. DPI says WKCE cannot tell us whether the academic skills of Wisconsin students are improving, staying the same, or getting worse over time.

This means that WKCE cannot answer the questions that are required to judge program effectiveness: Are student skills in the program area greater now than they were when the program was introduced? Can students read better? Can they do math better? Is this true of kids across the spectrum? High achievers? Low achievers? All grade levels? Different socioeconomic levels? Different ethnic groups? If we can�t answer these questions, we have no way of knowing where we are doing well and where we need to improve. We are in the dark.

There are standardized tests that can pinpoint gains or losses in student learning over time. An excellent one is NWEA�s MAP test (, used by 1,200 school districts nationally. Like WKCE, MAP is a set of standardized tests that students take at least once a year. But unlike WKCE, MAP reveals where curriculum is effective and exposes where it is not, by measuring students� growth over time�and comparing it against local and national norms--in subject areas, grade levels, quartiles, and ethnic groups. MAP, unlike WKCE, can be used to gauge program effectiveness, because it is measures growth over time, and not merely current proficiency.

Currently 70 districts in Wisconsin use MAP, including Monona Grove and several other Madison suburban districts. I anticipate that MMSD would be reluctant to adopt MAP, in part because it would mean even more testing. However, DPI could alleviate this by allowing districts to give MAP instead of WKCE.

DPI recently signed a 10-year contract with McGraw-Hill to provide the �state assessment� (the test Wisconsin schools use to show they are meeting the requirements of No Child Left Behind). This means that for the next 10 years, WKCE is the official Wisconsin No Child Left Behind test.

However, No Child Left Behind does not require states to choose a single test; only to develop an �assessment plan� that ensures that all students� proficiency can be accurately determined. And Wisconsin statute authorizes the state superintendent to determine which test or tests can be used to determine proficiency. It is within the power of DPI to include alternatives to WKCE in its No Child Left Behind assessment plan. That is, DPI can approve alternative assessments, and then school districts could give MAP instead of WKCE.

DPI does not want to do this, because it would mean extra work for them and it would create risk for them by adding one more thing to Wisconsin�s assessment plan that might get rejected by the feds. DPI�s official position is that MAP is great for measuring growth but no good for determining proficiency as defined by USDOE, and they have discouraged us at Monona Grove from trying to get MAP approved as an alternative state assessment.

So I think the battle, if anyone is interested in joining it, is on two fronts�one, convincing officials at MMSD of the need to give a test that measures academic growth over time, and two, pushing DPI to approve MAP as an alternative state assessment. This would make it possible for MMSD and all Wisconsin schools to give a single test to satisfy the feds while measuring student growth.

Until we start giving tests that accurately show where students are gaining academically and where they are stalling, MMSD officials will be able to believe whatever they want to about the effectiveness of their programs, because no one will have a way to substantiate or refute their claims.

Posted by Bill Herman at December 13, 2004 10:32 AM
Subscribe to this site via RSS/Atom: Newsletter signup | Send us your ideas