Madison and Wisconsin Math Data, 8th Grade

At a meeting on February 22 (audio / video), representatives of the Madison Metropolitan School District presented some data [820K pdf | html (click the slide to advance to the next screen)] which they claimed showed that their middle school math series, Connected Mathematics Project, was responsible for some dramatic gains in student learning. There was data on the percent of students passing algebra by the end of ninth grade and data from the state eighth grade math test for eight years. Let us look at the test data in a bit more detail.

All that was presented was data from MMSD and there was a very sharp rise in the percent of students scoring at the advanced and proficient level in the last three years. To see if something was responsible for this other than an actual rise in scores consider not only the the Madison data but the corresponding data for the State of Wisconsin.

The numbers will be the percent of students who scored advanced or proficient by the criteria used that year. The numbers for Madison are slightly different than those presented since the total number of students who took the test was used to find the percent in the MMSD presented data, and what is given here is the percent of all students who reached these two levels. Since this is a comparative study, either way could have been used. I think it is unlikely that those not tested would have had the same overall results that those tested had, which is why I did not figure out the State results using this modification. When we get to scores by racial groups, the data presented by MMSD did not use the correction they did with all students ( All 8th grade students in both cases)

MMSD Wisconsin
Oct 97 40 30
Feb 99 45 42
Feb 00 47 42
Feb 01 44 39
Feb 02 48 44
Nov 02 72 73
Nov 03 60 65
Nov 04 71 72

This is not a picture of a program which is remarkably successful. We went from a district which was above the State average to one with scores at best at the State average. The State Test was changed from a nationally normed test to one written just for Wisconsin, and the different levels were set without a national norm. That is what caused the dramatic rise from February 2002 to November 2002. It was not that all of the Middle Schools were now using Connected Mathematics Project, which was the reason given at the meeting for these increases.

It is worth looking at a breakdown by racial groups to see if there is something going on there. The formats will be the same as above.

Hispanics
MMSD Wisconsin
Oct 97 19 11
Feb 99 25 17
Feb 00 29 18
Feb 01 21 15
Feb 02 25 17
Nov 02 48 46
Nov 03 37 38
Nov 04 50 49
Black (Not of Hispanic Origin)
MMSD Wisconsin
Oct 97 8 5
Feb 99 10 7
Feb 00 11 7
Feb 01 8 6
Feb 02 13 7
Nov 02 44 30
Nov 03 29 24
Nov 04 39 29
Asian
MMSD Wisconsin
Oct 97 25 22
Feb 99 36 31
Feb 00 35 33
Feb 01 36 29
Feb 02 41 31
Nov 02 65 68
Nov 03 55 53
Nov 04 73 77
White
MMSD Wisconsin
Oct 97 54 35
Feb 99 59 48
Feb 00 60 47
Feb 01 58 48
Feb 02 62 51
Nov 02 86 81
Nov 03 78 73
Nov 04 88 81

I see nothing in the demography by race which supports the claim that Connected Mathematics Project has been responsible for remarkable gains. I do see a lack of knowledge in how to read, understand and present data which should concern everyone in Madison who cares about public education. The School Board is owed an explanation for this misleading presentation. I wonder about the presentations to the School Board. Have they been as misleading as those given at this public meeting?

Richard Askey

7 thoughts on “Madison and Wisconsin Math Data, 8th Grade”

  1. Thank you, thank you, Dick, for taking the time to put this together, and for the very clear presentation. Speaking as a social scientist who had a first career as a researcher and who used to teach statistics (undergraduate and undergraduate level), I am regularly appalled by the low level of quantitative understanding demonstrated by the people who run our school district. It is shocking; it is frightening.
    I’m going to send this link to everyone I can think of, especially the Administration, the BOE, and Linda McQuillen.

  2. Thanks for posting the data. I’d encourage those interested to go to read the details from the presentation. I missed it, but I’m impressed. We live in a great community, and deep concerns about math education is another sign that we have a well-informed and engaged community.
    I, too, would like to see a stronger showing of math skills. However, more importantly in my opinion, I’d like to see kids encouraged to find those methods _that work best for them_ in context of solving broader social problems. Sure, they’ll need lots of practice, but they should also care to some degree about what they are doing. Developing mathematical reasoning skills for a wider set of learning approaches in a social context is a main goal within connected math. I’ll defer to those who look very carefully at pedagogy (people like Dr. Terry Millar) to explain more.
    If you’re for reform of the school leadership because of their support of connected math – you should consider the kinds of social problems that seem to reinforce the need for mathematical understanding in a broader social context. We are generally numb to policies formed around us without much mathematical reasoning. You don’t have to look far outside the schools for plenty of examples of poor mathematical reasoning with wide support (private debt and government support of predatory lending practices, tax cuts applied to the most resourceful population during times of record deficits, broad-based emissions trading of pollutants with heavily regionalized impacts, prisons without drug counseling, the rise of gambling supported as a governmnetal revenue source, and draconian funding limit measures for education).
    In these unstable times, effective mathematical reasoning will be a major leg up for the individual – but it cannot compensate for the ignorance of the masses. An individual who understands just how fast a sinking ship is filling with water won’t necessarily be able to stop the leak – she’ll just know the best time to jump. We need creative solutions by at least a few and an operative understanding by many.
    Back to the data (sorry for my preaching)… Study design is critical to effective inerpretation of the results. Within the presentation there is a control case (CMP compared against school A and school B without CMP). The performance of connected math from this study looks very good. I’d encourage a closer look at the Madison data with some controls and data normalization in mind. One thing to note is that ESL population and other variables have changed in very significant ways in the 90’s. I think there are some very real connections of other forms of literacy and mathematical ability as measured by standardized tests. So, the relative changes in test performance between Madison and other areas of Wisconsin should also be checked relative to other measures of literacy. Also, in looking for solutions, we need to also support goals for improvements in other forms of literacy.
    Let’s stick with this. Thanks again for those who take the time to study and share the performance data in meaningful and interesting ways.

  3. Thanks, Dick, for this informative analysis regarding the 8th-grade test results. The interpretation of the data presented at the meeting showing % students successfully passing 9th-grade algebra by the end of 9th grade is similarly flawed. In this case, the dramatic jump observed in the % passing the course by the end of 9th grade is probably largely due to the MMSD having recently mandated the elimination of a remedial pre-algebra course for 9th graders who flunked math in 8th grade, requiring all students to now attempt 9th-grade math in 9th grade, instead. Previously, these 9th graders in remedial math could not possibly have passed 9th-grade math by the end of 9th grade because they had yet to attempt the course. When queried after the meeting, Ms. McQuillen admitted that the percent of the students who fail to pass 9th-grade math, ignoring the year in which they attempt it, has not changed significantly. Thus, the introduction of CMP has failed to decrease the % of MMSD students who can’t graduate from high school because they fail to pass algebra I and geometry.

  4. I guess I miss Janet Mertz’s point here, or its implications.
    Assuming that the % of 9th graders passing algebra by the end of 9th grade include students who did not take algebra, then the dramatic jump in % passing when all kids are forced to take algebra, does not seem to lend itself to arguing “artifical” inflation.
    If kids who would normally have taken pre-algebra now take algebra and pass, then that is a positive outcome. If these kids now include those who flunked 8th grade math, then this is all the more positive.
    Of course, if algebra has been watered down generally, or lowered for those kids with appropriate preparation, then this % increase would not be a positive sign.

  5. This has been bothering me since I first read it. It is not uncommon for a particular year’s standardized test to be normed wrong and to produce outlier data and I thought that this might have been the case for 2002 and 2004. With the increased emphasis on these tests there are also reports that the companies creating the tests are yielding to pressure to set the scales to produce pleasing statistics (see: http://www.dailyhowler.com/dh070705.shtml, scroll down). This may also be the case.
    One thing that I thought might be worth looking at is how the 2002 8th Grade cohort did on the 2004 test. When I went to the DPI site to look for this I found the following sentence: “Cutscores for proficiency levels changed effective November, 2002.”
    So yes the tests were renormed in 2002 and any comparisons with previous years are essentially useless (unless adjustments are made). DPI recognizes this and does not include pre 2002 statistics on their site.
    Is MMSD making these adjustments? Are the numbers they provide good numbers — with adjustments or using a test that wasn’t remormed – or are they the numbers DPI has rejected. If it is the latter, it is either gross incompetence or a deliberate attempt to mislead the public. I said if (I don’t know), but if the numbers are what they seem to be heads should roll.
    TJM

  6. The changing cut scores present a problem. Unfortunately, due to the introduction of the new state assessments in the grades required by NCLB, DPI has given notice that the cut scores will again have to be reset for this year (Fall 2005).
    Two approaches that can be helpful in the face of the cut score discontinuity that I like to use are looking at the scale score, which is supposed to be stable, and benchmarking against a set of comparable schools. These, too, have their limitations, but provide useful information.
    I’m skeptical about the Daily Howler piece, but there’s an solution. Like Bill Breisch and others have been advocating, let’s consider looking at achievement growth in addition to achievement status. That is a way to evaluate whether all our students are learning.

  7. There are several ways to look at the numbers. One is to ask whether they show growth or improvement over previous years. This perspective would be affected by the issue of renormed scores.
    Another way to look at the numbers is to ask how different groups compare to each other in a given year. When one looks at the scores for 2004, for example, it does appear that students of color – particularly African American students – remain substantially below the scores, whether aggregate average or for white students.
    If I’m wrong in this understanding, please feel free to say so. If I’m right in this understanding, we should be talking about how to change those disparities.

Comments are closed.