WKCE Scores Document Decline in the Percentage of Madison’s Advanced Students

For many years now, parents and community members, including members of Madison United for Academic Excellence, have expressed concerns about the decline in rigor and the lack of adequate challenge in our district’s curriculum. The release this week of WKCE scores for the November 2008 testing led me to wonder about the performance of our district’s strongest students. While most analyses of WKCE scores focus on the percentages of students scoring at the Advanced and Proficient levels, these numbers do not tell us about changes in the percent of students at each particular level of performance. We can have large increases in the percent of students scoring at the Proficient and Advanced levels because we have improved the performance of students who were previously at the Basic level on the WKCE, but yet fail to have any effect on the performance of our district’s strongest students. This is the argument that we are improving the performance of our low ability students, but failing to increase the performance of our already successful students. An examination of the numbers of students who are performing at just the Advanced level on the WKCE provides us with some insight into the academic progress of our more successful students.
I decided to examine WKCE math scores for students across the district. While it is not possible to track the performance of individual students, it is possible to follow the performance of a cohort as they advance through the system. Thus students who are now in 10th grade, took the 8th grade WKCE in 2006 and the 4th grade test in 2002. Because there have been significant changes in the demographics of the district’s students, I split the data by socio-economic status to remove the possibility of declines in WKCE performance simply being the result of increased numbers of low income students. Although the WKCE has been criticized for not being a rigorous enough assessment tool, the data on our students’ math performance are not encouraging. The figures below indicate that the percent of students scoring at the Advanced level on the WKCE decreases as students progress through the system, and this decline is seen in both our low income students and in our Not Economically Disadvantaged students. The figures suggest that while there is some growth in the percent of Advanced performing students in elementary school, there is a significant decline in performance once students begin taking math in our middle schools and this decline continues through high school. I confess that I take no pleasure in sharing this data; in fact, it makes me sick.

Because it might be more useful to examine actual numbers, I have provided tables showing the data used in the figures above. Reading across a row shows the percent of students in a class cohort scoring at the Advanced level as they have taken the WKCE test as they progressed from grades 3 – 10.

Percent of Economically Disadvantaged Students Scoring at the Advanced Level on the WKCE Math Test Between 2002 and 2008

Graduation Year 3rd Grade 4th Grade 5th Grade 6th Grade 7th Grade 8th Grade 10th Grade
2005
8
2006
8.8
2007
11
7.7
2008
5.6
8.7
2009
8.5
6.7
2010
9.2
8.4
2011
12
12.5
11.1
8
2012
9.7
10.4
9.5
8.2
2013
15.3
14.7
15.1
11.7
10.8
2014
12
13.6
16.1
13.2
2015
20.1
15
18
11.7
2016
15.4
17.1
18.4
2017
12.9
17
2018
13.8

Percent of Not Economically Disadvantaged Students Scoring at the Advanced Level on the WKCE Math Test Between 2002 and 2008

Graduation Year 3rd Grade 4th Grade 5th Grade 6th Grade 7th Grade 8th Grade 10th Grade
2005
47
2006
41.6
2007
49
42.2
2008
33.8
51.5
2009
42
45.2
2010
47.7
45.1
2011
50
45.3
45
38.4
2012
43.4
50.7
53
45.7
2013
50.3
54.8
54.1
54.7
48.2
2014
49.6
56.7
60.9
53.5
2015
60
57.8
60.7
54.2
2016
55.6
56.3
62
2017
57.4
61.4
2018
55.6

While it could be argued that the declining percentage of low income students scoring in the advanced range on the WKCE are simply the result of a relatively stable number of Advanced ability students in this group becoming a smaller and smaller percentage as the overall numbers of economically disadvantaged students increases, an examination of actual numbers reveal an absolute decline in the number of low income students scoring at the Advanced level on the Math portion of the WKCE.

Numbers of Economically Disadvantaged Students Scoring Advanced on the Math WKCE Between 2002 and 2008

Graduation Year 3rd Grade 4th Grade 5th Grade 6th Grade 7th Grade 8th Grade 10th Grade
2005
42
2006
29
2007
57
32
2008
26
51
2009
43
40
2010
52
48
2011
64
73
64
52
2012
45
64
59
49
2013
74
87
89
71
69
2014
75
85
71
87
2015
126
96
113
87
2016
112
123
131
2017
86
121
2018
102

In the interest of thoroughness, I am providing enrollment numbers for the Not Economically Disadvantaged students in the MMSD over this period of time. Readers will see that the absolute numbers of Not Disadvantaged students have declined over the past seven years; this simply confirms what we already know (the increase in numbers from 8th to 10th grade reflect the influx of 9th grade students who have attended private schools for their K-8 education, e.g., Blessed Sacrament and Queen of Peace in the West attendance area).

Numbers of Not Economically Disadvantaged Students Enrolled Across Different Grade Levels in the Madison Schools and Taking the WKCE between 2002 and 2008

Graduation Year 3rd Grade 4th Grade 5th Grade 6th Grade 7th Grade 8th Grade 10th Grade
2005
1486
2006
1628
2007
1197
1451
2008
1259
1292
2009
1145
1218
2010
992
1188
2011
1026
1019
1054
1106
2012
1039
847
913
916
2013
1064
954
949
976
952
2014
936
974
939
883
2015
953
973
960
890
2016
894
881
847
2017
950
884
2018
913

Because the percent of students in this group scoring at the Advanced level has declined as well, there are two possible explanations for what has been happening. One explanation is that the district has had a relatively larger decline in enrollments of high ability students amongst this group of Not Disadvantaged students, what is often referred to as “Bright Flight”. A more probably explanation is that the math curriculum, particularly in our middle schools and in 9th grade, does not adequately challenge our students and foster their intellectual growth regardless of their socio-economic background, and of course, it is possible that both of these factors are contributing to what we see here.

I should note that I have only examined the math data, and I don’t know if the WKCE data for the other subject areas is as dismal. This would seem like an analysis that the District should be doing on a regular basis, but I encourage anyone who is interested to explore the performance of our students in reading or language arts. I also do not know the extent to which the Madison data merely reflects a similar decline in performance across the state. The members of the UW Math faculty that I have talked with in the past have expressed their concerns about the overall level of preparation from Wisconsin students, and our district’s data may simply be a confirmation of the failure of currently popular constructionist approaches to adequately teach mathematical concepts. The statewide data is certainly worth exploring as well, and again I invited interested parties to visit the Department of Public Instruction WINNS website and download their own copy of the data.

I will say again that I find these data to be incredibly demoralizing, but perhaps we can take hope that our new superintendent and our School Board will use these data as a rallying point as they finalize a strategic plan and consider the recommendations of the Math Task Force. We have to find ways to raise the performance of all our district’s students, and right now it appears we aren’t meeting anyone’s academic needs.

8 thoughts on “WKCE Scores Document Decline in the Percentage of Madison’s Advanced Students”

  1. Jeff,
    Thanks for putting this data together. I have never been a fan of the WKCE….but I have a couple of points that might be an added twist. First most schools in Dane county have a decline in 10th grade scores. I am a parent of one of those 10th graders and they have a very interesting viewpoint of the WKCE. The week of testing I had a kitchen table of boys and their viewpoint was ….”It is long, boring and what difference does it make for me.” Younger kids care and want to please their teachers. 10th graders have self absorbed interest and it is not like this test determines their grade, college admission, or much of anything. Therefore their effort is limited. If it translated into scholarships, like the PSAT maybe the results for the subset would be different. It would be interesting to compare the WKCE results to the PSAT for this group.

  2. To address Mary’s point, wouldn’t you want to look at ACT scores? This would control for the student motivation factor as well as measure math knowledge at a later point in a high school student’s career than the WKCE, taken in the fall of 10th grade.
    Here is how district students have done on average on the math portion of the ACT over the last several years:
    1996-97 24.9
    1997-98 25.3
    1998-99 25.1
    1999-00 25.1
    2000-01 24.7
    2001-02 25.4
    2002-03 24.6
    2003-04 24.6
    2004-05 24.7
    2005-06 24.5
    2006-07 25.0
    2007-08 24.9
    Unfortunately, the only way in which the reported ACT information is disaggregated is by race. The average ACT math score for 2007-08 for district white students was the highest for that group for any of the 12 years reported, and the average ACT math score for district African-American students last year was the second highest of the 12 years.
    I don’t see much of a pattern of decline here.
    To get at information on just high-performing students, I suppose you would want to look at the number of National Merit semi-finalists in the district, or perhaps the number of students taking the calculus A.P. test, as well as an indication of how well they have done. The DPI website reports on number of A.P. tests taken overall, but does not break them down by subject. Overall, district students have taken a steadily increasing number of A.P. exams over the last ten years, with the overall “pass” rate (score of 3 or above) bouncing around in the 80 – 90% range.
    I’m all for appropriately challenging and rigorous math instruction and I appreciate Jeff’s efforts to try to extract more useful information out of WKCE test results, but I think the extent to which you can draw longitudinal conclusions from the WKCE is pretty limited, at least without quite a bit of analysis.

  3. Friends, I’m not sure you’re understanding these data correctly. Jeff has taken entire cohorts of students — for example, the District’s current 10th grade class, the class of 2011 — and looked at their performance OVER TIME. Ditto other entire cohorts. The critical comparisons, therefore, are what we call WITHIN GROUP comparisons, not across group ones (that is, not across different groups of 10th graders, which is what Ed is looking at).
    If you look at the Classes of 2011, 2012, 2013, for example, you’ll see a fairly steady decline in the percentage of students in each age-cohort scoring “Advanced,” and for both Economically Disadvantaged and Non-Economically Disadvantaged students. For those who think visually, the shape of the performance distributions is changing over time in the same way for each cohort: the area under the high (righthand side) end is getting smaller.
    The data suggest either a failure of our middle school math curriculum to stretch our students or an increasing disengagement (even departure) of our math-strong students, or both.

  4. This analysis is very helpful–thank you, Jeff. If a better longitudinal analysis is possible, I would hope that the district is engaged in such an endeavor, and that it shares its work with the community. I am not sure that a non-longitudinal trend line of ACT score averages (and which doesn’t factor in participation rates or the effect of “highest-score” inclusion from repeat testing) speaks to what appears to be shown by the data presented above.
    As an MMSD parent and an outside observer of various district initiatives and the current strategic planning process, I continue to be concerned with the basics: What information do we have (or do we need) to tell us what our problems are? What information do we have (or do we need) to tell us how we should try to address those problems? What information do we have (or do we need) to tell us whether what we’re trying to do to fix those problems is working? And what’s our commitment to gathering, analyzing and acting on that information?

  5. Mary raises a good point. As students get older, especially when they take the 10th grade WKCE, some will ask how this matters to them as individuals. The WKCE is a high stakes test for schools, but a low stakes test for students, especially in high school. Lower percentages of students scoring proficient and advanced as 10th graders can be seen in other districts and in the statewide results. It’s not a phenomenon specific to MMSD. Further, it is not restricted to math and you see erosion in the other content area WKCE assessments as well.
    This is one reason why many districts, including several in Dane County, are utilizing the Explore and PLAN assessments from ACT as universal assessments at the secondary level. ACT’s EPAS group of assessments is aligned with post-secondary objectives, the results are available more rapidly than the WKCE, and the reports contain information that is useful to students, their families, and the school district. Anecdotally, I’d argue that more students find EPAS assessments meaningful because of the post-secondary linkage.
    Many of these districts also use NWEA’s MAP assessment as an universal assessment in the elementary and middle levels to show student growth over time. The MAP is an adaptive assessment that adjusts its difficulty to the performance level of the student. MAP results are available immediately.
    A standardized assessment, whether it is the WKCE, EPAS or MAP, should only be one element of a comprehensive assessment framework to measure student progress and school district performance. Multiple measures with multiple assessment tools provide the best information.
    That having been said, Jeff’s analysis raises some important questions. It is important to remember that looking at WKCE proficiency levels can mask problems. By looking at levels, you can miss movement like student mean or median scale scores drifting down within levels. Also, the WKCE cut scores for proficient and advanced have been criticized by some observers as being relatively low. Downward movements in WKCE merit scrutiny.

  6. In 2000, I looked at similar cohort data over time (roughly the 90s) for students in the Rapid City Area Schools (SD), and found similar patterns. The data were numerical scores rather than categories, but a similar pattern emerged.
    High scoring students in 4th and 8th grades declined by about half by the 10th or 11th grade. We didn’t see the decline occurring in middle schools, but it was very obvious in freshman and sophomore years in high school. The decline did not occur in the very elite students (top 5%), who continued to perform well. But the bright students just below that top tier (down to about the top 10%-12%) had a sharp drop in scores, and a much sharper decline than was seen in lower scoring cohorts.
    My conclusions was the top 5% of students were offered the opportunity to challenge themselves academically with honors or AP courses, while the tier of students between 5% and 12% were not challenged enough. Often these bright students languish in courses that include students who were average or below average.
    High schools ought to have expanded honors and AP course offerings to provide enough room for the top 12% of students to benefit from academic challenges.

  7. Jeff’s analysis, given the data available, is an order of magnitude better than I’ve seen from MMSD. Most who’ve commented on these issues, seem to at least agree that MMSD is has never been data driven.
    Chan raises critical questions which neither the State nor MMSD have ever answered — and I don’t see them ever doing so. “Analyses” are never more than simple one variable averages disaggregated by race and income and maybe time (grade or year).
    I have never seen any analysis by MMSD or the State that tries to answer “why?”. Into this void, “whys” are asked and answered but without any support.
    As an example, both Mary and Tim are in agreement that one reason for WKCE scores being lower in the 10th grade is because for the student, their score on the WKCE is not important. This is an interesting and often-mentioned hypothesis — but it’s only a hypothesis, not a fact, and not supported by evidence. Laurie suggests additional hypotheses: failure of math curriculum or disengagement; again, interesting hypothesis, not fact, and no evidence to support it.
    How many other hypothesis do we have? Bad teachers, uncertified teachers, no administrative support, weak principals, lack of behavioral discipline, bad curriculum, one-size-fits-all, …. I’m sure you can add to this list.
    No surprise to me but none of these reasons have ever been confirmed with data. Further, reasons may be more relevant to certain groups of students than others. Looking only at averages and not detailed disaggregations hides these differences.
    WKCE presents problems for me generally. First, students take the tests which results in all these various scores — all this wonderful variances. But, we see only cutscores of Advanced, Proficient, Basic, Minimal, into which WKCE forces all these scores — eliminating the variances.
    Secondly, I don’t know what the words “Advanced”, “Proficient”, “Basic”, “Minimal” mean. I suspect these words don’t mean the same as in normal vernacular.
    Then, how well does WKCE measure their subjects, which standards were ignored, which emphasized. The characteristics of WKCE (emphases) changes for each grade level — makes sense but the tests do measure quite different material.

  8. I find myself thinking, in response to Donald’s comment, that probably many of the students in the top 5% come from cultures — including individual family cultures — that hold intellectual achievement and scholarship in very high esteem. At least in the MMSD. I imagine this value and respect for learning provides a kind of immunity to the anti-intellectualism of the peer and school culture.
    As for Mary’s comment — and I say this as another parent of a 10th grade boy — I’m not convinced that the attitude she describes has a significant effect on actual behavior. As with so many things, people/kids/teenage boys say one thing in the group setting, but do another when alone.
    But as Larry states, these are all good hypotheses, nothing more.

Comments are closed.