Black students boost ACT scores

Madison students continue to top state average
By TCT staff, news services
Madison high school students bested the state ACT test score average once again for the 12th straight year, with scores of African-American students rising at a greater pace than all other students.
ACT test score comparisons were released today.
According to the Madison Metropolitan School District, the composite score for Madison students was 24.2, or two points higher than the statewide average of 22.2 and more than three points higher than the national average of 21.1. A perfect score is 36.
African-American students in Madison scored an average composite of 18.8, a 6 percent increase from the 17.7 average in 2005, while Asian students had a 23.0 composite this year (22.1 in 2005) and Hispanic students a 21.8 composite (21.5 in 2005).
The big increase in black students’ scores closed the gap on white students’ scores locally, with the white students composite of 24.8, 24 percent higher than the black students composite, down from a 30 percent gap last year.

The overall Madison composite of 24.2 is a tenth of a point lower than last year, but test scores for Madison students have stayed in the lower 24s for the past 10 years, keeping the two-point advantage over the rest of Wisconsin.
The average ACT scores of Wisconsin’s high school seniors remained second-best in the nation this year, one-tenth of a point behind Minnesota and more than a full point above the national average.
State students graduating in 2006 had an average composite score of 22.2, the same score that Wisconsin students achieved for the last six years. It was the 10th best score in the nation, but second among the 25 states in which at least 50 percent of students took the test.
Wisconsin also ranked second to Minnesota in 2005 after holding the top spot for 10 straight years.
The test scale ranges from 1 to 36 and covers math, science, English and reading.
The national average composite score was 21.1, up from 20.9 last year, according to results prepared for release today. Minnesota students led the nation with an average composite score of 22.3.
Wisconsin Superintendent of Public Instruction Elizabeth Burmaster attributed Wisconsin’s high performance to the number of students taking a college-preparatory curriculum.
“We, parents and educators, must continue to encourage young people to engage in a rigorous high school curriculum,” she said.
Asian students in Wisconsin scored a composite average of 20.4, American Indians scored 20.1, Hispanics scored 19.6 and blacks scored 17.0. White students in Wisconsin scored an average of 22.6.
The scores by ethnic group in the state have remained consistent over the last five years.
The number of Wisconsin students who took the test was 44,275, about a 3 percent drop over last year. Over the same period, the number of students across the country who took the test increased by 1.7 percent.
More Wisconsin students than the national average exceeded several benchmarks considered indicators that students can handle college-level work. About 77 percent of Wisconsin students met the mark in English, 52 percent in math, 61 percent in reading and 35 percent in science.
Wisconsin’s average scores for the individual subjects each topped the national average. Students’ scores in the state remained constant over 2005 in English, math and reading – 21.5, 22.0 and 22.4 respectively. The students scored 22.2 in science, down one-tenth of a point.
Published: August 16, 2006

28 thoughts on “Black students boost ACT scores”

  1. Everyone just loves averages — an out of context statistic that is, therefore, meaningless. Averages, rank scores, percent increases/decreases are used by both apologists and detractors whose goals are simply to win an argument rather than understand issues, decide on directions and next steps, or allocate resources appropriately.
    Of course, the PR gamemanship says the higher MMSD averages is because MMSD schools are that much better than the other Wisconsin schools. Maybe that’s true, but comparing averages doesn’t say that.
    How do averages get raised? Let’s make a list.
    1) MMSD is educating all students (who took the ACT) better than other Wi districts, and the distribution of students who take the ACT is the same across all districts.
    2) MMSD is educating its students no better than other districts, but Madison contains other resources that students use to attain higher achievement (more highly education population, additional wealth, access to and use of voluntary and commerical tutoring programs).
    3) MMSD has demographically larger number of students who would be expected to score well on the ACT, and those students take the ACT test.
    4) MMSD students at the lower end of the scale are less likely to take the ACT than kids in other WI districts.
    Of course, all the above can be true, where each contributes a little bit to the resulting average MMSD ACT score. MMSD could be doing a better job of educating the lower achieving kids, thereby contributing, say 0.5 points to the average, but doing no better for the middle and upper level kids. For the upper 70% of the kids, there is sufficient family wealth/education to allow tutoring, say contributing another 0.7 points to the ACT average score. The existence of a highly educated populace (University, professional) would likely produce, say 0.4 points to the average (genes account for something). Finally, the cost of living and housing in Madison cause extremes of poverty would contribute to a loss of hope of those at the lower SES, causing them not to take the ACT, adding another 0.4 points to the average.
    The above is a quick list, and discussion of hypotheticals. The list of possible contributing factors and their weight is not known, and there is no question in my mind that the powers and either side of educational issues do not want to know.
    To care about the truth of these matters is not within the character of the powers that be nor likely in the population as a whole. Cherry-picking useless facts and statistics to win political advantage would obviously be harmed if anyone actually gave a damn.

  2. Very well said, Larry. Very well said. You are so right when you say “to care about the truth of these matters is not within the character of the powers that be nor likely in the population as a whole.” What’s equally disturbing is that so many of the powers that be probably cannot even understand the points you make here. How can our decision-makers make sound, empirically based decisions when they do not understand the basics of statistics and the basics of research design and analysis? (Can we insist that they bring someone in to teach them?)
    For my part, I’ve been sitting here with this week’s ACT report unable to look away from the whole forest. Here are a few paragraphs from the AP story. Note well the last sentence, about course rigor.
    “ACT officials said the numbers are encouraging but still show too few students are prepared for college-level work. Only 21 percent of test-takers scored the benchmark indicating they are likely to succeed in college on each of the four exams – math, English, reading and science. (ONLY 21% fully prepared for college???!!! I wonder what these numbers are for MMSD students? — LAF) More than two-thirds hit the benchmark score in English, but barely one-quarter did in science.
    “‘This doesn’t mean they won’t be successful and graduate from college, but it does increase the likelihood they will struggle or need remediation along the way,” Ferguson said.
    “Students persuaded to take a full core curriculum – including four years of English and three years each of math, science and social studies – do better on the ACT and are more likely to succeed in college. But the percentage who reported taking the core – which is more than many states require to graduate – actually fell from 56 percent to 54 percent this year.
    “‘The message still isn’t getting across to far too many students,” Ferguson said.
    “Average scores for black students rose 0.1 points to 17.1, while Hispanics’ scores were steady at 18.6. Significant racial gaps persist: Whites scored 22.0 on average and Asian-Americans 22.3. Even black students who took the core were outscored by white students who had not – which Ferguson attributed to a range of factors, including insufficient rigor in the core courses offered to minority students.”
    The other thing I’ve been thinking about is the number of middle school students I know — 7th and 8th graders — who have taken the ACT and earned composite scores in the high 20’s and low 30’s. What about them? Who cares about them? What can they expect from our schools?

  3. Carol,
    Can you provide us with information about the number and percentage of MMSD students (seniors only) who took the ACT this past school year?
    Also, the number and percentage of MMSD test takers (again, seniors only) who met the bench mark in each of the four content areas, as well as the number and percentage who met the bench mark in all four areas? It seems to me those are the students we have prepared well for college — a useful thing for all of us to know, along with the comparative stuff.
    Please ask Kurt to provide the data broken down by race/ethnicity.
    Many thanks for doing that.

  4. Through the Midwest Academic Talent Search (MATS) program run by the Center for Talent Development (CTD) at Northwestern:
    CTD is one of perhaps half-a-dozen talent development centers across the country devoted to the identification, education and support of gifted kids. It serves eight midwestern states, including Wisconsin. Through the MATS, students in third through ninth grades can take certain out-of-grade-level standardized tests, like the SAT, ACT and Explore (essentially an ACT-type test developed for eighth graders). For students who always hit the ceiling of their district and state standardized tests, taking an out-of-grade-level assessment is the only way one can really know what they know.
    I notice on the website that registration for this year’s talent search begins in September. The deadline is usually at the end of October or very beginning of November. Watch your school newsletter for an announcement from the TAG staff.
    FYI, the kids test in late January or early February. Extremely high scoring kids for the eight-state area are invited to a special awards ceremony at Northwestern at the beginning of June. All kids who test get put on mailing lists and such for a variety of special schools, academic programs and academic summer camps. Sometimes a kid’s test score is used to help with placement in the appropriate MMSD class. (More on that in my next post on English 10.)
    Hope that helps.

  5. Context is important in understanding statistics – a point that applies equally to this post and to numerous other posts.
    Here is some of the context. Over the last 10 years the district’s student body has become increasingly diverse and has more poverty. Thus:
    Students of color: 1995-96 – 29%; 2005-06 – 44%
    Low-income students: 1995-96 – 24%; 2005-06 – 38%
    Percent of students taking test: 1996-97 – 63%; 2005-06 – 70%.

  6. Laurie and Carol:
    I am confused on why the breakdown of seniors only score. Does Madison actually breakdown these scores between seniors, juniors, and others? (I am assuming they don’t use middle school scores at all, because states they don’t keep them). If a child gets a perfect score as a junior, (and then doesn’t take it again )it doesn’t sound like their scores are included in these numbers.
    Carol, you talk about the district population changing, is this the change at the high school level (who are the kids taking the tests)? I think if someone is comparing the % of students taking the test (high schoolers), they would also need the % of high school only in the other areas also.

  7. David,
    Just an FYI:
    * ACT is available for those 6th-9th grade
    * SAT is available for those 6th-8th grade
    * Explore (8th grade test) is available for 3rd-6th grade

  8. Just a few comments on ACT data.
    1. ACT issues a profile report for institutions annually on their graduating class. Last week, the profile reports for the graduating class of 2006 became available. The scores reported are for graduates who took the ACT as sophomores, juniors, or seniors. For students taking the test multiple times, the score used is the last sitting. It is possible to receive data files on your students on a periodic basis, but the headline reports are based on the profile of graduating seniors.
    2. Disaggregation of ACT data by demographics is not as straightforward as for state assessments. For the ACT, demographic data like gender and race/ethnicity are self-declared and some students do not provide responses to those items. Also, the unique identifier for students in ACT data sets is their SSN. District student information systems do not always hold student SSN’s so accurately matching the ACT data to the more complete data in the local SIS may require additional processing. I’m not sure whether MMSD keeps student SSN’s or not.

  9. Judy,
    Thanks for the added MATS information for David (and anyone else interested in the Midwest Academic Talent Search program).
    As for why seniors only, I simply wanted to minimize the variability in the sample with regard to number of years of schooling; that is, to keep the sample as purely as possible one of students who were in their last year of high school. From what Tim says, that may or may not be possible, depending on exactly how the MMSD stores its data.
    Of course, I bet it would be possible to get the additional information I requested — with a little effort — if someone with authority asked for it.

  10. Carol,
    Thanks for replying, and for providing the beginning of an empirical context within which to make sense of the ACT numbers. You’re right, it is good to have information that helps us reflect on what the numbers in this report really mean.
    That’s why I think the ACT benchmark scores are important. As defined on the DPI website, “a benchmark score is the minimum score needed on an ACT subject-area test to indicate a 50 percent or better chance of earning a “B” or higher grade or about a 75 percent chance of earning a “C” or better in the corresponding credit-bearing college courses.” It is expected that students who do not earn at least the benchmark score in a content area will require remedial work in college.
    Here are the benchmark scores for each of the four ACT subject-area tests, along with both state and national data regarding the percentage of test-takers (note: that’s different from the percentage of all students) who obtained the benchmark score (or higher):
    English: The benchmark score for the ACT English subtest is 18. 77% of Wisconsin test-takers and 69% of test-takers nationally earned a score of 18 (or higher) on the ACT English subtest.
    Math: The benchmark score for the ACT Math subtest is 22. It is thought to reflect a readiness for college-level algebra. 52% of Wisconsin students and 42% of students nationally earned a score of 22 (or higher) on the ACT Math subtest.
    Reading: The benchmark score for the ACT Reading subtest is 21. The Reading trest score is seen as an indicator of readiness for college-level social studies course work. 61% of Wisconsin test-takers and 53% of test-takers nationally earned a score of 21 (or higher) on the ACT Reading subtest.
    Science: The benchmark score for the science subtest is 24. It is seen as an indicator of readiness for college level biology. 35% of Wisconsin test-takers and 27% of test-takers nationally earned a score of 24 (or higher) on the ACT Science subtest.
    What about the percentage of students who earned the benchmark score (or higher) on all four of the ACT subtests? Nationally, that number is 21%. For Wisconsin test-takers, that number is 28%. That is, based on their ACT test scores, only 28% of the Wisconsin students who took the ACT this year are ready for college level work. I don’t know about you or other SIS readers, but I find those numbers appallingly and tragically low.
    I think it would be useful and enlightening for us to know what these numbers are for MMSD students who took the ACT this year. That’s why I have asked you to please ask Kurt for the number and percentage of 2006 MMSD ACT test-takers in the various demographic groups (whether defined by race, SES or gender — I think gender is important, too) who earned the benchmark score (or higher) in each of the four content areas and on all four content areas.
    Would you please?

  11. I met with Art Rainwater this afternoon to discuss this issue and to make similar requests for analysis. It was an interesting conversation ranging over several topics, but the result was I did not get a time-definite commitment to perform analysis of the ACT data.
    Time and commitment of staff was the reason given. I could be happy initially with analysis of the ACT data itself without the effort to match to the MMSD student data. Later, as time permits, MMSD data matching could be done.
    One approach to getting needed analysis done is to consider partnering with UW to perform the statistics. It would give grad students something relevant to do.
    Whoever might be able to perform these stats, it’s imperative it be done. Further, these policy-level statistics are something that should be automatically generated and except for initial setup, should be available routinely. The public needs to be kept fully informed of student, school, and district academic progress using hard data

  12. Yes, it is truly astonishing that these “policy level” analyses, as you so rightly call them, aren’t done routinely. If the goal is to prepare the majority of our students for college, why not make good use of an essentially nationally standardized indicator that is so readily available to us?
    Larry, do you think it would help if Art received requests/encouragement to do these analyses from more of us? Or if BOE members made a strong request?
    Thanks for taking the time to meet with Art about this.

  13. Larry, thanks for taking the time to do this.
    We’re fortunate to have people willing to spend time on K-12 issues. The absence of serious inquiries (it’s easy to print press releases) on the part of the 4th estate is ….. telling.
    Our community spends a substantial amount of money annually on K-12 education ($332M+ in 2006/2007), in addition to the extensive amount of volunteer hours contributed by parents, friends, staff, students and teachers.
    Laurie and Larry raise some useful points. I hope we see this information published on the MMSD’s website.

  14. Meanwhile, I got curious about some other data mentioned in this thread. Earlier this week, I asked the staff at WCATY (Wisconsin Center for Academically Talented Youth) about the number of Madison students who participated in the 2006 Midwest Academic Talent Search (MATS). I asked them to given me the number of students who participated, their grade level, and information about which test they took and how they did. (Note: Because WCATY does not have complete information on the schools MATS participants attend, public and private school students are mixed together in the data they gave me.) I thought SIS readers might be interested to see the information.
    251 students with Madison addresses participated in the 2006 MATS (as well as 33 students with Fitchburg addresses, some of whom are probably MMSD students — or private school students who would be MMSD students, if they attended public school). Here are the data for the number of students who took each MATS test, broken down by grade level:
    Explore (essentially, an 8th grade ACT)
    Third graders: 12
    Fourth graders: 26
    Fifth graders: 50
    Sixth graders: 20
    Sixth graders: 9
    Seventh graders: 31
    Eighth graders: 37
    Ninth graders: 13
    Sixth graders: 3
    Seventh graders: 19
    Eighth graders: 31
    To give you a sense of how these young test-takers did, here are some descriptive statistics for the ACT Composite Score and the SAT Reading-plus-Math Score. (I am still waiting for composite score data for the Explore test.) I have provided the mean (M), standard deviation (SD) and highest score (max) for each grade level.
    ACT (Composite Score)
    Sixth graders: M = 15.89 (SD = 5.56); max = 28
    Seventh graders: M = 20.16 (SD = 5.05); max = 32
    Eighth graders: M = 22.32 (SD = 4.63); max = 34
    Ninth graders: M = 25.08 (SD = 3.17); max = 31
    (For comparison, the average 2006 ACT Composite Score was 24.2 for MMSD students, 22.2 for all Wisconsin students, and 21.1 for all students across the country. I can’t help but wonder how many of these young students are ready for college level work, based on their ACT performance.)
    SAT (Reading-plus-Math)
    Sixth graders: M = 780 (SD = 79); max = 870
    Seventh graders: M = 1001 (SD = 133); max = 1340
    Eighth graders: M = 1145 (SD = 192); max = 1510
    I trust SIS readers will agree that we have some very academically talented youngsters in our town! And these are only the students whose parents knew about the MATS, thought it was a good idea for their child to participate, and could afford the registration fee. I imagine there are many more students in Madison who would do well on these tests … and benefit from the experience of taking them.
    Some obvious questions: What is the District’s responsibility to provide these students with an education that is appropriate FOR THEM? How good a job is the District doing at meeting its responsibility to these students? What is keeping the District from meeting its proper responsibility to these students? What is the price paid when our schools fail to meet the educational needs of students like this? And so forth.
    For more information about the Wisconsin Center for Academically Talented Youth and the Midwest Academic Talent Search, go to:
    To read an article entitled “Talent Search: Purposes, Rationale, and Role in Gifted Education,” go to:

  15. You ain’t gonna’ get the data to analyze, Larry.
    Here’s a response from the superintendent to my e-mail encouraging him to accept your offer to help:
    “As I told Larry yesterday we will be glad to do a more detailed analysis of the ACT data when we have staff time. Right now our research staff is involved in implementing the new student information system. Our data is not generally available to the public for analysis unless it is part of an approved research project for a recognized research institution and approved by our internal research committee. Of course the aggregate reports for some groups and for the sub tests that the district receives from the ACT are available to the public if approved as a part of a public records request.”
    This administration wants no outside analysis, input, review, or help, thank you very much. Control is more important than any analysis or insights from outside the Doyle building.

  16. Larry (and everyone else)
    Rereading all this, I see a specific request for some information from Laurie Frost, but I’m not clear about what you asked of MMSD. Before judging the district’s response (or lack there of) and the district’s priorities, I’d like to know what was asked.
    If you asking for a rigorous analysis of all the factors you mentioned, I hope you also brought a grant check to your meeting with the Superintendent. That’s a huge job, well beyond what any district can afford.
    As to whether “The Powers That Be” in MMSD or education in general care about the relationship between demographics and achievement, I don’t see your assertion as being plausible. Decades of research and the way districts (and the ACT) report data show that they are aware of these relationships and that the public (or at least the interested public) is also aware and care. I don’t think it is a secret that children from educated (or wealthy, or caring, or involved) homes have advantages and don’t see a need for MMSD (or anyone else) to spend time trying to further quantify these advantages. I really don’t see the need for MMSD to release a press statement saying that many of our best students come from highly educated (or wealthy) homes. Caring and involvement are different because they offer a way to minimize the relationships among demographics and achievement. So I’d welcome a whole slew of press releases (and the data collection and analyses to support them) that shouted PARENTS WHO READ WITH THEIR CHILDREN PRODUCE HIGH ACHIEVERS or DATA SHOWS HIGH ACHIEVING STUDENTS HAVE PARENTS WHO CHECK HOMEWORK or COMMUNITY TUTORS HELP RAISE ACHIEVEMENT (I think I read that one this spring). I’m much more concerned with severing the tie between demographics and achievement than I am with documenting it or discussing it. I think most of the people involved with MMSD are too.
    On the ACT press release, that’s just pro forma stuff. MMSD would be remiss (and I would question their competence) if they had done anything different.
    Laurie Frost’s request does seem simple on the surface and could be useful, but the way the ACT collects and releases data (as Tim Schell points out) may make this difficult and even not worth the trouble. An answer to this is warranted.
    One issue lurking off-stage in all this is the question of whether we already have good data on achievement disaggregated by “by race, SES or gender”? If we do then the ACT data is icing on the cake and shouldn’t be a priority. So I guess my other question is: Is there reason to believe that the ACT data is significantly better, of more use? From what I wrote above, you can tell I don’t think so. I’m intrigued by the college preparedness metric, but those issues aren’t unique to Madison. I’m not sure that knowing the demographics of the “cut levels” in Madison will tell us more than the data reported to DPI already do.
    Throwing aside the fear of contradicting myself, I would like to see historical data on preparedness and demographics (along the lines Laurie and Larry identified), but that’s just because I’m curious about whether there was a golden age and if there was how white and wealthy (and male) the “ollege-prepared”population was. If I was running a cash-strapped district, I certainly wouldn’t spend staff time looking for the answer. However, if Larry can get the grant….

  17. With all due respect, TJ, I think you’re mistaken when you wrote, “I’m much more concerned with severing the tie between demographics and achievement than I am with documenting it or discussing it. I think most of the people involved with MMSD are too.”
    Rather than severing the tie, most people in the administration and on the school board use demographics as the excuse for low scores in the MMSD on everything from 3rd grading reading to the ACT.
    For instance, here’s what Carol wrote in this thread:
    “Context is important in understanding statistics – a point that applies equally to this post and to numerous other posts.
    Here is some of the context. Over the last 10 years the district’s student body has become increasingly diverse and has more poverty. Thus:
    Students of color: 1995-96 – 29%; 2005-06 – 44%
    Low-income students: 1995-96 – 24%; 2005-06 – 38%
    Percent of students taking test: 1996-97 – 63%; 2005-06 – 70%.”
    I took her post to mean that low scores are inevitable because the poor kids and kids of color pull down the scores. Talk about tying achievement and demographics!
    To me, effective education would take kids in poverty and kids of color and raise their performance to the same level as white kids who aren’t poor.
    Call me a dreamer, but I’m not the only one.

  18. So to tie together some past and current postings: MMSD is NOT the reason for any of the higher achievement scores because of student background. Yet, MMSD is to take full responsibility for the failure of every student; because of background?!?! Student mobility makes it increasingly difficult, if not impossible to follow academic performance.
    I took Carol’s response to mean there are many factors related to the percentages posted. Interpretation (or in this case innuendo) is in the eye of the beholder. We have no idea how Carol interprets this data.

  19. Ed
    Things can get pretty subtle here. I can’t speak for Carol Carstensen, but I interpreted her post as (in part) a rejoinder to the assertion that “the powers that be” don’t recognize the historic and existing relationships among demographics and achievement and reminder that changes in the student population have produced challenges for MMSD, not a prelude to accepting these relationships as permanent. Recognition of these relationships is a necessary step towards understanding and eliminating them. Title 1 of the Elementary and Secondary Education Act of 1965 (an important source for the development of Direct Instruction), Head Start, the Sage program and even parts of NCLB all begin with this recognition and then have sought to understand and minimize them. If there is any doubt about MMSD’s awareness and commitment here, I think I would point the doubters to the Strategic Plan ( and participation in the Minority Student Achievement Network ( I have high hopes that the Equity Task Force ( will increase this commitment and help the district focus their work.
    The subtleties of my position come in at least two ways. The first is in my answer to how pressing is the need to continue to document and analyze these relationships. Although these relationships are well documented (if only partially understood) there is a continued need to collect better data, perform better analyses and find better ways to present this data to policy makers and the public, particularly longitudinal information ( and (in figuring out what is working) I’m intrigued by value added data analysis ( We know the general picture, but we do have keep track of how it is changing and we can always know more. The second part is identifying the roles for school districts in this quest for knowledge. School districts have an obligation to keep informed of trends among their students, an obligation to track what is working and what isn’t working in their schools and an obligation to keep their stakeholders informed in a meaningful ways (they also have expanding obligations to report to the state and federal authorities), but they don’t have an obligation to devote resources to the kind of research and analysis that Larry seems to want. This is especially true when you are looking at the relationship between immutable characteristics like race and class as factors in achievement. It is interesting to document the advantages of the privileged and it may be of help in the struggle to gain resources for the less privileged, but in the short term it isn’t going to help anyone figure out how to help anyone learn. If foundation and academic partners propose cooperation in research that has a strong potential to improve understanding and (especially) teaching and learning at no or minimal cost, I’d urge cooperation, but that is very different than initiating that research or tackling it alone. Administrators, policy-makers and classroom teachers should and do make use of what is available already, that’s how it should be.
    Lastly, I’m confused by your positions. On one hand you seconded Larry’s call for the district to document, analyze and publicize data on achievement and demographics and trash Art Rainwater for not acceding to this call; on the other hand you chide Carol Carstensen for calling attention to demographics as a context for understanding achievement in the district. Do you want MMSD to do more work on this topic, less work, give or call more or less attention to it…? The major difference I see is that Larry is looking for documentation analysis primarily about how privilege (an in the last) interacts with schooling in ways that make what schools do less relevant to achievement (nearly the position you accuse Carol Carstensen of adopting) and Carol is indirectly reminding us that lack of privilege makes effective schooling more difficult and more important. Both Larry and Carol seek a wider acknowledgment of demographics as a factor in achievement. Larry and Carol are both exactly right that demographics should be part of the equation when judging the achievements of educational efforts. I disagree with Larry’s proposal for the district to devote staff time to (what I understand of) his proposed analysis and disagree with his characterization of the willful ignorance or duplicitous behavior of “the powers that be,” but have no problem with him calling attention to the issues. Why did you find Carol’s reminder worthy of condemnation and Larry’s reminder, speculation and call for analysis worthy of support?

  20. If anyone cares, the sentence in the last paragraph of my previous comment should have read:
    The major difference I see is that Larry is looking for documentation and analysis primarily about how privilege (and in the last, a lack of privilege) interacts with schooling in ways that make what schools do less relevant to achievement (nearly the position you accuse Carol Carstensen of adopting) and Carol is indirectly reminding us that lack of privilege makes effective schooling more difficult and more important.
    I’ve got to remember to read before I post…

  21. TJM writes:
    “The major difference I see is that Larry is looking for documentation analysis primarily about how privilege (an in the last) interacts with schooling in ways that make what schools do less relevant to achievement (nearly the position you accuse Carol Carstensen of adopting) and Carol is indirectly reminding us that lack of privilege makes effective schooling more difficult and more important”
    This really resonated. The current partisanship of our culture seems to be to seek information only when we think it will confirm our own opinions and hypotheses; and to devalue any information that might challenge our opinions or hypotheses. If schools are “irrelevant to the achievement of privileged students”, then most of the people who involve themselves in debates about education would have little to worry about. If privilege is irrelevant to the achievement of students, then the things that parents with means provide for their students are a waste of time and money.
    Bottom line: it is not necessary to trash what exists in order to acknowledge a need to continue improving. Would it be so terrible to be glad that, at least on this one indicator, Wisconsin schools are outperforming schools from other states as well as getting more of its students to take the recommended core curriculum? It’s not like those of us who are out there teaching are going to pull out our “to do” lists and cross off college prep as “done”. Of course, it’s hard to stay motivated to work for this worthy goal when even the good news is disparaged and minimized.

  22. TJ, Marisue, and everyone else who posts,
    Can we deal with some fundamental issues?
    Who should the MMSD (and all schools) educate? As I said, “To me, effective education would take kids in poverty and kids of color and raise their performance to the same level as white kids who aren’t poor.” And, I would add, challenge every student (the lowest and highest academically) to raise their academic achievement.
    As an alternative, should schools simply acknowledge that students will come to school at different academic levels (often due to race and povery) and will leave in pretty much the same relative position where they entered? The academically low will finish low; the high will finish high.
    As another alternative, should schools target the lowest to bring them higher academically and let the highest fend for themselves? Or, vice versa.
    I have the feeling that Carol’s post provides a “context” on race and poverty to support the notion that we have to accept the hard fact that we cannot expect outstanding achievement scores when the MMSD has growing number of kids in poverty and kids of color.
    Further, I take the talk about high performing MMSD students as sweeping under the rug the low scores for children of poverty and color.
    Aren’t we really discussing our expectations of academic success for various racial and economic groups, as well as students with varying academic abilities?
    (Yes, Marisue, this is a highly political question.)
    Laurie Frost and I tried to raise this question of expectations in earlier posts. We didn’t get many responses. Maybe a separate post from one of your, TJ or Marisue, on this question would generate more discussion than Laurie and I got.
    And maybe if we clairfy our expectations on fundamental issues, we’ll be able to discuss details more productively.

  23. I’m not sure if the question of expectations (that Ed talked about raising) refers to the post Laurie first included under Curous Social Development and then moved to another strand? I know I responded to the original post, but didn’t repost under the new strand. When there wasn’t any response to my response :), I let it go.
    Personally, I think that one of the problems with the types of discussions we are having is that we are veering back and forth among topics, and a response to one topic is then applied to another. Perhaps it has to do with a natural tendency to make assumptions about where someone is coming from, based on posts in previous strands.
    In Ed’s post, he writes “I take the talk about high performing MMSD students as sweeping under the rug the low scores for children of poverty and color”. I, on the other hand, took it as a response to the criticism that frequently shows up on this forum regarding MMSD service to high achieving students. As a teacher in the district, I absolutely don’t feel that low scores for children of poverty and color are being swept under the rug. Data is disaggregated for us at inservices continually to make sure that we are cognizant of the existing gaps and motivated to see and deal with this as the emergency that it is. However, the competing truth is that “Madison” is routinely compared to school districts with very, very different demographics. Disaggregating the data and looking at the achievement of like students from district to district is important as well. If a district is performing well above MMSD I want to visit it and find out what they are doing–but only if that performance difference exists for comparable parts of our student body. Otherwise, it is just a waste of my time.
    My original interpretation of this strand was formed by previous discussion about whether the district was meeting the needs of high acheiving (TAG) students. Going back over what has been posted, I see that that may have been an error on my part.
    Who should the public schools educate? I believe we have an obligation to every student who walks through the door. I believe each student deserves instruction at a level that challenges but does frustrate. Every student will not leave school in the same place, but the “starting points” need not determine the endpoints, and ultimately, we want a vast majority of students to leave the public education system in the proficient/advanced range.
    I don’t think it is true that we can’t “expect outstanding achievement scores when the MMSD has growing number of kids in poverty and kids of color”. I think it is true that we are CURRENTLY still struggling with how to make that happen on a large scale.

  24. I understood Carol to be saying — in offering the contextual information that she provided — that “even as the racial and socioeconomic diversity of our student population has increased, we have nevertheless seen an increase in the number of our students taking the ACT” (along with the modest increase in average group performance described in her original post). I felt Carol’s implied message to be “and that’s good news, something the District should feel proud of.” I felt the insinuated message (in many of the above comments, not just Carol’s) to be “and those of you who want to look at other data are just being negative — as always.”
    So let me be clear: I wholeheartedly agree with Carol and the rest of you that an increase in the MMSD ACT participation rate is good news — at any point in time, but perhaps especially at a time when the diversity of our student population is increasing. I also agree that increases in performance — however modest — are also good news, certainly better than decreases in performance.
    Where it becomes harder for me to join in the celebration is when I consider the larger picture: in Wisconsin, 68% of graduating high school students took the ACT in 2006. Of that group, 28% are college-ready, as defined by the ACT benchmark criteria. That means that when you consider ALL Wisconsin 2006 high school graduates, less than 20% are college-ready (by those same criteria). Of course, the number becomes even smaller when you factor in the number of high school dropouts.
    O.K., so we’re better than we used to be and we’re better than most other states. And within Wisconsin, the MMSD is at the top of the distribution. That’s all wonderful. But friends, the vast majority of our children are still failing — many because we are failing them. And that’s tragic. Those numbers are appalling! No wonder the United States is behind so many other countries on international measures of academic performance. Mea culpa if I can’t join you in an extended celebration of what a great job our District is doing.
    (It’s kind of like saying (forgive me, I’ve been reading a lot about obesity and children’s nutrition recently) “our students lost ten pounds, while others lost only five pounds — yay, us!” when the fact is, most of the students still weigh over three hundred pounds! I mean, yes, ten is better than five is better than zero is better than a weight gain; but let’s not get too carried away with ourselves when everyone is still weighing way too much!)
    What’s frustrating for many of us is the District’s exclusive focus on the small good news; the silence around the larger not-so-good news (which often becomes a full-blown denial of the not-so-good news); and the readiness (in so many of you) to characterize those of us who want to talk about the not-so-good news as pessimistic wet blankets, forever raining on your parades (and mixing our metaphors).
    I keep returning to this: If there is a widely accepted national index of college-readiness — and more and more of our students are taking the test from which the index is calculated (an increase we all celebrate) — then we should be interested in knowing how our students are doing on said index. Period. No excuses. It should be a routine part of our performance and achievement analysis.

  25. Is it just me, or does anyone else question the benchmarks? Is it true that only 21% of Wisconsin graduates (who go on to college) are getting C’s in their subjects? I did NOT take the recommended core classes in math and science (stopped at trig, never took chem or physics) and still graduated from the UW with Honors–twice.
    I also noticed, when I was in the dorms, that those of us who graduated from MMSD were significantly less stressed out than our house mates, most of whom came from smaller districts. This was quite awhile ago and Madison has changed a lot since then, so I don’t know if that is still the case.
    Regardless, I have read nowhere that DPI or MMSD’s reaction to the scores have been greeted by an “extended celebration”. NO ONE is saying we don’t have to do anything else just because we are doing well relative to other states. Of course it’s not enough. But as a teacher who is working really hard, alongside a lot of other teachers who are working really hard, yes, I would appreciate having someone acknowledge (for a change) that however much work we still have to do, we are not failing everyone miserably.
    Good critiques are balanced with recognition of positives. In fact, the first rule of behavior modification is to make sure that students hear five positives for every criticism. Sometimes we have to work awfully hard to find those five positives, but if I want the student to hear anything I’m saying, I can’t create an excuse for them to tune me out and devalue my feedback. I have yet to see a student alter their thinking when faced with negative after negative after negative. Students who have those relationships with their teachers dismiss everything they hear by saying, “oh, that teacher just hates me”. Conversely, the students who feel liked and appreciated are genuinely impacted by the critiques they recieve. They are significantly more likely to hear what we say, because they care what we say.
    I don’t think that it is so different with adults. Faced only with criticism–especially while working on long term, slow fix problems–we have to crumble or stop giving it weight. I work in a school with one of the highest poverty, ELL and mobility rates in the district. I also work with students who have skills significantly above grade level. I love what I do, and the progress students make across the year tells me that I am doing my job well. However, it is hard, exhausting, stressful work. I want to listen to diverse viewpoints and discuss the problems the schools are facing, but I can’t do my job if all I feel is beat up.

Comments are closed.