A Quick Look At “Best High School” Rankings

Matthew DiCarlo:

Every year, a few major media outlets publish high school rankings. Most recently, Newsweek (in partnership with The Daily Beast) issued its annual list of the “nation’s best high schools.” Their general approach to this task seems quite defensible: To find the high schools that “best prepare students for college.”
The rankings are calculated using six measures: graduation rate (25 percent); college acceptance rate (25); AP/IB/AICE tests taken per student (25); average SAT/ACT score (10); average AP/IB/AICE score (10); and the percentage of students enrolled in at least one AP/IB/AICE course (5).
Needless to say, even the most rigorous, sophisticated measures of school performance will be imperfect at best, and the methods behind these lists have been subject to endless scrutiny. However, let’s take a quick look at three potentially problematic issues with the Newsweek rankings, how the results might be interpreted, and how the system compares with that published by U.S. News and World Report.
Self-reported data. The data for Newsweek’s rankings come from a survey, in which high schools report their results on the six measures above (as well as, presumably, some other basic information, such as enrollment). Self-reported data almost always entail comparability and consistency issues. The methodology document notes that the submissions were “screened to ensure that the data met several parameters of logic and consistency,” and that anomalies were identified and the schools contacted for verification. So, this is probably not a big deal, but it’s worth mentioning briefly.