Richard Innes, via a kind email:
One of the more notable problems with much that is written about the National Assessment of Educational Progress (NAEP) regarding relative state performances is that far too often, only overall average scores are compared. Whether we are talking college professors, state education agencies, local educators, members of the press, and more, far too often some important parts of the real story are ignored because only overall average scores are compared.
This isn’t a new problem. The National Center for Education Statistics (NCES) has cautioned about overly simplistic analysis that only looks at overall average scores for many years. NCES even included special comments on the topic in the 2009 NAEP Science Report Card (http://nces.ed.gov/nationsreportcard/pdf/main2009/2011451.pdf).
Below is a partial extract from Page 32 in that report card that highlights some examples of how the picture can be VERY different once more thorough analysis of NAEP is conducted.
The first example used by NCES is Kentucky’s performance in the 2009 Grade 8 NAEP Science Assessment. When you only look at overall average scores, Kentucky scores statistically significantly higher than the national public school average. However, when you only consider scores for White students in each state, Kentucky’s score statistically significantly lower than the national average. Once you learn that in this assessment Kentucky’s NAEP student sample was 85% White, the importance of this additional information becomes far more apparent.