Of course, given the commercial success of USN&WR's annual college rankings edition, competitors were bound to spring up. Ten years ago, Newsweek Magazine jumped into the fray by introducing the first national rankings of high schools. This week, the Washington Post initiated its ratings of Washington area elementary schools.
While it may be comforting to think that it's possible to qualitatively compare educational institutions, the reality is that the major rankings are subjective evaluations based upon largely arbitrary classification schemes. The problem is that people often ignore the evaluators' suspect methodologies and accept the ratings at face value.
Newsweek ranks high schools according to an index developed by education reporter Jay Mathews. Schools are ranked based upon how many college-level tests the schools give. There is no attempt to evaluate how students perform on the tests and many of the schools on Mathews' 2008 list received poor ratings from their own state boards of education. Many of the schools failed to make Adequate Yearly Progress under the No Child Left Behind federal mandates.
Mathews nevertheless defends his rankings because they encourage schools to challenge students -often beyond their capabilities- and because his rating formula is easily understood. Without a clear performance measure however, ranking a school by how many tests it gives is comparable to judging school quality by the number of colors of paint on its buildings or varieties of fruit offered in its cafeteria.
The ranking of D.C. area elementary schools by the Washington Post has a different sort of problem. The Post "calculated the share of elementary school students who reached the highest, or advanced levels of performance on reading and math tests." The Post rankings use test results but nothing else.
The difficulty with the Post rankings is that what is being measured is really the socioeconomic status of the families at each school, since high SES is strongly associated with test performance. The rankings don't account for family income or parents' education level so they provide no basis for deciding how well a school is actually educating children.
On its face, the USN&WR high school rankings appear to be the best of the lot. The rankings purport to measure the schools that are best at serving all their students and the rankings were formulated by Standard and Poor's School Evaluation Services. The rankings are derived in three steps. First, schools are evaluated as to whether their students perform better than statistically expected based on the economic mix of the school when compared with other schools in the state.
Second, schools are evaluated as to how their non-Asian minority populations perform relative to state averages. Finally, schools are judged based on how well they prepare students for college using results from the International Baccalaureate and Advanced Placement exams.
The top-rated school nationally, according to the USN&WR rankings, is Thomas Jefferson High School for Science and Technology in Fairfax County, Virginia. TJ high is a magnet school- as are many of USN&WR's top high schools- and it accepts only a small proportion of applicants. So, TJ High accepts an economic and racial cross-section of students, but only those students who are academically the best and brightest. In fact, the mean SAT scores for TJ graduating seniors are higher than the mean scores for Harvard freshmen.
Does that mean, however, that TJ High is a great school or that others on the USN&WR list are great? Not at all. It is possible that TJ High has added a great deal to its students incoming talents but the USN&WR does not attempt to measure students' progress. Paraphrasing the famous aphorism about the first President Bush that he was born on third and thinks he hit a triple, the magnet performers in the USN&WR list begin miles ahead of the game. Without a value-added measurement, the USN&WR ranking is no better than the Newsweek or Post rankings.
School rankings may sell magazines and newspapers, but no one should be fooled into thinking that they are good measures of school quality.