Why Schools Moved Higher or Lower in the Best High Schools Rankings

One question frequently asked about the U.S. News Best High Schools rankings is why schools moved up or down when compared with their placement in previous years. There are several possible reasons public schools' ranks changed in the 2015 edition.

1. Changes in relative performance on state tests: Some schools that were ranked in the 2014 Best High Schools rankings fell out of the 2015 rankings completely because they were no longer among the best-performing schools on their statewide tests -- meaning that their overall student performance on state tests during the 2012-2013 academic year did not exceed statistical expectations (Step 1 of the rankings methodology) or the performance of their least advantaged students was not as good as the state average (Step 2 of the methodology).

If they did not pass both of these steps of the methodology, schools were not eligible for the national competition for a gold, silver or bronze medal and don't appear at all in the rankings.

In total, 3,375, or 72 percent, of the high schools that were awarded a gold, silver or bronze medal in the 2014 Best High Schools rankings returned to the 2015 rankings as gold, silver or bronze medal winners. That means that 28 percent of the high schools that were ranked in 2014 were not ranked in 2015.

Of the schools that were gold medal winners in the 2014 rankings, 92 percent returned to the 2015 rankings as gold, silver or bronze medal winners. A majority of the 2014 gold medal winners -- 74 percent -- returned as gold in 2015.

Of the schools that were silver medal winners in the 2014 rankings, 79 percent returned to the 2015 rankings as gold, silver or bronze medal winners. More than half of the 2014 silver medal winners -- 66 percent -- returned as silver in 2015.

And of the schools that were bronze medal winners in the 2014 rankings, 64 percent returned to the 2015 rankings as gold, silver or bronze medal winners. More than half of the 2014 bronze medal winners -- 59 percent -- returned as bronze in 2015.

These results show that the bronze high schools were much less consistent in their year-to-year performance on statewide tests, especially when compared with the relatively high year-to-year consistency among the gold medal schools and, to a somewhat lesser degree, the silver medal winners.

2. Changes in relative or absolute performance on college-level course work: Some schools may have moved either up or down in the 2015 rankings compared with last year because of how their 12th-grade class in 2012-2013 compared with the 2011-2012 class, in terms of participation in and performance on Advanced Placement or International Baccalaureate exams.

U.S. News determines the college readiness of each school by analyzing these data for the graduating class cohort in the most recent academic year available -- in this case, the 2012-2013 school year. This means we looked at whether these students took and passed any AP or IB exams during their years at the school, up to and including their senior year.

Many schools experienced a change in their status, ranging from moving a few places in the gold medal rankings to changing medal status -- from gold to silver, silver to bronze, bronze to gold or bronze to silver -- because of changes in their College Readiness Index scores.

3. New medal winners: Some schools were new to the 2015 rankings because they passed both Step 1 and Step 2 of this year's methodology but didn't pass both of those steps in 2014 and therefore weren't eligible for a gold, silver or bronze medal.

Other high schools became eligible to be ranked for the first time in 2015 because they are relatively new schools. They may have had their first 12th-grade class graduate in 2012-2013, or the size of their graduating class may have grown enough to be included in the rankings.

In total, 3,142, or 48 percent, of the high schools that were awarded a gold, silver or bronze medal in the 2015 rankings were not ranked in 2014. Specifically, 101 of this year's gold medal winners, 837 of the silver medal winners and 2,204 of the bronze medal winners were not ranked in 2014.

One factor that aided schools not ranked in 2014 to be ranked in 2015 is that U.S. News made an important methodology change. We lowered the performance threshold necessary for a school to pass Step 1. To qualify for the rankings published from 2012 to 2014, schools had to meet a performance threshold of one-half of one standard deviation above the average. For this year's rankings, schools only had to reach one-third of one standard deviation above the average.

This slightly lower threshold was applied to a school's performance compared with what would be statistically expected for that school, based on its percentage of economically disadvantaged students.

The change meant that a larger percentage of high schools passed Step 1 in 2015 than in last year's rankings, and it resulted in hundreds more high schools winning medals. This change also meant that it was slightly easier for a ranked school in 2014 to return to the rankings in 2015, assuming that it had similar relative performance on state tests.

Overall, in the 2015 rankings, approximately one-third of eligible high schools earned a medal, compared with around one-fourth of those eligible in 2014.

4. Suppression of state test results, incomplete state test data or lack of AP test results: Some medal-winning schools that were top performers in terms of college readiness in 2014 weren't eligible to be ranked in 2015 because their state blocked certain portions of their math and English state test results from being released publicly.

There were also schools that weren't ranked in 2014 that may have been eligible for medals this year, but certain portions of their state test data were suppressed or missing.

Data could have been suppressed by states for various reasons, including protecting the identities of certain students or of students in particular subgroups, such as those who had very high or very low scores on their state tests.

Delaware, Maine, Utah and Wyoming did not have enough data available to calculate a performance index, the key component needed to pass Step 1 of the rankings methodology, for a relatively large proportion of schools. Schools without a performance index could not pass Step 1, which in turn meant they were not eligible for medal consideration.

Oklahoma was also missing state test data for a significant proportion of schools, which made it impossible to calculate Step 1 in the rankings for many of its schools. Alabama, Kentucky, Illinois, Mississippi and Virginia also had data suppression issues that affected the analysis of the high schools in their states. For a detailed explanation of how all of these issues were handled in the analysis, see the Best High Schools technical appendix.

In addition, some schools weren't eligible for gold or silver medals in 2015 because U.S. News could not use their AP data to determine their level of college readiness (Step 3 of the methodology).

Alabama, Minnesota and South Dakota were the only states that did not give U.S. News permission to use their schools' AP data in Step 3 of the rankings. Because of the lack of AP data, schools in those states were only able to be rated on IB data in Step 3, if those data were available. Both Alabama and Minnesota had schools with IB data.

The lack of AP data prevented some schools that were gold or silver medal winners in 2014 from being ranked at that level in 2015. However, these schools still had the ability to earn bronze medals if they passed Step 1 and Step 2 of the methodology.