The college-rankings game has been silly enough for years, but now it’s just absurd. U.S. News has just revealed that it changed its methodology again for this year’s list, suggesting that it’s now “woke” because it’s giving more weight to “social mobility” aspects of each institution. Decipher this statement, if you can, as reported in Inside Higher Ed:
New this year in the outcomes section are two social mobility factors that together make up 5 percent of the total ranking. One looks at the graduation rates of Pell Grant recipients, and the other compares Pell-recipient graduation rates to those of all students. Both of those figures are then adjusted for the share of all students who are Pell recipients. So if two colleges have the same Pell graduation rates, but one has a larger share of Pell recipients, the second college would earn more points in the formula. U.S. News counts the graduation rate formula as also indicating social mobility and so says that 13 percent of its formula is now based on social mobility.
Focusing only on this change (there are others), several schools vaulted upward significantly, such as Howard University, which rose from 110 to 89. Valerie Strauss puts it succinctly in the Washington Post:
If you put junk in, you get junk out. And that’s pretty much what you get with most rankings of schools. The folks doing the ranking decide what is important to them or their audience, and, for some reason, consumers and schools themselves put a great deal of stock in the outcome of ever-changing, questionable methodology.
At the same time, the top-ranked schools continue to be the usual suspects, without having changed much at all. They’re still the richest, the most privileged, the oldest (more or less), and in general the most established institutions in the United States. No matter how you slice it, this hierarchy will never change unless rankers decide to invert all their current measures.
U.S. News announces changes to its methodology every few years, so each new list is a “re-centering” along the lines of the College Board’s re-centering of the SAT a number of years ago. A 730 today isn’t the 730 of twenty years ago, so making comparisons over time requires conversion charts and a fierce resolution. Have the institutions changed? Over time, all institutions do; the ranking methodology says more about the rankers and their audiences than it does about the institutions.
One very interesting difference is that this year U.S. News eliminated“acceptance rate” from the equation. Although it accounted for only 1.25% of each school’s ranking, it probably accounted for closer to 35% of consumers’ thinking. Perhaps that goes with a greater emphasis on social mobility–it’s better to focus on inclusion than exclusivity, but at such a low weight it hardly makes a difference. Unfortunately, it also eliminates one of the useful features of the ranking that enabled college counselors to see where their seniors might have a good chance to be accepted. And it’s always helpful to remind readers that most colleges and universities in the United States accept more than 50% of their applicants. In fact, the National Association for College Admission Counseling’s (NACAC) State of College Admission 2017 report indicates that, despite a 7% increase in applications from first-time freshmen, 66.1% of applicants were offered admission, the “average selectivity rate.”
Interestingly, Stanford edged out U.S. News by announcing it would no longer publicize its admit rate. Coincidence? I doubt it; they saw what was coming and figured it was good PR to get out in front of the change, despite its having no real meaning and despite Stanford’s already having “won” the race to be among the most exclusive schools in the country.
Perhaps, as Sandy Hingston puts it in Philadelphia magazine, we can now all agree that college rankings are “bunk.” They’re really reflections of their creators, not unbiased scientific observations, no matter how many times words like “methodology,” “formula,” and “analysis” are used. Measuring once’s own priorities and concerns against such non-objective systems is folly. In the search for the “best” college for your children, it would be better to decide what your own “methodology” is and work from there.