Good schools, bad schools and ugly rankings

There is a certain tragic irony in the fact that a crackdown on U.S. colleges with low graduation rates was announced during the very week when the Times Higher Education Supplement published its annual list of the world’s best universities. The coincidence provokes the thought that there is too much of this going on.

By “this” I mean the constant effort in higher education to sort the best from the not-best, and the best among the best from the best, and the best among the best among the best from the best among the best. (Try saying that twice quickly.) There are so many rankings now, using so many different methodologies.

The Times Higher Education Supplement list — the THE list, as it is awkwardly known — emphasizes research done at the school. The U.S. News and World Report Rankings, which also came out this month, emphasize 15 factors, including alumni giving, graduation rate and admission selectivity. There are lists of the best party schools, the schools with the best teachers, the schools with the best dormitories.

Others have said it, and so will I. Such rankings are basically silly. They yield little useful information to would-be consumers of education. At the very top of the THE list, Oxford University has supplanted Cal Tech, which had a five-year run at the top. Does this mean that you should redirect your budding physicist to England?

The many lists, crude and opaque as they might be, have largely swamped the traditional idea of searching in a personalized way not for the best school but for the school that best fits each applicant. Moreover, the existence of the lists — and the fact that students and their parents pay so much attention to them — encourages gaming. Two years ago, the president of Northeastern University admitted making a “systematic effort” to improve his school’s U.S. News ranking. He broke no rules. He simply chose to emphasize things that would move the school higher.

Other schools have been accused of outright lying — excuse me, fudging the numbers — to improve their rankings.

Which brings us to the potential crackdown on schools at what we might call the other end of the list. According to the Wall Street Journal, the member organizations that comprise the Council of Regional Accrediting Commissions have decided to take a closer look at schools that graduate under 25 percent of their students in four years. I understand the impulse, but the tool of measurement is crude.

Experts on higher education have been predicting for years that the traditional four-year model is going to die, except at a handful of highly selective institutions. Maybe so. But it’s not obvious that we can tell from the graduation rate which schools to euthanize, or even which ones deserve re-examination. A college might well have a low graduation rate because it serves a marginal population — young people for whom, to be blunt, going to college is a risk. If we punish schools with low graduation rates, we may send the message that these kids shouldn’t be in higher education. (And if you’re saying, “Well, that’s true,” you might be right — but it would be better if we could debate the point openly.)

More important, graduation rate is too crude, and easily gamed. If the accreditors set the level for closer investigation at a given graduation rate, they will simply give colleges, even poor ones, an incentive to graduate more students. They might wind up passing almost everybody. As my Bloomberg View colleague Matt Levine recently noted, you get what you measure. He had banking in mind. But the notion holds just as true for colleges and universities struggling to move up the ratings list. We shouldn’t be surprised when the same idea occurs to the schools at the bottom.

Stephen L. Carter is a Bloomberg View columnist. He is a professor of law at Yale University and was a clerk to U.S. Supreme Court Justice Thurgood Marshall.

News Headlines
Add Event
Home Front Page Footer Listing
You May Like

You May Like