College Rankings Are Delusive — Should We Stop Giving Them So Much Credit?

It is that time of the year when high school students nervously check their email inboxes hoping they have been accepted to their reach schools. At the same time, college rankings, which many have relied on to make their list of college applications, have come under fire once more.

Recently, Michael Thaddeus, a math professor at Columbia University, has openly questioned the accuracy of the data submitted by the college to U.S. News. The accusation has drawn special attention, for the school was ranked as the second-best university in the U.S. along with Harvard University and Massachusetts Institute of Technology (MIT). Meanwhile, the University of Southern California admitted that it had been omitting key data when reporting to U.S. News in order to inflate rankings for its graduate school of education. The university pulled the graduate program out of the rankings altogether for this year.

This came amid a federal court’s decision to sentence 14-month incarceration for a former dean of the business school at Temple University on charges of providing false data to U.S. News. However, sending false data to manipulate school rankings seems to be a widespread practice; Claremont McKenna College, George Washington University, and the University of Oklahoma, to name of few among a slew of others, have also falsified their data in the past.

Fixing the problem might have been simpler if the issue with rankings was solely down to those who cheat the system. Unfortunately, the problem goes deeper; the rankings are delusive with questionable criteria, providing a misleading guide for students and families to choose where to apply and attend. People can end up spending more when similar quality education is available at a better price. For instance, schools ranked around 70th in the university rankings will probably not be dramatically different in the quality of education or career prospects compared to their counterparts ranked around 90th.  As the rankings do not accurately represent such important information, students may decide to attend a slightly higher-ranked university paying full tuition, forgoing scholarship opportunities at a lower-ranked one.

How Are Rankings Constructed?

[smartslider3 slider=2]

U.S. News & World Report College Ranking Metrics. Source: U.S. News

Therefore, it is important to take a closer look at what goes into these rankings that so many rely upon for guidance. Many prospective students are probably unaware that the single most significant factor constructing U.S. News’ ranking is the institution’s reputation, accounting for 20 percent of the final score. The reputation criterion — based solely on the “peer assessment survey” — wields as much impact on the rankings as the entire category of faculty resources which includes class size index, student-faculty ratio, and a few other criteria highly relevant to the institution’s quality of education.

Coming in as a close second, in terms of influence on scores, is the six-year graduation rate. This can also be highly misleading, as it reflects more so on the student cohort’s financial background to fund their studies than academic ability. As students from low-income families are much more likely to experience an interruption of study for financial reasons, many of them struggle to complete an undergraduate degree within six years. Assigning 17.6 percent of importance to the six-year graduation rate in the ranking metrics may consequently motivate colleges to accept a smaller number of students from low-income backgrounds to protect their graduation rates and rankings.

However, graduate indebtedness, the most important factor for student satisfaction according to an extensive survey by Gallup in 2017, accounts merely for 5 percent of the U.S. News’ ranking metrics. Since prospective students also search university rankings to predict their quality of life and satisfaction during and after their time at undergraduate, it does not seem entirely reasonable why graduate indebtedness casts only 5 percent of influence on the rankings.

On a positive note, following the years of heavy criticism, U.S. News removed the student acceptance rate from its ranking formula. The rate had previously made schools adopt aggressive marketing strategies to lure and incite students — even those who had practically no chance of getting an acceptance — to apply for admissions. Instead, the ranking now takes into account the proportion of Pell Grant students who complete their education within six years; this measure offers insight into how much support financially disadvantaged students receive at each college.

How to Fix

Nevertheless, many other factors that seem hardly relevant to students’ satisfaction with college experience still remain in the ranking metrics. Alumni donations, for example, account for 3 percent in the formula, while the aforementioned graduation rate of Pell Grant students accounts for 5 percent. This kind of information is simply not what prospective students need. Instead, they need to know whether the college provides a stimulating suite of courses with quality teaching, how the institution prepares its students for a future career, how indebted its students become after graduation, and the diversity on campus for students to easily find their niche group to fit themselves.

University of La Verne’s alumni reported a high level of satisfaction despite the school’s relatively low placement in the rankings. Source: University of La Verne

Likewise, students’ and alumni’s satisfaction with their overall college experience should wield much more influence on the rankings. For instance, Jonathan Rothwell, Gallup’s principal economist, used the results from the 2017 Gallop survey to devise a university ranking primarily focusing on graduates’ satisfaction. Although his ranking showed a general positive correlation between the traditional rankings and the graduates’ satisfaction, there were outliers such as alumni from Azusa Pacific University and the University of La Verne, indicating high satisfaction. Furthermore, Rothwell’s study also revealed that the tuition fee does not correlate with the graduates’ level of satisfaction.

Just as much as critics do, college administrators lament the misrepresentative university rankings. Nonetheless, they continue to participate in the rankings and benefit from the scheme. What they can do, instead, would be to collaborate with ranking companies to create a universal ranking system that better reflects alumni’s satisfaction. The new system could also include other useful information for students: Which institution has a wide variety of extracurricular opportunities and cultural diversity? Which university offers extensive career support? Which school provides specialized experiential learning?

Choosing a college to attend remains a challenging one for students as rankings provided by U.S. News and others are not what they appear to be. Nor are they likely to change the cornerstone of their business on their own. It is time for colleges to challenge the status quo and argue the case for a new ranking system — one that can act as a genuinely useful guide for students in making such a momentous decision in their lives.

 

Read More: Fake College Ranking Data

Similar Posts