Experimental feature

Listen to this article

00:00
00:00
Experimental feature
or

Since the Wharton School announced last spring that we would no longer provide the e-mail addresses of our students or alumni to publications conducting rankings of business schools, there has been a lot of talk about why we chose this path. Editorials in some publications posited that we might be trying to censor student opinion or that we were retaliating for having dropped numerically in some rankings or never having achieved the top spot in others.

I appreciate that the editors of the Financial Times have given me this opportunity to set the record straight. Quite simply, we have taken the position that rankings are a bad tool for prospective students who are trying to select the right school for them, the right fit for their educational needs and interests, and the best preparation for their futures.

We are not trying to hide from journalistic scrutiny.

We are, instead, looking for ways to focus our resources and co-operate with the media to develop more constructive and meaningful ways to help students and employers evaluate business schools.

Some have raised the concern that we are trying to suppress student and alumni opinion.

Anyone who has been on a university campus, and a business school in particular, knows that there is nothing that could hold back students from letting their views be known!

The fact is that we encourage constructive and open debate, because that’s where innovation is born - between the cracks of one idea and another. We deeply value student opinion. Our goal is to constantly improve our programming, and our students work with us to do that every year.

Likewise, recruiters’ interest in the content of our curriculum and the preparedness of our graduates is critical. We actively seek their thoughts and have instituted significant changes in our programmes as a result.

For more than a decade, we have conducted annual surveys of our students to elicit constructive feedback about their satisfaction with the educational experience. The feedback is by no means all positive. In fact, negative opinions are the most helpful; they tell us where we need to focus attention and resources to improve. We drill down into the data we collect to analyse the exact nature of the problems, and we react quickly with targeted strategies to fix what’s not working.

By contrast, rankings give a one-size-fits-all numerical assessment of a school’s performance, with unclear methodologies and subjective editorialised commentary on what schools are doing right or wrong.

This fuels the growing phenomenon of “playing the rankings game” - instituting policies and programmes that will move a school up in the rankings without actually improving the educational product.

Some rankings publications have dubbed themselves “educational reformers,” but what count as reforms can be just gimmicks with no substantive impact on the quality of education. One particularly disturbing anecdote involves a school that was dissatisfied with the ranking it had received from recruiters. To address its recruiter ranking, this school offered valet parking and a concierge service to recruiters visiting its campus. The following year, the school’s recruiter ranking improved, and the publication credited the concierge and valet parking as an effective reform.

But wouldn’t prospective students look at a recruiter ranking as an indication that firms liked hiring from that school, and as an indication of their career advancement opportunities with that school’s degree, and not as a rating of parking amenities? The numerical ranking made no distinction.

We believe that prospective students are hurt, rather than helped, by rankings. We know from the sales of ranking issues of magazines and college guides, as well as comments we hear from students, that rankings are playing a big role in deciding to which school they apply. But reliance on rankings is a poor substitute for the proper due diligence that helps students make the right choice for them.

We don’t claim to be the right school for everyone, and we definitely don’t want to be a school that students select solely or predominantly because of rankings. Students must select a school based on their career goals, the disciplines they want to study, their learning styles and the type of community they wish to be involved with for the rest of their lives.

This requires more research than rankings can provide.

We spend hundreds of staff hours responding to more than a dozen annual surveys for rankings in which we have historically participated, and we get more requests each year from publications hoping to tap into the popularity of rankings. While our decision last April affected only the provision of e-mail addresses, Wharton’s ultimate goal is to provide prospective and current students, alumni, recruiters and the media with more consistent and complete information, in a truly comparative format.

That’s why we support the Graduate Management Admissions Council (GMAC) and its effort to develop a database that will include standardised, audited data. The MBA Career Service Council’s Standards Committee has also worked with representatives from the several rankings publications, including the Financial Times, to discuss data points the publications would like to track.

By co-operating with other schools and media outlets, agreeing on standards, and subjecting our data to independent auditing, we hope to offer more information, not less, to potential students and recruiters around the world.

Patrick Harker is dean of the Wharton School at the University of Pennsylvania

Copyright The Financial Times Limited 2017. All rights reserved.
myFT

Follow the topics mentioned in this article

Comments have not been enabled for this article.