How to read the tables

Listen to this article

00:00
00:00

This is the second year the Financial Times has produced a ranking of European business schools.

This year’s table is slightly different from last year’s, in that it is compiled with data from four Financial Times business education rankings – the full-time MBA 2005 rankings (published in January), the ranking of non-degree executive education programmes (May), the European Masters in Management 2005 rankings (September) and the EMBA 2005 rankings (October).

Last year’s European business schools ranking did not incorporate the European Masters in Management rankings, because this survey was introduced in 2005.

Although this table is compiled from four rankings, it includes data from five tables because the Financial Times survey of non-degree executive education programmes (published in May) produces two tables; one is a ranking of open programmes and the other is a ranking of customised programmes.

Each of the Financial Times’ business education rankings has different criteria.

For example, the MBA ranking measures the research output by faculty, whereas the European Masters in Management ranking does not and, to take another example, the EMBA ranking includes data on salary earned by alumni but the Executive Education rankings do not include salary information.

To view all the criteria in the various rankings, please go to http://news.ft.com/businesslife/mba

To compile the European Business Schools table, all the criteria from each of the rankings have been utilised; 20 from the MBA 2005 rankings, 16 from the open programmes ranking (Executive Education 2005), 17 from the custom programmes ranking (Executive Education 2005), 16 from the European Masters in Management 2005 rankings and 16 from the EMBA 2005 rankings.

The first stage in the compilation process of European Business Schools table was to produce tables in each of the separate rankings, with only European business schools.

European schools that did not make it into the final table of each ranking, in that they achieved the required response rate but did not make it into the final table (the top 100 in the MBA, the top 75 in the EMBA, etc), were reinstated before each ranking table was rerun.

In each table, we were then left with only the European business schools and their z-scores.

Z-scores enable us to position each school, within any criteria, relative to all the other schools.

Each school’s z-scores from each of the rankings were added together and were then averaged out, depending on the number of rankings in which they took part.

The z-scores of joint programme schools were weighted (according to the number of partner schools), before their scores were added together.

The final stage was to deflate the averaged z-scores, by various weights according to the number of surveys in which they participated.

For schools participating in three out of four surveys, the final score is deflated by 10 per cent. For those participating in two out of four surveys, the final z-score is deflated by 20 per cent – and so on.

Even though re-running the individual ranking tables with only the European schools would change the final position of each school in its respective table, the schools have been ranked in the same order as they were in
each of their respective rankings, for publication.

Only a few criteria could be selected to be displayed in the final combined table, because of the limited available space in the newspaper, therefore only the main drivers from each of the separate rankings have been displayed – salary data (in euros for the European Masters salaries and in US dollars for the MBA and EMBA) and the research rating.

However, only the final ranks have been displayed from the Executive Education 2005 rankings (both the custom and open tables).

The research rank in this table has been re-calculated, to include schools from the most recent EMBA rankings and the MBA 2005 rankings (for schools that were not in the EMBA 2005).

The research rating is based on the amount of contribution the schools’ faculty have put into 40 international refereed research journals, which were chosen by the business schools.

To calculate the research criterion, a point (in the case of more than one author, then a fraction of a point) is awarded to the school where the faculty member is currently employed.

The final rank is a weighted sum of the absolute number of publications and the publications adjusted for faculty size.

For schools that have taken part in both the EMBA 2005 and MBA 2005, their research data from the EMBA 2005 were used, which was the more recent of the two rankings.

Also, for joint programme schools, the research output was calculated based on the faculty of the school being measured.

So for example, only WHU’s faculty (and not Kellogg’s) was used to calculate their research rating and similarly, only Essec’s faculty (and not Mannheim’s) was used to calculate Essec’s research output.

As the European Business Schools ranking focuses on the schools and not programmes, the Cems postgraduate programme, which was measured and performed very well in the Masters in Management 2005 rankings, is not included because it is a programme and not a school.

Also, although Chicago Business School’s EMBA programme is taught in London (as one of its three programme locations) and is competing with the likes of London Business School, it is not ranked in the table because it is a US school.

Additional research by Wai Kwen Chan. Database consultant Judith Pizer of Jeff Head Associates, Amersham, UK

Copyright The Financial Times Limited 2017. All rights reserved. You may share using our article tools. Please don't copy articles from FT.com and redistribute by email or post to the web.