Listen to this article
This year, 155 business schools from 28 countries took part. Each school must meet strict entry criteria: for example, schools must be internationally accredited and the MBA must have run for at least four consecutive years.
Data for the rankings are collected using two online surveys, one for the schools and one for alumni who completed full-time MBAs in 2009.
A 20 per cent alumni response rate is required, with at least 20 fully completed responses. A total of 10,706 alumni surveys were completed (a response rate of 48 per cent).
Alumni responses inform eight ranking criteria, accounting for 59 per cent of the ranking’s weight. The first two ranking criteria, the most heavily weighted, examine alumni salaries.
Salaries of non-profit and public sector workers, as well as full-time students, are removed. Remaining salaries are converted to US dollars using purchasing power parity rates supplied by the International Monetary Fund. PPP conversion – based on the premise that identical goods should cost the same in different countries – accounts for differences in the relative strength of currencies. Following the subsequent removal of the highest and lowest salaries, the average “current salary” is calculated for each school and weighted to reflect variations between industry sectors. The resulting figure, “weighted salary”, carries 20 per cent of the ranking’s weight.
“Salary increase” is determined for each school according to the difference in average alumni salary before the MBA to three years after graduation. Half of this figure is calculated according to the absolute increase and half according to the percentage increase.
Where available, data collated over the past three years are used for all alumni criteria, except “value for money”, which is based on 2013 figures. Responses from the 2013 survey carry 50 per cent of the total weight, and those from 2012 and 2011 each account for 25 per cent. Excluding salary-related criteria, if only two years of data are available, the weighting is split 60:40 if data are from 2013 and 2012, or 70:30 if from 2013 and 2011. For salary figures, the weighting is 50:50 for two years’ data, to negate inflation-related distortions.
Eleven criteria calculated from school data account for 31 per cent of the ranking. These measure the diversity of teaching staff, board members and MBA students, according to gender and nationality, and the international reach of the programme. For gender criteria, schools with a 50:50 (male/female) composition receive the highest score. To ensure the integrity of school data, KPMG audits a number of schools every year.
There have been minor changes to the calculation of international diversity for 2013. In addition to schools’ percentage of international students and faculty – the figures published – the composition of these groups by individual citizenship informed a diversity-measuring score, which feeds into the calculation.
Additionally, the contribution of the “international course experience” criterion has increased from 2 to 3 per cent. The “languages” criterion has been halved to account for 1 per cent.
The FT research rank, which accounts for 10 per cent of the ranking, is calculated according to the number of articles published by full-time faculty in 45 internationally recognised academic and practitioner journals. The rank combines the absolute number of publications, between January 2010 and October 2012, with the number of publications, weighted relative to the faculty’s size.
An FT score is finally calculated for each school. First, Z-scores – formulae that reflect the range of scores between the top and bottom school – are calculated for each ranking criterion. These scores are then weighted, according to the weights outlined in the key to the 2013 ranking, and added together to give a final score by which the schools are ranked.
Judith Pizer of Jeff Head Associates was the FT’s database consultant. FT research rank was calculated using Scopus, an abstract and citation database of research literature.