How can we measure the impact of executive education?
We’ll send you a myFT Daily Digest email rounding up the latest Executive education news every morning.
A plunge in the price of crude oil at the dawn of the financial crisis was one factor behind Neste Oil’s market capitalisation halving between 2007 and 2008. So when the Finnish refiner and marketing company created leadership courses a few years later in partnership with a business school, quantifying their impact as well as their monetary and other costs was paramount. “It was a substantial investment. There’s also the time people took off work,” says Hannele Jakosuo-Jansson, senior vice-president for human resources and safety.
Some 270 Neste managers enrolled on two nine-month programmes in 2011 and 2017, created with the Swiss business school IMD. The idea was to create a culture that would enable a radical pivot to producing renewable fuels.
Jeremy Baines, vice-president of sales for the Americas, who took both courses, says unsuccessful expansion in the 1980s into new sectors, such as petrochemicals production, had created a fear of failure. Guest speakers on the courses who were innovating in their fields, including one from a non-governmental organisation promoting sustainable fishing, encouraged participants to explore new ideas.
Baines, for instance, improved the profit margin in North America by selling renewable diesel directly to consumers, such as truck fleet operators, rather than through oil majors that blended the fuel. He says consumers are willing to pay higher prices now that the diesel is marketed as a renewable fuel. “We know we won’t be penalised for thinking differently now,” says Baines, following the IMD courses. “There is more risk but more reward.”
Financial Times Executive Education rankings 2019
Which schools are in the top 80 for Customised and Open-enrolment programmes 2019? Find out which schools are the top 50 providers for both custom and open courses. Also, learn how the tables are compiled.
To measure the impact of the 2011 programme, in 2013 and 2016 Denison Consulting surveyed all Neste employees about the company’s culture. Denison gave the company percentile scores indicating how well it ranked compared with other organisations in areas such as creating change and strategic direction. Over the three-year period, Neste had improved in 10 out of 12 indices.
On the courses, managers were encouraged to explore new ideas without fear of failure, learning from other successful innovators who gave guest lectures. Jakosuo-Jansson says the change in culture has been reflected in improved financial performance. Neste’s renewable products business went from a €163m loss in 2011 to a €56m loss in 2012, before shooting up to a €273m profit in 2013. By 2017 profits were €561m and today renewables accounts for more than 70 per cent of group profits. “There are many factors behind the turnaround,” she says. “But executive education was a key element.”
A growing number of companies are demanding executive education that has a proven positive impact. Surveys by the former FT/IE Corporate Learning Alliance, relaunched in April as Headspring, show that in 2018, 41 per cent of executives were choosing a provider based partly on this criterion, up from 25 per cent in 2016.
Mohamad Razaghi, senior project lead at IMD, which in 2018 established an impact measurement department, says the increase reflects cautious corporate spending since the financial crisis. Another factor, he says, is intensifying competition in executive education — consulting firms, for which impact measurement is routine, have entered the market. “Capturing value is critical to winning clients,” he says. “It is the holy grail of executive education.”
The assessments are moving beyond “happy sheets” that measure how much (or how little) participants liked course content and delivery, to calculating impact on hard business results, says Andrew Crisp, co-founder of the education marketing company CarringtonCrisp, which has researched the impact of executive education with the UK’s Chartered Association of Business Schools. Crisp says schools are using data analytics techniques from online education programmes to measure the impact of courses on clients’ productivity, profitability and sales.
Yet almost all the 49 schools he studied said they polled participants immediately after they had finished a course. This is not enough time for students to apply learning to work, says Andrew White, associate dean for executive education at the University of Oxford’s Saïd Business School. Only 40 per cent of the schools also surveyed students six months later. “Companies should choose a timeframe that fits their goals — some can take years to achieve, like cultural change or career progression,” says White.
Sir Frank McLoughlin is associate director for leadership at the Education and Training Foundation, which develops England’s further education (FE) workforce. Between 2017 and 2019, it created three programmes with Oxford Saïd, including one to develop the financial and management skills of FE college deputy principals. “The sector has been under a real financial strain and many of those in leadership roles have no experience,” says McLoughlin.
As the foundation is funded partly by government grants, he wanted an independent impact assessment from a research company. This showed an increase in participants’ competence and confidence in their leadership abilities. It also found that 15 per cent of the deputy principals who are on or have graduated from the programme have been promoted to principal to date. “Given that there are only 170-odd further education colleges in England, the programme is having a clear impact,” says McLoughlin.
However, delaying surveys can muddy the waters, since other factors can influence performance, says Albrecht Enders, dean of programmes and innovation at IMD. He says it may be hard for participants to recall what they learnt, and a fragmented system of measurement makes it difficult to compare results across programmes and schools.
Teresa Martín-Retortillo, head of open-enrolment executive programmes at IE Business School in Spain, uses the “net promoter score” (NPS) to assess how the school is judged by participants. Clients answer, on a scale of 0-10, how likely they are to recommend IE courses to others. Those who score 9-10 are grouped as “promoters” and those who score 0-6 are called “detractors”. Subtracting the percentage of detractors from promoters yields the NPS, which ranges from -100 to 100. The score for IE ranges from 50 to 90 depending on the programme. The system reduces admin and makes data comparable, says Martín-Retortillo. “Extensive questioning can put participants off. A lower return rate reduces a survey’s credibility.”
Another problem is that companies often lack baseline data, says David Roberts, president of UNC Executive Development, part of Kenan-Flagler Business School in the US. Companies’ annual employee engagement surveys, for example, may be too old to be an accurate starting point. “The risk is that companies are potentially making decisions to invest in training on the back of inaccurate or misleading data,” says Roberts. Rigorous measurement may mean creating a bespoke method for each client, he adds. “There’s no silver bullet solution.”