Listen to this article
This is an experimental feature. Give us your feedback. Thank you for your feedback.
What do you think?
These are anxious times in the world of work, and not just for individuals fretting over where future jobs will come from. Organisations too are waking up to awkward questions posed by advancing technology. Where will top managers come from now that we have broken the career ladder by outsourcing or eliminating middle management? What jobs are needed to lure the brightest young talent away from the start-up and freelance “gig” economy? What shouldn’t we automate? What are these currents doing to notions of development, loyalty — and the whole human resources function?
Some of this was explored in a recent Future of Work seminar led by Lynda Gratton, London Business School’s professor of management practice. The goalposts are moving, she says, but it is clear from mismatches all along the business value chain that employment policies of the past are out of date.
The old assumptions are bypassed by onrushing technology and changing attitudes. Big-company jobs were once a sellers’ market: corporations picked people they wanted for jobs they defined, in return for predictable careers. Preferred candidates were “the smartest guys [mostly] in the room”, often with specialist skills. The “war for talent” was with other HR teams honing established processes for recruiting, developing and keeping the best people.
But “all that is wobbling”, says Prof Gratton. From raising capital to final marketing, every part of the business value chain can now be outsourced to technology-enabled platforms (Kickstarter, social media, 3D printing…), multiplying the forms of jobs and options for a generation already favouring start-ups, or working for themselves, over established employers.
Meanwhile, as machine intelligence, improving almost exponentially, eats into higher-level specialist jobs, intelligence by itself no longer trumps human qualities such as empathy, critical thinking and creativity, that computers cannot (yet) emulate. Relying on old employment practices is a recipe for irrelevance. The boundaries of the company have become permeable; the relationship is no longer one of adult to child but of nurturing transient alliances and softer loyalties to reciprocal benefit.
Alongside the need to bridge missing steps in the career ladder where middle management used to be (one idea: loan out future stars to other employers, as football clubs do), at least two other yawning disjunctures lie ahead, one posing issues for society as a whole, one internal.
First, the education system, which assumes learning is an intensive, costly, one-off investment in preparation for work, at which point it stops, is out of sync with the logic of change. University-type learning is still essential — Silicon Valley needs arts graduates to supply the creative spark machines cannot — but it is not enough now that, as Prof Gratton says, “lifetime learning is beginning to be a reality”. The idea of competency-based education (building the ability to analyse, communicate and think critically, for example) is gaining ground as a foundation for further specialisation. Early experience of massive open online courses (Moocs) has been disappointing, but there is a vacuum to fill and there will almost certainly be a second coming. Educational disruption has not been eliminated, just postponed.
The other mismatch is opening up in companies, and may be as tricky to fix. Philosopher Michael Polanyi’s paradox (“we can know more than we can tell”, or our tacit knowledge of the world exceeds our ability to spell it out) posits, for now, some limits to automation — notably the qualities humans took millions of years to evolve, such as curiosity, creativity, imagination, empathy and morality. But to yield dividends, these qualities need time, patience and deep attention, all of which have been expunged from high-pressure, performance-managed offices.
“We’ve designed work that takes away the only opportunity humans have to be different from machines,” says Prof Gratton. “The very technology that makes creativity important is limiting it because of the way we’re choosing to make jobs work.”
In 1936, Charlie Chaplin set his satire on humans and technology, Modern Times, in a factory. Now the locus is the office, and as computers move up the employment value chain, the stakes get higher. “Humans are underrated,” argued a recent article in Fortune. But others think differently. The C-suite will not be a hiding place for much longer, warned a blog from a software company on Harvard Business Review: “The same cost/benefit analyses performed by shareholders against line workers and office managers will soon be applied to executives and their generous salaries.” The name of the company, in case you wondered: iCEO.