Computer board with chips and components

The main engine of growth for the digital economy has been fed by a bottomless well. But one day the fuel supply will run dry.

This week brought a reminder of that, with Intel’s warning that the arrival of its next generation of semiconductors will be six months late.

Rather than doubling the number of transistors on an integrated circuit every two years — the rate of progress that has held for much of the time since Intel co-founder Gordon Moore made his famous prediction 50 years ago — the period has stretched out to two and a half years.

This slowing of Moore’s Law may not sound like much. But when you are dealing with exponentials, small differences matter. It implies that the chip industry’s gift to the world over the next decade, in terms of extra computing power, will be only half what it otherwise would have been.

The sky is not falling. The technology sector has already been working on ways of mitigating a slowdown in Moore’s Law that has been apparent for a number of years. But Intel’s latest warning highlights a set of forces that will have a deeper effect on the chip industry and the wider technology sector — and, ultimately, the rest of the world.

Rather than solving computing problems with brute force — a strategy that made sense when the next generation of more powerful chips was always just round the corner — the slowdown has been forcing a rethink of how the force is applied.

Chip designs optimised for specific computing tasks have been one response. Graphical processing units, which break data-intensive tasks into separate strands to make processing easier, and field programmable gate arrays, chips that can be reprogrammed for specific purposes, are among the main growth markets.

This has also been one of the biggest drivers behind the chip industry’s M&A boom, including Intel’s $16.7bn offer for FPGA maker Altera.

A second result for the chip sector has been a narrowing of research and manufacturing to a smaller group of players able to keep pace with the science and spiralling capital costs of equipping each new generation of chip “fab”, or factory. Only Intel and IBM, whose work feeds an alliance with Samsung, TSMC and GlobalFoundries are still in the research game.

Ironically, the harder it becomes to keep pace with Moore’s Law, the more Intel’s strategy pays off. It has always counted on being first to the next generation of chips. It only needs to keep its nose in front, even as the overall race slows down; others will struggle to keep up.

Meanwhile, the rest of the technology world has been adapting. A shortage of software engineers capable of getting the most out of the new chip designs has been one bottleneck. But as Patrick Moorhead, a chip analyst, says: “Adding more people to the programming armies is easier than changing the laws of physics.”

Companies at the leading edge of computing, such as Google, have been working out how to optimise their data centres to eke out improvements in digital productivity.

At some point, however, the underlying slowdown will be felt. Mr Moore’s latest guess is that in a decade or so, the change in pace will become noticeable.

Scientific breakthroughs may offer some hope. IBM has trumpeted research into materials such as graphene, as well as the promise of almost limitless computing power being theoretically possible with quantum chips. There is also potential in completely new designs such as IBM’s synapse chip — a processor that emulates the human brain. But the practical value of experiments like these is impossible to predict.

The exponential advances in processing power have been behind the technology world’s periodic discontinuities — like the arrival of smartphones, made possible by a massive leap in computing power. If such upheavals take longer to arrive, it will be consumers who bear the cost.

Predictions based on the exponential growth, courtesy of Moore’s Law, also lie behind some of the more expansive visions of how technology will change the world. For instance, Ray Kurzweil, a futurist who works at Google, has predicted a moment when the intelligence of computers outpaces that of humans, leading to a merging, some time around 2045, of man and machine — a moment he calls the Singularity.

If Moore’s Law really is running out of gas, visions like this will have to be put on hold. But at least it will give humanity a while longer to enjoy its own exceptionalism.

richard.waters@ft.com

Get alerts on Intel Corp when a new story is published

Copyright The Financial Times Limited 2019. All rights reserved.
Reuse this content (opens in new window)

Follow the topics in this article