Modern smartphones, wrote the chief justice of the US Supreme Court in a recent ruling, “are now such a pervasive and insistent part of daily life that the proverbial visitor from Mars might conclude they were an important feature of human anatomy”. Without the chipmaker Intel, it is possible that such earthling technologies would not even exist.
In The Intel Trinity, Michael Malone tells the story of the innovators whose tiny slivers of silicon freed computing from the shackles of the mainframe era – a feat that the technology journalist and business school professor believes justifies the accolade of “the world’s most important company”.
Through access to corporate archives and decades of his own reporting, he has put together a detail-rich account of the California-based chipmaker’s successes and stumbles since its founding in 1968.
Malone’s biography of the company and those who led it verges at times on hagiography. Co-founder Robert Noyce, an MIT-trained physicist, was “the very embodiment of all that was good about American business and entrepreneurship”. Of one of Intel’s early chips, the author writes that “with [its] introduction . . . it can truly be said that mankind changed”.
Yet for all the hyperbole, Malone has a point. Intel’s other co-founder was the brilliant scientist Gordon Moore, who formulated a theory – known as Moore’s law – correctly predicting that the processing power of computers would roughly double every two years. He also pushed the company to maintain the spending that kept its products improving at such a rate. As Malone writes, chips’ increasing speed has been the “metronome of modern life”.
Why Intel, and not dozens of other semiconductor companies in Silicon Valley in the latter part of the 20th century? Malone argues that Intel outdid its rivals on three fronts: better engineering, better marketing and a better ability to learn from its mistakes.
Moore and Noyce were among the smartest scientists of their generation. Moore had his law, while Noyce was one of the inventors of the silicon chip itself, and their company employed a remarkable team of engineers.
The key problem with The Intel Trinity, however, is that Malone’s storytelling will do little to disabuse his more sceptical readers of the assumption that electrical engineering is fairly dry stuff. The executives involved are not iconoclastic characters and the minutiae of their office politics do not make for a compelling narrative, apart perhaps from Noyce’s affair with a 28-year-old employee while he was chief executive.
Malone also skimps on the industry context and colour that would enable those new to the subject to appreciate the significance of some of Intel’s decisions and discoveries. As for the great claim made in his subtitle, that will seem arguable to many – particularly at a time when Intel has seen its lead slip with the spread of smartphones and tablets largely powered by its rivals’ chips.
Nonetheless, Malone’s look inside Intel explains clearly and comprehensively what motivated the men who founded the modern technology industry and serves as a useful reminder that, even in today’s era of apps, social networks and drones, Silicon Valley still runs on silicon.
Sarah Mishkin is an FT San Francisco correspondent
Get alerts on Non-Fiction when a new story is published