Cerebras computer chip
The Cerebras processor is a single square chip cut from a 300mm silicon wafer

The race among semiconductor makers to gain an edge in the booming market for specialised AI processors has just given rise to the world’s biggest computer chip.

While chip circuitry continues to get smaller, the slab of silicon, developed by Californian start-up Cerebras, has a surface area slightly larger than a standard iPad and is 56 times bigger than its closest competitor. It also eats up as much electricity as all the servers contained in one and a half racks — the towers of computers in data centres that stand more than six feet tall.

The mammoth chip, due to be unveiled on Monday after nearly four years of development, is the starkest sign yet of how traditional thinking is being turned on its head as the chip industry struggles with the demands of artificial intelligence.

It also highlights giant leaps in the amount of computing power that are being thrown at the most complex AI problems — something that prompted US research group OpenAI to raise $1bn from Microsoft last month, hoping to ride the exponential hardware curve to reach human-level AI.

Most chipmakers have been looking to create smaller, modular elements, known as “chiplets”, out of which today’s most advanced chips are assembled, according to Patrick Moorhead, a US chip analyst. Cerebras, by contrast, has jettisoned that conventional approach and instead come up with what is in effect an entire computing cluster on a single chip, he says.

The race to build a new generation of specialised AI chips, under way for several years, is finally reaching a critical point, with several companies — including Intel, Habana Labs and UK start-up Graphcore — either just starting or promising to deliver their first chips to customers before the end of this year. Cerebras did not name what it said was a number of customers already receiving its chips, although they are likely to be best suited for the massive computing tasks undertaken by the biggest internet companies.

More than 50 companies have been trying to develop specialised chips for AI. Most of these are used for inference, the task of applying a trained AI system to real-world examples, rather than the far more data-intensive job of training the deep learning models in the first place. That challenge has been taken on by a handful of start-ups like Cerebras, Graphcore and Wave Computing, as well as Chinese challenger Cambricon.

The length of time it has taken for companies like these to start shipping products shows that the technical challenges were much greater than most had expected, said Linley Gwennap, principal analyst at the Linley Group, a US chip research firm. That has not prevented some of the product-less start-ups attracting high valuations. Cerebras has raised more than $200m in venture capital, with its latest round, late last year, valuing it at around $1.6bn, said Andrew Feldman, chief executive.

A graphic with no description

The need for a new type of processor for AI stems from the massive amounts of data needed to train neural networks, which are used in so-called deep learning systems to handle tasks like image recognition and language understanding.

The networks operate as giant feedback loops, recycling information as they learn to find patterns in the formless data. The computing “cores”, or brains, in the chips needed for this work are relatively simple compared with the cores in general-purpose CPUs, which have to handle many different computing tasks. But the chips’ makers must find ways to harness vast numbers of cores in ways that speed up the time it takes to train a large deep learning model, while saving on electricity, and hence cost.

Taking the challenge to its logical extreme, Cerebras has carved a single, square chip out of a 300mm diameter circular wafer, the largest silicon disc that can be produced in today’s chip “fabs”, or factories. Wafers are normally sliced up into dozens of individual chips, because the technology does not exist to etch circuitry into anything bigger than a large postage stamp sized area. Cerebras has sought to overcome this limitation by connecting the many different sectors on wafers, known as dies, enabling them to communicate directly with each other, in effect turning the entire silicon plate into a massive processor.

Cerebras computer chip

Most of the companies building specialised deep learning chips have adopted designs that push data into computing memory that sits alongside the chip’s many processing cores, so that they can handle tasks with the minimum possible delay and use only a tiny amount of energy to shuttle information back and forth.

The next task is to link cores in a matrix pattern so that they can communicate with each other, like the synapses in the brain. By connecting 400,000 cores, Cerebras claimed it was taking this process to its ultimate level while still keeping the efficiency of handling all the processing on a single chip.

Daily newsletter

Track trends in tech, media and telecoms from around the world

Sign up here with one click

Intel, which bought AI chip start-up Nervana four years ago as the AI race was getting started, has been trying to achieve the same effect by networking together many individual chips in a vast array. That has brought a breakthrough in the past six months and will bring much greater efficiency to systems due out later this year, said Amir Khosrowshahi, the chief technology officer of Intel’s AI efforts.

However, even if Intel succeeds in linking thousands of cores into a huge matrix-like system, it is unlikely to be as efficient as the Cerebras chip thanks to the inbuilt advantage that company has of placing everything on a single chip, said Mr Moorhead.

This article has been amended to reflect the fact that Cerebras’ chip is 56 times bigger than its closest competitor’s, not 80

Get alerts on Artificial intelligence when a new story is published

Copyright The Financial Times Limited 2019. All rights reserved.
Reuse this content (opens in new window)

Follow the topics in this article