Future of computing is smaller than you imagine

Listen to this article

00:00
00:00

Moore’s Law, the principle established by Intel co-founder Gordon Moore that says computing capacity doubles every two years, has served the IT industry in good stead. But Moore’s Law is coming under increasing pressure as developments in chip design start to collide with basic laws of physics.

“One way to look at Moore’s Law is, we are using fewer and fewer electrons to store ones and zeros,” says Dick Lampman, director of HP Labs. There comes a point, he says, when miniaturisation of circuits is no longer possible. “You see quantum mechanical effects: leakage or tunnel occurrence as electrons flow through the insulator.”

These effects make much harder building the transistors that make up the core of a microprocessor.

Scientists are divided on exactly when Moore’s Law will falter but most, including Mr Moore himself, do not expect it to hold for much more than 15 years. If the industry is to develop computing power beyond that, it will have to embrace nanotechnology.

The end of Moore’s Law may have less to do with the fundamentals of physical theory and more to do with the practicalities of engineering and manufacturing devices that are tiny. The way future microprocessors are designed might not change much – they will still be based on large numbers of tiny switches. But they will be built differently.

Today’s microprocessors are made using a photo-lithographic process, but that process is unlikely to be able to handle much smaller wires. Manufacturers have already announced chips based on 65 nanometre circuits, down from 90 just last year. But to go below 30nm is a far greater challenge. As Philip Kuekes, a researcher in Hewlett-Packard’s Palo Alto labs, points out, “at 15 to 10 nanometre channel length, we won’t be able to turn off the transistors.”

Solving this requires new materials, new manufacturing processes and new thinking on design.

Researchers working in nano-electronics are looking at two ways to make chips with very small wires. The first is the bottom-up approach: using chemicals to “grow” the chip’s structure.

The other, and the approach in use by HP researchers, is top-down. This means creating a master and then imprinting the circuit design in a soft polymer, in the way vinyl records are made. HP is drawing on its background in chemistry and printing to fine tune a process it hopes is cost-effective enough to put into production. Its researchers believe that nano-imprint lithography could even be cheaper than conventional manufacturing processes.

There is little point in developing nano circuits, however, if the transistors themselves are too small to work. Transistors have worked well in micro-electronics for the last 40 years but HP researchers believe that an alternative is needed to be effective at the nano level. The technology they have developed is the crossbar latch or array.

Here, two sets of parallel nanowires join at right angles and sandwich electrically switchable material. The junction of crossbars and switches stores a bit of information. HP has developed prototypes that allow signal restoration and inversion, or the ability to store the zeros and ones that form the basis of all computing processes.

The potential benefits for the IT industry are enormous. Nano chips could be thousands of times more powerful than today’s CMOS chips. HP is already working on crossbar arrays in its labs, based on 15nm wires. The technology also allows circuits to be stacked three-dimensionally. This offers the potential for memory chips that could store a petabyte of data (1m gigabytes) per square centimetre.

There are, though, still, a number of serious technical hurdles. The greatest is that, at this level, it is impossible to make a “perfect” chip.

Current microprocessor fabrication techniques are based on the idea of making as many perfect chips as possible. And the chips’ performance will remain consistent during their lives.

In the nano world, these assumptions no longer hold true. “There are significant architectural differences because you need radically different ways to build things at this scale,” says Stan Williams, HP Senior Fellow and director of quantum science research.

“The major problem is defects in nano-scale wires, and fluctuations. It takes just one or two atoms to drastically change the resistance of a wire. That could cause it not to work. These fluctuations are unavoidable, because of the second law of thermodynamics.”

The solution is to borrow from some of the techniques used to build the internet. The internet is based on the idea of redundancy: if one part fails, data packets take an alternative route. “We are making a defect-tolerant architecture, making a circuit that operates even though it is imperfect.”

This will be done primarily through software programming techniques. “If you design for defects, the system can effectively ignore many things that happen,” says Mr Williams. “You can operate the system in a way that degrades gracefully, so you don’t have a catastrophic failure.”

If HP’s researchers are on the right lines, nano chips offer the possibility of vastly more powerful computers, mobile phones and entertainment devices. But Mr Williams expects the first products built around nano chips to be much simpler.

“The first commercial devices are likely to be sensors,” he says. “Frankly, worries about terrorism and pollution are creating an environment where people are willing to pay a premium for a device that has enhanced attributes. The first nano devices will be expensive but there are organisations willing to pay that price.”

Copyright The Financial Times Limited 2017. All rights reserved. You may share using our article tools. Please don't copy articles from FT.com and redistribute by email or post to the web.