A stream of binary coding, text or computer processor instructions, is seen displayed on a laptop computer screen as a man works to enter data on the computer keyboard in this arranged photograph in London, U.K., on Wednesday, Dec. 23, 2015. The U.K.s biggest banks fear cyber attacks more than regulation, faltering economic growth and other potential risks, and are concerned that a hack could be so catastrophic that it could lead to a state rescue, according to a survey. Photographer: Chris Ratcliffe/Bloomberg
© Bloomberg

Is the computer the least efficient machine humans have ever built? Technology journalists often unthinkingly pick up a narrative of progress in which each generation of technology is an improvement on the last, from abacus to iPhone. We marvel that we carry more computing power in our pockets than was used to put a man on the moon in 1969.

What we have at our fingertips is smaller, faster and more complicated than before. But is it necessarily better?

In his new book The Bleeding Edge, Bob Hughes, an activist and former academic, takes a refreshingly critical look at assumptions about technology — the subtitle is “Why technology turns toxic in an unequal world”.

It has already become fashionable to question the accepted narrative of capitalism — thoughts perhaps best crystallised by Thomas Piketty’s Capital in the Twenty-first Century. We are living at a time when inequality in incomes and living standards is rising. But at least we have iPhones, the thinking goes. There may be collateral damage from technological progress, but the end-product elevates us all.

We can read stories about how small farmers are using WhatsApp to find buyers for their crops or how South American fishermen use their mobile phones to check spot prices and we assume that technology overall is moving humanity forward.

Mr Hughes unpicks some of this thinking. From a historical perspective, technological progress has not always resulted in the betterment of humanity. Take the spread of watermills for grinding corn, which began around 1000AD in Europe. Watermills have always been presented as an example of enlightened development, enabling people to grind much larger quantities of grain at once.

Yet milling by hand preserves more nutrients in the grain than mechanised milling, and a move to watermills — generally owned by feudal lords — was imposed by force on a reluctant peasant population. Around this time the average height of European peasants began to decrease, indicating a worsening diet.

In the computer age, we are similarly spun into cycles of obsolescence and upgrades that benefit us little but which are difficult to opt out of. Anyone still mourning the loss of their BlackBerry to an iPhone may feel a stab of sympathy when they read Mr Hughes.

The economics of microchip production — where factories must operate at enormous scale and only the very latest products make a profit — dictates a relentless pace of device upgrades, regardless of what consumers really need.

Understanding this helps to explain the mysterious “productivity paradox” — the fact that all the new computer and mobile technology of the past 20 years has not led to an increase in productivity. Employees must constantly learn new ways to perform the same task over and over again as technology changes. However, this does not necessarily increase the speed at which jobs are done.

Moreover, modern computers and mobile phones — for all their functionality — are hampered by a design flaw that dates back to the 1940s: a clock that dictates that only one tiny process can happen at a time. Clocks have sped up since British codebreakers at Bletchley Park built the Colossus machine during the second world war, but the principle remains the same: only a small amount of frenetic activity happens at a time, while most of the device remains idle.

There are other routes that we could have taken with technology. Until the 1960s around half the world’s computers were still analogue: in fact, it was analogue computers that enabled that first moon landing.

Analogue computers had many advantages. They could be more intuitive to use and even in the 1980s were significantly faster and cheaper than their digital rivals. They could be made from a variety of materials. The Monetary National Income Analogue Computer (Moniac), which was built in 1949, used water to model aspects of the UK economy. Another system, which was built by the University of the West of England’s International Centre of Unconventional Computing, was based on slime mould.

Mr Hughes, who co-founded a group campaigning for migrant rights in 2003, is more of an activist than a technologist. He is not offering a clear way out of the technological dead end he describes — apart from a move away from the current capitalist system.

There are many small acts of technological subversion Mr Hughes might have mentioned: villages that set up their own broadband internet connections when commercial companies deem it uneconomic, individuals running peer-to-peer alternatives to the conventional internet, or companies such as Fairphone that are building mobile phones that have longer shelf lives because components are easy to replace and swap out.

Such projects are small, obscure and often poorly funded but they exist and an assessment of whether they might offer viable alternatives would have been interesting.

Still, the author provides a useful counterpoint to breathless extolling of the latest gadget and the book is worth a read for that reason alone.

The Bleeding Edge , by Bob Hughes, New Internationalist Publications, RRP£10.99, 336 pages

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Follow the topics in this article

Comments