wires
© Shonagh Rae

A few months ago a senior British broker, whom I shall call “Dennis”, staged a tiny personal rebellion. Dennis sits on the board of a company that insures some of the biggest accounts at Lloyd’s of London. And in this field, as Dennis tells me, “accurate statistics are vital for the underwriters to assess risk”.

Unfortunately, the computing systems that were supposed to collect this all-important data were plagued by problems. In an effort to fix these woes, Dennis’s company duly hired an expert to devise a new, improved (and expensive) system – and called a meeting of the board to approve it.

So far, so unremarkable. But then the tale took an unusual turn. When the computing expert presented his plans, everyone on the board waved them through – except for Dennis, who declared that he would not approve the plans since he had “not understood a word the computer expert had said … [The project] was delivered using the baffling gobbledegook that many computer geeks use,” he explains.

A stand-off ensued until his fellow board members eventually admitted that they had not really understood the project either and demanded the computing experts translate their plans into plain English. But Dennis still wasn’t happy. Although he is no IT expert, he insisted on sitting with the experts to watch them work, in a desperate effort to educate himself about what was really going on with all those bytes and pixels.

“My colleagues amazingly had voted on something they didn’t understand,” Dennis fumes. “[But] the important point is this: I had to be with the installers to explain exactly what I wanted out of the system and to see it through with them.”

In some senses this is merely a humdrum tale of corporate life (and Dennis is such a creature of the City establishment that he did not want his real name to be used). But in another sense, Dennis’s story should challenge us all at the start of a new year. For when we look back at 2013, one of the big themes was the regularity with which computing systems produced hugely costly glitches.

Sometimes this has been headline-grabbing. Just think, for example, of the disastrous mishaps that have plagued the healthcare exchanges launched by Barack Obama and the flight delays in December at Heathrow airport when the air traffic controllers’ computers got days mixed up with nights. Or the continued problems besetting the computers used for welfare payments in the UK – not to mention the glitches that have beset many western banks, such as Royal Bank of Scotland.

But while those dramatic stories grab public attention because they cost millions (if not billions) of pounds and dollars and cause political embarrassment, computing failures in our offices, homes or on our tablets and phones are also pervasive – and pernicious. We rely on computers more than ever, which makes us increasingly vulnerable when they let us down.

On one level, this situation is not our fault. Another great theme of 2013 was the rising tide of cyber crime and cyber warfare, and generally it is difficult for ordinary people to combat those malevolent threats. But on another level, there is something we can do about computing risks: ask hard questions about how those IT systems actually work.

This sounds obvious. But much of the time, computing experts live in a technical silo of their own, detached from the consumers who use their products, the corporate executives who buy these systems and the politicians who develop policies that rely on IT. And most of the time the non-experts complacently ignore what the geeks do, since it seems excessively dull and technical.

In this sense, the way modern society treats IT looks uncannily similar to the picture with finance before 2007: a small group of technical experts is doing something that almost nobody understands but that has the potential to affect us all. Back then, this pattern played out with subprime lending and mortgage bonds; today, it plays out with Obama’s healthcare exchanges or UK welfare, say. Either way, nobody notices the problem until it is too late.

Of course there are ways to mitigate these risks – or at least there are if you listen to an entire army of IT consultants trying to sell their services. But the first step is the simplest and the most important: like Dennis, we need to ask challenging questions, admit that we do not understand “gobbledegook” and demand answers. And that applies whether you are a humble journalist, a consumer or a CEO – or even the US president.

gillian.tett@ft.com

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Follow the topics in this article

Comments