© Pâté

This article was originally published on 21August 2019.

Late last year, while researching material for a TedX talk I was giving on biometric data, I came across an example that chilled me. In summer 2015, billboards in Hong Kong were plastered with life-sized posters of specific people’s faces; not photographs, but close likenesses generated by a computer.

The software had reconstructed the faces of these people via their DNA, which had been taken from bits of litter such as chewing gum and cigarette butts, and yielded enough data to work out characteristics such as ethnicity, face shape, eye, hair and skin colour, and freckles.

The project was an ad campaign to raise awareness of the city’s littering problem, and had been devised by ad agency Ogilvy & Mather. The ad’s relatively trivial goal cloaked a darker lesson — your biometric data, which is the most personal data you own, is increasingly being used by private companies for commercial purposes.

The catch-all term biometrics refers to your individual biological features, including your face, fingerprints, iris, voice, gait, facial expressions and DNA. Biometrics is used to identify individuals, often in the context of personal safety and security.

In the UK, retailers, hospitals, airports, museums and casinos are using facial recognition for security and access, while banks such as HSBC have unveiled voice recognition to replace traditional passwords.

But despite biometrics being hailed as a smarter, more secure alternative to passwords, the risks of misuse and hacking are enormous. This type of data is near-impossible to change because it is encoded in your biology.

Once collected, it points to you and you alone; once lost or stolen, it is open to permanent abuse. You can always rethink a password but you can’t rewrite your DNA.

Security specialists have long pointed to the fallibility of biometric systems, showing that they can be fooled, and have also outlined the risks of biometric data hacks and leaks. But last week, we discovered that these risks were extremely real.

Facial recognition data and more than a million fingerprints were discovered on a publicly accessible site belonging to Suprema, a company that is used by banks, governments and the UK’s Metropolitan Police.

Suprema provides its biometric platform to an access control business called Nedap, which serves 5,700 organisations in 83 countries, according to a report in The Guardian. “Once stolen, fingerprint and facial recognition information cannot be retrieved. An individual will potentially be affected for the rest of their lives,” said VPNMentor, the research company that found the flaw in the database.

Introducing Culture Call, a new FT culture podcast

Join us for a transatlantic conversation as co-hosts Gris (in London) and Lilah (in New York) interview the people who are shifting culture today — across books, art, music, online and on screen. They will also bring you behind the scenes of FT Life & Arts journalism

Listen and subscribe here

The risks of further large-scale biometric leaks are steadily increasing as we flood companies, large and small, with our biometric data. And just as our online browsing behaviour has become the primary currency of the internet, biometric data is increasingly being monetised. Facebook uses facial recognition on our photos to identify people in the background of images, whether they are Facebook users or not.

Smart speakers such as Amazon Echo (aka Alexa) and Google Home are moving towards individual voice recognition. One online dating start-up, Pheramor, matches up potential couples using their DNA. In July, the FT reported that Pampers and Verily, Google’s life sciences business, were designing smart nappies to collect data from infants while they sleep, wee and poo.

The harvesting of biometric data from sometimes vulnerable populations has raised concerns about the potential for mass surveillance. Privacy activists have criticised the UN’s Refugee Agency for fingerprinting refugees who enter the Democratic Republic of Congo — a practice that, they say, increases the risk of surveillance, discrimination and exploitation.

Part of the solution is ensuring that companies have stringent cyber security procedures, such as fingerprint and face hashing, where the data is encoded in a way that can’t readily be reversed.

That may entail legislation, to mandate that the collection and storage of biometric data protects people’s privacy. But rather than relying on companies to safeguard our data, or on governments to regulate its use, it is incumbent on each of us to question the steady trickle of our biological information into the hands of profit-led corporations.

Madhumita Murgia is the FT’s European technology correspondent

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Follow the topics in this article

Comments