2AHA09T Detail of Mona Lisa, a painting by Leonardo da Vinci
© Alamy
Experimental feature

Listen to this article

00:00
00:00
Experimental feature

During the Renaissance, artists turned from the profile to the three-quarters view of the face — a position that allows the viewer a more intimate and personal engagement with the sitter. The change was prompted by an increased focus on the individual, and the emergence of that sense of personhood and agency was embodied in the rise of the portrait.

We might see the “Mona Lisa” as the apotheosis of that moment and we might go to the Louvre and observe a sea of elevated phones attempting to capture the image of the world’s most reproduced portrait. Just as those visitors are attempting to own a piece of the “Mona Lisa” on their phones, our faces are being captured and co-opted.

Now that our online data has been comprehensively mined, our behavioural surplus extracted, Big Tech is coming for our faces. The explosion in facial recognition has outpaced efforts to legislate around it and, in societies such as the UK where it is not compulsory to carry identification, we are being identified whether we like it or not.

Facial recognition algorithms are working in real time to identify people in public places, from shopping malls and football matches to protests and simply in the streets. But because legislation has trailed the technology, our basic rights are being contested as there are no controls on whose faces are being retained in databases.

Whereas there has long been some compunction on UK and US police forces to destroy DNA and fingerprint samples from those not charged with any crimes, no such rules apply to our faces. The American Civil Liberties Union is suing the FBI, the Department of Justice and the drug enforcement administration for those US agencies’ records to determine if there are undeclared surveillance programmes. The Metropolitan Police has confirmed it will be introduced in London.

XIANGYANG, CHINA - JUNE 26: Facial recognition equipment and a screen are installed at an intersection to shame jaywalkers on June 26, 2017 in Xiangyang, Hubei Province of China. (Photo by Visual China Group via Getty Images/Visual China Group via Getty Images)
Facial recognition equipment in Hubei, China © Getty

It represents a reimagining of the relationship between state and individual and of our notions that we are free to go about our everyday lives with an expectation of privacy. Perhaps because we have willingly surrendered so much of our information online and through smartphones that track and identify us, we have become blind to an arguably even bigger shift in freedom. We can choose not to use a smartphone but we cannot choose to be invisible.

Or at least that is the orthodoxy. In fact artists, designers and activists have been developing tools for stymieing facial recognition, from masks and make-up to jewellery and hats. Some are akin to the dazzle patterns employed by Vorticist artists in the first world war to camouflage ships at sea, while others are designed to confuse facial recognition programmes.

Artist Adam Harvey’s box of visual tricks for dazzle make-up will do the job, while Dutch designer Jip van Leeuwenstein developed a lens-type mask that distorts the face yet allows visual contact to see expressions. Less extreme is Isao Echizen’s “Privacy Visor” — LED goggles that preclude digital recognition. Other artists have developed more extreme masks, perhaps more for provocation than practice, but the recent experiences of masked Hong Kong protesters attempting to guard against facial recognition systems suggest this might be a necessary path to protest.

Anti facial recognition Mask by Dutch artist Jip van Leeuwenstein.
Distorted: Jip van Leeuwenstein’s mask © Emiliano Di Mauro
Anti facial recognition Mask by Dutch artist Jip van Leeuwenstein.
© Jip van Leeuwenstein

The most sinister use of facial recognition is in China’s Xinjiang region where the world’s most extensive system of CCTV cameras is designed to flag ethnic Uighurs differently to Han citizens. The idea, surely, is to make people aware that they are personally under supervision in their every public moment and to make them alter their behaviour. This is not confined to the big state. Facebook has its mission inscribed in its name, while the Russian-owned FaceApp, which predicted how users might appear in older age, caused controversy by maintaining their faces on a database.

Our faces are being turned into bar codes by machines that read them as a series of relational proportions. Resistance can appear futile — last year a man in London who covered his face when police were testing facial recognition software was forced to show ID and fined, albeit for disorder when he became abusive.

There might be hope. There is always Jonathan Hirshon, the man with no face. Despite working in PR, speaking in public and even having worked for Apple, he has kept his face off the internet. He posts no images on social media and once asked friends and colleagues to tag images of unrelated things with his name to confuse search engines. There is no “canonical” image, no cross-checked reference version, so he remains invisible.

The Renaissance portrait turned the face into an enigmatic treasure, but facial recognition has converted it into a commodity. Our identities are captured, digitised, packaged and sold. Our faces, these most human of identifiers, are being taken from us both with and without our permission.

Get alerts on Biometrics when a new story is published

Copyright The Financial Times Limited 2020. All rights reserved.
Reuse this content (opens in new window)

Follow the topics in this article