© The Financial Times Ltd 2016
FT and 'Financial Times' are trademarks of The Financial Times Ltd.
The Financial Times and its journalism are subject to a self-regulation regime under the FT Editorial Code of Practice.
“Keep Calm and Hit Her”. At first glance the T-shirt design seemed to be yet another example of the way violent misogyny has become a joke for some – a literal punchline. No wonder that earlier this month the internet was up in arms at the discovery of such a T-shirt, not from a street trader or a Soho sex shop, but on Amazon.co.uk. And “Keep Calm and Hit Her” wasn’t the only example. There was also “Keep Calm and Rape A Lot”.
When we see such T-shirts for sale through the website of one of the UK’s largest retailers, we naturally imagine that somebody designed the shirt and some buyer at Amazon said, “We’ll take it.” The fact that similar jokes are a staple of the stand-up comedy circuit makes it all the easier to believe that.
But “rape a lot” is not only outrageous, it’s an odd phrase. And what about “Keep Calm and Skim Me”? This was also among the hundreds of thousands of T-shirts available from Solid Gold Bomb, the company which sold them via the Amazon website. (The T-shirts are no longer for sale.) Solid Gold Bomb issued a profuse apology and delivered an intriguing explanation: it wasn’t that somebody had decided these T-shirts would be good business – it was that the T-shirt slogans were being suggested by a computer run amok.
The T-shirts didn’t exist in physical form: a computer algorithm suggested a range of different slogans, and if any of them were purchased the T-shirt would then be printed to order. Now you might well ask what kind of computer algorithm would include the word “rape”; or why Solid Gold Bomb didn’t check the output of the computer (or did check, but didn’t pull the offensive logos). Well, indeed: “my algorithm made me do it” is not much of an excuse here.
The idea that men might crack jokes about rape is all too familiar. The idea that computers are algorithmically generating entire product lines is a new one on me. And while the algorithm excuse is unpersuasive in this case, the algorithms are coming. Search Amazon.co.uk for “Philip M. Parker”, a business school professor who has patented software to write dictionaries, benchmarking studies and other narrow genre books: I saw 113,000 results.
What next: algorithmic blasphemy? Algorithmic child pornography? Hard to say: by their nature, algorithmic processes tend to produce surprises. They can also be a force for good. Using the blind power of variation and selection, computers can design a better nozzle, a better surfboard, even a better campaign to persuade people to quit smoking. The key insight is that a very high error rate is acceptable because the successes are retained and the errors quickly discarded. But what if one of the errors goes viral instead?
One solution is to rapidly remove rogue results. But that introduces a different problem: when the environmentalist author Mark Lynas published The God Species, the book was briefly removed from the Amazon website during the launch campaign. Amazon’s site reported that it had received complaints that the product was “not as described”. Lynas suspected sabotage from his political opponents, and remains unsure as to what really happened. Complaints-based takedown procedures can themselves be abused in a world full of algorithms. (Disclosure: Lynas is a friend.)
Another option is more diligent pre-screening of results. That would have worked for the rape T-shirts. But it, too, has its own costs: millions of customers looking for legitimate niche goods may find that online retailers cannot be bothered to check the long tail of products, and turn back to the mass market.
As I watch my daughters grow up, I suspect they will have even more serious feminist concerns than offensive T-shirts designed by badly supervised computers. But something tells me we haven’t heard the last of algorithmic products.
Tim Harford is the presenter of Radio 4’s ‘More or Less’.
Copyright The Financial Times Limited 2016. You may share using our article tools.
Please don't cut articles from FT.com and redistribute by email or post to the web.