© The Financial Times Ltd 2015 FT and 'Financial Times' are trademarks of The Financial Times Ltd.
August 24, 2014 5:01 pm
Even hardened journalists warned people not to watch the video, not even to share stills from it. One simply wrote, “oh, dear God, no”. But minutes after it was posted, images of James Foley’s decapitation had spread. People searching for information about the American journalist’s death were confronted with an appalling scene of carnage. Eventually, YouTube took the video down. Twitter, acting on a request from Foley’s family, also purged the images but only if they were flagged by users.
During the past year the Islamic State of Iraq and the Levant, also known as Isis, has posted videos of stoning, beheading, and torture of thousands of Iraqis and Syrians, whose families cannot supply Twitter with the evidence it requests, or repeatedly flag videos on YouTube.
Of far more import than the sensitivities of bereaved families, however, is the fact that these postings will lead to further killing of innocents. Isis depends on sophisticated use of social media to terrorise the masses, and to lure recruits. It floods social media with especially graphic material, combined with threats, as a prelude to major offensives, hoping to disperse resistance through fear. Its Twitter app, “The Dawn of Glad Tidings”, games the site’s algorithms by making automatic posts from supporters’ accounts, inflating its online footprint.
The crisp London accent of Foley’s beheader marked him as one of thousands the US state department reckons Isis has lured from 50 countries with the aid of social media. Some of the group’s videos resemble popular video games, such as Call of Duty or Halo, in which the player aims to shoot pixelated adversaries dead.
When the journalist Daniel Pearl was beheaded by jihadis in 2002, the video caused outrage, but it was not as widely viewed. Footage of the beheading of American businessman Nicholas Berg in 2004 was released on a Malaysian website, which was quickly shut down. But in that era social media barely existed. Two billion fewer people were online. Now extremist groups are their own media outlets.
Most Isis social media accounts remain undisturbed. They attract media attention, and censoring them is costly and perhaps offends against the ideal of free expression, however vile. When posts are taken down, it is done on an ad hoc basis. This is no longer enough. Major platforms should not allow themselves to become vehicles for the easy dissemination of propaganda that depicts staged murder. Shutting down Isis accounts will not remove this material from the internet. But it will make it harder to find and harder to distribute.
This does not mean censoring all graphic content. The most disturbing footage of Eric Garner being killed last month by a New York police officer’s chokehold while he repeatedly pleaded “I can’t breathe” did not make it on to the television news. But websites need have no blanket bans on content of public interest, however upsetting.
In the US, legal restrictions on speech generally apply only to incitement of imminent violence or copyright violations. If Isis used unauthorised copies of Beyoncé tracks, the videos would quickly have been deleted.
In the UK, home of many Isis recruits, Scotland Yard warned that “viewing, downloading or disseminating that video risks prosecution for a terrorist offence”, but this is an overreaction that criminalises the acts of ordinary citizens. Even if we consider that social media platforms, because of their power and reach, should be treated as crucial public spaces when it comes to questions of censorship, platforms can make a principled distinction between evidence exposing wrongdoing and snuff films used as propaganda, and carve out a narrow, explicit and enforced ban on the latter.
Given the complexity of this, and the importance of maintaining free speech, platform companies should establish ethics boards with authority over questions of censorship and archiving.
Today’s social media is mostly a boon to free speech, and it should remain so. This does not require that we give free rein to groups that commit crimes against humanity while breathing the oxygen of global publicity delivered at the click of a button.
The writer is an assistant professor at the University of North Carolina, Chapel Hill
Letter in response to this article:
Copyright The Financial Times Limited 2015. You may share using our article tools.
Please don't cut articles from FT.com and redistribute by email or post to the web.
Sign up for email briefings to stay up to date on topics you are interested in