© The Financial Times Ltd 2015 FT and 'Financial Times' are trademarks of The Financial Times Ltd.
June 18, 2013 12:00 am
Internet companies will be put under “unrelenting pressure” to do more to block online child abuse images, the culture secretary is set to say, as the government presses the industry to come up with a plan to stamp out illegal content.
Maria Miller has summoned 13 companies, including Google, BT and Microsoft, to a summit in Whitehall on Tuesday. They will be told that “widespread public concern has made it clear that the industry must take action”. Ms Miller will ask the companies to draw up specific plans by the autumn.
However, the search companies and internet service providers attending the 90-minute meeting say they feel the criticism is unfair because they have been blocking illegal content for years.
One executive said the summit was political “grandstanding”. None of the companies was willing openly to question the event because of the risk of being seen as downplaying child abuse.
Industry sources said on Monday they would sign a “pledge” that would reiterate the industry’s “zero tolerance” to child abuse online and promise to work with the Internet Watch Foundation and other organisations to stamp out such images. “It’s window dressing,” said one industry figure. “We are pledging to do what we already do”.
Industry participants are also concerned that Ms Miller will use the summit to put pressure on them to be more active in filtering legal content on the web, such as adult pornography. In recent weeks, Ms Miller has called repeatedly for internet service providers to stop children accessing “depraved images”.
Even David Cameron has thrown his weight behind the campaign. The prime minister said at the weekend he worried when any of his three children “grab hold of the iPad” because of what they might access.
Claire Perry, Mr Cameron’s special adviser on preventing the sexualisation of childhood, has said the industry needs to do more to help parents.
Ms Perry said the four biggest internet companies had agreed to a one-click filter system which would ensure that even “passive parents” were stopping their children’s access to pornographic images.
But while some want internet groups to impose parental filters for adult content as a default setting, the industry is determined to continue to provide filters on an “opt-in” rather than an “opt-out” basis.
Industry participants are also concerned that politicians have sown confusion by mixing the subject of illegal child pornography with the separate issue of access to legal adult pornography.
“The recent debate about child abuse images has been confused and unhelpful,” said Jim Gamble, former head of the Child Exploitation and Online Protection Centre, the national agency to tackle child abuse.
Mr Gamble, now head of security consultancy Ineqe Safe & Secure, said he had long been particularly impressed by Google’s “understanding of the issues and the amount of work they do in the background to make the internet safer”.
Over the weekend Google announced it would give a further $5m to child protection groups. Of the total, $1m will go to the Internet Watch Foundation to help it double the number of people actively searching the web for images of abuse.
The IWF has for years been central to the fight against child abuse. It receives and assesses public complaints about illegal images and then notifies its members such as BT, Virgin, Talk Talk and Sky, which then take action to block the sites.
BT, the UK’s biggest internet provider, has been blocking child pornography since 2004 through its Cleanfeed web-filtering system. In recent years the filter has been extended to sites that facilitate music and film piracy.
Google and Microsoft, as providers of the two biggest search engines, have both developed technologies to create a “hash”, or unique fingerprint, to tag known child abuse images. This allows them to identify and block duplicate images wherever they appear, reducing the need for human involvement.
Google is developing a cross-industry database that will enable companies, law enforcement agencies and charities to collaborate better on detecting and removing child abuse images.
Microsoft said it was in “active discussions” with its peers and government on what more could be done to fight illegal content. “Tuesday’s meeting represents the next step for all of us committed to putting an end to this type of material online,” the company said in a statement.
However, the focus so far in the fight against child abuse images and videos has been on the open web. Child protection officers fear that illegal content is shifting towards private networks and hard-to-reach parts of the internet.
Through services such as Tor, which bounces internet traffic through a global network of relays, users are able to conceal their location and hide what they are downloading. The more successful that agencies become in the fight against child abuse on the web, the harder it is likely to become to find those who view or upload such material.
Additional reporting by Daniel Thomas
Copyright The Financial Times Limited 2015. You may share using our article tools.
Please don't cut articles from FT.com and redistribute by email or post to the web.
Sign up for email briefings to stay up to date on topics you are interested in