File photo dated 30/11/15 of Facebook's logo reflected in a pair of glasses. Facebook has admitted it was "far too slow to recognise" Russian election interference and the spread of fake news on the social networking site. PRESS ASSOCIATION Photo. Issue date: Monday January 22, 2017. In a blog post on social networks and their influence, product manager Samidh Chakrabarti said that at its worst social media "allows people to spread misinformation and corrode democracy". See PA story TECHNOLOGY Facebook. Photo credit should read: Dominic Lipinski/PA Wire
The Incorporated Society of British Advertisers warned that consumers are becoming sceptical of digital advertising © PA

A group of the UK’s biggest advertisers has called on Facebook and Google to establish an independent body to regulate and monitor content on both of their platforms.

Phil Smith, the director-general of the Incorporated Society of British Advertisers, said the two technology companies should adopt common policies over the detection, monitoring and removal of inappropriate content. The group’s members include Lloyds Banking Group, Unilever and Procter & Gamble.

Google and Facebook should “thrash out some common principles” over content moderation and removal that could be adopted and enforced by an independent body, which they would fund, he said.

Other social networks and tech platforms, such as Twitter and Snapchat, would also be invited to join the regulatory framework.

“At a minimum, what we’re looking for is independent oversight and reporting,” Mr Smith told the Financial Times. “This would build confidence in the platforms themselves and would be good for their reputations.”

Funding an independent body would also strengthen consumer and advertiser confidence and ward off the threat of government regulation, he said.

Pressure for regulatory action against technology platforms is increasing across Europe. Last year Germany introduced fines of up to €50m for companies that do not remove hate speech or fake news within 24 hours of receiving a complaint and Theresa May, the UK prime minister, has called on technology groups to remove terror-related content within two hours of it appearing online.

Google and Facebook declined to comment on the ISBA proposal.

Google said recently that it would increase the number of people reviewing content on its YouTube video channel to more than 10,000 — although it declined to say how many it already employs.

Facebook recently promised to double the size of its safety and community teams to 20,000 by the end of 2018 — a move that was partly in response to the furore in the US over content associated with Russian entities that aimed to disrupt the outcome of the 2016 presidential election.

The technology companies have intensified their policing efforts amid growing public concern.

This year’s Edelman’s Trust Barometer found that public trust in big technology groups is declining. The fall over the past 12 months was particularly pronounced in the UK where concerns have grown over the past year about the availability of extremist and other inappropriate content on YouTube.

Facebook said in a statement that it was taking an “aggressive approach towards illegal and inappropriate content” on its platform.

Mr Smith, a former marketing director of Kraft, said advertisers expect the big technology companies to take action because consumers are becoming sceptical of digital advertising.

“Our consumer research tells us that digital advertising is intrusive and not being trusted,” he said. Consumers “know that television advertising is regulated in some way — both the advertising and the content — but they don’t believe that to be the case in any respect when it comes to digital”.

Many big UK advertisers left YouTube in March last year when it emerged that ads from well known brands had run alongside jihadi videos and other extremist content. Then in November, Diageo, Mars and Hewlett-Packard pulled advertising from the video channel after their campaigns appeared alongside videos featuring children and sexualised comments. 

“We are applying the lessons we’ve learned from our work fighting violent extremism content over the last year in order to tackle other problematic content,” Susan Wojcicki, YouTube’s chief executive, wrote in a blog post last month.

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Follow the topics in this article

Comments