Trends and Insight in association withSynapse Virtual Production

Industry Figures on Potential Knock-Ons of Facebook’s Leaked Moderation Guidelines

London, UK
LBB’s Addison Capper catches up with iris, Noble People, Laughlin Constable and SS+K

It was revealed this week that while it’s ok to say “to snap a bitch's neck make sure to apply all your pressure to the middle of her throat” on Facebook, it is not ok to wish “someone shoot Trump”. These and many other eye-opening moderation guidelines from the social network were brought to public attention this week after The Guardian published leaked documents on how Facebook decides which content should or should not be removed from its site. The task of policing a 2 billion-strong community is, understandably, near-on impossible, but a lot of what was published makes for uncomfortable reading. 

What’s more, following this Spring’s YouTube scandal relating to brands’ ads appearing next to extremist and hate-filled content – so marketers are more wary about the online context of where their ads are placed.

What effect could this have, then, on Facebook’s ever-soaring ad revenue, as brands potentially distance themselves from such controversy? LBB’s Addison Capper caught up with industry folk from iris, Noble People, Laughlin Constable and SS+K to find out. 

Matt Borchard, Media Director, Noble People

Here’s the thing: every coveted demographic and psychographic is on Facebook and in scale. Marketers can target them based on what they are actually interested in. There’s virtually no barrier for advertisers to run there and they have some control over where ads run - in news feed, side panel, suggested video, mobile, desktop, etc.

When you have a platform with tens and hundreds of millions of people in every demographic combined with the options for targeting and continuously improving metrics, advertisers won’t leave. Year over year, revenue is up nearly 50% for Q1, growth will slow but it’s not going to reverse completely.

The only thing that will slow revenue growth will be inventory, particularly in the US where new user growth has slowed significantly compared to the rest of the world. Even though they are up to nearly two billion users worldwide, there’s only so many ads that can be served. They need to continually crank out new products - like Instagram stories and mid-roll on Facebook video to continue growth.

They are fighting an impossible battle when it comes to graphic, violent, racist, terroristic content. Imagine a government trying to control two billion people of many different languages, cultures, and ideologies. Policing that perfectly is impossible. 

Digby Lewis, Head of Platforms and Distribution, iris

If Facebook were a country, it would be the most populated one in the world. How do you moderate humanity?

It sounds like the worst job in the world. People are the problem, not the platform - you can’t blame the ocean for sharks.

Different governments, executives and legislatures adopt different policies towards censorship, morality and acceptable versus unacceptable behaviour.

But Facebook also crosses cultural and ethnic boundaries. It’s often very difficult to define context online, especially in the feed.

Also, language is evolving, we are in a post-literate age where images and iconography take on secondary meanings. The task Facebook faces is impossible. I happen to think they get it about right, or as well as they can be expected to. Human moderators should be required to adjudicate in the grey areas and perhaps on a local, culturally relevant level. It’s awful to read about the things that have to sift through that are so utterly and universally vile. Live video presents real challenges. Hopefully AI will improve to make the process more efficient because with great power comes great responsibility.    

Maggie Avram, Lead Social Strategist, Laughlin Constable

I don’t anticipate this leak having any effect on ad revenue, but brands will still be affected. Many brands have developed their own community guidelines aimed at being extraordinarily transparent about what content will be removed from their page, commonly relating to hate speech, profanity and violent content. These brand-developed moderation guidelines often work in tandem with Facebook’s own policies, relying on Facebook to remove user content that could be abusive toward their communities. This leak will likely expose greater disparity between what a brand defines as inappropriate for their communities and what Facebook includes in their own guidelines. Ultimately, I anticipate Facebook facing considerable heat from brands concerned about maintaining the strong communities they’ve worked so hard to build.

Claudia Cukrov, Senior Digital Strategist, SS+K

Facebook is the core platform for the global audience, and as such, continues to be a priority for brands and services across the world. The only real monetary risk is a consumer outcry, leading to a major audience departure from the platform - and that feels unlikely, as most consumers won't engage in this type of industry-specific story. Aside from some of the abusive disability and bullying moderation guidelines, there's nothing much there I find all that surprising.