Skip to main contentSkip to navigationSkip to navigation
A moderator featured in the film The Cleaners: ‘sobering viewing’.
A moderator featured in the film The Cleaners: ‘sobering viewing’.
A moderator featured in the film The Cleaners: ‘sobering viewing’.

Facebook's burnt-out moderators are proof that it is broken

This article is more than 5 years old
John Naughton

Despite employing a small army of contractors to monitor posts, it’s clear the company is no longer fit for purpose

Way back in the 1950s, a pioneering British cybernetician, W Ross Ashby, proposed a fundamental law of dynamic systems. In his book An Introduction to Cybernetics, he formulated his law of requisite variety, which defines “the minimum number of states necessary for a controller to control a system of a given number of states”. In plain English, it boils down to this: for a system to be viable, it has to be able to absorb or cope with the complexity of its environment. And there are basically only two ways of achieving viability in those terms: either the system manages to control (or reduce) the variety of its environment, or it has to increase its internal capacity (its “variety”) to match what is being thrown at it from the environment.

Sounds abstruse, I know, but it has a contemporary resonance. Specifically, it provides a way of understanding some of the current internal turmoil in Facebook as it grapples with the problem of keeping unacceptable, hateful or psychotic content off its platform. Two weeks ago, the New York Times was leaked 1,400 pages from the rulebooks that the company’s moderators are trying to follow as they police the stuff that flows through its servers. According to the paper, the leak came from an employee who said he “feared that the company was exercising too much power, with too little oversight – and making too many mistakes”.

An examination of the leaked files, says the NYT, “revealed numerous gaps, biases and outright errors. As Facebook employees grope for the right answers, they have allowed extremist language to flourish in some countries while censoring mainstream speech in others.” Moderators were instructed, for example, to remove fundraising appeals for volcano victims in Indonesia because a co-sponsor of the drive was on Facebook’s internal list of banned groups; a paperwork error allowed a prominent extremist group in Myanmar, accused of fomenting genocide, to stay on the platform for months. And there was lots more in this vein.

Some numbers might help to put this in context. Facebook currently has 2.27bn monthly active users worldwide. Every 60 seconds, 510,000 comments are posted, 293,000 statuses are updated and 136,000 photos are uploaded to the platform. Instagram, which allows users to edit and share photos as well as videos and is owned by Facebook, has more than 1bn monthly active users. WhatsApp, the encrypted messaging service that is also owned by Facebook, now has 1.5bn monthly average users, more than half of whom use it several times a day.

These figures give one a feel for the complexity and variety of the environment that Facebook is trying to deal with. In cybernetic terms, its approach to date has been to boost its internal capacity to handle the variety – the torrent of filth, hatred, violence, racism and terrorist content – that comes from its users and is funnelled through its servers. In the beginning, the CEO, Mark Zuckerberg, went for the standard Silicon Valley line that there is a tech solution for every problem – artificial intelligence (AI) would do the trick – although he had to concede that the technology was not sophisticated enough to do the job just yet.

As criticism mounted (and the German Bundestag began to legislate), the company went on a massive drive to recruit human moderators to police its pages. Facebook now employs 15,000 of these wretches, the cost of whom is beginning to eat into profit margins.

Many if not most of these moderators are poorly paid workers employed by external contractors in low-wage countries such as the Philippines. They have to implement – in split seconds – the confusing guidelines that were leaked to the NYT. One of the most useful aspects of the documents is the way they illustrate the impossibility of the task. The guidelines, says the paper, “do not look like a handbook for regulating global politics. They consist of dozens of unorganised PowerPoint presentations and Excel spreadsheets with bureaucratic titles like ‘Western Balkans Hate Orgs and Figures’ and ‘Credible Violence: Implementation standards’.”

A trailer for Facebook moderation documentary The Cleaners

If you want to see what this kind of work involves, then a recent documentary, The Cleaners, filmed with the cooperation of Facebook moderators in Manila, makes sobering viewing. It shows that they have an impossible job and have to work under fierce time pressure to make their employer’s performance targets. Five seconds to make a judgment, thousands of times a day. And at the end of the shift, they go home, morally and physically exhausted.

These are the people who process Facebook’s waste so that nothing unclean appears in the news feeds of more affluent users in other parts of the world. To anyone with a moral compass, the fact that humans should have to do this kind of work so that a small elite in Silicon Valley can become insanely rich is an outrage. To a cybernetician, though, it is merely confirmation that Facebook is no longer a viable system.

What I’m reading

Don’t bother me now
Are digital distractions the reason for the productivity gap? This is the subject of a thoughtful post by Dan Nixon on the Bank of England’s Bank Underground blog.

As ye sow, so shall we weep…
Take a look at Cloudscene’s interesting guide to server farms (otherwise known as data centres) in Europe. There are more than you’d think…

How did we get there from here?
Answers can be found in Morgan Housel’s bracing (and breakneck) potted history of the United States in the postwar era – in one longish blog post.

More on this story

More on this story

  • Revealed: catastrophic effects of working as a Facebook moderator

  • Facebook moderators tell of strict scrutiny and PTSD symptoms

  • Facebook failing to protect moderators from mental trauma, lawsuit claims

  • Facebook releases content moderation guidelines – rules long kept secret

  • Underpaid and overburdened: the life of a Facebook moderator

  • Facebook moderators: a quick guide to their job and its challenges

  • Ignore or delete: could you be a Facebook moderator?

  • Facebook is hiring moderators. But is the job too gruesome to handle?

Most viewed

Most viewed