Skip to content
View Featured Image

Facebook’s Leaked Content Moderation Documents Reveal Serious Problems

Above Photo: From Gadgets.ndtv.com

This article points to new problems on Facebook regarding how posts to the website are handled, moderated and censored. We have reported on other problems, e.g. Facebook Deleting Accounts At Direction Of US & Israel, Facebook Ranking News Sites By Trust And Combating “Propaganda” During Elections, Facebook’s Files on People, Facebook’s Threats to Our Privacy, advertising policies that stifle immigration activists, how Facebook Censored An Anti-Fascist Rally In Washington, Facebook’s censorship of news content, how Facebook is censoring left-wing news content, as well as those who stir up political debate. We have also reported how Facebook has teamed up with US empire propagandist partners, the National Endowment for Democracy and the Atlantic Council and the US and NATO-funded German Marshall Fund.

We’ve reported on how Facebook censors specific news content, e.g. In Latest Fit Of Censorship, Facebook Deletes Video Detailing Brutal Legacy Of Christopher Columbus, Facebook ‘Blocks Accounts’ Of Palestinian Journalists. As this story broke so did a story about Facebook censoring a video about violence against Palestinians:

This is a long list but it is not everything we have reported. All of this indicates a need for a social media bill of rights that ends censorship, protects freedom of speech, freedom of the press, our privacy and the right to appeal censorship.  KZ

A significant part of Facebook’s moderation is done from places like Morocco and the Philippines

  • Facebook’s rules are reportedly filled with gaps, biases, and errors
  • Moderators get 8 to 10 seconds to review Facebook posts
  • Moderation rules are set by Facebook employees over breakfast meetings

Facebook’s thousands of content moderators worldwide rely on a bunch of unorganised PowerPoint presentations and Excel spreadsheets to decide what content to allow on the social network, revealed a report. These guidelines, which are used to monitor billions of posts every day, are apparently filled with numerous gaps, biases, and outright errors. The unnamed Facebook employee, who leaked these documents, reportedly feared that the social network was using too much power with too little oversight and making too many mistakes.

The New York Times reports that an examination of the 1,400 of Facebook’s documents showed that there are serious problems with not just the guidelines, but also how the actual moderation is done. Facebook confirmed the authenticity of the documents, however it added that some of them have been updated.

Here are the key takeaways from the story.

Who sets the rules?
According to the NYT report, although Facebook does consult outside groups while deciding the moderation guidelines, they are mainly set by a group of its employees over breakfast meetings every other Tuesday. This employee group largely consists of young engineers and lawyers, who have little to no experience in regions they are deciding guidelines about. The Facebook rules also seem to be written for English-speaking moderators, who reportedly use Google Translate to read non-English content. Machine translated content can often strip out context and nuances, showing a clear lack of local moderators, who will be more capable of understanding their own language and local context.

Biases, gaps, and errors
The moderation documents accessed by the publication also showed that they are often outdated, lack critical nuance, and sometimes plain inaccurate. For example, the Facebook moderators in India were apparently told to remove any comments that are critical of a religion by flagging them illegal, something that is not actually illegal according to the Indian law. In another case, a paperwork error allowed a known extremist group from Myanmar to remain on Facebook for months.

ALSO SEE Facebook: Backlash Threatens World’s Biggest Platform

The moderators often find themselves frustrated by the rules and say that they don’t make sense at times and even force them to leave posts live, which may end up leading to violence.

“You feel like you killed someone by not acting,” one unnamed moderator told NYT.

“We have billions of posts every day, we’re identifying more and more potential violations using our technical systems,” Monika Bickert, Facebook’s head of global policy management, said. “At that scale, even if you’re 99 percent accurate, you’re going to have a lot of mistakes.”

The moderators, who are actually reviewing the content, said they have no mechanism to alert Facebook of any holes in the rules, flaws in the process or other threats.

Seconds to decide
While the real-world implications of the hateful content of Facebook maybe massive, but the moderators are barely spending seconds while deciding whether a particular post can stay up or be taken down. The company is said to employ over 7,500 moderators globally, many of which are hired by third-party agencies. These moderators are largely unskilled workers and work in dull offices in places like Morocco and the Philippines, in sharp contrast to the fancy offices of the social network.

As per the NYT piece, the content moderators face pressure to review about a thousand posts per day, meaning they only have 8 to 10 seconds for each post. The video reviews may take longer. For many, their salary is tied to achieving the quotas. With so much pressure, the moderators feel overwhelmed, with many burning out in a matter of months.

Political matters
Facebook’s secret rules are very extensive and make the company a much more powerful judge of global speech than it is understood or believed. No other platform in the world has so much reach and so deeply entangled with people’s lives, including the important political matters.

NYT report notes that Facebook is becoming more decisive while barring groups, people or posts, which it feels may lead to violence, but in countries where extremism and the mainstream are becoming dangerously close, the social network’s decisions end up regulating what many see as political speech.

ALSO SEE Facebook Breaches: What Can You Do to Protect Your Data

The website reportedly asked moderators in June to allow posts praising Taliban if they included details about their ceasefire with the Afghan government. Similarly, the company directed moderators to actively remove any posts wrongly accusing an Israeli soldier of killing a Palestinian medic.

Around Pakistan elections, the company asked the moderators for extra scrutiny to Jamiat Ulema-e-Islam while treating Jamaat-e-Islami as benign, even though both are religious parties.

All these examples show the power Facebook possesses in driving the conversation and with everything happening in the background, the users are not even aware of these moves.

Little oversight and growth concerns
With moderation largely taking place in third-party offices, Facebook has little visibility into the actual day-to-day operations and that can sometimes lead to corner-cutting and other issues.

One moderator divulged an office-wide rule to approve any posts if no one on hand is available to read the particular language. Facebook claims this is against their rules and blamed the outside companies. The company also says that moderators are given enough time to review content and they don’t have any targets, however it has no real way to enforce these practices. Since the third-party companies are left to police themselves, the company has at times struggled to control them.

ALSO SEE Facebook Let Some Companies Access Your Private Messages, Friends Lists: Report

One other major problem that Facebook faces while controlling the hateful and inflammatory speech on its platform is the company itself. The company’s own algorithms highlight content that is most provocative, which can sometimes overlap with the kind of content it is trying to avoid promoting. The company’s growth ambitions also force it to avoid taking unpopular decision or things that may put it in legal disputes.

Urgent End Of Year Fundraising Campaign

Online donations are back! Keep independent media alive. 

Due to the attacks on our fiscal sponsor, we were unable to raise funds online for nearly two years.  As the bills pile up, your help is needed now to cover the monthly costs of operating Popular Resistance.

Urgent End Of Year Fundraising Campaign

Online donations are back! 

Keep independent media alive. 

Due to the attacks on our fiscal sponsor, we were unable to raise funds online for nearly two years.  As the bills pile up, your help is needed now to cover the monthly costs of operating Popular Resistance.

Sign Up To Our Daily Digest

Independent media outlets are being suppressed and dropped by corporations like Google, Facebook and Twitter. Sign up for our daily email digest before it’s too late so you don’t miss the latest movement news.