The Electronic Frontier Foundation (EFF) and more than 70 human and digital rights groups called on Mark Zuckerberg to add genuine transparency and accountability to Facebook’s content removal process.
The groups are demanding that Facebook clearly explain how much content it removes and give all users a fair and timely appeals process to be able to restore their content. The group argues:
“Without transparency, fairness, and processes to identify and correct mistakes, Facebook’s content takedown policies too often backfire and silence the very people that should have their voices heard on the platform.”
Politicians, museums, celebrities, and other high profile groups and individuals whose improperly removed content can garner media attention seem to have little trouble reaching Facebook to have content restored—they sometimes even receive an apology. But the average user? Not so much. Facebook only allows people to appeal content decisions in a limited set of circumstances, and in many cases, users have absolutely no option to appeal.
Onlinecensorship.org, an EFF project for users to report takedown notices, has collected reports of hundreds of unjustified takedown incidents where appeals were simply unavailable to users. For most users, EFF found, their removed content from Facebook was rarely restored, and some are banned from the platform “for no good reason.”
EFF, Article 19, the Center for Democracy and Technology, and Ranking Digital Rights wrote directly to Mark Zuckerberg demanding Facebook to implement “common sense standards” so that average users can easily appeal content moderation decisions, receive prompt replies and timely review by a person not an algorithm. They also want users to have the opportunity to present evidence during the review process.
The letter was co-signed by more than 70 human rights, digital rights, and civil liberties organizations from South America, Europe, the Middle East, Asia, Africa, and the U.S.
“You shouldn’t have to be famous or make headlines to get Facebook to respond to bad content moderation decisions, but that’s exactly what’s happening,” said EFF Director for International Freedom of Expression Jillian York. “Mark Zuckerberg created a company that’s the world’s premier communications platform. He has a responsibility to all users, not just those who can make the most noise and potentially make the company look bad.”
In addition to implementing a meaningful appeals process, EFF and partners called on Mr. Zuckerberg to issue transparency reports on community standards enforcement that include a breakdown of the type of content that has been restricted, data on how the content moderation actions were initiated, and the number of decisions that were appealed and found to have been made in error.
“Facebook is way behind other platforms when it comes to transparency and accountability in content censorship decisions,” said EFF Senior Information Security Counsel Nate Cardozo. “We’re asking Mr. Zuckerberg to implement the Santa Clara Principles, and release actual numbers detailing how often Facebook removes content—and how often it does so incorrectly.”
“We know that content moderation policies are being unevenly applied, and an enormous amount of content is being removed improperly each week. But we don’t have numbers or data that can tell us how big the problem is, what content is affected the most, and how appeals were dealt with,” said Cardozo. “Mr. Zuckerberg should make transparency about these decisions, which affect millions of people around the world, a priority at Facebook.”
For more information about private censorship: