A group with powers to rule on which content should stay up on Facebook started to take cases Thursday, but is not likely to make binding decisions on the world’s largest social network until after the upcoming U.S. election.
Facebook’s Oversight Board — formed earlier this year, and made up of 20 independent legal professors, former politicians and human rights experts — will have the power to determine if content that Facebook had removed from its platform should be reinstated, or if material that has been left up breaks the company’s standards.
Its action comes as the company is facing greater political scrutiny around the globe about how it polices users’ posts and videos, often around elections and the ongoing COVID-19 pandemic.
In recent weeks, Facebook has been at the center of a political storm in the United States after removing a post from U.S. President Donald Trump that the company deemed had spread falsehoods about the coronavirus. U.S. lawmakers also voted Thursday to subpoena both Facebook and Twitter’s chief executives over alleged censorship of conservative voices on their platforms.
In France, Germany, the United Kingdom and the EU, policymakers are all pursuing new legislation to force the U.S. tech giant to be held more responsible for what its users posted online.
Still, Brent Harris, Facebook’s director of governance and global affairs, told reporters Thursday that the company did not plan to submit cases to the independent group before the U.S. election. Instead, the Oversight Board will spend the next few weeks reviewing which cases it would investigate, with a priority on creating decisions on issues around misinformation, online nudity and fact-checking that would set precedents that could be applied across Facebook’s global platform.
“It’s no understatement that disinformation on Facebook is of great concern to the board,” said Helle Thorning-Schmidt, the group’s co-chair and former Danish prime minister, who will also oversee which cases will be taken up for review. “I expect this to be an issue that will feature in cases that come before the board.”
As part of that process, Facebook users will be able to appeal content moderation decisions to the independent group when they believe their posts have been unfairly treated by Facebook, mostly when the company has removed the material for allegedly breaking its community guidelines. They will have 15 days after the company makes an initial ruling to submit an appeal. Facebook can also ask the outside organization to rule on issues that it has flagged, and decisions are expected to be finalized within 90 days of a complaint’s initial submission.
Each decision will be made by five of the group’s members, and Facebook must comply with their decision on how the specific content should be treated. The goal is to create a body of quasi-case law that can then be applied to similar future cases.
“We will be developing over time as we see what the volume of cases that are appealed might be,” said Jamal Greene, a law professor at Columbia University. “We’ll start off with an initial round of cases.”
On a call with reporters Thursday, members of the independent group and Facebook’s representative declined to comment if cases involving Trump’s recent social media activity would be sent for review. They stressed that this was a global roll-out and would apply to tough content moderation decisions that affect the company’s 2.4 billion global users. Only some of Facebook’s users, initially, would be able to make appeals, with everyone being able to follow suit in the coming weeks, they added.
Despite starting to review appeals, the group may not take up one of the company’s most scrutinized recent moves in the United States: limiting the distribution of a disputed New York Post article alleging direct ties between Democratic presidential candidate Joe Biden and his son’s business interests to Ukraine, pending a possible third-party fact check.
The board’s spokesperson John Taylor told POLITICO on Tuesday that under the group’s current bylaws, users would not be able to appeal that decision to the board due to the nature of the company’s action in that case, which involved limiting how the New York Post article could be shared online. Taylor said the board “can review only whether it should have been excluded from factchecking because it was opinion or political speech.” Taylor added that Facebook had not referred this matter to the Oversight Board.
Thorning-Schmidt said that the group’s work did not stop Facebook from making hard decisions about what material should be removed from its platform, and that it would take time for its members to properly adjudicate on cases that would likely have a long-term effect on the social network.
“The Oversight Board does not exist to prevent Facebook to rapidly respond to content issues in real time,” she said. “The ability of the board to hear extradited cases doesn’t excuse Facebook from its responsibility to act.”
Crisitano Lima contributed reporting from Washington, DC