Facebook's newly-launched 'supreme court' issued its first rulings on Thursday, overturning four of five decisions to remove controversial posts from the platform.
The initial batch of rulings did not include Donald Trump's indefinite suspension from Facebook and Instagram after the storming of the US Capitol, but the board said last week it agreed to consider that case.
The four overturned decisions included a post that asserted that France lacked a health care strategy and included claims that a cure for Covid-19 exists.
This post was initially removed on grounds that it contributed to "risk of imminent... physical harm." But the review board said Facebook's rule on misinformation and imminent harm was "inappropriately vague."
Another case involved nudity. An Instagram user in Brazil had posted pictures of women's nipples as part of a breast cancer awareness message.
It was removed, but the board said the photos should be allowed in light of Facebook's own policy exception for breast cancer awareness.
Also overturned was the removal of a post dramatically condemning treatment of Uyghur Muslims in China, according to the board.
"None of these cases had easy answers and deliberations revealed the enormous complexity of the issues involved," the board said in a post on its rulings.
"In one case, board members looked at whether, in the context of an armed conflict, Facebook was right to remove an otherwise-permissible post because it contained a hateful slur."
In several of the cases, members of the board questioned whether Facebook rules were clear enough for users to understand, according to the post.
The board said that since it started accepting cases in October of last year, more than 150,000 cases have been appealed to the panel.
"As we cannot hear every appeal, we are prioritizing cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook’s policies," the board said.
Facebook's oversight board is tasked with making final decisions on appeals regarding what is removed or allowed to remain on the world's biggest social network.
It is considering cases involving Nazi propaganda, hate speech, nudity, pandemic misinformation and dangerous individuals or organizations.
Launch of the panel came late last year amid rising concerns about misinformation and manipulation around the US presidential election.
The board was created at the urging of Facebook founder Mark Zuckerberg with the authority to overrule him and other top executives.
Facebook has agreed to be bound by decisions on appeals, but rulings will only apply to cases at issue and will not set precedents.
Rulings can, however, come with recommendations about changing Facebook policies.
"We believe that the board included some important suggestions that we will take to heart," Facebook content policy vice president Monika Bickert said, noting it could take more than a month to analyze the recommendations.
An activist group that mockingly named itself The Real Facebook Oversight Board challenged the legitimacy of what has been referred to as the social network's "supreme court."
Members of the Facebook appeals panel come from various countries and include jurists, human rights activists, journalists, a Nobel peace laureate and a former Danish prime minister.
"A handful of hand-picked experts, paid six figures each, ruled on a limited set of harms in a non-transparent manner with no solutions for the core threats to democracy caused by Facebook's business," the activist group said in a release.
"While we respect all the individuals on the Oversight Board, the rulings by the board as a body do not meaningfully address the many, ongoing harms facilitated by Facebook."
No board can replace independent oversight anchored in the rule of law, Stanford University Cyber Policy Center international policy director Marietje Schaake said in a tweet.
However, the fact that the oversight panel reversed four of the five content removal decisions is a sign that it does "not intend to extend Facebook much deference" and will try to "force Facebook to clean up its content moderation act," Harvard Law School lecturer Evelyn Douek reasoned in an online post.
Washington, United States | AFP