It seemed like a good idea at the time, a group of outside experts who could rule on what people could and could not post on Facebook. But now, two months after its initiation, the cracks are starting to appear.
The independent body of human rights experts, free speech supporters and legal scholars has reversed Facebook’s decisions four out of five times. The biggest test is yet to come though, as Facebook and the board brood over whether to reinstate Donald Trump to the platform. A decision of which is due to be made by mid-April.
Its current in-tray reads like a mess of thorny issues, from Covid-19 misinformation in France, religious tensions in India and blackface in the Netherlands. No matter where they go and what they decide, controversy is likely to follow them.
Critics also claim that the board is acting too harshly in some instances whilst giving a free pass in others. This tension is a natural result of the Oversight Board trying to adopt a one-size-fits all approach to content moderation by applying a global free speech norm to Facebook, when its posts are rooted across cultures.
“The Board is applying international human rights law to Facebook as if it was a country. That’s impossible. It’s the first body that’s using international human rights law to make content decisions. Now that we’re getting down to brass tacks, it’s difficult.” Evelyn Douek, a free speech expert at Harvard University has said.
Under Facebook’s rules the Board’s decisions on specific posts under review are binding for the social network, though its suggestions on how the company’s wider community standards should be tweaked are only advisory.
Up until now Facebook has followed most of the group’s advice on how to update its content rules. But the Board can only review cases related to material that has already been taken down, though it is expected to gain greater powers this year.
Whilst acknowledging the work the board does, many policymakers in the EU and the US are proposing legislation to force social media companies to take greater responsibility for what is posted on their sites. This includes the European Commissions’ Digital Services Act which will include a hefty fine if social media platforms don’t quickly remove illegal content, but these regulations won’t come into force until 2023.
Roger McNamee, who was an early investor in Facebook thinks that the Board is not independent from the tech giant and does not have the resources or the power to tackle hate speech and other harmful content that is posted on the site every day. Indeed, the investor believes that because the board cannot force changes to the tech giant’s community standards, it would not be able to bring effective change.
Then of course there is the wider global situation. As the board has found, trying to regulate content based on a universalist approach is not easy. Last year, Facebook removed a short video that included Black Pete, a character that belongs in local tradition and involves black face. The company said the post broke their guidelines on hate speech due to including black face, the poster appealed saying the video should be reinstated because it was dedicated to their child who wanted it back on Facebook.
However, in the Netherlands Black Pete is a divisive subject, with some thinking it is inherently racist and others believing it is a Dutch tradition. The Board now has to pick a side and risk alienating one half of users.
This follows a similar issue the Board faced when it ordered Facebook to reinstate posts that criticised Muslims or quoted Joseph Goebbels due to the belief that the content did not break the network’s rules, despite causing outrage amongst users.
Such controversies show that there is no global understanding of freedom of expression, no matter if global companies want there to be. Should these policies be specific to the countries in which the posts are made in? Only time will tell.