Let’s talk about the highest-profile conflict between Meta and its Oversight Board, an independent association the company established to support it navigate the most complex questions connected to policy and content moderation.
Before the Board was completed, it faced criticism that it primarily suited a public-relations function for the company previously comprehended as Facebook.
The Board counts on funding from Meta, it has a contractual relationship with it managing its use of user data, and its founding associates were hand-picked by the community.
Assisting in the perception that it’s mostly a PR project is that, to date, Meta and the Board have rarely conflicted. For example, in the first quarter of its existence, of 18 proposals the Board made to Meta, the company executed 14. And even though it often leads against Facebook’s content moderators, ordering removed posts to be restored, none of those reversals has generated any significant controversy.
After Russia’s aggression on Ukraine in February, Meta had requested the Board to issue an advisory statement on how it should moderate content during wartime. The conflict has raised a series of difficult questions, including how users can post pictures of dead bodies or videos of captives of war criticizing the competition.
And in the most famous content moderation question of the invasion to date, Meta decided to temporarily allow calls for violence against Russian soldiers, Vladimir Putin, and others.
All of these raised questions about the balance between free articulation and user safety: but after asking the Board to contemplate it, Meta changed its mind — and asked board members to say nothing.
Meta started a policy advisory opinion (PAO) request linked to Russia’s invasion of Ukraine that had previously been referred to the Oversight Board. However, this decision was not made lightly — the PAO was revoked due to continuous safety and security troubles.
While the PAO has been revoked, they stand by the actions related to the Russian aggression on Ukraine and believe we are carrying out the proper steps to protect speech and offset the ongoing security concerns on the ground.
While the Board comprehends these concerns, we believe the request increases essential issues and are disappointed by the company’s decision to withdraw it. However, the Board also states the withdrawal of this request does not diminish Meta’s obligation to carefully consider the ongoing content moderation matters which have arisen from this war, which the Board resumes following. Indeed, the priority for the company to defend freedom of expression and human rights has only increased.
One of the most worrisome fads of the past year has been how authoritarian regimes in general, and Russia in particular, have manipulated the intimidation of employees on the ground to force platforms to do their bidding.
For example, last fall, Apple and Google both removed from their respective stores an app that allowed anti-Putin forces to collect before an election. In the aftermath, we discovered that Russian agents had risked their employees, in person, with jail time or worse.
Life for those workers — and their families — has only become more complicated since Putin’s invasion. First, the country enacted draconian laws outlawing candid dialogue during the war. Then, combining those laws and sanctions from the United States and Europe has pushed many platforms to withdraw benefits from Russia altogether.
In the wake of Meta’s determination to allow calls for brutality against the invaders, Russia said Meta had been employed in “extremist” activities. That potentially places hundreds of Meta employees at risk of being detained.
And while the company has now successfully released its employees from the country, the extremist language could mean that they will never be permitted to return to the country so long as they work at Meta. Moreover, it could indicate that employees’ households in Russia could still be subject to persecution.
Meta had asked for a reasonably broad opinion about its strategy for moderation and Russia. The Board has already exhibited a willingness to make expansive policy recommendations, even on narrower cases submitted by users.
However, after asking for the idea, the company’s legal and security crews became concerned that anything the Board said might somehow be manipulated against workers or their families in Russia, either now or in the end.
Technically, the Oversight Board is a different entity from Meta. But many Westerners still refuse to acknowledge that distinction, and company lawyers are concerned that Russia wouldn’t, either.
It is compounded by the reality that tech platforms have gotten little to no permission to date, from either the US or the EU, in their struggles to keep critical communication services in Russia and Ukraine. It’s not apparent what western democracies could do to lower platforms’ fears about how Russia might regale employees and their families. But conversations with executives at several big tech companies over the past year have made it clear that they all sense like they’re out on a limb.
The news still significantly impacts the Oversight Board’s already fragile credibility and arguably diminishes its value to Facebook. The company expanded for several years and $130 million to develop an independent body to declare it on policy matters. To request that body for its advice — advice that would not even be critical to the company — and then select belatedly that such advice might be dangerous calls into question the point of the whole enterprise. If the Oversight Board’s only role is to handle the easy questions, why bother it?
Facebook and the Board refused to comment to me beyond their statements. However, it’s appropriate to note that despite the reversal here, the institution has stood up to Russia in some significant ways — including standing by that determination to let Ukrainians call for Putin’s death. Meta could have cruised for Russia on that one and chose not to.
At the same time, we find that Facebook executives fail to understand risk and public perception at a crucial moment properly. Russia has been intimidating platform employees since at least last September. Whatever danger there was for workers and their families lived well before the juncture that Facebook sought an idea from its Board. To discover that only weeks later… well, talk about oversight.
I’m on record saying that the Oversight Board has altered Facebook for the better. And when it comes to authoritarians endangering platform employees, tech companies have distressingly few options. As in so many other situations, the Russia case was indeed a no-win situation.
But that doesn’t indicate it won’t have collateral damage for both Meta and its Board. Critics always stressed that if the stakes ever got high enough, Facebook would blink and decide to make all the relevant decisions themselves. And then Vladimir Putin went and invaded his neighbor, and the critics were proven right.