Skip to main content

Scholars overseeing Facebook: Supreme Court or fairness washing?

Written by: John Morgan
Published on: 23 Mar 2021

A carnival float depicting a judge holding the tail of an alien whereupon is written 'radicalisation, hate, rush, Facebook' in Germany 2020. To illustrate whether scholars on Facebook's oversight board are helping rule on moderation rows.

Source: Getty

The biggest issues in Facebook’s future include how it resolves free-expression controversies and its governance and regulation

Facebook’s 20-person oversight board, which rules on the platform’s most controversial moderation decisions, includes a former prime minister of Denmark, a Nobel Peace Prize laureate, a former editor of The Guardian and nine academics from universities around the world. Shortly before the board handed down its first decisions in January, it was announced that the board would soon decide whether or not to reinstate Donald Trump’s Facebook account, suspended since his followers stormed Congress that month.

Given the potential ramifications of that decision, one response must now recur when board members think about their roles: gulp.

In the eyes of both supporters and critics, the issues surrounding the board are the biggest issues in Facebook’s future: how it resolves free expression controversies, including those surrounding its most powerful users, along with its governance and regulation.

So why has Facebook chosen so many academics to sit on the board, and what might these scholars hope to achieve?

Often described – not very accurately – as Facebook’s “Supreme Court”, the board was set up as a separate entity by the company last year to help it “answer some of the most difficult questions around freedom of expression online: what to take down, what to leave up and why”, according to the board’s website. The board’s decisions are binding on Facebook, the company says. But that applies just to the small number of user appeals against Facebook and Instagram decisions to remove content that it will examine, with any recommendations about wider policies being advisory only.

Nicolas Suzor, a professor of law at Queensland University of Technology, author of Lawless: The Secret Rules that Govern Our Digital Lives, is one board member.

He described his research focus as being on “how we can ensure the rules that govern the online spaces we have come to depend on are legitimate…not arbitrary”, not made “behind closed doors” and not “subject to the whim of the executives and the individual moderators at the platforms”; but instead “reflect a vision of the common good – human rights, for example”.

For Professor Suzor, agreeing to join the board “was about seizing an opportunity to put my research into practice, after a decade of yelling at tech companies to do better”, he told Times Higher Education. “This is a cool experiment – there’s no guarantee that it will work. But I really wanted to do what I could to see how much I could push Facebook to change its governance practices in line with human rights agendas.”

As an example of the grave consequences of failure, he cited Facebook in Myanmar, where The New York Times has reported that military personnel used the platform to direct a torrent of propaganda against the country’s mostly Muslim Rohingya minority over several years, resulting in what it termed “a genocide incited on Facebook”. Professor Suzor said: “Facebook entered that jurisdiction and had no idea about the local context it was working in; it had no idea how to moderate speech.”

The academics on the board are nearly all lawyers, from Stanford UniversityColumbia University, the University of Oklahoma, the University of the Andes, Colombia, Rio de Janeiro State University, National Chengchi University in Taiwan, the National Law School of India University and the Central European University.

“There are gaps in our demographics and life experience, subject matter expertise,” Professor Suzor acknowledged. “We’re in a process now of starting to hire another 20 people for the board.”

In terms of mechanics, after a subcommittee of the board “oversees the process of identifying the most important cases”, these are then “allocated randomly to a five-member panel of the board”, he explained. Deliberations are “quite consultative – we put it up for public comments”. Professor Suzor emphasised “how much I’d really appreciate academics getting involved in the submission process” in terms of drawing attention to key research or to work “with communities of all different types facing content and access issues”.

Is a “Supreme Court” the right model for thinking of the board’s role? “I don’t think that we’re using that language any more,” replied Professor Suzor. “We are an adjudicatory body that is the avenue of last appeal for Facebook users.”

The court metaphor “takes attention away from the fact that this is a private company”, he continued. “I don’t want to suggest we’re displacing the need for real public oversight, legislation and content standards.”

Which leads on to the concerns of critics. Julie Cohen, the Mark Claster Mamolen professor of law and technology at Georgetown University, called it “an oversight board that a) doesn’t oversee the really problematic parts of the phenomenon, and b) doesn’t have any actual authority to dictate anything”, making it “accountability theatre”.

“There is an apparatus causing content to be more or less visible” on Facebook, the result of its “behavioural profiling” and algorithms designed to “keep people on the platform” by giving prominence to content that “will get the most clicks, likes and shares”, which is often “the most outrageous” content, she observed – and the board cannot make binding rulings on any of that.

Why are there so many university scholars on the board? “It is objectivity washing, or fairness washing, or credential washing – pick your favourite,” Professor Cohen replied.

Laurence Tribe, the Carl M. Loeb university professor and professor of constitutional law at Harvard Law School, is a member of the “Real Facebook Oversight Board”, set up by prominent Facebook critics in protest against what they see as the platform’s failure to reckon with its destabilising impact on democracy and elections around the world.

Facebook is using its oversight board “as a way of generating a patina of legitimacy that cannot in fact be justified by the board’s composition or function”, Professor Tribe said.

He added: “The inclusion of university scholars…seems to me to be a transparent device for making the board seem to be objective, disinterested and deeply informed, when it is in fact none of those things.”

For such critics, the context for the creation of the board is what they see as Facebook’s wish to show a limited level of self-regulation in order to stave off regulation by governments. From Facebook’s perspective, its spending on the board is “not very much money, but the return [in avoiding regulation] is potentially quite large”, said Professor Cohen.

In response to that kind of argument, Professor Suzor said the board “isn’t the whole solution”. But he also said: “Theoretically, I see our role as really doing the bits of that [regulation] that can’t be done by governments. Governments just can’t legislate everything, and they can’t adjudicate.”

Does he know if he will be involved in the Trump case? “Sorry, I can’t comment,” Professor Suzor replied.

But on the broad issues, he said: “This is one of the reasons I joined the board and one of the things I hoped we would be able to address: the challenge of dealing with potentially dangerous speech and, particularly, potentially dangerous speech of world leaders and politicians…This is an issue where platforms have really been struggling to balance the competing interests.”

For some academics in the field, Facebook’s design – the algorithms tailored to push forward emotional content, the creation of an all-powerful advertising platform – makes it essentially unreformable. “The problem with Facebook is Facebook”, in that argument.

“I get that,” Professor Suzor said of such critiques. “I’m more pragmatic, personally,” he continued. “I think we can make improvements that will have a dramatic effect on the rights and safety of individuals around the world.”

Whether the board’s slant towards scholars brings decision-making that is truly independent of Facebook’s corporate interests, or whether its work is the equivalent of giving an ethics lecture to a great white shark, the magnitude of the Trump case guarantees that the world will be watching.