BOSTON — Facebook said Tuesday that it expects to name the first members of a new quasi-independent oversight board by year-end.
The oversight panel is intended to rule on thorny content issues, such as when Facebook or Instagram posts constitute hate speech. It will be empowered to make binding rulings on whether posts or ads violate the company’s standards. Any other findings it makes will be considered “guidance” by Facebook.
CEO Mark Zuckerberg announced plans to establish the board last November after Facebook came under intense scrutiny for failures to protect user privacy and for its inability to quickly and effectively remove disinformation, hate speech and malign influence campaigns on its platform.
“Facebook should not make so many important decisions about free expression and safety on our own,” he wrote at the time.
Critics call the oversight board a bid by Facebook to forestall regulation or even an eventual breakup. The company faces antitrust investigations by the Federal Trade Commission, Congress and a group of state attorneys general.
“Facebook is attempting to normalize an approach to containing hate speech internally,” said Dipayan Ghosh, a former Facebook policy adviser and a fellow at Harvard’s Kennedy School. “If it can illustrate that this approach can work, it can pacify the public itch to regulate the business model behind Facebook.”
Luigi Zingales, a University of Chicago professor of finance, called the board’s creation “a clever move” that’s more about appearance than substance.
“It’s hard to imagine that this board will not be completely captured by Facebook,” said Zingales, who co-chaired a committee of more than two dozen prominent academics that published a report Tuesday on how to rein in digital platforms. To avoid that, at least some of its members would need to be chosen by outsiders, he said.
The multinational board will eventually comprise 40 members, who will collectively decide a few dozen cases a year, company executives told reporters in a conference call. It will at first hear only cases initiated by Facebook but will begin hearing appeals initiated by users in the first half of 2020, the company said. It will get to work as soon as 11 members are named.
Priority cases will involve content that “threatens someone else’s voice, safety, privacy, dignity or equality” and affects a large number of people, Facebook said in a blog post Tuesday.
Experts say the panel will have a limited range for decision-making, however. Local laws or directives from repressive governments might clash with its rulings, and Facebook might heed them for business reasons.
“How to deal with authoritarian regimes is a deep issue for the platform, and for the world really,” said Harvard law student Evelyn Douek, an Australian expert on content moderation.
Douek says the group’s charter, also released Tuesday, should insulate board members from public pressure and Facebook’s commercial imperatives. But she believes the conditions under which members could be removed are still too vague.
The first few board members will be directly chosen by Facebook; they will then choose additional members. Facebook will also name the administrators of the trust that manages the Oversight Board and pays its members’ salaries.
Brent Harris, Facebook’s director of governance, told reporters the company had not yet decided how much board members would be paid. He did not respond when asked how many hours a week would be expected of them in the part-time job. Facebook expects panelists will include former judges, editors, publishers and journalists, he said.
The board members’ access to Facebook data will also be limited.