Wednesday, February 17, 2021

Facebook’s ‘Supreme Court’ Faces Its First Major Test - The New York Times

Opinion | Facebook’s ‘Supreme Court’ Faces Its First Major Test - The New York Times

Jameel Jaffer and 

Mr. Jaffer is the executive director and Ms. Glenn Bass is the research director of the Knight First Amendment Institute at Columbia. The institute’s mission is to defend freedom of speech and the press in the digital age through litigation, research and education.



Facebook’s new Oversight Board is considering whether the company was justified in indefinitely suspending Donald Trump from its platform. The question is important, but it would be a mistake for the board to answer it right now, or on Facebook’s terms. To do so would effectively absolve the company of responsibility for its part in creating the circumstances that made Mr. Trump’s speech — both online and offline — so dangerous.

Facebook announced plans for its board in 2018 in response to concerns from civil society organizations and regulators about the company’s influence over public discourse online. Sometimes described as Facebook’s Supreme Court, the board comprises an impressive group of civic leaders, free speech experts and scholars from around the world. But Facebook narrowly limited the board’s jurisdiction, having it focus almost exclusively on questions relating to the removal of specific pieces of content.

Content moderation decisions can be consequential, of course. But Facebook shapes public discourse more profoundly through its decisions about the design of its platform. Its ranking algorithms determine which content appears at the top of users’ news feeds. Its decisions about what types of content can be shared, and how, help determine which ideas gain traction. Its policies and tools relating to political advertising determine which kinds of users see which political ads, and whether those ads can be countered by ads offering different viewpoints and correcting misinformation.
TRACKING VIRAL MISINFORMATION
Every day, Times reporters chronicle and debunk false and misleading information that is going viral online.

This creates a problem for the board. It’s not just that the board’s jurisdiction is too narrow. Nor is it merely that the elaborate quasi-judicial structure that Facebook has established for review of its content-moderation decisions draws public attention away from the design decisions that matter more — though this is certainly the case.

The fundamental problem is that many of the content-moderation decisions the board has been charged with reviewing can’t actually be separated from the design decisions that Facebook has placed off limits. Content-moderation decisions are momentous, but they are as momentous as they are because of Facebook’s engineering decisions and other choices that determine which speech proliferates on the platform, how quickly it spreads, which users see it, and in what context they see it. The board has effectively been directed to take the architecture of Facebook’s platform as a given. It shouldn’t accept that framing, and neither should anyone else.

The Trump case starkly highlights the problem with the board’s jurisdiction. Mr. Trump’s statements on and off social media in the days leading up to the Capitol siege on Jan. 6 were certainly inflammatory and dangerous, but part of what made them so dangerous is that, for months before that day, many Facebook users had been exposed to staggering amounts of sensational misinformation about the election, shunted into echo chambers by Facebook’s algorithms, and insulated from counterarguments by Facebook’s architecture.

This is why it would be a mistake for the board to address the question that Facebook has asked it to answer, at least right now. Doing so would draw public attention away from the platform design decisions that warrant most scrutiny, and from the regulatory interventions that are needed to better align Facebook’s practices with the public interest. It would also let Facebook off the hook for business practices that cause significant harm to democracy.

No comments:

Post a Comment