A phone displays the Facebook login screen. (Kanhaiya Raut, https://pixahive.com/photo/using-social-media/; CC0, https://creativecommons.org/share-your-work/public-domain/cc0/) After seemingly endless announcements heralding the arrival of the Facebook Oversight Board (FOB) over the past 18 months, the board has, at last, released its first batch of decisions . There are five cases, each running a little over 10 pages, and each telling Facebook what to do with a single piece of content. All but one are unanimous. Four overturn Facebook’s original decisions to remove posts, and only one agrees with Facebook. In the time it took you to read that sentence, Facebook probably made thousands of content moderation decisions to take down or leave up pieces of content. So it would be natural to question what possible difference the FOB’s five decisions could make to the ocean of content moderation. But reading the decisions, the FOB’s greater ambitions are obvious. These decisions strike at matters fundamental to the way Facebook designs its content moderation system and clearly signal that the FOB does not intend to play mere occasional pitstop on Facebook’s journey to connect the world. The question now—as it has always been with the FOB experiment—is whether Facebook will seriously engage with the FOB’s recommendations. Facebook’s decision to refer to the board its decision to suspend President Trump put the FOB in the spotlight recently. But these five more quotidian cases could, in the long term, have a far greater impact than the Trump case on Facebook’s rule writing and enforcement and people’s freedom of expression around the world. Jacob Schulz has a good summary of the docket here , and Lawfare will have summaries of outcomes in coming days. Here, though, I offer a few overarching observations from the first set of decisions, and what to watch next. The FOB’s Ambitions The eighty percent reversal rate is not the only sign that the FOB does not intend to extend Facebook much deference. These decisions show the FOB’s intention to interpret its remit expansively to attempt to force Facebook to clean up its content moderation act. In its decision about COVID-19 misinformation , for example, the FOB complains about how difficult it is to track Facebook’s policy changes over the course of the pandemic. Many updates to policies have been announced through the company’s Newsroom (essentially its corporate blog) without being reflected in the Community Standards (the platform’s actual rule book)—and, as the FOB notes, the announcements sometimes even seem to contradict the standards. Scholars of content moderation are used to having to scour and synthesize Facebook Newsroom announcements, blog posts from Mark Zuckerberg and even tweets from Facebook executives in an attempt to discern what the platform’s policies are at any given time. The FOB’s recommendation that Facebook should consolidate and clarify its rules is welcome. But the FOB goes further still. Drawing on a public comment submitted by the digital rights non-profit Access Now, the FOB also recommends Facebook publish a specific transparency report on its enforcement of its Community Standards during the coronavirus pandemic. This is, again, ambitious and important. The pandemic provided a natural experiment in content moderation, as Facebook rolled out more expansive misinformation policies and relied more heavily on artificial intelligence tools as content moderators were sent home. As yet, however, there has been no meaningful or detailed public accounting of how this experiment played out in practice. The call for such a report from the FOB is an aggressive demand that, if Facebook complies, could provide useful insight into the company’s systems. The FOB’s decision to acce pt a case about female nipples ha s prompted the predictable chuckling from the peanut gallery. But the fact that the FOB selected the case should not be surprising: Facebook’s Adult Nudity policy has long been one of its most controversial. The case is unusual for another reason, though. After the FOB accepted the case, in which Facebook removed an Instagram post showing symptoms of breast cancer during breast cancer awareness month in Brazil, Facebook admitted its decision to remove the post was a mistake. Its artificial intelligence tools had accidentally flagged the post, Facebook said, but given the Adult Nudity policy has a clear exception for breast cancer awareness the company restored the picture. Case closed, right? Wrong, said the FOB. In a strong assertion of its own power to decide its own jurisdiction, the FOB said Facebook could not moot a case after the FOB had accepted it simply by deciding to reverse a decision. The FOB’s Charter states that the FOB can review cases where users disagree with Facebook’s decision and have exhausted internal appeals, and the FOB argued here the requirement of disagreement is at the moment the user exhausts Facebook’s internal processes and not after. If it were otherwise, the FOB says, Facebook could “ exclude cases from review ” (the FOB’s own emphasis) simply by mooting any case it didn’t want the FOB to pronounce on. The FOB went on to confirm that Facebook’s original decision was wrong, but along the way the FOB made two important observations that have little to do with nipples. First, the FOB noted that the relationship between Facebook’s Community Standards and Instagram’s much shorter Community Guidelines is unclear. The latter has an unexplained hyperlink to the former, but that’s it. The FOB recommended that Facebook make it clear that in the case of any inconsistency between the two sets of rules, Facebook’s Community Standards should take precedence. If accepted, this is pretty significant: Facebook’s and Instagram’s rules would, for all intents and purposes, be explicitly harmonized. The second recommendation from the FOB is even more far-reaching. Noting that the mistake in this case was because of over-reliance on automated moderation without having a human in the loop to correct the error, the FOB calls for some pretty fundamental changes in how Facebook uses such tools. Facebook had urged the FOB to avoid this in its submissions: “Facebook would like the Board to focus on the outcome of enforcement, not the method” the FOB’s decision notes. But the FOB refused to so narrow its scope. Instead, the FOB accepts that while automated technology may be essential to detecting potentially violating content, Facebook needs to inform users when automation has been used, and ensure they can appeal such decisions to human review. It also suggested Facebook implement an internal audit procedure to analyze the accuracy of its automated systems. These are all sweeping, systemic recommendations, and potentially set the stage for the FOB to request updates on their implementation in future cases. Facebook has already replied that this recommendation suggests “major operational and product changes” and it may “take longer than 30 days to fully analyze and scope” what it requires. This decision sends a strong shot across the bow to Facebook. The board is establishing that it will not limit its view to just the outcomes in the cases before it, but will interrogate the systems that led to them. Due Process for Users A constant theme in these cases is the lack of adequate notice and reasoning for users who have been found to violate Facebook’s rules. The FOB shows concern that users who have been found to violate their rules simply cannot know what they are doing wrong, whether because Facebook’s policies are not clear or lack detail or are scattered around different websites, or because users are not given an adequate explanation for which rule has been applied in their specific case. Many of the FOB’s recommendations suggest more transparency and due process for users, to help them understand the platform’s rules. The Importance of Context The board also focused on the need to take context into account when applying rules to specific facts. Whether it be the significance of breast cancer awareness month in Brazil, ongoing armed conflict in Azerbaijan, the rise of what the FOB describes as “neo-Nazi” ideology around the world, or the pervasive hate speech against Muslims in Myanmar (and Facebook’s historical role in helping turbocharge it), the FOB repeatedly acknowledges that the specific context matters. This is not particularly surprising: It’s impossible to understand speech without taking it in context. But it does present a challenge. Platforms generally have one set of global rules, not least because that’s easier to enforce. It was always naive to hope connecting the wor
The Facebook Oversight Board’s First Decisions: Ambitious, and Perhaps Impractical posted first on http://realempcol.tumblr.com/rss
The law students aren’t considered the quickest off the mark for getting involved in applications and internships early on in their degree, but it’s a close one! More and more law firms are offering placements and taster days during the first year of university so it is tempting to think that you need to get involved in deciding your career choice right from day one.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment