Wednesday, February 3, 2021

The Wall Street Journal Misreads Section 230 and the First Amendment

The U.S. Supreme Court building. (Mark Thomas, https://tinyurl.com/16z3n4rq; Pixabay, https://pixabay.com/service/license/) When private tech companies moderate speech online, is the government ultimately responsible for their choices? This appears to be the latest argument advanced by those criticizing Section 230 of the Telecommunications Act of 1996—sometimes known as Section 230 of the Communications Decency Act. But upon closer scrutiny, this argument breaks down completely. In a new Wall Street Journal op-ed , Philip Hamburger argues that “the government, in working through private companies, is abridging the freedom of speech.” We’ve long respected Hamburger, a professor at Columbia Law School, as the staunchest critic of overreach by administrative agencies. Just last year, his organization (the New Civil Liberties Alliance) and ours (TechFreedom) filed a joint amicus brief to challenge such abuse. But the path proposed in Hamburger’s op-ed would lead to a regime for coercing private companies to carry speech that is hateful or even downright dangerous. The storming of the U.S. Capitol should make clear once and for all why all major tech services ban hate speech, misinformation and talk of violence: Words can have serious consequences—in this case, five deaths, in addition to two subsequent suicides by Capitol police officers. Hamburger claims that there is “little if any federal appellate precedent upholding censorship by the big tech companies.” But multiple courts have applied the First Amendment and Section 230 to protect content moderation, including against claims of unfairness or political bias. Hamburger’s fundamental error is claiming that Section 230 gives websites a “license to censor with impunity.” Contrary to this popular misunderstanding, it is the First Amendment—not Section 230—which enables content moderation. Since 1998, the Supreme Court has repeatedly held that digital media enjoy the First Amendment rights as newspapers. When a state tried to impose “fairness” mandates on newspapers in 1974, forcing them to carry third-party speech, no degree of alleged consolidation of “the power to inform the American people and shape public opinion” in the newspaper business could persuade the Supreme Court to uphold such mandates. The court has upheld “fairness” mandates only for one medium— broadcasting , in 1969—and only because the government licenses use of publicly owned airwaves, a form of “state action.”  Websites have the same constitutional right as newspapers to choose whether or not to carry, publish or withdraw the expression of others. Section 230 did not create or modify that right. The law merely ensures that courts will quickly dismiss lawsuits that would have been dismissed anyway on First Amendment grounds—but with far less hassle, stress and expense. At the scale of the billions of pieces of content posted by users every day, that liability shield is essential to ensure that website owners aren’t forced to abandon their right to moderate content by a tsunami of meritless but costly litigation. Hamburger focuses on Section 230(c)(2)(A), which states: “No provider or user of an interactive computer service shall be held liable on account of ... any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.” But nearly all lawsuits based on content moderation are resolved under Section 230(c)(1), which protects websites and users from being held liable as the “publisher” of information provided by others. In the 1997 Zeran decision, the U.S. Court of Appeals for the Fourth Circuit concluded that this provision barred “lawsuits seeking to hold a service provider liable for its exercise of a publisher’s traditional editorial functions—such as deciding whether to publish, withdraw , postpone or alter content” (emphasis added).  The Trump administration argued that these courts all misread the statute because their interpretation of 230(c)(1) has rendered 230(c)(2)(A) superfluous. But the courts have explained exactly how these two provisions operate differently and complement each other: 230(c)(1) protects websites only if they are not responsible, even “in part,” for the “development” of the content at issue. If, for example, they edit that content in ways that contribute to its illegality (say, deleting “not” in “John is not a murderer”), they lose their 230(c)(1) protection from suit. Because Congress aimed to remove all potential disincentives to moderate content, it included 230(c)(2)(A) as a belt-and-suspenders protection that would apply even in this situation. Hamburger neglects all of this and never grapples with what it means for 230(c)(1) to protect websites from being “treated as the publisher” of information created by others.  Hamburger makes another crucial error: He claims Section 230 “has privatized censorship” because 230(c)(2)(A) “makes explicit that it is immunizing companies from liability for speech restrictions that would be unconstitutional if lawmakers themselves imposed them.” But in February 2020, the U.S. Court of Appeals for the Ninth Circuit ruled that YouTube was not a state actor and therefore could not possibly have violated the First Amendment rights of the conservative YouTube channel Prager University by flagging some of its videos for “restricted mode,” which parents, schools and libraries can turn on to limit children’s access to sensitive topics.  Hamburger insists otherwise, alluding to the Supreme Court’s 1946 decision in Marsh v. Alabama : “The First Amendment protects Americans even in privately owned public forums, such as company towns.” But in 2019, Justice Brett Kavanaugh, writing for all five conservative justices, noted that in order to be transformed into a state actor, a private entity must be performing a function that is traditionally and exclusively performed by the government: “[M]erely hosting speech by others is not a traditional, exclusive public function and does not alone transform private entities into state actors subject to First Amendment constraints.” In fact, Marsh has been read very narrowly by the Supreme Court, which has declined to extend its holding on multiple occasions and certainly has never applied it to any media company. Hamburger also claims that Big Tech companies are “akin to common carriers.” He’s right that “the law ordinarily obliges common carriers to serve all customers on terms that are fair, reasonable and nondiscriminatory.” But simply being wildly popular does not transform something into a common carrier service. Common carriage regulation protects consumers by ensuring that services that hold themselves out as serving all comers equally don’t turn around and charge higher prices to certain users. Conservatives may claim that’s akin to social media services saying they’re politically neutral when pressed by lawmakers at hearings, but the analogy doesn’t work. Every social media service makes clear up front that access to the service is contingent on complying with community standards, and the website reserves the discretion to decide how to enforce those standards—as the U.S. Court of Appeals for the Eleventh Circuit noted recently in upholding the dismissal of a lawsuit by far-right personality Laura Loomer over her Twitter ban. In other words, social media are inherently edited services. Consider the Federal Communications Commission’s 2015 Open Internet Order, which classified broadband service as a common carrier service insofar as an internet service provider (ISP) promised connectivity to “substantially all Internet endpoints.” Kavanaugh, then an appellate judge, objected that this infringed the First Amendment rights of ISPs. Upholding the FCC’s net neutrality rules, the U.S. Court of Appeals for the D.C. Circuit explained that the FCC’s rules would not apply to “an ISP holding itself out as providing something other than a neutral, indiscriminate pathway—i.e., an ISP making sufficiently clear to potential customers that it provides a filtered service involving the ISP’s exercise of ‘editorial intervention.’” Social media services make that abundantly clear. And while consumers reasonably expect that their broadband service will connect them to all lawful content, they also know that social media sites won’t let you post everything you want. Hamburger is on surer footing when commenting on federalism and constitutional originalism: “[W]hen a statute regulating speech rests on the power to regulate commerce, there are constitutional dangers, and ambiguities in the statute should be read narrowly.” But by now, his mistake should be obvious: Section 230 doesn’t “regulat[e] speech.” In fact, it does the opp
The Wall Street Journal Misreads Section 230 and the First Amendment posted first on http://realempcol.tumblr.com/rss

No comments:

Post a Comment