Wikipedia:Wikipedia Signpost/2023-02-04/Section 230

Section 230

Twenty-six words that created the internet, and the future of an encyclopedia

JPxG is a welder, forklift driver, software engineer, message board administrator, and Wikipedia editor who has written a number of articles for the Signpost, and a number for Wikipedia, including "Extremely Online", which he has been since some time around 1999.

In two major English-speaking countries, two separate legal mechanisms are working their way through two separate processes. The first is a United States Supreme Court case regarding §230 of the Communications Decency Act, and the second is a proposed Act of the Parliament of the United Kingdom "intended to improve internet safety". Both have wide-ranging implications for posters, lurkers, and everyone in between, and both have been the subject of fierce debate. Both are also the subject of special reports in this issue of the Signpost – the other is at Special report.

The Spirit of '96

Related articles
State actors

Propaganda and photos, lunatics and a lunar backup
20 November 2023

Legal status of Wikimedia projects "unclear" under potential European legislation
4 February 2023

Twenty-six words that created the internet, and the future of an encyclopedia
4 February 2023

Missed and Dissed
28 November 2022

From Russia with WikiLove
31 October 2022

Editor given three-year sentence, big RfA makes news, Guy Standing takes it sitting down
26 June 2022

A net loss: Wikipedia attacked, closing off Russia? welcoming back Turkey?
30 September 2019

WMF staff turntable continues to spin; Endowment gets more cash; RfA continues to be a pit of steely knives
31 January 2019

Court-ordered article redaction, paid editing, and rock stars
1 December 2018

Wales in China; #Edit2015
16 December 2015

Russia temporarily blocks Wikipedia
26 August 2015

Turkish Wikipedia censorship; "Can Wikipedia survive?"; PR editing
24 June 2015

Foundation takes aim at undisclosed paid editing; Greek Wikipedia editor faces down legal challenge
19 February 2014

China blocks secure version of Wikipedia
5 June 2013

French intelligence agents threaten Wikimedia volunteer
8 April 2013

Lawsuit filed against two Wikipedians
10 September 2012

Russian Wikipedia shuts down to fight censorship threat; E3 team and new tools; Wikitravel proposal bogged down
9 July 2012

Censorship, social media in schools, and more
30 March 2009


More articles

See prior Signpost coverage and this issue's In the media.

Section 230 of the United States Communications Decency Act[1] is a federal statute made effective in February 1996. While a detailed explanation of all that it meant then, now, in between, and to the major political players of the last few years would make for quite thick reading, the section itself is quite short:

Of course, the First Amendment of the United States Constitution (and a little over two hundred years of subsequent jurisprudence) say in plain terms that "Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances".[2] However, the Internet occupies a unique place in law, as a decentralized structure in which messages are conveyed between users by intermediaries; Section 230 ensures that those organizations which provide the infrastructure for posting need not individually consider the content of each message being conveyed.

Prior to this, it was an open question whether websites themselves could be held liable for their users having made defamatory, tortious, or outright illegal posts (in addition to the users themselves). In fact, the case that prompted its creation was Stratton Oakmont, Inc. v. Prodigy Services Co. (yes, that Stratton Oakmont), which held that a hosting provider was legally liable for an anonymous user's defamation of a businessman. In this case, the fact that Prodigy had exerted any editorial control over the message board (including deleting posts for being spam, off-topic or just plain dumb) meant that they assumed the role of a publisher and were therefore responsible for whatever posts they didn't delete.

By permitting websites to serve content without their operators being exposed to lawsuits every time someone posted bad on them, the gates were opened to the modern web: Section 230 has been referred to as the "twenty-six words that created the Internet". But lately, things have been popping entirely off.

Popping off

In the last few years, it has become the subject of much political controversy; numerous challenges to websites' immunity under the section have come from many directions. For example, a bill in 2021 seeking to strip protections from sites whose recommendation algorithms served objectionable content was sponsored by Democratic congressman Frank Pallone, who alleged that current interpretations of the bill allowed social media companies to profit from "elevating disinformation and extremism". And in 2020, a Republican bill sought to enforce websites' compliance with government-created standards of "objectively reasonable" content removal, with senator Marsha Blackburn calling such changes necessary to "[bring] liability protections into the modern era".

Presently, two cases stand before US courts, both seeking to change the current interpretation of the law: NetChoice v. Paxton and Gonzalez v. Google. These cases were filed by different parties, in different jurisdictions, and concern different elements of the interpretation of the law; what they have in common is that they have implications for the future of the web, and of the Wikimedia projects that roam it.

NetChoice v. Paxton

This lawsuit concerns the recently-introduced Texas House Bill 20, a 2021 piece of legislation that applies restrictions to the editorial policies of "large social media platforms", i.e. those with more than 50 million monthly active users in the US. Guess who had 44,955,915 users in the last year?

It enjoins these sites from "censoring on the basis of user viewpoint, user expression, or the ability of a user to receive the expression of others", and allows for removal only under a few limited circumstances, like the post itself being unlawful or "directly inciting" criminal activity. Some have noted that this does not exactly make sense when applied to a site like Wikipedia, where "moderation" is carried out by the same group of volunteers as normal editing: is replacing the text of a Wikipedia article with "peepee poopoo" censorship, or is reverting that edit censorship? Are they both censorship?

Some have noted that the bill seems to fling similarly offensive materials all over Section 230 – most notably the plaintiffs in this case, NetChoice and the Computer & Communications Industry Association. They argued that the Texas bill was preempted by § 230, and a judge agreed with them in December 2021, blocking its promulgation on First Amendment grounds. The State of Texas appealed immediately, with the Fifth Circuit Court of Appeals reversing the decision and allowing the law to take effect in May 2022, but that was itself reversed later in the month by the United States Supreme Court, who is currently hearing the case. On January 23, it requested an opinion from the solicitor general regarding the case (as well as NetChoice v. Moody, an analogous case regarding a similar law in Florida), with SCOTUSblog saying that "the justices will not decide whether to take up the Florida and Texas cases until after they issue their decisions in two other cases that could transform how social-media companies operate" (here referring to Gonzalez v. Google and Twitter v. Taamneh, both related to the liability of websites for terrorist content posted by users).

Gonzalez v. Google

While the Islamic State of Iraq and Syria is not in the news very often these days, it was near its zenith in 2015, when it claimed responsibility for a string of attacks in France that included bombings, shootings, and standoffs with hostages. While over a hundred people were murdered, one of them was an American citizen; Nohemi Gonzalez, whose family subsequently filed a lawsuit against Google. They alleged that videos on Google-owned website YouTube "were the central manner in which ISIS enlisted support and recruits from areas outside the portions of Syria and Iraq which it controlled". Because YouTube's software had "affirmatively recommended ISIS videos to users", the plantiffs argued that Google had "provided material assistance to [and] aided and abetted" the terror group, in violation of 18 U.S.C. § 2333. In their November 2022 brief, the plaintiffs don't seem to mention individual instances where the perpetrators of those specific terror attacks were convinced to perform the acts by YouTube videos, or go any further than to say that YouTube "played a uniquely essential role" in the group's rise to prominence.

A dizzying panoply of organizations have filed briefs in the case, ranging across the political spectrum, and including some familiar advocacy groups. The brief from the National Police Association cites "social-media amplification of anti-LEO messages" as an obstacle to recruitment of police officers, and recommends that immunity be stripped in order to "damp anti-LEO attitudes" as evidenced by a "new paradigm of violence against police under the pretext of Black Lives Matter", citing hashtags associated with the 2020 George Floyd protests like #FUCK12. Meanwhile, the Anti-Defamation League's brief claims that immunity should be stripped in order to curb "hateful and extreme content": "After the 2020 murder of George Floyd, ADL reported that anti-Black posts on Facebook had quadrupled, and the number of white supremacist propaganda incidents has nearly doubled".

Conversely, many briefs urged the court to uphold immunity for websites, including those from the American Civil Liberties Union and Electronic Frontier Foundation. Perhaps one of the most notable filings was from Reddit, Inc. and Reddit Moderators, in which two amici are pseudonymous volunteer moderators (u/AkaashMaharaj and u/Halaku).

Most relevant here is the brief filed by the Wikimedia Foundation, in which many arguments are made for the inability of a web without § 230 immunity to accomodate works such as Wikipedia. A quote:

Petitioners’ flawed theory of Section 230 has it backwards: rather than locking in advantage for major technology players, Section 230 ensures that websites with small budgets but large impacts can exist and compete against the big players. Petitioners’ interpretation would hollow out Section 230 and call into question its protections for platforms that need it the most. The Court should decline that invitation, particularly given that Petitioners’ theory lacks any textual basis.

[...]

Even with Section 230, litigation based on user speech can costs tens if not hundreds of thousands of dollars at the motion-to-dismiss stage ... These costs alone are significant to smaller and lesser-funded websites. But without Section 230 granting start-ups the ability to dismiss cases against them, their legal expenses would pile up even higher, ranging anywhere from $100,000 to $500,000 or more for each case that reaches the discovery stage.


— Wikimedia Foundation

The WMF brief goes on to cite the hundreds of content-related legal complaints received yearly in the United States alone, and the ubiquitous nature of content recommendation even in the design of a website as simple as Wikipedia – most visibly the Main Page sections for Today's featured article, On this day, Did you know, and In the news, but even features as basic as hyperlinks to other articles in body text.

What does it mean?

Well, who knows? It may sound like a mere reconfiguration of liability law – certainly, much of the political discourse surrounding Section 230 focuses on "holding tech companies accountable" – but there are far-reaching implications to a potential state of affairs where posting (or hosting posts) is a privilege of the few. And certainly, there are some who welcome such a change; the worldwide reach of Wikipedia and its pseudonymous ilk have proven quite inconvenient for a number of powerful entities over the years. However, there are obvious benefits to a free web, and the extent to which people are willing to throw these away is often overstated. Of course, it is easy to imagine doom and gloom, and that may even be a plausible outcome. But even in a scenario where immunities were stripped (which would likely be catastrophic for posting writ large) it is also easy to imagine existing carveouts being broadened to include things like Wikimedia projects.

The fate of Section 230 lies in the hands, not only of the Supreme Court, but of the whole rest of the United States government apparatus, which is able to challenge decisions, as well as modify and create new frameworks and processes for going about things. At the end of the day, it lies in the hands of voters, citizens, and posters from whom the government draws its legitimacy, and to whom it is ultimately accountable.

The Signpost looks forward to keeping you updated on these developments for as long as we are able to do so.

Notes

  1. ^ Actually, it is Section 9 of the CDA, and Section 509 of the Telecommunications Act of 1996, but it is typically called "Section 230" because that's what it is in Title 47.
  2. ^ NB: After publication, a friend pointed out to me that, while it was kinda cheating by cutting off the last part, "Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press" is also 26 words.