17/06/2024 12:45 AM

Njug

Enjoy Fashion

Supreme Court Asked For An Emergency Review Of Texas’ Dangerous Social Media Law

from the and-we’re-off… dept

As you’ll recall, last Wednesday, the 5th Circuit surprised lots of people by immediately reinstating Texas’s ridiculous content moderation law that basically creates an open season to sue large social media sites for any moderation choices those sites make. The surprise wasn’t necessarily the judges’ decision, which had been telegraphed two days earlier via the judges’ (plural) extremely confused questions regarding the law (including saying that Twitter was not a website, which it is). The bigger surprise was that they reinstated the law just two days later, without any written opinion, or giving the plaintiffs (trade groups that represent many large internet companies) a chance to appeal. That’s just weird.

Late on Friday, the trade associations, NetChoice and CCIA, petitioned Justice Alito with an emergency application to stop the law from going into effect. Technically, it’s an “emergency application for immediate administrative relief and to vacate stay of preliminary injunction.” Just to break that apart: the law was passed, and the district court granted a preliminary injunction, blocking the law from going into effect (while noting the law was pretty clearly unconstitutional). The 5th Circuit’s reversal was putting a “stay” on the preliminary injunction, meaning that the law could go into effect. So, to block the law again, they need the Supreme Court to vacate the stay on the preliminary injunction blocking the law. Simple. Got it? Got it.

Also, the reason they petitioned Alito is that each Circuit court gets one of the Justices as that Circuit’s Justice, and Alito covers the 5th. So these kinds of emergency applications, which are part of the now infamous “shadow docket” of the court, have to go up to the Justice for that Circuit. If that Justice refuses, then the petitioners can try other Justices. In this case, on Saturday, Alito gave Texas until Wednesday to file a response.

The petition itself is worth reading. It’s 55 incredibly thorough pages. We’ll get to the content in a moment, but it’s worth noting that there is some serious legal fire power here, with a heavy focus on both knowing the law in Texas, and knowing the conservative Justices. The eye catching name is Paul Clement, former Solicitor General of the US under George W. Bush, who is extremely well known in legal circles and has been involved in tons of high profile cases. And then it also includes two recent Texas Solicitor Generals, Kyle Hawkins and Scott Keller, who both were appointed by current governor Greg Abbott, who pushed for this law. I mean, Hawkins only stepped down from that role last year. Notably, Hawkins also clerked for Alito at the Supreme Court (and one of the judges on the 5th Circuit panel). Another lawyer on this filing is Katherine Yarger, who clerked for Neil Gorsuch when he was on the 10th Circuit and Clarence Thomas at the Supreme Court. These are not coincidences.

As for the content of the request, it comes in with a strong opening:


Texas House Bill 20 (“HB20”) is an unprecedented assault on the editorial discretion of private websites (like Facebook.com, Instagram.com, Pinterest.com, Twitter.com, Vimeo.com, and YouTube.com) that would fundamentally transform their
business models and services. HB20 prohibits covered social media platforms (many
of which are members of Applicants NetChoice and CCIA) from engaging in any viewpoint-based editorial discretion. Thus, HB20 would compel platforms to disseminate
all sorts of objectionable viewpoints—such as Russia’s propaganda claiming that its
invasion of Ukraine is justified, ISIS propaganda claiming that extremism is warranted, neo-Nazi or KKK screeds denying or supporting the Holocaust, and encouraging children to engage in risky or unhealthy behavior like eating disorders. HB20
also imposes related burdensome operational and disclosure requirements designed
to chill the millions of expressive editorial choices that platforms make each day.

First point they make is that the 5th Circuit’s stay without any opinion is problematic in itself, before even getting to the underlying law:


Yet, on Wednesday night, a divided Fifth Circuit panel issued a one-sentence
order granting a stay motion filed by the Texas Attorney General five months earlier,
allowing him to immediately enforce HB20. This unexplained order deprives Applicants of the “careful review and a meaningful decision” to which they are “entitle[d].”
Nken v. Holder, 556 U.S. 418, 427 (2009). The Fifth Circuit has yet to offer any
explanation why the District Court’s thorough opinion was wrong. This Court should
allow the District Court’s careful reasoning to remain in effect while an orderly appellate process plays out.

They also point out that this rush to reinstate the law could interfere with the 11th Circuit, which heard Florida’s appeal regarding its similar law a few weeks before the 5th Circuit heard its appeal. The 11th Circuit is still waiting to rule (and the expectation is they may take a while). As the briefing here notes, immediately reinstating the Texas law upsets the status quo in a scenario where it’s likely that no matter what happens with both laws, the Supreme Court will have to hear a more fully briefed case about them at the relevant point in the future. But rather than letting any of that play out, the 5th Circuit was just like “yup, turn on the law.” Which is generally not how these things work.

Vacating the stay in this case will maintain the status quo while the Eleventh
Circuit also considers a parallel appeal concerning a preliminary injunction against
Florida’s similar law. NetChoice, LLC v. Moody, 546 F. Supp. 3d 1082, 1086 (N.D. Fla.
2021), appeal docketed, 11th Cir. No. 21-12355 (11th Cir. July 13, 2021). Until the
Fifth Circuit issued this stay, the status quo had been maintained pending a decision
from at least one federal court of appeals weighing in on the constitutionality of unprecedented state laws regulating the worldwide speech of only some governmentdisfavored social media platforms. And even then, that decision would not have gone
into effect until the appellate court’s mandate had issued or the parties sought further
review in this Court. By issuing a stay and allowing the Texas Attorney General to
enforce HB20 while appeals are still pending, the Fifth Circuit short-circuited the
normal review process, authorizing Texas to inflict a massive change to leading global
websites and undoubtedly also interfering with the Eleventh Circuit’s consideration
of Applicants’ challenge to the similar Florida law.

It also points out how damaging it is to just put the law into effect.


Furthermore, the covered platforms face immediate irreparable injury many
times over. Unrebutted record evidence demonstrates that it will be impossible for
these websites to comply with HB20’s key provisions without irreversibly transforming their worldwide online platforms to disseminate harmful, offensive, extremist,
and disturbing content—all of which would tarnish their reputations for offering appropriate content and cause users and advertisers to leave. As one of Applicants’
declarants stated, HB20 “would force us to change all of our systems to try to come
into compliance.” App.350a. And because there is no “off-switch” to platforms’ current
operations, the cost of revamping the websites’ operations would undo years of work
and billions of dollars spent on developing some platforms’ current systems. Id. Even
if platforms could revamp their entire communities, they would lose substantial revenue from boycotts by advertisers who do not want their ads to appear next to vile,
objectionable expression. In the past, YouTube and Facebook “lost millions of dollars
in advertising revenue” from advertisers who did not want their advertisements next
to “extremist content and hate speech.”

And then we get to the basics of the 1st Amendment issues inherent here, starting with a citation of the (very useful) Justice Kavanaugh-authored ruling three years ago in Halleck. We’ve pointed to that case regularly, as it says quite clearly that private platforms have their own rights to moderate as they see fit and the government should not interfere. It’s no surprise that this filing kicks off with a strong reminder of that ruling, followed by a long list of other famous cases regarding the constitutional problems with compelled speech and association, and closing it out with a cite to Justice Thomas’ concurrence in Denver Area v. FCC, which was basically a precursor case to Halleck.


More fundamentally, the Fifth Circuit’s order contradicts bedrock First Amendment principles established by this Court. When “a private entity provides a forum
for speech,” it may “exercise editorial discretion over the speech and speakers in the
forum.” Manhattan Cmty. Access Corp. v. Halleck, 139 S. Ct. 1921, 1930 (2019). This
Court thus has repeatedly recognized that private entities have the right under the
First Amendment to determine whether and how to disseminate speech. E.g., Hurley
v. Irish-Am. Gay, Lesbian & Bisexual Group of Bos., 515 U.S. 557, 581 (1995); PG&E
v. PUC of Cal., 475 U.S. 1, 12 (1986) (plurality op.);1 Miami Herald Publ’g Co. v.
Tornillo, 418 U.S. 241, 258 (1974); see also Sorrell v. IMS Health Inc., 564 U.S. 552,
570 (2011); Arkansas Educ. TV Comm’n v. Forbes, 523 U.S. 666, 674 (1998); Denver

The simple reality is that until maybe a year or two ago, questions about the government compelling websites to carry speech easily would have been a slam dunk as unconstitutional under the 1st Amendment, with the most conservative members of the Court being the most vocal. It’s only in the last two years or so that a concerted effort has been made to flip conservatives completely into arguing that you can force private property owners to host speech. And both Thomas and Alito have publicly suggested they’re on-board with this position. This brief works hard to remind them, and their colleagues, of their principles, from back when it was believed they had them.

For what it’s worth, this is also likely why, later in the filing, the petitioners want to remind the Justices of the Masterpiece Cakeshop ruling, again specifically citing Thomas’ concurrence in that case.


Fourth, private entities cannot be compelled to disseminate speech even if they
could “dissociate” themselves from the compelled publication by “simply post[ing] a
disclaimer,” as that would “justify any law compelling speech.” Masterpiece Cakeshop,
Ltd. v. Colorado C.R. Comm’n, 138 S. Ct. 1719, 1745 (2018) (Thomas, J., concurring).
A publisher’s ability to disclaim compelled speech was present in Tornillo, PG&E,
Hurley, and Wooley v. Maynard, 430 U.S. 705, 717 (1977). And the Court consistently
held that government could not compel speech. (In any event, HB20 prohibits platforms from disclaiming compelled speech, because they are not permitted to “discriminate” among speech on their platform

The petition does a pretty nice job of laying out how content moderation is a form of editorial discretion, and that lots of websites wish to cultivate their own kinds of communities, and the government can’t just come in and interfere with that:


In short, platforms “publish,” Reno, 521 U.S. at 853, and “disseminate” speech
authored by others, Sorrell, 564 U.S. at 570. But just as a newspaper does not publish
every opinion piece it receives, these platforms do not disseminate all speech users
submit—or treat all user-submitted speech equally. Instead, each platform has its
own rules about what speech is acceptable for its particular service and community.
Platforms all have hate-speech policies, for example. App.21a, 389a-445a. Platforms
also differ in important ways that accord with the websites’ designs and different
editorial policies and emphases. YouTube, for example, supports a “community that
fosters self-expression on an array of topics as diverse as its user base,” while prohibiting “harmful, offensive, and unlawful material” like “pornography, terrorist incitement, [and] false propaganda spread by hostile foreign governments.” App.146a,
149a. Twitter allows a wider range of expression such as adult content.3 Other social
media platforms—including Texas-favored websites excluded from HB20’s coverage
that tout less-moderated communities—still have similar policies. App.115a, 134a.

For all platforms, the expressive act of policy enforcement is critical to the distinctive experiences that platforms provide their users—and to ensuring that the services remain hospitable and useful services. Without these policies, platforms would
offer fundamentally worse (and perhaps even useless) experiences to their users, potentially overrun with spam, vitriol, and graphic content. App.20a-21a. The record
confirms that when platforms have failed to remove harmful content, their users and
advertisers have sought to hold platforms accountable—including through boycotts.
App.126a, 135a-38a, 168a-69a, 187a. And when platforms have chosen to remove, or
reduce the distribution of, objectionable content, they have faced criticism from users
as well as elected officials. App.73a.

From the moment users access a social media platform, everything they see is
subject to editorial discretion by the platform in accordance with the platforms’
unique policies. Platforms dynamically create curated combinations of user-submitted expression, the platforms’ own expression, and advertisements. This editorial process involves prioritizing, arranging, and recommending content according to what
users would like to see, how users would like to see it, and what content reflects (what
the platform believes to be) accurate or interesting information. App.21a; see
App.312a (YouTube: “I believe in 2018 that data was about 70 percent of views are
driven by recommendations.”).

Those decisions begin with the very basic design and functions of the site.
YouTube and Vimeo, for instance, disseminate both videos and users’ comments on
those videos. Facebook and LinkedIn have a broader range of videos and text. Instagram focuses on images and video, though it too has options for comments. Twitter is
largely limited to 280-character text “tweets,” with options to post videos and images.
TikTok has short videos. And Pinterest has images on digital “pin boards.” Across all
these websites, platforms make decisions about the user interface and appearance of
the platform. Some provide filters or parental controls to offer users even more curated experiences. And all this content appears next to the platforms’ distinctive
branding.

Given their size and dynamic nature, platforms must constantly make editorial
choices on what speech to disseminate and how to present it. At a minimum, this
involves the platforms’ determination of what should show up at the top of users’
“feeds” and search results—which are functions the platforms engage in for each user
and countless times a day. App.163a. Platforms also recommend or prioritize content
they consider relevant or most useful. App.150a. Consequently, much like a newspaper must decide what stories deserve the front page, how long stories should be, what
stories should be next to other stories, and what advertisements should be next to
what stories, social media platforms engage in the same kinds of editorial and curatorial judgments both for individual users and the platforms as a whole.

The petition also digs deep into the ridiculousness of the no-explanation stay, leading to the law immediately going into effect:


The cursory manner in which the Fifth Circuit panel majority allowed HB20 to
take effect alone justifies the granting of this Application. See Nken, 556 U.S. at 427

Last year, both Texas and Florida embarked on an unprecedented effort to override the editorial discretion of social media platforms and to compel them to disseminate a plethora of speech the platforms deem objectionable and antithetical to the
speech they want to present to users (and advertisers). App.6a-7a; NetChoice, 546 F.
Supp. 3d at 1085. Both laws are an undisguised effort to level the speech playing field
and control “Big Tech.” To that end, both laws override editorial discretion and compel speech—imposing their burdens only on selected speakers and carving out favored
content. App.28a-29a; NetChoice, 546 F. Supp. 3d at 1093-94. In short, the laws defy
established First Amendment doctrine by taking virtually every action forbidden to
state actors by the First Amendment.

Both states recognized that their laws would transform the Internet and fundamentally change the way platforms exercise editorial discretion and disseminate
speech, so they delayed their effective dates to allow regulated platforms to try to
come into compliance. App.9a; NetChoice, 546 F. Supp. 3d at 1085. Applicants took
advantage of that interval to seek preliminary injunctive relief that would prevent
the laws from taking immediate transformative effect, while allowing the parties to
debate the legal issues and giving jurists time to consider all the issues as part of an
orderly review process. The results were two well-reasoned district court opinions
carefully explaining the provisions of the respective laws and each preliminarily enjoining those laws as rather obvious affronts to the First Amendment.

Those two decisions paved the way for an orderly appellate process in the courts
of appeals. Florida did not even seek a stay of that preliminary injunction, but pursued a modestly expedited appeal that is fully briefed and was argued late last month. See Docket, 11th Cir. No. 21-12355. While Texas sought a stay, a Fifth Circuit motions panel referred that stay to the merits panel, which considered the important
issues pursuant to an orderly appellate process that included full briefing and an oral
argument. App.4a. But on Wednesday, a divided panel threw both the Internet and
the orderly appellate process into chaos by issuing a one-sentence order purporting
to allow the Texas Attorney General to enforce HB20 immediately. App.2a

As this Court explained in Nken, appellate courts may not enter stays pending
appeal “reflexively,” but only after the movant has satisfied its “heavy burden,” and
only after the panel has conducted “careful review” and issued a “meaningful decision.” 556 U.S. at 427; id. at 439 (Kennedy, J., concurring). Yet this one-sentence
order explains nothing—in stark contrast to the extensively reasoned district court
opinions that explained the various provisions of the laws, suggested some possible
limiting constructions, and identified the precise constitutional defects. The Fifth Circuit’s order creates immediate obligations, compels all sorts of speech, and essentially
forces Applicants to try to conform their global operations to Texas’s vision of how
they should operate—and they must do so essentially overnight. Equally important,
the order undermines the orderly appellate process in this Court (and the Eleventh
Circuit), which necessitates this emergency application.

It did not have to be this way. Even if a majority of the Fifth Circuit panel disagrees with the well-reasoned opinion of the district court, it could have explained its
reasoning in an opinion subject to the normal rules for issuing appellate mandates,
which would then have permitted Applicants to seek rehearing and petition for certiorari. That course would have allowed an appellate process that gave this Court the same opportunity for the calm and orderly consideration that every other court has
enjoyed in considering these momentous legal issues that go to the heart of the First
Amendment.

There are many more arguments made in the filing, but I did want to call out two quick points raised in it that push back on specious arguments made by many (including people in our comments) to say that governments can force social media websites to host content. First, the two popular cases people like to bring up are PruneYard and Rumsfeld v. Fair. Those don’t apply (and I’ll note in passing that Clement argued the Rumsfeld case on behalf of the US government, so he should know).


Neither Rumsfeld v. FAIR, 547 U.S. 47 (2006), nor PruneYard Shopping Center
v. Robins, 447 U.S. 74 (1980), justify HB20 or Defendant’s “hosting” theory. Neither
case involved private editorial choices about what speech to disseminate. See FAIR,
547 U.S. at 64 (“A law school’s recruiting services lack the expressive quality of a
parade, a newsletter, or the editorial page of a newspaper.”); PruneYard, 447 U.S. at
88 (no “intrusion into the function of editors”). In PruneYard, the shopping mall
“owner did not even allege that he objected to the content of the [speech]; nor was the
access right content based.” PG&E, 475 U.S. at 12 (discussing PruneYard). And FAIR
distinguished the “conduct” of a law school’s employment recruitment assistance from
a “number of instances” where the Court “limited the government’s ability to force
one speaker to host or accommodate another speaker’s message”—citing Hurley,
PG&E, and Tornillo. FAIR, 547 U.S. at 63

And then there’s the whole “common carrier” bit, which they note is completely nonsensical in this context.


Seventh, social media platforms are not common carriers, and the First Amendment analysis would not change if they were. “A common carrier does not make individualized decisions, in particular cases, whether and on what terms to deal.” FCC v.
Midwest Video Corp., 440 U.S. 689, 701 (1979). Far from “hold[ing] themselves out as
affording neutral, indiscriminate access to their platform without any editorial filtering,” unrebutted evidence establishes that platforms constantly engage in editorial
filtering, providing unique experiences to each user and limiting both who may access
their platforms and how they may use the platforms, as discussed above (at pp.5-9)
USTA, 855 F.3d at 392 (Srinivasan & Tatel, JJ., concurring in the denial of reh’g en
banc) (emphasis added). Consequently, “web platforms such as Facebook, Google,
Twitter, and YouTube . . . are not considered common carriers.” Id.; see also Cablevision Sys. Corp. v. FCC, 597 F.3d 1306, 1321-22 (D.C. Cir. 2010) (Kavanaugh, J., dissenting) (“A video programming distributor . . . is constitutionally entitled to exercise
‘editorial discretion over which stations or programs to include in its repertoire.’ As a
result, the Government cannot compel video programming distributors to operate like
‘dumb pipes’ or ‘common carriers’ that exercise no editorial control.”) (citations omitted)

This Court’s precedents likewise recognize that government cannot convert private entities that exercise editorial judgments into common carriers. See FCC v.
League of Women Voters of Cal., 468 U.S. 364, 379 (1984) (compelled publication unlawful because it would “transform broadcasters into common carriers and would intrude unnecessarily upon the editorial discretion of broadcasters”). This Court recognized that even television broadcasters have protected editorial discretion, id.,
though broadcasters receive less First Amendment protection than Internet websites.
See Reno, 521 U.S. at 870.

In all events, even common carriers retain the “right to be free from state
regulation that burdens” speech. PG&E, 475 U.S. at 17-18 & n.14. So HB20’s label as
“a common carrier scheme has no real First Amendment consequences,” because “impos[ing] a form of common carrier obligation” cannot justify a law that “burdens the
constitutionally protected speech rights” of platforms “to expand the speaking opportunities” of others. Denver, 518 U.S. at 824-26 (Thomas, J., concurring in the judgment in part and dissenting in part). Similarly, government cannot declare private
entities’ dissemination of speech as a “public accommodation.” Hurley, 515 U.S. at
573

Anyway, there’s a lot more in there, but it’s a strong filing. Hopefully Alito recognizes that…

Filed Under: 1st amendment, 5th circuit, content moderation, hb20, samuel alito, section 230, social media, supreme court, texas

Companies: ccia, netchoice