The Law and Business of Social Media
December 15, 2021 - Social Media Policy

District Court Enjoins Controversial Texas Social Media “Censorship” Law

Clubhouse faces steep challenges in spite of new features and expanded access

Over the past several years, Section 230 of the Communications Decency Act, the federal law that provides social media platforms with immunity from liability for user content and was once hailed as “the law that gave us the modern Internet,” has gone from relative obscurity (at least outside of tech circles) to being a household name and politicians’ favorite punching bag. Interestingly, the objections to Section 230 come from advocates on both sides of the aisle. Those on both the left and the right see the law as permitting platforms to maintain content moderation policies that result in significant social ills—they just tend to disagree about what those social ills are, and thus what those content moderation policies ought to be.

Generally speaking, advocates on the left see platforms as being too permissive in allowing misinformation to run rampant and blame that spread of misinformation for everything from Trump’s presidential win to, earlier in President Biden’s term, low COVID vaccination rates. Many on the left also argue that Twitter and Facebook should have banned then-President Trump from their platforms long before they did so in the wake of the January 6 insurrection. By contrast, those on the right view platforms as too restrictive when it comes to content moderation. As they see it, “Big Tech” is a puppet of Democrats and acts on their orders, systematically “censoring” conservative speakers and content as part of a larger liberal conspiracy. These claims from Republicans only increased after Trump finally was banned after January 6.

In short, no one has been happy with how social media platforms self-regulate and both sides believe that, if only Section 230 were amended or even eliminated entirely, their preferred policies would be put in place and the content they (dis)favor would finally be dealt with correctly.

Yet while dozens of pieces of legislation altering Section 230 have been introduced in both houses over the past few years, only one has become law—FOSTA-SESTA. FOSTA was a direct response to online personal ads appearing on Backpage.com and the belief that Section 230 insulated the site from liability for facilitating sex trafficking. It turns out that belief was false; Backpage was brought down without FOSTA and with Section 230 alive and well, but that’s a different story for a different day. The point for present purposes is that while there has been much hand wringing about Section 230 at the federal level, there has been much less action.

There has been much more legislative action at the state level, however, particularly in conservative states. The first to enact an anti-“Big Tech,” anti-Section 230 law was Florida, with Senate Bill 7072. A federal district court enjoined most of that law back in June, a day before it was set to go into effect, holding that it was preempted in large part by Section 230 and was an unconstitutional restriction on platforms’ speech. Florida appealed the court’s ruling and we will be watching closely to see how the Eleventh Circuit rules in the appeal (Plaintiff’s response was submitted on November 8).

Our focus in this post is on what came next: Texas House Bill 20 (HB 20), which was enacted in September and which Texas Governor Abbott described as targeting “a dangerous movement by social media companies to silence conservative viewpoints and ideas.” The law was set to go into effect on December 2, but on November 29, a federal district court in Austin heard oral arguments on its (un)constitutionality. On December 1, the court enjoined every part of it that plaintiffs NetChoice and the Computer & Communications Industry Association challenged.

HB 20

The Short Version

HB 20 (1) prevents social media companies with more than 50 million monthly active users from “censoring” users on the basis of viewpoint; (2) prohibits email service providers from “impeding the transmission” of emails based on content (including at least some spam); (3) requires covered platforms to have a complaint procedure for users who disagree with the platforms’ removal of users’ content based on a determination that the content violated the platforms’ acceptable use policies; and (4) requires covered platforms to make a number of disclosures.

The Long Version

  • (1) Must Carry

One of the most controversial provisions of HB 20 prohibits covered platforms (i.e., those with over 50 million monthly active users) from “censoring” on the basis of user viewpoint, user expression, or the ability of a user to receive the expression of others. (It also bans “censorship” on the basis of a user’s geographic location in Texas.) “Censor” is defined as to “block, ban, remove, deplatform, demonetize, de-boost, restrict, deny equal access or visibility to, or otherwise discriminate against expression.” In essence, content moderation is censorship. There are, though, four exceptions. Platforms may “censor” content that:

  • (a) the platform is “specifically authorized to censor by federal law”;
  • (b) an organization with the purpose of preventing the sexual exploitation of children or protecting survivors of sexual abuse from ongoing harassment requests the removal of (note that these platforms cannot remove the content if the actual subject of that harassment makes the request);
  • (c) directly incites criminal activity or consists of specific threats of violence targeted against a person or group on the basis of race, color, disability, religion, national origin or ancestry, age, sex, or status as a peace officer or judge; or
  • (d) is unlawful.

Given this, HB 20 would prohibit a social media platform from demoting the content of white supremacists unless it was effectively illegal as incitement or a true threat. Hate speech, which is generally protected by the First Amendment, could not be removed.

  • (2) Bans email service providers from “impeding the transmission” of another party’s emails and a private right of action for those affected

For reasons that are frankly hard to understand, HB 20 prohibits email service providers from “impeding the transmission” of another person’s emails based on the content of the messages unless the provider has a good faith and reasonable belief that the message contains either: malicious computer code; obscene material; material depicting sexual conduct; material that violates other laws; or the provider is authorized to block the transmission under other applicable state or federal laws. The law disincentivizes email service providers from blocking spam on the basis of these exceptions, though, by providing a private right of action to those “injured by a violation of this [provision] arising from the transmission of an unsolicited or commercial electronic mail message” and entitles them to statutory damages up to $25,000 a day their message is unlawfully impeded.

HB 20 defines neither “impeding” nor “transmission,” but it is hard to imagine an interpretation that doesn’t conflict with the protections granted to these service providers by Section 230(c)(2)(A) and 230(c)(2)(B). Section 230(c)(2) prevents the imposition of civil liability on internet service providers on account of (A) “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected” or (B) “any action taken to enable or make available to [users] the technical means to restrict access to” those same materials. None of us wants our inboxes filled with spam and, mercifully, Section 230(c)(2) encourages email service providers to help us avoid that from happening. Unfortunately, Plaintiffs did not challenge this provision, so the court did not address whether it would be preempted by Section 230.

  • (3) and (4) Disclosures and a User Complaint Procedure

HB 20 imposes three transparency obligations on covered platforms. They must: (a) disclose a laundry list of information about their business practices, including specific and detailed information about their content moderation policies; (b) have and make accessible an acceptable use policy that includes a number of elements, including a complaint intake mechanism (an earlier draft required platforms to provide a toll-free phone number users could call to complain); and (c) release biannual transparency reports.

As for the complaint intake mechanism, covered platforms are required to abide by a potentially burdensome process for removing user content that violates its acceptable use policy. For starters, when the platform removes the content, then (so long as the user can be located and the content is not part of an ongoing legal investigation) the company must: (a) notify the user and explain the reason for removal; (b) allow the user to appeal the decision; and (c) provide written notice to the user regarding the determination of the user’s appeal. If a user appeals the platform’s determination, the platform has 14 days (excluding weekends) to review the content again and notify the user of its determination. It is unclear how specific the explanation that the platform provides needs to be, but in any event this process seems likely to impose significant administrative burdens on platforms. This, in turn, could significantly discourage platforms from removing user content. While the conservative champions of the law may see that as precisely the goal, it would likely cause platforms to become less pleasant and welcoming to users. And, of course, Section 230 was enacted at least partly to encourage “computer service providers to establish standards of decency without risking liability for doing so,” as the Second Circuit noted in Domen v. Vimeo. Put another way, HB 20 appears to have been enacted to discourage exactly what Section 230 sought to encourage.

The Court’s Order

Somewhat surprisingly, the district court did not reach the question of whether HB 20 is preempted, in whole or in part, by Section 230. Unlike in the Florida case mentioned above, here the court enjoined the law entirely on the basis of the First Amendment.

Platforms Exercise Constitutionally Protected Editorial Discretion

Step One: If an entity exercises editorial discretion in choosing what to publish, the government cannot compel it to publish other content

The district court cited the Supreme Court’s three big compelled speech cases—Tornillo, Hurley, and PG&E—finding them to jointly “stand for the general proposition that private companies that use editorial judgment to choose whether to publish content—and, if they do publish content, use editorial judgment to choose what they want to publish—cannot be compelled by the government to publish other content.”

(As a quick a reminder, in Miami Herald Pub. Co. v. Tornillo the Court struck down a “right of reply” law that required newspapers to publish a candidate’s reply if the paper criticized her. In Hurley v. Irish-Am. Gay, Lesbian & Bisexual Group Of Boston, the Court held that a private organization had the right to exclude a gay rights group from their parade even in the face of a state antidiscrimination law that would have required the organization to permit the gay rights group’s participation. In Pacific Gas & Electric v. Public Utilities Commission of California, the Court found unconstitutional California’s requirement that a public utility company include a third-party newsletter in addition to its own newsletter in the envelopes in which it sent bills to its customers.)

Step Two: The covered platforms exercise editorial discretion

The court then moved on to conclude that the covered platforms engage in protected editorial discretion.

The parties disputed whether the platforms were more like newspapers and therefore entitled to a higher level protection for their speech (Plaintiffs’ position) or common carriers which, in their capacity as mere conduits of the speech of others, have been required to provide access on nondiscriminatory terms without raising First Amendment concerns (Defendant’s position). To support its position, Texas pointed to HB 20’s own pronouncement that the covered platforms were common carriers. The district court, though, was unmoved, and made clear that wishing does not make it so. “This Court starts from the premise that social media platforms are not common carriers.” At oral arguments, Texas took the position that, “the common carriage doctrine is essential to the First Amendment challenge . . . It dictates the rest of this suit in terms of the First Amendment inquiry.” Given the state’s position, the court acknowledged that it could have stopped there but decided not to, “in order to make a determination about whether social media platforms exercise editorial discretion or occupy a purgatory between common carrier and editor.”

The court found it straightforward that platforms exercise editorial discretion. They “curate both users and content to convey a message about the type of community the platform seeks to foster and, as such, exercise editorial discretion over their platform’s content.” The court pointed out that HB 20 itself was premised on the position that the covered platforms exercise just such discretion when they “silence conservative viewpoints and ideas.” In reaching its determination, the court dealt with a claim made by both the state and a number of commentators. Namely, that the platforms cannot be speaking when they curate their content because they use algorithms to do so. The court (rightly) called this a “distraction.”

HB 20 Violates Platforms’ First Amendment Rights

With the above analysis in place, the court went on to systematically address each of HB 20’s challenged provisions (i.e., every provision but the one prohibiting email service providers from “impeding the transmission” of third-party content).

The Must Carry Provision

  • The court returned to Tornillo, Hurley, and PG&E to determine that HB 20 impermissibly compels platforms’ speech. The statute targets the platforms’ editorial judgments, compelling them to “balance” their own judgments by including speech they otherwise would not, which was “precisely” what Tornillo, Hurley, and PG&E found unconstitutional.
  • The court also held that HB 20 requires platforms to change their speech. HB 20 prohibits “censorship” and thereby prohibits platforms from removing or including the content and users that they want. Citing Hurley, the court found this to “require[] platforms to ‘alter the expressive content of their [message].’” In a footnote, the court distinguished the social media platforms’ situation from those present in two cases where the Court rejected appellants’ attempts to invoke Tornillo: Pruneyard Shopping Center v. Robins (in which the Supreme Court upheld against a First Amendment challenge a California constitutional provision that compelled a private shopping center to allow a group onto its property to gather signatures for a petition, even though the shopping center’s policy forbade such activity) and Rumsfeld v. Forum for Academic & Institutional Rights (in which the Supreme Court held that a law requiring law schools to give military recruiters the same access as nonmilitary recruiters or lose certain federal funds did not compel the schools to speak, even though the schools disagreed with the military’s “Don’t Ask, Don’t Tell” policy.)
  • The statute also impermissibly burdens the platforms’ speech by specifying how they may arrange user content and potentially prohibits them from appending disclaimers onto posts.
  • The threat of lawsuits for violating the must carry provision also chills platforms’ speech rights.

Disclosure and User Complaint Procedure Provisions

  • The court found both the disclosure and user-complaint procedure requirements “inordinately burdensome given the unfathomably large number of posts on these sites and apps.” The court here cites some staggering numbers. In three months in 2021, Facebook removed 8.8 million pieces of “bullying and harassment content,” 9.8 million pieces of organized “hate content,” and 25.2 million pieces of “hate speech content.” In three months in 2021, YouTube removed 1.16 billion The court recognized that it would simply be impossible for these companies to operate if every removal were appealable.
  • In addition, the disclosure requirements compel speech.
  • The requirements also chill protected speech, as do the consequences of noncompliance.

HB 20 Discriminates on the Basis of Content and Speaker

The court was not yet finished with HB 20, also finding the law to permit unconstitutional content- and speaker-based discrimination.

  • HB 20 permits platforms to enforce their own content moderation policies for content that “directly incites criminal activity or consists of specific threats of violence targeted against a person or group because of their race, color, disability, religion, national origin or ancestry, age, sex, or status as a peace office or judge.” Citing A.V. v. City of St. Paul, Minnesota, the court agreed with Plaintiffs that the State has “‘no legitimate reason to allow the platforms to enforce their policies over threats based only on . . . favored criteria but not’ other criteria like sexual orientation, military service, or union membership.”
  • Only platforms with over 50 million monthly active users are covered under HB 20. The court made it clear that it thought that number was no accident when it pointed out that one state senator unsuccessfully proposed lowering the threshold to 25 million monthly users in an effort to include Parler and Gab, which are both popular with conservatives. “The record in this case confirms that the Legislature intended to target large social media platforms perceived as being biased against conservative views and the State’s disagreement with the social media platforms’ editorial discretion over their platforms. The evidence thus suggests that the State discriminated between social media platforms (or speakers) for reasons that do not stand up to scrutiny.”

HB 20 Does Not Even Survive Intermediate Scrutiny (Let Alone the Strict Scrutiny It Triggers)

  • The court held that the state failed to articulate a legitimate interest served by HB 20. The State offered two interests served by the law: (1) the “Free and unobstructed use of public forums and the information conduits provided by common carriers” and (2) “providing individual citizens effective protection against discriminatory practices, including discriminatory practices by common carriers.” The court thought (1) failed for several reasons, the most obvious being that platforms are not common carriers and “[e]ven if they were, the State provides no convincing support for recognizing a governmental interest in the free and unobstructed use of common carriers’ information conduits.” Additionally, the court pointed out that the Supreme Court rejected that same government interest in Tornillo. As for (2), the court cited Hurley once again. “Even given a state’s general interest in anti-discrimination laws, ‘forbidding acts of discrimination’ is ‘a decidedly fatal objective’ for the First Amendment’s ‘free speech commands.’” (Obviously, the state can and does prohibit certain acts of discrimination—even when those acts are engaged in by private associations, as the Court recognized in Roberts v. United States Jaycees—so as written, this is not quite true.)
  • The court also held that HB 20 was not narrowly tailored. Here, the court referenced the Northern District of Florida’s language when enjoining a similar law. “[T]he [Florida district court] colorfully described [the statute] as ‘an instance of burning the house to roast a pig.’ This Court could not do better in describing HB 20.” Instead of creating its own unmoderated platform, which Texas of course could do, the court found that the state targeted the large platforms because the state’s true intention was to target those platforms they believed to be “West Coast oligarchs” who were “silenc[ing] conservative viewpoints and ideas.”

As in Florida, the court found that nothing that could be severed and survive.

Concluding Thoughts

The court’s opinion is a complete win for platforms (and, frankly, users). As both the preliminary injunction in this case as well as in the Florida case make clear, the First Amendment and Section 230 will continue to pose obstacles to the viability of state laws like HB 20.