Section 230 Safe Harbor

A recent decision from a federal court in New York highlights the limits social media users enjoy under Section 230 of the Communications Decency Act (CDA). The case involves Joy Reid, the popular host of MSNBC’s AM Joy who has more than two million Twitter and Instagram followers, and the interaction between a young Hispanic boy and a “Make America Great Again” (MAGA)–hat wearing woman named Roslyn La Liberte at a Simi Valley, California, City Council meeting.

The case centers on a single re-tweet by Reid and two of her Instagram posts.

Here is Reid’s re-tweet.

It says: “You are going to be the first deported” “dirty Mexican” Were some of the things they yelled at this 14 year old boy. He was defending immigrants at a rally and was shouted down.   

Spread this far and wide this woman needs to be put on blast.

 
 

Here is Reid’s first Instagram post from the same day.

It says: joyannreid He showed up to a rally to defend immigrants. … She showed up too, in her MAGA hat, and screamed, “You are going to be the first deported” … “dirty Mexican!” He is 14 years old. She is an adult. Make the picture black and white and it could be the 1950s and the desegregation of a school. Hate is real, y’all. It hasn’t even really gone away.
Continue Reading

A recent Second Circuit decision makes clear that the safe harbor that social media and other Internet companies enjoy under Section 230 of the Communications Decency Act broadly applies to a wide variety of claims.

When you think about the Section 230 safe harbor, don’t just think defamation or other similar state law claims. Consider whether the claim—be it federal, state, local, or foreign—seeks to hold a party that publishes third-party content on the Internet responsible for publishing the content. If, after stripping it all down, this is the crux of the cause of action, the safe harbor should apply (absent a few statutory exclusions discussed below). The safe harbor should apply even if the party uses its discretion as a publisher in deciding how best to target its audience or to display the information provided by third parties.

In 2016, Facebook was sued by the estates of four U.S. citizens who died in terrorist attacks in Israel and one who narrowly survived but was grievously injured. The plaintiffs claimed that Facebook should be held liable under the federal Anti-Terrorism Act and the Justice Against Sponsors of Terror Act, which provide a private right of action against those who aid and abet acts of international terrorism, conspire in furtherance of acts of terrorism, or provide material support to terrorist groups. The plaintiffs also asserted claims arising under Israeli law.
Continue Reading

As we noted in our recent post on the Second Circuit case Herrick v. Grindr, LLC, Section 230 of the Communications Decency Act (CDA) continues to provide immunity to online intermediaries from liability for user content, despite pressure from courts and legislatures seeking to chip away at this safe harbor. The D.C. Circuit case Marshall’s Locksmith Service Inc. v. Google, LLC serves as another example of Section 230’s resiliency.

In Marshall’s Locksmith, the D.C. Circuit affirmed the dismissal of claims brought by 14 locksmith companies against search engine operators Google, Microsoft and Yahoo! for allegedly conspiring to allow “scam locksmiths” to inundate the online search results page in order to extract additional advertising revenue.

The scam locksmiths at issue published websites targeting heavily populated locations around the country to trick potential customers into believing that they were local companies. These websites provided either a fictitious address or no address at all, and falsely claimed that they were local businesses. The plaintiffs asserted various federal and state law claims against the search engine operators relating to false advertising, conspiracy and fraud based on their activities in connection with the scam locksmiths’ websites.
Continue Reading

A California Superior Court’s recent ruling in Murphy v. Twitter held that Section 230 of the Communications Decency Act shielded Twitter from liability for suspending and banning a user’s account for violating the platform’s policies. As we have previously noted, Section 230 has come under pressure in recent years from both courts and legislatures. But we have also examined other cases demonstrating Section 230’s staying power. The ruling in Murphy again shows that, despite the challenges facing Section 230, the statute continues to serve its broader purpose of protecting social media platforms from the actions of their users while allowing those platforms to monitor and moderate their services.

From January to mid-October 2018, Meghan Murphy posted a number of tweets that misgendered and criticized transgender Twitter users. After first temporarily suspending her account, Twitter ultimately banned her from the platform for violating its Hateful Conduct Policy. Twitter had amended this policy in late October 2018 to specifically include targeted abuse and misgendering of transgender people.
Continue Reading

As we have frequently noted on Socially Aware, Section 230 of the Communications Decency Act protects social media sites and other online platforms from liability for user-generated content. Sometimes referred to as “the law that gave us the modern Internet,” Section 230 has provided robust immunity for website operators since it was enacted in 1996. As we have also written previously, however, the historically broad Section 230 immunity has come under pressure in recent years, with both courts and legislatures chipping away at this important safe harbor.

Now, some lawmakers are proposing legislation to narrow the protections that Section 230 affords to website owners. They assert that changes to the section are necessary to protect Internet users from dangers such as sex-trafficking and the doctored videos known as “deep fakes.”

The House Intelligence Committee Hearing

Recently, a low-tech fraudulent video that made House Speaker Nancy Pelosi’s speech appear slurred was widely shared on social media, inspiring Hany Farid, a computer-science professor and digital-forensics expert at the University of California, Berkeley, to tell The Washington Post, “this type of low-tech fake shows that there is a larger threat of misinformation campaigns—too many of us are willing to believe the worst in people that we disagree with.”
Continue Reading

Often hailed as the law that gave us the modern Internet, Section 230 of the Communication Decency Act generally protects online platforms from liability for content posted by third parties. Many commentators, including us here at Socially Aware, have noted that Section 230 has faced significant challenges in recent years. But Section 230 has proven resilient (as we previously noted here and here), and that resiliency was again demonstrated by the Second Circuit’s recent opinion in Herrick v. Grindr, LLC.

As we noted in our prior post following the district court’s order dismissing plaintiff Herrick’s claims on Section 230 grounds, the case arose from fake Grindr profiles allegedly set up by Herrick’s ex-boyfriend. According to Herrick, these fake profiles resulted in Herrick facing harassment from over 1,000 strangers who showed up at his door over the course of several months seeking violent sexual encounters.
Continue Reading

As we have noted previously, the California Court of Appeal’s Hassell v. Bird decision in 2016 upholding an injunction requiring Yelp to remove certain user reviews was discouraging to social media companies and other online intermediaries, as well as to fans of Section 230 of the Communications Decency Act and proponents of Internet free speech generally. The recent California Supreme Court decision reversing the Court of Appeal was, therefore, met with considerable relief by many in the Internet community.

But while the California Supreme Court’s decision is undoubtedly a significant development, it would be premature for Section 230 fans to break out the champagne; the “most important law protecting Internet speech” remains under attack from many directions, and this recent decision is far from definitive. But before getting into the details of the Hassell v. Bird opinion, let’s step back and consider the context in which the case arose.

Before Section 230: A Wild, Wild Web

A fundamental issue for social media platforms and other online intermediaries, including review sites like Yelp, is whether a company may be held liable when its customers engage in bad behavior, such as posting defamatory content or content that infringes the IP rights of third parties. Imagine if Facebook, Twitter, YouTube, and Yelp were potentially liable for defamation every time one of their users said something nasty (and untrue) about another user on their platforms. It would be hard to imagine the Internet as we currently know it existing if that were the case.
Continue Reading

Does a search engine operator have to delist websites hosting, without authorization, your trade secret materials or other intellectual property? The answer may depend on where you sue—just ask Google. The U.S. District Court for the Northern District of California recently handed the company a victory over plaintiff Equustek Solutions Inc. in what has turned into an international battle where physical borders can have very real consequences on the Internet.

The dispute began when a rival company, Datalink, allegedly misappropriated Equustek’s trade secrets in developing competing products. Equustek also alleged that Datalink misled customers who thought they were buying Equustek products. In 2012, Equustek obtained numerous court orders in Canada against Datalink. Datalink refused to comply, and Canadian court issued an arrest warrant for the primary defendant, who has yet to be apprehended.
Continue Reading

Section 230 of the Communications Decency Act continues to act as one of the strongest legal protections that social media companies have to avoid being saddled with crippling damage awards based on the misdeeds of their users.

The strong protections afforded by Section 230(c) were recently reaffirmed by Judge Caproni of the Southern District of New York, in Herrick v. Grindr. The case involved a dispute between the social networking platform Grindr and an individual who was maliciously targeted through the platform by his former lover. For the unfamiliar, Grindr is mobile app directed to gay and bisexual men that, using geolocation technology, helps them to connect with other users who are located nearby.

Plaintiff Herrick alleged that his ex-boyfriend set up several fake profiles on Grindr that claimed to be him. Over a thousand users responded to the impersonating profiles. Herrick’s ex‑boyfriend, pretending to be Herrick, would then direct the men to Herrick’s’ work-place and home. The ex-boyfriend, still posing as Herrick, would also tell these would-be suitors that Herrick had certain rape fantasies, that he would initially resist their overtures, and that they should attempt to overcome Herrick’s initial refusals. The impersonating profiles were reported to Grindr (the app’s operator), but Herrick claimed that Grindr did not respond, other than to send an automated message.
Continue Reading

Happy 2018 to our readers! It has become a Socially Aware tradition to start the New Year with some predictions from our editors and contributors. With smart contracts on the horizon, the Internet of Things and cryptocurrencies in the spotlight, and a number of closely watched lawsuits moving toward resolution, 2018 promises to be an exciting year in the world of emerging technology and Internet law.

Here are some of our predictions regarding tech-related legal developments over the next twelve months. As always, the views expressed are not to be attributed to Morrison & Foerster or its clients.

From John Delaney, Co-Founder and Co-Editor, Socially Aware, and Partner at Morrison & Foerster:
Regarding Web Scraping

Web scraping is an increasingly common activity among businesses (by one estimate, web-scraping bots account for as much as 46% of Internet traffic), and is helping to fuel the “Big Data” revolution. Despite the growing popularity of web scraping, courts have been generally unsympathetic to web scrapers. Last August, however, web scrapers finally received a huge victory, as the U.S. District Court for the Northern District of California enjoined LinkedIn from blocking hiQ Labs’ scraping of publicly available user profiles from the LinkedIn website in the hiQ Labs, Inc. v. LinkedIn Corp. litigation. The case is now on appeal to the Ninth Circuit; although my sense is that the Ninth Circuit will reject the broad scope and rationale of the lower court’s ruling, if the Ninth Circuit nevertheless ultimately sides with hiQ Labs, the web scraper, the decision could be a game changer, bringing online scraping out of the shadows and perhaps spurring more aggressive uses of scraping tools and scraped data. On the other hand, if the Ninth Circuit reverses, we may see companies reexamining and perhaps curtailing their scraping initiatives. Either way, 2018 promises to bring greater clarity to this murky area of the law.

Regarding the Growing Challenges for Social Media Platforms

2017 was a tough year for social media platforms. After years of positive press, immense consumer goodwill and a generally “hands off” attitude from regulators, last year saw a growing backlash against social media due to a number of reasons: the continued rise of trolling creating an ever-more toxic online environment; criticism of social media’s role in the dissemination of fake news; the growing concern over social media “filter bubbles” and “echo chambers”; and worries about the potential societal impact of social media’s algorithm-driven effectiveness in attracting and keeping a grip on our attention. Expect to see in 2018 further efforts by social media companies to get out ahead of most if not all of these issues, in the hopes of winning over critics and discouraging greater governmental regulation.

Regarding the DMCA Safe Harbor for Hosting of User-Generated Content

The backlash against social media noted in my prior item may also be reflected to some extent in several 2017 court decisions regarding the DMCA safe harbor shielding website operators and other online service providers from copyright damages in connection with user-generated content (and perhaps in the CDA Section 230 case law discussed by Aaron Rubin below). After nearly two decades of court decisions generally taking an ever more expansive approach to this particular DMCA safe harbor, the pendulum begun to swing in the other direction in 2016, and this trend picked up steam in 2017, culminating in the Ninth Circuit’s Mavrix decision, which found an social media platform provider’s use of volunteer curators to review user posts to deprive the provider of DMCA safe harbor protection. Expect to see the pendulum continue to swing in favor of copyright owners in DMCA safe harbor decisions over the coming year.

Regarding Smart Contracts

Expect to see broader, mainstream adoption of “smart contracts,” especially in the B2B context—and perhaps litigation over smart contracts in 2019 . . . .

From Aaron Rubin, Co-Editor, Socially Aware, and Partner at Morrison & Foerster:
Regarding the CDA Section 230 Safe Harbor

We noted previously that 2016 was a particularly rough year for Section 230 of the Communications Decency Act and the immunity that the statute provides website operators against liability arising from third-party or user-generated content. Now that 2017 is in the rear view mirror, Section 230 is still standing but its future remains imperiled. We have seen evidence of Section 230’s resiliency in recent cases where courts rejected plaintiffs’ creative attempts to find chinks in the immunity’s armor by arguing, for example, that websites lose immunity when they use data analytics to direct users to content, or when they fail to warn users of potential dangers, or when they share ad revenue with content developers. Nonetheless, it is clear that the knives are still out for Section 230, including in Congress, where a number of bills are under consideration that would significantly limit the safe harbor in the name of combatting sex trafficking. I predict that 2018 will only see these efforts to rein in Section 230 increase.
Continue Reading