Expressing concern about the spread of disinformation related to COVID-19, Federal Trade Commissioner Rohit Chopra said Congress may need “to reassess the special privileges afforded to tech platforms, especially given their vast power to curate and present content in ways that may manipulate users.” His words implicate one of our favorite topics here at Socially
In a purported attempt to safeguard free speech, President Trump has issued an order “Preventing Online Censorship,” that would eliminate the protections afforded by one of our favorite topics here at Socially Aware, Section 230 of the Communications Decency Act, which generally protects online platforms from liability for content posted by third parties. President…
Often lauded as the most important law for online speech, Section 230 of the Communications Decency Act (CDA) does not just protect popular websites like Facebook, YouTube and Google from defamation and other claims based on third-party content. It is also critically important to spyware and malware protection services that offer online filtration tools.
Section 230(c)(2) grants broad immunity to any interactive computer service that blocks content it considers to be “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.” Under a plain reading of the statute, Section 230(c)(2) clearly offers broad protection. With respect to what the phrase “otherwise objectionable” was intended to capture, however, the protections are less clear.…
Continue Reading Computer Service Providers Face Implied Limits on CDA Immunity
A federal district court in Illinois allowed claims for vicarious and direct copyright infringement to proceed against an employee of the Chicago Cubs Baseball Club for retweeting a third-party tweet containing the plaintiff’s copyrighted material. Read the opinion.
In a move that might be part of a settlement that YouTube has entered into with the Federal Trade Commission, the video-sharing site said it will ban “targeted” advertisements on videos likely to be watched by children. Because targeted ads rely on information collected about the platform’s users, displaying such ads to children…
A recent decision from a federal court in New York highlights the limits social media users enjoy under Section 230 of the Communications Decency Act (CDA). The case involves Joy Reid, the popular host of MSNBC’s AM Joy who has more than two million Twitter and Instagram followers, and the interaction between a young Hispanic boy and a “Make America Great Again” (MAGA)–hat wearing woman named Roslyn La Liberte at a Simi Valley, California, City Council meeting.
The case centers on a single re-tweet by Reid and two of her Instagram posts.
Here is Reid’s re-tweet.
It says: “You are going to be the first deported” “dirty Mexican” Were some of the things they yelled at this 14 year old boy. He was defending immigrants at a rally and was shouted down.
Spread this far and wide this woman needs to be put on blast.
Here is Reid’s first Instagram post from the same day.
It says: joyannreid He showed up to a rally to defend immigrants. … She showed up too, in her MAGA hat, and screamed, “You are going to be the first deported” … “dirty Mexican!” He is 14 years old. She is an adult. Make the picture black and white and it could be the 1950s and the desegregation of a school. Hate is real, y’all. It hasn’t even really gone away.…
Continue Reading The Joys and Dangers of Tweeting: A CDA Immunity Update
A recent Second Circuit decision makes clear that the safe harbor that social media and other Internet companies enjoy under Section 230 of the Communications Decency Act broadly applies to a wide variety of claims.
When you think about the Section 230 safe harbor, don’t just think defamation or other similar state law claims. Consider whether the claim—be it federal, state, local, or foreign—seeks to hold a party that publishes third-party content on the Internet responsible for publishing the content. If, after stripping it all down, this is the crux of the cause of action, the safe harbor should apply (absent a few statutory exclusions discussed below). The safe harbor should apply even if the party uses its discretion as a publisher in deciding how best to target its audience or to display the information provided by third parties.
In 2016, Facebook was sued by the estates of four U.S. citizens who died in terrorist attacks in Israel and one who narrowly survived but was grievously injured. The plaintiffs claimed that Facebook should be held liable under the federal Anti-Terrorism Act and the Justice Against Sponsors of Terror Act, which provide a private right of action against those who aid and abet acts of international terrorism, conspire in furtherance of acts of terrorism, or provide material support to terrorist groups. The plaintiffs also asserted claims arising under Israeli law.…
Continue Reading CDA Section 230 Immunizes Platform From Liability for Friend and Content Suggestion Algorithms
As we noted in our recent post on the Second Circuit case Herrick v. Grindr, LLC, Section 230 of the Communications Decency Act (CDA) continues to provide immunity to online intermediaries from liability for user content, despite pressure from courts and legislatures seeking to chip away at this safe harbor. The D.C. Circuit case Marshall’s Locksmith Service Inc. v. Google, LLC serves as another example of Section 230’s resiliency.
In Marshall’s Locksmith, the D.C. Circuit affirmed the dismissal of claims brought by 14 locksmith companies against search engine operators Google, Microsoft and Yahoo! for allegedly conspiring to allow “scam locksmiths” to inundate the online search results page in order to extract additional advertising revenue.
The scam locksmiths at issue published websites targeting heavily populated locations around the country to trick potential customers into believing that they were local companies. These websites provided either a fictitious address or no address at all, and falsely claimed that they were local businesses. The plaintiffs asserted various federal and state law claims against the search engine operators relating to false advertising, conspiracy and fraud based on their activities in connection with the scam locksmiths’ websites.…
Continue Reading D.C. Circuit Holds that Section 230 Locks Out Locksmiths
A California Superior Court’s recent ruling in Murphy v. Twitter held that Section 230 of the Communications Decency Act shielded Twitter from liability for suspending and banning a user’s account for violating the platform’s policies. As we have previously noted, Section 230 has come under pressure in recent years from both courts and legislatures. But we have also examined other cases demonstrating Section 230’s staying power. The ruling in Murphy again shows that, despite the challenges facing Section 230, the statute continues to serve its broader purpose of protecting social media platforms from the actions of their users while allowing those platforms to monitor and moderate their services.
From January to mid-October 2018, Meghan Murphy posted a number of tweets that misgendered and criticized transgender Twitter users. After first temporarily suspending her account, Twitter ultimately banned her from the platform for violating its Hateful Conduct Policy. Twitter had amended this policy in late October 2018 to specifically include targeted abuse and misgendering of transgender people.…
Continue Reading California Court Finds Section 230 Protects Decision to Suspend and Ban Twitter Account
As we have frequently noted on Socially Aware, Section 230 of the Communications Decency Act protects social media sites and other online platforms from liability for user-generated content. Sometimes referred to as “the law that gave us the modern Internet,” Section 230 has provided robust immunity for website operators since it was enacted in 1996. As we have also written previously, however, the historically broad Section 230 immunity has come under pressure in recent years, with both courts and legislatures chipping away at this important safe harbor.
Now, some lawmakers are proposing legislation to narrow the protections that Section 230 affords to website owners. They assert that changes to the section are necessary to protect Internet users from dangers such as sex-trafficking and the doctored videos known as “deep fakes.”
The House Intelligence Committee Hearing
Recently, a low-tech fraudulent video that made House Speaker Nancy Pelosi’s speech appear slurred was widely shared on social media, inspiring Hany Farid, a computer-science professor and digital-forensics expert at the University of California, Berkeley, to tell The Washington Post, “this type of low-tech fake shows that there is a larger threat of misinformation campaigns—too many of us are willing to believe the worst in people that we disagree with.”…
Continue Reading Legislators Propose Narrowing § 230’s Protections