Section 230 Safe Harbor

Happy 2018 to our readers! It has become a Socially Aware tradition to start the New Year with some predictions from our editors and contributors. With smart contracts on the horizon, the Internet of Things and cryptocurrencies in the spotlight, and a number of closely watched lawsuits moving toward resolution, 2018 promises to be an exciting year in the world of emerging technology and Internet law.

Here are some of our predictions regarding tech-related legal developments over the next twelve months. As always, the views expressed are not to be attributed to Morrison & Foerster or its clients.

From John Delaney, Co-Founder and Co-Editor, Socially Aware, and Partner at Morrison & Foerster:
Regarding Web Scraping

Web scraping is an increasingly common activity among businesses (by one estimate, web-scraping bots account for as much as 46% of Internet traffic), and is helping to fuel the “Big Data” revolution. Despite the growing popularity of web scraping, courts have been generally unsympathetic to web scrapers. Last August, however, web scrapers finally received a huge victory, as the U.S. District Court for the Northern District of California enjoined LinkedIn from blocking hiQ Labs’ scraping of publicly available user profiles from the LinkedIn website in the hiQ Labs, Inc. v. LinkedIn Corp. litigation. The case is now on appeal to the Ninth Circuit; although my sense is that the Ninth Circuit will reject the broad scope and rationale of the lower court’s ruling, if the Ninth Circuit nevertheless ultimately sides with hiQ Labs, the web scraper, the decision could be a game changer, bringing online scraping out of the shadows and perhaps spurring more aggressive uses of scraping tools and scraped data. On the other hand, if the Ninth Circuit reverses, we may see companies reexamining and perhaps curtailing their scraping initiatives. Either way, 2018 promises to bring greater clarity to this murky area of the law.

Regarding the Growing Challenges for Social Media Platforms

2017 was a tough year for social media platforms. After years of positive press, consumer goodwill and a generally “hands off” attitude from regulators, last year saw a growing backlash against social media platforms due to a number of reasons: the continued rise of trolling creating an ever-more toxic online environment; criticism of social media’s role in the dissemination of fake news; the growing concern over social media “filter bubbles” and “echo chambers”; the increasingly sophisticated tracking of online behavior; and worries about the potential societal impact of social media’s algorithm-driven effectiveness in attracting and keeping a grip on our attention. Expect to see in 2018 further efforts by social media companies to get out ahead of most if not all of these issues, in the hopes of discouraging or at least delaying greater governmental regulation.

Regarding the DMCA Safe Harbor for Hosting of User-Generated Content

The backlash against social media noted in my prior item may also be reflected to some extent in several 2017 court decisions regarding the DMCA safe harbor shielding website operators and other online service providers from copyright damages in connection with user-generated content (and perhaps in the CDA Section 230 case law discussed by Aaron Rubin below). After nearly two decades of court decisions generally taking an ever more expansive approach to this particular DMCA safe harbor, the pendulum begun to swing in the other direction in 2016, and this trend picked up steam in 2017, culminating in the Ninth Circuit’s Mavrix decision, which found an social media platform provider’s use of volunteer curators to review user posts to deprive the provider of DMCA safe harbor protection. Expect to see the pendulum continue to swing in favor of copyright owners in DMCA safe harbor decisions over the coming year.

Regarding Smart Contracts

Expect to see broader, mainstream adoption of “smart contracts,” especially in the B2B context—and perhaps litigation over smart contracts in 2019 . . . .

From Aaron Rubin, Co-Editor, Socially Aware, and Partner at Morrison & Foerster:
Regarding the CDA Section 230 Safe Harbor

We noted previously that 2016 was a particularly rough year for Section 230 of the Communications Decency Act and the immunity that the statute provides website operators against liability arising from third-party or user-generated content. Now that 2017 is in the rear view mirror, Section 230 is still standing but its future remains imperiled. We have seen evidence of Section 230’s resiliency in recent cases where courts rejected plaintiffs’ creative attempts to find chinks in the immunity’s armor by arguing, for example, that websites lose immunity when they use data analytics to direct users to content, or when they fail to warn users of potential dangers, or when they share ad revenue with content developers. Nonetheless, it is clear that the knives are still out for Section 230, including in Congress, where a number of bills are under consideration that would significantly limit the safe harbor in the name of combatting sex trafficking. I predict that 2018 will only see these efforts to rein in Section 230 increase. Continue Reading 2018: Predictions From Socially Aware’s Editors and Contributors

In an effort to deter hate groups from tweeting sanitized versions of their messages, Twitter has began considering account holders’ off platform behavior when the platform evaluates whether potentially harmful tweets should be removed and account holders should be suspended or permanently banned.

In connection with Congressional efforts to deter online sex trafficking by narrowing the Communications Decency Act’s Section 230 safe harbor protection for website operators from claims arising from third-party ads and other content, a revised House bill would require proof of intent to facilitate prostitution, helping to address Internet industry concerns regarding the legislative initiative.

YouTube is making a concerted effort to remove disturbing videos featuring children in distress.

Concerned about the effect fake news could have on the democratic process, lawmakers in Ireland proposed a law that would make disseminating fake news on social media a crime.

A proposed cybersecurity law in Vietnam would require foreign tech companies like Google to establish offices and store data in that country. According to this op-ed, such a relatively late attempt to rein in Vietnam’s social media use would “most certainly trigger a popular backlash” and “seem like a retrograde move.”

A new report from clinical experts in the UK recommends that children younger than five-years-old should never be permitted to use digital technology without supervision.

Snapchat is rolling out a redesign that places all the messages and Stories from a user’s friends to the left side of the camera, and stories from professional social media stars and media outlets that the user follows to the right of it. But will people over the age of 30 still have no idea how to use the platform?

Instagram is testing a direct messaging app that would replace its current inbox. Called Direct, the app stands independent of that Instagram platform and, like Snapchat, opens to the user’s camera.

Artificial intelligence is allowing people to actually enjoy the moments they photograph by significantly cutting down the time it takes to share and catalog pictures.

There’s a browser extension that will hide all the potentially upsetting stories in your social media newsfeeds, but it’s not perfect. And maybe that’s a good thing.

Hmmm—in a tumultuous year, the ten most-liked posts on Instagram of 2017 all belong to Beyoncé, Cristiano Ronaldo or Selena Gomez.

In contrast, the most popular tweets of this year concern politics, tragedy and, well, chicken nuggets.

The government in Indonesia has warned the world’s biggest social media providers that they risk being banned in that country if they don’t block pornography and other content deemed obscene.

A member of the House of Lords has proposed an amendment to the U.K.’s data protection bill that would subject technology companies to “minimum standards of age-appropriate design” such as not revealing the GPS locations of users younger than 16.

A bill in Wisconsin would make impersonating someone on social media a misdemeanor.

Google’s general counsel wrote a blog post arguing that two new cases over right-to-be-forgotten requests and pending before the European Union’s top court put the search-engine company at risk of “restricting access to lawful and valuable information.”

Trucking is a $700 billion industry that stands to save billions from automation,  and will likely get self-driving vehicles on the road sooner than most people expected.

Social media platforms are often used to prey on potential sex trafficking victims, according to one FBI special agent.

A recent study shows that searching for information from unofficial sources on social media during a crisis is likely to result in the spread of misinformation and anxiety. Researchers recommend that, to quash rumors, emergency management officials should stay in regular contact with people even if they don’t have any new information.

This piece in Slate invites readers to imagine what the Internet would look like today if not for the passage of Section 230 of the Communications Decency Act, a statute that “says that in general, websites are not responsible for the things their users do or post.”

An op-ed in USA Today compares to swift spread of infectious diseases that resulted from the concentration of populations in urban areas to the swift spread of ideas that accompanied the invention of the Internet, and concludes that traditional training in critical thinking is as necessary to survive the latter as nutrition was to survive the former.

By allowing companies to provide consumers with verifiable information about things like their diversity-driven hiring practices and their products’ supply chains, blockchain is going to change the marketing industry significantly, the American Marketing Association reports.

A high school senior who was bullied in middle school created Sit With Us, the phone-based anti-bullying app that helps kids find a welcoming place to eat in their school cafeteria.

A federal appeals court in Miami held that a judge needn’t necessarily recuse herself from a case being argued by a lawyer with whom the judge is merely Facebook “friends.”

Bills in both houses of Congress propose amending Section 230 of the Communications Decency Act to clarify that it doesn’t insulate website operators from liability for violating civil or criminal child-sex-trafficking laws.

The Commonwealth Court of Pennsylvania held that an unemployment-benefits board acted appropriately when it relied, in part, on an applicant’s Facebook post to determine that the applicant was not entitled to benefits.

A Texas law makes cyberbullying punishable by as much as a year in jail and/or a fine of up to $4,000.

Google is trying to make it more difficult to find and profit from YouTube videos that contain extremist content by placing warnings on those videos and disabling the advertising on them.

A company backed by Mark Cuban is planning to create a social media platform that will anonymize its users’ identities using blockchain technology and attempt to cut down on trolls by charging people with bad reputations on the platform more for premium services.

The online publishing platform Medium is giving some of its content writers the option to put their work behind Medium’s subscription pay wall and get paid based on the number of “claps” that work gets.

Evolutionary psychologists aren’t at all surprised by the popularity of snooping on social media.

Tips for law firm marketers on how to best leverage Instagram.

Advice on how to pen the best automated out-of-office reply.

When you visit someone’s home these days, do you use the doorbell or text instead?

03_April_SociallyAware_thumbnailThe latest issue of our Socially Aware newsletter is now available here.

In this edition, we explore the threat to U.S. jobs posed by rapid advances in emerging technologies; we examine a Federal Trade Commission report on how companies engaging in cross-device tracking can stay on the right side of the law; we take a look at a Second Circuit opinion that fleshes out the “repeat infringer” requirement online service providers must fulfill to qualify for the Digital Millennium Copyright Act’s safe harbors; we discuss a state court decision holding that Section 230 of the Communications Decency Act immunizes Snapchat from liability for a car wreck that was allegedly caused by the app’s “speed filter” feature; we describe a recent decision by the District Court of the Hague confirming that an app provider could be subject to the privacy laws of a country in the European Union merely by making its app available on mobile phones in that country; and we review a federal district court order requiring Google to comply with search warrants for foreign stored user data.

All this—plus an infographic illustrating how emerging technology will threaten U.S. jobs.

Read our newsletter.

SociallyAware_Vol8Issue1_Thumb2The latest issue of our Socially Aware newsletter is now available here.

In this edition,we examine a spate of court decisions that appear to rein in the historically broad scope of the Communications Decency Act’s Section 230 safe harbor for website operators; we outline ten steps companies can take to be better prepared for a security breach incident; we describe the implications of the Second Circuit’s recent opinion in Microsoft v. United States regarding the U.S. government’s efforts to require Microsoft to produce email messages stored outside the country; we explore the EU’s draft regulation prohibiting geo-blocking; and we take a look at UK Consumer Protection regulators’ efforts to combat undisclosed endorsements on social media.

All this—plus an infographic highlighting the most popular social-media-post topics in 2016.

Read our newsletter.

Speedometer_166009527We have been monitoring a trend of cases narrowing the immunity provided to website operators under Section 230 of the Communications Decency Act (CDA).  A recent decision by a state court in Georgia, however, demonstrates that Section 230 continues to be applied expansively in at least some cases.

The case, Maynard v. McGee, arose from an automobile collision in Clayton County, Georgia.  Christal McGee, the defendant, had allegedly been using Snapchat’s “speed filter” feature, which tracks a car’s speed in real-time and superimposes the speed on a mobile phone’s camera view. According to the plaintiffs, one of whom had been injured in the collision, McGee was using the speed filter when the accident occurred, with the intention of posting a video on Snapchat showing how fast she was driving.  The plaintiffs sued McGee and Snapchat for negligence, and Snapchat moved to dismiss based on the immunity provided by Section 230.

The plaintiffs alleged that Snapchat was negligent because it knew its users would use the speed filter “in a manner that might distract them from obeying traffic or safety laws” and that “users might put themselves or others in harm’s way in order to capture a Snap.” To demonstrate that Snapchat had knowledge, the plaintiffs pointed to a previous automobile collision that also involved the use of Snapchat’s speed filter.  The plaintiffs claimed that “[d]espite Snapchat’s actual knowledge of the danger from using its product’s speed filter while driving at excessive speeds, Snapchat did not remove or restrict access to the speed filter.” Continue Reading Snapchat Clocks Section 230 Win in Speed Filter Case

A rear view of a businessman in a suit, with an upheld umbrella, standing in a large field during a thunderstorm. Lightning is seen descending from a gray and cloudy sky.

2016 has been a tough year for a lot of reasons, most of which are outside the scope of this blog (though if you’d like to hear our thoughts about Bowie, Prince or Leonard Cohen, feel free to drop us a line). But one possible victim of this annus horribilis is well within the ambit of Socially Aware: Section 230 of the Communications Decency Act (CDA).

Often hailed as the law that gave us the modern Internet, CDA Section 230 provides immunity against liability for website operators for certain claims arising from third-party or user-generated content. The Electronic Frontier Foundation has called Section 230 “the most important law protecting Internet speech,” and companies including Google, Yelp and Facebook have benefited from the protections offered by the law, which was enacted 20 years ago.

But it’s not all sunshine and roses for Internet publishers and Section 230, particularly over the past 18 months. Plaintiffs are constantly looking for chinks in Section 230’s armor and, in an unusually large number of recent cases, courts have held that Section 230 did not apply, raising the question of whether the historical trend towards broadening the scope of Section 230 immunity may now be reversing. This article provides an overview of recent cases that seem to narrow the scope of Section 230. Continue Reading The Decline and Fall of Section 230?

PrintAs we noted in our recent post on the Ninth Circuit case Kimzey v. Yelp! Inc., in the right circumstances, Section 230 of the Communications Decency Act (CDA) still provides robust protection against liability for website operators despite the unusually large number of decisions this year seemingly narrowing the scope of the statute. Defendants notched another Section 230 win recently in Manchanda v. Google, a case in the Southern District of New York. The case began in May 2016 when Rahul Manchanda, an attorney, filed a complaint alleging that Google, Yahoo and Microsoft harmed his reputation by indexing certain websites that described him in negative terms.

Manchanda asserted various claims against the three defendants, including defamation, libel, slander, tortious interference with contract, breach of fiduciary duty, breach of the duty of loyalty, unfair trade practices, false advertising, unlawful trespass, civil RICO, unjust enrichment, intentional infliction of emotional distress, negligent infliction of emotional distress and trademark infringement. Manchanda sought injunctive relief requiring the defendants to “de-index or remove the offending websites from their search engines” in addition to damages.

The court made quick work of dismissing most of Manchanda’s claims on Section 230 grounds, emphasizing that the CDA “immunizes search engines from civil liability for reputational damage resulting from third-party content that they aggregate and republish.” The court went on to note that “[t]his immunity attaches regardless of the specific claim asserted against the search engine, so long as the claim arises from the publication or distribution of content produced by a third party and the alleged injury involves damage to a plaintiff’s reputation based on that content.” Continue Reading In a Rough Year for CDA Section 230, Manchanda v. Google Provides Comfort to Website Operators

"Unlike" on a screen. More>>

2016 has been a challenging year for Section 230 of the Communications Decency Act (CDA) and the website operators who depend on it for protection against liability stemming from user-generated content. An unusually large number of cases this year have resulted in decisions holding that the defendant website operators were not entitled to immunity under Section 230. For example, as we’ve discussed recently, in Hassel v. Bird, the California Court of Appeal held that Section 230 did not prevent the court from ordering Yelp to remove from its website allegedly defamatory reviews posted by users, even though Yelp was not a party in the underlying defamation suit.

We are working on an article surveying some of the recent cases holding that Section 230 did not apply. But in the meantime, it is important to remember that Section 230 remains a powerful shield against liability and that defendants continue to wield it successfully in many cases. The Ninth Circuit’s recent decision in Kimzey v. Yelp is one such case.

Kimzey arose from two negative Yelp reviews that user “Sarah K” posted in September 2011 about Douglas Kimzey’s locksmith business in the Seattle area. Sarah K’s reviews were extremely negative and rated Kimzey one out of five stars in Yelp’s multiple-choice star rating system. In all caps, she warned Yelpers that “THIS WAS BY FAR THE WORST EXPERIENCE I HAVE EVER ENCOUNTERED WITH A LOCKSMITH. DO NOT GO THROUGH THIS COMPANY . . . CALL THIS BUSINESS AT YOUR OWN RISK.” Continue Reading Yelp Case Shows CDA §230 Still Has Teeth