Section 230 Safe Harbor

A federal appeals court in Miami held that a judge needn’t necessarily recuse herself from a case being argued by a lawyer with whom the judge is merely Facebook “friends.”

Bills in both houses of Congress propose amending Section 230 of the Communications Decency Act to clarify that it doesn’t insulate website operators from liability

03_April_SociallyAware_thumbnailThe latest issue of our Socially Aware newsletter is now available here.

In this edition, we explore the threat to U.S. jobs posed by rapid advances in emerging technologies; we examine a Federal Trade Commission report on how companies engaging in cross-device tracking can stay on the right side of the law; we take

SociallyAware_Vol8Issue1_Thumb2The latest issue of our Socially Aware newsletter is now available here.

In this edition,we examine a spate of court decisions that appear to rein in the historically broad scope of the Communications Decency Act’s Section 230 safe harbor for website operators; we outline ten steps companies can take to be better prepared for

Speedometer_166009527We have been monitoring a trend of cases narrowing the immunity provided to website operators under Section 230 of the Communications Decency Act (CDA).  A recent decision by a state court in Georgia, however, demonstrates that Section 230 continues to be applied expansively in at least some cases.

The case, Maynard v. McGee, arose from an automobile collision in Clayton County, Georgia.  Christal McGee, the defendant, had allegedly been using Snapchat’s “speed filter” feature, which tracks a car’s speed in real-time and superimposes the speed on a mobile phone’s camera view. According to the plaintiffs, one of whom had been injured in the collision, McGee was using the speed filter when the accident occurred, with the intention of posting a video on Snapchat showing how fast she was driving.  The plaintiffs sued McGee and Snapchat for negligence, and Snapchat moved to dismiss based on the immunity provided by Section 230.

The plaintiffs alleged that Snapchat was negligent because it knew its users would use the speed filter “in a manner that might distract them from obeying traffic or safety laws” and that “users might put themselves or others in harm’s way in order to capture a Snap.” To demonstrate that Snapchat had knowledge, the plaintiffs pointed to a previous automobile collision that also involved the use of Snapchat’s speed filter.  The plaintiffs claimed that “[d]espite Snapchat’s actual knowledge of the danger from using its product’s speed filter while driving at excessive speeds, Snapchat did not remove or restrict access to the speed filter.”
Continue Reading

A rear view of a businessman in a suit, with an upheld umbrella, standing in a large field during a thunderstorm. Lightning is seen descending from a gray and cloudy sky.

2016 has been a tough year for a lot of reasons, most of which are outside the scope of this blog (though if you’d like to hear our thoughts about Bowie, Prince or Leonard Cohen, feel free to drop us a line). But one possible victim of this annus horribilis is well within the ambit of Socially Aware: Section 230 of the Communications Decency Act (CDA).

Often hailed as the law that gave us the modern Internet, CDA Section 230 provides immunity against liability for website operators for certain claims arising from third-party or user-generated content. The Electronic Frontier Foundation has called Section 230 “the most important law protecting Internet speech,” and companies including Google, Yelp and Facebook have benefited from the protections offered by the law, which was enacted 20 years ago.

But it’s not all sunshine and roses for Internet publishers and Section 230, particularly over the past 18 months. Plaintiffs are constantly looking for chinks in Section 230’s armor and, in an unusually large number of recent cases, courts have held that Section 230 did not apply, raising the question of whether the historical trend towards broadening the scope of Section 230 immunity may now be reversing. This article provides an overview of recent cases that seem to narrow the scope of Section 230.
Continue Reading

PrintAs we noted in our recent post on the Ninth Circuit case Kimzey v. Yelp! Inc., in the right circumstances, Section 230 of the Communications Decency Act (CDA) still provides robust protection against liability for website operators despite the unusually large number of decisions this year seemingly narrowing the scope of the statute. Defendants notched another Section 230 win recently in Manchanda v. Google, a case in the Southern District of New York. The case began in May 2016 when Rahul Manchanda, an attorney, filed a complaint alleging that Google, Yahoo and Microsoft harmed his reputation by indexing certain websites that described him in negative terms.

Manchanda asserted various claims against the three defendants, including defamation, libel, slander, tortious interference with contract, breach of fiduciary duty, breach of the duty of loyalty, unfair trade practices, false advertising, unlawful trespass, civil RICO, unjust enrichment, intentional infliction of emotional distress, negligent infliction of emotional distress and trademark infringement. Manchanda sought injunctive relief requiring the defendants to “de-index or remove the offending websites from their search engines” in addition to damages.

The court made quick work of dismissing most of Manchanda’s claims on Section 230 grounds, emphasizing that the CDA “immunizes search engines from civil liability for reputational damage resulting from third-party content that they aggregate and republish.” The court went on to note that “[t]his immunity attaches regardless of the specific claim asserted against the search engine, so long as the claim arises from the publication or distribution of content produced by a third party and the alleged injury involves damage to a plaintiff’s reputation based on that content.”
Continue Reading

"Unlike" on a screen. More>>

2016 has been a challenging year for Section 230 of the Communications Decency Act (CDA) and the website operators who depend on it for protection against liability stemming from user-generated content. An unusually large number of cases this year have resulted in decisions holding that the defendant website operators were not entitled to immunity under Section 230. For example, as we’ve discussed recently, in Hassel v. Bird, the California Court of Appeal held that Section 230 did not prevent the court from ordering Yelp to remove from its website allegedly defamatory reviews posted by users, even though Yelp was not a party in the underlying defamation suit.

We are working on an article surveying some of the recent cases holding that Section 230 did not apply. But in the meantime, it is important to remember that Section 230 remains a powerful shield against liability and that defendants continue to wield it successfully in many cases. The Ninth Circuit’s recent decision in Kimzey v. Yelp is one such case.

Kimzey arose from two negative Yelp reviews that user “Sarah K” posted in September 2011 about Douglas Kimzey’s locksmith business in the Seattle area. Sarah K’s reviews were extremely negative and rated Kimzey one out of five stars in Yelp’s multiple-choice star rating system. In all caps, she warned Yelpers that “THIS WAS BY FAR THE WORST EXPERIENCE I HAVE EVER ENCOUNTERED WITH A LOCKSMITH. DO NOT GO THROUGH THIS COMPANY . . . CALL THIS BUSINESS AT YOUR OWN RISK.”
Continue Reading

"Unlike" on a screen. More>>

A recent California court decision involving Section 230 of the Communications Decency Act (CDA) is creating considerable concern among social media companies and other website operators.

As we’ve discussed in past blog posts, CDA Section 230 has played an essential role in the growth of the Internet by shielding website operators from defamation and

Here’s how Twitter is loosening up its 140-character limit.

The federal government will now check the social media history of prospective employees before granting them security clearance.

One expert says C-level executives shouldn’t entrust millennials with their companies’ social media feeds.

Federal court refuses to dismiss a lawsuit against Google for allegedly removing sites

03_21_Signs_Today’s companies compete not only for dollars but also for likes, followers, views, tweets, comments and shares. “Social currency,” as some researchers call it, is becoming increasingly important and companies are investing heavily in building their social media fan bases. In some cases, this commitment of time, money and resources has resulted in staggering success. Coca-Cola, for example, has amassed over 96 million likes on its Facebook page and LEGO’s YouTube videos have been played over 2 billion times.

With such impressive statistics, there is no question that a company’s social media presence and the associated pages and profiles can be highly valuable business assets, providing an important means for disseminating content and connecting with customers. But how much control does a company really have over these social media assets? What recourse would be available if a social media platform decided to delete a company’s page or migrate its fans to another page?

The answer may be not very much. Over the past few years, courts have repeatedly found in favor of social media platforms in a number of cases challenging the platforms’ ability to delete or suspend accounts and to remove or relocate user content.

Legal Show-Downs on Social Media Take-Downs

In a recent California case, Lewis v. YouTube, LLC, the plaintiff Jan Lewis’s account was removed by YouTube due to allegations that she artificially inflated view counts in violation of YouTube’s Terms of Service. YouTube eventually restored Lewis’s account and videos but not the view counts or comments that her videos had generated prior to the account’s suspension.

Lewis sued YouTube for breach of contract, alleging that YouTube had deprived her of her reasonable expectations under the Terms of Service that her channel would be maintained and would continue to reflect the same number of views and comments. She sought damages as well as specific performance to compel YouTube to restore her account to its original condition.

The court first held that Lewis could not show damages due to the fact that the YouTube Terms of Service contained a limitation of liability provision that disclaimed liability for any omissions relating to content. The court also held that Lewis was not entitled to specific performance because there was nothing in the Terms of Service that required YouTube to maintain particular content or to display view counts or comments. Accordingly, the court affirmed dismissal of Lewis’s complaint.

In a similar case, Darnaa LLC v. Google, Inc., Darnaa, a singer, posted a music video on YouTube. Again, due to allegations of view count inflation, YouTube removed and relocated the video to a different URL, disclosing on the original page that the video had been removed for violating its Terms of Service. Darnaa sued for breach of the covenant of good faith and fair dealing, interference with prospective economic advantage and defamation. In an email submitted with the complaint, Darnaa’s agent explained that she had launched several large campaigns (each costing $250,000 to $300,000) to promote the video and that the original link was already embedded in thousands of websites and blogs. Darnaa sought damages as well as an injunction to prevent YouTube from removing the video or changing its URL.

The court dismissed all of Darnaa’s claims because YouTube’s Terms of Service require lawsuits to be filed within one year and Darnaa had filed her case too late. In its discussion, however, the court made several interesting points. In considering whether YouTube’s Terms of Service were unconscionable, the court held that, although the terms are by nature a “contract of adhesion,” the level of procedural unconscionability was slight, since the plaintiff could have publicized her videos on a different website. Further, in ruling that the terms were not substantively unconscionable, the court pointed out that “[b]ecause YouTube offers its hosting services free of charge, it is reasonable for YouTube to retain broad discretion over [its] services.”

Although the court ultimately dismissed Darnaa’s claims based on the failure to timely file the suit, the decision was not a complete victory for YouTube. The court granted leave to amend to give Darnaa the opportunity to plead facts showing that she was entitled to equitable tolling of the contractual limitations period. Therefore, the court went on to consider whether Darnaa’s allegations were sufficient to state a claim. Among other things, the court held that YouTube’s Terms of Service were ambiguous regarding the platform’s rights to remove and relocate user videos in its sole discretion. Thus, the court further held that if Darnaa were able to amend the complaint to avoid the consequences of the failure to timely file, then the complaint would be sufficient to state a claim for breach of the contractual covenant of good faith and fair dealing.


Continue Reading