A federal district court in Illinois allowed claims for vicarious and direct copyright infringement to proceed against an employee of the Chicago Cubs Baseball Club for retweeting a third-party tweet containing the plaintiff’s copyrighted material. Read the opinion.
Here at Socially Aware we covered a wide range of issues in 2019. We reviewed an opinion reminding us that user-generated content posted on social media platforms is not necessarily freely available for use in other contexts, and a rare instance of a federal district court holding that a browsewrap agreement was enforceable.…
A recent Second Circuit decision makes clear that the safe harbor that social media and other Internet companies enjoy under Section 230 of the Communications Decency Act broadly applies to a wide variety of claims.
When you think about the Section 230 safe harbor, don’t just think defamation or other similar state law claims. Consider whether the claim—be it federal, state, local, or foreign—seeks to hold a party that publishes third-party content on the Internet responsible for publishing the content. If, after stripping it all down, this is the crux of the cause of action, the safe harbor should apply (absent a few statutory exclusions discussed below). The safe harbor should apply even if the party uses its discretion as a publisher in deciding how best to target its audience or to display the information provided by third parties.
In 2016, Facebook was sued by the estates of four U.S. citizens who died in terrorist attacks in Israel and one who narrowly survived but was grievously injured. The plaintiffs claimed that Facebook should be held liable under the federal Anti-Terrorism Act and the Justice Against Sponsors of Terror Act, which provide a private right of action against those who aid and abet acts of international terrorism, conspire in furtherance of acts of terrorism, or provide material support to terrorist groups. The plaintiffs also asserted claims arising under Israeli law.…
A California Superior Court’s recent ruling in Murphy v. Twitter held that Section 230 of the Communications Decency Act shielded Twitter from liability for suspending and banning a user’s account for violating the platform’s policies. As we have previously noted, Section 230 has come under pressure in recent years from both courts and legislatures. But we have also examined other cases demonstrating Section 230’s staying power. The ruling in Murphy again shows that, despite the challenges facing Section 230, the statute continues to serve its broader purpose of protecting social media platforms from the actions of their users while allowing those platforms to monitor and moderate their services.
From January to mid-October 2018, Meghan Murphy posted a number of tweets that misgendered and criticized transgender Twitter users. After first temporarily suspending her account, Twitter ultimately banned her from the platform for violating its Hateful Conduct Policy. Twitter had amended this policy in late October 2018 to specifically include targeted abuse and misgendering of transgender people.…
Often hailed as the law that gave us the modern Internet, Section 230 of the Communication Decency Act generally protects online platforms from liability for content posted by third parties. Many commentators, including us here at Socially Aware, have noted that Section 230 has faced significant challenges in recent years. But Section 230 has proven resilient (as we previously noted here and here), and that resiliency was again demonstrated by the Second Circuit’s recent opinion in Herrick v. Grindr, LLC.
As we noted in our prior post following the district court’s order dismissing plaintiff Herrick’s claims on Section 230 grounds, the case arose from fake Grindr profiles allegedly set up by Herrick’s ex-boyfriend. According to Herrick, these fake profiles resulted in Herrick facing harassment from over 1,000 strangers who showed up at his door over the course of several months seeking violent sexual encounters.…
In 2019, the European Court of Justice (CJEU) is expected to clarify one of the key open issues in EU copyright law: the extent to which online platforms such as YouTube can be liable for copyright infringement caused by user-generated content—content uploaded on to the Internet by users such as music, videos, literature, photos, or the streaming of live events such as concerts. The CJEU decisions are eagerly awaited by both media and copyright owners and by online platform operators—and will mark yet another stage in the on-going battle of the creative industries against copyright infringements in the online world.
In September 2018, the German Federal Court of Justice (Bundesgerichtshof, BGH) suspended proceedings in a widely-publicized case concerning YouTube’s liability for copyright infringing user-uploaded content and referred a series of questions regarding the interpretation of several EU copyright provisions to the CJEU for a preliminary ruling. A few days later, the BGH also suspended proceedings in five other high-profile cases concerning the liability of the file hosting service uploaded.net for user files containing copyright infringing content and submitted the same questions again to the CJEU.
Previous rulings by the CJEU have addressed both the application of the safe harbor principle set out in EU E-Commerce Directive 2000/31/EC, which shields hosting providers from liability for hosted unlawful third-party content (see, for example, eBay/L’Oreal; Netlog/SABAM; and Scarlet/SABAM) of which they have no actual knowledge and, separately, the extent of infringement of copyright by hosting of, or linking to, copyright infringing third-party content under the EU Copyright Directive (See GS Media/Sanoma; Filmspeler; and The Pirate Bay). But it is still unclear under which conditions the providers of the various online platforms that store and make available user-generated content, can rely on the safe harbor privilege applying to hosting providers to avoid liability, or whether they must not only take down the infringing content when they obtain knowledge of such content but also compensate the rights holders of such content for damages for copyright infringement.
The questions that the BGH submitted to the CJEU aim to clarify these uncertainties by bringing together the different requirements established by the previous CJEU rulings for (i) affirming a direct copyright infringement by the online platform providers under the EU Copyright Directive and (ii) denying the application of the safe harbor privilege as well as the legal consequences of such a denial (such as the extent of liability for damages). The CJEU will have to consider the differences between the YouTube and uploaded.net business models. The CJEU will hopefully provide much clearer guidelines on key issues such as:
- to what extent can providers of online services engage with the user content hosted by them;
- which activities will trigger a liability for copyright infringement irrespective of actual knowledge of a specific infringement;
- whether they must actively monitor the content uploaded by users for copyright infringements (e.g., by using state-of-the-art efficient filter technologies) to avoid damage claims by rights holders.
In addition, we expect these cases to have an effect on the interpretation of the new Art. 13 of the revision of the EU Copyright Directive that will likely be adopted by the EU legislative institutions in the second quarter of 2019. The current trilogue negotiations among the EU institutions indicate that, under such new Art.13, providers of online content sharing services will be directly liable for copyright infringements by content uploaded to the platform by their users and will not be granted safe harbor under the EU E-Commerce Directive. The providers would then have to ensure that content for which the providers have not obtained a license from the respective rights holders for use on their platforms cannot be displayed on their platform. This means that the providers would have to monitor all content files when uploaded to their platform, making filter technology mandatory for the majority of the platforms (see our previous Client Alert on the draft amendment to the EU Copyright Directive).
As Socially Aware readers know, social media is transforming the way companies interact with consumers. Learn how to make the most of these online opportunities while minimizing your company’s legal risks at Practising Law Institute’s (PLI) 2018 Social Media conference, to be held in San Francisco on Thursday, February 1st, and in New…
If your company operates a website or blog that hosts user-generated content, and has yet to register an agent for receipt of copyright infringement notices under the U.S. Copyright Office’s new agent designation system, it’s time to light a fire. Failure to do so could significantly increase your company’s copyright liability exposure in connection with such hosted content.
Here’s what you need to know:
Under the Digital Millennium Copyright Act’s (DMCA) Section 512(c) safe harbor, website operators and other online service providers that comply with the eligibility requirements are shielded from copyright damages in connection with their hosting of infringing content uploaded by service users.
This powerful safe harbor has played a major role in the success of Facebook, Instagram, YouTube and other U.S. social media and Internet sites. But it also protects brands that host on their websites text, photos and videos uploaded by their customers.…
We discussed last year the trend toward companies seeking to monetize user-generated content. A recent Central District of California decision in Greg Young Publishing, Inc. v. Zazzle, Inc. serves as an important reminder of the serious risks that can arise from seeking to commercially exploit such content.
Under the Digital Millennium Copyright Act’s (DMCA) Section 512(c) safe harbor, online service providers that comply with the eligibility requirements are shielded from copyright damages in connection with their hosting of infringing content uploaded by service users. This powerful safe harbor has played a major role in the success of Facebook, Instagram, YouTube and other U.S. social media and Internet sites.
With over one billion websites on the Internet, and 211 million items of online content created every minute, it should come as no surprise that content curation is one of the hottest trends in the Internet industry. We are overwhelmed with online content, and we increasingly rely on others to separate the good from the bad so that we can make more efficient use of our time spent surfing the web.
Consistent with this trend, many websites that host user-generated content are now focused on filtering out content that is awful, duplicative, off-topic, or otherwise of little interest to site visitors. And these sites often find that humans—typically passionate volunteers from the sites’ user communities—are better than algorithms at sorting the wheat from the chaff.
Of course, any website that deals with user-generated content needs to consider potential copyright liability arising from such content. We’ve discussed in past Socially Aware blog posts the critical importance of Section 512(c) of the Digital Millennium Copyright Act (the DMCA) to the success of YouTube, Facebook and other online platforms that host user-generated content. By providing online service providers with immunity from monetary damages in connection with the hosting of content at the direction of users, Section 512(c) has fueled the growth of the U.S. Internet industry.…