In U.S. copyright law circles, one of the hottest topics of debate is the degree to which the fair use doctrine—which allows for certain unauthorized uses of copyrighted works—should protect companies building commercial products and services based on content created by others, especially where such products or services are making transformative uses of such content.

This debate is likely to become even more heated in the wake of the Second Circuit Court of Appeals’ issuance last week of its long-awaited decision in the copyright dispute between Fox News and TVEyes, in which the court sided with the copyright owner over the creator of a digital “search engine” for identifying and viewing television content. But regardless of which side of the debate you are on (or if you are just standing on the sidelines), the court’s decision provides important guidance on the scope of the fair use doctrine as applied to commercial products and services.

The Dispute

Using the closed-captioning data that accompanies most television programming, TVEyes provides a searchable database of video clips. TVEyes’ subscribers—who pay $500 a month—can search the database for keywords in order to identify and view video clips from the service; such video clips may be as long as ten minutes in duration.

In July 2013, Fox sued TVEyes for copyright infringement and, in August 2015, Judge Hellerstein of the U.S. District Court for the Southern District of New York held that the key features of the TVEyes service are protected under the fair use doctrine.
Continue Reading

Following a recent decision from the Sixth Circuit, anonymous bloggers and other Internet users who post third-party copyrighted material without authorization have cause for concern. They may be unable to preserve their anonymity.

In Signature Management Team, LLC v. John Doe, the majority of a panel of the U.S. Court of Appeals for the Sixth Circuit established a new “presumption in favor of unmasking anonymous defendants when judgment has been entered for a plaintiff” in a copyright infringement case. This unmasking presumption is intended to protect the openness of judicial proceedings. Whether to unmask the defendant in such circumstances requires an examination of factors such as the plaintiff’s and public’s interest in knowing the defendant’s identity.
Continue Reading

Happy 2018 to our readers! It has become a Socially Aware tradition to start the New Year with some predictions from our editors and contributors. With smart contracts on the horizon, the Internet of Things and cryptocurrencies in the spotlight, and a number of closely watched lawsuits moving toward resolution, 2018 promises to be an exciting year in the world of emerging technology and Internet law.

Here are some of our predictions regarding tech-related legal developments over the next twelve months. As always, the views expressed are not to be attributed to Morrison & Foerster or its clients.

From John Delaney, Co-Founder and Co-Editor, Socially Aware, and Partner at Morrison & Foerster:
Regarding Web Scraping

Web scraping is an increasingly common activity among businesses (by one estimate, web-scraping bots account for as much as 46% of Internet traffic), and is helping to fuel the “Big Data” revolution. Despite the growing popularity of web scraping, courts have been generally unsympathetic to web scrapers. Last August, however, web scrapers finally received a huge victory, as the U.S. District Court for the Northern District of California enjoined LinkedIn from blocking hiQ Labs’ scraping of publicly available user profiles from the LinkedIn website in the hiQ Labs, Inc. v. LinkedIn Corp. litigation. The case is now on appeal to the Ninth Circuit; although my sense is that the Ninth Circuit will reject the broad scope and rationale of the lower court’s ruling, if the Ninth Circuit nevertheless ultimately sides with hiQ Labs, the web scraper, the decision could be a game changer, bringing online scraping out of the shadows and perhaps spurring more aggressive uses of scraping tools and scraped data. On the other hand, if the Ninth Circuit reverses, we may see companies reexamining and perhaps curtailing their scraping initiatives. Either way, 2018 promises to bring greater clarity to this murky area of the law.

Regarding the Growing Challenges for Social Media Platforms

2017 was a tough year for social media platforms. After years of positive press, immense consumer goodwill and a generally “hands off” attitude from regulators, last year saw a growing backlash against social media due to a number of reasons: the continued rise of trolling creating an ever-more toxic online environment; criticism of social media’s role in the dissemination of fake news; the growing concern over social media “filter bubbles” and “echo chambers”; and worries about the potential societal impact of social media’s algorithm-driven effectiveness in attracting and keeping a grip on our attention. Expect to see in 2018 further efforts by social media companies to get out ahead of most if not all of these issues, in the hopes of winning over critics and discouraging greater governmental regulation.

Regarding the DMCA Safe Harbor for Hosting of User-Generated Content

The backlash against social media noted in my prior item may also be reflected to some extent in several 2017 court decisions regarding the DMCA safe harbor shielding website operators and other online service providers from copyright damages in connection with user-generated content (and perhaps in the CDA Section 230 case law discussed by Aaron Rubin below). After nearly two decades of court decisions generally taking an ever more expansive approach to this particular DMCA safe harbor, the pendulum begun to swing in the other direction in 2016, and this trend picked up steam in 2017, culminating in the Ninth Circuit’s Mavrix decision, which found an social media platform provider’s use of volunteer curators to review user posts to deprive the provider of DMCA safe harbor protection. Expect to see the pendulum continue to swing in favor of copyright owners in DMCA safe harbor decisions over the coming year.

Regarding Smart Contracts

Expect to see broader, mainstream adoption of “smart contracts,” especially in the B2B context—and perhaps litigation over smart contracts in 2019 . . . .

From Aaron Rubin, Co-Editor, Socially Aware, and Partner at Morrison & Foerster:
Regarding the CDA Section 230 Safe Harbor

We noted previously that 2016 was a particularly rough year for Section 230 of the Communications Decency Act and the immunity that the statute provides website operators against liability arising from third-party or user-generated content. Now that 2017 is in the rear view mirror, Section 230 is still standing but its future remains imperiled. We have seen evidence of Section 230’s resiliency in recent cases where courts rejected plaintiffs’ creative attempts to find chinks in the immunity’s armor by arguing, for example, that websites lose immunity when they use data analytics to direct users to content, or when they fail to warn users of potential dangers, or when they share ad revenue with content developers. Nonetheless, it is clear that the knives are still out for Section 230, including in Congress, where a number of bills are under consideration that would significantly limit the safe harbor in the name of combatting sex trafficking. I predict that 2018 will only see these efforts to rein in Section 230 increase.
Continue Reading

In the classic rock song “Light My Fire,” ‘60s icon and the Doors’ lead singer Jim Morrison sang, “The time to hesitate is through.”

If your company operates a website or blog that hosts user-generated content, and has yet to register an agent for receipt of copyright infringement notices under the U.S. Copyright Office’s new agent designation system, it’s time to light a fire. Failure to do so could significantly increase your company’s copyright liability exposure in connection with such hosted content.

Here’s what you need to know:

Under the Digital Millennium Copyright Act’s (DMCA) Section 512(c) safe harbor, website operators and other online service providers that comply with the eligibility requirements are shielded from copyright damages in connection with their hosting of infringing content uploaded by service users.

This powerful safe harbor has played a major role in the success of Facebook, Instagram, YouTube and other U.S. social media and Internet sites. But it also protects brands that host on their websites text, photos and videos uploaded by their customers.
Continue Reading

We discussed last year the trend toward companies seeking to monetize user-generated content. A recent Central District of California decision in Greg Young Publishing, Inc. v. Zazzle, Inc. serves as an important reminder of the serious risks that can arise from seeking to commercially exploit such content.

Under the Digital Millennium Copyright Act’s (DMCA) Section 512(c) safe harbor, online service providers that comply with the eligibility requirements are shielded from copyright damages in connection with their hosting of infringing content uploaded by service users. This powerful safe harbor has played a major role in the success of Facebook, Instagram, YouTube and other U.S. social media and Internet sites.


Continue Reading

With over one billion websites on the Internet, and 211 million items of online content created every minute, it should come as no surprise that content curation is one of the hottest trends in the Internet industry. We are overwhelmed with online content, and we increasingly rely on others to separate the good from the bad so that we can make more efficient use of our time spent surfing the web.

Consistent with this trend, many websites that host user-generated content are now focused on filtering out content that is awful, duplicative, off-topic, or otherwise of little interest to site visitors. And these sites often find that humans—typically passionate volunteers from the sites’ user communities—are better than algorithms at sorting the wheat from the chaff.

Of course, any website that deals with user-generated content needs to consider potential copyright liability arising from such content. We’ve discussed in past Socially Aware blog posts the critical importance of Section 512(c) of the Digital Millennium Copyright Act (the DMCA) to the success of YouTube, Facebook and other online platforms that host user-generated content. By providing online service providers with immunity from monetary damages in connection with the hosting of content at the direction of users, Section 512(c) has fueled the growth of the U.S. Internet industry.
Continue Reading

The U.S. Supreme Court unanimously held that a North Carolina law that the state has used to prosecute more than 1,000 sex offenders for posting on social media is unconstitutional because it violates the First Amendment.

The U.S. Supreme Court denied certiorari in what has become known as the  “dancing baby” case—a lawsuit

One year since agreeing with the European Commission to remove hate speech within 24 hours of receiving a complaint about it, Facebook, Microsoft, Twitter and YouTube are removing flagged content an average of 59% of the time, the EC reports.

The U.S. Court of Appeals for the Second Circuit held that a catering company

GettyImages-183313080With over one billion websites on the Internet, and 211 million items of online content created every minute, it should come as no surprise that content curation is one of the hottest trends in the Internet industry. We are overwhelmed with online content, and we increasingly rely on others to separate good content from bad content so we can make more efficient use of our time spent surfing the web.

Consistent with this trend, many websites that host user-generated content are now focused on filtering out content that is awful, duplicative, off-topic or otherwise of little interest to site visitors. And these sites are often finding that humans—typically passionate volunteers from these sites’ user communities—do a better job than algorithms in sorting the wheat from the chaff.

Of course, any website that deals with user-generated content needs to worry about potential copyright liability arising from such content. We’ve discussed in past Socially Aware blog posts the critical importance of Section 512(c) of the Digital Millennium Copyright Act (DMCA) to the success of YouTube, Facebook and other online sites that host user-generated content. By providing online service providers with immunity from monetary damages in connection with the hosting of content at the direction of users, Section 512(c) has fueled the growth of the U.S. Internet industry.
Continue Reading