The music industry came out on top in one of its first attempts to hold an internet service provider liable for its subscribers’ unauthorized peer-to-peer file sharing.

The decision, handed down by the Fourth Circuit Court of Appeals in a dispute between BMG Rights Management and Cox Communications, outlines the obligations an ISP must fulfill to receive safe harbor protection under the Digital Millennium Copyright Act for a subscriber’s infringement. It also explains when an ISP can be held contributorily liable for its subscribers’ actions.

Read my full analysis here.

Happy 2018 to our readers! It has become a Socially Aware tradition to start the New Year with some predictions from our editors and contributors. With smart contracts on the horizon, the Internet of Things and cryptocurrencies in the spotlight, and a number of closely watched lawsuits moving toward resolution, 2018 promises to be an exciting year in the world of emerging technology and Internet law.

Here are some of our predictions regarding tech-related legal developments over the next twelve months. As always, the views expressed are not to be attributed to Morrison & Foerster or its clients.

From John Delaney, Co-Founder and Co-Editor, Socially Aware, and Partner at Morrison & Foerster:
Regarding Web Scraping

Web scraping is an increasingly common activity among businesses (by one estimate, web-scraping bots account for as much as 46% of Internet traffic), and is helping to fuel the “Big Data” revolution. Despite the growing popularity of web scraping, courts have been generally unsympathetic to web scrapers. Last August, however, web scrapers finally received a huge victory, as the U.S. District Court for the Northern District of California enjoined LinkedIn from blocking hiQ Labs’ scraping of publicly available user profiles from the LinkedIn website in the hiQ Labs, Inc. v. LinkedIn Corp. litigation. The case is now on appeal to the Ninth Circuit; although my sense is that the Ninth Circuit will reject the broad scope and rationale of the lower court’s ruling, if the Ninth Circuit nevertheless ultimately sides with hiQ Labs, the web scraper, the decision could be a game changer, bringing online scraping out of the shadows and perhaps spurring more aggressive uses of scraping tools and scraped data. On the other hand, if the Ninth Circuit reverses, we may see companies reexamining and perhaps curtailing their scraping initiatives. Either way, 2018 promises to bring greater clarity to this murky area of the law.

Regarding the Growing Challenges for Social Media Platforms

2017 was a tough year for social media platforms. After years of positive press, immense consumer goodwill and a generally “hands off” attitude from regulators, last year saw a growing backlash against social media due to a number of reasons: the continued rise of trolling creating an ever-more toxic online environment; criticism of social media’s role in the dissemination of fake news; the growing concern over social media “filter bubbles” and “echo chambers”; and worries about the potential societal impact of social media’s algorithm-driven effectiveness in attracting and keeping a grip on our attention. Expect to see in 2018 further efforts by social media companies to get out ahead of most if not all of these issues, in the hopes of winning over critics and discouraging greater governmental regulation.

Regarding the DMCA Safe Harbor for Hosting of User-Generated Content

The backlash against social media noted in my prior item may also be reflected to some extent in several 2017 court decisions regarding the DMCA safe harbor shielding website operators and other online service providers from copyright damages in connection with user-generated content (and perhaps in the CDA Section 230 case law discussed by Aaron Rubin below). After nearly two decades of court decisions generally taking an ever more expansive approach to this particular DMCA safe harbor, the pendulum begun to swing in the other direction in 2016, and this trend picked up steam in 2017, culminating in the Ninth Circuit’s Mavrix decision, which found an social media platform provider’s use of volunteer curators to review user posts to deprive the provider of DMCA safe harbor protection. Expect to see the pendulum continue to swing in favor of copyright owners in DMCA safe harbor decisions over the coming year.

Regarding Smart Contracts

Expect to see broader, mainstream adoption of “smart contracts,” especially in the B2B context—and perhaps litigation over smart contracts in 2019 . . . .

From Aaron Rubin, Co-Editor, Socially Aware, and Partner at Morrison & Foerster:
Regarding the CDA Section 230 Safe Harbor

We noted previously that 2016 was a particularly rough year for Section 230 of the Communications Decency Act and the immunity that the statute provides website operators against liability arising from third-party or user-generated content. Now that 2017 is in the rear view mirror, Section 230 is still standing but its future remains imperiled. We have seen evidence of Section 230’s resiliency in recent cases where courts rejected plaintiffs’ creative attempts to find chinks in the immunity’s armor by arguing, for example, that websites lose immunity when they use data analytics to direct users to content, or when they fail to warn users of potential dangers, or when they share ad revenue with content developers. Nonetheless, it is clear that the knives are still out for Section 230, including in Congress, where a number of bills are under consideration that would significantly limit the safe harbor in the name of combatting sex trafficking. I predict that 2018 will only see these efforts to rein in Section 230 increase. Continue Reading 2018: Predictions From Socially Aware’s Editors and Contributors

As part of a new tracking system, the Department of Homeland Security will be keeping records of immigrants’ social media handles and search results.

Russia to Facebook: Turn over user-information or risk being blocked.

Google is ending a policy that required news sites to allow users at least one free article-click.

A new social media platform called Steemit will pay users in cryptocurrency for posting, commenting, or liking content—and its market capitalization is around $294 million.

Not everyone is a fan of Twitter’s new 280-character limit.

A type of biometric payment system that identifies a checking or credit account owner based on the unique vein-pattern in his or her fingertip would allow consumers to shop without cash, cards or devices.

Initial coin offerings (ICOs) are allowing startups that develop applications for blockchain technology to raise money without giving up the equity or decision-making power they would have to surrender to venture capitalists.

In this Wired op-ed, a former prisoner argues that allowing inmates controlled social media use might reduce recidivism and help the cell phone contraband problem.

Young kids are the new social media celebrities—and the law isn’t clear on whether they’re owed any of the money that their parents collect as a result of the viral videos.

When a social media celebrity famous for posting photos of herself posing in fitness gear changed the direction of her Instagram account to one that promotes body acceptance, she initially lost 70,000 followers, but she ultimately wound up with more fans than ever.

Kudos to Netflix’s in-house counsel for crafting a cease-and-desist letter for brand marketing in the modern age.

With over one billion websites on the Internet, and 211 million items of online content created every minute, it should come as no surprise that content curation is one of the hottest trends in the Internet industry. We are overwhelmed with online content, and we increasingly rely on others to separate the good from the bad so that we can make more efficient use of our time spent surfing the web.

Consistent with this trend, many websites that host user-generated content are now focused on filtering out content that is awful, duplicative, off-topic, or otherwise of little interest to site visitors. And these sites often find that humans—typically passionate volunteers from the sites’ user communities—are better than algorithms at sorting the wheat from the chaff.

Of course, any website that deals with user-generated content needs to consider potential copyright liability arising from such content. We’ve discussed in past Socially Aware blog posts the critical importance of Section 512(c) of the Digital Millennium Copyright Act (the DMCA) to the success of YouTube, Facebook and other online platforms that host user-generated content. By providing online service providers with immunity from monetary damages in connection with the hosting of content at the direction of users, Section 512(c) has fueled the growth of the U.S. Internet industry. Continue Reading Could the Use of Online Volunteers and Moderators Increase Your Company’s Copyright Liability Exposure?

GettyImages-183313080With over one billion websites on the Internet, and 211 million items of online content created every minute, it should come as no surprise that content curation is one of the hottest trends in the Internet industry. We are overwhelmed with online content, and we increasingly rely on others to separate good content from bad content so we can make more efficient use of our time spent surfing the web.

Consistent with this trend, many websites that host user-generated content are now focused on filtering out content that is awful, duplicative, off-topic or otherwise of little interest to site visitors. And these sites are often finding that humans—typically passionate volunteers from these sites’ user communities—do a better job than algorithms in sorting the wheat from the chaff.

Of course, any website that deals with user-generated content needs to worry about potential copyright liability arising from such content. We’ve discussed in past Socially Aware blog posts the critical importance of Section 512(c) of the Digital Millennium Copyright Act (DMCA) to the success of YouTube, Facebook and other online sites that host user-generated content. By providing online service providers with immunity from monetary damages in connection with the hosting of content at the direction of users, Section 512(c) has fueled the growth of the U.S. Internet industry. Continue Reading Could the Use of Online Volunteers and Moderators Increase Your Company’s Copyright Liability Exposure?

GettyImages-179131621-600pxOne of the most significant legal concerns for Internet service providers is the risk of exposure to liability for the copyright infringements of their users. The concern is not unreasonable. Because Internet service providers can be held secondarily liable for the infringements of their users, and because this liability can come with statutory damages attached, the service provider’s potential economic exposure can be significant, especially for Internet service providers engaged in the transmission or hosting of user-generated content.

Moreover, the principle of joint and several liability may further increase this potential economic exposure for Internet service providers.

Under Section 504(c) of the Copyright Act, which permits a range of statutory damages for each infringed work, the principle of joint and several liability can make a defendant liable for multiple statutory damage awards for infringing a single work. The Ninth Circuit’s decision in Columbia Pictures Television v. Krypton Broadcasting of Birmingham, Inc. two decades ago illustrates the operation of this principle.

The defendants in Columbia Pictures were three television stations that had directly infringed upon plaintiff’s copyrights independently of each other. Consequently, the company that owned the three stations was secondarily liable for their infringement. Relying in part on legislative history, the court held that the plaintiff was entitled to separately calculated statutory awards against each of the three stations as they were separate infringers, and that, with respect to these awards, each of the three stations was jointly and severally liable with their common owner. Continue Reading Limiting Statutory Damages in Internet Copyright Cases

CheerUniformsDecisionImageOn March 22, 2017, the Supreme Court held in Star Athletica, LLC v. Varsity Brands that design elements of cheerleading uniforms may be protected under the Copyright Act. The 6-2 decision, written by Justice Thomas, clarified the scope of protection afforded to clothing designs and, more broadly, designs on useful articles.

Varsity Brands, Inc.—the country’s largest cheerleading supplier—owns more than 200 copyright registrations for two-dimensional designs consisting of combinations of chevrons, stripes, and other colorful shapes for its cheerleading uniforms. At issue in this case were the five pictured designs.

Varsity Brands sued Star Athletica, LLC, an upstart competitor, for copyright infringement. The District Court for the Western District of Tennessee granted Star Athletica’s motion for summary judgment, holding that the designs could not be conceptually or physically separated from the uniforms, and they were therefore ineligible for copyright protection. The Copyright Act makes “pictorial, graphic, or sculptural features” of the “design of a useful article” eligible for copyright protection as artistic works only if those features “can be identified separately from, and are capable of existing independently of, the utilitarian aspects of the article.” The Sixth Circuit reversed, concluding that the graphics were “separately identifiable” and “capable of existing independently” of the uniforms.

In affirming, the Supreme Court laid out a two-part test for when a feature incorporated into the design of a useful article is eligible for copyright protection: When the feature (1) can be perceived as a two- or three-dimensional work of art separate from the useful article; and (2) would qualify as a protectable pictorial, graphic, or sculptural work—either on its own or fixed in some other tangible medium of expression—if it were imagined separately from the useful article into which it is incorporated. “To be clear, the only feature of the cheerleading uniform eligible for a copyright in this case is the two-dimensional work of art,” the Court explained. “Respondents have no right to prohibit any person from manufacturing a cheerleading uniform of identical shape, cut, and dimensions to the ones on which the decorations in this case appear.” Continue Reading Supreme Court Rules Cheerleading Uniform Designs Are Copyrightable

ContentGraphic_SmallWe’re in the midst of a seismic shift in how companies interact with user-generated content (UGC).

For years, companies were happy simply to host UGC on their websites, blogs and social media pages and reap the resulting boost to their traffic numbers. And U.S. law—in the form of Section 512(c) of the Digital Millennium Copyright Act (DMCA)—accommodated this passive use of UGC by creating a safe harbor from copyright damages for websites, blogs and social media platform operators that hosted UGC posted without the authorization of the owners of the copyrights in such UGC, so long as such operators complied with the requirements of the safe harbor.

Increasingly, companies are no longer satisfied with passively hosting UGC. Rather, they now want to find creative ways to commercialize such content—by incorporating it into ads (including print, TV and other offline ads), creating new works based on such content and even selling such content. Yet, in moving beyond mere hosting to proactive exploitation of UGC, companies risk losing the benefit of the DMCA Section 512(c) safe harbor, which could result in potentially significant copyright liability exposure.

For example, if a company finds that users are posting potentially valuable UGC to the company’s Facebook page, or on Twitter in connection with one of the company’s hashtags, that company may want to make such UGC available on its own website. The DMCA Section 512(c) safe harbor, however, is unlikely to protect the company in copying such UGC from the Facebook or Twitter platform to its own website.

The reality is that any company seeking to monetize or otherwise exploit UGC needs to proceed with extreme caution. This is true for several reasons:

  • UGC can implicate a wide range of rights . . . As with any content, UGC is almost certainly subject to copyright protection, although certain Tweets and other short, text-only posts could potentially be exempt from copyright protection if they qualify as “short phrases” under the Copyright Act. If any individuals are identifiable in UGC, then rights of publicity and rights of privacy may also be relevant. In addition, UGC may contain visible third-party trademarks or comments that defame or invade the privacy of third parties.
  • . . . and a wide range of rightsholders. Notably, many of the rights necessary to exploit UGC are likely to be held by individuals and corporations other than the posting user. For example, unless a photo is a “selfie,” the photographer and the subject of the photo will be different individuals, with each holding different rights—copyright, for the photographer, and the rights of publicity and privacy, for the subject—that could be relevant to the exploitation of the photo. Moreover, any trademarks, logos and other images contained in a photo could potentially implicate third-party rightsholders, including third-party corporations. Videos also raise the possibility of unauthorized clips or embedded music.
  • If the UGC is hosted by a third-party social network, it may have Terms of Service that help—or hurt—efforts to exploit the UGC. Most social media networks collect broad rights to UGC from their users, although they differ substantially when it comes to passing those rights along to third parties interested in exploiting the content. For example, if a company uses Twitter’s Application Programming Interface (API) to identify and access Tweets that it would like to republish, then Twitter grants to that company a license to “copy a reasonable amount of and display” the Tweets on the company’s own services, subject to certain limitations. (For example, Twitter currently prohibits any display of Tweets that could imply an endorsement of a product or service, absent separate permission from the user.) Instagram also has an API that provides access to UGC, but, in contrast to Twitter, Instagram’s API terms do not appear to grant any license to the UGC and affirmatively require companies to “comply with any requirements or restrictions” imposed by Instagram users on their UGC.

With these risks in mind, we note several emerging best practices for a company to consider if it has decided to exploit UGC in ways that may fall outside the scope of DMCA Section 512(c) and other online safe harbors. Although legal risk can never be eliminated in dealing with UGC, these strategies may help to reduce such risk:

  • Carefully review the Social Media Platform Terms. If the item of UGC at issue has been posted to a social media platform, determine whether the Terms of Service for such platform grants any rights to use such posted UGC off of the platform or imposes any restrictions on such content. Note, however, that any license to UGC granted by a social media platform almost certainly will not include any representations, warranties or indemnities, and so it may not offer any protection against third-party claims arising from the UGC at issue.
  • Seek Permission. If the social media platform’s governing terms don’t provide you with all of the rights needed to exploit the UGC item at issue (or even if they do), seek permission directly from the user who posted the item. Sophisticated brands will often approach a user via the commenting or private messaging features of the applicable social media platform, and will present him or her with a link to a short, user-friendly license agreement. Often, the user will be delighted by the brand’s interest in using his or her content. Of course, be aware that the party posting the content may not be the party that can authorize use of that content, as Agence France Presse learned the hard way in using photos taken from Twitter.
  • Make Available Terms and Conditions for “Promotional” Hashtags. If a company promotes a particular hashtag to its customers, and would like to use content that is posted in conjunction with the hashtag, the company could consider making available a short set of terms alongside its promotion of that hashtag. For example, in any communications promoting the existence of the hashtag and associated marketing campaign, the company could inform customers that their use of the hashtag will constitute permission for the company to use any content posted together with the hashtag. Such an approach could face significant enforceability issues—after all, it is essentially a form of “browsewrap” agreement—but it could provide the company with a potential defense in the event of a subsequent dispute.
  • Adopt a Curation Process. Adopt an internal curation process to identify items of UGC that are especially high risk, which could include videos, photos of celebrities, photos of children, professional-quality content, any content containing copyright notices, watermarks and so forth, and any content containing potentially defamatory, fraudulent or otherwise illegal content. Ensure that the curators are trained and equipped with checklists and other materials approved by the company’s legal department or outside counsel. Ideally, any high-risk content should be subject to the company’s most stringent approach to obtaining permission and clearing rights—or perhaps avoided altogether.
  • Adjust the Approach for High-Risk Uses. Consider the way in which the UGC at issue is expected to be used, and whether the company’s risk tolerance should be adjusted accordingly. For example, if an item of UGC will be used in a high-profile advertisement, the company may want to undertake independent diligence on any questionable aspects of the UGC, even after obtaining the posting user’s permission—or perhaps avoid any questionable UGC altogether.

In a social media age that values authenticity, more and more companies—even big, risk-adverse Fortune 100 companies—are interested in finding ways to leverage UGC relevant to their business, products or services. Yet the shift from merely hosting UGC to actively exploiting it raises very real legal hurdles for companies. The tips above are not a substitute for working closely with experienced social media counsel, but they collectively provide a framework for addressing legal risks in connection with a company’s efforts to commercialize UGC.

*          *        *

For more on the issues related to user-generated content, see New Court Decision Highlights Potential Headache for Companies Hosting User-Generated Content; Court Holds That DMCA Safe Harbor Does Not Extend to Infringement Prior to Designation of Agent; and Thinking About Using Pictures Pulled From Twitter? Think Again, New York Court Warns.

The UK wants to use the blockchain to track the spending of welfare recipients.

Some believe that a recent Ninth Circuit holding could turn sharing passwords into a federal crime under the Computer Fraud and Abuse Act.

And another Ninth Circuit opinion sided with Facebook in a closely-watched case interpreting the same federal law, this time involving unauthorized access to Facebook’s website.

The fashion world is embroiled in a rocky romance with social media.

Snapchat filed a patent application for image-recognition technology that may help the platform’s ad sales.

Scientists think they’ve found a way to tackle virtual reality sickness.

What’s going on at Vine? First a bunch of influencers cut ties with the platform. Now a group of its top executives have jumped ship.

Livestreaming services are giving cable TV networks a run for their money.

You didn’t think we’d ignore the Pokémon Go craze, did you? Here’s advice on how to protect your privacy when you’re using the app. We’re also preparing an article describing the game and the business and legal issues that are arising from it. Stay tuned.

04_21_Apr_SociallyAware_v6_Page_01The latest issue of our Socially Aware newsletter is now available here.

In this issue of Socially Aware, our Burton Award winning guide to the law and business of social media. In this edition, we discuss what a company can do to help protect the likes, followers, views, tweets and shares that constitute its social media “currency”; we review a federal district court opinion refusing to enforce an arbitration clause included in online terms and conditions referenced in a “wet signature” contract; we highlight the potential legal risks associated with terminating an employee for complaining about her salary on social media; we explore the need for standardization and interoperability in the Internet of Things world; we examine the proposed EU-U.S. Privacy Shield’s attempt to satisfy consumers’ privacy concerns, the European Court of Justice’s legal requirements, and companies’ practical considerations; and we take a look at the European Commission’s efforts to harmonize the digital sale of goods and content throughout Europe.

All this—plus an infographic illustrating the growing popularity and implications of ad blocking software.

Read our newsletter.