With over one billion websites on the Internet, and 211 million items of online content created every minute, it should come as no surprise that content curation is one of the hottest trends in the Internet industry. We are overwhelmed with online content, and we increasingly rely on others to separate the good from the bad so that we can make more efficient use of our time spent surfing the web.

Consistent with this trend, many websites that host user-generated content are now focused on filtering out content that is awful, duplicative, off-topic, or otherwise of little interest to site visitors. And these sites often find that humans—typically passionate volunteers from the sites’ user communities—are better than algorithms at sorting the wheat from the chaff.

Of course, any website that deals with user-generated content needs to consider potential copyright liability arising from such content. We’ve discussed in past Socially Aware blog posts the critical importance of Section 512(c) of the Digital Millennium Copyright Act (the DMCA) to the success of YouTube, Facebook and other online platforms that host user-generated content. By providing online service providers with immunity from monetary damages in connection with the hosting of content at the direction of users, Section 512(c) has fueled the growth of the U.S. Internet industry. Continue Reading Could the Use of Online Volunteers and Moderators Increase Your Company’s Copyright Liability Exposure?

GettyImages-183313080With over one billion websites on the Internet, and 211 million items of online content created every minute, it should come as no surprise that content curation is one of the hottest trends in the Internet industry. We are overwhelmed with online content, and we increasingly rely on others to separate good content from bad content so we can make more efficient use of our time spent surfing the web.

Consistent with this trend, many websites that host user-generated content are now focused on filtering out content that is awful, duplicative, off-topic or otherwise of little interest to site visitors. And these sites are often finding that humans—typically passionate volunteers from these sites’ user communities—do a better job than algorithms in sorting the wheat from the chaff.

Of course, any website that deals with user-generated content needs to worry about potential copyright liability arising from such content. We’ve discussed in past Socially Aware blog posts the critical importance of Section 512(c) of the Digital Millennium Copyright Act (DMCA) to the success of YouTube, Facebook and other online sites that host user-generated content. By providing online service providers with immunity from monetary damages in connection with the hosting of content at the direction of users, Section 512(c) has fueled the growth of the U.S. Internet industry. Continue Reading Could the Use of Online Volunteers and Moderators Increase Your Company’s Copyright Liability Exposure?

GettyImages-525955707-600pxThe Fifth Circuit Court of Appeals recently considered in BWP Media USA, Inc. v. T&S Software Associates, Inc. whether volitional conduct is required to establish a claim for direct copyright infringement against an Internet service provider (“ISP”). The defendant ISP, T&S Software Associates (“T&S”), hosted a website that included a public forum called “HairTalk” where users could post content about hair, beauty, and celebrities.

HairTalk users posted photographs of Ke$ha, Julianne Hough, and Ashlee Simpson that were owned by the plaintiffs, BWP Media USA and National Photo Group (“BWP”), without BWP’s authorization. The plaintiffs sued T&S for direct and secondary copyright infringement based on the users’ posts. The district court granted summary judgment in favor of T&S as to both direct and secondary infringement and BWP appealed the judgment as to the direct infringement claim. Continue Reading 5th Circuit: ISP Not Liable for Infringement Due to Lack of Volitional Conduct, Despite Ineligibility for DMCA Safe Harbor

03_April_SociallyAware_thumbnailThe latest issue of our Socially Aware newsletter is now available here.

In this edition, we explore the threat to U.S. jobs posed by rapid advances in emerging technologies; we examine a Federal Trade Commission report on how companies engaging in cross-device tracking can stay on the right side of the law; we take a look at a Second Circuit opinion that fleshes out the “repeat infringer” requirement online service providers must fulfill to qualify for the Digital Millennium Copyright Act’s safe harbors; we discuss a state court decision holding that Section 230 of the Communications Decency Act immunizes Snapchat from liability for a car wreck that was allegedly caused by the app’s “speed filter” feature; we describe a recent decision by the District Court of the Hague confirming that an app provider could be subject to the privacy laws of a country in the European Union merely by making its app available on mobile phones in that country; and we review a federal district court order requiring Google to comply with search warrants for foreign stored user data.

All this—plus an infographic illustrating how emerging technology will threaten U.S. jobs.

Read our newsletter.

GettyImages-whitebackground_514259478-[Converted]Congress enacted the Digital Millennium Copyright Act (“DMCA”) nearly two decades ago seeking to balance the needs of two factions: Content creators, who were struggling to protect their intellectual property in the digital age, and fledgling Internet companies, who feared being held liable for the misdeeds of their customers.

For the Internet companies, Congress offered relief by creating a number of “safe harbors” shielding such companies from copyright-related damages arising from their customers’ infringing activities.

In particular, the DMCA established four distinct safe harbors for online service providers, each safe harbor aimed at a different type of online activity (i.e., transitory digital network communications; system caching; online hosting; and provision of information location tools) and each with its own set of eligibility requirements.

To qualify for any of these DMCA safe harbors, however, the DMCA requires that service providers “reasonably implement” a policy that provides for the termination of “repeat infringers” in “appropriate circumstances.”

Despite the threshold importance of repeat infringer policies, the DMCA left many questions unanswered. Who exactly counts as an “infringer”? Does it include every user accused of infringement or only those found culpable in court? If it’s somewhere in between, what level of proof is required before a service provider is required to take action? Can the repeat infringer policy differentiate between those who upload infringing content for others to copy and share and those who only download such content for their own personal viewing? And how many acts of infringement does it take to become a “repeat infringer” anyway? Continue Reading Second Circuit Clarifies “Repeat Infringer” Policy Requirement for DMCA Copyright Safe Harbors

3D rendering of Copyright Symbol made of transparent glass with Shades and Shadow isolated on white background.

If your company operates a website or blog that hosts user-generated content, you’ll want to read this post carefully.

We’re ringing the alarm bell on an important new U.S. copyright law development that, if ignored, could significantly increase your company’s potential liability exposure in connection with user-generated content.

If your company hosts user-generated content, such hosted content may include materials that were posted without the permission of the owners of the copyrights in such materials—potentially subjecting your company to copyright infringement liability.

For nearly two decades, however, Section 512(c) of the U.S. Copyright Act, enacted in 1998 as part of the Digital Millennium Copyright Act (DMCA), has provided a safe harbor insulating online service providers from monetary damages for hosting copyright-infringing materials posted by their users. To receive protection under the Section 512(c) safe harbor, service providers must, among other things, designate an agent to receive notifications of claimed infringement with the Copyright Office. Continue Reading New Copyright Office Rule Creates Potential “Gotcha” for Blogs and Websites Hosting User-Generated Content

ContentGraphic_SmallWe’re in the midst of a seismic shift in how companies interact with user-generated content (UGC).

For years, companies were happy simply to host UGC on their websites, blogs and social media pages and reap the resulting boost to their traffic numbers. And U.S. law—in the form of Section 512(c) of the Digital Millennium Copyright Act (DMCA)—accommodated this passive use of UGC by creating a safe harbor from copyright damages for websites, blogs and social media platform operators that hosted UGC posted without the authorization of the owners of the copyrights in such UGC, so long as such operators complied with the requirements of the safe harbor.

Increasingly, companies are no longer satisfied with passively hosting UGC. Rather, they now want to find creative ways to commercialize such content—by incorporating it into ads (including print, TV and other offline ads), creating new works based on such content and even selling such content. Yet, in moving beyond mere hosting to proactive exploitation of UGC, companies risk losing the benefit of the DMCA Section 512(c) safe harbor, which could result in potentially significant copyright liability exposure.

For example, if a company finds that users are posting potentially valuable UGC to the company’s Facebook page, or on Twitter in connection with one of the company’s hashtags, that company may want to make such UGC available on its own website. The DMCA Section 512(c) safe harbor, however, is unlikely to protect the company in copying such UGC from the Facebook or Twitter platform to its own website.

The reality is that any company seeking to monetize or otherwise exploit UGC needs to proceed with extreme caution. This is true for several reasons:

  • UGC can implicate a wide range of rights . . . As with any content, UGC is almost certainly subject to copyright protection, although certain Tweets and other short, text-only posts could potentially be exempt from copyright protection if they qualify as “short phrases” under the Copyright Act. If any individuals are identifiable in UGC, then rights of publicity and rights of privacy may also be relevant. In addition, UGC may contain visible third-party trademarks or comments that defame or invade the privacy of third parties.
  • . . . and a wide range of rightsholders. Notably, many of the rights necessary to exploit UGC are likely to be held by individuals and corporations other than the posting user. For example, unless a photo is a “selfie,” the photographer and the subject of the photo will be different individuals, with each holding different rights—copyright, for the photographer, and the rights of publicity and privacy, for the subject—that could be relevant to the exploitation of the photo. Moreover, any trademarks, logos and other images contained in a photo could potentially implicate third-party rightsholders, including third-party corporations. Videos also raise the possibility of unauthorized clips or embedded music.
  • If the UGC is hosted by a third-party social network, it may have Terms of Service that help—or hurt—efforts to exploit the UGC. Most social media networks collect broad rights to UGC from their users, although they differ substantially when it comes to passing those rights along to third parties interested in exploiting the content. For example, if a company uses Twitter’s Application Programming Interface (API) to identify and access Tweets that it would like to republish, then Twitter grants to that company a license to “copy a reasonable amount of and display” the Tweets on the company’s own services, subject to certain limitations. (For example, Twitter currently prohibits any display of Tweets that could imply an endorsement of a product or service, absent separate permission from the user.) Instagram also has an API that provides access to UGC, but, in contrast to Twitter, Instagram’s API terms do not appear to grant any license to the UGC and affirmatively require companies to “comply with any requirements or restrictions” imposed by Instagram users on their UGC.

With these risks in mind, we note several emerging best practices for a company to consider if it has decided to exploit UGC in ways that may fall outside the scope of DMCA Section 512(c) and other online safe harbors. Although legal risk can never be eliminated in dealing with UGC, these strategies may help to reduce such risk:

  • Carefully review the Social Media Platform Terms. If the item of UGC at issue has been posted to a social media platform, determine whether the Terms of Service for such platform grants any rights to use such posted UGC off of the platform or imposes any restrictions on such content. Note, however, that any license to UGC granted by a social media platform almost certainly will not include any representations, warranties or indemnities, and so it may not offer any protection against third-party claims arising from the UGC at issue.
  • Seek Permission. If the social media platform’s governing terms don’t provide you with all of the rights needed to exploit the UGC item at issue (or even if they do), seek permission directly from the user who posted the item. Sophisticated brands will often approach a user via the commenting or private messaging features of the applicable social media platform, and will present him or her with a link to a short, user-friendly license agreement. Often, the user will be delighted by the brand’s interest in using his or her content. Of course, be aware that the party posting the content may not be the party that can authorize use of that content, as Agence France Presse learned the hard way in using photos taken from Twitter.
  • Make Available Terms and Conditions for “Promotional” Hashtags. If a company promotes a particular hashtag to its customers, and would like to use content that is posted in conjunction with the hashtag, the company could consider making available a short set of terms alongside its promotion of that hashtag. For example, in any communications promoting the existence of the hashtag and associated marketing campaign, the company could inform customers that their use of the hashtag will constitute permission for the company to use any content posted together with the hashtag. Such an approach could face significant enforceability issues—after all, it is essentially a form of “browsewrap” agreement—but it could provide the company with a potential defense in the event of a subsequent dispute.
  • Adopt a Curation Process. Adopt an internal curation process to identify items of UGC that are especially high risk, which could include videos, photos of celebrities, photos of children, professional-quality content, any content containing copyright notices, watermarks and so forth, and any content containing potentially defamatory, fraudulent or otherwise illegal content. Ensure that the curators are trained and equipped with checklists and other materials approved by the company’s legal department or outside counsel. Ideally, any high-risk content should be subject to the company’s most stringent approach to obtaining permission and clearing rights—or perhaps avoided altogether.
  • Adjust the Approach for High-Risk Uses. Consider the way in which the UGC at issue is expected to be used, and whether the company’s risk tolerance should be adjusted accordingly. For example, if an item of UGC will be used in a high-profile advertisement, the company may want to undertake independent diligence on any questionable aspects of the UGC, even after obtaining the posting user’s permission—or perhaps avoid any questionable UGC altogether.

In a social media age that values authenticity, more and more companies—even big, risk-adverse Fortune 100 companies—are interested in finding ways to leverage UGC relevant to their business, products or services. Yet the shift from merely hosting UGC to actively exploiting it raises very real legal hurdles for companies. The tips above are not a substitute for working closely with experienced social media counsel, but they collectively provide a framework for addressing legal risks in connection with a company’s efforts to commercialize UGC.

*          *        *

For more on the issues related to user-generated content, see New Court Decision Highlights Potential Headache for Companies Hosting User-Generated Content; Court Holds That DMCA Safe Harbor Does Not Extend to Infringement Prior to Designation of Agent; and Thinking About Using Pictures Pulled From Twitter? Think Again, New York Court Warns.

03_21_Signs_Today’s companies compete not only for dollars but also for likes, followers, views, tweets, comments and shares. “Social currency,” as some researchers call it, is becoming increasingly important and companies are investing heavily in building their social media fan bases. In some cases, this commitment of time, money and resources has resulted in staggering success. Coca-Cola, for example, has amassed over 96 million likes on its Facebook page and LEGO’s YouTube videos have been played over 2 billion times.

With such impressive statistics, there is no question that a company’s social media presence and the associated pages and profiles can be highly valuable business assets, providing an important means for disseminating content and connecting with customers. But how much control does a company really have over these social media assets? What recourse would be available if a social media platform decided to delete a company’s page or migrate its fans to another page?

The answer may be not very much. Over the past few years, courts have repeatedly found in favor of social media platforms in a number of cases challenging the platforms’ ability to delete or suspend accounts and to remove or relocate user content.

Legal Show-Downs on Social Media Take-Downs

In a recent California case, Lewis v. YouTube, LLC, the plaintiff Jan Lewis’s account was removed by YouTube due to allegations that she artificially inflated view counts in violation of YouTube’s Terms of Service. YouTube eventually restored Lewis’s account and videos but not the view counts or comments that her videos had generated prior to the account’s suspension.

Lewis sued YouTube for breach of contract, alleging that YouTube had deprived her of her reasonable expectations under the Terms of Service that her channel would be maintained and would continue to reflect the same number of views and comments. She sought damages as well as specific performance to compel YouTube to restore her account to its original condition.

The court first held that Lewis could not show damages due to the fact that the YouTube Terms of Service contained a limitation of liability provision that disclaimed liability for any omissions relating to content. The court also held that Lewis was not entitled to specific performance because there was nothing in the Terms of Service that required YouTube to maintain particular content or to display view counts or comments. Accordingly, the court affirmed dismissal of Lewis’s complaint.

In a similar case, Darnaa LLC v. Google, Inc., Darnaa, a singer, posted a music video on YouTube. Again, due to allegations of view count inflation, YouTube removed and relocated the video to a different URL, disclosing on the original page that the video had been removed for violating its Terms of Service. Darnaa sued for breach of the covenant of good faith and fair dealing, interference with prospective economic advantage and defamation. In an email submitted with the complaint, Darnaa’s agent explained that she had launched several large campaigns (each costing $250,000 to $300,000) to promote the video and that the original link was already embedded in thousands of websites and blogs. Darnaa sought damages as well as an injunction to prevent YouTube from removing the video or changing its URL.

The court dismissed all of Darnaa’s claims because YouTube’s Terms of Service require lawsuits to be filed within one year and Darnaa had filed her case too late. In its discussion, however, the court made several interesting points. In considering whether YouTube’s Terms of Service were unconscionable, the court held that, although the terms are by nature a “contract of adhesion,” the level of procedural unconscionability was slight, since the plaintiff could have publicized her videos on a different website. Further, in ruling that the terms were not substantively unconscionable, the court pointed out that “[b]ecause YouTube offers its hosting services free of charge, it is reasonable for YouTube to retain broad discretion over [its] services.”

Although the court ultimately dismissed Darnaa’s claims based on the failure to timely file the suit, the decision was not a complete victory for YouTube. The court granted leave to amend to give Darnaa the opportunity to plead facts showing that she was entitled to equitable tolling of the contractual limitations period. Therefore, the court went on to consider whether Darnaa’s allegations were sufficient to state a claim. Among other things, the court held that YouTube’s Terms of Service were ambiguous regarding the platform’s rights to remove and relocate user videos in its sole discretion. Thus, the court further held that if Darnaa were able to amend the complaint to avoid the consequences of the failure to timely file, then the complaint would be sufficient to state a claim for breach of the contractual covenant of good faith and fair dealing.

Continue Reading How to Protect Your Company’s Social Media Currency

0114_SA_ImageIn this election season, we hear a lot of complaints about laws stifling business innovation. And there is no doubt that some laws have this effect.

But what about laws that spur innovation, that result in the creation of revolutionary new business models?

Section 512(c) of the Digital Millennium Copyright Act (the DMCA) is one such law. Passed by Congress and signed by President Bill Clinton in 1998, Section 512(c) has played an enormous role in the success of YouTube, Facebook and other social media platforms that host user-generated content, by shielding such platforms from monetary damages from copyright infringement claims in connection with such content.

Absent this safe harbor, it is difficult to imagine a company like YouTube thriving as a business. For example, in 2014 alone, YouTube removed over 180 million videos from its platform due to “policy violations,” the vast majority of which likely stemmed from alleged copyright infringement; yet, absent the Section 512(c) safe harbor, YouTube could have been exposed to staggering monetary damages in connection with those videos.

The DMCA’s protection from liability is expansive, but it is not automatic. To qualify, online service providers must affirmatively comply with a number of requirements imposed by the law. While most of those requirements may seem straightforward, a recent case in the Southern District of New York illustrates how even seemingly routine paperwork can pose problems for websites that host user-generated content.

For companies seeking protection under the DMCA, the typical starting point is designating an agent to receive “takedown” notices from copyright owners. If a company is sued for copyright infringement relating to its website, that company will want to show that it has designated a DMCA agent. But what if the designation paperwork was handled by another entity within the defendant’s organizational structure, such as a corporate parent? That was the situation faced by one of the defendants in BWP Media USA Inc., et al. v. Hollywood Fan Sites LLC, et al. (S.D.N.Y. 2015)—and the court held that the defendant was out of luck.

Although the defendant’s corporate parent had filed a registration form with the U.S. Copyright Office under the parent’s name, nothing on the form mentioned the defendant or made any general reference to affiliates. Under those circumstances, the court concluded that the defendant was ineligible for the safe harbor because it had “no presence at all” in the Copyright Office’s directory of DMCA agents. The court reasoned that those searching the Copyright Office directory should not be “expected to have independent knowledge of the corporate structure of a particular service provider.”

Despite lacking a Copyright Office registration, the defendant argued that it did actually post the agent’s information on its own website, and that one of the plaintiffs had successfully used such information to send a takedown notice resulting in removal of the allegedly infringing material. The court found those assertions “irrelevant,” because they did nothing to address the Copyright Office registration requirement. As the court noted, the DMCA requires each service provider to post the agent’s name and contact information on the provider’s website, and submit such information to the Copyright Office.

Would the defendant’s DMCA eligibility have turned out differently if the parent had included the affiliate’s name on the form, or at least made a general reference to the existence of affiliates? The court’s opinion leaves those questions unaddressed, but the preamble to the Copyright Office regulations—cited in passing by the court—appears to reject such an approach. According to the preamble, each designation “may be filed only on behalf of a single service provider[, and] related companies (e.g., parents and subsidiaries) are considered separate service providers who would file separate [designations].”

Following the Hollywood Fan Sites decision, we expect that many companies that host user-generated content will be checking to make sure that all of their legal names are indeed listed in the Copyright Office directory—and, in light of the Copyright Office’s position on this subject, many such companies may also decide to file separate designations for each legal entity within a corporate family. While this process may be cumbersome, it seems a small price to pay for the generous safe harbor benefits offered by the DMCA, especially for companies with business models that depend on user-generated content.

150514SociallyAwareThe latest issue of our Socially Aware newsletter is now available here.

In this issue of Socially Aware, our Burton Award-winning guide to the law and business of social media, we discuss a recent decision in Virginia protecting the anonymity of Yelp users; we examine the FTC’s much anticipated report, “Internet of Things: Privacy & Security in a Connected World;” we explore the major social media platforms’ approaches to handling deceased users’ accounts; we highlight a recent CJEU case holding that extracting large amounts of data from public websites—commonly known as “web scraping”—may violate website’s terms of use; we highlight the first-ever award of “any damages” for fraudulent DMCA takedowns; we drill down on important precedents that are defining the multi-channel programming distribution industry; and we take a look at cross-device tracking in interest-based advertising.

All this—plus an infographic featuring some intriguing online dating statistics.

Read our newsletter.