In 2019, the European Court of Justice (CJEU) is expected to clarify one of the key open issues in EU copyright law: the extent to which online platforms such as YouTube can be liable for copyright infringement caused by user-generated content—content uploaded on to the Internet by users such as music, videos, literature, photos, or the streaming of live events such as concerts. The CJEU decisions are eagerly awaited by both media and copyright owners and by online platform operators—and will mark yet another stage in the on-going battle of the creative industries against copyright infringements in the online world.

SUMMARY

In September 2018, the German Federal Court of Justice (Bundesgerichtshof, BGH) suspended proceedings in a widely-publicized case concerning YouTube’s liability for copyright infringing user-uploaded content and referred a series of questions regarding the interpretation of several EU copyright provisions to the CJEU for a preliminary ruling. A few days later, the BGH also suspended proceedings in five other high-profile cases concerning the liability of the file hosting service uploaded.net for user files containing copyright infringing content and submitted the same questions again to the CJEU.

Previous rulings by the CJEU have addressed both the application of the safe harbor principle set out in EU E-Commerce Directive 2000/31/EC, which shields hosting providers from liability for hosted unlawful third-party content (see, for example, eBay/L’Oreal; Netlog/SABAM; and Scarlet/SABAMof which they have no actual knowledge and, separately, the extent of infringement of copyright by hosting of, or linking to, copyright infringing third-party content under the EU Copyright Directive (See GS Media/Sanoma; Filmspeler; and The Pirate Bay). But it is still unclear under which conditions the providers of the various online platforms that store and make available user-generated content, can rely on the safe harbor privilege applying to hosting providers to avoid liability, or whether they must not only take down the infringing content when they obtain knowledge of such content but also compensate the rights holders of such content for damages for copyright infringement.

The questions that the BGH submitted to the CJEU aim to clarify these uncertainties by bringing together the different requirements established by the previous CJEU rulings for (i) affirming a direct copyright infringement by the online platform providers under the EU Copyright Directive and (ii) denying the application of the safe harbor privilege as well as the legal consequences of such a denial (such as the extent of liability for damages). The CJEU will have to consider the differences between the YouTube and uploaded.net business models. The CJEU will hopefully provide much clearer guidelines on key issues such as:

  • to what extent can providers of online services engage with the user content hosted by them;
  • which activities will trigger a liability for copyright infringement irrespective of actual knowledge of a specific infringement;
  • whether they must actively monitor the content uploaded by users for copyright infringements (e.g., by using state-of-the-art efficient filter technologies) to avoid damage claims by rights holders.

In addition, we expect these cases to have an effect on the interpretation of the new Art. 13 of the revision of the EU Copyright Directive that will likely be adopted by the EU legislative institutions in the second quarter of 2019. The current trilogue negotiations among the EU institutions indicate that, under such new Art.13, providers of online content sharing services will be directly liable for copyright infringements by content uploaded to the platform by their users and will not be granted safe harbor under the EU E-Commerce Directive. The providers would then have to ensure that content for which the providers have not obtained a license from the respective rights holders for use on their platforms cannot be displayed on their platform. This means that the providers would have to monitor all content files when uploaded to their platform, making filter technology mandatory for the majority of the platforms (see our previous Client Alert on the draft amendment to the EU Copyright Directive).

Continue Reading Time to Hit Pause: Copyright Infringement on User Generated Platforms – When Is the Platform Provider Liable for Damages?

In what is being described as “the first settlement to deem such sales illegally deceptive,” New York Attorney General Letitia James has entered into a settlement with a company that had been selling fake followers, likes and views on several social media platforms. Read how much revenue the sales were generating for the defendant companies.

Twitter is requiring parties interested in posting ads related to the European Parliament elections to verify their identities and confirm that they are based in the EU.

The online pinboard Pinterest has confidentially filed for an initial public offering, the Wall Street Journal reports. Find out the valuation the social media company is reportedly seeking, and how the company monetizes its platform.

A state appeals court in New York widened the scope of the discovery allowable in a personal injury case to include these types of posts on social media platforms.

Washington state senators have proposed a bill that would make “social media extortion”—defined as attempting to acquire property from someone in return for removing negative social media communications—a class C felony, which is punishable by a prison sentence of at least five years or a maximum fine up to $10,000. Read about the restaurant owner’s experience that inspired the bill.

The app maker Niantic Inc. may have reached a settlement with homeowners who brought a nationwide class action suit based on trespass and nuisance allegedly caused by the Pokémon Go craze that Socially Aware reported on in the summer of 2016. Find out what the settlement would require of Niantic to do—and possibly pay.

California has made it easier for wineries to promote events over social media without running afoul of state law.

The up-and-coming generation of “unicorn” start-ups—new companies on track to quickly reach $1 billion in value—is looking very different from the first generation, which included now-household names like Uber and Airbnb. Find out how in this New York Times article.

A new app called “Tudder” is basically just like the dating app Tinder, but for cows. We’re serious.

The California Attorney General continued its series of public forums regarding the California Consumer Privacy Act (CCPA), with forums last week in Riverside (January 24, 2019) and
Los Angeles (January 25, 2019). As in the previous forums, there were a significant number of attendees, but few elected to speak publicly regarding their views on the Act. You can read our reports on the public forums held earlier this month in San Francisco and San Diego.

Lisa Kim, Deputy Attorney General for the AG’s Privacy Unit, provided opening remarks at both forums and identified the areas of the AG’s rulemaking on which speakers should focus their comments, specifically those areas of the Act that call for specific AG rules.  Ms. Kim encouraged interested parties to provide written comments and proposed regulatory language during this pre-rulemaking phase. Consistent with the prior forums, she noted that the AG’s office would be listening, and not responding, to comments made in Riverside and Los Angeles.

Of note, the presentation slides made available at the forum (and available here) state that the AG anticipates publishing proposed rules in Fall 2019,and that after that there will be a period for public comment and additional public hearings.

Continue Reading California AG Hosts Two More Public Forums on CCPA in Riverside and Los Angeles

In anticipation of preparing rules to implement the California Consumer Privacy Act, the California Attorney General recently announced six public forums that he will host in January and February 2019 across California.  On January 8, 2019, the AG hosted the first of these forums in San Francisco.  The following provides an overview of the forum and the comments made at the forum.

Overview of the January 8, 2019, San Francisco Forum 

Stacey Schesser, the Supervising Deputy Attorney General for the AG’s Privacy Unit, provided opening remarks.  Ms. Schesser confirmed that the AG’s office is at the very beginning of its rulemaking process.  Although the AG’s office will solicit formal comments after it prepares proposed rules, the AG is interested in receiving detailed written comments from the public with proposed language during this informal period.

These forums appear to be designed to inform the AG’s rulemaking and potentially streamline the process, by allowing public input before rules are drafted.  In this regard, Ms. Schesser clarified that she and other AG representatives in attendance at the San Francisco forum were there only to listen to the public comments and would not respond to questions or engage with speakers.  As a result, if the remaining forums follow a similar approach, it is unlikely that the forums will elicit meaningful intelligence regarding the AG’s anticipated approach to, or the substance of, the anticipated rulemaking.

Continue Reading California Attorney General Holds First California Consumer Privacy Act Public Forum

As we have noted previously, the California Court of Appeal’s Hassell v. Bird decision in 2016 upholding an injunction requiring Yelp to remove certain user reviews was discouraging to social media companies and other online intermediaries, as well as to fans of Section 230 of the Communications Decency Act and proponents of Internet free speech generally. The recent California Supreme Court decision reversing the Court of Appeal was, therefore, met with considerable relief by many in the Internet community.

But while the California Supreme Court’s decision is undoubtedly a significant development, it would be premature for Section 230 fans to break out the champagne; the “most important law protecting Internet speech” remains under attack from many directions, and this recent decision is far from definitive. But before getting into the details of the Hassell v. Bird opinion, let’s step back and consider the context in which the case arose.

Before Section 230: A Wild, Wild Web

A fundamental issue for social media platforms and other online intermediaries, including review sites like Yelp, is whether a company may be held liable when its customers engage in bad behavior, such as posting defamatory content or content that infringes the IP rights of third parties. Imagine if Facebook, Twitter, YouTube, and Yelp were potentially liable for defamation every time one of their users said something nasty (and untrue) about another user on their platforms. It would be hard to imagine the Internet as we currently know it existing if that were the case. Continue Reading Section 230 Survives to Fight Another Day Following California Supreme Court Decision

Just over a month after the EU General Data Protection Regulation (GDPR) took effect, California passed its own sweeping privacy legislation, the California Consumer Privacy Act of 2018.

The Act stands to affect countless global companies doing business in California, many of which recently devoted extensive time and resources to GDPR compliance. These companies must now determine what additional steps are necessary to comply with the Act by the time it takes effect on January 1, 2020.

Join Socially Aware contributors Christine Lyon and Julie O’Neill on Thursday, September 20, 2018, for a deep dive into the key similarities and differences between the GDPR and the Act, as well as practical steps companies can take to assess gaps and chart a path to compliance. The areas they expect to cover include:

  • Notice requirements
  • Access and portability
  • Deletion
  • Opt-outs
  • Discrimination

If you are interested in attending this free webinar, please register here.

On July 19, 2018, in May, et al. v. Expedia Inc., U.S. Magistrate Judge Mark Lane issued a Report and Recommendation recommending that U.S. District Judge Robert Pitman for the Western District of Texas grant a motion to compel arbitration and dismiss a putative class action on the grounds that the plaintiff agreed to the defendants’ website’s Terms and Conditions, which contained a mandatory arbitration clause.

HomeAway User Files Putative Class Action 

HomeAway is an online marketplace for vacation rental properties where property owners can list their properties for rent and travelers can book rental properties. HomeAway’s original business model was to charge owners a fee to list their properties (either on a one-year subscription or pay-per-booking basis) and to allow travelers to search and book rentals for free. HomeAway was acquired by Expedia in 2015 and changed its business model to charge travelers a fee to book rentals in mid-2016. Plaintiff James May had been a property owner who used HomeAway since 2013. Continue Reading Sneaky Website User Bound by Online Terms of Use’s Arbitration Provision Despite Renewing Subscription in Spouse’s Name

An advertising executive who lost his job after being named on an anonymous Instagram account is suing the now-defunct account for defamation. The suit names as defendants not only the account—Diet Madison Avenue, which was intended to root out harassment and discrimination at ad agencies—but also (as “Jane Doe 1,” “Jane Doe 2,” et cetera) several of the anonymous people who ran it. Whether Instagram will ultimately have to turn over the identities of the users behind the account will turn on a couple of key legal issues.

A bill recently passed by the New York State Senate makes it a crime for “a caretaker to post a vulnerable elderly person on social media without their consent.” At least one tech columnist thinks the legislation is so broadly worded that it violates the U.S. Constitution. That might be so, but—in light of several news reports about this unfortunate form of elder abuse over the last few years—that same columnist may not be correct about the bill likely having been passed in response to a one-time incident.

A new law in Egypt that categorizes social media accounts and blogs with more than 5,000 followers as media outlets allows the government in that country to block those accounts and blogs for publishing fake news. Some critics aren’t buying the government’s explanation for the law’s implementation, however, and are suggesting it was inspired by a very different motivation.

Critics of the most recent version of the European Copyright Directive’s Article 13, which the European Parliament rejected in early July, brought home their message by arguing that it would have prevented social media users from uploading and sharing their favorite memes.

In a criminal trial, social media posts may be used by both the prosecution and the defense to impeach a witness but—as with all impeachment evidence—the posts’ use and scope is entirely within the discretion of the trial court. The New York Law Journal’s cybercrime columnist explains.

To thwart rampant cheating by high school children, one country shut down the Internet nationwide during certain hours and had social media platforms go dark for the whole exam period.

Snapchat now allows users to unsend messages. Here’s how.

Employees of Burger King’s Russian division recently had to eat crow for a tasteless social media campaign that offered women a lifetime supply of Whoppers as well as three million Russian rubles ($47,000) in exchange for accomplishing a really crass feat.

We’ve all heard of drivers experiencing road rage, but how about members of the public experiencing robot rage? According to a company that supplies cooler-sized food-delivery robots, its’s a thing.

 

 

 

 

 

If a web server located outside the United States hosts video content that can be viewed by Internet users located in the United States, does a public performance result under U.S. copyright law?

This has been a topic of hot debate for a surprisingly long time, with little or no direct guidance from the courts—until now. A recent decision from the D.C. Circuit, Spanski Enterprises v. Telewizja Polska, addresses this issue head-on, with the court finding that the uploading of video content in which a party held exclusive U.S. public performance rights and the subsequent directing of the content to U.S. viewers upon their request to be an infringing “performance” under the U.S. Copyright Act.

Telewizja Polska (“Polska”) is Poland’s national TV broadcaster that owns, operates and creates content for several Polish TV channels. Polska and Spanski Enterprises (“Spanski”), a Canadian corporation, entered into a licensing agreement granting Spanski exclusive broadcasting rights in North and South America to TVP Polonia, one of Polska’s TV channels. Polska provides online access to its programming through a video-on-demand feature on its Poland-based website and, to protect Spanski’s rights, Polska used geoblocking technology to block North and South American IP addresses from accessing the copyrighted content. The territorial restrictions were either incorporated into the digital video formats of the episodes themselves or assigned through a content management system. Continue Reading Copyright’s Long Arm: Foreign Website Found to Infringe U.S. Copyright Law by Providing U.S. Viewers Access to Site Content

As close observers of the implications of privacy law on companies’ data collection, usage and disclosure practices, we at Socially Aware were among the many tech-law enthusiasts anticipating the U.S. Supreme Court’s recent decision in Carpenter v. United States, in which the Court held that the government must obtain a warrant to acquire customer location information maintained by cellular service providers, at least where that information covers a period of a week or more.

Authored by Chief Justice John Roberts, the 5-4 opinion immediately enshrines greater protections for certain forms of location data assembled by third parties. It also represents the Court’s growing discomfort with the so-called “third-party doctrine”—a line of cases holding that a person does not have a reasonable expectation of privacy in records that he or she voluntarily discloses to a third party. In the longer run, there will likely be further litigation over whether the same logic should extend Fourth Amendment protections to other types of sensitive information in the hands of third parties as courts grapple with applying these principles in the digital age.

Background

Anytime a cell phone uses its network, it must connect to the network through a “cell site.” Whenever cell sites make a connection, they create and record Cell Site Location Information (CSLI). Cell phones may create hundreds of data points in a normal day, and providers collect and store CSLI to spot weak coverage areas and perform other business functions. Continue Reading Location Information Is Protected by the 4th Amendment, SCOTUS Rules