Last week, German regulators decided to no longer accept the widely used “JusProg” software as a sufficient means for online service providers to comply with statutory youth protection requirements. The decision is effective immediately, although it will most likely be challenged in court. If it prevails, it puts video-sharing platforms, distributors of gaming content, and online media services at risk of being held accountable for not properly protecting minors from potentially harmful content. For the affected providers, this is particularly challenging because it will be hard, if not impossible, to implement alternative youth protection tools that will meet the redefined regulatory standards.

This alert provides the legal background for the decision and discusses its implications in more detail. It is relevant to all online service providers that target German users.
Continue Reading

The Directive on Copyright in the Digital Single Market (Directive) was finally approved by all EU legislative bodies on April 15, 2019. Introducing “modernizing EU copyright rules for European culture to flourish and circulate” was a key initiative of the European Commission’s Digital Single Market (DSM), which, according to the Commission’s President Jean-Claude Juncker, has now been completed by the Directive as “the missing piece of the puzzle.” The Directive was approved, just in time for the elections to the EU Parliament taking place in May 2019. Within a period of 24 months, the Member States are required to implement the Directive’s provisions into national law.

Various Member States have issued, along with their approval of the Directive, statements regarding their interpretation of the Directive and voicing quite different views about the upcoming implementation process. While Germany strongly opposes the notion of upload filters, it appears that France is in favor of a copyright protection mechanism that includes upload filters. At the same time, it remains a pressing question whether currently available algorithm-based filters would even be able to sufficiently differentiate between infringing and non-infringing content.
Continue Reading

A new law in Australia makes a social media company’s failure to remove “abhorrent violent material” from its platform punishable by significant fines. The law also states that the executives at social media companies who fail to remove the content could be sentenced to jail time.

The European Parliament voted to approve the Copyright Directive,

One of the next big items in Europe will be the expansion of “ePrivacy,” (which, among other things, regulates the use of cookies on websites). While the ePrivacy reform is still being worked on by EU lawmakers, one of the items the ePrivacy Regulation is expected to update is the use of “cookie walls.” Recently, the Austrian and UK data protection authorities (DPAs) issued enforcement actions involving the use of cookie walls, albeit with different findings and conclusions.

Cookie Walls

A cookie wall blocks individuals from accessing a website unless they first accept the use of cookies and similar technologies. The practice of using cookie walls is not prohibited under the current ePrivacy Directive.

However, the European Data Protection Board (EDPB), the successor to the Article 29 Working Party, has issued a non-binding opinion that the use of cookie walls should be prohibited under new EU ePrivacy rules. The EDPB argues that cookie walls run contrary to the General Data Protection Regulation (GDPR): “In order for consent to be freely given as required by the GDPR, access to services and functionalities must not be made conditional on the consent of a user to the processing of personal data or the processing of information related to or processed by the terminal equipment of end-users, meaning that cookie walls should be explicitly prohibited.”


Continue Reading

In 2019, the European Court of Justice (CJEU) is expected to clarify one of the key open issues in EU copyright law: the extent to which online platforms such as YouTube can be liable for copyright infringement caused by user-generated content—content uploaded on to the Internet by users such as music, videos, literature, photos, or the streaming of live events such as concerts. The CJEU decisions are eagerly awaited by both media and copyright owners and by online platform operators—and will mark yet another stage in the on-going battle of the creative industries against copyright infringements in the online world.

SUMMARY

In September 2018, the German Federal Court of Justice (Bundesgerichtshof, BGH) suspended proceedings in a widely-publicized case concerning YouTube’s liability for copyright infringing user-uploaded content and referred a series of questions regarding the interpretation of several EU copyright provisions to the CJEU for a preliminary ruling. A few days later, the BGH also suspended proceedings in five other high-profile cases concerning the liability of the file hosting service uploaded.net for user files containing copyright infringing content and submitted the same questions again to the CJEU.

Previous rulings by the CJEU have addressed both the application of the safe harbor principle set out in EU E-Commerce Directive 2000/31/EC, which shields hosting providers from liability for hosted unlawful third-party content (see, for example, eBay/L’Oreal; Netlog/SABAM; and Scarlet/SABAMof which they have no actual knowledge and, separately, the extent of infringement of copyright by hosting of, or linking to, copyright infringing third-party content under the EU Copyright Directive (See GS Media/Sanoma; Filmspeler; and The Pirate Bay). But it is still unclear under which conditions the providers of the various online platforms that store and make available user-generated content, can rely on the safe harbor privilege applying to hosting providers to avoid liability, or whether they must not only take down the infringing content when they obtain knowledge of such content but also compensate the rights holders of such content for damages for copyright infringement.

The questions that the BGH submitted to the CJEU aim to clarify these uncertainties by bringing together the different requirements established by the previous CJEU rulings for (i) affirming a direct copyright infringement by the online platform providers under the EU Copyright Directive and (ii) denying the application of the safe harbor privilege as well as the legal consequences of such a denial (such as the extent of liability for damages). The CJEU will have to consider the differences between the YouTube and uploaded.net business models. The CJEU will hopefully provide much clearer guidelines on key issues such as:

  • to what extent can providers of online services engage with the user content hosted by them;
  • which activities will trigger a liability for copyright infringement irrespective of actual knowledge of a specific infringement;
  • whether they must actively monitor the content uploaded by users for copyright infringements (e.g., by using state-of-the-art efficient filter technologies) to avoid damage claims by rights holders.

In addition, we expect these cases to have an effect on the interpretation of the new Art. 13 of the revision of the EU Copyright Directive that will likely be adopted by the EU legislative institutions in the second quarter of 2019. The current trilogue negotiations among the EU institutions indicate that, under such new Art.13, providers of online content sharing services will be directly liable for copyright infringements by content uploaded to the platform by their users and will not be granted safe harbor under the EU E-Commerce Directive. The providers would then have to ensure that content for which the providers have not obtained a license from the respective rights holders for use on their platforms cannot be displayed on their platform. This means that the providers would have to monitor all content files when uploaded to their platform, making filter technology mandatory for the majority of the platforms (see our previous Client Alert on the draft amendment to the EU Copyright Directive).


Continue Reading

Geo-blocking is the practice of preventing Internet users in one jurisdiction from accessing services elsewhere based on the user’s geographic location. The European Commission wants to eliminate geo-blocking within the EU—and has taken a significant step forward in its plans to do so by clearing key votes in the EU legislative process.

By the end of 2018, we expect that online retailers will need to ensure that they phase out the use of geo-blocking across the EU except in limited circumstances.

These changes are part of a wider programme of reform affecting all businesses operating in the Technology, Media, and Telecoms sectors in Europe.

Background

The European Commission launched its Digital Single Market (“DSM”) strategy in May 2015. We have written a number of articles following the DSM’s progress: at its inception, one year in, and in 2017 following a mid-term review.


Continue Reading

The European Union (EU) has made reform of the e-commerce rules in Europe one of its main priorities for 2018.

The European Commission has already published two proposed Directives relating to cross-border e-commerce but legislative progress has been slow—a situation that the Commission plans to correct in 2018.

The Commission’s stated aim is to establish

Happy 2018 to our readers! It has become a Socially Aware tradition to start the New Year with some predictions from our editors and contributors. With smart contracts on the horizon, the Internet of Things and cryptocurrencies in the spotlight, and a number of closely watched lawsuits moving toward resolution, 2018 promises to be an exciting year in the world of emerging technology and Internet law.

Here are some of our predictions regarding tech-related legal developments over the next twelve months. As always, the views expressed are not to be attributed to Morrison & Foerster or its clients.

From John Delaney, Co-Founder and Co-Editor, Socially Aware, and Partner at Morrison & Foerster:
Regarding Web Scraping

Web scraping is an increasingly common activity among businesses (by one estimate, web-scraping bots account for as much as 46% of Internet traffic), and is helping to fuel the “Big Data” revolution. Despite the growing popularity of web scraping, courts have been generally unsympathetic to web scrapers. Last August, however, web scrapers finally received a huge victory, as the U.S. District Court for the Northern District of California enjoined LinkedIn from blocking hiQ Labs’ scraping of publicly available user profiles from the LinkedIn website in the hiQ Labs, Inc. v. LinkedIn Corp. litigation. The case is now on appeal to the Ninth Circuit; although my sense is that the Ninth Circuit will reject the broad scope and rationale of the lower court’s ruling, if the Ninth Circuit nevertheless ultimately sides with hiQ Labs, the web scraper, the decision could be a game changer, bringing online scraping out of the shadows and perhaps spurring more aggressive uses of scraping tools and scraped data. On the other hand, if the Ninth Circuit reverses, we may see companies reexamining and perhaps curtailing their scraping initiatives. Either way, 2018 promises to bring greater clarity to this murky area of the law.

Regarding the Growing Challenges for Social Media Platforms

2017 was a tough year for social media platforms. After years of positive press, immense consumer goodwill and a generally “hands off” attitude from regulators, last year saw a growing backlash against social media due to a number of reasons: the continued rise of trolling creating an ever-more toxic online environment; criticism of social media’s role in the dissemination of fake news; the growing concern over social media “filter bubbles” and “echo chambers”; and worries about the potential societal impact of social media’s algorithm-driven effectiveness in attracting and keeping a grip on our attention. Expect to see in 2018 further efforts by social media companies to get out ahead of most if not all of these issues, in the hopes of winning over critics and discouraging greater governmental regulation.

Regarding the DMCA Safe Harbor for Hosting of User-Generated Content

The backlash against social media noted in my prior item may also be reflected to some extent in several 2017 court decisions regarding the DMCA safe harbor shielding website operators and other online service providers from copyright damages in connection with user-generated content (and perhaps in the CDA Section 230 case law discussed by Aaron Rubin below). After nearly two decades of court decisions generally taking an ever more expansive approach to this particular DMCA safe harbor, the pendulum begun to swing in the other direction in 2016, and this trend picked up steam in 2017, culminating in the Ninth Circuit’s Mavrix decision, which found an social media platform provider’s use of volunteer curators to review user posts to deprive the provider of DMCA safe harbor protection. Expect to see the pendulum continue to swing in favor of copyright owners in DMCA safe harbor decisions over the coming year.

Regarding Smart Contracts

Expect to see broader, mainstream adoption of “smart contracts,” especially in the B2B context—and perhaps litigation over smart contracts in 2019 . . . .

From Aaron Rubin, Co-Editor, Socially Aware, and Partner at Morrison & Foerster:
Regarding the CDA Section 230 Safe Harbor

We noted previously that 2016 was a particularly rough year for Section 230 of the Communications Decency Act and the immunity that the statute provides website operators against liability arising from third-party or user-generated content. Now that 2017 is in the rear view mirror, Section 230 is still standing but its future remains imperiled. We have seen evidence of Section 230’s resiliency in recent cases where courts rejected plaintiffs’ creative attempts to find chinks in the immunity’s armor by arguing, for example, that websites lose immunity when they use data analytics to direct users to content, or when they fail to warn users of potential dangers, or when they share ad revenue with content developers. Nonetheless, it is clear that the knives are still out for Section 230, including in Congress, where a number of bills are under consideration that would significantly limit the safe harbor in the name of combatting sex trafficking. I predict that 2018 will only see these efforts to rein in Section 230 increase.
Continue Reading

In 2016, brands spent $570 million on social influencer endorsements on Instagram alone. This recode article takes a looks at how much influencers with certain followings can command, and whether they’re worth the investment.

And don’t overlook the legal issues associated with the use of social media influencers; the FTC just settled its first