Social Links is our ongoing series here at Socially Aware that rounds up current developments at the intersection of social media, policy, research, and the law.

Embedding social media posts can be considered copyright infringement…but is it?

A Manhattan federal judge ruled in August 2021 that the practice of embedding social media posts on third-party websites, without permission from the content owner, could violate the owner’s copyright.

The case centered around a 2017 video of a starving polar bear that nature photographer Paul Nicklen took and posted on his Instagram and Facebook accounts. Its purpose was to highlight the effects of global warming. When the image went viral, Sinclair Broadcasting Group published an article about it and embedded Nicklen’s post without obtaining his permission.

In reaching his decision, U.S. District Judge Jed Rakoff rejected the “server test” from the 9th Circuit Court of Appeals, which generally holds that embedding content from a third party’s social media account only violates the content owner’s copyright if a copy is stored on the defendant’s servers.

A recent decision on this very topic, however, reveals a different perspective on embedding practices and the “server test.” Reuters reports that a San Francisco federal court rejected a group of photographers’ claims that Instagram’s embedding tool infringed on their copyrights. Judge Charles Breyer of the U.S. District Court for the Northern District of California ruled that “the Instagram feature doesn’t violate the photographers’ exclusive right to display the pictures publicly because third-party websites that embed the images don’t store their own copies of them.” This decision is second in a series that rejects the “server test” cited above.

Photographers Alexis Hunley and Matthew Brauer led the class action complaint in May, followed by Instagram’s move to dismiss in July. According to Reuters, Judge Breyer noted that “Because they do not store the images and videos, they do not ‘fix’ the copyrighted work in any “tangible medium of expression…. Therefore, when they embed the images and videos, they do not display ‘copies’ of the copyrighted work.”

These decisions will most likely end in appeals. We will continue to monitor the developments as they unfold.

Harnessing the wisdom – and skills – of the crowd to combat social media’s trust issues

Elections, COVID-19 vaccinations, Gabrielle Petito and Brian Laundrie’s disappearance, conspiracy theories about other missing persons: each represents a category that has cast social media platforms under intense scrutiny about their handling of the information – or misinformation – that is published and appears across the social media landscape.

While social media platforms employ fact checkers to validate the veracity of posts, the staggering volume of posts on any given day makes it impossible for them to employ enough people to confirm or reject every entry on every channel globally, 24/7. Scale, not to mention cost, remains the central issue.

A recent Wired article suggests that using the “wisdom of the crowds” – groups of lay people – often matches or surpasses the accuracy that professional fact checkers provide. In addition to cost savings, this model also offers something that professional fact checker programs do not: scalability.

These issues around trust and misinformation on social media platforms will continue to dominate headlines across technological, legal, and public opinion forums.

Social media algorithms result in surprising and unexpected moments for those grieving lost loved ones and friends

Many of us have experienced moments when social media serves up posts, memories, or birthday reminders of those who have passed. A recent Wired article explores the emotional and psychological effects of those on the receiving end of such social media notices. Particularly in the COVID-19 era, when many have experienced sudden losses of loved ones, friends, and family members, social media platforms have provided much-needed forums for those wanting to share news, information, tributes, memorials, and memories of those who have passed. Many of the major social media outlets have mechanisms and settings to help family members and friends manage their deceased loved ones’ accounts in respectful ways. Facebook offers both memorialized accounts and legacy contact settings; Instagram provides similar memorialization settings; and Twitter has processes in place  to work with an authorized estate representative to deactivate an account. Google provides a similarly robust program, allowing individuals to designate “digital beneficiaries” who will act on their behalf to manage their accounts once they’ve passed.

While none of us wants to think about end-of-life directives, taking into consideration the digital footprint that we leave behind on social media as part of our legacy is another factor in the technological landscape where we live today.

Young people, social media, and emerging from COVID-19

Much has been written in the last 18 months about the effects that COVID-19 has had on all of us, but in particular, on the youth in the United States. Common Sense, Hopelab, and the California Healthcare Foundation published Coping with COVID-19: How Young People Use Digital Media to Manage Their Mental Health earlier this year. It examined how youth used social media and other online tools to cope with the separation and isolation from friends and other social structures that are vital to their intellectual, social, and emotional development. 

While depression is on the rise among young people (as this infographic explains), there’s good news on the horizon. Recent analyses have noticed positive trends among young people with decreased levels of depression and cited two major factors that occurred during the pandemic: teens are getting more sleep, and they’re spending more time with family.

In addition, GLAAD (formerly known as the Gay and Lesbian Alliance Against Defamation) recently published its first-ever Social Media Safety Index report that examined social media platforms and LGBTQ+ youths’ involvement and usage. While the report cited an increase in hate speech and other hostile forums on social media, other analysts provided a more nuanced interpretation, citing that social media platforms provide a much-needed lifeline for LGBTQ+ youth as they seek information and support.

As part of the new Fair Consumer Contracts Act, [Gesetz für Faire Verbraucherverträge; published in the Federal Gazette (Part I) no. 53/2021, p. 3433 et seq., full text publicly available (in German) Germany will soon require specific cancellation/termination mechanisms for consumer subscriptions. These mechanisms come on top of the updated EU-wide consumer contract rules under the EU Directives on Contracts for Digital Services and Content and on Contracts for the Consumer Sale of Goods and will take effect on July 1, 2022. Significant implementation effort is expected for affected providers.

Businesses offering subscriptions to German consumers will have to provide a specifically labeled button for their online retail channels that leads to a contact form through which consumers must be enabled to cancel existing subscriptions with the click of only one further button. If sufficient data to identify the subscription to be cancelled is entered by the consumer, the submission of the form will itself be a valid cancellation, the effect of which cannot be made subject to further steps such as logins or second factor (e.g., email, app) confirmations.

Failing to implement this “cancellation button” or “termination button” correctly will invalidate any minimum subscription terms or termination notice periods in contracts for which such button should have been provided and will be a breach of consumer protection that opens providers to cease and desist claims.

Read the full client alert.

Clubhouse, the former invitation-only social media darling that captured the attention of investors, social media early adopters, and competitors since its introduction in April 2020, now faces significant challenges as it strives to remain relevant and attract new and engaged users.

Since our previous report on Clubhouse in March 2021, the social media app has released some significant changes and upgrades on its platform.

Based on trends from other social media apps and platforms, both changes would seem to predict a significant uptick in downloads, new users, and activity.

While Clubhouse added more than 1 million Android users within weeks of its launch in May 2021, more recent numbers indicate that the app’s popularity may be declining. Wired, citing a report from analytics group SensorTower, reported that Clubhouse had only 484,000 new installs globally during the five-day period between July 21-25 2021, compared to the 10 million iOS downloads it received in February 2021. (To date, Clubhouse has 30.2 million installations, with 18.7 million of those on the iOS platform, also according to SensorTower as reported by TechCrunch.)

Yet, this slowdown in installs and usage may simply be an indication of the rapid maturation and leveling off of an app that is competing for attention against a plethora of other offerings that provide similar experiences, such as Spotify. ScreenRant, the online review publication, speculated that the significant decrease in downloads could be attributed to “the early novelty of the app wearing off, competitors offering their own takes on the curated audio rooms concept, and maybe even people leaving their homes a little more as COVID-19 restrictions are lifted.

Even with those concerns, Clubhouse continues to capture the interest of investors and users. It enjoys some key advantages over larger competitors: early mover status, a smaller size, and a more nimble model that will enable it to introduce additional features and functions, such as its recent introduction of payments (for iOS users) and Backchannel, its direct messaging feature.

These upgrades will remain crucial to its continued growth and success. Even with pressure from tech giant competitors that are considering functionality similar to Clubhouse’s, many predict that Clubhouse will remain a strong competitor.

Some analysts, such as Bloomberg’s Alex Webb, raised a critical question about Clubhouse and its content monetization strategy.

Webb described one model akin to the subscription based SiriusXM digital radio channel, where users would pay for content either on a broad-based plan or individual channels. Clubhouse recently rolled out its payment platform directly to other club members (primarily conversation and room hosts in the form of tips), but only for iOS users. With this feature, Clubhouse will not receive any percentage of those payments, raising additional questions on its future monetization strategies. This move does, however, provide incentives for popular and influential content hosts to join Clubhouse and contribute to the app’s rising popularity.

In addition, brands are also catching on quickly to the potential for marketing products on the app. The targeted focus that Clubhouse provides (with its moderated rooms) could help Clubhouse capitalize on brands’ desire to reach early adopter influencers.

Noted author and technology analyst Nir Eyal unpacked Clubhouse’s appeal and how it follows his hook model, described as “a way of describing a user’s interactions with a product as they pass through four phases: a trigger to begin using the product, an action to satisfy the trigger, a variable reward for the action, and some type of investment that, ultimately, makes the product more valuable to the user. As [they go] through these phases, [they build] habits in the process.” Eyal explains how Clubhouse follows this four-step cycle, citing key aspects of the model that Clubhouse has clearly mastered in its early days, including internal triggers, variable rewards, scarcity, and rewards of the tribe.

Clubhouse will continue to be a social media phenomenon, and one to watch as it moves to its next level of adoption, innovation, and investment. We’ll keep a close eye on the app’s evolution, how it continues to push boundaries, and where it may be headed in light of its competitors’ developments.

With a judgment dated April 27 and published on June 4, 2021, the German Federal Court (Bundesgerichtshof – the “Court”) declared unfair and therefore illegal and unenforceable a common way to make changes to terms and conditions (“T&Cs”) used vis-à-vis consumers in Germany.

For more information, read the full client alert.

While Section 230 of the Communications Decency Act continues to face significant calls for reform or even elimination, the recent Coffee v. Google case illustrates that Section 230 continues to provide broad protection to online service providers.

In Coffee, the Northern District of California invoked Section 230 to dismiss a putative class action against Google alleging various claims premised on the theory that video games in the Google Play store with a gaming feature called “loot boxes” constituted illegal “slot machines or devices” under California state law.

To obtain these loot boxes, players must purchase virtual in-game currency through Google Play’s payment system. Players can then exchange the virtual currency for loot boxes, which give them a chance to obtain rare virtual items. Google charges a 30% commission on purchases of such virtual currency.

The plaintiffs asserted that these loot boxes “entice[d] consumers, including children, to engage in gambling and similar addictive conduct.” Because Google profited from the loot boxes through commission it charged on sales of virtual currency, the plaintiffs argued that Google should be held liable under a variety of state law claims.

In response, Google moved to dismiss the plaintiffs’ claims, arguing that it was immune under Section 230, which provides a safe harbor from claims that treat an online intermediary as the publisher or speaker of any information provided by another party.

The court evaluated Google’s Section 230 defense using the standard three-prong test as enunciated by the Ninth Circuit in Barnes v. Yahoo!, Inc.: Immunity exists for a “(1) a provider or user of an interactive computer service (2) whom a plaintiff seeks to treat, under a state law cause of action, as a publisher or speaker (3) of information provided by another information content provider.”

On the first prong—whether Google is a provider of an interactive computer service—the court determined that Google was a provider of such a service because it maintains a virtual online store where consumers can download various software applications that are generally created by other developers.

On the second prong—whether the plaintiffs seek to treat Google as a publisher or speaker under a state law cause of action—the court cited Fair Hous. Council of San Fernando Valley v. Roommates.Com for the proposition that publication includes “any activity that can be boiled down to deciding whether to exclude material that third parties seek to post online.” Because the plaintiffs apparently sought an order requiring Google to screen apps offered through its Google Play store for those containing the loot boxes, the court reasoned that the plaintiffs’ claims did treat Google as the publisher of the video game apps at issue.

The plaintiffs argued that Section 230 only protects publishers of “speech” rather than publishers of other content such as software. But the court rejected that argument, citing Evans v. Hewlett-Packard Co., which held that the defendant enjoyed immunity under Section 230 in connection with the operation of a web-based store that distributed an app developed by a third party.

The plaintiffs also argued that Section 230 did not apply because their claims did not treat Google as a publisher of another’s content but rather sought to “hold Google accountable for permitting and facilitating illegal gambling.” The plaintiffs cited Barnes for the proposition that Section 230 does not insulate interactive computer service providers from liability for their own wrongful conduct that goes beyond merely publishing another’s content.

Unconvinced, the court noted that Barnes denied Section 230 immunity to Yahoo! with respect to a promise that Yahoo! had made to the plaintiff to remove certain third-party content subsequent to, and separately from, the initial publication of the content. Google had made no such promise to the plaintiff in the instant case.

Finally, on the third prong—whether the information was provided by another information content provider—the plaintiffs noted the holding in Roommates that “[a] website operator is immune [under Section 230] only with respect to content created entirely by third parties.” Specifically, a provider that materially contributes to the illegality of the content at issue is not entitled to immunity under Section 230. However, the court held that the plaintiffs failed to allege any conduct by Google that would constitute a material contribution to the video games on its app store.

Accordingly, the court held that Google met all three prongs of the test and was entitled to immunity under Section 230 with respect to the apps in the Google Play store. Accordingly, the court dismissed the case with leave to amend.

Companies contracting with consumers have to take care to ensure their agreement terms are enforceable. In one of the first post-Brexit decisions on issues in an online consumer contract, a UK court recently showed that principles of fairness and transparency remain vital in the terms and conditions of consumer digital contracts.

In Europe, drafting digital consumer contracts requires extra care and thought to be given towards incorporation, meanings, and additional regulations in comparison to B2B contracts. This is equally true in a post-Brexit world as it was back in 2012 when we reported on something similarPlus ça change, plus c’est la même chose – as no Brexiteer would ever say.

Any consumer-facing company doing business online must step back and ask the question: Would this hold up in court? We’ve outlined some key takeaways from this case for organizations to consider when drafting digital consumer contracts that they apply to UK-based customers.

For more information, read the full client alert.

Partner Christiane Stuetzle, senior associate Patricia Ernst, and research assistant Susan Bischoff authored an article for Law360 covering how online content service providers must act to mitigate risks and avoid liability under the European Union’s Copyright Directive, created in an effort to strengthen the rights of copyright holders by making certain platforms that host user-uploaded content (UUC) liable for copyright infringements.

This article was first published on Law360 on May 14, 2021. It is also available as a download on our website. (Please note that Law360 may require a subscription to access this article.)


Until now, throughout the European Union, platforms hosting user-uploaded content have profited from the safe harbor privilege under the EU E-Commerce Directive, which has been shielding platforms from liability for copyright-infringing user uploads for more than 20 years.[1]

This safe harbor privilege implies that online content service providers, or OCSSP, only need to remove copyright-infringing content upon notice to avoid liability for copyright infringement.[2]

In an effort to strengthen the rights of copyright holders, the EU legislator recently decided, however, that certain platform providers will be on the hook for copyright infringements pertaining to user-uploaded content.[3] The directive’s rationale is to close the so-called value gap, a term used to describe rights holders’ missing remuneration if their works are uploaded and shared online by users.

While rights holders have a remuneration claim against such users — though it is difficult to enforce and rarely valuable commercially — they did not have a claim against OCSSPs until now. The directive’s new liability overhauls this substantially: OCSSPs will be considered to commit copyright infringement by making illegal user-uploaded content available.

Liability means, among other things, that OCSSPs can be subject to substantial remuneration and damage claims. Companies need to be aware that this even applies if the OCSSP has properly instructed its users within its standard terms and conditions that upload of non-infringing content only is permitted.

Not all EU member states are supportive of the overhaul. In fact, the Polish government has even challenged the directive before the European Court of Justice.[4] While there is a lot of debate on the details of Article 17, the new liability regime is just around the corner.

The directive requires implementations into national law by June 7. National implementations vary, both as regards timing, some will not meet the deadline, as well as the liability exemptions. Failure by an EU member state to transpose within the deadline will mean that the directive applies directly.

Therefore, it is high time companies had a plan of action to mitigate risks and avoid liability.

Who Is Affected and What Is the Exposure?

The new liability scheme is relevant to OCSSPs only, i.e., service providers, whose main purpose or at least one of the main purposes is to store and give the public access to a large amount of copyright-protected works or subject matters uploaded by its users, which the provider organizes and promotes for profit-making purposes.

For example, categorizing the content and using targeted promotion within would qualify as such organizing. This definition includes certain social media platforms.

For international companies, it is important to keep in mind that these new obligations will apply to any OCSSP hosting copyright-infringing user-uploaded content targeted at the European market regardless of whether its seat is situated within or outside the EU. An OCSSP headquartered in the U.S. but also doing business in Europe will therefore be subject to the new liability scheme.

Sanctions for the copyright infringement committed by an OCSSP will take the form of legal remedies and claims determined by the respective EU member state. For Germany for instance, such sanctions not only include cease-and-desist orders and a penalty in case of recurring infringement, but also information claims and claim for damages in the amount of a hypothetical license fee for the infringing use.

How to Avoid Liability

Under the new liability regime, OCSSPs will have to observe proactive obligations in order to avoid a direct liability for copyright-infringing user-uploaded content.

First and foremost, an OCSSP can avoid direct liability by obtaining a license for the third-party content uploaded by its user.

If no license has been secured, the OCSSP must be able to prove that it complied with the following three obligations:

  • The OCCSP must make “best efforts” to obtain a license. The directive’s wording is less precise than some of the national draft implementations. Still, on this basis alone, it is likely that an OCSSP will be required to actively seek licenses from collective rights societies and large rights holders. License offers do not have to be accepted at any price, but a rejection implies that the OCSSP’s liability for an infringement of the relevant works as part of user-uploaded content remains — subject to the conditions described below.
  • The OCSSP must make best efforts to block copyright-infringing content for which the rights holder has provided the relevant information. Rights holders can provide the OCSSP with information on works that they wish not to be included in user-uploaded content. In practice, OCSSPs will likely have to implement some sort of reference database fed with that information. Content to be uploaded by the OCSSP’s user will then have to be checked against the database. It is to be expected that OCSSPs will only be able to comply with this obligation by employing advanced filter technologies.
  • Finally, the OCSSP must act expeditiously to disable access to or to remove content upon receiving a corresponding request from the rights holder. Subsequently, the OCSSP must block future attempts to upload this removed content, stay-down obligation, by suitable technical means. While the directive does not call such means “automated upload filter,” the majority of stakeholders qualifies this obligation as precisely that. This underlines the economical dimension of Article 17: There is no “one size fits all” filter technology and the directive leaves it open how much companies have to invest into filter technologies.

Adherence to these obligations by the OCSSP shall be determined in accordance with high industry standards of professional diligence and the principle of proportionality. This broad approach leaves EU member states with significant leeway when implementing the directive.

27 EU Member States, 27 Different Sets of Rules

As some of the member states’ implementations contain detailed specifications of the obligations imposed on OCSSPs, providers must consider the national approaches.

For companies operating internationally, it goes without saying that adhering to 27 different national compliance concepts will not be feasible. Instead, they will likely require a compliance concept that aligns with the strictest national set of rules or with their main market.

As one of the most economically relevant markets within the EU, the German implementation draft of the directive, the German Draft Act,[5] is therefore one of the implementations that OCSSPs should closely monitor when developing their risk strategy.

The German Draft Act at a Glance

With regard to licensing best efforts, the German Draft Act adds the additional requirement that OCSSPs must accept licenses available through a collecting management organization or a dependent collecting body established in Germany, as well as individually offered licenses by rights holders, provided that the licenses:

  • Concern content that the OCSSP typically makes available in more than minor quantities, e.g., audio-visual content, music;
  • Cover a considerable repertoire of works and rights holders as well as the German territory; and
  • Allow for use under reasonable terms and conditions, including a reasonable remuneration.

In addition, the OCSSP has to proactively seek licenses from rights holders known to the OCSSP from prior business relationships or other circumstances. The German Draft Act does not stop here though. It also includes a direct compensation claim of authors and performers against the OCSSP to be asserted by collecting societies only.

This even applies where the OCSSP obtains the license from a rights holder such as a record label or a publisher. Even though the OCSSP has not entered into a contract with the author in that case, the author can claim appropriate remuneration from the OCSSP via its collecting society. Business insiders expect that the validity of such double payments will be among the first questions to be presented to the courts.

As regards the obligation to make best efforts to block pre-notified content, the German Draft Act provides for a nuanced procedure.

The user must be given the opportunity to flag the content as statutorily (parody, quote, etc.) or contractually permitted use. In addition, the German Draft Act introduces a new statutory copyright exemption for minor uses — up to 15 seconds of video, 160 characters of text or 125 kilobytes of graphic — against a statutory license fee that the OCSSP has to pay.

This approach is remarkable for a number of reasons. At present, fundamental copyright exemptions such as the right of parody, under German law, do not require extra payment. That principle stays valid — except for uses of parody on OCSSPs. Further, the minor use exemption is a provision by the German legislator without basis in the Copyright Directive and the Information Society Directive.

The European Court of Justice had only recently determined in its “Metal on Metal” decision that copyright exemptions were to be conclusively determined by the Information Society Directive.[6] It remains to be seen if the German legislator’s current inconsistent and much criticized approach will make it into the final implementation.

Another German specificity is the “red buzzer” — a term that until recently was more likely to be associated with gaming shows than with the law. If content is flagged by the user or qualifies as minor use, the OCSSP must upload the content and inform the rights holder.

The rights holder may file a complaint with the OCSSP, starting a maximum one-week-long decision process. In such a case, trustworthy rights holders can make use of the red buzzer procedure, with the German Draft Act lacking a definition for “trustworthy.”

Once the red buzzer is pushed, the OCSSP is then required to immediately block the content until the conclusion of the complaints procedure. In theory, this may sound appealing to rights holders — especially for live broadcasts or content premieres since a few days of illegal exploitation can have a huge commercial impact.

In practice, the hurdle of the red buzzer is that the decision to use it needs to be made by a natural person, not an algorithm. That means rights holders need to staff up in order to benefit from the red buzzer.

If no red buzzer procedure applies, content will stay online until the complaint procedure’s conclusion.

Where content neither is flagged nor qualifies as minor use, the OCSSP must block the content and inform the user, which in turn may file a complaint with the OCSSP.

If works in user-uploaded content do not match any blocking requests, the content will be uploaded.

Practical Considerations

The EU Commission announced that it would publish guidelines on the interpretation and implementation of Article 17.[7] Yet, with less than a month to go before the deadline, these guidelines are still in the making.

Since the directive will become directly applicable upon expiry of the implementation deadline, OCSSPs are well advised to have a strategy in place.

Regardless of national nuances, it seems very likely that the use of advanced, or filter, technologies to meet the stay-down requirements will be common requirement across the EU. Depending on the business model of the respective OCSSP, in an ideal scenario, such filter technology already exists and could be licensed from a vendor. Some companies already offer filter solutions for music for instance. In other instances, there may be a need to develop tailor-made filter technology.

With regard to licensing best efforts, it makes sense for OCSSPs to prioritize the most relevant and most frequently used categories or even rights catalogs and actively approach these rights holders first as to secure licenses or, at least, evidence best efforts.


[1] Article 14 DIRECTIVE 2000/31/EC OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL – of June 8, 2000 – on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (

[2] Under the nuanced and well-established case-law of the European Court of Justice (“ECJ”), the Safe Harbor privilege of the E-Commerce-Directive (see footnote 1) implies a neutral position of the host provider. This requires that the platform confines itself to providing its service neutrally and passively by a merely technical and automatic processing of the data provided by its customers (Case C-324/09, Judgment July 12, 2011, point 113; Joint Cases C-236/08 to C-238/08, Judgment March 23, 2010, point 114). Where a service provider plays an active role, e.g., by optimizing or promoting user content (Case C-324/09, point 116), it has presumed knowledge of or control over unlawful content stored and does thus not profit from the Safe Harbor protection.

[3] Article 17 of the EU Copyright Directive. DIRECTIVE (EU) 2019/ 790 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL — of April 17, 2019 — on copyright and related rights in the Digital Single Market and amending Directives 96/ 9/ EC and 2001/ 29/ EC (

[4] Action brought on May 24, 2019, Case C-401/19. The opinion of the Advocate General that precedes each decision of the ECJ is scheduled for July 15, 2021.

[5] An English language version of the latest draft “Act on the Copyright Liability of Online Sharing Content Service Providers” can be found here. The draft is currently discussed in the relevant committees of the Parliament.

[6] Case C-476/17, Judgment July 29, 2019.

[7] The directive requires the European Commission to issue a guidance on the understanding of the directive’s provisions, including best practices for its implementation by the member states. The guidance shall be based on a stakeholder dialogue process. Accordingly, the Commission held several stakeholder dialogue meetings between October 2019 and February 2020 (further information on the meetings can be found here) and called for written statements of the stakeholders.

After the presentation of a general “European Approach to Artificial Intelligence” by the EU Commission in March 2021, a detailed draft regulation aimed at safeguarding fundamental EU rights and user safety was published today (“Draft Regulation”). The Draft Regulation’s main provisions are the following:

  • A binding regulation for AI Systems (defined below) that directly applies to Providers and Users (both defined below), importers, and distributors of AI Systems in the EU, regardless of their seat.
  • A blacklist of certain AI practices.
  • Fines of up to EUR 30 million or up to 6% of annual turnover, whichever is higher.
  • Transparency obligations and, for High-Risk AI Systems (defined below), registration and extensive compliance obligations

For more information, read the full client alert.

A recent ruling by the Ninth Circuit Court of Appeals in Lemmon v. Snap provides a reminder that while Section 230 of the Communications Decency Act provides broad immunity to the owners and operators of websites and mobile apps, that immunity is not without limits.

As a refresher, Section 230 provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Because of the broad immunity that website owners have enjoyed in the 25 years since its enactment, these have been hailed as the 26 words that created the Internet.

The District Court Action

The case centers around the “Speed Filter” that Snap offered on its popular Snapchat messaging app. Users of the app could send a message to their friends, which would display the speed the user was traveling at the point in time that the message was sent.

In May 2017, Jason Davis (age 17), Hunter Morby (age 17), and Landen Brown (age 20) were driving down Cranberry Road in Walworth County, Wisconsin. The boys were traveling at speeds up to 123 MPH. They sped along at these high speeds for several minutes, before they eventually ran off the road and crashed into a tree.  Their car burst into flames.  All three boys died. Shortly before the crash, one of the boys opened Snapchat to document how fast they were traveling.

In 2019, the parents of two of the boys commenced an action against Snap, the owner and operator of Snapchat, in the United States District Court for the Central District of California. The parents claimed that Snap was negligent in its design of the Speed Filter.

Snap moved to dismiss arguing that the parents had failed to state a claim for negligence and that Snap was in any event immune from suit under Section 230.

Under Ninth Circuit authority, Section 230 immunity from liability exists for “(1) provider or user of an interactive computer service (2) whom a plaintiff seeks to treat, under a state law cause of action, as a publisher or speaker (3) of information provided by another information content provider.”

In February 2020, without reaching the issue of whether the parents had properly stated a negligence claim against Snap, the district court held that Snap was entitled to Section 230 immunity and dismissed the parents’ claims. The district court held that Section 230 immunity applies where the website merely provides a framework that could be utilized for proper or improper purposes by the user. The district court found that the Speed Filter is a neutral tool, which can be utilized for both proper and improper purposes.  The district court reasoned that the Speed Filter is essentially a speedometer tool, which allows users to capture and share their speeds with others.

The district court was guided by the Ninth Circuit’s decision in Dyroff v. Ultimate Software Grp., Inc. In that case, the plaintiff brought an action against an operator of the “Experience Project” website after her son died from a heroin overdose. The website allowed users to register anonymously, recommended users to join certain groups based on the content of their posts and other attributes and sent email notifications when a user posted content to a group.  The plaintiff’s son posted in a “heroin-related group” regarding where to purchase heroin, received an email notification regarding his post, and ultimately connected off the site to purchase heroin from a contact he made on the site.

In Dyroff, the plaintiff argued that the website operator should not receive Section 230 immunity because she was trying to hold the website operator liable for its own “content” — namely, the recommendation and notification functions. The Ninth Circuit disagreed. It held that the website operator was immune from liability under Section 230 because its functions, including recommendations and notifications, were content-neutral tools used to facilitate communications.

The Ninth Circuit Decision

On May 4, 2021, the Ninth Circuit issued its decision in the parents’ appeal from the district court’s dismissal of their claims. The Ninth Circuit held that Section 230 immunity does not apply to bar the parents’ negligent design claim against Snap in connection with the Speed Filter.

The Ninth Circuit held that because the parents’ claim neither treats Snap as a “publisher or speaker” nor relies on “information provided by another information content provider,” Snap does not enjoy Section 230 immunity from this suit. Here the Ninth Circuit noted the parents seek to hold Snap liable for its allegedly “unreasonable and negligent” design decisions regarding Snapchat. And such claims rests on the premise that manufacturers have a duty to exercise due care in supplying products that do not present an unreasonable risk of injury or harm to the public.

The Ninth Circuit held that website and app providers continue to face the prospect of liability, even for their “neutral tools,” so long as plaintiffs’ claims do not blame them for the content that third parties generate with those tools.

Key Takeaways

In recent years, Section 230 immunity has come under increased scrutiny with calls for its reform or repeal. Courts have generally recognized that the language of Section 230 is broadly worded and that the immunity that Congress granted to promote the growth of the Internet should be broadly construed. It remains to be seen whether the Ninth Circuit’s Snap decision reflects a movement away from interpreting Section 230 immunity in broad terms. The decision certainly shows that Section 230 immunity has its limits and that those limits will continue to be tested.

The Supreme Court has issued its much-anticipated ruling in Facebook v. Duguid, impacting many pending Telephone Consumer Protection Act (TCPA) cases nationwide and providing guidance to the many businesses that engage in calling and texting campaigns. The TCPA generally requires an individual’s prior consent to use an automatic telephone dialing system (an “autodialer”) to call or text his or her mobile phone, including for non-marketing purposes.

The definition of an autodialer is thus crucial to whether and when a business may call or text its customers, prospects, or even employees. Unanimously reversing the Ninth Circuit, the Court held that a “necessary feature of an autodialer . . . is the capacity to use a random or sequential number generator to either store or produce phone numbers to be called.” The Court further held that Facebook’s text-notification system should not be considered an autodialer because it sent “targeted or individualized” texts to “numbers linked to specific accounts,” instead of randomly or sequentially storing or producing those numbers.

We explore this decision in our most recent client alert.