Clubhouse, the former invitation-only social media darling that captured the attention of investors, social media early adopters, and competitors since its introduction in April 2020, now faces significant challenges as it strives to remain relevant and attract new and engaged users.

Since our previous report on Clubhouse in March 2021, the social media app has released some significant changes and upgrades on its platform.

Based on trends from other social media apps and platforms, both changes would seem to predict a significant uptick in downloads, new users, and activity.

While Clubhouse added more than 1 million Android users within weeks of its launch in May 2021, more recent numbers indicate that the app’s popularity may be declining. Wired, citing a report from analytics group SensorTower, reported that Clubhouse had only 484,000 new installs globally during the five-day period between July 21-25 2021, compared to the 10 million iOS downloads it received in February 2021. (To date, Clubhouse has 30.2 million installations, with 18.7 million of those on the iOS platform, also according to SensorTower as reported by TechCrunch.)

Yet, this slowdown in installs and usage may simply be an indication of the rapid maturation and leveling off of an app that is competing for attention against a plethora of other offerings that provide similar experiences, such as Spotify. ScreenRant, the online review publication, speculated that the significant decrease in downloads could be attributed to “the early novelty of the app wearing off, competitors offering their own takes on the curated audio rooms concept, and maybe even people leaving their homes a little more as COVID-19 restrictions are lifted.

Even with those concerns, Clubhouse continues to capture the interest of investors and users. It enjoys some key advantages over larger competitors: early mover status, a smaller size, and a more nimble model that will enable it to introduce additional features and functions, such as its recent introduction of payments (for iOS users) and Backchannel, its direct messaging feature.

These upgrades will remain crucial to its continued growth and success. Even with pressure from tech giant competitors that are considering functionality similar to Clubhouse’s, many predict that Clubhouse will remain a strong competitor.

Some analysts, such as Bloomberg’s Alex Webb, raised a critical question about Clubhouse and its content monetization strategy.

Webb described one model akin to the subscription based SiriusXM digital radio channel, where users would pay for content either on a broad-based plan or individual channels. Clubhouse recently rolled out its payment platform directly to other club members (primarily conversation and room hosts in the form of tips), but only for iOS users. With this feature, Clubhouse will not receive any percentage of those payments, raising additional questions on its future monetization strategies. This move does, however, provide incentives for popular and influential content hosts to join Clubhouse and contribute to the app’s rising popularity.

In addition, brands are also catching on quickly to the potential for marketing products on the app. The targeted focus that Clubhouse provides (with its moderated rooms) could help Clubhouse capitalize on brands’ desire to reach early adopter influencers.

Noted author and technology analyst Nir Eyal unpacked Clubhouse’s appeal and how it follows his hook model, described as “a way of describing a user’s interactions with a product as they pass through four phases: a trigger to begin using the product, an action to satisfy the trigger, a variable reward for the action, and some type of investment that, ultimately, makes the product more valuable to the user. As [they go] through these phases, [they build] habits in the process.” Eyal explains how Clubhouse follows this four-step cycle, citing key aspects of the model that Clubhouse has clearly mastered in its early days, including internal triggers, variable rewards, scarcity, and rewards of the tribe.

Clubhouse will continue to be a social media phenomenon, and one to watch as it moves to its next level of adoption, innovation, and investment. We’ll keep a close eye on the app’s evolution, how it continues to push boundaries, and where it may be headed in light of its competitors’ developments.

With a judgment dated April 27 and published on June 4, 2021, the German Federal Court (Bundesgerichtshof – the “Court”) declared unfair and therefore illegal and unenforceable a common way to make changes to terms and conditions (“T&Cs”) used vis-à-vis consumers in Germany.

For more information, read the full client alert.

While Section 230 of the Communications Decency Act continues to face significant calls for reform or even elimination, the recent Coffee v. Google case illustrates that Section 230 continues to provide broad protection to online service providers.

In Coffee, the Northern District of California invoked Section 230 to dismiss a putative class action against Google alleging various claims premised on the theory that video games in the Google Play store with a gaming feature called “loot boxes” constituted illegal “slot machines or devices” under California state law.

To obtain these loot boxes, players must purchase virtual in-game currency through Google Play’s payment system. Players can then exchange the virtual currency for loot boxes, which give them a chance to obtain rare virtual items. Google charges a 30% commission on purchases of such virtual currency.

The plaintiffs asserted that these loot boxes “entice[d] consumers, including children, to engage in gambling and similar addictive conduct.” Because Google profited from the loot boxes through commission it charged on sales of virtual currency, the plaintiffs argued that Google should be held liable under a variety of state law claims.

In response, Google moved to dismiss the plaintiffs’ claims, arguing that it was immune under Section 230, which provides a safe harbor from claims that treat an online intermediary as the publisher or speaker of any information provided by another party.

The court evaluated Google’s Section 230 defense using the standard three-prong test as enunciated by the Ninth Circuit in Barnes v. Yahoo!, Inc.: Immunity exists for a “(1) a provider or user of an interactive computer service (2) whom a plaintiff seeks to treat, under a state law cause of action, as a publisher or speaker (3) of information provided by another information content provider.”

On the first prong—whether Google is a provider of an interactive computer service—the court determined that Google was a provider of such a service because it maintains a virtual online store where consumers can download various software applications that are generally created by other developers.

On the second prong—whether the plaintiffs seek to treat Google as a publisher or speaker under a state law cause of action—the court cited Fair Hous. Council of San Fernando Valley v. Roommates.Com for the proposition that publication includes “any activity that can be boiled down to deciding whether to exclude material that third parties seek to post online.” Because the plaintiffs apparently sought an order requiring Google to screen apps offered through its Google Play store for those containing the loot boxes, the court reasoned that the plaintiffs’ claims did treat Google as the publisher of the video game apps at issue.

The plaintiffs argued that Section 230 only protects publishers of “speech” rather than publishers of other content such as software. But the court rejected that argument, citing Evans v. Hewlett-Packard Co., which held that the defendant enjoyed immunity under Section 230 in connection with the operation of a web-based store that distributed an app developed by a third party.

The plaintiffs also argued that Section 230 did not apply because their claims did not treat Google as a publisher of another’s content but rather sought to “hold Google accountable for permitting and facilitating illegal gambling.” The plaintiffs cited Barnes for the proposition that Section 230 does not insulate interactive computer service providers from liability for their own wrongful conduct that goes beyond merely publishing another’s content.

Unconvinced, the court noted that Barnes denied Section 230 immunity to Yahoo! with respect to a promise that Yahoo! had made to the plaintiff to remove certain third-party content subsequent to, and separately from, the initial publication of the content. Google had made no such promise to the plaintiff in the instant case.

Finally, on the third prong—whether the information was provided by another information content provider—the plaintiffs noted the holding in Roommates that “[a] website operator is immune [under Section 230] only with respect to content created entirely by third parties.” Specifically, a provider that materially contributes to the illegality of the content at issue is not entitled to immunity under Section 230. However, the court held that the plaintiffs failed to allege any conduct by Google that would constitute a material contribution to the video games on its app store.

Accordingly, the court held that Google met all three prongs of the test and was entitled to immunity under Section 230 with respect to the apps in the Google Play store. Accordingly, the court dismissed the case with leave to amend.

Companies contracting with consumers have to take care to ensure their agreement terms are enforceable. In one of the first post-Brexit decisions on issues in an online consumer contract, a UK court recently showed that principles of fairness and transparency remain vital in the terms and conditions of consumer digital contracts.

In Europe, drafting digital consumer contracts requires extra care and thought to be given towards incorporation, meanings, and additional regulations in comparison to B2B contracts. This is equally true in a post-Brexit world as it was back in 2012 when we reported on something similarPlus ça change, plus c’est la même chose – as no Brexiteer would ever say.

Any consumer-facing company doing business online must step back and ask the question: Would this hold up in court? We’ve outlined some key takeaways from this case for organizations to consider when drafting digital consumer contracts that they apply to UK-based customers.

For more information, read the full client alert.

Partner Christiane Stuetzle, senior associate Patricia Ernst, and research assistant Susan Bischoff authored an article for Law360 covering how online content service providers must act to mitigate risks and avoid liability under the European Union’s Copyright Directive, created in an effort to strengthen the rights of copyright holders by making certain platforms that host user-uploaded content (UUC) liable for copyright infringements.

This article was first published on Law360 on May 14, 2021. It is also available as a download on our website. (Please note that Law360 may require a subscription to access this article.)

===

Until now, throughout the European Union, platforms hosting user-uploaded content have profited from the safe harbor privilege under the EU E-Commerce Directive, which has been shielding platforms from liability for copyright-infringing user uploads for more than 20 years.[1]

This safe harbor privilege implies that online content service providers, or OCSSP, only need to remove copyright-infringing content upon notice to avoid liability for copyright infringement.[2]

In an effort to strengthen the rights of copyright holders, the EU legislator recently decided, however, that certain platform providers will be on the hook for copyright infringements pertaining to user-uploaded content.[3] The directive’s rationale is to close the so-called value gap, a term used to describe rights holders’ missing remuneration if their works are uploaded and shared online by users.

While rights holders have a remuneration claim against such users — though it is difficult to enforce and rarely valuable commercially — they did not have a claim against OCSSPs until now. The directive’s new liability overhauls this substantially: OCSSPs will be considered to commit copyright infringement by making illegal user-uploaded content available.

Liability means, among other things, that OCSSPs can be subject to substantial remuneration and damage claims. Companies need to be aware that this even applies if the OCSSP has properly instructed its users within its standard terms and conditions that upload of non-infringing content only is permitted.

Not all EU member states are supportive of the overhaul. In fact, the Polish government has even challenged the directive before the European Court of Justice.[4] While there is a lot of debate on the details of Article 17, the new liability regime is just around the corner.

The directive requires implementations into national law by June 7. National implementations vary, both as regards timing, some will not meet the deadline, as well as the liability exemptions. Failure by an EU member state to transpose within the deadline will mean that the directive applies directly.

Therefore, it is high time companies had a plan of action to mitigate risks and avoid liability.

Who Is Affected and What Is the Exposure?

The new liability scheme is relevant to OCSSPs only, i.e., service providers, whose main purpose or at least one of the main purposes is to store and give the public access to a large amount of copyright-protected works or subject matters uploaded by its users, which the provider organizes and promotes for profit-making purposes.

For example, categorizing the content and using targeted promotion within would qualify as such organizing. This definition includes certain social media platforms.

For international companies, it is important to keep in mind that these new obligations will apply to any OCSSP hosting copyright-infringing user-uploaded content targeted at the European market regardless of whether its seat is situated within or outside the EU. An OCSSP headquartered in the U.S. but also doing business in Europe will therefore be subject to the new liability scheme.

Sanctions for the copyright infringement committed by an OCSSP will take the form of legal remedies and claims determined by the respective EU member state. For Germany for instance, such sanctions not only include cease-and-desist orders and a penalty in case of recurring infringement, but also information claims and claim for damages in the amount of a hypothetical license fee for the infringing use.

How to Avoid Liability

Under the new liability regime, OCSSPs will have to observe proactive obligations in order to avoid a direct liability for copyright-infringing user-uploaded content.

First and foremost, an OCSSP can avoid direct liability by obtaining a license for the third-party content uploaded by its user.

If no license has been secured, the OCSSP must be able to prove that it complied with the following three obligations:

  • The OCCSP must make “best efforts” to obtain a license. The directive’s wording is less precise than some of the national draft implementations. Still, on this basis alone, it is likely that an OCSSP will be required to actively seek licenses from collective rights societies and large rights holders. License offers do not have to be accepted at any price, but a rejection implies that the OCSSP’s liability for an infringement of the relevant works as part of user-uploaded content remains — subject to the conditions described below.
  • The OCSSP must make best efforts to block copyright-infringing content for which the rights holder has provided the relevant information. Rights holders can provide the OCSSP with information on works that they wish not to be included in user-uploaded content. In practice, OCSSPs will likely have to implement some sort of reference database fed with that information. Content to be uploaded by the OCSSP’s user will then have to be checked against the database. It is to be expected that OCSSPs will only be able to comply with this obligation by employing advanced filter technologies.
  • Finally, the OCSSP must act expeditiously to disable access to or to remove content upon receiving a corresponding request from the rights holder. Subsequently, the OCSSP must block future attempts to upload this removed content, stay-down obligation, by suitable technical means. While the directive does not call such means “automated upload filter,” the majority of stakeholders qualifies this obligation as precisely that. This underlines the economical dimension of Article 17: There is no “one size fits all” filter technology and the directive leaves it open how much companies have to invest into filter technologies.

Adherence to these obligations by the OCSSP shall be determined in accordance with high industry standards of professional diligence and the principle of proportionality. This broad approach leaves EU member states with significant leeway when implementing the directive.

27 EU Member States, 27 Different Sets of Rules

As some of the member states’ implementations contain detailed specifications of the obligations imposed on OCSSPs, providers must consider the national approaches.

For companies operating internationally, it goes without saying that adhering to 27 different national compliance concepts will not be feasible. Instead, they will likely require a compliance concept that aligns with the strictest national set of rules or with their main market.

As one of the most economically relevant markets within the EU, the German implementation draft of the directive, the German Draft Act,[5] is therefore one of the implementations that OCSSPs should closely monitor when developing their risk strategy.

The German Draft Act at a Glance

With regard to licensing best efforts, the German Draft Act adds the additional requirement that OCSSPs must accept licenses available through a collecting management organization or a dependent collecting body established in Germany, as well as individually offered licenses by rights holders, provided that the licenses:

  • Concern content that the OCSSP typically makes available in more than minor quantities, e.g., audio-visual content, music;
  • Cover a considerable repertoire of works and rights holders as well as the German territory; and
  • Allow for use under reasonable terms and conditions, including a reasonable remuneration.

In addition, the OCSSP has to proactively seek licenses from rights holders known to the OCSSP from prior business relationships or other circumstances. The German Draft Act does not stop here though. It also includes a direct compensation claim of authors and performers against the OCSSP to be asserted by collecting societies only.

This even applies where the OCSSP obtains the license from a rights holder such as a record label or a publisher. Even though the OCSSP has not entered into a contract with the author in that case, the author can claim appropriate remuneration from the OCSSP via its collecting society. Business insiders expect that the validity of such double payments will be among the first questions to be presented to the courts.

As regards the obligation to make best efforts to block pre-notified content, the German Draft Act provides for a nuanced procedure.

The user must be given the opportunity to flag the content as statutorily (parody, quote, etc.) or contractually permitted use. In addition, the German Draft Act introduces a new statutory copyright exemption for minor uses — up to 15 seconds of video, 160 characters of text or 125 kilobytes of graphic — against a statutory license fee that the OCSSP has to pay.

This approach is remarkable for a number of reasons. At present, fundamental copyright exemptions such as the right of parody, under German law, do not require extra payment. That principle stays valid — except for uses of parody on OCSSPs. Further, the minor use exemption is a provision by the German legislator without basis in the Copyright Directive and the Information Society Directive.

The European Court of Justice had only recently determined in its “Metal on Metal” decision that copyright exemptions were to be conclusively determined by the Information Society Directive.[6] It remains to be seen if the German legislator’s current inconsistent and much criticized approach will make it into the final implementation.

Another German specificity is the “red buzzer” — a term that until recently was more likely to be associated with gaming shows than with the law. If content is flagged by the user or qualifies as minor use, the OCSSP must upload the content and inform the rights holder.

The rights holder may file a complaint with the OCSSP, starting a maximum one-week-long decision process. In such a case, trustworthy rights holders can make use of the red buzzer procedure, with the German Draft Act lacking a definition for “trustworthy.”

Once the red buzzer is pushed, the OCSSP is then required to immediately block the content until the conclusion of the complaints procedure. In theory, this may sound appealing to rights holders — especially for live broadcasts or content premieres since a few days of illegal exploitation can have a huge commercial impact.

In practice, the hurdle of the red buzzer is that the decision to use it needs to be made by a natural person, not an algorithm. That means rights holders need to staff up in order to benefit from the red buzzer.

If no red buzzer procedure applies, content will stay online until the complaint procedure’s conclusion.

Where content neither is flagged nor qualifies as minor use, the OCSSP must block the content and inform the user, which in turn may file a complaint with the OCSSP.

If works in user-uploaded content do not match any blocking requests, the content will be uploaded.

Practical Considerations

The EU Commission announced that it would publish guidelines on the interpretation and implementation of Article 17.[7] Yet, with less than a month to go before the deadline, these guidelines are still in the making.

Since the directive will become directly applicable upon expiry of the implementation deadline, OCSSPs are well advised to have a strategy in place.

Regardless of national nuances, it seems very likely that the use of advanced, or filter, technologies to meet the stay-down requirements will be common requirement across the EU. Depending on the business model of the respective OCSSP, in an ideal scenario, such filter technology already exists and could be licensed from a vendor. Some companies already offer filter solutions for music for instance. In other instances, there may be a need to develop tailor-made filter technology.

With regard to licensing best efforts, it makes sense for OCSSPs to prioritize the most relevant and most frequently used categories or even rights catalogs and actively approach these rights holders first as to secure licenses or, at least, evidence best efforts.

Footnotes

[1] Article 14 DIRECTIVE 2000/31/EC OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL – of June 8, 2000 – on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (europa.eu).

[2] Under the nuanced and well-established case-law of the European Court of Justice (“ECJ”), the Safe Harbor privilege of the E-Commerce-Directive (see footnote 1) implies a neutral position of the host provider. This requires that the platform confines itself to providing its service neutrally and passively by a merely technical and automatic processing of the data provided by its customers (Case C-324/09, Judgment July 12, 2011, point 113; Joint Cases C-236/08 to C-238/08, Judgment March 23, 2010, point 114). Where a service provider plays an active role, e.g., by optimizing or promoting user content (Case C-324/09, point 116), it has presumed knowledge of or control over unlawful content stored and does thus not profit from the Safe Harbor protection.

[3] Article 17 of the EU Copyright Directive. DIRECTIVE (EU) 2019/ 790 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL — of April 17, 2019 — on copyright and related rights in the Digital Single Market and amending Directives 96/ 9/ EC and 2001/ 29/ EC (europa.eu).

[4] Action brought on May 24, 2019, Case C-401/19. The opinion of the Advocate General that precedes each decision of the ECJ is scheduled for July 15, 2021.

[5] An English language version of the latest draft “Act on the Copyright Liability of Online Sharing Content Service Providers” can be found here. The draft is currently discussed in the relevant committees of the Parliament.

[6] Case C-476/17, Judgment July 29, 2019.

[7] The directive requires the European Commission to issue a guidance on the understanding of the directive’s provisions, including best practices for its implementation by the member states. The guidance shall be based on a stakeholder dialogue process. Accordingly, the Commission held several stakeholder dialogue meetings between October 2019 and February 2020 (further information on the meetings can be found here) and called for written statements of the stakeholders.

After the presentation of a general “European Approach to Artificial Intelligence” by the EU Commission in March 2021, a detailed draft regulation aimed at safeguarding fundamental EU rights and user safety was published today (“Draft Regulation”). The Draft Regulation’s main provisions are the following:

  • A binding regulation for AI Systems (defined below) that directly applies to Providers and Users (both defined below), importers, and distributors of AI Systems in the EU, regardless of their seat.
  • A blacklist of certain AI practices.
  • Fines of up to EUR 30 million or up to 6% of annual turnover, whichever is higher.
  • Transparency obligations and, for High-Risk AI Systems (defined below), registration and extensive compliance obligations

For more information, read the full client alert.

A recent ruling by the Ninth Circuit Court of Appeals in Lemmon v. Snap provides a reminder that while Section 230 of the Communications Decency Act provides broad immunity to the owners and operators of websites and mobile apps, that immunity is not without limits.

As a refresher, Section 230 provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Because of the broad immunity that website owners have enjoyed in the 25 years since its enactment, these have been hailed as the 26 words that created the Internet.

The District Court Action

The case centers around the “Speed Filter” that Snap offered on its popular Snapchat messaging app. Users of the app could send a message to their friends, which would display the speed the user was traveling at the point in time that the message was sent.

In May 2017, Jason Davis (age 17), Hunter Morby (age 17), and Landen Brown (age 20) were driving down Cranberry Road in Walworth County, Wisconsin. The boys were traveling at speeds up to 123 MPH. They sped along at these high speeds for several minutes, before they eventually ran off the road and crashed into a tree.  Their car burst into flames.  All three boys died. Shortly before the crash, one of the boys opened Snapchat to document how fast they were traveling.

In 2019, the parents of two of the boys commenced an action against Snap, the owner and operator of Snapchat, in the United States District Court for the Central District of California. The parents claimed that Snap was negligent in its design of the Speed Filter.

Snap moved to dismiss arguing that the parents had failed to state a claim for negligence and that Snap was in any event immune from suit under Section 230.

Under Ninth Circuit authority, Section 230 immunity from liability exists for “(1) provider or user of an interactive computer service (2) whom a plaintiff seeks to treat, under a state law cause of action, as a publisher or speaker (3) of information provided by another information content provider.”

In February 2020, without reaching the issue of whether the parents had properly stated a negligence claim against Snap, the district court held that Snap was entitled to Section 230 immunity and dismissed the parents’ claims. The district court held that Section 230 immunity applies where the website merely provides a framework that could be utilized for proper or improper purposes by the user. The district court found that the Speed Filter is a neutral tool, which can be utilized for both proper and improper purposes.  The district court reasoned that the Speed Filter is essentially a speedometer tool, which allows users to capture and share their speeds with others.

The district court was guided by the Ninth Circuit’s decision in Dyroff v. Ultimate Software Grp., Inc. In that case, the plaintiff brought an action against an operator of the “Experience Project” website after her son died from a heroin overdose. The website allowed users to register anonymously, recommended users to join certain groups based on the content of their posts and other attributes and sent email notifications when a user posted content to a group.  The plaintiff’s son posted in a “heroin-related group” regarding where to purchase heroin, received an email notification regarding his post, and ultimately connected off the site to purchase heroin from a contact he made on the site.

In Dyroff, the plaintiff argued that the website operator should not receive Section 230 immunity because she was trying to hold the website operator liable for its own “content” — namely, the recommendation and notification functions. The Ninth Circuit disagreed. It held that the website operator was immune from liability under Section 230 because its functions, including recommendations and notifications, were content-neutral tools used to facilitate communications.

The Ninth Circuit Decision

On May 4, 2021, the Ninth Circuit issued its decision in the parents’ appeal from the district court’s dismissal of their claims. The Ninth Circuit held that Section 230 immunity does not apply to bar the parents’ negligent design claim against Snap in connection with the Speed Filter.

The Ninth Circuit held that because the parents’ claim neither treats Snap as a “publisher or speaker” nor relies on “information provided by another information content provider,” Snap does not enjoy Section 230 immunity from this suit. Here the Ninth Circuit noted the parents seek to hold Snap liable for its allegedly “unreasonable and negligent” design decisions regarding Snapchat. And such claims rests on the premise that manufacturers have a duty to exercise due care in supplying products that do not present an unreasonable risk of injury or harm to the public.

The Ninth Circuit held that website and app providers continue to face the prospect of liability, even for their “neutral tools,” so long as plaintiffs’ claims do not blame them for the content that third parties generate with those tools.

Key Takeaways

In recent years, Section 230 immunity has come under increased scrutiny with calls for its reform or repeal. Courts have generally recognized that the language of Section 230 is broadly worded and that the immunity that Congress granted to promote the growth of the Internet should be broadly construed. It remains to be seen whether the Ninth Circuit’s Snap decision reflects a movement away from interpreting Section 230 immunity in broad terms. The decision certainly shows that Section 230 immunity has its limits and that those limits will continue to be tested.

The Supreme Court has issued its much-anticipated ruling in Facebook v. Duguid, impacting many pending Telephone Consumer Protection Act (TCPA) cases nationwide and providing guidance to the many businesses that engage in calling and texting campaigns. The TCPA generally requires an individual’s prior consent to use an automatic telephone dialing system (an “autodialer”) to call or text his or her mobile phone, including for non-marketing purposes.

The definition of an autodialer is thus crucial to whether and when a business may call or text its customers, prospects, or even employees. Unanimously reversing the Ninth Circuit, the Court held that a “necessary feature of an autodialer . . . is the capacity to use a random or sequential number generator to either store or produce phone numbers to be called.” The Court further held that Facebook’s text-notification system should not be considered an autodialer because it sent “targeted or individualized” texts to “numbers linked to specific accounts,” instead of randomly or sequentially storing or producing those numbers.

We explore this decision in our most recent client alert.

We’ve all been there: How many times have we downloaded a new social media app, only to have one of the sign-up steps ask for access to our contacts or address book? While on the surface the request seems innocent enough – the whole point of social media is to be social and connect with others – some apps may take that access too far, raising questions and both legal and ethical issues around personal data privacy and security.

Take, for example, the new, wildly popular, invitation-only audio chat app Clubhouse. One of the first steps the app requests of the user is access to her or his contacts. While not a required step, not granting access to contacts defeats the purpose of signing up for the service in the first place, as the app won’t allow the user to invite others to join in.

Many apps require access to features on one’s device to work properly: for example, without location access, ride-share services, such as Lyft and Uber, won’t work.

But granting access to one’s entire contacts address book – both past and present – provides the app platform usage rights in ways that the user may not always want to grant either willingly or unwillingly.

A recent article from One Zero unpacked the issues surrounding data-handling practices and provided several useful – and sometimes disturbing – examples of Clubhouse members who unwillingly offered access to their contacts, only to witness uncomfortable and sometimes embarrassing connections for the invitation-only app.

Many other social connection apps that have proliferated in the last year or two (especially in the light of the COVID-19 pandemic) use similar methods. Apps, such as House Party, GroupMe, Yubo, Hoop, Telegram, Discord, and Line, all use connection algorithms to suggest friends and contacts from the user’s address books.

The proliferation of these social apps presents tremendous business opportunities to platform developers. With those opportunities come responsibilities to develop a robust and transparent set of disclosures and disclaimers to ensure these apps maintain high levels and standards of data privacy for subscribers and those subscribers’ contacts, be they direct or indirect users.

In Stover v. Experian Holdings, the Ninth Circuit decided an issue of first impression for the circuit, holding that a party’s single visit to a website four years after her original visit—when she agreed to an online contract containing a change-of-terms provision—is not enough to bind her to an arbitration provision that she wasn’t aware of and that appeared in a later version of the contract.

The panel held “a mere website visit after the end of a business relationship” is not enough “to bind parties to changed terms in a contract pursuant to a change-of-terms provision in the original contract.”

In something of a departure from the typical case involving modification of an online contract, the plaintiff in the case, Rachel Stover, asserted that the updated arbitration provision did apply, while the defendant website operator, Experian, argued that the parties remained subject to the original terms. Continue Reading Role Reversal: Ninth Circuit Rejects Consumer’s Attempt to Enforce Updated Arbitration Provision in Website Terms of Use