Recent Senate hearings on social media safety have spotlighted the urgent need to protect children online, a concern that’s increasingly challenging the legal frameworks governing online platforms. Against this backdrop, the immunity that online platforms enjoy under Section 230 of the Communications Decency Act remains a contentious issue.
Here at Socially Aware, we have been tracking the backlash against Section 230 for many years, and the pressure has only intensified recently. But to paraphrase Mark Twain, rumors of Section 230’s death have (so far) been greatly exaggerated, and the statute still provides platforms with robust protection in many cases, as illustrated by a recent opinion out of the Central District of California dismissing claims against the dating app Grindr. This is not the first time that Grindr has benefited from Section 230 protection.
Grindr is a popular LGBTQ+ dating application. The app requires users to provide an email address and verify that they are over 18, permitting them to create a profile including their geolocation and other information. The app uses the information that users provide to match them with other users who are “geographically proximate.”
The case, John Doe v. Grindr, arose when the plaintiff, a minor, was matched with four adult men, with whom he chatted and whom met in person, ultimately leading to his sexual assault and rape in each instance. Doe sued Grindr for child sex trafficking and distributing a defective product, asserting claims of strict product liability, negligence, negligent misrepresentation, and violation of the Trafficking Victims Protection Reauthorization Act (TVPRA). He alleged that Grindr’s app is an inherently dangerous software product, that Grindr knows that minors use the app, and that sexual predators use it to target minors.
Grindr moved to dismiss Doe’s claims, arguing that they were all barred by Section 230. The court applied the Ninth Circuit’s standard three-prong test, as originally articulated in Barnes v. Yahoo!, to determine Section 230’s applicability. This approach asks three questions: (1) is the defendant a provider or user of an “interactive computer service,” (2) does the plaintiff’s claim seek to treat the defendant as a “publisher or speaker,” and (3) was the content at issue provided by “another information content provider”? If these conditions are met, then the defendant is entitled to Section 230 immunity (assuming no other exception applies).
The court easily determined that Grindr is an interactive computer service for purposes of the first prong of the test. The more interesting questions involved the second and third prongs, with Doe arguing that his claims did not treat Grindr as a publisher or speaker of content provided by a third party.
Doe argued that his claims arose from Grindr’s design, development, and sale of a defective product rather than from Grindr’s activities as a publisher or speaker, relying on the Ninth Circuit case Lemmon v. Snap (which we wrote about previously). In Lemmon, the Ninth Circuit held that Section 230 did not apply to negligent design claims based on Snapchat’s “Speed Filter” feature, because those claims did not treat Snap as a publisher or speaker of information provided by another information content provider, but rather targeted Snap’s allegedly “unreasonable and negligent” design decisions. Similarly, Doe argued, Grindr’s app is defective in that it matches children with adults for in-person sexual encounters and facilitates the exchange of sexually explicit material.
The court, however, distinguished Lemmon, noting that the allegedly defective match function in Grindr’s product relies on and publishes a user’s profile and geolocation data, which is third-party content generated by the user. To support this conclusion, the court cited Dyroff v. Ultimate Software (finding that an app’s features, functions, and algorithms that analyze user content and recommend connections “are tools meant to facilitate the communication and content of others”), Herrick v. Grindr (noting that Grindr’s geolocation feature is based on a user’s mobile device’s longitude and latitude), and Doe v. Myspace (rejecting argument that negligence claims were predicated on the lack of safety features to protect minors from communicating with predators). According to the court,
Grindr received the user content from Doe and the adult men and published it via the match feature’s notification. If Grindr had not published that user-provided content, Doe and the adult men would never have met and the sexual assaults never occurred. [Citation omitted.] Thus, Doe’s claims require the Court to treat Grindr as a publisher of user content.
On the third prong of the test—i.e., whether the relevant published material came from a third-party content provider—Doe argued that Grindr materially contributed to Doe’s harm by matching closely located children and adults on its app and that, therefore, his claim was not based only on third-party content. The court was not convinced, finding that the matching feature was based entirely on the geolocation information and user profiles provided by the users, which is publication of information created by third parties. According to the court, “Grindr is not an ‘information content provider because it did not create or develop information, but rather published information created or developed by third parties.’” [Internal citations omitted.]
Regarding the TVPRA claim, Doe argued that Section 230 did not immunize Grindr because the Allow States and Victims to Fight Online Sex Trafficking Act of 2017 (FOSTA) exempts child sex trafficking claims from the statutory safe harbor. Citing Does v. Reddit, however, the court noted that the FOSTA exception only applies when the platform itself contributed to the sex trafficking by knowingly benefiting or facilitating the illegal activities (we wrote about Does v. Reddit here).
Here, because Doe told Grindr he was over 18, Grindr did not know that Doe was a minor, so Grindr was not directly liable under the TVPRA. Moreover, the court held, Doe’s allegations that Grindr knowingly benefitted (e.g., by selling ads) from sexual predators using the app did not mean that Grindr knowingly benefited from participating in child sex trafficking. According to the court, Grindr’s alleged generalized knowledge that sexual predators used the app was not sufficient to implicate Grindr in a “causal relationship between its own affirmative conduct furthering the sex-trafficking venture and its receipt of resulting ad revenues.” [Internal citations omitted.] Therefore, the FOSTA exception to Section 230 immunity did not apply.
This case underscores the robust protection that Section 230 continues to provide to online platforms, even as societal concerns regarding social media’s impact on children reach fever pitch. As debates around Section 230 rage, Doe v. Grindr illustrates that the safe harbor provided by the statute remains potent, highlighting the complex interplay between ensuring online safety and maintaining the internet’s open, interactive nature.