Photo of Jay Donde

Recently, the “trolley problem,” a decades-old thought experiment in moral philosophy, has been enjoying a second career of sorts, appearing in nightmare visions of a future in which cars make life-and-death decisions for us. Among many driverless car experts, however, talk of trolleys is très gauche. They call the trolley problem sensationalist and irrelevant. But this attitude is unfortunate. Thanks to the arrival of autonomous vehicles, the trolley problem will be answered—that much is unavoidable. More importantly, though, that answer will profoundly reshape the way law is administered in America.

To understand the trolley problem, first consider this scenario: You are standing on a bridge. Underneath you, a railroad track divides into a main route and an alternative. On the main route, 50 people are tied to the rails. A trolley rushes under the bridge on the main route, hurtling towards the captives. Fortunately, there’s a lever on the bridge that, when pulled, will divert the trolley onto the alternative route. Unfortunately, the alternative route is not clear of captives, either — but only one person is tied to it, rather than 50. Do you pull the lever? Continue Reading Yes, the Trolley IS a Problem

Recent challenges to the Federal Trade Commission’s (FTC) authority to police data security practices have criticized the agency’s failure to provide adequate guidance to companies.

In other words, the criticism goes, businesses do not know what they need to do to avoid a charge that their data security programs fall short of the law’s requirements.

A series of blog posts that the FTC began on July 21, 2017, titled “Stick with Security,” follows promises from acting Chair Maureen Ohlhausen to provide more transparency about practices that contribute to reasonable data security. Some of the posts provide insight into specific data security practices that businesses should take, while others merely suggest what, in general, the FTC sees as essential to a comprehensive data security program. Continue Reading More Insight From the FTC on Data Security—or More of the Same?

Two bills designed to facilitate the removal of minors’ personal information from social networking sites are currently under consideration in the California State Assembly, after being approved in the upper house of the state’s legislature, the Senate, in early 2013. The first of the two bills, S.B. 501, would require a “social networking Internet Web site” to remove, within 96 hours of receiving a registered user’s request, any of that user’s personal identifying information that is accessible online. The site would also be required to remove the personal identifying information of a user who is under the age of 18 upon request of the user’s parent or guardian. The second bill, S.B. 568, would require an “Internet Web site” to remove, upon request of a user under age 18, any content that user posted on the site.

Web site operators, whether they consider themselves to be in the social networking space or not, should remain alert to any forthcoming guidance from state agencies on the language contained in each of these bills. For instance, S.B. 501, as currently drafted, defines a “social networking” site as one that “allows an individual to construct a public or partly public profile within a bounded system, articulate a list of other users with whom the individual shares a connection, and view and traverse his or her list of connections . . . .” On its face, this definition would include not only the likes of Facebook and Twitter, but a host of other sites that primarily offer services such as, for example, ecommerce, gaming or blogging, and additionally provide to their users the ability to maintain profiles and interact with one another.

Furthermore, those who use social networking sites should be aware that S.B. 568 is not the Internet equivalent of an “undo” function for ill-advised content uploads. The bill expressly provides that site operators need only remove a minor’s original posting, and not content that “remains visible because a third party has copied the posting or reposted the content.” Therefore, anything uploaded to sites that facilitate rapid dissemination through “sharing” or “re-tweeting” is likely there to stay.

If S.B. 568 is passed by the Assembly, site operators will have until January 1, 2015, to develop the infrastructure necessary to ensure compliance. However, there is no such grace period currently written into S.B. 501, so companies may benefit from reviewing prior instances of social networking sites being required to rapidly implement new privacy policies in response to enforcement actions and changing laws. In one noteworthy episode in late 2011, Facebook was audited by Ireland’s Office of the Data Protection Commissioner (DPC) in response to complaints over the site’s retention of data that users believed they had deleted. Guided by DPC recommendations, Facebook rolled out over the next year a series of 45 privacy-related policy and operational changes, including changes involving whether and how long user data would be retained.

Lastly, companies should understand these two bills in the context of an expanding body of online privacy laws being enacted at both the state and federal levels, and in key foreign jurisdictions. One question likely to be addressed in coming years is whether laws such as S.B. 501 and 568, as well as similar legislation passed in others states—for example, Maine’s Act to Prevent Predatory Marketing Practices against Minors—are preempted by the federal Children’s Online Privacy Protection Act (COPPA), which contains broad language barring state-level imposition of liability “in connection with an activity” discussed in COPPA and that is inconsistent with COPPA’s mandates. Even if these state laws are found to be preempted, however, social networking companies should nonetheless prepare themselves to adapt to an evolving regulatory landscape in the area of privacy protection, as negotiations proceed in the European Union over a new General Data Protection Regulation that would likewise require the removal of users’ data upon request—and levy fines of up to two percent of global revenue for failure to comply.