Last week, the Federal Trade Commission made clear that child-directed parts of an otherwise general audience service will subject the operator of the service to the Children’s Online Privacy Protection Act (COPPA).

Just six months after the FTC’s record-setting settlement against TikTok, the FTC announced a $170 million fine against Google and its subsidiary YouTube to settle allegations that YouTube had collected personal information from children without first obtaining parental consent, in violation of the FTC’s rule implementing COPPA. This $170 million fine—$136 million to the FTC and $34 million to the New York Attorney General, with whom the FTC brought the enforcement action—dwarfs the $5.7 million levied against TikTok earlier this year. It is by far the largest amount that the FTC has obtained in a COPPA case since Congress enacted the law in 1998. The settlement puts operators of general-audience websites on notice that they are not automatically excluded from COPPA’s coverage: they are required to comply with COPPA if particular parts of their websites or content (including content uploaded by others) are directed to children under age 13.


Continue Reading

The French data protection authority, the CNIL, continues to fine organizations for failing to adopt what the CNIL considers to be fundamental data security measures. In May 2019, the CNIL imposed a EUR 400,000 fine on a French real estate company for failing to have basic authentication measures on a server and for retaining information too long. This is the second fine by the CNIL under the EU General Data Protection Regulation 2016/679 (GDPR) after the one against Google. The decision is among many pre-GDPR fines imposed by the CNIL for failing to meet security standards, and shows that data security continues to be a high enforcement priority for the CNIL.

Background

French real estate company Sergic operated a website where individuals could upload information about themselves for their property rental applications. Responding to a complaint by an applicant, the CNIL investigated Sergic in September 2018, as it appeared that applicants’ documents were freely accessible without authentication (by modifying a value in the website URL). The CNIL confirmed the vulnerability and found that almost 300,000 documents were accessible in a master file containing information such as individuals’ government issued IDs, Social Security numbers, marriage and death certificates, divorce judgments, and tax, bank and rental statements. The CNIL also discovered that Sergic had been informed of the vulnerability back in March 2018 but did not fix it until September 2018.

Continue Reading

Nevada just joined California as the second state to enact an opt-out right for consumers from the “sale” of their personal information. Senate Bill 220, which was signed into law on May 29, 2019, is scheduled to take effect on October 1, 2019, three months prior to its precursor under the California Consumer Protection Act (the CCPA). The opt-out right is one of several changes made to Nevada’s existing online privacy law, which requires operators of commercial websites and other online services to post a privacy policy. In addition to the new opt-out right, the revised law exempts from its requirements certain financial institutions, HIPAA-covered entities, and motor vehicle businesses.
Continue Reading

The cost for violating the Children’s Online Privacy Protection Act (COPPA) has been steadily rising, and companies subject to the law should take heed. Last week, the Federal Trade Commission (FTC) announced a record-setting $5.7 million settlement with the mobile app company Musical.ly for a myriad of COPPA violations, exceeding even the December 2018 $4.95 million COPPA settlement by the New York Attorney General. Notably, two Commissioners issued a statement accompanying the settlement, arguing that the FTC should prioritize holding executives personally responsible for their roles in deliberate violations of the law in the future.

COPPA is intended to ensure parents are informed about, and can control, the online collection of personal information (PI) from their children under age thirteen. Musical.ly (now operating as “TikTok”) is a popular social media application that allows users to create and share lip-sync videos to popular songs. The FTC cited the Shanghai-based company for numerous violations of COPPA, including failure to obtain parental consent and failure to properly delete children’s PI upon a parent’s request.


Continue Reading

The California Attorney General continued its series of public forums regarding the California Consumer Privacy Act (CCPA), with forums last week in Riverside (January 24, 2019) and
Los Angeles (January 25, 2019). As in the previous forums, there were a significant number of attendees, but few elected to speak publicly regarding their views on the Act. You can read our reports on the public forums held earlier this month in San Francisco and San Diego.

Lisa Kim, Deputy Attorney General for the AG’s Privacy Unit, provided opening remarks at both forums and identified the areas of the AG’s rulemaking on which speakers should focus their comments, specifically those areas of the Act that call for specific AG rules.  Ms. Kim encouraged interested parties to provide written comments and proposed regulatory language during this pre-rulemaking phase. Consistent with the prior forums, she noted that the AG’s office would be listening, and not responding, to comments made in Riverside and Los Angeles.

Of note, the presentation slides made available at the forum (and available here) state that the AG anticipates publishing proposed rules in Fall 2019,and that after that there will be a period for public comment and additional public hearings.


Continue Reading

In anticipation of preparing rules to implement the California Consumer Privacy Act, the California Attorney General recently announced six public forums that he will host in January and February 2019 across California.  On January 8, 2019, the AG hosted the first of these forums in San Francisco.  The following provides an overview of the forum and the comments made at the forum.

Overview of the January 8, 2019, San Francisco Forum 

Stacey Schesser, the Supervising Deputy Attorney General for the AG’s Privacy Unit, provided opening remarks.  Ms. Schesser confirmed that the AG’s office is at the very beginning of its rulemaking process.  Although the AG’s office will solicit formal comments after it prepares proposed rules, the AG is interested in receiving detailed written comments from the public with proposed language during this informal period.

These forums appear to be designed to inform the AG’s rulemaking and potentially streamline the process, by allowing public input before rules are drafted.  In this regard, Ms. Schesser clarified that she and other AG representatives in attendance at the San Francisco forum were there only to listen to the public comments and would not respond to questions or engage with speakers.  As a result, if the remaining forums follow a similar approach, it is unlikely that the forums will elicit meaningful intelligence regarding the AG’s anticipated approach to, or the substance of, the anticipated rulemaking.


Continue Reading

“My Google Home Mini was inadvertently spying on me 24/7 due to a hardware flaw,” wrote a tech blogger who purchased Google Inc.’s latest internet of things (IoT) device. Following the incident, a pact of consumer advocacy groups insisted the U.S. Consumer Product Safety Commission (CPSC) recall the Google smart speaker due to privacy concerns arising when the device recorded all audio without voice command prompts.

The CPSC is charged with protecting consumers from products that pose potential hazards. Traditionally, this has meant hazards that may cause physical injury or property damage. But as internet-connected household products continue to proliferate, issues like the “always-on” Google Home Mini raise an important question: Where does cybersecurity of consumer IoT devices fit within the current legal framework governing consumer products?

The Explosion of IoT

Forecasts predict that by 2020 IoT devices will account for 24 billion of the 34 billion devices connected to the internet. According to a recent Gemalto survey, “[a] hacker controlling IoT devices is the most common concern for consumers (65%), while six in ten (60%) worry about their data being stolen.”

The rapid growth of the IoT market and continued integration into daily life raises the question of which regulatory body or bodies, if any, should be responsible for consumer safety when it comes to cybersecurity for consumer IoT devices.

The Intersection of Consumer Product Safety, Privacy and Cybersecurity

The CPSC’s jurisdiction has traditionally been limited to physical injury and property damage. It is “charged with protecting the public from unreasonable risks of injury or death associated with the use of the thousands of types of consumer products under the agency’s jurisdiction.”
Continue Reading

Computer laptop with ransomware malware virus key icon on red display background. Vector illustration technology data privacy and security concept.

The global WannaCry ransomware attack should be a wake up call for all companies about the threat ransomware poses. While WannaCry was one of the first highly publicized attacks in which ransomware was weaponized and used against numerous companies at once, there will undoubtedly be future attacks.  Companies can take proactive steps to reduce their

SociallyAware_Vol8Issue1_Thumb2The latest issue of our Socially Aware newsletter is now available here.

In this edition,we examine a spate of court decisions that appear to rein in the historically broad scope of the Communications Decency Act’s Section 230 safe harbor for website operators; we outline ten steps companies can take to be better prepared for

0329_JS_imageThe European Commission has published two draft directives on the supply of digital content and the online sale of goods that aim to help harmonise consumer law across Europe. In proposing these new laws, the European Union is making progress towards one of the main goals in its Digital Single Market Strategy (announced in May 2015), which is concerned with strengthening the European digital economy and increasing consumer confidence in online trading across EU Member States. According to the Commission, only 12% of EU retailers sell online to consumers in other EU countries, while more than three times as many sell online in their own country. The Commission has also announced a plan to carry out a fitness check of other existing European consumer protection laws.

This article outlines the potential implications of these latest developments, with a particular focus on the UK and Germany.

DIGITAL CONTENT AND ONLINE SALES OF GOODS

This is not the first time that the Commission has tried to align consumer laws across the EU: the Commission’s last attempt at a Common European Sales Law faltered in 2015. But the Commission has now proposed two new directives dealing with contracts for the supply of digital content (“Draft Digital Content Directive”) and sales of online goods (“Draft Online Goods Directive”) (together, the “Proposed Directives”). The Online Goods Directive will replace certain aspects of an Existing Sales of Consumer Goods and Associated Guarantees Directive (“Existing Goods Directive”), whereas the Digital Cotent Directive introduces a new set of rights for consumers when they buy digital content across the EU.

Part of the issue with previous EU legislative initiatives in this area is that “harmonised” has really meant “the same as long as a country doesn’t want to do anything different”. This time, the Proposed Directives have been drafted as so-called “maximum harmonisation measures”, which would preclude Member States from providing any greater or lesser protection for the matters falling within their scope. The Commission hopes that this consistent approach across Member States will encourage consumers to enter into transactions across EU borders, while also allowing suppliers to simplify their legal documentation by using a single set of terms and conditions for all customers within the EU.

The Proposed Directives will need to be adopted by the EU Parliament and Council before becoming law. Member States would then have two years to transpose the Proposed Directives into national law.


Continue Reading