- Consumers have the right to opt out of the sale of their personal information. The law gives Nevada consumers the right to request that website operators refrain
A recent decision from the Ninth Circuit Court of Appeals in a dispute between LinkedIn and hiQ Labs has spotlighted the thorny legal issues involved in unauthorized web scraping of data from public websites. While some may interpret the LinkedIn decision as greenlighting such activity, this would be a mistake. On close review of the decision, and in light of other decisions that have held unauthorized web scrapers liable, the conduct remains vulnerable to legal challenge.
hiQ and LinkedIn
Founded in 2012, hiQ is a data analytics company that uses automated bots to scrape information from LinkedIn’s website. hiQ targets the information that users have made public for all to see in their LinkedIn profile. hiQ pays nothing to LinkedIn for the data, which it uses, along with its own predictive algorithm, to yield “people analytics,” which it then sells to clients.
In May 2017, LinkedIn sent a cease-and-desist letter to hiQ demanding that it stop accessing and copying data from LinkedIn’s servers. LinkedIn also implemented technical measures to prevent hiQ from accessing the site, which hiQ circumvented.
Shortly thereafter, with its entire business model under threat, hiQ filed suit in the United States District Court for the Northern District of California seeking injunctive relief and a declaration that LinkedIn had no right to prevent it from accessing public LinkedIn member profiles.…
Continue Reading Ninth Circuit’s LinkedIn Decision Does Not Greenlight the Unauthorized Web Scraping of Public Websites
Last week, the Federal Trade Commission made clear that child-directed parts of an otherwise general audience service will subject the operator of the service to the Children’s Online Privacy Protection Act (COPPA).
Just six months after the FTC’s record-setting settlement against TikTok, the FTC announced a $170 million fine against Google and its subsidiary YouTube to settle allegations that YouTube had collected personal information from children without first obtaining parental consent, in violation of the FTC’s rule implementing COPPA. This $170 million fine—$136 million to the FTC and $34 million to the New York Attorney General, with whom the FTC brought the enforcement action—dwarfs the $5.7 million levied against TikTok earlier this year. It is by far the largest amount that the FTC has obtained in a COPPA case since Congress enacted the law in 1998. The settlement puts operators of general-audience websites on notice that they are not automatically excluded from COPPA’s coverage: they are required to comply with COPPA if particular parts of their websites or content (including content uploaded by others) are directed to children under age 13.
Advancements in technology appear to have spurred the Federal Trade Commission to initiate a review of its rule promulgated pursuant to the Children’s Online Privacy Protection Act (the “COPPA Rule” or “Rule”) four years ahead of schedule. Last week, the FTC published a Federal Register notice seeking comments on the Rule. Although the FTC typically reviews a rule only once every 10 years and the last COPPA Rule review ended in 2013, the Commission unanimously voted 5-0 to seek comments ahead of its next scheduled review. The Commission cited the education technology sector, voice-enabled connected devices, and general audience platforms hosting third-party, child-directed content as developments warranting reexamination of the Rule at this time.
The COPPA Rule, which first went into effect in 2000, generally requires operators of online services to obtain verifiable parental consent before collecting personal information from children under the age of 13. In 2013, the FTC amended the COPPA Rule to address changes in the way children use and access the internet, including through the increased use of mobile devices and social networking. Its amendments included the expansion of the definition of “personal information” to include persistent identifiers that track online activity, geolocation information, photos, videos, and audio recordings. The new review could result in similarly significant amendments.
Questions for Public Comment
In addition to standard questions about the effectiveness of the COPPA Rule and whether it should be retained, eliminated, or modified, the FTC is seeking comment on all major provisions of the Rule, including its definitions, notice and parental consent requirements, exceptions, and security requirements.…
Continue Reading Back to School Early: FTC Seeks Comments to COPPA Rule Ahead of Schedule
The French data protection authority, the CNIL, continues to fine organizations for failing to adopt what the CNIL considers to be fundamental data security measures. In May 2019, the CNIL imposed a EUR 400,000 fine on a French real estate company for failing to have basic authentication measures on a server and for retaining information too long. This is the second fine by the CNIL under the EU General Data Protection Regulation 2016/679 (GDPR) after the one against Google. The decision is among many pre-GDPR fines imposed by the CNIL for failing to meet security standards, and shows that data security continues to be a high enforcement priority for the CNIL.
French real estate company Sergic operated a website where individuals could upload information about themselves for their property rental applications. Responding to a complaint by an applicant, the CNIL investigated Sergic in September 2018, as it appeared that applicants’ documents were freely accessible without authentication (by modifying a value in the website URL). The CNIL confirmed the vulnerability and found that almost 300,000 documents were accessible in a master file containing information such as individuals’ government issued IDs, Social Security numbers, marriage and death certificates, divorce judgments, and tax, bank and rental statements. The CNIL also discovered that Sergic had been informed of the vulnerability back in March 2018 but did not fix it until September 2018.
Continue Reading The CNIL Strikes Again – Mind Your Security
Continue Reading Nevada Enacts CCPA-Style Opt-Out Right for Consumers—but Similarities Are Few
New York is now one of the 43 states where “revenge porn,” the posting of explicit photographs or videos to the Internet without the subject’s consent, is punishable by law. See how far the states have come – find out how many had criminalized revenge porn as of 2014, when Socially Aware first covered the…
However, the European Data Protection Board (EDPB), the successor to the Article 29 Working Party, has issued a non-binding opinion that the use of cookie walls should be prohibited under new EU ePrivacy rules. The EDPB argues that cookie walls run contrary to the General Data Protection Regulation (GDPR): “In order for consent to be freely given as required by the GDPR, access to services and functionalities must not be made conditional on the consent of a user to the processing of personal data or the processing of information related to or processed by the terminal equipment of end-users, meaning that cookie walls should be explicitly prohibited.”
The cost for violating the Children’s Online Privacy Protection Act (COPPA) has been steadily rising, and companies subject to the law should take heed. Last week, the Federal Trade Commission (FTC) announced a record-setting $5.7 million settlement with the mobile app company Musical.ly for a myriad of COPPA violations, exceeding even the December 2018 $4.95 million COPPA settlement by the New York Attorney General. Notably, two Commissioners issued a statement accompanying the settlement, arguing that the FTC should prioritize holding executives personally responsible for their roles in deliberate violations of the law in the future.
COPPA is intended to ensure parents are informed about, and can control, the online collection of personal information (PI) from their children under age thirteen. Musical.ly (now operating as “TikTok”) is a popular social media application that allows users to create and share lip-sync videos to popular songs. The FTC cited the Shanghai-based company for numerous violations of COPPA, including failure to obtain parental consent and failure to properly delete children’s PI upon a parent’s request.
The California Attorney General continued its series of public forums regarding the California Consumer Privacy Act (CCPA), with forums last week in Riverside (January 24, 2019) and
Los Angeles (January 25, 2019). As in the previous forums, there were a significant number of attendees, but few elected to speak publicly regarding their views on the Act. You can read our reports on the public forums held earlier this month in San Francisco and San Diego.
Lisa Kim, Deputy Attorney General for the AG’s Privacy Unit, provided opening remarks at both forums and identified the areas of the AG’s rulemaking on which speakers should focus their comments, specifically those areas of the Act that call for specific AG rules. Ms. Kim encouraged interested parties to provide written comments and proposed regulatory language during this pre-rulemaking phase. Consistent with the prior forums, she noted that the AG’s office would be listening, and not responding, to comments made in Riverside and Los Angeles.
Of note, the presentation slides made available at the forum (and available here) state that the AG anticipates publishing proposed rules in Fall 2019,and that after that there will be a period for public comment and additional public hearings.