As regular readers of Socially Aware already know, there are many potential traps for companies that use photographs or other content without authorization from the copyright owners. For example, companies have faced copyright infringement claims based on use of photos pulled from Twitter. Claims have even arisen from the common practice of embedding tweets on blogs and websites, and we have seen a flurry of stories recently about photographers suing celebrities for posting photos of themselves.

Now there is another potential source of liability: the appearance of murals in the background of photographs used in advertisements. In at least two recent cases, automotive companies have faced claims of copyright infringement from the creators of murals painted on buildings that appear in the backgrounds of ads.

Most recently, in a federal district court in the Eastern District of Michigan, Mercedes Benz sought a declaratory judgment that its photographs, taken in Detroit (with permits from the city) and later posted on Instagram, did not infringe the copyrights of three defendants whose murals appeared in the backgrounds of those photographs. Continue Reading Insta-Mural Infringement: Public Art in Instagram Ad Leads to Copyright Claim

Singapore has enacted a law granting government ministers the power to require social media platforms to completely remove or place warnings alongside posts the authorities designate as false.

Unlike the compensation earned by child stars who perform on television, in films, or on other traditional media in California, the income generated by children who perform on social media—“kidfluencers”—still isn’t protected under California law.

The Federal Trade Commission is suing Match Group, the owner of dating sites including Match.com, Tinder, OkCupid and Plenty of Fish, for allegedly tricking hundreds of thousands of Match.com users into subscribing by disingenuously implying that their profiles were getting a lot of attention from other users.

Speaking of dating sites, OKCupid is auditing photos posted by its users and banning the ones that employ filters.

Instagram and Facebook are testing the practice of making the “likes” on a person’s posts invisible to other users. Some marketers say that eliminating engagement metrics such as “likes” will have a significant effect on the influencer marketing industry.

YouTube is modifying its popularity metrics too, citing a concern for its users’ mental health.

Also motivated by concern for their users’ wellbeing, Instagram and Facebook have adopted a new policy regarding posts promoting weight loss products and certain types of cosmetic surgery.

Based in China, the social media network TikTok is incredibly popular, having been downloaded more than 104 million times in the United States since its U.S. debut in 2017. Although the network has sparked controversy in several ways—including its parent company’s $1 billion spend on ads to achieve TikTok’s meteoric rise—the revolutionary artificial intelligence that the network employs to gather data about its users might be the biggest cause for concern, according to Hootsuite CEO Ryan Holmes.

Librarians are not happy with LinkedIn. Here’s why.

Is antagonizing your brand’s competition on social media a sound marketing strategy? It worked for Popeye’s last summer.

In a landmark ruling, the European Court of Justice—Europe’s highest court—dealt Google a clear win by placing a territorial limit on the “right to be forgotten” in the EU. The court’s holding in Google v. Commission nationale de l’informatique et des libertés (CNIL) clarifies that a search engine operator that is obligated to honor an individual’s request for erasure by “de-referencing” links to his or her personal data (i.e., removing links to web pages containing that personal data from search results) is only required, under the GDPR, to de-reference results on its EU domains (e.g., google.fr in France and google.it in Italy), and not on all of its domains globally.

However, in the same ruling, the Court also stated that the GDPR applies to Google’s data processing on all of its domains globally (by virtue of such processing comprising “a single act of processing”). Therefore, an EU Member State’s supervisory authority and courts are free to treat the ECJ’s EU-wide de-referencing requirement as a “floor” and go one step further, requiring search engines to implement the right to be forgotten on all of its domains worldwide, including those outside the EU.

Background – The Right to Be Forgotten

The right to be forgotten—codified at Article 17 of the GDPR—grants individuals the right to obtain erasure of their personal data without undue delay, where, for example, the data are no longer necessary for the purpose for which they were collected or processed. However, the right is not unlimited; exceptions apply if the processing is deemed necessary for the exercise of freedom of expression, compliance with a legal obligation, public interests such as public health, scientific or historic research, or the establishment or defense of legal claims. Continue Reading Forget Me…or Not: Europe’s High Court Limits Territorial Reach of Right to Be Forgotten, But Not of GDPR

In just over a week, on October 1, 2019, key amendments to Nevada’s online privacy law will take effect. We previously detailed the amendments here. In brief:

  • Consumers have the right to opt out of the sale of their personal information. The law gives Nevada consumers the right to request that website operators refrain from a “sale” of their personal information. The right is much narrower than that offered by the California Consumer Privacy Act, which becomes operative on Jan. 1, 2020. Specifically, under the Nevada law, “sale” means the disclosure of covered information for monetary consideration to a recipient that then licenses or resells the information to another person. Transfers to vendors and affiliates are exceptions to the definition.
  • All operators—even those who do not “sell”—must provide a point of contact to receive opt-out requests. Website operators must provide a “designated request address” (by email, online form, or toll-free number) to which Nevada consumers may submit do-not-sell requests. Importantly, and unlike the CCPA, the law does not distinguish between operators who actually “sell” covered information and those who do not, and thus the Nevada law appears to require that all operators have a designated request address in place, even if they do not sell covered information.
  • Operators must grant verified requests within 60 days. Similar to the CCPA, operators must honor only those requests they can verify. They must act on verified requests within 60 days, with a 30-day extension if such an extension is “reasonably necessary.”

There is no private right of action under the Nevada law. The Attorney General is authorized to enforce the law and may seek civil penalties of up to $5,000 per violation.

 

A recent decision from the Ninth Circuit Court of Appeals in a dispute between LinkedIn and hiQ Labs has spotlighted the thorny legal issues involved in unauthorized web scraping of data from public websites. While some may interpret the LinkedIn decision as greenlighting such activity, this would be a mistake. On close review of the decision, and in light of other decisions that have held unauthorized web scrapers liable, the conduct remains vulnerable to legal challenge.

hiQ and LinkedIn

Founded in 2012, hiQ is a data analytics company that uses automated bots to scrape information from LinkedIn’s website. hiQ targets the information that users have made public for all to see in their LinkedIn profile. hiQ pays nothing to LinkedIn for the data, which it uses, along with its own predictive algorithm, to yield “people analytics,” which it then sells to clients.

In May 2017, LinkedIn sent a cease-and-desist letter to hiQ demanding that it stop accessing and copying data from LinkedIn’s servers. LinkedIn also implemented technical measures to prevent hiQ from accessing the site, which hiQ circumvented.

Shortly thereafter, with its entire business model under threat, hiQ filed suit in the United States District Court for the Northern District of California seeking injunctive relief and a declaration that LinkedIn had no right to prevent it from accessing public LinkedIn member profiles. Continue Reading Ninth Circuit’s LinkedIn Decision Does Not Greenlight the Unauthorized Web Scraping of Public Websites

Last week, the Federal Trade Commission made clear that child-directed parts of an otherwise general audience service will subject the operator of the service to the Children’s Online Privacy Protection Act (COPPA).

Just six months after the FTC’s record-setting settlement against TikTok, the FTC announced a $170 million fine against Google and its subsidiary YouTube to settle allegations that YouTube had collected personal information from children without first obtaining parental consent, in violation of the FTC’s rule implementing COPPA. This $170 million fine—$136 million to the FTC and $34 million to the New York Attorney General, with whom the FTC brought the enforcement action—dwarfs the $5.7 million levied against TikTok earlier this year. It is by far the largest amount that the FTC has obtained in a COPPA case since Congress enacted the law in 1998. The settlement puts operators of general-audience websites on notice that they are not automatically excluded from COPPA’s coverage: they are required to comply with COPPA if particular parts of their websites or content (including content uploaded by others) are directed to children under age 13.

Continue Reading The Company Who Cried “General Audience”: Google and YouTube to Pay $170 Million for Alleged COPPA Violations

A recent Second Circuit decision makes clear that the safe harbor that social media and other Internet companies enjoy under Section 230 of the Communications Decency Act broadly applies to a wide variety of claims.

When you think about the Section 230 safe harbor, don’t just think defamation or other similar state law claims. Consider whether the claim—be it federal, state, local, or foreign—seeks to hold a party that publishes third-party content on the Internet responsible for publishing the content. If, after stripping it all down, this is the crux of the cause of action, the safe harbor should apply (absent a few statutory exclusions discussed below). The safe harbor should apply even if the party uses its discretion as a publisher in deciding how best to target its audience or to display the information provided by third parties.

In 2016, Facebook was sued by the estates of four U.S. citizens who died in terrorist attacks in Israel and one who narrowly survived but was grievously injured. The plaintiffs claimed that Facebook should be held liable under the federal Anti-Terrorism Act and the Justice Against Sponsors of Terror Act, which provide a private right of action against those who aid and abet acts of international terrorism, conspire in furtherance of acts of terrorism, or provide material support to terrorist groups. The plaintiffs also asserted claims arising under Israeli law. Continue Reading CDA Section 230 Immunizes Platform From Liability for Friend and Content Suggestion Algorithms

It is likely no surprise to regular readers of Socially Aware that posting content to social media can, in some cases, generate significant income. But those who make their living on social media may find their livelihood threatened if they fail to comply with the law and with the relevant platform’s terms of use.

For example, we often see trouble arise when social media users fail to follow the Federal Trade Commission’s disclosure rules in connection with receiving compensation in exchange for a promotional post, or when users purchase followers—a practice that violates most social media platforms’ terms of use, and might be illegal. As we have noted previously, the social media platform and not the user sets the rules. If your business model is built on a social media platform, you have play by the platform’s rules.

Earning an honest living is what Instagram user “Ben” (the pseudonym assigned to him by MarketWatch) claims to have been doing when he was taking in approximately $4,000 per month by operating and curating several accounts containing memes originally created by third parties. (For those who have somehow managed to avoid this ubiquitous Internet phenomenon, Wikipedia describes a meme as a “piece of media that spreads, often . . . for humorous purposes, from person to person via the Internet.” The article at this link contains some examples.) Continue Reading The Meme Generation: Social Media Platforms Address Content Curation

Advancements in technology appear to have spurred the Federal Trade Commission to initiate a review of its rule promulgated pursuant to the Children’s Online Privacy Protection Act (the “COPPA Rule” or “Rule”) four years ahead of schedule. Last week, the FTC published a Federal Register notice seeking comments on the Rule. Although the FTC typically reviews a rule only once every 10 years and the last COPPA Rule review ended in 2013, the Commission unanimously voted 5-0 to seek comments ahead of its next scheduled review. The Commission cited the education technology sector, voice-enabled connected devices, and general audience platforms hosting third-party, child-directed content as developments warranting reexamination of the Rule at this time.

Background

The COPPA Rule, which first went into effect in 2000, generally requires operators of online services to obtain verifiable parental consent before collecting personal information from children under the age of 13.  In 2013, the FTC amended the COPPA Rule to address changes in the way children use and access the internet, including through the increased use of mobile devices and social networking.  Its amendments included the expansion of the definition of “personal information” to include persistent identifiers that track online activity, geolocation information, photos, videos, and audio recordings. The new review could result in similarly significant amendments.

Questions for Public Comment

In addition to standard questions about the effectiveness of the COPPA Rule and whether it should be retained, eliminated, or modified, the FTC is seeking comment on all major provisions of the Rule, including its definitions, notice and parental consent requirements, exceptions, and security requirements. Continue Reading Back to School Early: FTC Seeks Comments to COPPA Rule Ahead of Schedule

A federal district court dismissed a case against supermodel Gigi Hadid for posting to Instagram a photo of herself that was taken by a paparazzo. The reason for the court’s decision was simple: The party claiming copyright ownership of the photo failed to get it registered with the U.S. Copyright Office, a prerequisite to filing an infringement suit against alleged violators.

If the plaintiff in the suit had complied with the copyright registration process and required the court to make a substantive decision,  the court’s opinion would necessarily have had to, as the Hollywood Reporter wrote, clarify a celebrity’s “right to control how others profit from [that celebrity’s] likeness” and address a “battle that involves a copyright law written before the dawn of the internet, before legislators could imagine social phenomena like Instagram’s billion users and hundreds of millions of daily photo uploads.”

The facts of Hadid’s case are common; celebrities are sued by paparazzi for posting photos of themselves to social media all the time. In this particular suit against a supermodel, an independent photo agency claimed Hadid had violated its copyright in a photo of herself when Hadid posted the picture to social media despite the fact that Hadid had arguably contributed to the image by smiling for the photo, selecting the outfit she’s wearing in it, and even cropping the photo for posting.

The suit had the potential to test legal theories, such as the “fair use” doctrine, that could protect celebrities from copyright infringement liability for posting  paparazzi-taken photos of themselves to social media. Although those theories weren’t tested this time around, “it’s an imminent fight that could spark the type of legal rethinking needed when the old rules fail to accommodate new realities.”