A new law in Australia makes a social media company’s failure to remove “abhorrent violent material” from its platform punishable by significant fines. The law also states that the executives at social media companies who fail to remove the content could be sentenced to jail time.

The European Parliament voted to approve the Copyright Directive, a directive that, although vaguely worded, affords copyright holders significant new protections online, and requires online platforms to police content more thoroughly than ever before. Find out exactly what impact industry advocates predict the law will have, and how long it will be until it’s implemented.

Learn how companies can collect and use biometric data without becoming an easy target for litigation, according to my co-editor Julie O’Neill and our colleague Max Phillip Zidel.

As part of the FTC’s continuing efforts to ensure consumers are aware of when an online endorser has been compensated in connection with an endorsement, the agency recently settled a complaint against a subscription service that allegedly offered its product for free to consumers who posted positive online reviews.

In the wake of reports about social media influencers purchasing fake followers and fake likes, as well as failing to adequately label endorsed content, online celebrities are embracing more relatable posts, potentially in an effort to appear more trustworthy.

To better compete with digital media platforms, the top 40 television markets in the United States will introduce a broadcasting standard that will enable interactive and targeted advertising.

Snap Inc., whose Snapchat app currently excludes users younger than 13 but generally does not verify ages, has announced that it is working with British lawmakers to prevent underage children from signing up for its service.

What’s up with Google’s new streaming game platform?

A photographer is suing supermodel Gigi Hadid for copyright infringement for posting a photo of herself to Instagram.

Fruit of the Loom is holding a contest on Instagram in search of the best jingle for their Breathable Boxer Briefs. See how much the underwear manufacturer promises to award the winning songwriter.

In early March 2019, the Department of Justice (DOJ) revised its Foreign Corrupt Practices Act (FCPA) Corporate Enforcement Policy (the Policy). First announced in November 2017, the Policy is designed to encourage companies to self-report FCPA violations and to cooperate with the DOJ’s FCPA investigations. The Policy and its recent revisions were incorporated into the United States Attorneys’ Manual (USAM), now referred to as the Justice Manual (JM), which is the internal DOJ document that sets forth policies and guidance for federal prosecutors.

One of the most notable aspects of the original Policy was its requirement that companies seeking to obtain remediation credit prohibit employees from using ephemeral messaging systems unless appropriate retention mechanisms were put in place. According to the original Policy, a company would receive full credit for remediation only “if [it] prohibit[ed] employees from using software that generates but does not appropriately retain business records or communications.” Continue Reading How to Comply with the Revised Ephemeral-Messaging Provision in the DOJ’s Corporate Enforcement Policy

As consumers increasingly communicate and interact through social media platforms, courts have had to grapple with how to apply existing laws to new ways of communicating, as well as disseminating and using content. Sometimes, however, traditional legal standards apply to these new platforms in a straightforward manner. At least, that is what the court found in Dancel v. Groupon, Inc., a putative class action against Groupon, Inc., alleging that Groupon’s use of images originally posted on the social media site Instagram violated users’ rights under the Illinois Right of Publicity Act (IRPA).

Groupon, a website that offers consumers deals on goods and services, built a widget intended to provide its users a window into businesses for which Groupon offered deals. The widget used Instagram’s API to find photos that Instagram users had taken at particular locations, and then displayed those images under the deals offered on Groupon’s own website.  When a visitor to the Groupon page hovered his or her mouse over the Instagram images, the Groupon user could see the username of the person who posted the photo on Instagram and an associated caption, if there was one.

Dancel, who maintains an Instagram account with the username “meowchristine,” took a selfie of herself and her boyfriend in front of a restaurant and posted it on Instagram with a tag noting the name of the restaurant. Groupon later displayed this photograph, among others, in connection with its deal for the same restaurant. Continue Reading What’s in a (User)Name?

New York is now one of the 43 states where “revenge porn,” the posting of explicit photographs or videos to the Internet without the subject’s consent, is punishable by law. See how far the states have come – find out how many had criminalized revenge porn as of 2014, when Socially Aware first covered the issue.

YouTube announced that it will not allow channels that promote anti-vaccination videos to run advertisements because such videos violate the platform’s policy, which, among other things, disallows the monetization of “dangerous content.” Many of the companies whose ads appeared alongside anti-vaccination content say they were not aware it was happening. Find out how that could be possible.

Senator John Kennedy (R-LA) has introduced a bill that would give Internet users considerably more control over their personal data by mandating that social media companies inform registrants—in simple, easy-to-understand terms—that they are entering into an agreement licensing their personal data to the company. Coined the Own Your Own Data Act, the legislation would also require social media platforms to make it easy for their registrants to cancel the licensing agreement and obtain the collected data and any analysis of it.

Another privacy bill, this one proposed by Senators Ed Markey (D-MA) and Josh Hawley (R-MO), would amend the Children’s Online Privacy Protection Act (COPPA) to completely prohibit the running of targeted advertisements on websites targeted to children. Find out how else the bill would amend COPPA, and how long companies would have to comply with the amendment if it became law.

The debate over whether politicians have a right to block people on social media rages on.

The United States isn’t the only country whose president favors social media as a vehicle for sharing his views.

A #TwitterLaw symposium is being held at the University of Idaho College of Law next month. Road trip, anyone?

Even the British Royal Family has to contend with social media trolls.

One of the next big items in Europe will be the expansion of “ePrivacy,” (which, among other things, regulates the use of cookies on websites). While the ePrivacy reform is still being worked on by EU lawmakers, one of the items the ePrivacy Regulation is expected to update is the use of “cookie walls.” Recently, the Austrian and UK data protection authorities (DPAs) issued enforcement actions involving the use of cookie walls, albeit with different findings and conclusions.

Cookie Walls

A cookie wall blocks individuals from accessing a website unless they first accept the use of cookies and similar technologies. The practice of using cookie walls is not prohibited under the current ePrivacy Directive.

However, the European Data Protection Board (EDPB), the successor to the Article 29 Working Party, has issued a non-binding opinion that the use of cookie walls should be prohibited under new EU ePrivacy rules. The EDPB argues that cookie walls run contrary to the General Data Protection Regulation (GDPR): “In order for consent to be freely given as required by the GDPR, access to services and functionalities must not be made conditional on the consent of a user to the processing of personal data or the processing of information related to or processed by the terminal equipment of end-users, meaning that cookie walls should be explicitly prohibited.”

Continue Reading The Cookie Wall Must Go Up. Or Not?

The cost for violating the Children’s Online Privacy Protection Act (COPPA) has been steadily rising, and companies subject to the law should take heed. Last week, the Federal Trade Commission (FTC) announced a record-setting $5.7 million settlement with the mobile app company Musical.ly for a myriad of COPPA violations, exceeding even the December 2018 $4.95 million COPPA settlement by the New York Attorney General. Notably, two Commissioners issued a statement accompanying the settlement, arguing that the FTC should prioritize holding executives personally responsible for their roles in deliberate violations of the law in the future.

COPPA is intended to ensure parents are informed about, and can control, the online collection of personal information (PI) from their children under age thirteen. Musical.ly (now operating as “TikTok”) is a popular social media application that allows users to create and share lip-sync videos to popular songs. The FTC cited the Shanghai-based company for numerous violations of COPPA, including failure to obtain parental consent and failure to properly delete children’s PI upon a parent’s request.

Continue Reading Thank You, Next Enforcement: Music Video App Violates COPPA, Will Pay $5.7 Million

In 2019, the European Court of Justice (CJEU) is expected to clarify one of the key open issues in EU copyright law: the extent to which online platforms such as YouTube can be liable for copyright infringement caused by user-generated content—content uploaded on to the Internet by users such as music, videos, literature, photos, or the streaming of live events such as concerts. The CJEU decisions are eagerly awaited by both media and copyright owners and by online platform operators—and will mark yet another stage in the on-going battle of the creative industries against copyright infringements in the online world.

SUMMARY

In September 2018, the German Federal Court of Justice (Bundesgerichtshof, BGH) suspended proceedings in a widely-publicized case concerning YouTube’s liability for copyright infringing user-uploaded content and referred a series of questions regarding the interpretation of several EU copyright provisions to the CJEU for a preliminary ruling. A few days later, the BGH also suspended proceedings in five other high-profile cases concerning the liability of the file hosting service uploaded.net for user files containing copyright infringing content and submitted the same questions again to the CJEU.

Previous rulings by the CJEU have addressed both the application of the safe harbor principle set out in EU E-Commerce Directive 2000/31/EC, which shields hosting providers from liability for hosted unlawful third-party content (see, for example, eBay/L’Oreal; Netlog/SABAM; and Scarlet/SABAMof which they have no actual knowledge and, separately, the extent of infringement of copyright by hosting of, or linking to, copyright infringing third-party content under the EU Copyright Directive (See GS Media/Sanoma; Filmspeler; and The Pirate Bay). But it is still unclear under which conditions the providers of the various online platforms that store and make available user-generated content, can rely on the safe harbor privilege applying to hosting providers to avoid liability, or whether they must not only take down the infringing content when they obtain knowledge of such content but also compensate the rights holders of such content for damages for copyright infringement.

The questions that the BGH submitted to the CJEU aim to clarify these uncertainties by bringing together the different requirements established by the previous CJEU rulings for (i) affirming a direct copyright infringement by the online platform providers under the EU Copyright Directive and (ii) denying the application of the safe harbor privilege as well as the legal consequences of such a denial (such as the extent of liability for damages). The CJEU will have to consider the differences between the YouTube and uploaded.net business models. The CJEU will hopefully provide much clearer guidelines on key issues such as:

  • to what extent can providers of online services engage with the user content hosted by them;
  • which activities will trigger a liability for copyright infringement irrespective of actual knowledge of a specific infringement;
  • whether they must actively monitor the content uploaded by users for copyright infringements (e.g., by using state-of-the-art efficient filter technologies) to avoid damage claims by rights holders.

In addition, we expect these cases to have an effect on the interpretation of the new Art. 13 of the revision of the EU Copyright Directive that will likely be adopted by the EU legislative institutions in the second quarter of 2019. The current trilogue negotiations among the EU institutions indicate that, under such new Art.13, providers of online content sharing services will be directly liable for copyright infringements by content uploaded to the platform by their users and will not be granted safe harbor under the EU E-Commerce Directive. The providers would then have to ensure that content for which the providers have not obtained a license from the respective rights holders for use on their platforms cannot be displayed on their platform. This means that the providers would have to monitor all content files when uploaded to their platform, making filter technology mandatory for the majority of the platforms (see our previous Client Alert on the draft amendment to the EU Copyright Directive).

Continue Reading Time to Hit Pause: Copyright Infringement on User Generated Platforms – When Is the Platform Provider Liable for Damages?

In what is being described as “the first settlement to deem such sales illegally deceptive,” New York Attorney General Letitia James has entered into a settlement with a company that had been selling fake followers, likes and views on several social media platforms. Read how much revenue the sales were generating for the defendant companies.

Twitter is requiring parties interested in posting ads related to the European Parliament elections to verify their identities and confirm that they are based in the EU.

The online pinboard Pinterest has confidentially filed for an initial public offering, the Wall Street Journal reports. Find out the valuation the social media company is reportedly seeking, and how the company monetizes its platform.

A state appeals court in New York widened the scope of the discovery allowable in a personal injury case to include these types of posts on social media platforms.

Washington state senators have proposed a bill that would make “social media extortion”—defined as attempting to acquire property from someone in return for removing negative social media communications—a class C felony, which is punishable by a prison sentence of at least five years or a maximum fine up to $10,000. Read about the restaurant owner’s experience that inspired the bill.

The app maker Niantic Inc. may have reached a settlement with homeowners who brought a nationwide class action suit based on trespass and nuisance allegedly caused by the Pokémon Go craze that Socially Aware reported on in the summer of 2016. Find out what the settlement would require of Niantic to do—and possibly pay.

California has made it easier for wineries to promote events over social media without running afoul of state law.

The up-and-coming generation of “unicorn” start-ups—new companies on track to quickly reach $1 billion in value—is looking very different from the first generation, which included now-household names like Uber and Airbnb. Find out how in this New York Times article.

A new app called “Tudder” is basically just like the dating app Tinder, but for cows. We’re serious.

The California Attorney General continued its series of public forums regarding the California Consumer Privacy Act (CCPA), with forums last week in Riverside (January 24, 2019) and
Los Angeles (January 25, 2019). As in the previous forums, there were a significant number of attendees, but few elected to speak publicly regarding their views on the Act. You can read our reports on the public forums held earlier this month in San Francisco and San Diego.

Lisa Kim, Deputy Attorney General for the AG’s Privacy Unit, provided opening remarks at both forums and identified the areas of the AG’s rulemaking on which speakers should focus their comments, specifically those areas of the Act that call for specific AG rules.  Ms. Kim encouraged interested parties to provide written comments and proposed regulatory language during this pre-rulemaking phase. Consistent with the prior forums, she noted that the AG’s office would be listening, and not responding, to comments made in Riverside and Los Angeles.

Of note, the presentation slides made available at the forum (and available here) state that the AG anticipates publishing proposed rules in Fall 2019,and that after that there will be a period for public comment and additional public hearings.

Continue Reading California AG Hosts Two More Public Forums on CCPA in Riverside and Los Angeles

In anticipation of preparing rules to implement the California Consumer Privacy Act, the California Attorney General recently announced six public forums that he will host in January and February 2019 across California.  On January 8, 2019, the AG hosted the first of these forums in San Francisco.  The following provides an overview of the forum and the comments made at the forum.

Overview of the January 8, 2019, San Francisco Forum 

Stacey Schesser, the Supervising Deputy Attorney General for the AG’s Privacy Unit, provided opening remarks.  Ms. Schesser confirmed that the AG’s office is at the very beginning of its rulemaking process.  Although the AG’s office will solicit formal comments after it prepares proposed rules, the AG is interested in receiving detailed written comments from the public with proposed language during this informal period.

These forums appear to be designed to inform the AG’s rulemaking and potentially streamline the process, by allowing public input before rules are drafted.  In this regard, Ms. Schesser clarified that she and other AG representatives in attendance at the San Francisco forum were there only to listen to the public comments and would not respond to questions or engage with speakers.  As a result, if the remaining forums follow a similar approach, it is unlikely that the forums will elicit meaningful intelligence regarding the AG’s anticipated approach to, or the substance of, the anticipated rulemaking.

Continue Reading California Attorney General Holds First California Consumer Privacy Act Public Forum