Last week, German regulators decided to no longer accept the widely used “JusProg” software as a sufficient means for online service providers to comply with statutory youth protection requirements. The decision is effective immediately, although it will most likely be challenged in court. If it prevails, it puts video-sharing platforms, distributors of gaming content, and online media services at risk of being held accountable for not properly protecting minors from potentially harmful content. For the affected providers, this is particularly challenging because it will be hard, if not impossible, to implement alternative youth protection tools that will meet the redefined regulatory standards.

This alert provides the legal background for the decision and discusses its implications in more detail. It is relevant to all online service providers that target German users. Continue Reading Youth Protection in Germany: Are Online Age Checks & Daytime Blackouts Ahead?

A federal district court in California has added to the small body of case law addressing whether it’s permissible for one party to use another party’s trademark as a hashtag. The court held that, for several reasons, the 9th Circuit’s nominative fair use analysis did not cover one company’s use of another company’s trademarks as hashtags. Whether a hashtag is capable of functioning as a trademark, the topic of two of Socially Aware’s most popular posts, is—of course—another issue entirely.

In what The New York Times describes as “the latest in a line of rulings allowing companies to use arbitration provisions to bar both class actions in court and class-wide arbitration proceedings,” the Supreme Court held that the employment agreements of workers at the lighting fixture retailer Lamps Plus can’t band together to sue the company for allegedly failing to protect their data. The details of the data breach make for an interesting read.

The U.K.’s data regulator has proposed rules that would prevent social media platforms from allowing children to “like” posts. Here’s why.

Officials in a Georgia city might pass a law that would allow elected and appointed officials and employees of the city to sue—at the city’s taxpayers’ expense—anyone who defames them on social media.

Instagram influencer Gianluca Vacchi, who is not a fictional character, but—according the complaint he filed in a federal court in New York—“an international social media celebrity, influencer, fashionista, and disk jockey” —is suing E*Trade for allegedly depicting a character in its commercials who is “stunningly identical” to him. The suit claims copyright infringement, Lanham Act false association and unfair competition, and violation of New York’s right of publicity and privacy.

The Chinese social media company Weibo restored access to this type of content from its platform after significant backlash from its users.

In the age of smartphones and social media, what can trial lawyers do to secure a jury that relied only on the evidence presented in court?

Artificial intelligence is informing which items McDonald’s includes on its outdoor digital menu displays.

The California State Bar is considering using artificial intelligence, too. The bar hopes that AI can help it to more efficiently determine which attorney misconduct complaints to pursue, and perform a function that affects every wannabe lawyer.

Should Wendy’s put Spicy Chicken Nuggets back on its menu? Social media users have spoken (with some prompting from Chance the Rapper).

The Directive on Copyright in the Digital Single Market (Directive) was finally approved by all EU legislative bodies on April 15, 2019. Introducing “modernizing EU copyright rules for European culture to flourish and circulate” was a key initiative of the European Commission’s Digital Single Market (DSM), which, according to the Commission’s President Jean-Claude Juncker, has now been completed by the Directive as “the missing piece of the puzzle.” The Directive was approved, just in time for the elections to the EU Parliament taking place in May 2019. Within a period of 24 months, the Member States are required to implement the Directive’s provisions into national law.

Various Member States have issued, along with their approval of the Directive, statements regarding their interpretation of the Directive and voicing quite different views about the upcoming implementation process. While Germany strongly opposes the notion of upload filters, it appears that France is in favor of a copyright protection mechanism that includes upload filters. At the same time, it remains a pressing question whether currently available algorithm-based filters would even be able to sufficiently differentiate between infringing and non-infringing content. Continue Reading The EU Copyright Directive Passes – But Member States Remain Split on Upload Filters

Often hailed as the law that gave us the modern Internet, Section 230 of the Communication Decency Act generally protects online platforms from liability for content posted by third parties. Many commentators, including us here at Socially Aware, have noted that Section 230 has faced significant challenges in recent years. But Section 230 has proven resilient (as we previously noted here and here), and that resiliency was again demonstrated by the Second Circuit’s recent opinion in Herrick v. Grindr, LLC.

As we noted in our prior post following the district court’s order dismissing plaintiff Herrick’s claims on Section 230 grounds, the case arose from fake Grindr profiles allegedly set up by Herrick’s ex-boyfriend. According to Herrick, these fake profiles resulted in Herrick facing harassment from over 1,000 strangers who showed up at his door over the course of several months seeking violent sexual encounters. Continue Reading Appeals Court Again Upholds Section 230 Protections in Case Against Grindr

A new law in Australia makes a social media company’s failure to remove “abhorrent violent material” from its platform punishable by significant fines. The law also states that the executives at social media companies who fail to remove the content could be sentenced to jail time.

The European Parliament voted to approve the Copyright Directive, a directive that, although vaguely worded, affords copyright holders significant new protections online, and requires online platforms to police content more thoroughly than ever before. Find out exactly what impact industry advocates predict the law will have, and how long it will be until it’s implemented.

Learn how companies can collect and use biometric data without becoming an easy target for litigation, according to my co-editor Julie O’Neill and our colleague Max Phillip Zidel.

As part of the FTC’s continuing efforts to ensure consumers are aware of when an online endorser has been compensated in connection with an endorsement, the agency recently settled a complaint against a subscription service that allegedly offered its product for free to consumers who posted positive online reviews.

In the wake of reports about social media influencers purchasing fake followers and fake likes, as well as failing to adequately label endorsed content, online celebrities are embracing more relatable posts, potentially in an effort to appear more trustworthy.

To better compete with digital media platforms, the top 40 television markets in the United States will introduce a broadcasting standard that will enable interactive and targeted advertising.

Snap Inc., whose Snapchat app currently excludes users younger than 13 but generally does not verify ages, has announced that it is working with British lawmakers to prevent underage children from signing up for its service.

What’s up with Google’s new streaming game platform?

A photographer is suing supermodel Gigi Hadid for copyright infringement for posting a photo of herself to Instagram.

Fruit of the Loom is holding a contest on Instagram in search of the best jingle for their Breathable Boxer Briefs. See how much the underwear manufacturer promises to award the winning songwriter.

In early March 2019, the Department of Justice (DOJ) revised its Foreign Corrupt Practices Act (FCPA) Corporate Enforcement Policy (the Policy). First announced in November 2017, the Policy is designed to encourage companies to self-report FCPA violations and to cooperate with the DOJ’s FCPA investigations. The Policy and its recent revisions were incorporated into the United States Attorneys’ Manual (USAM), now referred to as the Justice Manual (JM), which is the internal DOJ document that sets forth policies and guidance for federal prosecutors.

One of the most notable aspects of the original Policy was its requirement that companies seeking to obtain remediation credit prohibit employees from using ephemeral messaging systems unless appropriate retention mechanisms were put in place. According to the original Policy, a company would receive full credit for remediation only “if [it] prohibit[ed] employees from using software that generates but does not appropriately retain business records or communications.” Continue Reading How to Comply with the Revised Ephemeral-Messaging Provision in the DOJ’s Corporate Enforcement Policy

As consumers increasingly communicate and interact through social media platforms, courts have had to grapple with how to apply existing laws to new ways of communicating, as well as disseminating and using content. Sometimes, however, traditional legal standards apply to these new platforms in a straightforward manner. At least, that is what the court found in Dancel v. Groupon, Inc., a putative class action against Groupon, Inc., alleging that Groupon’s use of images originally posted on the social media site Instagram violated users’ rights under the Illinois Right of Publicity Act (IRPA).

Groupon, a website that offers consumers deals on goods and services, built a widget intended to provide its users a window into businesses for which Groupon offered deals. The widget used Instagram’s API to find photos that Instagram users had taken at particular locations, and then displayed those images under the deals offered on Groupon’s own website.  When a visitor to the Groupon page hovered his or her mouse over the Instagram images, the Groupon user could see the username of the person who posted the photo on Instagram and an associated caption, if there was one.

Dancel, who maintains an Instagram account with the username “meowchristine,” took a selfie of herself and her boyfriend in front of a restaurant and posted it on Instagram with a tag noting the name of the restaurant. Groupon later displayed this photograph, among others, in connection with its deal for the same restaurant. Continue Reading What’s in a (User)Name?

New York is now one of the 43 states where “revenge porn,” the posting of explicit photographs or videos to the Internet without the subject’s consent, is punishable by law. See how far the states have come – find out how many had criminalized revenge porn as of 2014, when Socially Aware first covered the issue.

YouTube announced that it will not allow channels that promote anti-vaccination videos to run advertisements because such videos violate the platform’s policy, which, among other things, disallows the monetization of “dangerous content.” Many of the companies whose ads appeared alongside anti-vaccination content say they were not aware it was happening. Find out how that could be possible.

Senator John Kennedy (R-LA) has introduced a bill that would give Internet users considerably more control over their personal data by mandating that social media companies inform registrants—in simple, easy-to-understand terms—that they are entering into an agreement licensing their personal data to the company. Coined the Own Your Own Data Act, the legislation would also require social media platforms to make it easy for their registrants to cancel the licensing agreement and obtain the collected data and any analysis of it.

Another privacy bill, this one proposed by Senators Ed Markey (D-MA) and Josh Hawley (R-MO), would amend the Children’s Online Privacy Protection Act (COPPA) to completely prohibit the running of targeted advertisements on websites targeted to children. Find out how else the bill would amend COPPA, and how long companies would have to comply with the amendment if it became law.

The debate over whether politicians have a right to block people on social media rages on.

The United States isn’t the only country whose president favors social media as a vehicle for sharing his views.

A #TwitterLaw symposium is being held at the University of Idaho College of Law next month. Road trip, anyone?

Even the British Royal Family has to contend with social media trolls.

One of the next big items in Europe will be the expansion of “ePrivacy,” (which, among other things, regulates the use of cookies on websites). While the ePrivacy reform is still being worked on by EU lawmakers, one of the items the ePrivacy Regulation is expected to update is the use of “cookie walls.” Recently, the Austrian and UK data protection authorities (DPAs) issued enforcement actions involving the use of cookie walls, albeit with different findings and conclusions.

Cookie Walls

A cookie wall blocks individuals from accessing a website unless they first accept the use of cookies and similar technologies. The practice of using cookie walls is not prohibited under the current ePrivacy Directive.

However, the European Data Protection Board (EDPB), the successor to the Article 29 Working Party, has issued a non-binding opinion that the use of cookie walls should be prohibited under new EU ePrivacy rules. The EDPB argues that cookie walls run contrary to the General Data Protection Regulation (GDPR): “In order for consent to be freely given as required by the GDPR, access to services and functionalities must not be made conditional on the consent of a user to the processing of personal data or the processing of information related to or processed by the terminal equipment of end-users, meaning that cookie walls should be explicitly prohibited.”

Continue Reading The Cookie Wall Must Go Up. Or Not?

The cost for violating the Children’s Online Privacy Protection Act (COPPA) has been steadily rising, and companies subject to the law should take heed. Last week, the Federal Trade Commission (FTC) announced a record-setting $5.7 million settlement with the mobile app company Musical.ly for a myriad of COPPA violations, exceeding even the December 2018 $4.95 million COPPA settlement by the New York Attorney General. Notably, two Commissioners issued a statement accompanying the settlement, arguing that the FTC should prioritize holding executives personally responsible for their roles in deliberate violations of the law in the future.

COPPA is intended to ensure parents are informed about, and can control, the online collection of personal information (PI) from their children under age thirteen. Musical.ly (now operating as “TikTok”) is a popular social media application that allows users to create and share lip-sync videos to popular songs. The FTC cited the Shanghai-based company for numerous violations of COPPA, including failure to obtain parental consent and failure to properly delete children’s PI upon a parent’s request.

Continue Reading Thank You, Next Enforcement: Music Video App Violates COPPA, Will Pay $5.7 Million