This is the famous Monkey selfie.

I confess: I have mixed emotions regarding the iconic “monkey-selfie” photo and all the hubbub it has created.

Don’t get me wrong; I think monkeys are wonderful, and the photo deserves its iconic status. Who can resist smiling while viewing that famous image of Naruto, the macaque monkey who allegedly snapped the self-portrait?

And the monkey selfie has been a boon to legal blogs. Our own posts regarding the photo have been among the most viewed content on Socially Aware (one of our posts prompted a call from my mother, who felt strongly that Naruto should be entitled to a copyright in the photo).

But, let’s face it, in an era where technology disruption is generating so many critical and difficult copyright issues, the law relevant to the monkey selfie is pretty straightforward, at least in the United States. As the U.S. Copyright Office states in its Compendium II of Copyright Office Practices, for a work to be copyrightable, it must “owe its origin to a human being,” and that materials produced solely by nature, by plants or by animals do not count. U.S. courts have reached the same conclusion. (Although I note that David Slater, the nature photographer whose camera was used to take the photo, claims that he—and not the macaque—is in fact the author of the photo for copyright purposes.) Continue Reading Monkey-Selfie Case Returns—To Court & (Maybe) a Theater Near You

A recent German Federal Court of Justice decision may have a significant impact on content providers’ business models. Offering software that allows users to block advertising does not constitute an unfair commercial practice. Even providing advertisers with the option to pay for showing certain ads—a practice known as whitelisting—does not violate the unfair competition rules.

Issued on April 19, the decision involved a legal dispute between the ad blocking software provider Eyeo GmbH and the online-content provider Axel Springer (which also happens to be Germany’s largest publishing house). The decision overruled the Higher Regional Court of Cologne’s previous decision, which, like the Federal Court of Justice, did not categorize Eyeo’s offer of its ad blocking product as an unfair competition practice, but did categorize paid whitelisting as unlawful.

Axel Springer is now left with the final option of taking the case to the Federal Constitutional Court.

Background and core arguments of the parties

Eyeo, a German software company, offers the product AdBlock Plus, which allows Internet users to block ads online. The product became the most popular ad blocking software in Germany and abroad, with over 500 million downloads and 100 million users worldwide.

In 2011, the company started to monetize its product by offering a whitelisting service that gives advertisers the option to pay to show their ads. To get on Eyeo’s list of companies whose ads are not blocked, advertisers have to comply with Eyeo’s “acceptable advertising” conditions and share their ad revenue with the company. The conditions dictate the advertising’s features such as its placement, size, and—in the case of text advertising—color. Continue Reading German Federal Court: Unfair Competition Law No Basis to Ban Ad Blocking and Whitelisting

Based on copyright infringement, emotional distress and other claims, a federal district court in California awarded $6.4 million to a victim of revenge porn, the posting of explicit material without the subject’s consent. The judgment is believed to be one of the largest awards relating to revenge porn. A Socially Aware post that we wrote back in 2014 explains the difficulties of using causes of action like copyright infringement—and state laws—as vehicles for fighting revenge porn.

The highest court in New York State held that whether or not a personal injury plaintiff’s Facebook photos are discoverable does not depend on whether the photos were set to “private,” but rather “the nature of the event giving rise to the litigation and the injuries claimed, as well as any other information specific to the case.”

A federal district court held that Kentucky’s governor did not violate the free speech rights of two Kentucky citizens when he blocked them from commenting on his Facebook page and Twitter account. The opinion underscores differences among courts as to the First Amendment’s application to government officials’ social media accounts; for example, a Virginia federal district court’s 2017 holding reached the opposite conclusion in a case involving similar facts.

Having witnessed social media’s potential to escalate gang disputes, judges in Illinois have imposed limitations on some juvenile defendants’ use of the popular platforms, a move that some defense attorneys argue violates the young defendants’ First Amendment rights.

A bill proposed by California State Sen. Bob Hertzberg would require social media platforms to identify bots—automated accounts that appear to be owned by real people but are actually computer programs capable of simulating human dialog. Bots can spread a message across social media faster and more widely than would be humanly possible, and have been used in efforts to manipulate public opinion.

This CIO article lists the new strategies, job titles and processes that will be popular this year among businesses transforming into data-driven enterprises.

A solo law practitioner in Chicago filed a complaint claiming defamation and false light against a former client who she alleges posted a Yelp review calling her a “con artist” and a “legal predator”  after, allegedly pursuant to the terms of his retainer, she billed $9,000 to his credit card for a significant amount of legal work.

Carnival Cruise Line put up signs all over the hometown of the 15-year-old owner of the Snapchat handle @CarnivalCruise in order to locate him and offer him and his family a luxurious free vacation in exchange for the transfer of his Snapchat handle—and the unusual but innovate strategy paid off. Who knew that old-school billboards could be so effectively used for one-on-one marketing?

Does a search engine operator have to delist websites hosting, without authorization, your trade secret materials or other intellectual property? The answer may depend on where you sue—just ask Google. The U.S. District Court for the Northern District of California recently handed the company a victory over plaintiff Equustek Solutions Inc. in what has turned into an international battle where physical borders can have very real consequences on the Internet.

The dispute began when a rival company, Datalink, allegedly misappropriated Equustek’s trade secrets in developing competing products. Equustek also alleged that Datalink misled customers who thought they were buying Equustek products. In 2012, Equustek obtained numerous court orders in Canada against Datalink. Datalink refused to comply, and Canadian court issued an arrest warrant for the primary defendant, who has yet to be apprehended. Continue Reading The Coming Border Wars: U.S. Court Decision Refusing to Enforce Canadian Court Order Highlights the Growing Balkanization of the Internet

Section 230 of the Communications Decency Act continues to act as one of the strongest legal protections that social media companies have to avoid being saddled with crippling damage awards based on the misdeeds of their users.

The strong protections afforded by Section 230(c) were recently reaffirmed by Judge Caproni of the Southern District of New York, in Herrick v. Grindr. The case involved a dispute between the social networking platform Grindr and an individual who was maliciously targeted through the platform by his former lover. For the unfamiliar, Grindr is mobile app directed to gay and bisexual men that, using geolocation technology, helps them to connect with other users who are located nearby.

Plaintiff Herrick alleged that his ex-boyfriend set up several fake profiles on Grindr that claimed to be him. Over a thousand users responded to the impersonating profiles. Herrick’s ex‑boyfriend, pretending to be Herrick, would then direct the men to Herrick’s’ work-place and home. The ex-boyfriend, still posing as Herrick, would also tell these would-be suitors that Herrick had certain rape fantasies, that he would initially resist their overtures, and that they should attempt to overcome Herrick’s initial refusals. The impersonating profiles were reported to Grindr (the app’s operator), but Herrick claimed that Grindr did not respond, other than to send an automated message. Continue Reading Lawsuit Against Online Dating App Grindr Dismissed Under Section 230 of the Communications Decency Act

In a decision that has generated considerable controversy, a federal court in New York has held that the popular practice of embedding tweets into websites and blogs can result in copyright infringement. Plaintiff Justin Goldman had taken a photo of NFL quarterback Tom Brady, which Goldman posted to Snapchat. Snapchat users “screengrabbed” the image for use in tweets on Twitter. The defendants—nine news outlets—embedded tweets featuring the Goldman photo into online articles so that the photo itself was never hosted on the news outlets’ servers; rather, it was hosted on Twitter’s servers (a process known as “framing” or “inline linking”). The court found that, even absent any copying of the image onto their own servers, the news outlets’ actions had resulted in a violation of Goldman’s exclusive right to authorize the public display of his photo.

If legislation recently introduced in California passes, businesses with apps or websites requiring passwords and enabling Golden State residents younger than 18 to share content could be prohibited from asking those minors to agree to the site’s or the app’s terms and conditions of use.

After a lawyer was unable to serve process by delivering court documents to a defendant’s physical and email addresses, the Ontario Superior Court granted the lawyer permission to serve process by mailing a statement of claim to the defendant’s last known address and by sending the statement of claim through private messages to the defendant’s Instagram and LinkedIn accounts. This is reportedly the first time an Ontario court has permitted service of process through social media. The first instance that we at Socially Aware heard of a U.S. court permitting a plaintiff to serve process on a domestic, U.S.-based defendant through a social media account happened back in 2014.

Videos that impose celebrities’ and non-famous people’s faces onto porn performers’ to produce believable videos have surfaced on the Internet, and are on the verge of proliferating. Unlike the non-consensual dissemination of explicit photos that haven’t been manipulated—sometimes referred to as “revenge porn”—this fake porn is technically not a privacy issue, and making it illegal could raise First Amendment issues.

By mining datasets and social media to recover millions of dollars lost to tax fraud and errors, the IRS may be violating common law and the Electronic Communications Privacy Act, according to an op-ed piece in The Hill.

A woman is suing her ex-husband, a sheriff’s deputy in Georgia, for having her and her friend arrested and briefly jailed for posting on Facebook about his alleged refusal to drop off medication for his sick children on his way to work. The women had been charged with “criminal defamation of character” but the case was ultimately dropped after a state court judge ruled there was no basis for the arrest.

During a hearing in a Manhattan federal court over a suit brought by seven Twitter users who say President Trump blocked them on Twitter for having responded to his tweets, the plaintiffs’ lawyer compared Twitter to a “virtual town hall” where “blocking is a state action and violates the First Amendment.” An assistant district attorney, on the other hand, analogized the social media platform to a convention where the presiding official can decide whether or not to engage with someone. The district court judge who heard the arguments refused to decide the case on the spot and encouraged the parties to settle out of court.

Have your social media connections been posting headshots of themselves alongside historical portraits of people who look just like them? Those posts are the product of a Google app that matches the photo of a person’s face to a famous work of art, and the results can be fun. But not for people who live in Illinois or Texas, where access to the app isn’t available. Experts believe it’s because laws in those states restrict how companies can use biometric data.

The stock market is apparently keeping up with the Kardashians. A day after Kim Kardashian’s half-sister Kylie Jenner tweeted her frustration with Snapchat’s recent redesign, the company’s market value decreased by $1.3 billion.

In February the U.S Supreme Court heard oral arguments in United States v. MicrosoftAt issue is Microsoft’s challenge to a warrant issued by a U.S. court directing it to produce emails stored in Ireland. With implications for government investigations, privacy law, and multi-national tech companies’ ability to compete globally, the case has attracted significant attention.

Over the course of the oral arguments it became clear that rendering a decision in United States v. Microsoft would require the justices to choose between two less-than-satisfactory outcomes: denying the U.S. government access to necessary information, or potentially harming U.S. technology companies’ ability to operate globally.

The conundrum the justices face is largely due to the fact that the 1986 law at issue, the Stored Communications Act (SCA), never envisioned the kind of complex, cross-border data storage practices of today.

Find out more about the case and how recently introduced legislation known as the CLOUD Act could wind up superseding the Court’s decision in United States v. Microsoft by, among other things, clarifying the SCA’s applicability to foreign-stored data while also providing technology companies with a new vehicle for challenging certain orders that conflict with the laws of the country where data is stored.

Read my article in Wired.

Companies that offer services, whether online or offline, to consumers on a subscription or other automatic renewal basis should be aware that such offers are heavily regulated at both the federal and state levels. A recent amendment to Section 17602 of California’s Business and Professions Code provides a good opportunity for businesses that make subscription offers to review their practices. As of July 1, 2018, the obligations under California law will expand in two ways that may require businesses to update those practices.

The first change relates to the information that businesses must provide to consumers regarding the terms of a subscription offer. The current law already requires a business to provide certain information about the renewal process—such as the amount of the recurring charges, the length of the renewal period, and the cancellation policy—both before the consumer accepts the agreement, and afterwards in an acknowledgement. The amendment provides that, as of July 1, 2018, if the offer includes any free trial or gift component, the information provided to consumers must also include a “clear and conspicuous explanation of the price that will be charged after the trial ends or the manner in which the subscription or purchasing agreement pricing will change upon conclusion of the trial.” Continue Reading Amended California Law Expands Requirements for Consumer Subscriptions

Following a recent decision from the Sixth Circuit, anonymous bloggers and other Internet users who post third-party copyrighted material without authorization have cause for concern. They may be unable to preserve their anonymity.

In Signature Management Team, LLC v. John Doe, the majority of a panel of the U.S. Court of Appeals for the Sixth Circuit established a new “presumption in favor of unmasking anonymous defendants when judgment has been entered for a plaintiff” in a copyright infringement case. This unmasking presumption is intended to protect the openness of judicial proceedings. Whether to unmask the defendant in such circumstances requires an examination of factors such as the plaintiff’s and public’s interest in knowing the defendant’s identity. Continue Reading Anonymous Internet Users Beware: New Presumption in Favor of Unmasking the Losing Anonymous Defendant

Happy 2018 to our readers! It has become a Socially Aware tradition to start the New Year with some predictions from our editors and contributors. With smart contracts on the horizon, the Internet of Things and cryptocurrencies in the spotlight, and a number of closely watched lawsuits moving toward resolution, 2018 promises to be an exciting year in the world of emerging technology and Internet law.

Here are some of our predictions regarding tech-related legal developments over the next twelve months. As always, the views expressed are not to be attributed to Morrison & Foerster or its clients.

From John Delaney, Co-Founder and Co-Editor, Socially Aware, and Partner at Morrison & Foerster:
Regarding Web Scraping

Web scraping is an increasingly common activity among businesses (by one estimate, web-scraping bots account for as much as 46% of Internet traffic), and is helping to fuel the “Big Data” revolution. Despite the growing popularity of web scraping, courts have been generally unsympathetic to web scrapers. Last August, however, web scrapers finally received a huge victory, as the U.S. District Court for the Northern District of California enjoined LinkedIn from blocking hiQ Labs’ scraping of publicly available user profiles from the LinkedIn website in the hiQ Labs, Inc. v. LinkedIn Corp. litigation. The case is now on appeal to the Ninth Circuit; although my sense is that the Ninth Circuit will reject the broad scope and rationale of the lower court’s ruling, if the Ninth Circuit nevertheless ultimately sides with hiQ Labs, the web scraper, the decision could be a game changer, bringing online scraping out of the shadows and perhaps spurring more aggressive uses of scraping tools and scraped data. On the other hand, if the Ninth Circuit reverses, we may see companies reexamining and perhaps curtailing their scraping initiatives. Either way, 2018 promises to bring greater clarity to this murky area of the law.

Regarding the Growing Challenges for Social Media Platforms

2017 was a tough year for social media platforms. After years of positive press, immense consumer goodwill and a generally “hands off” attitude from regulators, last year saw a growing backlash against social media due to a number of reasons: the continued rise of trolling creating an ever-more toxic online environment; criticism of social media’s role in the dissemination of fake news; the growing concern over social media “filter bubbles” and “echo chambers”; and worries about the potential societal impact of social media’s algorithm-driven effectiveness in attracting and keeping a grip on our attention. Expect to see in 2018 further efforts by social media companies to get out ahead of most if not all of these issues, in the hopes of winning over critics and discouraging greater governmental regulation.

Regarding the DMCA Safe Harbor for Hosting of User-Generated Content

The backlash against social media noted in my prior item may also be reflected to some extent in several 2017 court decisions regarding the DMCA safe harbor shielding website operators and other online service providers from copyright damages in connection with user-generated content (and perhaps in the CDA Section 230 case law discussed by Aaron Rubin below). After nearly two decades of court decisions generally taking an ever more expansive approach to this particular DMCA safe harbor, the pendulum begun to swing in the other direction in 2016, and this trend picked up steam in 2017, culminating in the Ninth Circuit’s Mavrix decision, which found an social media platform provider’s use of volunteer curators to review user posts to deprive the provider of DMCA safe harbor protection. Expect to see the pendulum continue to swing in favor of copyright owners in DMCA safe harbor decisions over the coming year.

Regarding Smart Contracts

Expect to see broader, mainstream adoption of “smart contracts,” especially in the B2B context—and perhaps litigation over smart contracts in 2019 . . . .

From Aaron Rubin, Co-Editor, Socially Aware, and Partner at Morrison & Foerster:
Regarding the CDA Section 230 Safe Harbor

We noted previously that 2016 was a particularly rough year for Section 230 of the Communications Decency Act and the immunity that the statute provides website operators against liability arising from third-party or user-generated content. Now that 2017 is in the rear view mirror, Section 230 is still standing but its future remains imperiled. We have seen evidence of Section 230’s resiliency in recent cases where courts rejected plaintiffs’ creative attempts to find chinks in the immunity’s armor by arguing, for example, that websites lose immunity when they use data analytics to direct users to content, or when they fail to warn users of potential dangers, or when they share ad revenue with content developers. Nonetheless, it is clear that the knives are still out for Section 230, including in Congress, where a number of bills are under consideration that would significantly limit the safe harbor in the name of combatting sex trafficking. I predict that 2018 will only see these efforts to rein in Section 230 increase. Continue Reading 2018: Predictions From Socially Aware’s Editors and Contributors