Based on copyright infringement, emotional distress and other claims, a federal district court in California awarded $6.4 million to a victim of revenge porn, the posting of explicit material without the subject’s consent. The judgment is believed to be one of the largest awards relating to revenge porn. A Socially Aware post that we wrote back in 2014 explains the difficulties of using causes of action like copyright infringement—and state laws—as vehicles for fighting revenge porn.

The highest court in New York State held that whether or not a personal injury plaintiff’s Facebook photos are discoverable does not depend on whether the photos were set to “private,” but rather “the nature of the event giving rise to the litigation and the injuries claimed, as well as any other information specific to the case.”

A federal district court held that Kentucky’s governor did not violate the free speech rights of two Kentucky citizens when he blocked them from commenting on his Facebook page and Twitter account. The opinion underscores differences among courts as to the First Amendment’s application to government officials’ social media accounts; for example, a Virginia federal district court’s 2017 holding reached the opposite conclusion in a case involving similar facts.

Having witnessed social media’s potential to escalate gang disputes, judges in Illinois have imposed limitations on some juvenile defendants’ use of the popular platforms, a move that some defense attorneys argue violates the young defendants’ First Amendment rights.

A bill proposed by California State Sen. Bob Hertzberg would require social media platforms to identify bots—automated accounts that appear to be owned by real people but are actually computer programs capable of simulating human dialog. Bots can spread a message across social media faster and more widely than would be humanly possible, and have been used in efforts to manipulate public opinion.

This CIO article lists the new strategies, job titles and processes that will be popular this year among businesses transforming into data-driven enterprises.

A solo law practitioner in Chicago filed a complaint claiming defamation and false light against a former client who she alleges posted a Yelp review calling her a “con artist” and a “legal predator”  after, allegedly pursuant to the terms of his retainer, she billed $9,000 to his credit card for a significant amount of legal work.

Carnival Cruise Line put up signs all over the hometown of the 15-year-old owner of the Snapchat handle @CarnivalCruise in order to locate him and offer him and his family a luxurious free vacation in exchange for the transfer of his Snapchat handle—and the unusual but innovate strategy paid off. Who knew that old-school billboards could be so effectively used for one-on-one marketing?

Does a search engine operator have to delist websites hosting, without authorization, your trade secret materials or other intellectual property? The answer may depend on where you sue—just ask Google. The U.S. District Court for the Northern District of California recently handed the company a victory over plaintiff Equustek Solutions Inc. in what has turned into an international battle where physical borders can have very real consequences on the Internet.

The dispute began when a rival company, Datalink, allegedly misappropriated Equustek’s trade secrets in developing competing products. Equustek also alleged that Datalink misled customers who thought they were buying Equustek products. In 2012, Equustek obtained numerous court orders in Canada against Datalink. Datalink refused to comply, and Canadian court issued an arrest warrant for the primary defendant, who has yet to be apprehended. Continue Reading The Coming Border Wars: U.S. Court Decision Refusing to Enforce Canadian Court Order Highlights the Growing Balkanization of the Internet

Geo-blocking is the practice of preventing Internet users in one jurisdiction from accessing services elsewhere based on the user’s geographic location. The European Commission wants to eliminate geo-blocking within the EU—and has taken a significant step forward in its plans to do so by clearing key votes in the EU legislative process.

By the end of 2018, we expect that online retailers will need to ensure that they phase out the use of geo-blocking across the EU except in limited circumstances.

These changes are part of a wider programme of reform affecting all businesses operating in the Technology, Media, and Telecoms sectors in Europe.

Background

The European Commission launched its Digital Single Market (“DSM”) strategy in May 2015. We have written a number of articles following the DSM’s progress: at its inception, one year in, and in 2017 following a mid-term review.

Continue Reading EU Regulation Reform—Unjustified Geo-Blocking to Be Phased Out by End of 2018

Section 230 of the Communications Decency Act continues to act as one of the strongest legal protections that social media companies have to avoid being saddled with crippling damage awards based on the misdeeds of their users.

The strong protections afforded by Section 230(c) were recently reaffirmed by Judge Caproni of the Southern District of New York, in Herrick v. Grindr. The case involved a dispute between the social networking platform Grindr and an individual who was maliciously targeted through the platform by his former lover. For the unfamiliar, Grindr is mobile app directed to gay and bisexual men that, using geolocation technology, helps them to connect with other users who are located nearby.

Plaintiff Herrick alleged that his ex-boyfriend set up several fake profiles on Grindr that claimed to be him. Over a thousand users responded to the impersonating profiles. Herrick’s ex‑boyfriend, pretending to be Herrick, would then direct the men to Herrick’s’ work-place and home. The ex-boyfriend, still posing as Herrick, would also tell these would-be suitors that Herrick had certain rape fantasies, that he would initially resist their overtures, and that they should attempt to overcome Herrick’s initial refusals. The impersonating profiles were reported to Grindr (the app’s operator), but Herrick claimed that Grindr did not respond, other than to send an automated message. Continue Reading Lawsuit Against Online Dating App Grindr Dismissed Under Section 230 of the Communications Decency Act

In a decision that has generated considerable controversy, a federal court in New York has held that the popular practice of embedding tweets into websites and blogs can result in copyright infringement. Plaintiff Justin Goldman had taken a photo of NFL quarterback Tom Brady, which Goldman posted to Snapchat. Snapchat users “screengrabbed” the image for use in tweets on Twitter. The defendants—nine news outlets—embedded tweets featuring the Goldman photo into online articles so that the photo itself was never hosted on the news outlets’ servers; rather, it was hosted on Twitter’s servers (a process known as “framing” or “inline linking”). The court found that, even absent any copying of the image onto their own servers, the news outlets’ actions had resulted in a violation of Goldman’s exclusive right to authorize the public display of his photo.

If legislation recently introduced in California passes, businesses with apps or websites requiring passwords and enabling Golden State residents younger than 18 to share content could be prohibited from asking those minors to agree to the site’s or the app’s terms and conditions of use.

After a lawyer was unable to serve process by delivering court documents to a defendant’s physical and email addresses, the Ontario Superior Court granted the lawyer permission to serve process by mailing a statement of claim to the defendant’s last known address and by sending the statement of claim through private messages to the defendant’s Instagram and LinkedIn accounts. This is reportedly the first time an Ontario court has permitted service of process through social media. The first instance that we at Socially Aware heard of a U.S. court permitting a plaintiff to serve process on a domestic, U.S.-based defendant through a social media account happened back in 2014.

Videos that impose celebrities’ and non-famous people’s faces onto porn performers’ to produce believable videos have surfaced on the Internet, and are on the verge of proliferating. Unlike the non-consensual dissemination of explicit photos that haven’t been manipulated—sometimes referred to as “revenge porn”—this fake porn is technically not a privacy issue, and making it illegal could raise First Amendment issues.

By mining datasets and social media to recover millions of dollars lost to tax fraud and errors, the IRS may be violating common law and the Electronic Communications Privacy Act, according to an op-ed piece in The Hill.

A woman is suing her ex-husband, a sheriff’s deputy in Georgia, for having her and her friend arrested and briefly jailed for posting on Facebook about his alleged refusal to drop off medication for his sick children on his way to work. The women had been charged with “criminal defamation of character” but the case was ultimately dropped after a state court judge ruled there was no basis for the arrest.

During a hearing in a Manhattan federal court over a suit brought by seven Twitter users who say President Trump blocked them on Twitter for having responded to his tweets, the plaintiffs’ lawyer compared Twitter to a “virtual town hall” where “blocking is a state action and violates the First Amendment.” An assistant district attorney, on the other hand, analogized the social media platform to a convention where the presiding official can decide whether or not to engage with someone. The district court judge who heard the arguments refused to decide the case on the spot and encouraged the parties to settle out of court.

Have your social media connections been posting headshots of themselves alongside historical portraits of people who look just like them? Those posts are the product of a Google app that matches the photo of a person’s face to a famous work of art, and the results can be fun. But not for people who live in Illinois or Texas, where access to the app isn’t available. Experts believe it’s because laws in those states restrict how companies can use biometric data.

The stock market is apparently keeping up with the Kardashians. A day after Kim Kardashian’s half-sister Kylie Jenner tweeted her frustration with Snapchat’s recent redesign, the company’s market value decreased by $1.3 billion.

In February the U.S Supreme Court heard oral arguments in United States v. MicrosoftAt issue is Microsoft’s challenge to a warrant issued by a U.S. court directing it to produce emails stored in Ireland. With implications for government investigations, privacy law, and multi-national tech companies’ ability to compete globally, the case has attracted significant attention.

Over the course of the oral arguments it became clear that rendering a decision in United States v. Microsoft would require the justices to choose between two less-than-satisfactory outcomes: denying the U.S. government access to necessary information, or potentially harming U.S. technology companies’ ability to operate globally.

The conundrum the justices face is largely due to the fact that the 1986 law at issue, the Stored Communications Act (SCA), never envisioned the kind of complex, cross-border data storage practices of today.

Find out more about the case and how recently introduced legislation known as the CLOUD Act could wind up superseding the Court’s decision in United States v. Microsoft by, among other things, clarifying the SCA’s applicability to foreign-stored data while also providing technology companies with a new vehicle for challenging certain orders that conflict with the laws of the country where data is stored.

Read my article in Wired.

In the last few years, as advertising has followed consumers from legacy media such as television to online video and social media platforms, the Federal Trade Commission has been attempting to ensure that participants in this new advertising ecosystem understand the importance of complying with the FTC’s “Guides Concerning the Use of Endorsements and Testimonials in Advertising,” or the endorsement guides. The endorsement guides require advertisers and endorsers (also referred to as influencers) to, among other things, clearly and conspicuously disclose when the advertiser has provided an endorser with any type of compensation in exchange for an endorsement.

A failure to make appropriate disclosures may be a violation of Section 5 of the Federal Trade Commission Act, which prohibits unfair or deceptive acts or practices. In recent enforcement actions, press releases, guidance, closing letters and letters sent directly to endorsers (including prominent public figures), the FTC has made clear its belief that: (1) appropriate disclosures by influencers are essential to protecting consumers; and (2) in too many instances, such disclosures are absent from celebrity or other influencer endorsements. Continue Reading The FTC’s Quest for Better Influencer Disclosures

In U.S. copyright law circles, one of the hottest topics of debate is the degree to which the fair use doctrine—which allows for certain unauthorized uses of copyrighted works—should protect companies building commercial products and services based on content created by others, especially where such products or services are making transformative uses of such content.

This debate is likely to become even more heated in the wake of the Second Circuit Court of Appeals’ issuance last week of its long-awaited decision in the copyright dispute between Fox News and TVEyes, in which the court sided with the copyright owner over the creator of a digital “search engine” for identifying and viewing television content. But regardless of which side of the debate you are on (or if you are just standing on the sidelines), the court’s decision provides important guidance on the scope of the fair use doctrine as applied to commercial products and services.

The Dispute

Using the closed-captioning data that accompanies most television programming, TVEyes provides a searchable database of video clips. TVEyes’ subscribers—who pay $500 a month—can search the database for keywords in order to identify and view video clips from the service; such video clips may be as long as ten minutes in duration.

In July 2013, Fox sued TVEyes for copyright infringement and, in August 2015, Judge Hellerstein of the U.S. District Court for the Southern District of New York held that the key features of the TVEyes service are protected under the fair use doctrine. Continue Reading All Eyes on Fair Use: The Second Circuit Delivers a Victory for Copyright Owners

As we have noted previously, YouTube users sometimes object when the online video giant removes their videos based on terms-of-use violations, such as artificially inflated view counts. In a recent California case, Bartholomew v. YouTube, LLC, the court rejected a user’s claim that the statement YouTube posted after it removed her video, which allegedly gave the impression that the video contained offensive content, was defamatory.

Joyce Bartholomew is a musician who creates what she calls “original Christian ministry music.” Ms. Bartholomew produced a video for the song “What Was Your Name” and posted the video on YouTube in January 2014. YouTube assigned a URL to the video, which Ms. Bartholomew began sharing with her listeners and viewers. By April 2014, she claims that the video had amassed over 30,000 views.

Shortly afterwards, however, YouTube removed the video and replaced it with the image of a “distressed face” and the following removal statement: “This video has been removed because its content violated YouTube’s Terms of Service.” The removal statement also provided a hyperlink to YouTube’s “Community Guideline Tips,” which identifies 10 categories of prohibited content: “Sex and Nudity,” “Hate Speech,” “Shocking and Disgusting,” “Dangerous Illegal Acts,” “Children,” “Copyright,” “Privacy,” “Harassment,” “Impersonation” and “Threats.” Continue Reading California Court Holds That YouTube’s Removal Notice Is Not Defamatory

The music industry came out on top in one of its first attempts to hold an internet service provider liable for its subscribers’ unauthorized peer-to-peer file sharing.

The decision, handed down by the Fourth Circuit Court of Appeals in a dispute between BMG Rights Management and Cox Communications, outlines the obligations an ISP must fulfill to receive safe harbor protection under the Digital Millennium Copyright Act for a subscriber’s infringement. It also explains when an ISP can be held contributorily liable for its subscribers’ actions.

Read my full analysis here.