In Ganske v. Mensch, a defamation suit stemming from a “battle by Tweet,” a federal district court in New York held that the allegedly defamatory statements in the defendant’s Tweet were nonactionable statements of opinion and dismissed the case. The case illustrates that courts in such “Twibel” (Twitter + libel) cases may view Tweets and similar statements on social media as informal and “freewheeling” in nature, which reasonable readers would understand to be expressions of opinion rather than statements of fact.

Charles Ganske, a former Associated Press (AP) journalist, sued Louise Mensch, a blogger and former member of the British Parliament, for defamation and tortious interference. Ganske argued that Mensch defamed him and interfered with his employment at AP based on a single Tweet that she posted on July 27, 2018, by which she “interjected herself” into a Twitter thread between Ganske and another Twitter user with the handle @Conspirator0.

Mensch’s Tweet from her @patribotics Twitter account stated: “To this xenophobic tweet of yours, sir, I fear we must tell @APCentral ‘citation needed’. You clearly personally spread Russian bots on your own site; and @Conspirator0 work on it has sent you into a frenzy of tweeting and trying to discredit him.”

Ganske claimed that Mensch’s Tweet contained false and defamatory statements about him because neither he nor his Tweets were xenophonic and he never spread Russian bots on any website. He also alleged that Mensch deliberately tagged his employer, AP, and published the Tweet to @APCentral in order to interfere with his employment. Ganske’s employment with AP was later terminated, and Ganske argued that this was the result of Mensch’s Tweet. Continue Reading S.D.N.Y. Dismisses Defamation Case Arising Out of “Battle by Tweet”

A recent ruling in Parziale v. HP, Inc., arising out of the implementation by Hewlett-Packard (“HP”) of a remote firmware update on many models of the company’s printers, highlights the potentially broad application of the Computer Fraud and Abuse Act (“CFAA”). It also serves as a reminder to technology companies that when distributing software and firmware updates, they must be mindful of providing specific advance notice that such updates may impact product or computer performance.

The Computer Fraud and Abuse Act

In 1986, Congress enacted the CFAA, reportedly in response to concerns arising from the Matthew Broderick film War Games, in which a teenage computer hacker accesses a U.S. Defense Department computer, unintentionally starts the launch sequence on the U.S. nuclear arsenal thinking it is a computer game, and comes close to starting World War III. Saving us all from annihilation, Broderick teaches the computer that when it comes to global thermonuclear war, “the only winning move is not to play.”

The CFAA is the primary computer crime law in the United States. Over the years, it has been amended several times and has broad application. The CFAA criminalizes fraud and certain other specified activities in connection with unauthorized access to computers. The CFAA also provides for civil remedies based on the same prohibited conduct. Continue Reading Avoiding Claims Under the Computer Fraud and Abuse Act in Connection with Software and Firmware Updates

In an attempt to shut down free speech online, Turkey enacted a law that requires social media platforms with more than a million daily users in Turkey to open an office there or assign a representative who is legally accountable to Turkish authorities. Among other things, the law also requires companies to respond within two days to complaints about posts that “violate personal and privacy rights.” Learn what social media companies risk if they don’t comply.

Significantly deviating from his former choices for FCC Commissioner, President Trump nominated Nathan Simington to replace current Republican Commissioner Mike O’Reilly. Unlike previous nominees, who were not heavily involved in technology policy, Simington “played a significant role in drafting” an order instructing the FCC to limit Section 230 of the Communications Decency Act’s protections for technology companies, according to The Verge. Find out why Trump revoked O’Reilly’s nomination for a third term.

Spending on advertising in general plummeted in Q2 of 2020 compared to the same quarter last year, with newspaper ad spend dropping by nearly 50% and radio ad spend dropping by about 42%. On the other hand, the dip in social media ad spend was far less significant. Analysts attribute advertisers’ continued interest in social media to COVID-19, saying that social media use skyrocketed during quarantine. Find out how little social media’s advertising revenue dropped in Q2 2020.

Some of YouTube’s top earners are children. France just passed a law to limit the hours they work and place their earnings in a bank account until they turn 16. You won’t believe the staggering amounts of cash some of these kids rake in.

Harvard Law School has a new social media policy precluding students from posting statements made in class together with enough information to make the speaker identifiable by someone who was not present in class. Learn the school’s reason for implementing this new policy.

A private investigator in Sacramento, California—where a new law classifies COVID-19 as an “injury” under workers’ compensation when certain circumstances apply—is using people’s social media posts as evidence that they didn’t contract COVID-19 at work.

Doomswiping” is the latest dating-app trend.

The USPTO recently released the report “Public Views on Artificial Intelligence and Intellectual Property Policy”. The report is part of the USPTO’s effort to engage with the innovation community and experts on AI and to promote innovation of AI through appropriate intellectual property incentives.

The report includes the analysis of nearly 200 responses received from individuals and organizations to federal notices published in August and October 2019 to solicit public comments on patenting AI inventions and the impact of AI on other areas of intellectual property policy. The USPTO requested feedback on issues such as whether current laws and regulations regarding patent inventorship and authorship of copyrighted work should be revised to take into account contributions other than by natural persons.

AI in Evolution. As an initial matter, commenters noted that AI has no universally recognized definition, and any definition used as part of an AI policy must be dynamic enough to evolve as AI technology evolves. Some suggested that the USPTO revisit the question of non-human inventions when artificial general intelligence (AGI)—AI that mimics human intelligence—is a reality and not just “purely hypothetical.”

Sufficient and Not Necessary. The majority of respondents took the view that current U.S. IP laws provide sufficient protection for development using current AI technology. To many, existing contract law principles can be used to adequately fill in any gaps as AI technology further advances. Generally, commentators were divided on the need for new intellectual property rights to address AI inventions. Those focusing on new protections were focused mostly on data, with some suggesting that advances in AI should warrant more protection for data rights, including sui generis protection.

Human Not Machine Inventors. With respect to patents, commenters agreed in large part that, for now, humans, not machines, must be inventors. Further, most agreed that only a natural person or a company, through an assignment, should be considered the owner of a patent or an invention, although some suggested extending ownership to those who train an AI process or own or control an AI system. Other respondents were concerned on the practical effects of recognizing non-natural inventors, e.g., how would a machine sign an oath?

Continue Reading In the Public Eye: USPTO Issues Report on AI

Thomson Reuters’ The Daily Docket interviewed Cecillia Xie about her popular TikToks, where she mentors followers and helps them navigate law school and legal careers.

“What I want to encourage young attorneys to do is take a step back from the environment of law school and think about what really makes them happy and what practices would make their career rewarding,” Cecillia said.

Read the full article.

SUMMARY
  • On June 7, 2019, the highly controversial EU Copyright Directive (“Directive”) came into force, requiring EU Member States to transpose its provisions into national law by June 7, 2021.
  • To recap, the most relevant provisions of the Directive require the implementation of the following rules into national law:
    • Online content-sharing service providers’ liability for copyright-infringing content, Article 17
      Online content-sharing services are subject to direct liability for copyright-infringing content uploaded by their users if they fail to prove that they made “best efforts” to obtain the rights-holder’s authorization or fail to evidence that they made “best efforts” to ensure the unavailability of such content. They are also liable if they fail to act expeditiously to take down uploads of work for which they have received a takedown notice.
    • Exceptions and limitations to copyright protection
      The Directive introduces exceptions and limitations (e.g., for text and data mining (incl. in favor of commercial enterprises)); provisions regarding collective licensing; and recalltransparency, and fair remuneration rights for authors.
    • Ancillary copyright for press publishers, Article 15
      Press publishers are granted an ancillary copyright for press publications, covering the reproduction and making available of such content by information society service providers (excluding only hyperlinks accompanied by “individual words or very short extracts”).
  • The German Federal Ministry of Justice and Consumer Protection’s latest draft act (Referentenentwurf) for the national implementation of the Directive was leaked in September 2020 (“German Draft Act”).  In summary, the current German Draft Act proposes to implement Article 17 as follows, deviating in part from the EU Copyright Directive:
    • De minimis use: In contrast to the EU Copyright Directive as well in contract to the German Copyright Act, the German Draft Act additionally provides for a de minimis copyright exemption for non-commercial minor uses such as 20 seconds (against a statutory license fee to be paid to collecting societies). This is the most criticized provision of the German Draft Act and makes it very likely that this draft will be revised again by the German legislature shortly.
    • Flagging: Furthermore, without any legal basis in the Directive, as to avoid the risk of over-blocking, based on the German Draft Act, users shall be technically enabled to flag their content as (i) contractually authorized or as (ii) authorized based on copyright exemptions, if such content is identified to them as blocked content. If content is flagged, the provider is not obligated to block or remove content unless the flagging is obviously incorrect.
    • Online content-sharing service providers’ liability for copyright-infringing content: The German Draft Act follows the provisions of the Directive on the scope of liability for online content-sharing service providers.
    • Licensing: Going beyond the Directive, the Draft Act imposes a unilateral obligation on online service providers to contract with representative rights-holders. Effectively, online service providers will have to accept licenses available through a collecting society or a major rights-holder under certain conditions such as the appropriateness of the requested remuneration.
    • Blocking and Removing: If a rights-holder has provided corresponding information to an online service provider, online service providers are obligated to block non-authorized uses of rights-holder’s work (“stay down”). Similarly, following a rights-holder’s request after a work has already been uploaded without authorization, online service providers are obligated to remove such work (“take down”) and to block the work in the future (“stay down”). Factually, the German Draft Act thereby embraces the use of upload filters.
    • Copyright Exemptions: The Draft Act expressly determines copyright exemptions under the German Copyright Act (e.g., caricature, parody, pastiche) as being applicable.

Continue Reading EU Copyright Directive – Quo Vadis: First Steps Towards its German Implementation

A federal district court judge in Brooklyn, N.Y., dismissed the complaint in a case filed by Genius, a platform that lets users share and annotate lyrics, holding that the plaintiff’s claims were preempted by copyright law. The suit alleged that Google had stolen from Genius transcriptions of song lyrics, and included those song lyrics in Google’s website boxes when Google users search for a song. Such actions, Genius alleged, amounted to “unfair and anticompetitive practices.” Genius did not allege copyright infringement, however, because the relevant songwriters and publishers, not Genius, own the copyright in the song lyrics at issue.

In an effort to help bloggers and other online writers who create and publish more short-form content than they have time to register with the Copyright Office, the office has established a new copyright registration option that affords a simplified way to obtain copyright protection for blog entries, social media posts, web articles, and even comments to social media posts if they meet certain criteria. Those criteria include the requirement that each of the works separately contains between 50 and 17,500 words. Also, all the works in a single application must have been created by the same individual, or have been created jointly by the same group of individuals. All of the works also must first have been published as part of a website or online platform, such as a blog, online magazine, or social networking site.

In response to the Federal Communications Commission’s request for comments on potential changes to §230 of the Communications Decency Act, AT&T—which now owns Time Warner—is telling the FCC to shrink the protections that §230 affords technology companies from liability for content posted by third parties.

In a criminal case in which an alleged murderer sought to exonerate himself by seeking the gun-shot-wound victim’s Facebook messages, the California Supreme Court refused to answer the Constitutional question of whether a social media platform’s refusal to turn over messages violates the alleged criminal’s right to a fair trial. The court’s reasoning: In this particular case, the validity of the underlying subpoena was questionable.

YouTube has re-enabled monetization of the account of ultra-conservative commentator Steven Crowder, who—among other things—for years targeted former Vox writer and current YouTuber Carlos Maza, making insulting comments about Maza’s ethnicity and sexuality. The Washington Post reports that several of the platform’s moderators said that “their recommendations to strip advertising from videos that violate the site’s rules were frequently overruled by higher-ups within YouTube when the videos involved higher profile content creators who draw more advertising.”

The Florida Fourth District Court of Appeal reversed a cyberstalking injunction against Florida lawyer Ashley Ann Krapacs. The injunction was imposed on Krapacs for publishing disparaging posts on social media about another lawyer, Nisha Bacchus, including the posts Krapacs published during a four-hour period in one day while Bacchus “untagged herself in real time,” according to Law.com. Read the brief explanation of the court’s reasoning.

In Elliott v. Donegan, a federal district court in New York held that Section 230 of the Communications Decency Act does not warrant the dismissal of a defamation claim where the plaintiff’s complaint did not “foreclose[] the possibility that Defendant created or developed the allegedly unlawful content.” The content at issue was a “publicly accessible, shared Google spreadsheet” titled “Shitty Media Men” that the defendant Moira Donegan had started containing a list of men in the media business who had allegedly committed some form of sexual violence.

Donegan, a media professional, circulated the list via email and other electronic means to women in her industry in an effort to share information about “people to avoid.” The list included the headings “NAME, AFFILIATION, ALLEGED MISCONDUCT, and NOTES.” It also included a disclaimer at the top of the spreadsheet stating, “This document is only a collection of misconduct allegations and rumors. Take everything with a grain of salt.”

After Donegan first published the list, “Shitty Media Men” assumed a life of its own. The spreadsheet went viral and a flurry of allegations were swiftly submitted. Soon, 70 men were named and major media outlets were preparing to publish related stories. As a result, the defendant took the spreadsheet offline about 12 hours after she posted it. Continue Reading EDNY Refuses to Dismiss on § 230 Grounds in “Shitty Media Men” Defamation Case

It is another win for social media platforms in the realm of the Communications Decency Act’s Section 230. In a case of first impression within the Third Circuit, the Eastern District of Pennsylvania in Hepp v. Facebook ruled that social media platforms are immune under the Communications Decency Act for right of publicity violations under state law by users of such platforms.

Karen Hepp, a television news anchor for FOX 29 News, filed a complaint against several social media platforms, including Facebook, Imgur, Reddit, and Giphy (collectively, “social media defendants”), alleging that the social media defendants violated Pennsylvania’s right of publicity statute and Hepp’s common law right of publicity, based on such defendants’ “unlawful use of her image.”

Two years before filing her complaint, Hepp discovered that a photograph of her was taken without her consent by a security camera in a New York City convenience store. The photograph was subsequently used in online advertisements for erectile dysfunction and dating websites. For example, Hepp’s photograph was featured: (a) on Imgur under the heading “milf,” and (b) on a Reddit post titled “Amazing” in the subgroup r/obsf (“older but still $#^able”). Hepp alleged that, as a public figure, she suffered harm from the unauthorized publication of her image on the platforms hosted by the social media defendants, but she did not allege that such defendants created, authored, or directly published the photograph at issue.

Continue Reading District Court in 3rd Circuit Sides with 9th Circuit: §230 Protects Social Platforms from State Law Intellectual Property Claims

Expressing concern about the spread of disinformation related to COVID-19, Federal Trade Commissioner Rohit Chopra said Congress may need “to reassess the special privileges afforded to tech platforms, especially given their vast power to curate and present content in ways that may manipulate users.” His words implicate one of our favorite topics here at Socially Aware: Section 230 of the Communications Decency Act, which generally protects websites from liability for content posted by third parties.

Florida Governor Ron DeSantis signed into law legislation requiring Florida state agencies, local governments and firms that contract with them to use the E-Verify system, an online database operated by the U.S. Department of Homeland Security that can confirm a person’s eligibility to work in the United States.

In the wake of a series of tweets insulting the family of Turkey’s President, Recep Tayyip Erdogan, the president submitted legislation to parliament that would require social media companies with more than 1 million daily users in Turkey to appoint someone responsible for, among other things, responding to the company’s alleged violations of privacy laws.

China’s recently imposed security law—which outlaws subversion, secession, terrorism and colluding with foreign forces—had many Hong Kongers “scrubbing their social media accounts.”

Pop-up brokers tried to capitalize on the scarcity of personal protective equipment—especially masks—meant to safeguard people against COVID-19 by connecting with suppliers on LinkedIn. Learn what makes that social media platform convenient for sellers and scammers interested in sourcing goods.

In a suit that Socially Aware covered last year, a federal district court in New York was able to avoid addressing whether a defamation claim against television show host Joy Reid should be dismissed based on the safe harbor in Section 230 of the Communications Decency Act. The reason: The plaintiff in the suit failed to prove actual malice, which is required to succeed on a defamation claim against a public figure. On remand, the U.S. Court of Appeals in July 2020 once again passed on the opportunity to answer the re-tweet question, holding, “[The plaintiff’s] initial complaint included Reid’s retweet of the Vargas tweet; but since [the plaintiff] later dropped that claim, we need not decide whether a retweet qualifies for Section 230 immunity. Nor are we called to decide whether Section 230 protects a social media user who copies verbatim (and without attribution) another user’s post, a question that may be complicated by issues as to malice and status as a public figure.

This opinion piece argues that, because facial recognition technology can be inaccurate and biased, and “privacy-encroaching facial recognition companies rely on social media platforms to scrape and collect user facial data,” social media platforms should add a “do-not-track” option for users’ faces.