Thomson Reuters’ The Daily Docket interviewed Cecillia Xie about her popular TikToks, where she mentors followers and helps them navigate law school and legal careers.

“What I want to encourage young attorneys to do is take a step back from the environment of law school and think about what really makes them happy and what practices would make their career rewarding,” Cecillia said.

Read the full article.

  • On June 7, 2019, the highly controversial EU Copyright Directive (“Directive”) came into force, requiring EU Member States to transpose its provisions into national law by June 7, 2021.
  • To recap, the most relevant provisions of the Directive require the implementation of the following rules into national law:
    • Online content-sharing service providers’ liability for copyright-infringing content, Article 17
      Online content-sharing services are subject to direct liability for copyright-infringing content uploaded by their users if they fail to prove that they made “best efforts” to obtain the rights-holder’s authorization or fail to evidence that they made “best efforts” to ensure the unavailability of such content. They are also liable if they fail to act expeditiously to take down uploads of work for which they have received a takedown notice.
    • Exceptions and limitations to copyright protection
      The Directive introduces exceptions and limitations (e.g., for text and data mining (incl. in favor of commercial enterprises)); provisions regarding collective licensing; and recalltransparency, and fair remuneration rights for authors.
    • Ancillary copyright for press publishers, Article 15
      Press publishers are granted an ancillary copyright for press publications, covering the reproduction and making available of such content by information society service providers (excluding only hyperlinks accompanied by “individual words or very short extracts”).
  • The German Federal Ministry of Justice and Consumer Protection’s latest draft act (Referentenentwurf) for the national implementation of the Directive was leaked in September 2020 (“German Draft Act”).  In summary, the current German Draft Act proposes to implement Article 17 as follows, deviating in part from the EU Copyright Directive:
    • De minimis use: In contrast to the EU Copyright Directive as well in contract to the German Copyright Act, the German Draft Act additionally provides for a de minimis copyright exemption for non-commercial minor uses such as 20 seconds (against a statutory license fee to be paid to collecting societies). This is the most criticized provision of the German Draft Act and makes it very likely that this draft will be revised again by the German legislature shortly.
    • Flagging: Furthermore, without any legal basis in the Directive, as to avoid the risk of over-blocking, based on the German Draft Act, users shall be technically enabled to flag their content as (i) contractually authorized or as (ii) authorized based on copyright exemptions, if such content is identified to them as blocked content. If content is flagged, the provider is not obligated to block or remove content unless the flagging is obviously incorrect.
    • Online content-sharing service providers’ liability for copyright-infringing content: The German Draft Act follows the provisions of the Directive on the scope of liability for online content-sharing service providers.
    • Licensing: Going beyond the Directive, the Draft Act imposes a unilateral obligation on online service providers to contract with representative rights-holders. Effectively, online service providers will have to accept licenses available through a collecting society or a major rights-holder under certain conditions such as the appropriateness of the requested remuneration.
    • Blocking and Removing: If a rights-holder has provided corresponding information to an online service provider, online service providers are obligated to block non-authorized uses of rights-holder’s work (“stay down”). Similarly, following a rights-holder’s request after a work has already been uploaded without authorization, online service providers are obligated to remove such work (“take down”) and to block the work in the future (“stay down”). Factually, the German Draft Act thereby embraces the use of upload filters.
    • Copyright Exemptions: The Draft Act expressly determines copyright exemptions under the German Copyright Act (e.g., caricature, parody, pastiche) as being applicable.

Continue Reading EU Copyright Directive – Quo Vadis: First Steps Towards its German Implementation

A federal district court judge in Brooklyn, N.Y., dismissed the complaint in a case filed by Genius, a platform that lets users share and annotate lyrics, holding that the plaintiff’s claims were preempted by copyright law. The suit alleged that Google had stolen from Genius transcriptions of song lyrics, and included those song lyrics in Google’s website boxes when Google users search for a song. Such actions, Genius alleged, amounted to “unfair and anticompetitive practices.” Genius did not allege copyright infringement, however, because the relevant songwriters and publishers, not Genius, own the copyright in the song lyrics at issue.

In an effort to help bloggers and other online writers who create and publish more short-form content than they have time to register with the Copyright Office, the office has established a new copyright registration option that affords a simplified way to obtain copyright protection for blog entries, social media posts, web articles, and even comments to social media posts if they meet certain criteria. Those criteria include the requirement that each of the works separately contains between 50 and 17,500 words. Also, all the works in a single application must have been created by the same individual, or have been created jointly by the same group of individuals. All of the works also must first have been published as part of a website or online platform, such as a blog, online magazine, or social networking site.

In response to the Federal Communications Commission’s request for comments on potential changes to §230 of the Communications Decency Act, AT&T—which now owns Time Warner—is telling the FCC to shrink the protections that §230 affords technology companies from liability for content posted by third parties.

In a criminal case in which an alleged murderer sought to exonerate himself by seeking the gun-shot-wound victim’s Facebook messages, the California Supreme Court refused to answer the Constitutional question of whether a social media platform’s refusal to turn over messages violates the alleged criminal’s right to a fair trial. The court’s reasoning: In this particular case, the validity of the underlying subpoena was questionable.

YouTube has re-enabled monetization of the account of ultra-conservative commentator Steven Crowder, who—among other things—for years targeted former Vox writer and current YouTuber Carlos Maza, making insulting comments about Maza’s ethnicity and sexuality. The Washington Post reports that several of the platform’s moderators said that “their recommendations to strip advertising from videos that violate the site’s rules were frequently overruled by higher-ups within YouTube when the videos involved higher profile content creators who draw more advertising.”

The Florida Fourth District Court of Appeal reversed a cyberstalking injunction against Florida lawyer Ashley Ann Krapacs. The injunction was imposed on Krapacs for publishing disparaging posts on social media about another lawyer, Nisha Bacchus, including the posts Krapacs published during a four-hour period in one day while Bacchus “untagged herself in real time,” according to Read the brief explanation of the court’s reasoning.

In Elliott v. Donegan, a federal district court in New York held that Section 230 of the Communications Decency Act does not warrant the dismissal of a defamation claim where the plaintiff’s complaint did not “foreclose[] the possibility that Defendant created or developed the allegedly unlawful content.” The content at issue was a “publicly accessible, shared Google spreadsheet” titled “Shitty Media Men” that the defendant Moira Donegan had started containing a list of men in the media business who had allegedly committed some form of sexual violence.

Donegan, a media professional, circulated the list via email and other electronic means to women in her industry in an effort to share information about “people to avoid.” The list included the headings “NAME, AFFILIATION, ALLEGED MISCONDUCT, and NOTES.” It also included a disclaimer at the top of the spreadsheet stating, “This document is only a collection of misconduct allegations and rumors. Take everything with a grain of salt.”

After Donegan first published the list, “Shitty Media Men” assumed a life of its own. The spreadsheet went viral and a flurry of allegations were swiftly submitted. Soon, 70 men were named and major media outlets were preparing to publish related stories. As a result, the defendant took the spreadsheet offline about 12 hours after she posted it. Continue Reading EDNY Refuses to Dismiss on § 230 Grounds in “Shitty Media Men” Defamation Case

It is another win for social media platforms in the realm of the Communications Decency Act’s Section 230. In a case of first impression within the Third Circuit, the Eastern District of Pennsylvania in Hepp v. Facebook ruled that social media platforms are immune under the Communications Decency Act for right of publicity violations under state law by users of such platforms.

Karen Hepp, a television news anchor for FOX 29 News, filed a complaint against several social media platforms, including Facebook, Imgur, Reddit, and Giphy (collectively, “social media defendants”), alleging that the social media defendants violated Pennsylvania’s right of publicity statute and Hepp’s common law right of publicity, based on such defendants’ “unlawful use of her image.”

Two years before filing her complaint, Hepp discovered that a photograph of her was taken without her consent by a security camera in a New York City convenience store. The photograph was subsequently used in online advertisements for erectile dysfunction and dating websites. For example, Hepp’s photograph was featured: (a) on Imgur under the heading “milf,” and (b) on a Reddit post titled “Amazing” in the subgroup r/obsf (“older but still $#^able”). Hepp alleged that, as a public figure, she suffered harm from the unauthorized publication of her image on the platforms hosted by the social media defendants, but she did not allege that such defendants created, authored, or directly published the photograph at issue.

Continue Reading District Court in 3rd Circuit Sides with 9th Circuit: §230 Protects Social Platforms from State Law Intellectual Property Claims

Expressing concern about the spread of disinformation related to COVID-19, Federal Trade Commissioner Rohit Chopra said Congress may need “to reassess the special privileges afforded to tech platforms, especially given their vast power to curate and present content in ways that may manipulate users.” His words implicate one of our favorite topics here at Socially Aware: Section 230 of the Communications Decency Act, which generally protects websites from liability for content posted by third parties.

Florida Governor Ron DeSantis signed into law legislation requiring Florida state agencies, local governments and firms that contract with them to use the E-Verify system, an online database operated by the U.S. Department of Homeland Security that can confirm a person’s eligibility to work in the United States.

In the wake of a series of tweets insulting the family of Turkey’s President, Recep Tayyip Erdogan, the president submitted legislation to parliament that would require social media companies with more than 1 million daily users in Turkey to appoint someone responsible for, among other things, responding to the company’s alleged violations of privacy laws.

China’s recently imposed security law—which outlaws subversion, secession, terrorism and colluding with foreign forces—had many Hong Kongers “scrubbing their social media accounts.”

Pop-up brokers tried to capitalize on the scarcity of personal protective equipment—especially masks—meant to safeguard people against COVID-19 by connecting with suppliers on LinkedIn. Learn what makes that social media platform convenient for sellers and scammers interested in sourcing goods.

In a suit that Socially Aware covered last year, a federal district court in New York was able to avoid addressing whether a defamation claim against television show host Joy Reid should be dismissed based on the safe harbor in Section 230 of the Communications Decency Act. The reason: The plaintiff in the suit failed to prove actual malice, which is required to succeed on a defamation claim against a public figure. On remand, the U.S. Court of Appeals in July 2020 once again passed on the opportunity to answer the re-tweet question, holding, “[The plaintiff’s] initial complaint included Reid’s retweet of the Vargas tweet; but since [the plaintiff] later dropped that claim, we need not decide whether a retweet qualifies for Section 230 immunity. Nor are we called to decide whether Section 230 protects a social media user who copies verbatim (and without attribution) another user’s post, a question that may be complicated by issues as to malice and status as a public figure.

This opinion piece argues that, because facial recognition technology can be inaccurate and biased, and “privacy-encroaching facial recognition companies rely on social media platforms to scrape and collect user facial data,” social media platforms should add a “do-not-track” option for users’ faces.

Foreign websites that use geotargeted advertising may be subject to personal jurisdiction in the United States, even if they have no physical presence in the United States and do not specifically target their services to the United States, according to a new ruling from the Fourth Circuit Court of Appeals.

In UMG Recordings, Inc. v. Kurbanov, twelve record companies sued Tofig Kurbanov, who owns and operates the websites: and These websites enable visitors to rip audio tracks from videos on various platforms, like YouTube, and convert the audio tracks into downloadable files.

The record companies sued Kurbanov for copyright infringement and argued that a federal district court in Virginia had specific personal jurisdiction over Kurbanov because of his contacts with Virginia and with the United States more generally. Kurbanov moved to dismiss for lack of personal jurisdiction, and the district court granted his motion. Continue Reading Stretching the Bounds of Personal Jurisdiction, 4th Circuit Finds Geotargeted Advertising May Subject Foreign Website Owner to Personal Jurisdiction in the U.S.

In a purported attempt to safeguard free speech, President Trump has issued an order “Preventing Online Censorship,” that would eliminate the protections afforded by one of our favorite topics here at Socially Aware, Section 230 of the Communications Decency Act, which generally protects online platforms from liability for content posted by third parties. President Trump issued the order after Twitter tagged some of his tweets as misleading and linked the tweets to text contradicting their substance. Democratic presidential candidate Joseph Biden is also in favor of abolishing the law, but for very different reasons. Find out what they are.

There’s more than one way to strip a law of its teeth. In what Politico describes as “the latest GOP-led plan to target Section 230” of the Communications Decency Act, Senator Josh Hawley is proposing legislation that would make websites eligible for the Section 230 protections only if they stop the sale of ads that target users based on behavioral data.

And a Circuit Court in Virginia dismissed a suit that Representative Devin Nunes, a California Republican, brought against Twitter for defamation stemming from two parody accounts—@DevinNunesMom, and @DevinCow—that posted unflattering things about Rep. Nunes during President Trump’s impeachment hearings last year. The court held that Section 230 of the Communications Decency Act insulated Twitter from liability.

After the first Supreme Court oral argument held over the phone, Justice Ruth Bader Ginsburg wrote in a majority opinion that is entitled to a trademark, and that “adding ‘.com’ to a generic word can make the entire combination eligible for trademark protection.”Twitter’s new “request verification” option will allow run-of-the-mill users to acquire the blue checkmark next to their names that the platform formerly reserved for public figures. Read about how else Twitter plans to improve its verification system.

Tik My Day, a marketing agency that Tik Tok launched in Australia, says it can provide all the services necessary for a branded Tik Tok campaign within 24 hours.

Users of the dating app OkCupid can now add a #BlackLivesMatter badge to their profiles. The social media firm also donated $1 million in advertising to Black American civil rights organizations, and added several social-justice-related questions to their matching process. How did the app’s users answer? Read the statistics.

Eric Akira Tate spoke to TechRepublic about how businesses should think about establishing or updating corporate social media policies to account for the changing standards, especially as the U.S. is in the midst of a civil rights movement.

“Reviewing social media policies so that there are no misunderstandings about what use of social media is acceptable or not for an employer is all the more appropriate to do now,” Eric said, adding that employers should have policies in place to prevent employees from inadvertently or purposefully interfering with the employer’s desired image.

“A policy puts employees on notice of what conduct or use of social media is allowed and not allowed within the parameters of their employment and work for the employer,” he said. “It allows employers to have more consistent enforcement in the event of improper behavior.”

Eric also recommended reviewing existing social media policies to cover the increase in remote work, “Being careful about taking selfies with computer screens or work papers in the background, for example, could inadvertently reveal confidential business information.”

Read the full article.

Online service providers typically seek to mitigate risk by including arbitration clauses in their user agreements. In order for such agreements to be effective, however, they must be implemented properly. Babcock vs. Neutron Holdings, Inc., a recent Southern District of Florida case involving a plaintiff who was injured while riding one of the defendant’s Lime e-scooters, illustrates that courts will closely scrutinize the details of how an online contract is presented to users to determine whether or not it is enforceable. Continue Reading Sweating the Details: Court Analyzes User Interface to Uphold Online Arbitration Clause