Computer scientist and legal scholar Nick Szabo first proposed the idea of “smart contracts” in 1996. Szabo published his initial paper on the topic in a publication called Extropy, a journal of transhumanism, a movement seeking to enhance human intellect and physiology by means of sophisticated technologies. At the time, the idea was nothing if not futuristic.

Fast forward 22 years, and even if the actual use of smart legal contracts remains largely in the future, the idea of them has gone mainstream. What follows is our list of the top five things you need to know about this quickly evolving area.

  1. Their Name Is Somewhat Confusing

When lawyers speak of contracts, they generally mean agreements that are intended to be legally enforceable. In contrast, when most people use the term “smart contract” they’re not referring to a contract in the legal sense, but instead to computer coding that may effectuate specified results based on “if, then” logic.

Advocates of smart legal contracts envision a day when coding will automatically exercise real-world remedies if one of the parties to a smart contract fails to perform.. For example, if an automotive borrower were to fail to make a car payment, coding within the smart loan agreement could automatically trigger a computer controlling the relevant car to prevent the borrower from driving it, or could cause the car to drive autonomously to the lender’s garage.

Even then, whether coding itself could ever satisfy the requirements of a legally binding contract is up for debate. Continue Reading Five Things to Know About Smart Contracts

Most companies are familiar with the Children’s Online Privacy Protection Act (COPPA) and its requirement to obtain parental consent before collecting personal information online from children under 13.  Yet COPPA also includes an information deletion requirement of which companies may be unaware.  On May 31, 2018, the Federal Trade Commission (FTC) published a blog post addressing this requirement, clarifying (i) when children’s personal information must be deleted and (ii) how the requirement applies, as well as (iii) recommending that covered companies review their information retention policies to ensure they are in compliance.

(i) COPPA’s information deletion requirement.  The FTC clarifies that, under Section 312.10 of COPPA, companies may retain children’s personal information “for only as long as is reasonably necessary to fulfill the purpose for which the information was collected.”  After that, a company must use reasonable measures to ensure such personal information is securely destroyed.

(ii) Application of the deletion requirement to children’s outdated subscription information.  In its post, the FTC applies the deletion requirement to the example of a subscription-based app directed to children under 13.  If the subscription period ends, and a parent decides not to renew the service, can the company keep the child’s personal information?  The answer, the FTC confirms, is “no”:  the information is no longer “reasonably necessary” to provide the app’s services, so it must be deleted.  This is true regardless of whether a parent affirmatively requests deletion.

(ii) Recommendation to review information retention policies in light of the deletion requirement.  The FTC recommends that companies review their information retention policies with COPPA’s deletion requirement in mind.  It lists questions to help guide companies as they navigate this requirement:

  • What types of personal information are you collecting from children?
  • What is your stated purpose for collecting the information?
  • How long do you need to hold onto the information to fulfill the purpose for which it was initially collected? For example, do you still need information you collected a year ago?
  • Does the purpose for using the information end with an account deletion, subscription cancellation, or account inactivity?
  • When it’s time to delete information, are you doing it securely?

Key takeaway.  If a company possesses personal information collected online from a child under 13, and the information no longer serves the purpose for which it was collected, the company must delete it.  Companies should review their information retention policies to ensure compliance with this COPPA requirement.

*       *       *

For more on the Children’s Online Privacy Protection Act, please read the following Socially Aware posts: FTC Issues Substantially Revised COPPA Rule: and Review of Changes and Compliance Tips; and Mobile App Legal Terms & Conditions: Six Key Considerations.

Finding that President Trump’s Twitter feed constitutes a public forum, a federal judge in New York City held that it’s a First Amendment violation when the President or one of his assistants blocks a Twitter user from viewing or responding to one of the President’s tweets. As the New York Times points out, the decision “is likely to have implications far beyond Mr. Trump’s feed and its 52 million followers.” A blog post on the online version of the monthly magazine Reason provides some tips for politicians with social media accounts who want to stay on the right side of the law.

Speaking of President Trump, the former secretary of a federal judge is claiming the President got her fired. Okay, not exactly. The secretary, Olga Zuniga, who worked for a judge on Texas’s highest criminal court, filed a lawsuit alleging that the judge—a member of the GOP—terminated her employment because he found Facebook posts in which Zuniga criticized President Trump’s and other Republican politicians’ immigration policies. A post on Popehat, a fellow ABA Web 100 honoree, explores the strength of Zuniga’s case.

Unless you’ve been living in a cave, you know that the EU’s General Data Protection Regulation (GDPR) took effect last Friday, May 25th. Now that the dust has cleared, if you are interested in up-to-date information regarding GDPR developments and compliance insights, check out our GDPR Readiness Center. If you want details on what GDPR means for your outsourcing and other vendor agreements, you might want to attend our upcoming webinar.

The impact of GDPR is being felt across social media platforms in all sorts of ways. For example, in a move reportedly prompted by GDPR, Twitter has shut down accounts of those users who, at the time that they joined Twitter, were under 13 years of age, based on date-of-birth information voluntarily provided by such users during the registration process.

Facing an inbox full of companies’ privacy policy updates? You can blame that on the GDPR too. In fact, the onslaught of GDPR-induced privacy-policy updates inspired some pretty creative memes on Twitter.

Wait… the GDPR will also affect tourists taking photos with their phones?

Instagram is expanding its anti-bullying initiatives by using a machine-learning algorithm to filter out harassing comments and reviewing the accounts with an especially high number of blocked comments to determine whether the owners of those accounts have violated the platform’s community guidelines.

The still-unprofitable Snapchat will begin running six-second advertisements that its users will not be able to skip. These un-skippable commercials will not run during users’ personal stories, only during select Snapchat Shows—highly produced three-to-five minute programs from well-known entertainment companies.

The fascinating story of how Wired lost a small fortune in Bitcoin. . . . (Well, the Bitcoins are here, but the key has been destroyed.)

The Royal Wedding was a bigger topic on Pinterest than it was on Facebook. FastCompany speculates that it’s because Pinterest’s audience is predominantly women and reveals the subject of most of the Royal Wedding pins.

This is the famous Monkey selfie.

I confess: I have mixed emotions regarding the iconic “monkey-selfie” photo and all the hubbub it has created.

Don’t get me wrong; I think monkeys are wonderful, and the photo deserves its iconic status. Who can resist smiling while viewing that famous image of Naruto, the macaque monkey who allegedly snapped the self-portrait?

And the monkey selfie has been a boon to legal blogs. Our own posts regarding the photo have been among the most viewed content on Socially Aware (one of our posts prompted a call from my mother, who felt strongly that Naruto should be entitled to a copyright in the photo).

But, let’s face it, in an era where technology disruption is generating so many critical and difficult copyright issues, the law relevant to the monkey selfie is pretty straightforward, at least in the United States. As the U.S. Copyright Office states in its Compendium II of Copyright Office Practices, for a work to be copyrightable, it must “owe its origin to a human being,” and that materials produced solely by nature, by plants or by animals do not count. U.S. courts have reached the same conclusion. (Although I note that David Slater, the nature photographer whose camera was used to take the photo, claims that he—and not the macaque—is in fact the author of the photo for copyright purposes.) Continue Reading Monkey-Selfie Case Returns—To Court & (Maybe) a Theater Near You

With the effective date of the EU’s General Data Protection Regulation (GDPR) less than one month away, companies subject to the GDPR are racing to comply with the regulation’s data privacy laws. But, for those companies, May 25 doesn’t represent a finish line as much as it does a starting gate.

In the coming months, as the most thorough and efficient methods of complying with the GDPR’s requirements come to light, the compliance processes that companies rushed to implement will need to evolve and change.

Do your company’s GDPR-compliance practices require an overhaul or just a few minor tweaks? Find out at Morrison & Foerster’s Data Protection Masterclass, a webinar that will help you to avoid wasting your organization’s precious resources by busting GDPR myths.

Join Socially Aware contributors Miriam Wugmeister, Christine Lyon, Alex van der Wolk, and Alja Poler De Zwart on Tuesday, June 19, from 12:00 pm until 1:00 pm ET to learn about data processors’ obligations, the GDPR’s impact on outsourcing and vendor agreements,  and more. If you are interested in attending this webinar, please register here. There is no charge to attend.

A recent German Federal Court of Justice decision may have a significant impact on content providers’ business models. Offering software that allows users to block advertising does not constitute an unfair commercial practice. Even providing advertisers with the option to pay for showing certain ads—a practice known as whitelisting—does not violate the unfair competition rules.

Issued on April 19, the decision involved a legal dispute between the ad blocking software provider Eyeo GmbH and the online-content provider Axel Springer (which also happens to be Germany’s largest publishing house). The decision overruled the Higher Regional Court of Cologne’s previous decision, which, like the Federal Court of Justice, did not categorize Eyeo’s offer of its ad blocking product as an unfair competition practice, but did categorize paid whitelisting as unlawful.

Axel Springer is now left with the final option of taking the case to the Federal Constitutional Court.

Background and core arguments of the parties

Eyeo, a German software company, offers the product AdBlock Plus, which allows Internet users to block ads online. The product became the most popular ad blocking software in Germany and abroad, with over 500 million downloads and 100 million users worldwide.

In 2011, the company started to monetize its product by offering a whitelisting service that gives advertisers the option to pay to show their ads. To get on Eyeo’s list of companies whose ads are not blocked, advertisers have to comply with Eyeo’s “acceptable advertising” conditions and share their ad revenue with the company. The conditions dictate the advertising’s features such as its placement, size, and—in the case of text advertising—color. Continue Reading German Federal Court: Unfair Competition Law No Basis to Ban Ad Blocking and Whitelisting

We’re proud to announce that the online platform JD Supra has named Socially Aware co-founder and co-editor John Delaney as the recipient of one of its 2018 Readers’ Choice Awards. John was chosen from among the nearly 50,000 writers who publish on JD Supra as the leading author of content in the emerging field of artificial intelligence.

As witnesses of John’s passion for emerging technologies and the law, we at Socially Aware aren’t at all surprised that JD Supra and its readers appreciate John’s articles and blog posts. But we’re glad they do.

Based on copyright infringement, emotional distress and other claims, a federal district court in California awarded $6.4 million to a victim of revenge porn, the posting of explicit material without the subject’s consent. The judgment is believed to be one of the largest awards relating to revenge porn. A Socially Aware post that we wrote back in 2014 explains the difficulties of using causes of action like copyright infringement—and state laws—as vehicles for fighting revenge porn.

The highest court in New York State held that whether or not a personal injury plaintiff’s Facebook photos are discoverable does not depend on whether the photos were set to “private,” but rather “the nature of the event giving rise to the litigation and the injuries claimed, as well as any other information specific to the case.”

A federal district court held that Kentucky’s governor did not violate the free speech rights of two Kentucky citizens when he blocked them from commenting on his Facebook page and Twitter account. The opinion underscores differences among courts as to the First Amendment’s application to government officials’ social media accounts; for example, a Virginia federal district court’s 2017 holding reached the opposite conclusion in a case involving similar facts.

Having witnessed social media’s potential to escalate gang disputes, judges in Illinois have imposed limitations on some juvenile defendants’ use of the popular platforms, a move that some defense attorneys argue violates the young defendants’ First Amendment rights.

A bill proposed by California State Sen. Bob Hertzberg would require social media platforms to identify bots—automated accounts that appear to be owned by real people but are actually computer programs capable of simulating human dialog. Bots can spread a message across social media faster and more widely than would be humanly possible, and have been used in efforts to manipulate public opinion.

This CIO article lists the new strategies, job titles and processes that will be popular this year among businesses transforming into data-driven enterprises.

A solo law practitioner in Chicago filed a complaint claiming defamation and false light against a former client who she alleges posted a Yelp review calling her a “con artist” and a “legal predator”  after, allegedly pursuant to the terms of his retainer, she billed $9,000 to his credit card for a significant amount of legal work.

Carnival Cruise Line put up signs all over the hometown of the 15-year-old owner of the Snapchat handle @CarnivalCruise in order to locate him and offer him and his family a luxurious free vacation in exchange for the transfer of his Snapchat handle—and the unusual but innovate strategy paid off. Who knew that old-school billboards could be so effectively used for one-on-one marketing?

Does a search engine operator have to delist websites hosting, without authorization, your trade secret materials or other intellectual property? The answer may depend on where you sue—just ask Google. The U.S. District Court for the Northern District of California recently handed the company a victory over plaintiff Equustek Solutions Inc. in what has turned into an international battle where physical borders can have very real consequences on the Internet.

The dispute began when a rival company, Datalink, allegedly misappropriated Equustek’s trade secrets in developing competing products. Equustek also alleged that Datalink misled customers who thought they were buying Equustek products. In 2012, Equustek obtained numerous court orders in Canada against Datalink. Datalink refused to comply, and Canadian court issued an arrest warrant for the primary defendant, who has yet to be apprehended. Continue Reading The Coming Border Wars: U.S. Court Decision Refusing to Enforce Canadian Court Order Highlights the Growing Balkanization of the Internet

Geo-blocking is the practice of preventing Internet users in one jurisdiction from accessing services elsewhere based on the user’s geographic location. The European Commission wants to eliminate geo-blocking within the EU—and has taken a significant step forward in its plans to do so by clearing key votes in the EU legislative process.

By the end of 2018, we expect that online retailers will need to ensure that they phase out the use of geo-blocking across the EU except in limited circumstances.

These changes are part of a wider programme of reform affecting all businesses operating in the Technology, Media, and Telecoms sectors in Europe.

Background

The European Commission launched its Digital Single Market (“DSM”) strategy in May 2015. We have written a number of articles following the DSM’s progress: at its inception, one year in, and in 2017 following a mid-term review.

Continue Reading EU Regulation Reform—Unjustified Geo-Blocking to Be Phased Out by End of 2018