New York City’s Conflicts of Interest Board has issued guidelines prohibiting elected officials from using official social media accounts for political purposes or having their staff draft content for their personal social media accounts.
Congress has begun paving the way for the deployment of autonomous vehicles.
Twitter has begun temporarily limiting its account features for users the company identifies as abusive.
Google Maps now allows users to create lists and share them with followers.
U.S. and Canadian companies can post job openings to their Facebook pages for free.
Thanks to millennials, online chat rooms are making a comeback.
With Tinder’s acquisition of the video-sharing startup called Wheel, video dating is likely in store for a revival, too.
Yelp will be offering a new feature that allows users to ask questions about a venue.
Reportedly used by political operatives ranging from White House staffers to EPA workers, encrypted messaging apps have become popular in Washington—and it’s raising legal questions.
Warren Buffet is getting into the wearables game.
Some tips for small businesses on how to manage a social media presence.
In a major development for cloud and other data storage providers, and further complicating the legal landscape for the cross-border handling of data, a Federal Magistrate Judge in the Eastern District of Pennsylvania ruled for the Department of Justice and ordered Google, Inc., to comply with two search warrants for foreign-stored user data. The order was issued on February 3, 2017 pursuant to the Stored Communications Act, (SCA), and the reasoning of the Court rested heavily on the court’s statutory analysis of the SCA. The ruling is a marked departure from a recent, high-profile Second Circuit decision holding that Microsoft could refuse to comply with a similar court order for user data stored overseas.
The SCA regulates how service providers like Google and Microsoft who store user data can disclose user information. The Magistrate Judge issued two warrants under the SCA for emails sent from Google users in the United States to recipients in the United States. Google refused to fully comply, invoking Microsoft, and the Government moved to compel. In its briefing, Google argued that the SCA can only reach data stored in the United States and that, because Google constantly shuffles “shards” of incomplete user data between its servers across the world, Google could never know for certain what information is stored domestically and what is stored overseas. Therefore, Google argued, the data sought under the warrants was beyond the reach of the SCA. Continue Reading
Can the mere offering of a mobile app subject the provider of such app to the privacy laws of countries in the European Union (EU)—even if the provider does not have any establishments or presence in the EU? The answer from the District Court of The Hague to that question is yes. The court confirmed on November 22, 2016, that app providers are subject to the Dutch Privacy Act by virtue of the mere offering of an app that is available on phones of users in the Netherland, even if they don’t have an establishment or employees there.
Context. EU privacy laws generally apply on the basis of two triggers: (i) if a company has a physical presence in the EU (in the form of an establishment or office or otherwise) and that physical presence is involved in the collection or other handling of personal information; or (ii) if a company doesn’t have a physical presence but makes use of equipment and means located in the EU to handle personal information.
Well over a year after holding a workshop addressing privacy issues associated with cross-device tracking, Federal Trade Commission (FTC) staff have issued a report. The report sets the stage by describing how cross-device tracking works, noting its “benefits and challenges,” and reviewing (and largely commending) current industry self-regulatory efforts.
The report also makes recommendations, which—while building upon the staff’s traditional themes of transparency and choice—do not introduce any materially new suggestions for compliance.
The staff’s recommendations do not have the force of law, but they do indicate the steps that the staff believes a company should take in order to avoid a charge of unfairness or deception under Section 5 of the FTC Act.
A Quick Review of Cross-Device Tracking
As more consumers utilize multiple devices in their daily lives, more and more companies are using new technologies to attempt to ascertain that multiple devices are connected to the same person. This is generally done through the use of either deterministic information (e.g., by recognizing a user through the log-in credentials he or she uses across different devices) or probabilistic information (i.e., by inferring that multiple devices are used by the same person based on information about the devices, such as IP address, location, and activities on the devices).
A New Jersey court rules that state police can examine a suspect’s private social media messages without having to apply for an order under the state’s wiretapping laws.
Technology companies are exercising a lot of control ever over users’ devices remotely, and it’s implicating privacy issues.
Social media companies are reportedly considering putting up on their platforms police icons that will allow police officers in the UK to access the chats of users who click the icons when they feel threatened.
With the help of bots and cyborgs, a 68-year-old Chicago retiree posts more than 1,000 pro-Republican messages to Twitter a day.
Interested in a good, basic—but comprehensive—overview of blockchain and its potential impact that can be read in one sitting? Here it is.
Here are some tips on how brands can use social artificial intelligence to their advantage.
Weary of all the political posts in your newsfeed? Try this app.
Guess which celebrity posted the first Instagram photo to win more than 7.2 million likes in less than 24 hours?
A robot developed by Google could be representative of how robots “look and operate in the future.” It’s also a little creepy.
In the wake of a successful social media conference in San Francisco, Socially Aware co-editors John Delaney and Aaron Rubin are revved up and ready to chair (John) and present (Aaron and John) at another Practicing Law Institute (PLI) 2017 Social Media conference! This one will be held in New York City on Wednesday, February 15, and will be webcasted.
Attendees and webcast listeners will learn how to leverage social-media-marketing opportunities while minimizing their companies’ risks from entirely new panels of industry experts, lawyers and regulators.
Topics to be addressed will include:
- Key developments shaping social media law
- Emerging best practices for staying out of trouble
- Risk mitigation strategies regarding user-generated content and online marketing
- Legal considerations regarding use of personal devices and other workplace issues
Other special features of the conference include:
- Regulators panel: guidance on enforcement priorities for social media and mobile apps
- In-house panel: practical tips for handling real-world issues
- Potential ethical issues relating to the use of social media by attorneys
The conference will end with a networking cocktail reception—a great way to meet others who share your interest in social media, mobile apps and other emerging technologies.
Don’t miss this opportunity to get up-to-date information on the fast-breaking developments in the critical area of social media and mobile apps so that you can most effectively meet the needs of your clients.
For more information or to register, please visit PLI’s website here. We hope to see you there!
The latest issue of our Socially Aware newsletter is now available here.
In this edition,we examine a spate of court decisions that appear to rein in the historically broad scope of the Communications Decency Act’s Section 230 safe harbor for website operators; we outline ten steps companies can take to be better prepared for a security breach incident; we describe the implications of the Second Circuit’s recent opinion in Microsoft v. United States regarding the U.S. government’s efforts to require Microsoft to produce email messages stored outside the country; we explore the EU’s draft regulation prohibiting geo-blocking; and we take a look at UK Consumer Protection regulators’ efforts to combat undisclosed endorsements on social media.
All this—plus an infographic highlighting the most popular social-media-post topics in 2016.
Read our newsletter.
We have been monitoring a trend of cases narrowing the immunity provided to website operators under Section 230 of the Communications Decency Act (CDA). A recent decision by a state court in Georgia, however, demonstrates that Section 230 continues to be applied expansively in at least some cases.
The case, Maynard v. McGee, arose from an automobile collision in Clayton County, Georgia. Christal McGee, the defendant, had allegedly been using Snapchat’s “speed filter” feature, which tracks a car’s speed in real-time and superimposes the speed on a mobile phone’s camera view. According to the plaintiffs, one of whom had been injured in the collision, McGee was using the speed filter when the accident occurred, with the intention of posting a video on Snapchat showing how fast she was driving. The plaintiffs sued McGee and Snapchat for negligence, and Snapchat moved to dismiss based on the immunity provided by Section 230.
The plaintiffs alleged that Snapchat was negligent because it knew its users would use the speed filter “in a manner that might distract them from obeying traffic or safety laws” and that “users might put themselves or others in harm’s way in order to capture a Snap.” To demonstrate that Snapchat had knowledge, the plaintiffs pointed to a previous automobile collision that also involved the use of Snapchat’s speed filter. The plaintiffs claimed that “[d]espite Snapchat’s actual knowledge of the danger from using its product’s speed filter while driving at excessive speeds, Snapchat did not remove or restrict access to the speed filter.” Continue Reading
Some industry observers are asking whether the post-inauguration tweets that President Trump is sending from his personal Twitter account may be subject to the same Presidential Records Act standards as official presidential communications.
Spending on mobile ads is expected to reach how much by 2021?!
Google recently banned 200 publishers from its AdSense network for either publishing fake news or impersonating real news organizations by using shortened top-level domains such as .co instead of .com.
Perhaps due in part to the fake news phenomenon, most Americans do not trust the news that they read on social media platforms, according to a recent study.
In an effort to keep users on the site, YouTube is testing an in-app messaging platform that allows users to chat about and comment on YouTube content.
Interested in how your brand can best respond to the complaints dissatisfied customers lodge on Twitter? Take a cue from Dippin’ Dots’ savvy response to the new White House Press Secretary’s multiple disses of the company’s product.
An artificially intelligent algorithm that can easily be adapted for smartphones might detect skin cancer moles as effectively as a dermatologist.
A recent study reveals that the United States has more immigrant inventors than every other country combined.
Google just made it possible to view an instantaneous Japanese-to-English translation by holding your smartphone in front of the relevant text.
These tools can help you to take a look beyond your personal social media bubble and understand how the other half thinks.
Congress enacted the Digital Millennium Copyright Act (“DMCA”) nearly two decades ago seeking to balance the needs of two factions: Content creators, who were struggling to protect their intellectual property in the digital age, and fledgling Internet companies, who feared being held liable for the misdeeds of their customers.
For the Internet companies, Congress offered relief by creating a number of “safe harbors” shielding such companies from copyright-related damages arising from their customers’ infringing activities.
In particular, the DMCA established four distinct safe harbors for online service providers, each safe harbor aimed at a different type of online activity (i.e., transitory digital network communications; system caching; online hosting; and provision of information location tools) and each with its own set of eligibility requirements.
To qualify for any of these DMCA safe harbors, however, the DMCA requires that service providers “reasonably implement” a policy that provides for the termination of “repeat infringers” in “appropriate circumstances.”
Despite the threshold importance of repeat infringer policies, the DMCA left many questions unanswered. Who exactly counts as an “infringer”? Does it include every user accused of infringement or only those found culpable in court? If it’s somewhere in between, what level of proof is required before a service provider is required to take action? Can the repeat infringer policy differentiate between those who upload infringing content for others to copy and share and those who only download such content for their own personal viewing? And how many acts of infringement does it take to become a “repeat infringer” anyway? Continue Reading