• Filling a gap. Gap Inc., the clothing retailer, is using social media to establish a web presence focused on feminism, equal pay for equal work, and progressive values. For example, the company produced an Instagram video, which it also shared on its other social media channels, to support equal pay for women. The video is an extension of a company initiative to highlight “the missing 23 cents” – i.e., the 77 cents that women make for every dollar their male counterparts earn. This is a bold use of the Gap’s social media soapbox to promote its company values and, it appears, a very successful one.
  • Uber mess. A passenger in an Uber car in San Francisco was allegedly bashed in the head last month by the driver during an argument. The passenger suffered serious eye injuries and has said that he is likely to sue Uber. In similar cases, Uber has invoked Section 230 of the Communications Decency Act as a defense, claiming that it is merely an online marketplace and not a transportation provider. But while Section 230 has been interpreted broadly, even some of the statute’s staunchest defenders have questioned whether Uber can claim its protection in this case.
  • Tweeting your wish. Amazon and Twitter have just rolled out a new feature that enables consumers to use the new hashtag #AmazonWishList to add tweeted products to their Amazon Wish Lists, so long as they have first connected their Amazon and Twitter accounts. The companies appear to be betting that this new “wish list” functionality will be a natural extension of the way in which people already use Twitter to express interest in, and opinions about, products.

In 2012, the National Labor Relations Board (NLRB or the “Board”) found a “courtesy” policy unlawful. Since then, the NLRB has continued to create more and more tension between the National Labor Relations Act (NLRA or the “Act”) and employers’ legitimate interests in maintaining and enforcing workplace guidelines governing courtesy in a nondiscriminatory fashion.

This article focuses on the maintenance and enforcement of courtesy and civility rules. In these cases, the Board has taken extreme positions that increasingly ignore competing interests and obligations of employers. Among the obligations that can conflict with Section 7 in this context, employers must protect their employees from harassment, including on the basis of sex and race, by disciplining employees making harassing comments and engaging in harassing behavior and by maintaining civil workplaces that are not conducive to harassment. Employers also have a legitimate interest in maintaining a civil workplace simply to promote employee productivity and job satisfaction, as well as ensuring appropriate levels of customer service.

The Framework: Regulating Workplace Rules Under the NLRA

Employees have the right to engage in concerted activity under Section 7 of the NLRA. Concerted activity is activity undertaken for the employees’ mutual aid and protection, including, for example, discussing the terms and conditions of employment, such as wages, policies, and workplace treatment. Under Section 8(a)(1) of the Act, it is an unfair labor practice for an employer “to interfere with, restrain, or coerce employees in the exercise of the rights guaranteed in section 7.”

Under the general framework of the Act, the National Labor Relations Board regulates employer maintenance and enforcement of generally applicable workplace rules in several ways.

First, an employer commits an unfair labor practice, under Section 8(a)(1), if it maintains a rule that would reasonably tend to chill employees in the exercise of their Section 7 rights. If it expressly restricts Section 7 activity, the rule is unlawful. Further, if it does not expressly restrict Section 7 activity, the rule is still unlawful under Lutheran Heritage Village if “(1) employees would reasonably construe the language to prohibit Section 7 activity; (2) the rule was promulgated in response to union activity; or (3) the rule has been applied to restrict the exercise of Section 7 rights.” In reading the rule, the Board should “refrain from reading particular phrases in isolation.” Similarly, the Board should not seek out “arguable ambiguity . . . through parsing the language of the rule, viewing [a] phrase . . . in isolation, and attributing to the [employer] an intent to interfere with employee rights.” Lafayette Park Hotel.

Second, employers may not discipline employees for engaging in protected activity. In the event that “the very conduct for which employees are disciplined is itself protected concerted activity,” then the discipline violates Section 8(a)(1) regardless of the employer’s motive or a showing of animus. Burnup & Sims, Inc. Similarly, if an employee violates a workplace rule and is disciplined, the discipline is unlawful if the employee “violated the rule by (1) engaging in protected conduct or (2) engaging in conduct that otherwise implicates the concerns underlying Section 7 of the Act.” Continental Group, Inc.

Continue Reading The Death of Courtesy and Civility Under the National Labor Relations Act

  • Status check.  In the recently released Corporate Directors Survey from PricewaterhouseCoopers, 41% of corporate board members reported that their companies monitor social media for adverse publicity.  That’s up from 32% in 2012.  One commentator suggests that a company’s entire board of directors—not just the members of its audit or risk committees—should be charged with social media oversight, given the reputational risk social media chatter poses and the medium’s potential as an effective investor relations tool.
  • Fightin’ words?  An Indonesian law student landed in a police detention cell for criticizing a historic city online because police in that country suspected her of running afoul of the 2008 Law on Information and Electronic Transactions, Indonesian legislation that provides prison time for anyone convicted of using electronic media—including social media networks—“to intimidate or defame others.”  Many criticize the law as being inconsistent with Indonesia’s successful transition from an authoritarian state to a robust democracy.
  • The wrong number.  Twitter users sometimes give the social media company their cell phone numbers in order to be able to view tweets as text messages. But when a cell phone number that has been submitted to Twitter for that purpose is reassigned to a new user, do Twitter’s text messages to that number violate the Telephone Consumer Protection Act? Beverly Nunes claims they do. In a suit she filed in the U.S. District Court for the Northern District of California, Nunes is seeking class certification, and at least $500 in damages for each unsolicited Twitter text she received.  In a Sept. 16 motion to dismiss Nunes’s complaint, Twitter contends that the texts do not violate the TCPA because, among other things, they were not sent using an “automatic telephone dialing system or an artificial or prerecorded voice,” as the statute requires.

A 2013 CareerBuilder survey of hiring managers and human resource professionals reports that more than two in five companies use social networking sites to research job candidates. This interest in social networking does not end when the candidate is hired: to the contrary, companies are seeking to leverage the personal social media networks of their existing employees, as well as to inspect personal social media in workplace investigations.

As employer social media practices continue to evolve, individuals and privacy advocacy groups have grown increasingly concerned about employers intruding upon applicants’ or employees’ privacy by viewing restricted access social media accounts. A dozen states already have passed special laws restricting employer access to personal social media accounts of applicants and employees (“state social media laws”), and similar legislation is pending in at least 28 states. Federal legislation is also under discussion.

These state social media laws restrict an employer’s ability to access personal social media accounts of applicants or employees, to ask an employee to “friend” a supervisor or other employer representative and to inspect employees’ personal social media. They also have broader implications for common practices such as applicant screening and workplace investigations, as discussed below. Continue Reading Employer Access to Employee Social Media: Applicant Screening, ‘Friend’ Requests and Workplace Investigations

The latest issue of our Socially Aware newsletter is now available here.

In this issue of Socially Aware, our Burton Award-winning guide to the law and business of social media, we summarize the FFIEC’s recently-issued final guidance on social media use by financial institutions; we report on a new NLRB decision holding that particularly egregious social media postings by employees may fall outside the protections of the NLRA; we provide an update on the California Attorney General’s guidance regarding compliance with the state’s “do-not-track” disclosure requirements for websites; we discuss a recent case that calls into question the status of domain names as intangible property; we take a look at the latest in a string of cases exploring the First Amendment status of social media activity by government employees; and we highlight an important FTC settlement with a mobile app publisher related to data collection and sharing disclosures. All this plus a collection of surprising statistics about the most popular people, videos, tweets and hashtags of 2013.

Our global privacy + data security group’s Data Protection Masterclass Webinar series is turning the spotlight on social media marketing and policies in January.

Please join Socially Aware contributors Christine Lyon and Karin Retzer, along with Ann Bevitt in our London office for a webinar that will examine the laws and regulations in the United States and Europe relating to consumer-facing issues that arise from the use of social media in advertising and marketing. This presentation will also address the challenges that employers and employees face resulting from the use of social media in the workplace and in the recruitment process.

Topics Will Include:

  • Privacy issues for social media advertising, blogging and tweeting
  • Data sharing in relation to social plug-ins
  • Data protection requirements for social media market research
  • Targeting and analytics
  • Social media policies
  • Monitoring of social media use, including misuse of social media by employees
  • Use of social media in the application process

Date & Time:

Tuesday, January 21, 2014

4:30 p.m. – 6:00 p.m. GMT
11:30 a.m. – 1:00 p.m. EST
8:30 a.m. – 10:00 a.m. PST

Speakers:

Registration:

To register for this webinar, please click here.

For more information, please contact Kay Burgess at kburgess@mofo.com or +44 20 7920 4067.

Social media platforms have become an increasingly important means for companies to build and manage their brands and to interact with their customers, in many cases eclipsing companies’ traditional “.com” websites. Social media providers typically make their platforms available to users without charge, but companies nevertheless invest significant time and other resources to create and maintain their presences on those providers’ platforms. A company’s social media page or profile and its associated followers, friends and other connections are often considered to be valuable business assets.

But who owns these valuable assets – the company or the individual employee who manages the company’s page or profile? Social media’s inherently interactive nature has created an important role for these individual employees. Such an employee essentially acts as the “voice” of the company and his or her style and personality may be essential to the success and popularity of that company’s social media presence. As a result, the lines between “company brand” and “personal brand” may become blurred over time. And when the company and the individual part ways, that blurring can raise difficult issues, both legal and logistical, regarding the ownership and valuation of business-related social media accounts.

Such issues have arisen in a number of cases recently, several of which we discuss below. Although these cases leave open a number of questions, the message to companies who use social media is loud and clear: it is imperative to proactively establish policies and practices that address ownership and use of business-related social media accounts.

PhoneDog v. Kravitz

A recently settled California case, PhoneDog v. Kravitz, Case No. C 11-03474 (N.D. Cal.), raised a number of interesting issues around the ownership and valuation of social media accounts. The defendant, Noah Kravitz, worked for the plaintiff, PhoneDog, a mobile news and reviews website. While he was employed by PhoneDog, Kravitz used the Twitter handle “@PhoneDog_Noah” to provide product reviews, eventually accumulating 17,000 Twitter followers over a period of approximately four and a half years. Kravitz then left PhoneDog to work for one of its competitors but he maintained control of the Twitter account and changed the account handle to “@noahkravitz.” When Kravitz refused PhoneDog’s request to relinquish the Twitter account that had been previously associated with the “@PhoneDog_Noah” handle, PhoneDog filed a complaint against Kravitz asserting various claims, including trade secret misappropriation, conversion, and intentional and negligent interference with economic advantage.

Kravitz filed a motion to dismiss the complaint based on a number of arguments, including PhoneDog’s inability to establish that it had suffered damages in excess of the $75,000 jurisdictional threshold. Kravitz also disputed PhoneDog’s ownership interest in either the Twitter account or its followers, based on Twitter’s terms of service, which state that Twitter accounts belong to Twitter and not to Twitter users such as PhoneDog. Finally, Kravitz argued that Twitter followers are “human beings who have the discretion to subscribe and/or unsubscribe” to the account and are not PhoneDog’s property, and asserted that “[t]o date, the industry precedent has been that absent an agreement prohibiting any employee from doing so, after an employee leaves an employer, they are free to change their Twitter handle.”

With respect to the amount-in-controversy issue, PhoneDog asserted that Kravitz’s continued use of the “@noahkravitz” handle resulted in at least $340,000 in damages, an amount that was calculated based on the total number of followers, the time during which Kravitz had control over the account, and a purported “industry standard” value of $2.50 per Twitter follower. Kravitz argued that any value attributed to the Twitter account came from his efforts in posting tweets and the followers’ interest in him, not from the account itself. Kravitz also disputed PhoneDog’s purported industry standard value of $2.50 per Twitter follower, and contended that valuation of the account required consideration of a number of factors, including (1) the number of followers, (2) the number of tweets, (3) the content of the tweets, (4) the person publishing the tweets, and (5) the person placing the value on the account.

With respect to the ownership issue, PhoneDog claimed that it had an ownership interest in the account based on the license to use and access the account granted to it in the Twitter terms of service, and also that it also had an ownership interest in the content posted on the account. PhoneDog also pointed to a purported “intangible property interest” in the Twitter account’s list of followers, which PhoneDog compared to a business customer list. Finally, PhoneDog asserted that, regardless of any ownership interest in the account, PhoneDog was entitled to damages based on Kravitz’s interference with PhoneDog’s access to and use of the account, which (among other things) purportedly affected PhoneDog’s economic relations with its advertisers.

The court determined that the amount-in-controversy issue was intertwined with factual and legal issues raised by PhoneDog’s claims and, therefore, could not be resolved at the motion-to-dismiss stage. Accordingly, the court denied without prejudice Kravitz’s motion to dismiss for lack of subject matter jurisdiction. The court also denied Kravitz’s motion to dismiss PhoneDog’s trade secret and conversion claims, but granted Kravitz’s motion to dismiss PhoneDog’s claims of interference with prospective economic advantage.

The parties subsequently settled the dispute, so, unfortunately, we will never know how the court would have ruled on the variety of interesting issues that the case presented. Interestingly, although the terms of the settlement remain confidential, as of mid-September, Kravitz appears to have kept control of the Twitter account and its attendant followers. It is worth noting that the case might have been more straightforward—and the result more favorable to the company—had PhoneDog established clear policies regarding the ownership of business-related social media accounts.

Ardis Health, LLC et al. v. Nankivell

A New York case, Ardis Health, LLC et al. v. Nankivell, Case No. 11 Civ. 5013 (S.D.N.Y.), more clearly illustrates the fundamental point that companies should proactively establish policies and practices that address the ownership and use of business-related social media accounts.

The plaintiffs in Ardis Health were a group of closely affiliated online marketing companies that develop and market herbal and beauty products. The defendant was a former employee who had held a position at Ardis Health, LLC as a “Video and Social Media Producer.” Following her termination, the defendant refused to turn over to the plaintiffs the login information and passwords for the social media accounts that she had managed for the plaintiffs during her employment. The plaintiffs then filed a lawsuit against the defendant and sought a preliminary injunction seeking, among other things, to compel her to provide them with that access information.

Fortunately for the plaintiffs, they had required the defendant to execute an agreement at the commencement of her employment that stated in part that all work created or developed by defendant “shall be the sole and exclusive property” of one of the plaintiffs, and that required the defendant to return all confidential information to the company upon request. This employment agreement also stipulated that “actual or threatened breach . . . will cause [the plaintiff] irreparable injury and damage.” On these facts, the court noted that “[i]t is uncontested that plaintiffs own the rights to” the social media account access information that the defendant had refused to provide. Interestingly, the court held that the plaintiffs were likely to prevail on their conversion claim, effectively treating the disputed social media account access information as a form of intangible personal property. The court also determined that plaintiffs were suffering irreparable harm as a result of the defendant’s refusal to turn over that access information. Accordingly, the court granted the plaintiffs’ motion for a preliminary injunction ordering the defendant to turn over the disputed login information and passwords to the plaintiffs.

As far as we can tell from the reported decision in Ardis Health, the defendant’s employment agreement did not expressly address the ownership or use of social media accounts or any related access information. Nonetheless, even the fairly generic work product ownership and confidentiality language included in the defendant’s employment agreement, as noted above, appears to have been an important factor in the favorable outcome for the plaintiffs, which illustrates the advantages of addressing these issues contractually with employees—in advance, naturally. And as discussed below, companies can put themselves in an even stronger position by incorporating more explicit terms concerning social media into their employment agreements.

Eagle v. Morgan and Maremont v. Fredman

Former employers aren’t always the plaintiffs in cases regarding the ownership of business-related social media accounts.  In an interesting twist, two other cases – Eagle v. Morgan, Case No. 11-4303 (E.D. Pa.), and Maremont v. Fredman, Case No. 10 C 7811 (N.D. Ill.) – were brought by employees who alleged that their employers had taken over and started using social media accounts that the employees considered to be personal accounts.

Eagle began as a dispute over an ex-employee’s LinkedIn account and her related LinkedIn connections. The plaintiff, Dr. Linda Eagle, was a founder of the defendant company, Edcomm. Dr. Eagle alleged that, following her termination, Edcomm personnel changed her LinkedIn password and account profile, including by replacing her name and photograph with the name and photo of the company’s new CEO. Among the various claims filed by each party, in pretrial rulings, the court granted Dr. Eagle’s motion to dismiss Edcomm’s trade secret claim and granted Edcomm’s motion for summary judgment on Dr. Eagle’s Computer Fraud and Abuse Act (CFAA) and Lanham Act claims.

Regarding the trade secret claim, the court held that LinkedIn connections did not constitute trade secrets because they were “either generally known in the wider business community or capable of being easily derived from public information.” Regarding her CFAA claims, the court concluded that the damages Dr. Eagle claimed she had suffered – putatively arising from harm to reputation, goodwill and business opportunities – were insufficient to satisfy the “loss” element of a CFAA claim, which requires some relation to “the impairment or damage to a computer or computer system.” Finally, in rejecting the plaintiff’s claim that Edcomm violated the Lanham Act by posting the new CEO’s name and picture on the LinkedIn account previously associated with Dr. Eagle, the court found that Dr. Eagle could not demonstrate that Edcomm’s actions caused a “likelihood of confusion,” as required by the Act.

Eventually, the Eagle case proceeded to trial. The court ultimately held for Dr. Eagle on her claim of unauthorized use of name under the Pennsylvania statute that protects a person’s commercial interest in his or her name or likeness, her claim of invasion of privacy by misappropriation of identity, and her claim of misappropriation of publicity. The court also rejected Edcomm’s counterclaims for misappropriation and unfair competition. Meanwhile, the court held for the defendants on Dr. Eagle’s claims of identity theft, conversion, tortious interference with contract, civil conspiracy, and civil aiding and abetting. Although the court’s decision reveals that Edcomm did have certain policies in place regarding establishment and use of business-related social media accounts by employees, unfortunately for Edcomm, those policies do not appear to have clearly addressed ownership of those accounts or the disposition of those accounts after employees leave the company.

In any event, although Dr. Eagle did prevail on a number of her claims, the court concluded that she was unable to establish that she had suffered any damages. Dr. Eagle put forth a creative damages formula that attributed her total past revenue to business generated by her LinkedIn contacts in order to establish a per contact value, and then used that value to calculate her damages for the period of time when she was unable to access her account. But the court held that Dr. Eagle’s damages request was insufficient for a number of reasons, primarily that she was unable to establish the fact of damages with reasonable certainty. The court also denied Dr. Eagle’s request for punitive damages. Therefore, despite prevailing on a number of her claims, Dr. Eagle’s victory in the case was somewhat pyrrhic.

In Maremont, the plaintiff, Jill Maremont, was seriously injured in a car accident and had to spend several months rehabilitating away from work. While recovering, Ms. Maremont’s employer, Susan Fredman Design Group, posted and tweeted promotional messages on Ms. Maremont’s personal Facebook and Twitter accounts, where she had developed a large following as a well-known interior designer. Although Ms. Maremont asked her employer to stop posting and tweeting, the defendant continued to do so. Ms. Maremont then brought claims against Susan Fredman Design Group under the Lanham Act, the Illinois Right of Publicity Act, and the Stored Communications Act, as well as a common law right to privacy claim. The parties filed cross-motions for summary judgment, which the court denied with respect to the Lanham Act and Stored Communications Act claims, largely due to lack of evidence on whether or not Ms. Maremont suffered actual damages as a result of her employer’s actions. The court granted Susan Fredman Design Group’s motion for summary judgment with respect to Ms. Maremont’s right of publicity claim, based on the fact that the defendant did not actually impersonate Ms. Maremont when it used her accounts. The court also granted Susan Fredman Design Group’s motion for summary judgment with respect to Ms. Maremont’s right of privacy claim because the “matters discussed in Maremont’s Facebook and Twitter posts were not private and that Maremont did not try to keep any such facts private.”

Proactive Steps

Considering how vital social media accounts are to today’s companies, and given the lack of clear applicable law concerning the ownership of such accounts, companies should take proactive steps to protect these valuable business assets.

For example, companies should consider clearly addressing the ownership of company social media accounts in agreements with their employees, such as employee proprietary information and invention assignment agreements. Agreements like this should state, in part, that all social media accounts that employees register or manage as part of their job duties or using company resources – including all associated account names and handles, pages, profiles, followers and content – are the property of the company, and that all login information and passwords for such accounts are both the property and the confidential information of the company and must be returned to the company upon termination or at any other time upon the company’s request. In general, companies should not permit employees to post under their own names on company social media accounts or use their own names as account names or handles. If particular circumstances require an employee or other individual to post under his or her own name – for example, where the company has engaged a well-known industry expert or commentator to manage the account – the company might want to go a step further and include even more specific contractual provisions that address ownership rights to the account at issue.

In parallel, companies should implement and enforce social media policies that provide employees with clear guidance regarding the appropriate use of business-related social media accounts, including instructions on how to avoid blurring the lines between company and personal accounts. (Keep in mind, however, that social media policies need to be carefully drafted so as not to not run afoul of the National Labor Relations Act, state laws restricting employers’ access to employees’ personal social media accounts, or the applicable social media platforms’ terms of use.) Finally, companies should control employee access to company social media accounts and passwords, including by taking steps to prevent individual employees from changing account usernames or passwords without authorization.

Peer-to-peer (“P2P”) business models based on the Internet and technology platforms have become increasingly innovative.  As such models have proliferated, they frequently result in clashes with regulators or established market competitors using existing laws as a defensive tactic.  The legal battles that result illustrate the need for proactive planning and consideration of the likely legal risks during the early structuring phase of any new venture.

Collaborative consumption, or the “sharing economy” as it is also known, refers to the business model that involves individuals sharing their resources with strangers, often enabled by a third-party platform.  In recent years, there has been an explosion of these P2P businesses.  The more established businesses include online marketplaces for goods and services (eBay, Taskrabbit) and platforms that provide P2P accommodation (Airbnb, One Fine Stay), social lending (Zopa), crowdfunding (Kickstarter) and car sharing (BlaBlaCar, Lyft, Uber).  But these days, new sharing businesses are appearing at an unprecedented rate; you can now find a sharing platform for almost anything.  People are sharing meals, dog kennels, boats, driveways, bicycles, musical instruments – even excess capacity in their rucksacks (cyclists becoming couriers).

The Internet and, more specifically, social media platforms and mobile technology has brought about this economic and cultural shift.  Some commentators are almost evangelical about the potential disruption to traditional economic models that the sharing economy provides, and it’s clear that collaborative consumption offers a compelling proposition for many individuals.  It helps people to make money from under-utilized assets and tap into global markets; it gives people the benefits of ownership but with reduced costs and less environmental impact; it helps to empower the under-employed; and it brings strangers together and offers potentially unique experiences.  There’s clearly both supply and demand, and a very happy set of users for a great many of these new P2P services.

However, not everyone is in favor of the rapid growth of this new business model.  Naturally, most of the opposition comes from incumbent businesses or entrenched interests that are threatened by the new competition or those that have genuine concerns about the risk posed by unregulated entrants to the market.  Authorities and traditional businesses are challenging sharing economy businesses in a variety of ways, including arguing that the new businesses violate applicable laws, with accommodation providers and car-sharing companies appearing to take the brunt of the opposition to date.

Bed Surfing

One of the most successful P2P marketplaces, San Francisco-founded Airbnb is a platform that enables individuals to rent out part or all of their house or apartment.  It currently operates in 192 countries and 40,000 cities.  Other accommodation-focused P2P models include One Fine Stay, a London-based platform that allows home owners to rent out empty homes while they are out of town.

Companies such as these have faced opposition from hoteliers and local regulators who complain that home owners using these platforms have an unfair advantage by not being subject to the same laws as a traditional hotel.  City authorities have also cited zoning regulations and other rules governing short-term rentals as obstacles to this burgeoning market.  It has been reported that some residents have been served with eviction notices by landlords for renting out their apartments in violation of their leases, and some homeowner and neighborhood associations have adopted rules to restrict this type of short-term rental.

These issues are not unique to the United States.  Commentators have reported similar resistance with mixed responses from local or municipal governments in cities such as Barcelona, Berlin and Montreal.

It’s not particularly surprising that opposition to P2P accommodation platforms would come from existing incumbent traditional operators after all, that’s typical of most new disruptive business models in the early stages before mainstream acceptance.  But the approaches taken by P2P opponents illustrate that most regulations were originally devised to apply to full-time commercial providers of goods and services, and apply less well to casual or occasional providers.

This has consequences for regulators, who are likely to have to apply smarter regulatory techniques to affected markets.  Amsterdam is piloting such an approach to accommodation-sharing platforms, realizing the benefits that a suitably-managed approach to P2P platforms could have on tourism and the local economy.

Car Sharing

Companies that enable car-sharing services have also faced a barrage of opposition, both from traditional taxi companies and local authorities.  In many U.S. cities, operators such as Lyft and Uber have faced bans, fines and court battles.

It was reported in August 2013 that eleven Uber drivers and one Lyft driver were recently arrested at San Francisco airport on the basis of unlawful trespassing offenses.  In addition, during summer 2013, the Washington, D.C. Taxicab Commission proposed new restrictions that would prevent Uber and its rivals from operating there.  Further, in November 2012, the California Public Utilities Commission (“CPUC”) issued $20,000 fines against Lyft, SideCar and Uber for “operating as passenger carriers without evidence of public liability and property damage insurance coverage” and “engaging employee-drivers without evidence of workers’ compensation insurance.

All three firms appealed these fines, arguing that outdated regulations should not be applied to peer-rental services, and the CPUC allowed the companies to keep operating while it drafted new regulations, which were eventually issued in July 2013.  In August 2013, the Federal Trade Commission intervened and wrote to the Commissions arguing that the new rules were too restrictive and could stifle innovation.  The CPUC rules (approved on September 19, 2013) require operators to be licensed and meet certain criteria including in terms of background checks, training and insurance.  The ridesharing companies will be allowed to operate legally under the jurisdiction of the CPUC, and will now fall under a newly created category called “Transportation Network Company.”

Some operators have structured their businesses in an attempt to avoid at least some of the regulatory obstacles.  For example, Lyft does not set a price for a given journey; instead, riders are prompted to give drivers a voluntary “donation.”  Lyft receives an administrative fee in respect of each donation.  In addition, in its terms, Lyft states that it does not provide transportation services and is not a transportation carrier; rather, it is simply a platform that brings riders and drivers together.  In BlaBlaCar’s model, drivers cannot make a profit, just offset their actual costs, which helps to ensure that drivers are not considered to be traditional taxi drivers, thereby helping them avoid the regulation that applies to the provision of taxi services.

Traditional players embracing the new model

Interestingly, not all traditional players are taking a completely defensive approach.  From recent investment decisions, it appears that some companies appreciate that it could make sense for them to work closely with their upstart rivals, rather than oppose them.  For example, in 2011, GM Ventures invested $13 million in RelayRides and, in January 2013, Avis acquired Zipcar, giving Avis a stake in Wheelz, a P2P car rental firm in which Zipcar has invested $14 million.

The incentive for incumbent operators to embrace P2P models will likely vary by sector.  Perhaps it’s no surprise that this is best illustrated in the car rental industry, where there already exists a financial “pull” and a regulatory “push” towards greener and more sustainable models of service provision.

Legal and Regulatory Issues

Lawmakers and businesses around the world are currently grappling with how to interpret existing laws in the context of P2P sharing economy business models and considering whether new regulation is required.  For example, the European Union is preparing an opinion on collaborative consumption in the light of the growth of P2P businesses there.  One hopes that European policy makers focus more on incentivizing public investment in P2P projects via grants or subsidies than on prescriptive regulation of the sector.

Importantly, however, it’s a particular feature of the market for P2P platforms that much of the regulatory activity tends to be at the municipal or local level, rather than national.  This tends to make for a less cohesive regulatory picture.

In the meantime, anyone launching a social economy business will need to consider whether and how various thorny legal and regulatory issues will affect both the platform operator and the users of that platform.  Often, this may mean tailoring services to anticipate particular legal or regulatory concerns.

  • Consumer protection.  Operators will need to consider the extent to which their platforms comply with applicable consumer protection laws, for example when drafting appropriate terms of use for the platform.
  • Privacy.  Operators will need to address issues of compliance with applicable privacy laws in terms of the processing of the personal data of both users and users’ customers, and prepare appropriate privacy policies and cookie notices.
  • Employment.  Where services are being provided, the operator will need to consider compliance with any applicable employment or recruitment laws, e.g., rules governing employment agencies, worker safety and security, and minimum wage laws.
  • Discrimination.  Operators will need to consider potential discrimination issues, e.g., what are the consequences if a user refuses to loan their car or provide their spare room on discriminatory grounds, for example due to a person’s race or sexuality?  Could the operator attract liability under anti-discrimination laws?
  • Laws relating to payments.  One key to success for a P2P business model is to implement a reliable and effective payment model.  But most countries impose restrictions on certain types of payment structures in order to protect consumers’ money.  Where payments are made via the P2P platform rather than directly between users, operators will need to address compliance with applicable payment rules, and potentially deal with local payment services laws.  Fundamentally, it needs to be clear whose obligation it is to comply with these laws.
  • Taxation.  Operators will need to consider taxation issues that may apply – both in terms of the operator and its users.  Some sectors of the economy – hotels, for example – are subject to special tax rates by many cities or tax authorities.  In such cases, the relevant authorities can be expected to examine closely – and potentially challenge, or assess municipal, state or local taxes against – P2P models that provide equivalent services.  In some places, collection of such taxes can be a joint and several responsibility of the platform operator and its users.
  • Safety and security.  When strangers are being brought together via a platform, security issues will need to be addressed.  Most social economy businesses rely on ratings and reciprocal reviews to build accountability and trust among users.  However, some platforms also mitigate risks by carrying out background and/or credit checks on users.  Airbnb also takes a practical approach, employing a full-time Trust & Safety team to provide extra assurance for its users.
  • Liability.  One of the key questions to be considered is who is legally liable if something goes wrong.  Could the platform attract liability if a hired car crashes or a host’s apartment is damaged?
  • Insurance.  Responsibility for insurance is also a key consideration.  The issue of insurance for car-sharing ventures made headlines in April 2013 when it was reported that a Boston resident had crashed a car that he had borrowed via RelayRides.  The driver was killed in the collision and four other people were seriously injured. RelayRides’ liability insurance was capped at $1 million, but the claims potentially threaten to exceed that amount.  Given these types of risks, some insurance companies are refusing to provide insurance coverage if policyholders engage in P2P sharing.  Three U.S. states (California, Oregon and Washington) have passed laws relating to car sharing, placing liability squarely on the shoulders of the car-sharing service and its insurers.
  • Industry-specific law and regulation.  Companies will need to consider issues of compliance with any sector-specific laws, whether existing laws or new regulations that are specifically introduced to deal with their business model (such as crowd-funding rules under the JOBS Act in the United States, and P2P lending rules to be introduced shortly in the United Kingdom).  As noted above, some social economy businesses have already experienced legal challenges from regulators, and as collaborative consumption becomes even more widely adopted, regulatory scrutiny is likely to increase.  Accordingly, rather than resist regulation, the best approach for sharing economy businesses may be to create trade associations for their sector and/or engage early on with lawmakers and regulators in order to design appropriate, smarter policies and frameworks for their industry.

Conclusion

Erasmus said, “There is no joy in possession without sharing.”  Thanks to collaborative consumption, millions of strangers are now experiencing both the joy – and the financial benefits – of sharing their resources.  However, the legal challenges will need to be carefully navigated in order for the sharing economy to move from being merely disruptive to become a firmly established business model.

Businesses are increasingly using social networks and online forums for marketing, recruiting, customer service, branding and PR purposes. The UK privacy regulator, the Information Commissioner (ICO), has recently published new guidance on the use of social networking and online forums. This replaces its 2007 guidance on the subject and is a helpful reminder to companies of their data protection obligations when operating online in the UK.

The UK Data Protection Act 1998 (DPA) sets out an extensive data protection regime by imposing broad obligations on those who collect personal data, as well as conferring rights on individuals about whom data are collected. Broadly speaking, data controllers must ensure that personal data are processed in accordance with a number of “data protection principles” set out in Schedule 1 to the DPA, which require that:

  1. Data must be processed fairly and lawfully;
  2. Data must be obtained only for specified lawful purposes and not further processed in a manner which is incompatible with those purposes;
  3. Data must be adequate, relevant and not excessive in relation to the purposes for which it is processed;
  4. Data must be accurate and, where necessary, kept up to date;
  5. Data must not be kept for longer than is necessary;
  6. Data must be processed in accordance with the rights of data subjects;
  7. Appropriate technical and organizational security measures must be taken to prevent unauthorized or unlawful processing, accidental loss of or destruction or damage to, personal data; and
  8. Personal data must not be transferred outside the EEA unless the destination country ensures an adequate level of protection for the rights of the data subject in relation to the processing of personal data.

The DPA details certain circumstances where processing is exempt from these data protection principles.

The new guidance

The new guidance confirms that Section 36 of the DPA (which provides a domestic purposes exemption) does not cover: (i) an organization’s use of social networks and online forums, or (ii) an individual’s use of social networks/forums for non-domestic purposes (e.g., in connection with running a sole trader business).

Accordingly, organizations and businesses that use social media and online forums will have privacy compliance obligations under the DPA: (i) if they post personal data on their own or a third party’s website, (ii) if they download and use personal data from a third party’s website or (iii) if they run a website (e.g. a blog) which allows third parties to add comments or posts about living individuals and they are a data controller of that third-party content.

The ICO states that the DPA will apply even where an organization asks an employee to carry out processing via their own personal social media page (e.g., their Facebook or LinkedIn page, or a personal blog) for business purposes. This is because, in such circumstances, the employee will be acting on behalf of the organization, so the processing will be for the organization’s corporate purposes and not for domestic purposes. However, the ICO acknowledges that there are situations where the purpose of using a social network or online forum will not be particularly clear-cut. Some users of social media may use it for mixed purposes, e.g., not only for personal, family and recreational purposes, but also to promote business interests. The ICO states that in such circumstances, individuals will need to ensure that any posts that involve personal data and are not made for domestic purposes comply with the DPA.

Of course, the first issue that businesses will need to consider if running an online forum or social networking site is the extent to which they will be considered a data controller under the DPA. Section 1 of the DPA states that a “data controller” means “the person (who either alone or jointly or in common with other persons) determines the purposes for which and the manner in which any personal data are, or are to be, processed”.

The ICO confirms that the operator of a social network or online forum will be a data controller in relation to any contact information or other personal data that the site operator processes about the users of such network or forum, and will need to comply with the DPA.

In relation to any personal data that is posted on the social network or online forum by third parties, the ICO acknowledges that the position is less clear-cut. For an illustration of a situation where an operator would be considered a data controller in respect of third-party posts, the ICO refers to the 2011 case of The Law Society and Ors v Rick Kordowski EWHC 3185 (QB). In this case, Mr Kordowski ran a website, “Solicitors from Hell”, which encouraged the public to name and shame solicitors. Mr. Kordowski moderated posts made by users and charged fees for adding or removing posts. The court held that Mr Kordowski was a data controller under the DPA, and this was not disputed by either party. It was clear that Mr Kordowski had decided the purpose and manner in which the personal data was processed.

However, the ICO states that this does not mean that moderation is always an essential factor – even if the site operator does not carry out moderation in advance it could still be deemed to be a data controller. For example, if a site only allowed posts subject to terms and conditions which covered acceptable content, and posts could be removed if they breached those terms and conditions, then the operator would still be determining, to a certain extent, the purposes and manner in which personal data was processed and therefore would be deemed to be a data controller.

If a business is deemed to be a data controller of third-party posts, it will need to take reasonable steps to ensure that any personal data posted on its site are accurate and, where necessary, kept up-to-date in order to be compliant with the DPA. The ICO indicates that what amounts to “reasonable steps” will depend on the nature of the site and how active a role the operator takes in selecting, allowing or moderating content. For example, if (i) the vast majority of posts on a site are third-party posts, (ii) the volume of such posts is significant, (iii) posts are not moderated in advance and (iv) the site relies upon users complying with user policies and reporting problems to the site operator, the ICO acknowledges that it would not be reasonable to check every individual post for accuracy.

However, the ICO would consider “reasonable steps” to include:

  1. having clear and prominent policies for users about acceptable and non-acceptable posts;
  2. having clear and easy-to-find procedures through which users can dispute the accuracy of posts and ask for them to be removed; and
  3. responding to disputes about accuracy quickly, and having procedures which enable access
    to content to be suspended until the dispute has been settled.

In addition, the ICO states that it expects site operators to have appropriate internal policies in place to deal with: (a) complaints from individuals who believe that their personal data may have been processed unfairly or unlawfully because of derogatory, threatening or abusive third-party posts, (b) disputes between individuals regarding the factual accuracy of posts; and (c) complaints about how the organization processes personal data provided by users.

Best practices

Businesses should take note of the new guidance when using social networks and online forums. In addition, we suggest the following best practices to ensure privacy compliance.

  • Understand the privacy terms that apply in respect of your use of third-party social networks and online forums.
  • Understand the privacy risks associated with users posting third-party content and take appropriate steps to minimise such risks.
  • Put in place appropriate terms of use and privacy policies.
  • Ensure that privacy policies comply with the DPA and:
    • include a description of how user data will be processed/used;
    • ensure that appropriate consents are obtained from users; and
    • limit the company’s liability for privacy breaches by users.
  • Don’t keep personal data that has been collected via social networks and online forums for longer than necessary.
  • Obtain the consent of employees if any of their information will be posted to social networks or online forums.
  • Create an appropriate social media policy, and provide training to employees to ensure employees understand their obligations in respect of their use of social networks and online forums including:
    • setting out rules about accessing social media sites at work;
    • making clear in what circumstances employees will represent the company and in what circumstances employees will act in a personal capacity and the rules relating to each (e.g., when acting in a personal capacity, employees       should write in the first person (“I” as opposed to “we”) and should include a prominent disclaimer stating that the views expressed are their own, and not necessarily those of their company); and
    • making clear to employees that they must not post personal information about colleagues on social networks and online forums.
  • Put in place practical security measures for employees who have been approved to administrate/operate blogs and social media pages.

Other social media issues

Of course, in addition to data protection and privacy, there are a whole host of other issues that businesses need to consider as part of their social media strategy, such as:

  1. employees use of social media (monitoring, policies, training, liability, disclosure of confidential/proprietary information, etc.);
  2. use of social media in recruitment;
  3. security;
  4. crisis management and damage to company reputation;
  5. protection and infringement of intellectual property rights;
  6. advertising, marketing and promotion rules;
  7. consumer protection/unfair terms and trading rules;
  8. user-generated and third-party content (user terms and conditions, notice and takedown policies and procedures, disclosure of material connections with third-party bloggers, etc.);
  9. insurance; and
  10. applicable industry rules and regulations.

Lastly, where a UK organization is a global business, it will be important for the organization to ensure that individuals outside of the UK are not unintentionally targeted by any UK social media campaigns; or if they are targeted intentionally, it will be important for organizations to consider any specific rules that they will need to comply with in the applicable jurisdiction.

We have written before about cases involving disputes between employers and employees over work-related social media accounts, but a new case out of Arizona federal court raises issues that appear to be unlike those we have addressed previously.

In Castle Megastore Group, Inc. v. Wilson, plaintiff Castle Megastore Group (CMG), a retailer of novelty and adult-themed merchandise, brought suit against three former employees for various causes of action related to the employees’ alleged misuse of CMG’s confidential information. Among its allegations, CMG claimed that one of the defendants, Michael Flynn, uploaded a video of a confidential CMG managers’ meeting to Flynn’s private Vimeo account and shared access to this video with the other two defendants (who had both been fired from CMG prior to the sharing of the video). CMG also alleged that Flynn, after having been fired, changed the username and password of the Facebook page he created for CMG while employed as CMG’s Social Media Specialist.

CMG appears to have brought its social media-related claims solely under the Stored Communications Act (SCA), a federal statute that provides for a cause of action against anyone who “intentionally accesses without authorization a facility through which an electronic communication service is provided; or intentionally exceeds an authorization to access that facility, and thereby obtains . . . access to a wire or electronic communication while it is in electronic storage in such system.”

The SCA protects individuals’ privacy in their electronic communications by making it criminally punishable for hackers and other unauthorized individuals to obtain, alter or destroy such communications. The statute, however, also provides relief to aggrieved parties in civil causes of action. The SCA has, for instance, been invoked by employees whose employers have improperly accessed, and read messages from, the employees’ private email accounts.

CMG alleged that Flynn violated the SCA when he posted the managers’ meeting on his Vimeo account and when he shared access to the site with the other two defendants. CMG also alleged that the other two defendants violated the SCA when they accessed the posted video.

In its ruling on the defendants’ motions to dismiss, however, the court found that, while Vimeo might be an “electronic communication service” within the meaning of the SCA, CMG failed to allege that Flynn lacked authority to authorize others to view his Vimeo account, a required element for SCA liability. Accordingly, CMG failed to state a claim that the two former employees with whom Flynn shared access to the video violated the SCA. Further, while CMG alleged that Flynn was not authorized to have or to share the video, it did not allege that Flynn obtained the video through unauthorized access of an electronic communication service—also necessary to state a claim under the SCA. The court therefore dismissed CMG’s SCA claims related to the uploading and accessing of the managers’ meeting video.

Regarding Flynn’s alleged changing of the Facebook account password, the court held that CMG failed to allege facts about the company’s use of the Facebook page from which the court could conclude that the page was an electronic communication service under the SCA. The court therefore dismissed the claim, finding that “[t]he threadbare statement that Flynn changed the Facebook password . . . does not state a claim under the SCA.”

With the dismissal of the plaintiff’s SCA claims (the only federal law claims brought in the action), the court declined to exercise supplemental jurisdiction over the remaining state law claims, and granted the defendants’ motions to dismiss the action. In dismissing the case, however, the court granted CMG leave to file an amended complaint. Will CMG be able to re-state its SCA claims so as to address the court’s concerns?  Stay tuned.