Federal Trade Commission

The Federal Trade Commission (FTC) announced this week that it sent warning letters to more than 60 national advertisers regarding the inadequacy of disclosures in their television and print ads. The letters are part of an initiative named “Operation Full Disclosure,” which the FTC implemented to review fine print disclosures and other disclosures that it believed were difficult to read or easy for consumers to overlook, yet included critical information that consumers would need to avoid being misled.

What Does it Mean for a Disclosure to be “Clear and Conspicuous”

Disclosures may be necessary to clarify a claim or to ensure that the full terms of an offer are adequately disclosed, in order to avoid a charge of deception by material omission. In FTC jurisprudence, disclosures must be “clear and conspicuous,” and while they may modify claims in the text of an ad itself, they may not contradict any such claims. The most recent pronouncement on how to make effective disclosures (this one was focused on online disclosures, but the general principles are the same) was issued in March 2013. The key is that if a disclosure is necessary to make an ad truthful and not misleading, it must be clear and conspicuous; otherwise, it is as though the disclosure was not made at all.

Continue Reading FTC Warns Advertisers to Check the Fine Print in “Operation Full Disclosure”; Shot Across the Bow Could Signal Law Enforcement Actions to Come

  • Better shop around. In connection with a new staff report, the Federal Trade Commission (FTC) examined 121 popular apps used to comparison shop, find online deals and pay with mobile devices; the FTC concluded that many of these apps failed, prior to download, to disclose important information to users, such as how the apps deal with payment-related disputes and how consumer data is collected, used and shared. The FTC urged developers of mobile shopping apps to be more transparent in how they deal with privacy, security and consumer protection issues. If your company is involved with apps designed to facilitate online shopping, you’ll definitely want to check out the report.
  • Social notworking.  Does social media undermine office productivity? Surprisingly, a study suggests that social media is responsible for a mere 5% of wasted time at work, well behind “water cooler talk” (14% of office time wasted), IT problems (12%) and – my favorite – pointless meetings (11%).
  • Chat or Tweet? For years, the direct message (DM) function within Twitter has been dormant – hardly used at all, much less for commercial purposes. Now, Twitter is trying to upgrade DM and position itself as a real-time chat option that will appeal to advertisers who want to communicate directly with consumers.

Article courtesy of Morrison & Foerster’s Mobile Payments Practice

Lawmakers in Washington, D.C., continue to show interest in understanding and developing regulatory proposals relating to mobile apps. The interest appears to be driven, at least in part, by policymakers’ concerns about consumer privacy when using mobile phones and other smart hand-held devices. The issue of consumer privacy, as well as the security of financial information, and the use of mobile also apps has been raised in the context of Congressional hearings held to understand the new ways in which consumers are paying, and taking payments, via smartphone.

The recent introduction of a bill focusing on mobile apps and privacy issues is another indicator of ongoing legislative interest in mobile phone technology and ways in which smartphones are used. On May 9, 2013, Representative Hank Johnson (D-GA) introduced H.R. 1913, the “Application Privacy, Protection, and Security Act of 2013” (“APPS Act”). H.R. 1913 was referred to the House Committee on Energy and Commerce for consideration. As of June 4, 2013, the bill had five co-sponsors.

Representative Johnson’s introduction of the APPS Act follows the release, in January 2013, of a discussion draft of the bill that was developed through an Internet-based legislative project launched by the congressman’s office in July 2012. The following provides a brief overview of the APPS Act, as introduced.

User Notices

Under the APPS Act, app developers would be required to provide users with a notice, before collecting their personal data, describing the terms and conditions governing the collection, use, storage and sharing of personal data. Developers would also be required to obtain the consent of the users to these terms and conditions.

The bill would require this notice to users to include the following:

  • The categories of personal data that the app will collect;
  • The purposes for which the personal data will be used;
  • The categories of third parties with which the personal data will be shared; and
  • A “data retention policy” that governs the length of time for which the personal data will be stored and a description of the user’s rights under the bill to notify the app developer and request that the developer refrain from collecting additional personal data through the app.

The APPS Act would direct the Federal Trade Commission (FTC) to issue regulations specifying the format, manner and timing of the notice. In promulgating the regulations, the FTC would consider how to ensure the “most effective and efficient” communication to the user regarding the treatment of personal data.

Data Security

The APPS Act would also require app developers to take reasonable and appropriate measures to prevent unauthorized access to personal data collected by apps. This provision demonstrates that concerns about consumer privacy continue to be a driving force for policymakers in crafting legislative proposals.

FTC Enforcement and Safe Harbor

The APPS Act would provide for FTC enforcement, pursuant to the FTC’s unfair or deceptive acts or practices authority under the FTC Act, but would not foreclose private rights of action, or actions by state attorneys general or other state officials. Pursuant to a safe harbor provision, app developers would satisfy the APPS Act’s requirements, and requirements of implementing regulations, by adopting and following a code of conduct for consumer data privacy developed in the multi-stakeholder process convened by the U.S. Department of Commerce’s National Telecommunications and Information Administration (NTIA). The NTIA process is an outgrowth of the White House white paper, “Consumer Data Privacy in a Networked World,” which advocated the coupling of voluntary privacy codes of conduct with federal legislation establishing consumer “Bill of Rights” principles.

The full text of H.R. 1913 is accessible on the Web site of the Government Printing Office at: http://www.gpo.gov/fdsys/pkg/BILLS-113hr1913ih/pdf/BILLS-113hr1913ih.pdf.

On March 7, 2013, a federal court in Manhattan ruled, in Federal Trade Commission v. PCCare247 Inc., that service via Facebook is an acceptable alternative means of serving court documents on foreign defendants. Although this is a watershed ruling in many respects, in other ways, it is a natural extension of current authority in a factual situation where such a ruling posed little risk.

In this case, the FTC brought suit against nine parties, including five India-based defendants—two entities and three individuals. The FTC alleged that the defendants violated a provision of the FTC Act by operating a scheme, run largely out of call centers in India, that tricked American consumers into spending money to fix non-existent computer problems. At the time of the decision, the FTC had already secured a temporary restraining order enjoining the defendants’ business practices and freezing various assets.

In addition, following procedures outlined in Rule 4(f)(1) of the Federal Rules of Civil Procedure and the Hague Convention on the Service Abroad of Judicial and Extrajudicial Documents in Civil or Commercial Matters, the FTC had provided the summons, complaint and related court documents to the Indian Central Authority for service on the defendants and had also sent these documents by three alternative means: by email to the defendants’ last known addresses, by Federal Express and by personal service through an Indian process server. The process server had successfully delivered the documents to all five defendants and FedEx had confirmed delivery for most, but the Indian Central Authority had still not confirmed delivery nor responded to the FTC’s inquiries more than four months later. Nevertheless, defendants, on notice of the action, hired counsel to represent them at a preliminary injunction hearing, only to have counsel withdraw two months later due to nonpayment.

In the motion decided by this ruling, the FTC sought the court’s permission to serve documents other than the summons and complaint by alternative means on the five defendants located in India. In particular, the FTC sought to serve the defendants by email and Facebook. Analyzing Rule 4(f)(3) of the Federal Rules of Civil Procedure, concerning service of an individual in a foreign country by alternative means, the court first concluded that service by email and Facebook was not prohibited by international agreement and that India had not specifically objected to such service.

The court next turned to a due process analysis, considering whether the proposed means of service was reasonably calculated to notify defendants of future filings in the case. The court found that it was. Reasoning that the defendants used email frequently to run their Internet-based business and that the FTC had identified email addresses—some used for scheme-related tasks—for each of the individuals (who served as directors of the defendant corporations), the court concluded that it was highly likely that defendants would actually receive and respond to emails sent to these addresses. Thus, the court found that service by email alone would satisfy due process.

For the sake of “thoroughness,” according to the court, the FTC had proposed service by personal message via Facebook, attaching the relevant documents in addition to service by email. The court found that the FTC had also demonstrated a high likelihood that Facebook messages would reach the defendants, given that two of the defendants had registered their Facebook accounts with the known email addresses, two had listed their job titles at the defendant companies on their profiles, and two were Facebook friends with the third individual defendant.

Although the court recognized that Facebook service was a “relatively novel concept” and might not actually reach the defendants, it drew comfort from the fact that such service was a “backstop” to email service. In addition, the defendants were already on notice of the lawsuit. Where the defendants had embraced new technology in operating their scheme, it was only fitting that service by both email and Facebook should satisfy due process: “Where defendants run an online business, communicated with customers via email, and advertise their business on their Facebook pages, service by email and Facebook together presents a means highly likely to reach defendants.”

As reported in June 2012 by Socially Aware, the PCCare247 court is not the first court in the Southern District of New York to consider whether service by Facebook is an acceptable means of alternative service. In fact, in the June 7, 2012 decision of Fortunato v. Chase Bank USA, N.A., another judge in the Southern District of New York considered the issue and concluded that service of a third party complaint by Facebook message and email to an address listed on an individual’s Facebook profile (in addition to service on the woman’s estranged mother) would not satisfy due process.

As the PCCare247 court noted in distinguishing the earlier decision, the facts in Fortunato were much different. In Fortunato, the plaintiff had failed to show that the Facebook profile was authentic—that the account in fact belonged to or was maintained by the individual in question, who had a history of providing fictional or out-of-date addresses to state and private parties—or that the email address listed on the Facebook profile was operational or regularly used by the individual. By contrast, in PCCare247, the court had multiple indicia that the Facebook profiles actually belonged to the defendants and that the defendants regularly used both their Facebook accounts and their email addresses.

Even while concluding that Facebook service was an acceptable alternative means of service, the PCCare247 court struck a cautionary note about its ruling: “To be sure, if the FTC were proposing to serve defendants only by means of Facebook, as opposed to using Facebook as a supplemental means of service, a substantial question would arise whether that service comports with due process.” Other courts have not been so bothered by Facebook service alone. For example, in a May 10, 2011 ruling, a Minnesota state court concluded that it would be considered sufficient service for a woman who had unsuccessfully been seeking to serve divorce papers on her husband to serve them by publication on the Internet, in whatever format she believed it most likely that he would receive such notice, including by “[c]ontact via Facebook, Myspace, or other social networking site.” In addition, beginning with an Australian ruling in December 2008, courts in Australia, Canada, New Zealand and the United Kingdom have permitted service via Facebook. And a bill has recently been introduced in Texas that would permit service of process through social media sites.

The PCCare247 decision is indeed an important moment—the first time a federal court has endorsed service via Facebook as an alternative means of service. But it was also a safe decision in many ways. The service was of documents other than the summons and complaint after the foreign defendants had already appeared through counsel and proved themselves to be on notice of the case. By their online behavior, defendants had shown themselves to be highly likely to access Facebook messages. Most crucial, Facebook service was permitted only as a backstop to email service, which itself was highly likely to reach the defendants; the court took pains to note that Facebook service alone might not satisfy due process. And the permitted method of service was private Facebook message—quite similar to email service—not by a wall post or other means arguably more similar to traditional publication.

Nonetheless, the PCCare247 decision will likely serve as a springboard for more decisions endorsing service by social media because, as the court explained, “history teaches that, as technology advances and modes of communication progress, courts must be open to considering requests to authorize service via technological means of then-recent vintage, rather than dismissing them out of hand as novel.”

The Federal Trade Commission (FTC) announced a potentially groundbreaking settlement with the social networking app Path and released an important new staff report on Mobile Privacy Disclosures late last week.

The FTC’s Settlement with Path suggests a new standard may be on the near-term horizon: out-of-policy, just-in-time notice and express consent for the collection of data that is not obvious to consumers in context. The FTC has long encouraged heightened notice and consent prior to the collection and use of sensitive data, such as health and financial information. This settlement, however, requires such notice and consent for the collection and use of information that is not inherently sensitive, but that, from the Commission’s perspective, at least, might surprise consumers based on the context of the collection. Only time will tell, but historically Order provisions like this have tended to become cemented as FTC common law. Moreover, although the Children’s Online Privacy Protection Act (COPPA) portions of the settlement do not break new ground, they do serve as a potent—and expensive—reminder that the FTC is highly focused on kids’ privacy online, particularly in the mobile space.

The FTC’s Report reinforces this sentiment by encouraging all the major players in the mobile ecosystem—including app developers, ad networks, and trade associations—to increase the transparency of the mobile ecosystem through clear, accessible disclosures about information collection and sharing at appropriate times.

To continue reading this post, click here.

In the latest issue of Socially Aware, our Burton Award-winning guide to the law and business of social media, we look at recent First Amendment, intellectual property, labor and privacy law developments affecting corporate users of social media and the Internet. We also recap major events from 2012 that have had a substantial impact on social media law, and we take a look at some of the big numbers racked up by social media companies over the past year.

To read the latest issue of our newsletter, click here.

For an archive of previous issues of Socially Aware, click here.

On December 19, 2012, the Federal Trade Commission (“Commission”) announced long-awaited amendments to its rule implementing the Children’s Online Privacy Protection Act (“Rule”). The changes—which take effect on July 1, 2013—are significant. They alter the scope and obligations of the Rule in a number of ways. We discuss the revisions in greater detail below.

  • The Commission revised the Rule’s definition of “personal information” to include more types of data that trigger the Rule’s notice, consent, and other obligations. These include persistent identifiers when used for online behavioral advertising and other purposes not necessary to support the internal operations of the site or online service.
  • The Commission expanded the Rule’s coverage to third-party services—such as ad networks and social plug-ins—that collect personal information through a site or service that is subject to COPPA. The host site or service is strictly liable for the third party’s compliance, while the third party must comply only if it has actual knowledge that it is collecting personal information through a child-directed site or from a child.
  • The Commission streamlined the content of the parental notice and simplified the privacy policy.
  • The Commission retained the “email plus” method of obtaining parental consent. It also added new methods of obtaining consent and established a process for pre-clearance of other consent mechanisms.
  • The Commission imposed new data security pass-through requirements, as well as data retention obligations.
  • The Commission revised the Rule to permit certain sites that are “directed to children” to comply only with respect to those users who self-identify as under 13.

To continue reading this post, click here.

The Federal Trade Commission (FTC) has cracked down on a company that was engaged in “history sniffing,” a means of online tracking that digs up information embedded in web browsers to reveal the websites that users have visited. In a proposed settlement with Epic Marketplace, Inc. and Epic Media Group (together, “EMG”) announced on December 5, 2012, the FTC settled charges that EMG had improperly used history sniffing to collect sensitive information regarding unsuspecting consumers. 

EMG functions as an intermediary between publishers—i.e., websites that publish ads—and the advertisers who want to place their ads on those websites. It does this through online behavioral advertising, which typically entails placing cookies on websites a consumer visits in order to collect information about his or her use of the website and then using that information to serve targeted ads to the user when he or she visits other websites within the EMG Marketplace Network.

What got EMG into trouble was that it also used history sniffing to collect information regarding the websites that users visited. Here’s how the technique works. In your web browser, hyperlinks to websites change color once you have visited them. After you have visited a webpage, the hyperlink to it will most likely appear in one color (e.g., purple). If you haven’t been to a particular webpage before, any link to it will probably show up in another color (e.g., blue). History sniffing code exploits this feature to go through your browser—that is, to “sniff” around—to see what color your hyperlinks are. When the code finds purple links, it knows that you’ve been to those websites.

According to the FTC, for almost 18 months—from March 2010 until August 2011—EMG included history sniffing code in ads it served to website visitors on at least 24,000 webpages within its network, including webpages associated with name brand websites. EMG used the code to determine whether consumers had visited more than 54,000 different domains, including websites “relating to fertility issues, impotence, menopause, incontinence, disability insurance, credit repair, debt relief, and personal bankruptcy.” EMG used this sensitive information to sort consumers into “interest segments” that, in turn, included sensitive categories like “Incontinence,” “Arthritis,” “Memory Improvement,” and “Pregnancy-Fertility Getting Pregnant.” EMG then used the sensitive interest segments to deliver targeted ads to consumers.

History sniffing is not per se illegal under U.S. law. What got EMG in trouble was that it allegedly misrepresented how it tracked consumers. First, EMG’s privacy policy at the time stated that the company only collected information about visits to websites within the EMG network; however, the FTC alleged that the history sniffing code enabled EMG to “determine whether consumers had visited webpages that were outside the [EMG] Marketplace Network, information it would not otherwise have been able to obtain.” EMG’s tracking of users in a manner inconsistent with its privacy policy was therefore allegedly deceptive, in violation of Section 5 of the FTC Act.

Second, EMG’s privacy policy did not disclose that the company was engaged in history sniffing; it disclosed only that it “receives and records anonymous information that your browser sends whenever you visit a website which is part of the [EMG] Marketplace Network.” According to the FTC, the fact that the company engaged in history sniffing would have been material to consumers in deciding whether to use EMG’s opt-out mechanism. EMG’s failure to disclose the practice was therefore also allegedly deceptive in violation of Section 5 of the FTC Act.

The proposed consent order would, among other things, require EMG to destroy all the information that it collected using history sniffing, bar it from collecting any data through history sniffing, prohibit it from using or disclosing any information that was collected through history sniffing, and bar misrepresentations regarding how the company collects and uses data from consumers or about its use of history sniffing code.

EMG stopped its history sniffing in August 2011, and most new versions of web browsers have technology that blocks this practice. Nonetheless, the FTC made it clear in the complaint that it wanted to highlight the problem because history sniffing “circumvents the most common and widely known method consumers use to prevent online tracking: deleting cookies.” Mark Eichorn, assistant director of the FTC’s Division of Privacy and Identity Protection, told the Los Angeles Times that the FTC “really wanted to make a statement with this case.” He added, “People, I think, really didn’t know that this was going on and didn’t have any reason to know.” The proposed consent order puts online tracking and advertising companies on notice: If you collect data in a manner inconsistent with—or not disclosed in—your privacy policy, you run the risk of a charge of deception.

On October 30, 2012, California Attorney General Kamala Harris announced that her office would begin notifying the developers of as many as 100 mobile apps that their apps do not comply with the state’s Online Privacy Protection Act (OPPA) and that they have 30 days to bring them into compliance.

The announcement does not come as a surprise. Earlier this year, the Attorney General published a Joint Statement of Principles with the major platforms that distribute and sell mobile apps, providing that they will distribute only apps that have privacy policies that consumers are able to review prior to download. At that time, her office told app developers that they had six months to come into compliance or to be notified of violations. Shortly thereafter, Attorney General Harris formed a Privacy Enforcement and Protection Unit, intended specifically to enforce OPPA and other privacy laws.

In light of the Attorney General’s announcement and her continued focus on privacy, companies that collect personal information online from California residents—whether through a website, online service, or app—should take steps to ensure that they are in compliance. According to the Attorney General’s sample non-compliance letter attached to her press release, failure to comply could subject a company to a fine of up to $2,500 each time a non-compliant app is downloaded.

The Law’s Requirements

OPPA requires a commercial website operator or online service provider, including a mobile app developer, that collects personally identifiable information (PII) from consumers residing in California to post a conspicuous privacy policy. Because OPPA applies to any company that collects data online about California residents, companies both within and outside of California may be subject to enforcement activity.

Under OPPA, the privacy policy must include:

  • The categories of PII that the website, online service, or app collects from its users;
  • The third parties with whom such PII may be shared;
  • The process by which the consumer can review and request changes to his or her PII, if the website operator, online service provider, or app developer maintains such a process;
  • The process by which the operator, provider, or developer notifies consumers of material changes to its privacy policy; and
  • Its effective date.

Additional Considerations

Compliance with OPPA does not necessarily ensure compliance with all applicable laws. In particular, the Federal Trade Commission (FTC) has long taken the position that privacy policies should describe, in a way that consumers can easily understand, all material collection, use, and disclosure practices. This means that, in addition to the information required by OPPA, a privacy policy should include other disclosures, such as:

  • Its scope;
  • How PII may be used;
  • How “other information”—information that may not be considered PII but the collection of which may be material to users—is collected, used, and disclosed. This may include, for instance, users’ clickstream information or other information derived from their interaction with the website, service, or app and collected for purposes of personalizing content or displaying targeted ads;
  • How PII is secured and for how long it may be retained;
  • How the user may exercise various rights, such as to opt out of receiving direct marketing or to opt out of the sharing of his or her PII with third parties;
  • How the user may access the PII collected from him or her and the control that he or she has with respect to it; and
  • How the user can contact the operator or developer.

Drafting a compliant privacy policy is only the first step. A company must also implement measures to ensure that it complies with the representations it makes in its privacy policy, to avoid claims that its privacy policy is deceptive or misleading.

In light of the increased enforcement activity by the California Attorney General and FTC, mobile app developers will want to ensure their mobile apps include a privacy policy, that the privacy policy is conspicuously posted on the mobile apps, and that the privacy policy is followed in practice.

The Federal Trade Commission (FTC) recently reached an $800,000 settlement with the data broker Spokeo, Inc. (“Spokeo”).  The FTC’s complaint alleged violations not normally seen together:  First, that Spokeo distributed personal information for background checks by employers in ways that failed to comply with the Fair Credit Reporting Act (FCRA) and, second, that Spokeo’s employees posted Spokeo product endorsements without revealing their connection to the company, in violation of Section 5 of the FTC Act, which prohibits “unfair or deceptive acts or practices in or affecting commerce.”

The Alleged FCRA Violations

The FCRA imposes certain obligations on “consumer reporting agencies,” which are generally defined as businesses that assemble or evaluate certain information about a consumer and furnish it to third parties for their use in determining the consumer’s eligibility for credit, insurance, or employment.  The FCRA requires a consumer reporting agency to follow specified procedures to help protect consumers’ rights, including steps to ensure that each report it sells is used for a purpose specifically permitted by the law and that the information contained in the report is accurate.  The law also requires a consumer reporting agency to inform each recipient of a consumer report of its obligations under the Act, including that it notify a consumer in the event that it takes adverse action against him or her based on information in the report (such as a decision to deny him or her credit or not to hire him or her).

Spokeo collects personal information about consumers from hundreds of online and offline sources—including social networks and marketing databases—and combines this information to create profiles on those consumers.  Spokeo then sells access to these profiles.  The FTC alleged that, because Spokeo marketed the profiles to human resources departments and others for use in the hiring process, it was a consumer reporting agency subject to the FCRA.  According to the FTC, Spokeo did not, however, comply with the Act’s requirements.  Moreover, even though Spokeo had changed its website terms of service to state that it was not a consumer reporting agency and to prohibit clients from using its information for purposes protected by the FCRA, Spokeo did not actually enforce those terms, such as by revoking the access of companies that it knew—or should have known—were using its consumer reports for employment purposes.

Although this is the FTC’s first FCRA case involving the sale of data collected for employment purposes from social media and other online sources, it should not have come as a complete surprise, as this was not the first time that the agency had weighed in on the subject.  In May 2011, FTC staff wrote to a company described as “an Internet and social media background screening service used by employers in pre-employment background screening,” reminding it of the FCRA’s applicability.  Even in light of these FTC activities, however, businesses may not appreciate just how broad the law’s definition of a “consumer reporting agency” is.  Companies that compile or evaluate and then distribute consumer data should seek to determine whether they need to comply with the FCRA’s requirements.  Further, companies that receive consumer reports from consumer reporting agencies—whether to make employment decisions or otherwise—are also bound by certain obligations under the FCRA and, potentially, state laws.

The Allegedly Deceptive Endorsements

Just a few years ago, the FTC updated its Endorsement Guides (“Guides”) to address issues specific to social media marketing.  Although the Guides do not have the force of law, they provide marketers with guidance from the FTC on avoiding potentially deceptive practices under Section 5 of the FTC Act.  Even prior to this update, however, the Guides made clear that any connection between an endorser and the seller of the advertised product—such as an employment relationship—must be disclosed, as such a connection affects the weight that consumers give to the endorsement.  The message to companies:  Create and enforce a social media policy that requires your employees to disclose the fact of their employment when talking about your products or services.

Spokeo allegedly did just the opposite:  The FTC asserted in its complaint that Spokeo directed its employees to pose as ordinary consumers and post endorsements praising the company’s products.  What’s more, Spokeo managers actually reviewed the endorsements and supplied the accounts that were used to make them—all to give the public the misleading impression that Spokeo had numerous happy customers.  In the FTC’s view, this practice was deceptive because, had consumers known that the endorsements were posted by the seller’s own employees, they would have known that they should probably take the endorsements with a grain of salt.  In its settlement with the FTC, Spokeo agreed not only to comply with the Guides going forward but to also remove all of the fake endorsements already posted.

A myth has developed among many companies seeking to exploit social media that the old rules do not apply in this new age.  The Spokeo settlement is a stark reminder that the old rules do in fact apply, and that companies ignore those rules at their peril.