GettyImages-169937464_SMALLCan the mere offering of a mobile app subject the provider of such app to the privacy laws of countries in the European Union (EU)—even if the provider does not have any establishments or presence in the EU? The answer from the District Court of The Hague to that question is yes. The court confirmed on November 22, 2016, that app providers are subject to the Dutch Privacy Act by virtue of the mere offering of an app that is available on phones of users in the Netherland, even if they don’t have an establishment or employees there.

Context. EU privacy laws generally apply on the basis of two triggers: (i) if a company has a physical presence in the EU (in the form of an establishment or office or otherwise) and that physical presence is involved in the collection or other handling of personal information; or (ii) if a company doesn’t have a physical presence but makes use of equipment and means located in the EU to handle personal information.

Continue Reading The Hague District Court’s WhatsApp Decision Creates Concerns for Mobile App Developers

On February 27, 2013, the European Article 29 Working Party (a group comprising representatives from all of the data protection authorities of the EU Member States, referred to in this articles as “WP29”) issued an Opinion on the privacy and data protection implications of the use of apps on mobile devices (“the Opinion”). The Opinion primarily targets app developers, but provides recommendations for all players in the app ecosystem, including operating system (“OS”) developers and device manufacturers, app owners, app stores, and other third parties such as analytics and advertising providers. The Opinion sets out “musts” and “recommendations” for each player and is to some extent consistent with the U.S. Federal Trade Commission’s staff report on Mobile Privacy Disclosures (February 2012).

The Opinion considers the key data protection risks of apps to be a lack of transparency and a lack of the ability to provide meaningful consent. The Opinion recognizes that the ‘real-estate’ on a mobile device is limited because of limited screen sizes, but nonetheless states that users should be appropriately and adequately informed about how their personal information is used, and—where required—that users’ consent should be obtained

Scope and Applicable Law

The Opinion states that mobile apps are covered by both the Data Protection Directive 95/46/EC and the ePrivacy Directive 2009/136. Most data processed via apps is personal data, including unique device identifiers, user IDs, browsing history, contact data, phone calls, SMS, pictures and videos. Any player in the app ecosystem, regardless of its location, must comply with EU/EEA laws if it targets individuals resident in the EU/EEA. However, non-EU/EEA controllers are exempt where data is only processed in the device itself without generating traffic to the applicable data controller(s). Importantly, the user’s acceptance of terms of use or a contractual agreement will not exclude the applicability of EU law and the data controller’s or processor’s obligation to comply. Accordingly, app stores must “warn” app developers about EU/EEA obligations before submitting and making the app available to EU/EEA residents, and reject apps that do not comply.

Notice

The Opinion emphasizes the need to provide comprehensive, easy-to-understand, and timely notice. Notice must be provided “at the point when it matters to consumers, just prior to collection of such information by apps.” In practice, this will mean prior to installation of the app. This notice requirement not only applies to app developers but also to app stores and any OS or device manufacturers who provide pre-installed apps.

Notices should at least contain information about:

  • The entity that is legally responsible for the processing of the data, and how that entity can be contacted. Where there are multiple entities involved, apps should provide for a single point of contact.
  • The categories of personal information that will be processed through the app, in particular where such categories are not intuitively obvious.
  • The purposes for which information is processed. WP29 notes that such purposes should be described narrowly and specifically, and warns for “purpose-elasticity.”
  • Whether or not information data will be shared with third parties.

The Opinion further states that “the essential scope of the processing must be available to the users before app installation via the app store. The relevant information about the data processing must also be available from within the app, after installation.” In practice, this will mean that both app developers and app stores carry the biggest burden of ensuring adequate notification, both prior to and after installation. WP29 favors a layered approach that provides essential information in an initial notice and further information via (links to) a complete privacy policy. The Opinion also suggests the development of industry-wide visual logos, icons or images.

Consent

Consent is required for any processing of data via apps. The Opinion states that the two different consent regimes overlap: Under Article 5(3) of the ePrivacy Directive, consent is required to access or store any information on a user’s device; and under the Data Protection Directive, consent is required to process personal data. In practice, a single consent can be obtained for both types of processing. Consent should be “granular” and simply clicking an “install” button would not suffice. In order for consent to be valid, it needs to be freely given, specific and informed (which places additional importance on the quality and scope of the notice). Other legal bases may be used for processing at a later stage (during use of the app) but only by app developers.

Children

The Opinion also calls for extra attention to applicable national age requirements. Many national privacy laws in EU Member States require parental consent for minors of certain ages. In addition, even when consent can be legally obtained from a minor and the app is intended to be used by a minor, developers should be particularly mindful of the minor’s potentially limited understanding of, and attention for, information about data processing. Developers and app stores should adapt their notices and data processing practices accordingly. WP29 further notes that children’s data should never, whether directly or indirectly, be used for behavioral advertising purposes, as this will fall outside the scope of a child’s understanding.

Security and Retention

App developers should pay specific attention to the security of their apps, and implement security considerations at the design stage of the app. They should also carefully consider where data will be stored (locally on the device or remotely), and not use persistent (device-specific) identifiers, but instead use app-specific or temporary device identifiers to avoid tracking users over time.

Also, app developers must consider appropriate retention periods for the personal information they collect, taking into account that users may lose their devices or switch devices. App developers are recommended to implement procedures that will treat accounts as expired after defined periods of inactivity.

Different Players

Although the Opinion references app developers in many of its requirements and recommendations, WP29 acknowledges that responsibilities are shared between different players. The Opinion states that every app should provide a single point of contact for users, “taking responsibility for all the data processing that takes place via the app.” The Opinion provides the following recommendations:

  • App stores are usually data controllers, in particular when they facilitate upfront payments for apps, support in-app purchases and require user registration. App stores should: (i) collaborate with OS and device manufacturers in developing user control tools (such as symbols representing access to data) and display them in the app store; (ii) implement checks in their admissions policy to eliminate malicious apps before making them available in the store (and provide detailed information on such submission checks to users); (iii) implement a privacy-friendly remote uninstall mechanism based on notice and consent; (iv) collaborate with app developers to proactively inform users about data security breaches; and (v) consider the use of public reputation mechanisms whereby users rate apps not only on their popularity but also on privacy and security.
  • OS and device manufacturers are usually the data controllers (or joint controllers) when they process data for their own purposes, for example for smooth running of the device, security, back-ups or remote facility location. OS and device manufacturers should: (i) develop technical mechanisms and interfaces that offer sufficient user control, in particular via built-in consent mechanisms at the first launch of the app or the first time an app attempts to access data that has a significant impact on privacy (this also applies to pre-installed apps); (ii) ensure that the app developer implements sufficiently granular control and can access only the data necessary for the functioning of the app; (iii) ensure that the user can block the access to the data and uninstall the app in a simple manner; (iv) implement mechanisms to inform users about what the app does, what data the app can access, and provide settings to change parameters of processing (OS and device manufacturers share this responsibility with app stores); (v) develop clear audit trails into the devices, such that end users can see which apps have been accessing which data on their devices; (vi) prevent covert monitoring of users and put in place a mechanism to avoid online tracking by advertisers and other third parties (in particular, default settings must be “such as to avoid any tracking”); and (vii) ensure security by strengthening authentication mechanisms, enabling strong encryption mechanisms, and providing security updates.
  • Ad providers, analytics providers and communications service providers act as data processors where they execute operations for app owners such as analytics, provided they do not process data for their own purposes or share data across app developers. In this context, such third parties have limited obligations, mostly related to data security. However, third parties are data controllers where they collect or share data across apps, provide additional services, or provide analytics figures “at a larger scale,” such as for app popularity and personalized recommendation. In such cases, third parties should: (i) obtain consent for behavioral or targeted advertising, as well as accessing or storing any information on the device; and (ii) apply security requirements, in particular secure data transmission and encrypted storage of unique device and app identifiers and other personal data. Where communications service providers issue branded devices, they must ensure that consent is obtained for any pre-installed apps. Ad providers must not deliver ads outside the context of the app, such as by delivering ads through modified browser settings or placing icons on the mobile desktop.  Advertisers should further refrain from using unique device or subscriber IDs for tracking purposes.

Europe is currently undergoing a significant reform of its privacy regime. Under the current European Union (EU) Privacy Directive, individuals already have broad rights curtailing companies’ ability to process their personal data. The proposed EU Privacy Regulation seeks to broaden these rights even further. In particular, the proposed “right to be forgotten” may ultimately impose substantial new burdens on companies, especially social media and Internet businesses.

European privacy laws restrict the information that companies can process regarding individuals, and grant to individuals several rights with respect to their personal data (e.g., access and correction rights). The current EU Privacy Directive came into force in 1995 and has continued to apply ever since with various updates in the intervening years. The Europeans, however, are currently discussing a proposed EU Privacy Regulation that would further strengthen the protection of personal data of individuals by, among other things, introducing new rights. Among the new rights being proposed is the “right to be forgotten.” Essentially, under this proposed new right, individuals would be able to request—under certain circumstances—that companies erase all information in their systems and databases regarding such individuals. Companies receiving such requests would be obligated to comply.

The right to request removal from a company’s records is not new. Under the current EU Privacy Directive, an individual can request that a company remove his or her data from its system under certain circumstances, for example, because there is no legal basis for the company having such data in the first place or because the individual no longer has a relationship with that company (e.g., if a customer switches mobile phone carriers). However, this current right of removal is not absolute and can take a backseat to other interests, such as a company’s duty to maintain books and records of its business.

The new right to be forgotten would strengthen and expand the current right of removal. In particular, the new right would require a company to not only erase the applicable information and cease any further dissemination of the information but also take all reasonable steps necessary to inform third parties to whom the company has made the data available and to request that such third parties also remove the data from their systems. In other words, the new right would require a complete cleanup of the data originating from the company. A phone company receiving the request would therefore have to not only remove the data from its systems, but also inform, for example, its collections agencies, advertising and marketing agencies and outsourcing providers (such as installation services companies) that the request was made and that they should also remove the applicable data from their systems (as currently drafted, the company would only have to pass along the request, and would not be required to verify compliance with such request by other companies).

The right to be forgotten has been conceived in particular to address social media companies and other online businesses. Regarding such providers, the European legislatures find it of paramount importance that individuals be able to control what information is online about them (even when they have put the information online themselves), especially with respect to minors under the age of 18. While the rationale for this approach may be understandable, the way that the right is currently drafted, a social media site that receives a request to be forgotten could be obligated to inform third parties about the request, including other users of the social media site, other social media sites to which the data has been linked (e.g., via Twitter feeds or integration), search engines and any other website that the social media site knows has received the data. Given the expansive scope of the right as currently drafted, this right could potentially create burdensome and costly compliance obligations for social media sites and other online services, once the proposed EU Privacy Regulation is in force.

The proposed reform is currently being discussed in the European Parliament and is not expected to be finalized until 2014 at the earliest, after which there will be another two years before it would take effect. The proposals in the Regulation may still change pending ongoing debate, although it is expected that many of the new rights and requirements, including the right to be forgotten, will be maintained in some form.