For many companies, the main question about cloud computing is no longer whether to move their data to the “cloud,” but how they can accomplish this transition. Cloud (or Internet-based on-demand) computing involves a shift away from reliance on a company’s own local computing resources, in favor of greater reliance on shared servers and data centers. Well-known examples of cloud computing services include Google Apps, Salesforce.com, and Amazon Web Services. In principle, a company also may maintain its own internal “private cloud” without using a third-party provider. Since many companies choose to use third-party cloud providers, however, this article will focus on that cloud computing model.

Cloud computing offerings range from the provision of IT infrastructure alone (servers, storage, and bandwidth) to the provision of complete software-enabled solutions. Cloud computing can offer significant advantages in cost, efficiency, and accessibility of data. The pooling and harnessing of processing power provides companies with flexible and cost-efficient IT systems. At the same time, however, cloud computing arrangements tend to reduce a company’s direct control over the location, transfer, and handling of its data.

The flexibility and easy flow of data that characterize the cloud can raise challenging issues related to protection of data in the cloud. A company’s legal obligations and risks will be shaped by the nature of the data to be moved to the cloud, whether the data involve personal information, trade secret information, customer data, or other competitively sensitive information. This article describes the special legal considerations that apply when moving personal information to the cloud. It also offers a framework to help companies navigate these issues to arrive at a solution that meets their own legal and business needs.

Determine the categories of personal information to be moved to the cloud

As a general principle, personal information includes any information that identifies or can be associated with a specific individual. Some types of personal information involve much greater legal and business risks than other types of personal information. For example, a database containing health information will involve greater risks than a database containing names and business contact information of prospective business leads. Also, financial regulators in many countries require specific security standards for financial information. Accordingly, a cloud computing service that may be sufficient for the business lead data may fail to provide the legally required level of protection for health, financial, or other sensitive types of information.

A company will want to develop a strategy that provides sufficient protection to the most sensitive personal information to be transmitted to the cloud. In some cases, a company may elect to maintain certain types of personal information internally, in order to take advantage of more cost-efficient cloud computing services for its less-sensitive data.

Identify applicable laws affecting your outsourcing of personal information

Cloud computing, by its nature, can implicate a variety of laws, including privacy laws, data security and breach notification laws, and laws limiting cross-border transfers of personal information.

(a) Privacy Laws

Companies operating in the United States will need to consider whether they are subject to sector-specific privacy laws or regulations, such as the Gramm-Leach-Bliley Act (GLBA) or the Health Insurance Portability and Accountability Act (HIPAA). Such laws impose detailed privacy and data security obligations, and may require more specialized cloud-based offerings.

Europe-based companies, as well as companies working with providers in or with infrastructure in Europe, will need to account for the broad-reaching requirements under local omnibus data protection laws that protect all personal information, even basic details like business contact information. These requirements can include notifying employees, customers, or other individuals about the outsourcing and processing of their data; obligations to consult with works councils before outsourcing employee data; and registering with local data protection authorities. Similar requirements arise under data protection laws of many other countries, including countries throughout Europe, Asia, the Middle East, and the Americas.

(b) Data Security Requirements

Even if a company is not subject to these types of privacy laws, it will want to ensure safeguards for personal information covered by data security and breach notification laws. In the United States, these laws tend to focus on personal information such as social security numbers, driver’s license numbers, and credit or debit card or financial account numbers. One of the key safeguards is encryption because many (although not all) of the U.S. state breach notification laws provide an exception for encrypted data.

In contrast, many other countries require protection of all personal information, and do not necessarily provide an exception for encrypted data. Consequently, companies operating outside of the United States may have broader-reaching obligations to protect all personal information. While data protection obligations vary significantly from law to law, both U.S. and international privacy laws commonly require the following types of safeguards:

i. Conducting appropriate due diligence on providers;

ii. Restricting access, use, and disclosure of personal information;

iii. Establishing technical, organizational, and administrative safeguards;

iv. Executing legally sufficient contracts with providers; and

v. Notifying affected individuals (and potentially regulators) of a security breach compromising personal information.

The topic of data security in the cloud has received significant industry attention. Industry groups, such as the Cloud Security Alliance, have suggested voluntary guidelines for improving data security in the cloud. For example, please refer to the CSA’s Security Guidelines for Critical Areas of Focus for Cloud Computing, available at https://cloudsecurityalliance.org/download/security-guidance-for-critical-areas-of-focus-in-cloud-computing-v3/. In Europe, the Cloud Select Industry Group (CSIG), an industry group sponsored by the European Commission, recently issued the Cloud Service Level Agreement Standardization Guidelines, available at http://ec.europa.eu/digital-agenda/en/news/cloud-service-level-agreement-standardisation-guidelines. The Guidelines recommend contractual stipulations covering (1) business continuity, disaster recovery, and data loss prevention controls; (2) authentication/authorization controls, including access provision/revocation, and access storage protection; (3) encryption controls; (4) security incident management and reporting controls and metrics; (5) logging and monitoring parameters and log retention periods; (6) auditing and security certification; (7) vulnerability management metrics; and (8) security governance metrics. Providers also may choose to be certified under standards such as ISO 27001, although such certifications may not address all applicable legal requirements.

(c) Restrictions on Cross-Border Data Transfers

A number of countries—e.g., all the European Economic Area (EEA) Member States and certain neighboring countries (including Albania, the Channel Islands, Croatia, the Faroe Islands, the Isle of Man, Macedonia, Russia, and Switzerland), as well as countries in North Africa (e.g., Morocco), the Middle East (e.g., Israel), Latin America (e.g., Argentina and Uruguay), and Asia (e.g., South Korea)—restrict the transfer or sharing of personal information beyond their borders. These restrictions can present significant challenges for multinational companies seeking to move their data to the cloud. Recognizing these challenges, some providers are starting to offer geographic-specific clouds, in which the data are maintained within a given country or jurisdiction. Some U.S. providers have also certified to the U.S.-European Union Safe Harbor program, in order to accommodate EU-based customers. However, as the Safe Harbor only permits transfers from the EU to the United States, it is not a global solution. Accordingly, a company should assess carefully whether the options offered by a provider are sufficient to meet the company’s own legal obligations in the countries where it operates.

To complicate matters, international data protection authorities, particularly in the EEA, have expressed concerns about use of the cloud model for personal information. The Working Party 29 (WP29), the assembly of EEA data protection authorities, and many other local EEA authorities have issued guidance about cloud computing, covering purpose and transfer restrictions, notification requirements, mandatory security requirements, and the content of the contract to be concluded with cloud providers. This guidance includes the WP29 Opinion 05/2012 on Cloud Computing, which is discussed further below. The draft Data Protection regulation currently discussed among the EEA Member States reflects such guidance and should be accounted for prior to engaging cloud providers.

Review contractual obligations affecting your outsourcing of personal information

If your company is seeking to outsource to a cloud provider applications that involve third-party data, such as personal information maintained on behalf of customers or business partners, it is important to consider any limitations imposed by contracts with those third parties. Such agreements might require third-party consent to the outsourcing or subcontracting of data processing activities, or may require your company to impose specific contractual obligations on the new provider or subcontractor.

Select an appropriate cloud computing solution

Cloud services tend to be offered on a take-it-or-leave-it basis, with little opportunity to negotiate additional contractual protections or customized terms of service. As a result, companies may find themselves unable to negotiate the types of privacy and data security protections that they typically include in contracts with other service providers. Companies will need to evaluate whether the contract fulfills their applicable legal and contractual obligations, as discussed above. Beyond that, companies will want to evaluate the practical level of risk to their data, and what steps they might take to reduce those risks.

(a)   Public vs. Private Cloud

Broadly speaking, a private cloud maintains the data on equipment that is owned, leased, or otherwise controlled by the provider. Private cloud models can be compared with many other well-established forms of IT outsourcing and do not tend to raise the same level of concerns as a public cloud model.

A public cloud model disperses data more broadly across computers and networks of unrelated third parties, which might include business competitors or individual consumers. While offering maximum flexibility and expansion capabilities, the public cloud model raises heightened concerns about the inability to know who holds your company’s data, the lack of oversight over those parties, and the absence of standardized data security practices on the hosting equipment. Given these challenges, companies outsourcing personal information will want to understand whether the proposed service involves a private or public cloud, as well as evaluate what contractual commitments the provider is willing to make about data security.

(b)   Securing Data Before Transmission to the Cloud

Companies also may be able to take measures themselves to protect personal information before it is transmitted to the cloud. Some provider agreements instruct or require customers to encrypt their data before uploading the data to the cloud, for example. If it is feasible to encrypt the data prior to transmission to the provider, this may provide substantial additional protections, as long as the encryption keys are not available to the provider.

It is also important to account for applicable security requirements. To this effect, several countries in Europe have very specific statutory requirements for security measures, and some regulators have issued detailed security standards for cloud computing providers. Pursuant to the WP29 Opinion 05/2012, all contracts should include  security measures in accordance with EU data protection laws, including requirements for cloud providers on technical and organizational security measures, access controls, disclosure of data to third parties, cooperation with the cloud client, details on cross-border transfer of data, logging, and auditing processing. The recent guidelines from the CSIG recommends the inclusion of the following provisions in processing agreements: (1) standards or certification mechanisms the cloud service provider complies with; (2) precise description of purposes of processing; (3) clear provisions regarding retention and erasure of data; (4) reference to instances of disclosure of personal data to law enforcement and notification to the customer of such disclosures; (5) a full list of subcontractors involved in the processing and inclusion of a right of the customer to object to changes to the list, with special attention to requirements for processing of special or sensitive data; (6) description of data breach policies implemented by the cloud service provider including relevant documentation suitable to demonstrate compliance with legal requirements; (7) clear description of geographical location where personal data is stored or processed, for purposes of implementing appropriate cross-border transfer mechanisms; and (8) time period necessary for a cloud service provider to respond to access, rectification, erasure, blocking, or objection requests by data subjects.

(c)   Contract Issues

In the majority of cloud computing services, the client is the data controller and the cloud provider is the data processor. However, in certain scenarios (in particular Platform as a Service (PaaS) and Software as a Service (SaaS) in public computing models), the client and the cloud provider may be joint controllers. Under EU guidance, the responsibilities of joint controllers must be very clearly set out in the contract to avoid any “dilution” of legal responsibility.

The contract with the cloud services provider needs to set out clearly the roles and responsibilities of the parties. Unlike many outsourcing arrangements, cloud service contracts usually do not distinguish between personal information and other types of data. These contracts may still include at least basic data protection concepts, even if they are not expressly identified as such. At a minimum, companies will want to look for provisions preventing the provider from using the information for its own purposes, restricting the provider from sharing the information except in narrowly specified cases, and confirming appropriate data security and breach notification measures. Various European data protection authorities have underscored that access to cloud data by public authorities must comply with national data protection law and that the contract should require notification of any such requests unless prohibited under criminal law and should prohibit any non-mandatory sharing. Given the difficulty of negotiating special arrangements with cloud providers, it is important to select a cloud offering that is appropriately tailored to the nature of the data and the related legal obligations. It is likely that as cloud computing matures, more offerings tailored to specific business requirements, including compliance with privacy and similar laws, will be made available to companies.

Concluding thoughts

While cloud computing can substantially improve the efficiency of IT solutions, particularly for small and medium-sized businesses, the specific offerings need to be examined closely. There is no “one-size-fits-all” solution to cloud computing, especially for companies operating in highly regulated sectors or internationally. By understanding their legal compliance obligations, companies can make informed decisions in selecting cloud computing services or suites of services that best meet their needs.

From our sister blog, MoFo Tech:

Within a decade, analysts say, the “Internet of Things” will have transformed our lives. Billions of Internet-connected devices will monitor our homes, businesses, cars, and even our bodies, using the data to manage everything from appliances to heart monitors. Companies like Google— which recently paid $3.2 billion for smart-thermostat company Nest Labs—are already racing to build the IoT. But businesses face fundamental questions regarding the ownership of data, protecting customer privacy, liability when devices fail, and more.

The IoT will connect product developers and manufacturers in countless new ways, creating uncertainty about ownership of and rights to customer data. If a company contracts with a big data vendor to store and process consumer information, for instance, each party will need to know that its partner has the legal rights to collect or share data, says Alistair Maughan, a partner in Morrison & Foerster’s London office who is co-chair of the Technology Transactions Group. Then there is the question of who owns the data. “There is a whole supply chain the law is only beginning to grapple with,” Maughan says. “Manufacturers will need to understand the risks when there aren’t clear government standards.”

An area of major interest is how companies will protect customer privacy when so much data is in play. Companies need to make sure that what they say about their use of data collected from connected devices is accurate, complete, and up to date. “There is no one-sizefits-all approach to data security,” says Morrison & Foerster partner D. Reed Freeman Jr., who specializes in privacy matters. “The burden for a company is to consider what kind of data you have and how to protect against reasonably foreseeable, unauthorized access to personal information.”

Liability, of course, is a paramount concern when connected businesses adjust their use of data for new business or consumer products, says Stephanie Sharron, a Morrison & Foerster partner and a member of the firm’s Technology Transactions Group. Using vast sets of data to find patterns and targets will leave open all sorts of possibilities for technical and human mistakes. “There are questions about who should bear responsibility for inaccurate inferences or patterns that give rise to harm,” Sharron says. “Or who is responsible if a pattern comes from inaccurate data from a malfunctioning sensor.”

Then there is the question of who will manage and monitor the electrical systems needed to operate such vast networks—traditional public utility companies, new electricity market participants, or a combination. “Customers will want more choices to accommodate the new technologies and services they get to use” in the IoT, says Robert S. Fleishman, senior of counsel for Morrison & Foerster and an expert on energy regulation law. “Generally it will be up to state public utility commissions to decide who gets to provide the traffic control function and related activities for these things to operate within the system for distributing energy.” Some state utility commissions have already started to look at reforming their regulations and policies.

  • Doctor in the mouse. What if you could input a list of your current symptoms to Google, and quickly be connected with a doctor for a brief consultation? For a limited trial period, Google seems to have set up such a system for people who are looking for medical advice online. A lot of the details aren’t known yet, but a Google spokesperson told a Gizmodo reporter, “When you’re searching for basic health information — from conditions like insomnia or food poisoning — our goal is provide you with the most helpful information available.” The feature is part of Google’s Helpouts video-chat service.
  • Just shoot me. Data mining has reached the world of selfies. Social media users may not know this, but unless they have marked their photos posted on social media sites as private, the photos can be analyzed in bulk by third parties and used for marketing purposes. Privacy advocates say people should assume that their photos, unless clearly marked as private, are being scanned by market researchers. The rules and regulations applicable to this practice, including the privacy policies of the relevant social media platforms, are not always clear. So if you’ve posted a photo of yourself wearing a particular brand of ski gear on the mountain, some company may be making marketing decisions based on your photo and thousands of others. Soon, it may be targeting ads to you on that basis as well. For our own blog post on this subject, please click here.
  • Mere threats? In 2010, Anthony Elonis, a man from western Pennsylvania, made a series of rants on Facebook in the form of rap lyrics that threatened to kill his wife, an FBI agent, and children in a kindergarten class. He claimed that he never intended to kill anyone and that he was merely venting. He also claimed that his comments were protected by the First Amendment. Elonis was nonetheless charged and convicted under a federal threat statute and sentenced to 44 months in prison. The U.S. Supreme Court will hear his appeal in December. The case raises important issues, including whether statements on social media should be treated differently from statements made on the phone or in person. Elonis wrote to the Court, for example, “Modern media allow personal reflections intended for a small audience (or no audience) to be viewed widely by people who are unfamiliar with the context in which the statements were made and thus who may interpret the statements much differently than the speakers intended.”
  • Buy local. Facebook has just announced that it’s going to provide hyper-local advertising services for merchants who want to reach consumers in very specific geographic areas. This new feature reportedly will allow a business to target just those consumers who are within a mile of the physical location of such business. Facebook is able to roll out this new business because so many of Facebook’s one billion plus mobile users permit Facebook to collect their location information, or otherwise provide Facebook with the data needed to allow hyper-local ads. This new feature should launch in the United States in just a few weeks.
  • Psst – wanna know a secret? Secret is a hot new social network designed to permit people to share their secrets online in a completely anonymous setting, without letting anyone know who has made the post. But how secure is it actually? According to a Wired article, not very secure. “White hat” hackers – those who try to find the vulnerabilities of a network without doing harm – have repeatedly found out people’s supposed secrets by using basic hacking techniques. The best-known hack works only one way; the hacker can find a person’s secret if the hacker knows the person’s e-mail address, but can’t tie a posted secret to any particular individual. The Wired article raises an interesting question as to whether any app or platform can be truly social and truly secret at the same time.
  • Nyet. The U.S. Court of Appeals for the Second Circuit recently rejected an effort by prosecutors to use a profile page from a popular Russian social media platform, Vk.com, to link a defendant with the sending of an allegedly fake birth certificate from a particular e-mail address. The Vk.com profile page at issue included a photograph of the defendant and the name “azmadeuz,” which was part of the e-mail address in question. The trial court had admitted the page into evidence, but the Second Circuit reversed, finding that, although it doesn’t take much to authenticate evidence, the page at issue could not be authenticated. In particular, the Second Circuit found that there could be no “reasonable conclusion” that the page at issue belonged to the defendant and wasn’t bogus in some way. The truly interesting question is whether there should be a higher standard for authenticating social media and other Internet-based evidence; the Second Circuit, however, declined the opportunity to set such a higher standard; rather, the focus should remain on the specific facts surrounding the specific item of evidence to be authenticated.

Not to be outdone by Florida, California has yet again amended its data security breach law and again in groundbreaking (yet confusing) fashion. On September 30, 2014, California Governor Brown signed into law a bill (“AB 1710”) that appears to impose the country’s first requirement to provide free identity theft protection services to consumers in connection with certain data security breaches. The law also amends the state’s personal information safeguards law and Social Security number (“SSN”) law. The amendments will become effective on January 1, 2015.

Free Identity Theft Protection Services Required for Certain Breaches

Most significantly, AB 1710 appears to amend the California breach law to require that a company offer a California resident “appropriate identity theft prevention and mitigation” services, at no cost, if a breach involves that individual’s name and SSN, driver’s license number or California identification card number. Specifically, AB 1710 provides, in pertinent part, that if a company providing notice of such a breach was “the source of the breach”:

an offer to provide appropriate identity theft prevention and mitigation services, if any, shall be provided at no cost to the affected person for not less than 12 months, along with all information necessary to take advantage of the offer to any person whose information was or may have been breached.

The drafting of this requirement is far from clear and open to multiple readings. In particular, the use of the phrase “if any” can be read in multiple ways. For example, the phrase “if any” can be read to modify the phrase “appropriate identity theft prevention and mitigation services.” Under this reading, the law would impose an obligation to provide free identity theft protection services if any such services are appropriate. The phrase “if any,” however, could be read to modify the “offer” itself. Under this alternate reading, the law would provide that if a company intends to offer identity theft protection services, those services must be at no cost to the consumer. It is difficult to know how the California Attorney General (“AG”) or California courts will interpret this ambiguity. One thing is clear: until the AG or courts opine, the standard will remain unclear.

The drafting of the requirement also is not clear in other ways. For example, the statute does not specify what type of services would qualify as “appropriate identity theft prevention and mitigation services.” For example, would a credit monitoring product alone be sufficient to meet the requirement? Or would the law require something in addition to credit monitoring, such as an identity theft insurance element?

Nonetheless, state AGs historically have encouraged companies to provide free credit monitoring to consumers following breaches. In addition, even though not legally required, free credit monitoring has become a common practice, particularly for breaches involving SSNs and also increasingly for high-profile breaches. Nonetheless, California appears to be the first state to legally require that companies offer some type of a free identity theft protection service for certain breaches.

AB 1710 is particularly notable in its approach. First, the offer of free identity theft protection services will only be required for breaches involving SSNs, driver’s licenses or California identification card numbers. In this regard, an offer of free identity theft protection services will not be required for breaches involving other types of covered personal information, such as payment card information or usernames and passwords. This approach endorses a position that many companies have long held—that credit monitoring is appropriate only when the breach creates an actual risk of new account identity theft (as opposed to fraud on existing accounts). In addition, the offer of free identity theft protection services will only be required for a period of one year (as opposed to, for example, two years). The length of the offer of free credit monitoring has always been an issue of debate, and California has now endorsed a position that a one-year offer is sufficient.

Continue Reading Breaking Old Ground: California Again Amends Data Security Breach Law

  • Yik yuck. As we’ve discussed on this blog, secrecy is all the rage these days in the online world. Yik Yak – a particularly edgy social media app that seeks to preserve user anonymity – is sweeping the country, or at least the nation’s college campuses. With users’ identities concealed, the app has reportedly become a popular means for communicating deeply offensive remarks and even threats of violence. At one school, Colgate University, students launched a sit-in to protest against the ugliness they found on the app. And at the University of Tennessee, Dean of Students Melissa Shivers recently sent an email to students warning about the app and emphasizing the importance of civility on campus. With the growing popularity of anonymous social media platforms such as Yik Yak, expect to see increased tensions between anonymous speech rights and efforts to limit hateful or violent speech.
  • Listening in. For some time, many pharmaceutical companies have reportedly “listened in” on patients’ social media conversations to obtain a sense of how their products were actually being used, and data-packaging and data-mining companies have sprung up to help pharma firms get a handle on the discussions on Facebook, Twitter and other social media platforms. Now, investors too are starting to jump into this emerging field of “social listening” to get ideas about where to put their money. An Israeli company, Treato, is actively courting fund managers with its pitch that it can tap into these aggregate conversations. Even though no actual patient names are apparently used in these reports, a lot of people are raising privacy concerns. Big Data, meet Big Privacy – should be an interesting battle to watch.
  • Legally social. The Pennsylvania Bar Association just issued a formal ethics opinion on the use of social media by attorneys. Among the key provisions are ones requiring lawyers to have a basic working knowledge of social media as part of their competence in the law, prohibiting lawyers from disclosing confidential client information in response to negative online reviews, and permitting attorneys to access the public portion of jurors’ social media profiles while prohibiting efforts to access private information from such profiles. The formal opinion is one of the most comprehensive on the subject in any state, and is recommended reading for attorneys, regardless of their practice focus.

 

 

Hooray for Hollywood. According to a new study by KPMG, television and movie viewers have never had it better. A report by the consulting company found that the overwhelming majority of well-known movies and television shows are available legally to U.S. viewers through online services such as Amazon Prime, Netflix and Hulu. The study found that fully 94 percent of popular titles were legally available through one channel or another. The study also pointed out that these legal services provide high-quality viewing, which is generally not true of illegal services. Maybe it is possible to compete with free? After all, sellers of bottled water have done well vis-à-vis tap water by offering products that are high-quality, reasonably priced, convenient and ubiquitous.

There’s an app for that too. Entertainers, celebrities and others who are popular with the teen-age and young-adult demographics are increasingly choosing to promote themselves on mobile app-based social networks, or so-called “chat apps,” rather than through Facebook or Twitter, where it can be hard to stand out amidst the sheer volume of posts. These new chat apps, including Line, Kik, Snapchat, WeChat and Viber, allow for more direct engagement with followers. For example, when Paul McCartney and his band recently headed to Japan, he used Line to interact with fans, personally responding to inquiries and offering a free pack of stickers featuring cartooned images of himself.

Hipster’s paradise. Here at Socially Aware, we’ve taken a keen interest in Ello, the incredibly hip new social media platform that is generating a big buzz in the tech community. Indeed, some have dubbed Ello the “anti-Facebook,” because it does not sell ads based on user data, does not require users to use their real names and has a business model that relies on users to pay for premium features that they select. It is also invitation-only, further sparking interest in the new platform. Ello’s founders say they are aiming the network at artists, designers and programmers – not at the whole universe.  They also report that Ello is doubling in size every three or four days. Will Ello become the next big thing? Will consumer concerns regarding online privacy fuel the growth of alternative platforms such as Ello and, in the search space, DuckDuckGo, services that purport to provide greater privacy protections for users?

  • Is Tumblr trendier? A survey released by Tumblr says the users of that social media platform have higher average incomes than users of Facebook, Twitter, or Pinterest, and a report from Adobe says that this translates into cash: The average revenue per visit from a Tumblr referral is $2.57 on tablets and 67 cents on smartphones. Both figures are higher than the numbers for Facebook, Twitter, or Pinterest.  According to an Adobe digital analyst, “the fact that [Tumblr] produces the highest revenue per visit from mobile devices is likely due to its user base, which is skewed to young, trendy and well-educated urbanites with a greater affinity for online purchases and the disposable income to spend more.”
  • Come together. At a tech conference in San Francisco on September 15, Facebook announced that it, along with Google, Twitter, Square Inc., and other companies, is launching an initiative to jointly develop software programs that can be shared for free. This move has a great deal in common with Facebook’s strategy of offering its technology, including hardware technology, to other companies in an effort to reduce the costs of development and broaden Internet use. Facebook’s Open Compute concept, unveiled in 2011, already permits it to share the designs for more efficient products such as servers and network switches.
  • Expert needed? How esoteric are forensic methods of technologically linking a person to an online video in a criminal case? A New Jersey man was on trial for invading his ex-girlfriend’s privacy by posting nude pictures of her on Twitvid.com, a video-sharing service. Through several steps, a police detective was allegedly able to tie the uploading of the photos to a particular IP address that was linked to the defendant. The defense objected on the grounds that these technical aspects were not fully understandable to the average juror and required an expert witness to present them. The trial judge, as well as a New Jersey state appeals panel, agreed that a hearing was necessary to consider the nature and extent of the detective’s evidence and whether she was qualified to testify about it or whether an expert was needed.
  • School discipline. The California legislature has passed a law that, if signed by Gov. Jerry Brown (or not vetoed by him before the end of September), would significantly expand privacy protections for students from kindergarten through high school. In particular, among other things, the law would limit education technology companies used by K-12 schools from knowingly engaging in targeted advertising to students or their parents and guardians; using certain student-related information to create a profile regarding a K-12 student; or selling or otherwise disclosing such student-related information. Will other states follow California’s lead?
  • Taxi wars. The upstart P2P ride-sharing service Uber and its allies – including the D.C.-based trade group the Internet Association – have begun a public relations campaign to “brand” traditional taxicabs in a negative light and to enhance the public image of ride-sharing apps. Their online campaign, known as “Taxi Facts,” refers to “Big Taxi” as if it were “Big Oil” or “Big Steel,” and states the public deserves to know the truth about the industry. Not to be outdone, the traditional taxi industry has launched a campaign that refers to the new entrants as simply “unregulated taxicabs.”
  • This Bud’s for you. Anheuser-Busch and Facebook have teamed up on a new promotion in which people will be able to go onto the social network and buy their friends beers for their birthdays, to be redeemed at a nearby bar or restaurant. The giver simply enters credit card information, and the recipient redeems an online voucher – as long as he or she is of legal age to drink. “The program was born of A-B’s desire to remain relevant with millennial consumers of legal drinking age – and strengthen our position as the perfect beer for connecting with friends around any occasion,” said Anheuser-Busch’s VP of consumer connections.
  • Blind spots. Self-driving cars are an excellent example of innovation, and the ones with Google technology have already traveled more than 700,000 miles. But what if a self-driving car doesn’t “see” a new traffic light or a previously nonexistent traffic sign? This could result in traffic citations, or worse. But Google says it’s taking steps towards eliminating this type of problem and that the future of self-driving cars is essentially unlimited.
  • Getting personal. As the name suggests, Michigan’s Video Rental Privacy Act limits the ability of companies to disclose information regarding customers’ video rental activities. But does the law cover magazines as well as videos? In a case filed by a consumer who alleged that a magazine company had improperly disclosed her personal information, along with information about the magazines to which she subscribed, the U.S. District Court for the Eastern District of Michigan recently held that the law does in fact apply to magazines. The court noted that the statute is directed to companies “engaged in the business of selling at retail, renting, or lending books or other written materials, sound recordings, or video recordings,” and that magazines constitute “other written materials.”
  • Geotargeting crime. In a new effort to use technology to foil credit-card fraud, a company called BillGuard is testing a system that would monitor the precise whereabouts of mobile devices to detect possible payment issues. The tech firm is tracking mobile-phone locations in an attempt to stay one step ahead of fraudsters. Because smartphones are almost always near their owners, the technology would register and flag those occasions when a phone is not near the owner’s credit card. The technology would only be used with the consumer’s consent.