Friday, December 25, 2015

The Five Most Common Questions Clients Asked Us About Privacy Compliance in 2015

By Tanya Forsheit on December 23, 2015
Posted in Online Privacy

As we wrap up 2015, we thought it might be helpful to talk about some of the most common questions we received this year with respect to privacy compliance. Here is a random sampling of the questions topping the charts this year.

My company is self-certified under the Safe Harbor Framework. Now that the Court of Justice of the European Union (CJEU) has invalidated that framework in the October 2015 Schrems decision, how can I legally transfer data from the European Union to the United States?

Don’t panic. Let’s enjoy a nice German beer while we talk through this one. Although the practical implications of the CJEU’s judgment were (and still are) uncertain for companies that previously had availed themselves of the Safe Harbor Framework to transfer personal data from the EU, in the weeks following the decision a number of European data protection authorities (DPAs) and other regulators issued opinions and guidance.
Of particular note, on October 16, 2015, the Article 29 Working Party indicated that although transfers pursuant to the Safe Harbor Framework were no longer permitted, through the end of January 2016 EU DPAs would allow a replacement mechanism or alternative solution (often referred to as “Safe Harbor 2.0”) to be developed. 

In the interim, other data transfer mechanisms, such as binding corporate rules (BCRs) and standard contractual clauses (SCCs), offer potential alternatives for entities to consider. Although some German DPAs have called into question the validity of both BCRs and SCCs as data transfer mechanisms for sending EU personal data to the United States, and have indicated that they will not authorize BCRs or other data export contracts for transfers of personal data from Germany to the U.S., SCCs remain a valid and legal mechanism for data transfers. If you have not done so already, you should seriously consider entering into SCCs as appropriate (controller to controller for intracompany data transfers, or controller to processor for service provider arrangements). And let’s revisit this early next year. I will bring the French wine to that meeting. In the meantime, here is some additional information on what has happened in the aftermath of Schrems.

We are a major retailer. Our marketing group is launching a campaign in conjunction with a new microsite that will include text messaging regarding offers that may be of interest to our rewards members. Our rewards members all gave us their mobile phone numbers when they signed up for the rewards program over the past four years. Can we go ahead and start sending text messages?

Hold the phone. For text messaging campaigns going forward, it is likely not enough that a consumer voluntarily gave you his or her mobile phone number. In February 2012, the Federal Communications Commission (FCC) issued a Report and Order, which became effective on October 16, 2013 (the “February 2012 Order”), which altered the definition of “prior express written consent” for autodialed calls (which include text messages) under the Telephone Consumer Protection Act (TCPA) to include new content requirements for obtaining such consent. Such consent now must include a “clear and conspicuous disclosure” that (1) the telemarketing calls (or texts) may be made using an autodialer or an artificial or prerecorded voice and (2) providing consent is not required to make a purchase. 

In the February 2012 Order, the commission stated that once its new written consent rules became effective, companies could be liable for making autodialed or prerecorded voice telemarketing calls “absent prior written consent.” Based on that language, many companies considered the written consents they had obtained previously to be sufficient, even though those previously obtained consents did not contain the new disclosures as required by the February 2012 Order.

On July 10, 2015, the FCC released an Omnibus Declaratory Ruling and Order (the “July 2015 Order”) it had adopted on June 18 addressing requests for clarification regarding this requirement, among others. In the July 2015 Order, the FCC acknowledged that its “absent prior written consent” language “could have reasonably been interpreted to mean that written consent obtained prior to the current rule’s effective date would remain valid even if it does not satisfy the current rule.” However, the FCC has now made clear that prior written consents that did not include the new disclosures are not valid, and companies that had been relying on those old consents must obtain new consents that include the required disclosures.

It is true that some courts have continued to find that voluntary provision of a mobile number constitutes consent. See, e.g., Murphy v. DCI Biologicals Orlando, LLC, 797 F.3d 1302, 1306 (11th Cir. 2015) (“By voluntarily providing his cell phone number to [the defendant], [the plaintiff] gave his prior express consent to be contacted.”); and Reardon v. Uber Technologies, Inc., No. 14-CV-05678-JST, 2015 WL 4451209, at *7 (N.D. Cal. July 19, 2015) (“The FCC’s most recent order, released on July 10, 2015, elucidated that there is not a specific method by which a caller must obtain prior express consent, only that the consent must be express and not implied or presumed. Express consent can be demonstrated when the called party gives her wireless number to the person initiating the phone call without instructions to the contrary.”)

But better to be safe than sorry. The FCC’s February 2012 and July 2015 Orders are clear that certain disclosures should be made in connection with the call to action, and best practice would also call for a double opt in with very specific disclosures to confirm that the consumer wants to receive the messages. Given that the volume of TCPA class action litigation has skyrocketed over the past few years, with the potential for hundreds of millions of dollars in damages (at a rate of $500 to $1,500 per text) without any showing of injury, you would be well advised to prepare a more explicit opt in before you start the campaign. Let’s put together some language for your website sign-up and for the follow-up confirmation text message.
You can read more about the FCC’s July 2015 Order here.

We are transferring our HR data to a third-party cloud platform. Is there anything in particular we need to check with the cloud provider with respect to privacy issues before we move the data over?

I am going to assume you already have a data security schedule that requires your cloud provider to implement and maintain appropriate and reasonable security controls, and that your information security team has fully vetted the platform and determined that it provides sufficient security for your purposes. Here is some additional information about what you should be discussing with your vendors with respect to data security. But let’s focus on privacy. Does your agreement restrict how your cloud provider can use the data? Does the provider want to be able to use information about how you use the service to help it improve the service or develop new products and services? Will that information be personally identifiable or de-identified and aggregated? At what level will it be aggregated? Are you really comfortable with the provider being able to use your data for purposes unrelated to providing you with the services, even if the data is de-identified? On a different note, will any data be transferred from overseas? And where are the data centers? If Europe is involved, have you entered into controller-to-processor standard contractual clauses with your provider? (See number 1 above.)

These are all important considerations from a business risk perspective, not to mention the potential legal liability, especially if you are in a highly regulated industry like financial services or healthcare and if the data of non-U.S. employees will be transferred to the U.S. or other jurisdictions. Let’s take a closer look at the draft agreement and work through these issues before it is finalized.

Remember us, the ones planning that text message marketing campaign? It turns out that our privacy policy does not really address the fact that we are sharing some of our rewards members’ data with trusted third-party rewards partners that will send the customers their own offers. Also, we will be using a new analytics provider to engage in cross-device tracking. Do we need to do anything?

Yes. We need to revise your privacy policy to make sure these things are disclosed. Failure to do so could be construed as unfair or deceptive and get you in trouble with the Federal Trade Commission (FTC) or state attorneys general.

As for the data sharing with the trusted third party, California’s “Shine the Light” law, Civil Code section 1798.93, also requires that if you are collecting personal information from California customers (and I bet you are), you give them the choice of whether to opt in to or opt out of such data sharing unless you respond within 30 days to any request from a California customer to disclose to him or her the identity of those third parties with whom you shared the information for the third parties’ own direct marketing purposes over the past calendar year. What is your preference for how to address California law?

Cross-device tracking also raises a host of potential privacy implications, and you can read more about those here.

If you are already engaged in this sharing or in cross-device tracking, the changes that we make to your privacy policy could be considered to be material and retroactive. In those situations, the FTC takes the position that you must provide notice of these changes to the affected users and obtain their express affirmative consent to the changes. Let’s discuss how to go about doing that.

We are developing an amazing technology (app, platform, API, device) that can provide in-depth predictive analytics reports based on (fill in the blank) customer behavior. Are there any privacy issues that we need to consider in building the product and taking it to market?

I am so glad you called me while you are early in the development process. Yes, there are lots of things we can do to help mitigate privacy risk. Let’s talk about how to implement privacy by design with respect to this new product/service so that we consider privacy concerns right from the start. Now, I am going to ask you a lot of questions, but the answers to these will help us reduce the risk of violating laws or consumer trust and build toward best practices so that the company can be seen as a privacy champion and can use privacy as a competitive differentiator.

What data will the technology collect? Is it personally identifiable? That’s a loaded question, of course, because some regulators now consider things like IP addresses and unique device identifiers to be personally identifiable, so let’s think about those nuances and not assume we are OK because you are not going to be collecting Social Security numbers or driver’s license numbers. Can you minimize the amount of personal information you need to collect? Can you get rid of it after a certain period of time? What about de-identification or aggregation? Will you be sharing the reports with third parties or just with your customers? Will these reports be available to the end-user consumers or just corporate customers that collect the data and provide it to you? Will the process be transparent to the end user even if you are not consumer-facing? Do we need to address this in your customer contracts to make sure that your customers are providing appropriate notices and obtaining consents from their consumer end users where appropriate? In what jurisdictions will you be marketing the product/service?

While we are working together to talk to the relevant stakeholders in your organization to get a better understanding of the product/service, I am going to recommend to you the FTC’s report on Data Brokers. It’s a 2014 report, but it remains highly relevant, and it will really give you a sense of why the FTC is so interested in companies that are collecting large amounts of data and, in many cases, putting that data to highly beneficial use for consumers in fraud prevention or relevant marketing. This can be done in a way that is both legally compliant and privacy friendly.



No comments:

Post a Comment