Rajeev Chand 

Wow what a great room. Good afternoon everyone. Welcome to the Chief Privacy Officer Supersession at CES 2020. We are extremely honored to have our four speakers here as introduction. My name is Rajeev Chand, I serve as head of research for wing venture capital. Immediately to my left is Erin Egan, who serves as Vice President of Public Policy and Chief Privacy Officer for Facebook. Immediately to her left is Jane Horvath who serves as Senior Director of Global Privacy at Apple. Immediately to her left is Susan Shook, who serves as global Privacy Officer at Procter and Gamble. And immediately to her left is Commissioner Rebecca Slaughter Commissioner of the FTC. Everybody please join me in welcoming Erin Jane, Susan and the commissioner to CES. So we've got a lot of ground to cover. And so I'm going to go through a bunch of questions. I'm also going to leave time in the in the end for last 20 minutes or so for Q&A from the group directly. Let me start maybe, Commissioner, I'll start with you, because the topic of the session is, you know, what do consumers want? And maybe I'll start with a question that I got from a reporter last week kind of rephrase, if you would. And the question was around, given the high number of high profile, you know, security and privacy sort of incidences even last year in 2019. Do you believe that the consumer tech industry is doing enough to address privacy issues?

 
Rebecca Slaughter 

Well, thank you. Thank you for the question. And thank you guys so much for having me here. Before I start talking, I am working really hard on remembering to give the disclaimer that I am here speaking for myself and not for the FTC, generally, or my other commission, my fellow Commissioners. So with that disclaimer out of the way, listen, I think if you just the fact that every almost every day, when we read the newspaper we see different concerning stories about privacy or security breaches, it would be impossible to conclude that enough is being done. I think that that would be. That would be, it wouldn't make a lot of sense to draw that conclusion. The The question is what needs to be done differently? What needs to be done better? And I don't think that that's something that we can generalize about. Because it's a question that is very company specific or even industry specific, practice specific and there are different things that need to be done to address security issues, to address privacy issues, and to protect consumers, not just from harm as we know about today but from harms that can flow downstream from practices that are concerning today.

 
Rajeev Chand 

So maybe Jane, let me turn to you next and just as background Jane has the tail end of laryngitis so so if if you're not able to hear a perfectly she'll whisper (inaudible) will we will relay to everyone else. So Jane maybe even backing up one more step. How do you define privacy? And what do consumers want?

 
Jane Horvath 

At Apple I, the way we define privacy is to put the consumers in the driver's seat, they should have control over their data, they should have choices about their data. And one of the things that we really, really focused on at Apple is privacy by design. And we do that in a number of different ways. And it really starts from the beginning. We have a team of privacy lawyers that report to me, and we also have a team of privacy engineers, and for every new product, even at the imminent beginning design phases. We have a privacy engineer and a privacy lawyer assigned to work with the team. The other thing that is critically important is we have support of our executives. Tim is incredibly committed to privacy, and it flows through the company So, we are a team that the engineering teams want present when they're designing a product.

 
Rajeev Chand 

Do you think the consumer tech industry is doing enough in privacy?

 
Jane Horvath 

I don't think we can ever say that we're doing enough. I think that we should always be doing more. Things are changing. And there's no way to say that, at this point in time, we've reached a panacea. We always have to be pushing the envelope. We always have to be looking at new innovations, and figuring out how are we going to put the consumer in control of their data,

 
Erin Egan 

If I can build on that and everything that Jane said at Apple completely resonates with how we approach privacy at Facebook. So I don't want to repeat the privacy by design and we could talk if interest of our new FTC order, which is not yet final, but the accountability that's built in there is super interesting, I think, for many companies. But what I want to add to what Jane said, which I think is also important is as we all know, the landscape is evolving. For example, you know, I got this amazing Ring. It's amazing, this new door doorbell system that I can now I'm here at CES, it's a snow day back in DC, I have no idea who's going to come to the door when my kids are home. And I can see that that's hugely, hugely valuable. But the question is, what do people expect? It's all about what do people understand about how their data is being collected and used and that's something that is hard. And that's something that's going to take constant work to make sure that people understand what and how, what we're collecting and how we're using their data. So for example, we recently and again, this is one of many things we're all trying to do. We recently launched a privacy checkup tool, we've expanded this, this is a tool where we say to everybody, Hey, hi, let's take just like you take a health checkup. Let's take a privacy checkup. Let's take a look at who is seeing stuff. Let's make sure that this you're comfortable with this. And we added a bunch to it this year in light of feedback about security and so forth, but we're constantly iterating because expectations are evolving.

 
Rajeev Chand 

Susan, your thoughts on what consumers want, they'll jump into some details.

 
Susan Shook 

So as an advertiser, we're seeking to communicate the benefit of our products to our consumers every day to make their everyday lives better, right. And part of doing that is, is doing our communication methods and meaningful and impactful ways. But foundational to that is our consumer trust, and our consumer transparency. So for us, what we're trying to champion like Jane is a consumer centric privacy framework. And there's three parts to that, that we look at first and foremost, its consumer choice, you know, is she in the driver's seat does she know how we're processing the data, what we're using it for, who we're sharing it with, and making sure that our ecosystem is following through on that, whether it's us handling it or or service providers. Second, you know, and we want her to engage with us with the perceived benefits she wants to share right in ways that she believed will be beneficial to her and her engagement with us and sharing that message. Second, if she doesn't want to engage anymore if she doesn't perceive that the benefits are worth the data sharing, we want to empower her to, you know, disengage from that relationship in a way that that works for her. And we will follow through along with our vendors and service providers. And then third, we want to make sure and that's important. What I've been saying about the service providers and us is whether it's in our ecosystem or in the ecosystem of our service providers, we want to make sure that our consumer understands we're being strong stewards of her data. And we're making sure that the vendors we're using are also doing that.

 
Rebecca Slaughter 

Can I, can I maybe push back a little bit on some of the things these ladies have said, because I think they raised really good, good points and all very consistently talked about the importance of consumer control and consumer choice, all of which is true, but I also am concerned about a universe where the entirety of the burden to protect one's data lies with the consumer. Because I think that today, even if consumers can walk through a privacy checkup, or think about all of these things, the amount of information that you have to process to figure out what is happening with your data is, is untenable for most people. I mean, the way I think about it is I'm a relatively well educated person who specializes in privacy. And I can't possibly figure out all the things that are being done with all of my data across different services. And that's just by the companies with whom I have a first party relationship, and doesn't even think about the backbone infrastructure where there's third party data sharing. So I think it's also important that we think about ways that the burden be placed not just on the consumer, but that the collectors and stewards of data, have the responsibility to, for example, minimize what's collected and minimize what's retained, minimize how its chaired, consistent with still providing the product or service that they're offering that the consumer wants, without sort of creating this endless trove of data that can disappear into the ether because the other point that has become clear to me over the time I've been working on these issues is that it's very difficult to separate the harms that come from sharing data beyond what you would want that we all feel that it can feel intrusive or violative, with other sort of downstream harms that can flow back at the consumers because of what is done with that data after the fact. So decisions that are made for consumers about, you know, jobs or credit or things like that, or the targeting of content to consumers in ways that could be manipulative or problematic. So I think those are all questions that we have to flow together that are broader than just the immediate question of what consumers want on privacy.

 
Rajeev Chand 

Jane, your thoughts?

 
Jane Horvath 

I'm happy to take this one because I wanted to talk about three different things that we do set. We don't have to rely on the consumer to consent at all because we've we've used data minimization principles. The first is differential privacy. So different privacy is something that we use when we're collecting data, the data set sends out noise in the data set. So there is no way to know whether it's your data if it's really your data or not. So one example of where we've used differential privacy is the emojis on your iPhone. So if you looked at your emojis, you will see the emojis, the emojis that are most commonly used really helpful. But in figuring out what emojis are most commonly used, that could be a privacy problem. If we're constantly collecting every time you use an emoji, so we use differential privacy to inject noise into the data set. So that is one way that we're protecting the consumer without making them make a choice. They want their emojis that they use most frequently, but they don't have to sacrifice privacy. The second thing that we do is on device processing. And we have been doing more and more on device processing these days. Things are more powerful than a mainframe computer five years ago, it's got a lot of computing power. So you can build your models on your servers and send the models down onto your phone. And you can sync the learning on one phone across an encrypted cloud to all your other devices. So all your devices are smart. But Apple is not wiser because Apple doesn't see what's going on. So an example of that is photos. We have facial recognition algorithms in photos. All of that is done on device synced across an encrypted cloud. So all of your devices are smart, and know who's in photos, but Apple doesn't. The third thing that we do is we use identifiers that are created by your device random number generator on your device. We use it for both Siri and maps. So all of your Siri data and all of your maps data is not sent up to Apple servers associated with your Apple ID, but instead just sent up to Apple servers using a random number. So these are three different techniques we've used to minimize the privacy impact

 
Rajeev Chand 

Jane jumping off of that. So one of the other interesting parts of privacy is the differences that various tech companies have in their approaches to giving consumer privacy. Would you argue that tech companies that are not focused on those types of things, you know, like, let's say a Facebook or a Google, like your friend to the right, would you argue that they're using too much personal data and their services?

 
Jane Horvath 

I don't want to opine on what my competitors are doing. I'm very, very much focused on what Apple's doing.

 
Erin Egan 

And I also if I, if I may, Rashid, we actually at Facebook, we are diff we have a different business model than Apple, but both business models are privacy protective and we're very committed to protecting privacy and our advertising business model. And just to pick up on a lot of the pieces, I think what's absolutely important is figuring out not putting burdens on people. And there's a lot, it's really interesting in the policy environment, you're actually seeing legislation move away from this idea of consent, and having people have to consent all the time. And moving to the idea of responsibility, fiduciary obligations, accountability, and the kinds of de-identifying differential privacy. These are all techniques that tech companies use. We also, for example, store information on the device, your Oculus, your Rift, there's a bunch of stuff on the device there. It doesn't always make sense, though, to be on the device. And it's not necessarily more protective or less whether you're on the device or in the server. For example, we have to centralize if you come to Facebook, you come because you want to share, you want to connect, I wanted to share my holiday photos with a bunch of people that need centralization to do that well. So you can't always do things on the device. It's a different service we offer. But that doesn't mean that one is more privacy productive than the other. We are committed to privacy. And we feel privacy by design in all of our products just like Jane.

 
Rajeev Chand 

So there's been a good amount of discussion, especially over the last year on kind of this term age of surveillance capitalism. And obviously Shoshana Zuboff has been sort of the most vocal she coined it she talks about unilateral incursion into personal privacy, the human experience, the free raw material. Even in my research last week, I had a very good friend of mine, who's a software engineer in the Bay Area said, "Hey, I'm a software engineer. And I believe engineers have done a disservice to users over the past two decades that we're now catching up on." Jane, do you think we live in an age of surveillance capitalism?

 
Jane Horvath 

Jane? Or

 
Rajeev Chand 

Oh, sorry, Erin.

 
Rajeev Chand 

Yeah, sorry. Sorry.

 
Erin Egan 

Either one. We can

 
Rajeev Chand 

Yeah, you can both the

 
Erin Egan 

surveillance capitalism it captures a broad concepts, but I take real issue with the idea that certainly what we do at Facebook, the advertising that we serve is somehow surveying people we work so hard to be transparent surveillance connotates surreptitious activity that that people don't know about. We work hard to be transparent. Does that mean that we've done enough? We're constantly working to be transparent with people about our advertising. And there's a couple pieces to this idea of surveillance capitalism. And I think it's really it's an interesting and important piece of our policy conversation. One is about value. our people, our people, are they deriving value from advertising business models? And yes, we believe that people are we are able to offer a service to people for free. I don't know if you remember, but WhatsApp for example, used to have to charge 10 cents for messaging and now doesn't. There is value in free services that we believe for people and for small business. The other piece of surveillance capitalism, which is interesting is this idea of manipulation is do behavioral advertisements generally manipulate people in some way. We obviously don't want to be manipulating anybody that's not our business model. Our business model is to enable people to see things that they want to see and that are relevant. And one of the ways to address that, again, is I it's not enough to say that we're to tell people what's going on. But it's really important. So we built this off Facebook activity tool that enables people to see all of the information we are getting from third parties in one place. And to control that information and what we do with it. Again, that is a piece of a broader set of work that we need to do around direct differential privacy, de-identification, but I think that those pieces help address the kind of concerns that you're hearing. And you just mentioned.

 
Rajeev Chand 

Erin, staying with you for one more moment in terms of Facebook services and more broadly, tech services. Do you believe that those services should be using a lot less personalized levels of detail, you know that they are doing today?

 
Erin Egan 

Data minimization is a core privacy concept. And it's something that we that we embody and everything we do. So we collect the data we need to serve people and to serve relevant advertising. And underlying your question is that is that is,

 
Rajeev Chand 

Are we using too much? Are we using too much personal data?

 
Erin Egan 

And there we are, we are not we adhere to the concept of data minimization, we collect what we need to serve people, we give people control and choice over that data. And we're clear with people about it, and we work to de identify it, which again, is a key where this is about us. What are we doing to not put all the burden on people? What are we doing to try to deal with the downstream effects, it is taking the data, it's de-identifying it and our systems is deleting it after a certain retention period. So I think again, I think that inherited as there's something wrong with the business model, and there isn't, again, you can offer a privacy protective ad business model and we do

 
Rajeev Chand 

Let me jump to another element here and privacy, which is location data. And Susan, I'll throw this to you there was an excellent piece in The New York Times, I believe last month, which talked about, I think the title was something like "One Nation Tracked." The journalists had gotten a hold of a small amount of data. 50 billion location pings. 13 million Americans. And through that piece of data was able to find out information on celebrities, you know, residences; was able to find that a DOD official was attending a women's march with his wife; found individuals who are having affairs by virtually the fact they were going to a hotel for you know, two period two hour periods of time; they found a Microsoft employee who was interviewing and Amazon. And then the the sort of the, the end conclusion was this sort of statement by Paul Ohm from Georgetown University who said that describing location data as anonymous is a completely false claim. Do you agree with that?

 
Susan Shook 

I don't agree with it because I think you can have location data that's being used at macro levels. So I'll give an example of something we do at PNG. We may get zip code data about a consumer we may have an application gathering that in order to find out what products we would want to serve to that consumer to Erin's point right where and when she wants that product. So for our haircare products, we may have hair care products that are that work well in a humid environment. And getting that data just in time minimizing the data so that we're only delivering that at the point which the consumers in that zip code where humidity is high, and delivering that product and that resonates with her at that point in time is what we'd be seeking to achieve. We're not we would not be trying to put together and we wouldn't anticipate our consumers from a notice and choice standpoint. One is using the data for any other purpose other than then providing that value about that, that product to them. And to the earlier point people were talking about for P&G consumer trust is paramount to what we're doing into our privacy framework. And for us, unlike some of the tech companies, you know, a consumer has easy switching costs if we don't comply with the purposes that for which we're communicating to the consumer, how we're using her data, his or her data, if we go off with that the consumer can Shift to another product so they can shift from Pampers to Huggies. They can shift in a you know, a very (inaudible) consumer can say, what other products does P&G offer to us? And I want to be out of that equation forever. So it's very important to us beyond the legal. It's it's important to ask that if we were to look at what we're doing what we're processing, what we're saying we're doing with our consumers data, what we in our service providers are doing, if we depart from that we understand that the consumer can easily exit the situation. For us, trust is fundamental to our equation from a privacy standpoint. And we never want to breach that with the consumer and our consumer choice and transparency and control framework.

 
Rebecca Slaughter 

So I think allowing consumers and enabling consumers to vote with their feet is a key principle of competition as well as consumer protection law. The concern in the data area that I have is consumers don't always know. Like, how does a consumer know that P&G is doing exactly what it says with its data. When a lot of what happens in data sharing is goes on in this sort of opaque infrastructure behind the scenes. So you don't necessarily know that it was P&G who shared this specific data. And then to your point, Rajeev, about location data. I mean, first is, as a matter of principle, the FTC and I think our lawn policy generally, understands the different types of data are more and less sensitive and location data is certainly highly sensitive data that merits particular kinds of protection, de-identification sounds like a really good idea. The problems that I think Professor Ohm is pointing to and that the article illustrated is, de identification is only meaningful if data can't be re identified or associated with other data sets to create identification. And I don't think we have reasons to have confidence that in most forms of de identified data collection that that re identification is impossible, or improbable or even unlikely. And so that's that's sort of where the scrutiny and the concern has to come in. Now, Jane pointed to some technological solutions for some of that. There certainly are. It's an opportunity for a lot of innovation. And that's a good thing and something that we should be thinking about and excited about. But just like there are opportunities for innovation on protection side, there are also opportunities for innovation on the bad side of data use and abuse. And so we need to be really, really sensitive, and particularly when it comes to location data that implicates not only sensitivity, but security, personal security and safety in ways that are really meaningful. The FTC brought a case against the stalking app, a manufacturer of stalking apps earlier this year that I think was really important and something that I think about a lot, you know, in the domestic violence context, that there are a lot of these technologies that are being applied, not just location tracking, which was what this app allows the set of apps allowed. But also, when you think about connected homes, The New York Times did a really compelling article a couple of years ago about the ways in which abusive domestic partners were using connected home devices to target and terrorize their victims by like turning the temperature all the way up or making the lights go on and off. And that's a real meaningful problem that we need to think about. in ways that aren't immediately obvious when you think about the benefits of some of these developments.

 
Jane Horvath 

I just wanted to add on location. Data minimization is also really important thing to do with location data. If you don't need lat long don't collect it. So Siri, for example, if you ask Siri, what is the weather nearby? It will only send up city level data does not send up your lat long. But if you ask Siri, Siri, where is the nearest gas station, then Siri needs to know lat long So that's an example Data minimization where you don't always have to send up the most precise location data.

 
Rajeev Chand 

Erin, your thoughts on location data whether or not location data is too fine grained in terms of its current cloud storage?

 
Erin Egan 

location data is sensitive. And I agree with everything that these experts next to me have said we, you have to, as Jane said, it's about context. And it's about value to people and what are you using location data for? And at what level so if someone checks in on Facebook someplace that's going to be checking in specifically to some other specific place, that would be a value to people then we're collecting it at that level. Otherwise, generally, no, we don't collect it at a detailed level. So again, this gets into having what's important is having as we talked about, and we've all talked about having programs and privacy by design, where you're thinking about in every context in every new product in every new data collection, how you You are using and storing and deleting and how people can access and how you're minimizing ultimately, the impact to people. And one thing that I think is going to be interesting moving forward. And the and the commissioner mentioned this, too is is how how technology can help us. How can technology help us? We're all we've all built privacy by design. And I'm sure many of you in the room at companies have privacy by design programs. What does that mean? Well, it means lots of different things and lots of different companies. But what I'm seeing that's really exciting is how technology, how you can annotate code. And you can actually attach to code specific uses of data and ensure that that and those uses and those rules around retention and storage can attach to code and flow all the way through the system. And it's really fascinating work and it's on par with the current differential privacy work that's happening by engineers around the world. So it's interesting

 
Rajeev Chand 

So is privacy solvable? Whether it's through consents or whether it's through sort of, personal privacy data management tools that are very complex, you know, even even Facebook sort of Portal is very complex is privacy solvable? What model will sort of work at scale to give consumers both sort of informed consent where they're not just pressing the agree button or ignoring kind of the controls that are available to them?

 
Erin Egan 

I mean, by solve I mean, when I think of I think a privacy's have it, there's a range of principles to protect people's privacy. We believe privacy is a fundamental right that people have. So when you say I'm not quite sure, but

 
Rajeev Chand 

do you think privacy is protected today for people?

 
Erin Egan 

I think privacy is protected today for people. Yes. On Facebook today? Yes. And we have continue and we will continue to evolve our processes and programs to make sure that it continues to be the case.

 
Rajeev Chand 

For sure.

 
Rebecca Slaughter 

I don't want to talk about specific services or products but as a general matter, no, I don't think privacy is generally protected. I think people the amount of data that is called about any individual in this room, I don't think anyone here could tell any accurately, who who has, what data about them and how it is being used. If and I bet if you took a survey and a quiz, we'd all get a lot of it wrong unless we guess everyone has everything. And I could be it's being used in ways that I don't know and can't anticipate. And so, in that in that framework, no, I don't think privacy is being fully protected. The question is, does it need to be, right? Do we need to be in need or and what is the degree that it needs to be in? Do we need to be in a universe where no data is ever shared? No, I don't think that that's the case. I think the question is, what is what is the minimum amount of amount of data that can be collected, shared and used about people to make sure that they are not being harmed in ways that are either specifically privacy related like we've talked about or flow from data collection, like alluded to earlier. And I think, you know, Erin talked a little bit about the the advertising business model and Chairman Simon's referenced this in the last session to that we've had ad serve ads serviced businesses for a very long time, television, newspapers, media. that's been true. And I think that there is enormous value in advertise. There can be enormous value in advert in businesses that are based on serving ads and providing services through an ad based model. The question is, what's the differential value of highly targeted advertising? And and what is the cost of that? And are those things in balance with each other? I think that I don't think we have enough data. I hear a lot of citations all the time about the increased value of behavioral advertising. And I've seen very little data to back that up, whether it's a question about value to the consumers value to the advertisers, and where the costs come from. So I think This is an area where we need to apply a lot of scrutiny and a lot of care to understand exactly what is the marginal benefit? and understand what is the marginal cost? And are those two things in an appropriate balance with each other? And I'm concerned that we operate in a universe where we are asking those questions maybe a little too late to think about, think ex ante about what those effects would

 
Rajeev Chand 

Commissioner, what's your gut instinct? Is there a balance in terms of the incremental value versus the incremental cost to today's kind of digital an ecosystem?

 
Rebecca Slaughter 

I think that there I think we can live in a universe where there where we have a more appropriate balance than we have today. But the concerns I have right now is that I think the weight of that balance under today's business models really falls to the detriment of consumers rather than to the benefit of consumers and and often without the knowledge of consumers. are even as we've been saying, even where they know, they don't feel like they have options. Consent isn't particularly meaningful notices aren't particularly meaningful. So I think it's very difficult to conclude that the status quo is an effective balance. I think there are a lot of ways that we can, as a society, nationally and globally, think about enhancing that balance and creating an environment and an ecosystem where privacy can be an aspect on which companies compete meaningfully. Because we want competition, we want a competitive innovation. And I think that that is something that we can strive towards and achieve. But I also think it is, as we've all said, Here, a moving target as technology develops, the challenges change and we can't sort of ever stop and say, Oh, well, we've solved this problem. So there's not an issue anymore.

 
Rajeev Chand 

Erin, Susan, what are your thoughts on this? Is there is there an imbalance? Would you agree there's an imbalance?

 
Susan Shook 

No, I guess what I would say is you're asking the question you posed is privacy solvable? And it's and it's a very binary question you're saying yes or no. And to me, it's the consumer engagement that I was talking about earlier. So if the consumer understands what's going on and the ecosystem is following suit with what that consumer anticipates is not using it for other purposes, right, then then it's solvable for that consumer and the consumer should be in the driver's seat and defining what benefits she or he wants to obtain. So for our consumer, getting meaningful data about our products to help improve their everyday lives, is something we want to deliver. But more importantly, something the consumer also wants to get where and how she's interacting with her world. So if she's online, through through interest based advertising, that would be one methodology to do it. I do think, you know, to your point right there, I think we can all agree there's certain toxic things that happen with data, right, that are harmful that everyone says, you know, we shouldn't be doing these things. And there are certain things that we would say like using snail mail to deliver a product that the consumer has ordered, is pretty innocuous. There's a whole lot of in between, and I think it's presumptuous for us to step in and say well, if For, for everyone, this is what we believe is what we deem beneficial or harmful. I think it It should lie with the consumer to decide what he or she wants to engage in what data she's she or he is willing to give in order to get the benefits that he or she wants to get. And I do think you know, so one thing, you know, I think would be helpful is probably more transparency in the notice and choice mechanisms. I've been following. You know, they had the old privacy nutrition label they talked about, you know, 10 years ago, is there a way to be more transparent with the consumer in a way that she doesn't have to look at, you know, 3000 privacy policies over the course of the year and read them word for word? Is there a way industry can lead a more meaningful, transparent way to say here are all your options? What exactly do you want to engage with? How do you want to engage and make sure that everyone follows suit but I do think you know, it speaking for myself, when I look at all of the different tech disclosures and what's going on, it is hard and I'm a privacy professional. And I do think industry needs to lead to a more serious transparent way of communicating with the consumer and then letting her decide. She or she decide what she wants to engage with how she wants to engage. And whether she or she wants to exit that equation. If they don't think that the engagement is flowing the right way.

 
Rajeev Chand 

Erin, your quick thoughts on the commissioners comments on whether there's a balance and balance.

 
Erin Egan 

I was about to just just quickly on Susan's point to I could I couldn't agree more. What's interesting to in policy circles is we're seeing regulation get into the idea of design and dark design and interface and and how do we talk to people about this? None of us companies are in companies in this room, want people to be surprised that's not in our interest. We all want to have trusted brands and have people want to engage in our services, and that it's incumbent upon us to explain how things work but it is a challenge and it is hard. And I do think give it I think it is something that all of us, regulators, companies, academics are thinking about how do we talk to People in a way so that they can understand whether it's the nutrition label or some other way. It's interesting in some of the draft privacy bills I'm seeing in Congress, they're focusing on Well, maybe certain sensitive data types like location, like biometrics, we call those out. In other areas. We talk more about de identification and differential privacy. So we're struggling with how do we make sure people understand what they need to understand recognizing that there's a lot of data that's flowing, how can we then minimize that? And so it takes a comprehensive approach is what I would say

 
Rajeev Chand 

and you kind of see a viewpoint that maybe there is an imbalance more to the harm of consumers today, as Commissioner slaughter said, Where's that? Is that something that doesn't resonate necessarily?

 
Erin Egan 

At Facebook, we provide real value to people in the advertising that we deliver, and we do it in a privacy protective way. That's how I'll answer your question Rajeev.

 
Rajeev Chand 

There you go. There you go. Okay, so I'm gonna jump to the next topic, which is end to end encryption. And so and maybe I'll Jane, I'll throw this to you first. So the question that often comes up and then encryption is if you have any encryption, what's the right approach to, you know, abuse content such as terrorism, communications or child abuse? content? What's your views on that?

 
Jane Horvath 

So, end to end encryption is critically important to the services that we have come to rely on. Apple, we have the ability to see into the future. So as we're designing, we're always about year to two years had in the design process. So we know the things that we want to put on our devices, health data, payment data, very sensitive data. And the other thing that we know is our phones are relatively small, and they get lost and stolen. So if you're going to be able to rely on having our health data and finance data on our device is then we need make sure that if you misplace that device, You're not losing your sensitive data. Now, that being said, terrorism and child sexual abuse material is of horror. I also manage the law enforcement compliance team at Apple. And I have a team that works 24 hours a day, seven days a week responding to requests from law enforcement, we have helped in solving a cases preventing suicides, etc. So we are very dedicated, and none of us want that kind of material on our platforms. But building back towards encryption is not the way that we're going to solve those other issues.

 
Rajeev Chand 

Should content be screened when it's uploaded to either iCloud or Dropbox or any other cloud services? Should content be screened? For for example, child sexual abuse content? Jane?

 
Jane Horvath 

Yeah, sorry. We have start your utilizing some technologies to help screen for child sexual abuse material.

 
Rajeev Chand 

I'm sure you started to use

 
Jane Horvath 

utilizing some of those materials.

 
Rajeev Chand 

Gotcha, gotcha, gotcha.

 
Erin Egan 

Just to build on

 
Jane Horvath 

technology, sorry.

 
Erin Egan 

And just to build on what Jane said, I mean, WhatsApp, we very much also believe in encryption in messaging services is the privacy future. we encrypt messages in WhatsApp. And we do believe that that's critical. And there are ways to it's really important, though, that we get it right with abusive content, and how do we balance that and one of the things we're working on is reporting mechanisms that I'm sure also that you all have and also detection ahead of time is really important as well and looking for signals that I'm sure also you can do even in Apple does as well in encrypted context where you can look for signals with unencrypted data that you have available. To find misused to take action It happens. But encryption is super important to privacy

 
Rajeev Chand 

Do you view the tech companies have a responsibility for the content that's going through their cryptids schemes.

 
Erin Egan 

We have a responsibility to help protect the safety of people. And if we are seeing content and have information about bad content on our platform, we do everything we can to address it.

 
Rajeev Chand 

Commissioner?

 
Rebecca Slaughter 

So this is one where I don't actually have any pushback to give on the comments that they've raised about encryption. I think what it boils down to is, wow, I am really sensitive to the desire for a backdoor for good legal law enforcement reasons. You can't create a backdoor for the good guys, that doesn't also create a backdoor for the bad guys. And I think that every I think people are increasingly appreciating the importance of being able to protect their content with encryption, whether it's sensitive financial data, health data, personal communications, I think it is really important. I also agree that there needs to be other solutions to abusive content at detecting, identifying, reporting, protecting against abusive content. Your last question, I think Rajeev subtly wasn't an encryption question. It was a 230 question. It was a question about the responsibilities of platforms for the content that flows over and through them. And that is a live active and important debate that is happening right now that is different from the encryption encryption debate, but also one where I think there's a lot of nuance that we can't gloss over. And we want to make sure that we want to make sure that there's a balance between assigning appropriate responsibility where companies have responsibility and are actually taking action or failing to take action that they should be taking to protect their users or the society more generally, and creating liability that would be misplaced with a platform. Right. So it's a it's a not an easy question and one that I know there's actually a really interesting

 
Rebecca Slaughter 

panel about tomorrow. So that's right. I think it's gonna be a live topic for a long time.

 
Rajeev Chand 

Let me jump to another element in privacy, which is the CCPA of law. Susan, what's, what's your view? Let's fast forward to CES 2021. You know, how will CCPA be viewed what will be working, what won't be working for CCPA?

 
Susan Shook 

I think many companies like P&G, right. We've been all working hard to make sure that we're complying, just like we do with GDPR to make sure we have our inventories of what we're processing that we have contracts in place with our ecosystem that we're reviewing from a privacy by design standpoint, that everything is what we're communicating to the consumer is occurring. I think the regulations are being finalized and how those will be interpreted. I think is it One question will will still be gray and then we potentially have the ballot initiative number two coming our way next year. So those are will be hot topics for us to explore and see where it's headed. And then also just seeing how consumers are responding to it and what the other states are going to do as well, as well as the potential for federal legislation protecting consumers across the board with a with a with the same standard.

 
Rajeev Chand 

Gotcha. And Erin, there's a high profile sort of situation in question with Facebook here. I think I saw an article yesterday that said, Facebook doesn't believe in CCPA. Facebook believes that CCPA doesn't apply to it, which I actually don't think is accurate. But there was sort of an underlying question, which is, why is Facebook Pixel not a sale as defined by CCPA? So your thoughts on those two questions?

 
Erin Egan 

So okay, so CCPA applies to us and we're complying with it. There's two pieces to that just in case there's any confusion. I mean, one CCPA has a set of rights. are part of that are really important that are in many ways based on what we've seen in Europe and in other pieces of legislation rights for people to know what data is being collected, right to delete, right to access, those are super important rights. We built tools for those. So that's one piece. Also, in terms of what we do at Facebook, we do not sell data. So that's important. But then the question you're getting at is when we receive data from business partners, like, like a Procter and Gamble, how does that transaction what is that transaction of the CCPA. And the ccpa is clear that if you're acting as a service provider and using data for business purposes, pursuant to a contract with the service with a with a company, like a P&G, that is not a sale, and so we are a service provider, we are acting as a service provider on behalf of our clients to serve ads on their behalf.

 
Rajeev Chand 

So most of what I read by sort of independent privacy experts have a different view. And the view is that the pixel data that's being received by Facebook falls under this definition of a sale transfer of data and exchange for monetary or other valuable consideration. So the data that you get from P&G through pixel, would you say that you get monetary or other valuable consideration from it

 
Erin Egan 

CCPA is clear regime that if you act as a service provider, and it's very clear that if you use the data that you receive on behalf of a company for business purposes, that are articulated and CCPA, one of which is ads, that you are not acting, you're not engaging the sale, you're acting as a service provider. So we are clear on our contracts that we're service provider. And when it comes to the data that we received from California residents, we are using that data for the business purposes that are very clear in CCPA. And I'm happy to talk to anybody who has questions about our compliance with CCPA. We are complying and proud of it.

 
Rajeev Chand 

Great. I think there might be questions for that. Commissioner Slaughter right. Commissioner Slaughter some questions for you. What is the likelihood that we will see a federal privacy law passed in 2021.

 
Rebecca Slaughter 

So I, before I worked in this job, I worked in the Senate for about a decade. And I used to think that one of my great professional strengths was doing a really good job of sort of be able to predict and see what was going to happen and anticipate things. The last three years have taught me that maybe that is not a strength of mine or anyone elses, necessarily so I have become very humble about my predictions for the future. What I will say is that I think we should have privacy legislation, federal privacy legislation, hopefully in 2021, or if not before that, I think we have some of the necessary and but not necessarily sufficient conditions to get that legislation across the finish line. Many of my former colleagues and friends and the senators with whom I work with When I was in the Senate have been working really diligently on these issues. They're incredibly dedicated smart, public servants. And so I feel really confident that if it is doable, they can do it.

 
Rajeev Chand 

Would you handicap it 5050 chance for the next two years, I

 
Rebecca Slaughter 

Will not handicap it because I really hate being wrong. And whatever prediction I make will invariably be wrong. But I do think that when I say that there are necessary economic conditions in place, what I mean is that I think there is motivation from industry to get a federal law because they want transparency and consistency and there is a very real and should be appreciated, fear that we will be living in a universe where companies are not only navigating the intricacy intricacies of ccpa, but also slightly different laws and other states or even worse, fundamentally incompatible laws in other states. I actually think that's it's not how do I comply with 50 different similar but not exactly the same laws, which is what we have in data breach right now. And that's Seems to be largely a sustainable situation. I think the question is, what if state a tells you you have to do X and state b tells you you have to do the opposite of x? How do you comply with that? I think that's a real a real challenge. And then on the other side, I think, you know, there's motivation from the left and the public interest interest community to have meaningful, strong federal privacy legislation. But what that means is it has to be meaningful, and it has to be strong. And there have to be real consequences for violating real meaningful

 
Rajeev Chand 

So two specific questions. Should preemption be part of a federal bill when it gets passed? Should national law preempt state laws?

 
Rebecca Slaughter 

So I think the preemption question is one of those things that sort of tying everybody in knots, it's very difficult to evaluate on its own. I'm uncomfortable saying yes, we should have a federal law that preempts state law. If the federal law is kind of weak and doesn't really provide meaningful protections to consumers. If you have a very very strong federal law that creates the opportunity for evolving regulations and standards that keep up to date with the technology and the innovations that we're seeing, then there is a better basis for specific preemption, should we ever have field preemption? No, I don't think so. I think we should allow states the opportunity to innovate consistent with the federal standards, not inconsistent with federal standards. But so I think it's very difficult to x AMT evaluate preemption.

 
Rajeev Chand 

And then second question is should individuals have the private right to action to sue companies for violations?

 
Rebecca Slaughter 

So again, this goes to the question of how meaningful this it doesn't go to the question of how meaningful the laws but a law is only as meaningful as it is enforceable. And if you limit the enforceability of a law by limiting who can enforce it and how well resourced they are, then it's very difficult. The FTC is a very small agency I do think it is the agency that should do federal privacy enforcement. But it currently has a budget of about $300 million annually. We have we had about 50% more employees at the beginning of the Reagan administration than we do today at the FTC. And that is as the economy has grown and our privacy mandate has grown. So I have a really hard time limiting enforcement to under-resourced federal and state agencies. In a vacuum, I think, I also think that you want people to be able I, I am a person who firmly believes that our courthouse doors should be open to people to vindicate their rights, and I think they have been systematically closed in a lot of ways over the last several decades. So I'm concerned about other things that closed doors to people to make sure that they can protect themselves, particularly if federal and state enforcers are not doing it. That said, I also am sensitive to the idea That you don't want to create litigation factories and lawsuit factories that incentivize settlements rather than compliance. And so I put this in the category of solvable problems. I think it can be solved, I think it will be solved. But I think that there are important rights and interests that need to be protected.

 
Rajeev Chand 

And then Jane your last thought on this, and then I'm going to turn to the audience for questions. So there are two mics in the middle row here, please line up. And let's see if we get a bunch of questions in Jane, last thoughts?

 
Jane Horvath 

Sure. I think Commissioner Slaughter addressed a lot of say I agree with a lot of what you said about preemption and private right of action. But one things I think we also need to be aware of, is people talk about GDPR all the time. And what you may not know is before GDPR, there was what was called a directive. So in Europe, every member state had a directive to pass a law along the lines of what was in the direct and what they found is those members states passed incompatible laws. So the data wasn't flowing between all the members states. So when they passed GDPR, it was the view of encouraging the free flow of data across all the member states. I think we in the United States need to look at that model as well. Strong federal privacy law that is consistent across all 50 states states, and to look at every consumer regardless where they live. Is it entitled to the same strong protections?

 
Rajeev Chand 

Name and company and very brief question?

 
Jeff Fowler  

Hi, I'm Jeff Fowler tech columnist with the Washington Post. This question is for you, Jane. Last year here at CES, you guys put up a big billboard right over there. That said, what happens on your iPhone stays on your iPhone. After you did that a bunch of journalists including me, did investigations that found that wasn't true that the apps that you guys have had and put into your store allow lots of third party companies to track our data and my case, including while I slept at night. My question for you is, what is Apple doing to fix that for me? I know you've said you're going to stop doing that activity and children's apps what what are you doing to actually make sure that what happens on my iPhone stays on my iPhone,

 
Jane Horvath 

We're, sorry, my voice is really going now. We are constantly trying to up the protections that you have, and ensure that your data when we do send it up, one of the things we do it up or whatever we collect, Apple collects data from an iPhone every two weeks we have a review with team so a team may want to review data for crashes or something else. And we walk through with very inordinate amount of time debating collections that we ensure do not identify you. And will it will require the teams to sample to do all kinds of different things to ensure that we are not actually collecting your data.

 
Jeff Fowler 

And what about the third party apps that you sell in your store?

 
Jane Horvath 

We have App Review, we have the developer guidelines, we have the just in time notices, where we have isolated a lot of the data, we call it isolation, meaning that when you open an app, and an app wants to access your location. It's a technical block, until you say, yes, that API stays outside of the apps sandbox. So only when you say yes. Does that app get access to your location data? So we're constantly innovating here.

 
Jeff Fowler 

And are you doing anything further to batch the apps in the store to make sure that they're not sending that data to third parties. You're doing anything new to vet the apps, as you said, to make sure they're not sending our data that they said?

 
Jane Horvath 

I said we're constantly innovating, including in the app review process.

 
Rajeev Chand 

They thank you very much next name, title and brief name company and be briefly

 
Speaker 

(inaudible) thank you in Paris. Just a question on manipulation I think was a factor coming out. A book of (inaudible) was a key a very helpful to bring the debate. So we understand that Facebook will make everything possible not to be regulate remain service and things like that. So that fall back in the brand. And the question is for P&G, do you fear that you will have a kind of backlash of the consumer who don't like you to use Facebook data to be manipulated?

 
Susan Shook 

I think when we're engaging with our with our vendors, whether it's Facebook or Apple or any vendor, we're looking at how the data is being used, and is it being used in the way that we've communicated to the consumer? So we sit down, we talked to the product teams, and and with Facebook or any other company, we look at what they're doing, make sure that it's following through is what we anticipate if it's not, or we believe that it doesn't engage with the consumer in the way that we anticipated. We won't buy the product do we do? And I guess the question you're asking is, do we think consumers are going to have a backlash if they

 
Rajeev Chand 

will the tech backlash extend to brands like P&G? And then we'll go to the next question then.

 
Susan Shook 

I think if we're doing the right thing by our consumer, and and if our consumer decides that she or he does not want to pass the data on to the tech companies that we're working with, we will honor that request. Right. So we provide a notice and choice mechanism to the consumer. And if they decide that they don't want to engage with a vendor that we're using in our ecosystem, we're going to honor that. And we're going to make sure that our vendors honor that as well. So as as Jane just talked, right, there's a and this has come a long way in the last 10 years. So I don't want it to be all doom and gloom, through good guidance and enforcement. Many of the companies when I sat down 1012 years ago with companies and said Show me your data flows and what your what your info security controls look like we got saucer sized eyes from some of the developers, right? They were like, What are you talking about you Midwestern company? Why are your lawyers asking us these things? When I say down now with many of these tech companies, they come prepared, they have data flows, they tell us about their info security controls, and they tell us about what they're going to do with the data. And if it's not consistent with what we're planning, we have a decision to make. And if and again, if it's not, you know, transparent to our consumers transparent to us what's happening, we're not going to go forward, likely with that with that proposition until it can be so we're going to tweak it or we're going to walk away.

 
Rajeev Chand 

Great question. Next question name, company, and question.

 
Speaker 

Marguerite Johnson, (inaudible) automotive.

 
Rebecca Slaughter 

I actually couldn't have teed up this question better. I mean, if we consider data usage as being the barrier for consumers to understand how their privacy is being used, Would you ever consider a label such as the nutrition fact telling me how much data am I giving up to consume your product? P&G?

 
Susan Shook 

Yeah, I mean, I, we're for transparency. We want the consumer to understand how she's engaging with us. And I do think it would be great if industry all got together and said how instead of having You know, 400 different privacy policies that consumers looking at that are saying different things. And everyone's well intentioned, they're saying all the right things legally, they're trying to say that all the right things from a trust standpoint, but I think consumers are inundated. And and some of the research that was done even in the past year, they were looking at IoT devices from Carnegie Mellon research. And they showed consumers that were actually surveyed in this, were able to then make apples to apples comparisons across companies, right? And say, Okay, this device is doing this, this device is doing that. Am I okay with that? It's like being at at the shelf at the grocery store, and you're looking at the weights and measures, right? You're saying I'm paying whatever, 35 cents per ounce for this. And 45 cents for that. I, you know, I'm going to go with the cheaper option or looking at the nutrition label and saying, I have a peanut allergy. I'm not going to buy this, I'm going to buy that product. We're not apples to apples right now and all of our privacy policies. And it's not because anyone's not well intentioned, it's just because there are multitude of ways to solve the communication issue with their consumers we all think we have the best way to do it. And I think industry needs to get together on what that what that privacy nutrition label or info security nutrition label may look like for the consumer.

 
Susan Shook 

And we're gonna have time for one last question so name company and very brief question

 
Speaker 

Okay (inaudible) Chief Technology Officer of Appleosophy. I had a question for Miss Horvath in regards to  with Apple, they use iCloud Drive to in order to have a continuous stream of data from your Mac, your iPad connect all of those together. But if someone is going to, for example, run a malicious software through for example, a Mac, wouldn't that transfer over to, for example, the iPad because right now, more and more increasingly, the iPad is becoming a computer. And not only that, but it's using parts of the kernels from Mac OS. Another question I had was with iPhone cables.

 
Rajeev Chand 

And actually let me just pause it right there. That's a great question. And we're going to run out of time here. So Jane, your thoughts on that first question. I apologize.

 
Jane Horvath 

Yeah, we are always looking at security vulnerabilities, particularly when we're sinking data across. We have a very strong team of security engineers that are looking at those questions. I will bring your issue back and raise it with them, but I'm certain that we have considered that.

 
Rajeev Chand 

Great. Thank you very much, everybody. This has been a fantastic session. I think it's clear that privacy is the one of the most important topics in tech. Right. Please join me in thanking Erin Egan, Jane Horvath, Susan Shook, and Commissioner Rebecca Slaughter. Thank you all.

CTATECH-PROD1