Lorrie Cranor portrait
Lorrie Cranor

Does anyone actually read privacy policies? What’s in them, and why can’t we usually understand them?

On our second season finale, we’ll talk with Professor Lorrie Cranor, director of the CyLab Usable Privacy and Security Laboratory at Carnegie-Mellon University. The lab brings together more than 100 faculty from across campus to study security and privacy and help shape public policy in those areas. One of her specialties is how humans interact with security and privacy technologies, to make sure the mechanisms we build are not just secure in theory, but are actually things that we can use. Her TED Talk about password security has been viewed more than 1.5 million times. But today, we’ll talk about another pesky aspect of our digital lives – privacy policies, those mysterious terms and conditions we sign off on – often without reading them — before we can use an app on our smartphone or laptop.

Links:

Are they worth reading? An in-depth analysis of online trackers’ privacy policies,” Lorrie Cranor, Candice Hoke, Pedro Giovanni Leon, and Alyssa Au, A Journal of Law and Policy for the Information Society, 2015.

Lorrie Cranor’s Website

IoT Security & Privacy Label, Carnegie Mellon University

 

Transcript:

Aaron Nathans:
From the Princeton University School of Engineering and Applied Science, this is Cookies, a podcast about technology privacy and security. I’m Aaron Nathans. On this podcast, we’ll discuss how technology has transformed our lives, from the way we connect with each other, to the way we shop, work, and consume entertainment, and we’ll discuss some of the hidden trade-offs we make as we take advantage of these new tools. Cookies, as you know, can be a tasty snack, but they can also be something that takes your data.

Aaron Nathans:
On today’s episode, the final episode of our second season, we’ll talk with professor Lorrie Cranor. She is the director of the CyLab Security & Privacy Institute, which has over 150 faculty from across campus at Carnegie Mellon University. Within CyLab, she is also director of the CyLab Usable Privacy and Security Laboratory, which focuses on the human side of security and privacy. She recently became co-director of CMU and the University of Pittsburgh new Collaboratory Against Hate – Research and Action Center.

Aaron Nathans:
Lorrie’s TED Talk about password security has been viewed more than 1.5 million times, but today, we’ll talk about another pesky aspect of our digital lives, privacy policies, those mysterious terms and conditions we sign off on, often without reading them before we can use an app on our smartphone or laptop. Let’s get started. Lorrie, welcome to the podcast.

Lorrie Cranor:

Thank you.

Aaron Nathans:
In the year 2008, you and your graduate student calculated that it would take 244 hours a year for an average consumer to actually sit and read the entirety of every privacy policy they encounter. If everyone in America did that, it would add up to 54 billion hours per year spent just reading privacy policies, and that doesn’t even begin to account for how well people understand what they’re reading. It’s now 2021. Is it still 244 hours per year?

Lorrie Cranor:
Well, we haven’t redone the calculation, but I would guess that it probably is. It might even be worse now because now we see this real proliferation of third-party content embedded in websites, so it’s not enough to just read all those privacy policies, you really have to read the privacy policies for the third-party content as well, and so I think that’s going to really blow up the amount of time.

Aaron Nathans:
How many privacy policies do you typically encounter in, say, a given month, and to what extent do you, who actually understands this stuff, actually read the whole thing?

Lorrie Cranor:
Well, I don’t keep track and I don’t read them all. There’s just no way, so yeah, in a month I’m sure I encounter dozens of new privacy policies, but sadly, I can’t manage reading all of them.

Aaron Nathans:
Why not?

Lorrie Cranor:
Because it would take too long and there’s a lot of information in there that, even as an expert, I have trouble understanding.

Aaron Nathans:
Is there something that you’re able to skim for to find the relevant information? Is there something typically, that’s a red flag, you can maybe do a search, use the search function?

Lorrie Cranor:
Well, I wish I could easily use the search function. One of the things we found in our research is that there’s very little consistency in privacy policies, so there are not key terms that you can consistently search for. We thought that maybe we would find standard terminology about the choices that you can make because that’s something a lot of people want is to find out where’s that opt-out button, “I just want to find out about what my choices are.” Some privacy policies talk about choices and some talk about opt-outs and some talk about user rights and they’re all using different terms, so there’s nothing you can reliably search for. That said, if I do scan a privacy policy, I am interested in choices. I’m also interested in the extent to which my personal information will be shared and why it’s being shared. Those are some of the things that I look for.

Aaron Nathans:
Hmm. How often are you able to opt out of the more objectionable portions of a privacy policy?

Lorrie Cranor:
It varies quite a bit. I think there are some companies that will let you opt out of advertising and sharing for marketing and things like that pretty easily. In certain industries like the financial industry, there are requirements, and so in many cases, you can opt out of a lot of that, but in other industries, the companies are frankly making their money from selling your data, and so that may not be an option.

Aaron Nathans:
Well, a cynic would say that a lot of these policies are designed to frustrate users to make their eyes glaze over, to make them metaphorically throw up their hands in surrender and click the “I Agree” box without having to read more than a few sentences at most. I mean, is that true? Are these policies confusing by design?

Lorrie Cranor:
I think, for the most part, they are not confusing by design. I go to privacy conferences and there are representatives from big companies that have privacy policies and they have privacy lawyers and I’ve met a lot of these people and they’re not evil people, right? I don’t think they’re, for the most part, trying to confuse people. But I do think that a lot of these companies are trying to protect themselves, and so they want to make sure that they have disclosed everything that they legally have to disclose using all the good lawyer language, and that’s their priority, not clear communication with the end-user.

Aaron Nathans:
Mm-hmm (affirmative). I mean, I’m almost afraid to ask this, but if we did understand what we were reading in these terms and conditions, in these privacy policies, what would we typically see?

Lorrie Cranor:
Well, I think we would see for a lot of companies that they make money from selling our data for advertising and marketing-related purposes. I think increasingly people have some awareness that that’s happening, but people might be surprised about the extent that that happens. I think there’s a lot in there that are things that they need to tell you for legal reasons, some of which probably don’t have a lot of privacy consequences, but it’s really confusing to see all of that. But really, the extent to which our personal information is bought and sold and traded and whatnot may surprise a lot of people.

Aaron Nathans:
When I find a free app on my smartphone, I feel lucky. I feel like, “Wow, I could get a free metronome. Wow, I can get a free Pac-Man.” I mean, are a lot of these apps not free at all, that they’re really making tons of money off of you?

Lorrie Cranor:
Well, a lot of these apps are advertising-supported. Not all of them were making tons of money. The popular ones are making tons of money, the not-so-popular ones are making a small amount of money, but yes, they’re making their money out of selling your information or allowing advertisements to appear in their app.

Aaron Nathans:
Is it just a matter of a human nature? I mean, do people care what’s in a privacy policy? If they don’t feel like they can get their hands around what’s in it, why will we say, “Go ahead”? I mean, do people ever just say, “Well, I don’t understand what this policy means. I’m not going to use it.” Why do we consent to this?

Lorrie Cranor:
Yeah, that’s a great question. I think there’s a big sense of resignation where people feel like they are not empowered to do anything to protect their privacy. People feel, “I can’t really understand this. Even if I read it, there’s probably some legalese that I didn’t understand,” and unless I want to just stop using apps and stop engaging in the online world, which some people have chosen to do, but most people still want to engage, and so a lot of people have given up and have just said, “Well, they’re going to get my data one way or another, so I guess I’ll just do it and not bother.”

Aaron Nathans:
I mean, what’s your personal answer to this sort of thing? This is a good bookend to a conversation I had with journalist Barton Gellman earlier in this season. He was the first guest in the season and I asked him, because he’s Mr. Privacy, he uses only the best privacy technology, and I said, “Do you just put away your phone so that it won’t track you?” and he said, “No, I want to be a full participant in the digital economy.” What’s your answer to that? You know that you can’t get your hands around everything in these privacy policies. Do you choose to participate, anyway? Where do you draw the line?

Lorrie Cranor:
Yeah. I do not take some of the more extreme measures. I do want to participate. I opt-out where I can. I use privacy tools to block some of the tracking. I refrain from giving out some of my personal information when I have that option. Sometimes I ask for that option. I definitely do those sorts of things, but I still carry a cell phone, and I turn on location tracking because I want to be able to use maps on my cell phone, so I do things that I know are giving away my personal information, and I do it because I want the convenience of some of these services.

Aaron Nathans:
What are some examples of some good privacy policies and what are some ways to improve the ones that aren’t?

Lorrie Cranor:
By “good,” you mean that they’re privacy-protective or that they’re easy to understand?

Aaron Nathans:
Well, let’s take them one at a time. Go ahead.

Lorrie Cranor:
Yeah, yeah. Yeah, I don’t have my list of my top 10 good privacy policies. It’s hard to find really good privacy policies for companies with complex business models because there’s just so much data that they collect and so many different partners that they have. You find the better in terms of more privacy-protective policies from companies that have a very simple business model. They sell you some physical good, that’s their business, selling you that one thing, right? A small business, they’re going to have usually a more privacy-protective policy.

Lorrie Cranor:
As far as the clarity of the policies, again, it’s often those companies with simpler business models that will have the clearer policies, and the more complicated the business model, the more things they feel like they need to tell you. But I have seen some companies that try to turn their policy into a table or a layered approach where they give you the highlights at the top and then you can click on things and drill down and get more details. That’s a nice approach. One company that I’ve seen recently has done this is Strava, the fitness app. They have a table and they have a bunch of questions and it says, “Yes/no, yes/no. We do this. We don’t do that.” Then you can click on those yeses and nos to get the details about why they do or don’t do each of these things. There’s another company called Juro that does online contracts and they have a really nicely designed top layer that summarizes their use of data, and again, you can click into it to get more detailed information.

Aaron Nathans:
You’re listening to Cookies, a podcast about technology security and privacy. We’re talking with Lorrie Cranor. Lorrie is a professor at Carnegie Mellon University in computer science and engineering and public policy. It’s the hundredth anniversary of Princeton’s School of Engineering and Applied Science. To celebrate, we’re providing 100 facts about our past, our present, and our future, including some quiz questions to test your knowledge about the people, places, and discoveries who have made us who we are. Join the conversation by following us on Instagram at @eprinceton, that’s the letter “E” Princeton. But for now, back to our conversation with Lorrie Cranor.

Aaron Nathans:
Well, at least websites that have confusing privacy policies have a privacy policy. We are rapidly moving into an Internet of Things world. Do users of these devices have an opportunity to read up on the privacy trade-offs before they use, say, their smart thermostat?

Lorrie Cranor:
Yeah. So, IoT devices pose a big problem for understanding privacy. Certainly, if you walk into a smart building and there are smart light bulbs on the ceiling and smart thermostats and you walk by drones, there’s no way to find out about their privacy. You can’t stop the drone flying by and ask it for its privacy policy, so that’s somewhat problematic. When you’re going to buy an IoT device, even there it’s problematic. If you walk into a brick-and-mortar store and pick up an IoT device, a smart thermostat, or whatnot, on the shelf, you can look on all sides of the packaging. Most of the packages that I’ve looked at say nothing about privacy on them. Then if you go online to Best Buy or Amazon or a company like that and try to buy these IoT devices, they’ve got all sorts of information about what protocols they follow and how much they weigh and their dimensions, but pretty much nothing about privacy. There are some manufacturers that on their website, you can find a bit about privacy, but that’s about it.

Aaron Nathans:
Why do you think that is? Why do my apps have a privacy policy and IoT does not? Is there some sort of legal requirement?

Lorrie Cranor:
I think that IoT, it’s been kind of unregulated, and there hasn’t really been the pressure to do that yet. But I think that’s about to change. There is pressure from regulators to disclose information about both privacy and security of IoT devices. There was actually a White House executive order issued just last week (editors’ note: this conversation took place in the spring) that mentioned this idea. NIST is looking into it. I think that that may change and we’re doing some research at Carnegie Mellon to help move us in that direction.

Aaron Nathans:
Do you happen to know whether that kind of regulation would require an act of Congress?

Lorrie Cranor:
The most straightforward way for it to happen would be for Congress to pass a law that would require some sort of privacy and security labeling on devices. I don’t know the extent to which an executive order can mandate something like that.

Aaron Nathans:
Speaking about a label, you’ve been working on an Internet of Things nutrition label, so-called, to help people better understand the security and privacy features of these amazing new devices. Can you tell us what that kind of label would look like and whether the manufacturers of these devices are willing to use them without being required to?

Lorrie Cranor:
Yeah, we did a lot of research into what information you should put on such a label. My students interviewed experts in IoT security and privacy, they did studies with consumers to find out what information they wanted to know, and we came up with drafts of a label, and then we did consumer testing, so we have a label design, which has two layers. The top layer is designed really for simple information for consumers and it tells you about what kinds of information the IoT device collects and what the purpose of its use of this data is and whether it’s shared and there’s a little bit of information about security as well about whether you can set a password on the device and things like that. Then there’s a link and a QR code to take you to the secondary layer that has about 47 different pieces of security and privacy information and a lot of detail, which is going to be useful, mostly for experts, but some consumers may want that information as well. You can check it out on our website at iotsecurityprivacy.org. You can see an example of this label.

Aaron Nathans:
Can you get this information without the participation of the manufacturers?

Lorrie Cranor:
Yeah, unfortunately, it’s going to require some participation from the manufacturers. We spent some time trying to get some of this information and constructing these labels ourselves for some popular devices and we had some students go and read all the manuals and try to get all the information they could and there’s certainly some information that we could get out of that, but to really fill out the full label, we need the manufacturers to do it.

Aaron Nathans:
We’ve already talked a little bit about regulation so far. Is there any other work that we haven’t discussed that needs to be done in this area generally?

Lorrie Cranor:
Yeah, I think regulation would certainly help with the adoption. There are other paths to adoption as well. You could have an industry group declare a label as a standard. You could have some of the big retailers basically say, “Hey, if you want to be on our store shelves, or if you want to be prominently featured, you need to have a label.” These are all things that could happen. They’re not happening yet, but those are other paths to adoption.

Aaron Nathans:
Is there anything else on the topic of privacy policies that we haven’t mentioned that you’d like to say?

Lorrie Cranor:
Another project that we’re working on at CMU is to solve the problem of walking into a space where you don’t own the IoT devices but you want to know if there are devices that are collecting your information. This is work that my colleague Norman Sadeh is leading. The idea is that you could set up your smartphone or your smartwatch to be on the lookout for these devices and the devices themselves could broadcast information about what their privacy policy is and what data they collect, or there could be a registry, which all of the devices in the building would be registered and you could find out what’s there, so we’ve developed a protocol for this and some demos.

Aaron Nathans:
What are some common devices that people might not know about when they enter a building and what kind of information are they collecting?

Lorrie Cranor:
Increasingly these days, buildings have smart light bulbs, smart thermostats, video cameras. Sometimes they may have audio sensors, vibration sensors, so there’s lots of sensors in the building, lots of environmental sensors. Many of these sensors are not collecting personal information, they’re really just helping them maintain the HVAC system in the building, but some of them are collecting personal information. They may be collecting video and maybe doing face detection on the video they collect. They may be tracking your cell phone so they can track your path through the building. In a shopping mall, they may be interested in where you go in the mall. There are lots of reasons for doing the face detection, some of which might be things that people appreciate, and some might be things that people would really rather not happen.

Aaron Nathans:
Are these IoT devices able to figure out who I am if I walk into a store, if I walk into a building? Can they reach into my phone and figure out personally identifiable information?

Lorrie Cranor:
They may be able to, especially if they overlay the information they collect with other databases. If they’re doing face recognition and they have a database of faces, then they can match your face up with a database of faces. If you are buying something in a store, so you hand over your credit card at a point of sale, now they have your name and they might be able to match that with the ID on your phone, so yeah, they may not be able to just recognize your phone without any other help, but there are multiple ways that they could get that help to connect it with you.

Aaron Nathans:
Why would they want to connect it with you?

Lorrie Cranor:
Well, I can think of all sorts of reasons why they might want to connect it with you. I think the biggest reasons are probably related to marketing and sales. Stores are interested in knowing their customers better so that they can better target and sell you more things, but there may be other reasons that they would want to connect with you. There may be public safety reasons, in order to know where people are in a building, in case they have to evacuate in an emergency, there may be law enforcement, they find out who was around when a crime was committed, all sorts of reasons.

Aaron Nathans:
Well, finally, you’ve been a big advocate for diversity and inclusion in computer science. What advice do you have for women who are thinking about a career in this area, but might hesitate because they don’t see equal representation in the field today?

Lorrie Cranor:
Yeah, women have been underrepresented in this field for a while, but I think we’re starting to see some movement. We’re starting to see in undergraduate computer science classes at my university, we now have 50/50 men and women and there are an increasing number of universities that are reaching that and so things are looking up. I would encourage young women who are interested in getting into this field to pursue it. I don’t want to say that there are no barriers, but I think that the barriers are decreasing and I think it is very much open to people of all kinds entering the computer science field.

Aaron Nathans:
Well, this has been a great conversation. I really enjoyed speaking with you.

Lorrie Cranor:
Thank you.

Aaron Nathans:
We’ve been speaking with Lorrie Cranor. Lorrie is a professor at Carnegie Mellon University in computer science and engineering and public policy. I want to thank Lorrie as well as our recording engineer, Dan Kearns. Thanks as well to Emily Lawrence, Molly Sharlach, Neil Adelantar, and Steve Schultz. Cookies is a production of the Princeton University School of Engineering and Applied Science. This podcast is available on iTunes, Spotify, Stitcher, and other platforms. Show notes and an audio recording of this podcast are available at our website, engineering.princeton.edu. If you get a chance, please leave a review. It helps. The views expressed on this podcast do not necessarily reflect those of Princeton University. I’m Aaron Nathans, digital media editor at Princeton Engineering. This concludes our second season of Cookies. Thanks for listening. Peace.

Research

  • Security and Privacy