Barton Gellman deletes his account
By
on
To kick off our second season, we’re honored to welcome Barton Gellman, Princeton Class of 1982.
Gellman has won multiple Pulitzer Prizes, including for his groundbreaking work with The Washington Post in 2013 to reveal widespread surveillance by the National Security Agency. The stories showed that even though they weren’t the targets, law-abiding American citizens could still find their private email, social media content, and online activity swept up by our national security apparatus. Privacy has long been a passion of Gellman’s, and today we’ll ask him for tips we can use to make our own digital lives more private, from email to text messaging to apps and the cloud. He talks about tradeoffs he’s willing to make to be a full participant in the digital revolution, as well as one popular service he distrusts so much, he vows to delete his account entirely. And we’ll as talk about his book, “Dark Mirror: Edward Snowden and the American Surveillance State.” Bart Gellman was a visiting fellow at Princeton’s Center for Information Technology Policy a few years back.
Links: BartonGellman.com, his website
During this episode he talks of various pro-privacy services such as the Tor and Brave browsers, Signal messaging service, Tresorit cloud service, YOPmail disposable email service, and SecureDrop.
“U.S., British Intelligence Mining Data From Nine U.S. Internet Companies in Broad Secret Program,” Washington Post, Jan. 7, 2013.
Pulitzer Prizes:
2014, The Washington Post
2008, The Washington Post
Transcript:
Aaron Nathans:
From the Princeton University School of Engineering and Applied Science, welcome to season two of Cookies, a podcast about technology privacy and security. I’m Aaron Nathans. On this podcast, we’ll discuss how technology has transformed our lives, from the way we connect with each other, to the way we shop, work, and consume entertainment, and we’ll discuss some of the hidden trade-offs we make as we take advantage of these new tools. Cookies, as you know, can be a tasty snack, but they can also be something that takes your data.
On today’s episode, we’re honored to welcome Barton Gellman, Princeton Class of 1982. Bart has won multiple Pulitzer Prizes, including for his groundbreaking work with The Washington Post in 2013 to reveal widespread surveillance by the National Security Agency. The stories showed that even though they weren’t the targets, law abiding American citizens could still find their private email, social media content, and online activities swept up by our national security apparatus.
After the 9/11 attacks, the NSA was handed the ability to tap the records held by leading U.S. internet companies without a warrant based on probable cause. Gellman’s source later famously unmasked himself as Edward Snowden, a former national security contractor, who, accused of espionage and theft of government property, is now living in Russia. Both Gellman and Snowden share a strong appreciation for using private and secure digital communications. That helped them build trust and allowed Gellman and his collaborators to evade anyone who might’ve tried to interfere with their reporting. But privacy has long been a passion of Gellman’s, and today we’ll ask him for tips we can use to make our own lives more private, as well as talk about his book, “Dark Mirror: Edward Snowden and the Surveillance State.”
Bart Gellman was a visiting fellow at Princeton Center for Information Technology Policy a few years back. Let’s get started. Bart, welcome to the podcast.
Barton Gellman:
Thanks for having me.
Aaron Nathans:
All right. So you noted in your book that your inclination toward privacy has been so strong that your former colleagues at the Post used to tease you that you may have worn a tinfoil sleeping cap to ward off radio beams. Now, I’m sure you don’t, but can you remember the origin of your appreciation for privacy? And does it predate the digital revolution or is this only about technology?
Barton Gellman:
It coincided with the digital revolution, and my growing awareness of the digital exhaust that everyone leaves behind. And also my clear awareness that where I stored my notes mattered a great deal. I mean, I took care of with my handwritten notebooks to keep them in my control. And I realized that the default setup in the Washington Post newsroom put all my notes on a network drive that administrators and supervisors had access to, which meant that potentially other snoops around the newsroom had access and potentially outside forces. So I started off thinking about stored communications and I began saving only local copies of my notes. And then I learned about encryption, so I began saving only encrypted copies of my notes.
And gradually I became aware of more and more attack surface, more and more vulnerabilities to interception. The reason I worried about it was that there were government investigations after stories that disclosed secrets, and I knew that they were trying to find out who our sources were and from time to time they were arresting those sources. I thought it’s all well and good for us to promise anonymity and keep our word, but if we’re leaving their identities or our conversations out there in plain view for someone with surveillance authorities, then we’re not doing a very good job.
Aaron Nathans:
If the authorities wanted your encrypted notes, would you be required to help them unencrypt them?
Barton Gellman:
Well, that’s a question mark, but for sure I’d be aware. The worst case scenario for me was that somebody would go with a warrant to the newsroom IT department and say, “Hand us these notes from Gellman,” and I wouldn’t even find out about it, or I might not find out until afterwards, or it wouldn’t be my decision. The trial of Scooter Libby, who was vice president Cheney’s chief of staff and suspected of leaking and lying about it, Time Magazine handed over notes belonging to its reporter over the reporter’s objection. Time was facing a subpoena and penalties for contempt if it did not comply. And those penalties included stiff fines and the company decided to comply. It had circumstances when I might not have complied myself if I were that reporter and the reporter was objecting.
So I realized I wanted to leave this as my decision. And yes, that exposed me to risk of contempt if I refuse to decrypt my notes, but again, the decision would be mine and I would have control.
Aaron Nathans:
So here’s a strange question, but it bears asking, do you use a cell phone?
Barton Gellman:
I do. So I’m well aware that I’ve already given up a lot of privacy just by the nature of using a cell phone. It’s become so important to basic functioning in society that even though I do know a couple of people in the privacy sphere who either minimize use of cell phones or still use an old dumb flip phone, I can’t make that decision myself. The power of the information revolution in my pocket is so great that I’m just not willing to give it up, except for certain kinds of communications that need to be more secure.
Aaron Nathans:
Can you elaborate what are those communications?
Barton Gellman:
Well, those are source communications. Those are communications that need to be remain more private. If I’m talking to someone or I want to talk to someone whom I hope to have confidential conversations with that might attract the interest of an investigator down the road, then I try to use means that are more secure. I offer a bunch of options to people on the contacts page on my website.
Aaron Nathans:
Where do you draw the line between the sort of… like if you’re just going to the bakery, or if you’re going to a gas station, the phone leaves a trail of breadcrumbs, someone might know that you made that trip. I mean, do you object on principle to that or are you willing to accept that loss of privacy because it really just doesn’t matter?
Barton Gellman:
I do object. I object in principle, I rebel against it in my mind, but if it’s a trip to the bakery, I just swallow hard and accept it. But I don’t believe that our location information should belong to anyone but us. And although the phone company needs to have it in order to connect my phone to cell towers and may conceivably need to store it briefly for billing purposes, I don’t think they should have the right to do anything whatsoever with that information other than what’s required to perform the service that I’m paying for. That’s not the state of the law right now, and I wish it were.
Aaron Nathans:
Are there ways that you use your phone that may be different from the rest of us?
Barton Gellman:
Well, I have more restrictive settings than the defaults on location information. For one thing, I use a VPN that I’ve satisfied myself is not itself a consumer of data to sell about me. I think very carefully about what I put in email. I prefer Apple iMessages over email, I prefer Signal over iMessages for routine communications because they’re more private. Email in particular is a crazy way to talk about things that you don’t want someone else to read. If someone sends me something that I wish they hadn’t sent my email, I delete it from the server, because I am a target of hacking.
I mean, both in the sense that we all are and the sense that there are times when I am of particular interest to outsiders, and I just don’t want them, if they happen to manage to get into my accounts, I don’t want them to see personal details of my life, even if it… I mean, I know that my confidential source communications are not going to be open to view that way, but I’ve got plenty to hide in terms of personal privacy. And I think everyone should stop and think about it.
Aaron Nathans:
In the book, you talk about the potential. It might be a far-flung potential, but maybe you can expand on this a little bit. Can someone really turn on your phone without you knowing it and use it as a remote controlled microphone to spy on you?
Barton Gellman:
Yes. That’s a well-known capability of the NSA and now the FBI. And there have been cases in which that what’s known as a lawful intercept in the intelligence and law enforcement community, meaning they’ve got legal authority to do it, and they have exploited vulnerabilities in the system that allow them to do this remote microphone. There’ve been cases when overseas in Europe, I’m trying to remember which country it was, hackers, I think it was Greece, hackers found and took over or gained persistent access to this surveillance technique. So they were actually able to turn on the remote mics of telephone numbers at will that had been using tools that had been created for government.
Aaron Nathans:
Do ordinary Americans have to worry about that sort of thing?
Barton Gellman:
Well, that’s very much a targeted means of surveillance. It doesn’t suit itself to bulk surveillance. So unless you become of interest to government or sophisticated hackers, then you don’t have to worry about that particular method. But there’s lots of surveillance that happens to everyone, and the more they learn about that at a granular level, the more alarmed most people become. We’re accustomed to not thinking about it and we’re discouraged from thinking about it by extremely vague and opaque terms of service and privacy policies that no one reads, anyone, no one reads especially because they are long and opaque and you can’t figure out what they mean in practice. But if you were to become aware of how much how many people know about you, it would creep you out.
Aaron Nathans:
Sure. It seems like the ordinary American is overwhelmed with these just every little way this whack-a-mole, you secure yourself in one area and you leave yourself wide open in another. I mean, do you think that’s on purpose? Are they trying to overwhelm you into complacency?
Barton Gellman:
Yeah, I think they’re counting on that feeling of being overwhelmed. And every time someone comes up with a privacy technique that reduces surveillance, clever people with very high incentives financially come up with new ways around that. And so these days, the one self-defense mechanism that most people know, which is to delete the cookies on your browser isn’t very effective anymore because there are multiple generations of tracking technology that don’t rely on cookies.
Aaron Nathans:
In the book, you describe the internet as a TV that watches you. What did you mean by that? And what kind of internet browser do you use when you’re off the clock? And are you worried about yourself being watched?
Barton Gellman:
It’s a TV that watches you because your browsing habits are minutely tracked, not only the pages you go to, but in many cases, not only how long you spend on the page, but where’s the focus of your cursor. They’re watching you watch them and they’re drawing conclusions from that and they’re aggregating it with enormous amounts of other data about you to form a profile of your interests and your demographics. And some of those things touch on highly personal things like medical or sexual or communications with people that you’d rather not advertise.
For me, when I’m browsing for something sensitive, I mean, suppose a close friend tells me they’ve got a dread disease and I want to know more about it, I’ll use the Tor Browser, which is simple. There’s very little sacrifice in using the Tor Browser now. It’s slower than regular browsing because it bounces your communications around the world to disguise where you’re going and who you are, but it’s built on Firefox and it works just like a regular browser and it’s not a big sacrifice to use that.
Except for speed. So for my everyday browsing, I use something called Brave, which is built on the Chromium open source code so it works quite a lot like Chrome. But all the built-in spying that Google puts in Chrome is gone and there’s a lot of default technology to block tracking technology from the websites you go to. So it works well, it works smoothly. It’s a good clean browser and it protects you from a lot of the essentially the spy industry that tracks you.
Aaron Nathans:
What’s the name of that browser again?
Barton Gellman:
Brave.
Aaron Nathans:
Brave.
Barton Gellman:
I think it’s either brave.net or brave.com to get it. It’s as easy to use as any other browser and I highly recommend it. The people who run that project are privacy advocates known to me and I trust them. And I mean, I avoid Google Search for most things. I use DuckDuckGo, which is almost as good and highly private.
Aaron Nathans:
Do you ever use your credit card to buy something over the internet? And how do you feel about that?
Barton Gellman:
I do use credit card to buy things on the internet, I do internet banking. I’m not an all-cash person, but there are things that I’ll buy for cash because I prefer not to keep a record of it. And depending on what my threat model is, I mean, if I was worried about government surveillance, I mean, for example, I’ve bought throwaway phones and throwaway laptops for cash. But for everyday transactions, if I think a site is a tiny bit sketchy or especially nosy about my life, I’ll use a disposable credit card, one of these places that gets you a temporary number that doesn’t track back to your real address and you don’t use for anything else. But most of the time I just use regular credit cards. I mean, I’m willing to participate in the digital economy.
Aaron Nathans:
I guess that speaks to the overall question of where your threshold is. During COVID, we’re all talking about our own risk threshold. Where do you think in general your risk threshold sits in the digital world? Is there something that you’re willing to risk? Is there a risk you’re willing to take and if worse comes to worse you can live with that?
Barton Gellman:
Well, I mean, I want to be a full participant in the modern information and financial economy so I have to accept certain compromises. There are privacy people who take many more precautions than I do. All but one of the people I know who still use flip phones worked for the NSA, so they know the capabilities and they just choose not to participate. But I find smartphone apps so powerful a tool in my life that I’m not willing to give them up. I just, I am conscious every time I use it that there is something of a public nature in what I’m doing. And so there might be products I wouldn’t buy using a smartphone or online at all. There may be internet searches that I’ll use only on Tor, things like that.
Aaron Nathans:
You spoke before about the “contact me” page on your website. And it’s really fascinating, you say the best way to reach you if you’re a source or a potential source is to send you a message on Signal. And you list your number and you state that you don’t use it as a phone. Do you ever use ordinary text messages? And why don’t you trust them for that purpose?
Barton Gellman:
We’re talking about three things here, really. There’s Signal, then the iPhone is so prevalent that a lot of people are using iMessage, which is different from an ordinary text message in that it travels encrypted on Apple’s network, whereas SMS, the short message service that is the literal text message travels in the clear over multiple phone company networks and is easily readable and storable and is read and is stored by phone companies so the content is not secure at all. iMessage is pretty secure. The thing is it is stored in the Apple cloud and hard to delete and subject to law enforcement subpoenas and could theoretically be read by Apple employees under some circumstances.
Aaron Nathans:
Is that only Apple to Apple communication?
Barton Gellman:
That’s only to Apple to Apple. So you can’t use iMessage unless you’re going iPhone to iPhone or Mac to Mac or Mac to iPhone. Then there’s Signal. Signal is very well encrypted using an elegant and well-audited encryption technology. It’s open source. And Signal doesn’t store messages at all after they’re sent and received. The only thing that Signal stores is when your account was created and the last time it connected to the Signal network. And so even under subpoena, and this has been demonstrated with numerous subpoenas, that’s the only thing that Signal can provide to law enforcement. It can’t provide any content and it can’t provide metadata such as the fact that Joe and Jane sent 35 messages to each other on this date and have been talking periodically for the last six months. Signal doesn’t store that information and can’t provide it so I like it.
Now, you still have to give your phone number, you have to get a phone number to create a Signal phone. But for example, my public signal number is not my actual phone number. You can use, for example, a phone booth to create a Signal account. And so if someone dials the number using a regular phone, they’ll reach the phone booth, but if they send a Signal message, it’ll come to your Signal device. That way you can keep anonymity. I’m not anonymous because I’m putting my name on my website to that number, but it saves me from being spammed. If need be, I could abandon that number and do a new one without changing my real phone number.
Aaron Nathans:
So you write on your site the… this is really interesting, a quote that, “Before you reach out, give some thought to what you want to keep private, who might try to listen, and how much you care.” And then you write, “If you have a genuine reason to fear surveillance by a well-resourced adversary, you may reach me on SecureDrop, a system designed for maximum privacy.” Edward Snowden clearly was attracted to your work because you know how to protect information. In your work, have you started to attract others who actually contact you like this? And is the information typically of much value?
Barton Gellman:
Well, that’s interesting. Advertising a SecureDrop instance is a mixed blessing. First of all, you have to give enormous amount of credit to the Freedom of the Press Foundation which took over the SecureDrop project and puts a lot of resources into it and has created the most secure system that is not out of reach of ordinary mortals. You don’t want to require someone to take a six-month course on computer security to be able to make contact with you. And there are a lot of pitfalls that someone can make if they try to come up with their own operational security and that will cause the whole thing to fail.
SecureDrop is built so that it is idiot-proof for both the sender and receiver of information. It is almost not possible to make a mistake that would compromise your security. It requires someone to download the Tails browser and it takes care of anonymity and encryption. You could also use the Tor Browser, although a bit less securely.
But in any case, it does solve the initial problem of, how do you initiate first contact with someone without leaving a trail that can be traced back to you? Which is a hard problem and SecureDrop solves that. The trouble is that people who are attracted to secret spooky things are often paranoid or misguided or deluded, or obsessive about something. And so the honest truth is, a large fraction, a very large fraction of the communications I receive through SecureDrop are not of interest to me and many of them are just deluded people or obsessed people who think I’m going to be deeply interested in their struggles with their phone company or whatever the case may be. I burn a decent amount of time going through the process of accessing and decrypting these communications and deciding what to do with them. But from time to time, something very interesting comes along.
Aaron Nathans:
I don’t want to get in the way of your ongoing projects, but suffice it to say, do you think that something of ongoing use has come through that platform?
Barton Gellman:
I don’t have anything at the moment that’s an ongoing project that’s using SecureDrop, but I have had.
Aaron Nathans:
I see you use Twitter. Why do you feel comfortable on that platform? And are you on any of the other social media platforms?
Barton Gellman:
I’m least uncomfortable with Twitter. I maintain a LinkedIn page just passively. I don’t post anything and I don’t do a lot to maintain my network on there, but I use it almost as a directory. If you want to know some things about me and my employment history, here it is. I have a Facebook account that I keep on meaning to delete. It’s entirely passive. I do nothing but lurk once a month or so to see what my kids are doing, or a few relatives or close friends. I have now it could be a thousand unanswered friend requests. There are many hundreds. I just don’t interact with it at all. I can’t stand Facebook and I think it’s a force that does a lot more harm than good and does everything it can to track me and I’m just not interested in engaging my social life or communications life on there.
Twitter is just so useful a tool that I accept the amount of metadata I’m giving away by using it. I find it very useful for direct communications with people I want to reach, and I find it even more useful for tracking news and events that I care about. I’m very likely to find out something of considerable interest to me first on Twitter before I find out any other way. And there are some things I might never have learned because I curate who I follow and I make lists. I have Twitter lists, which I use primarily by interest.
Aaron Nathans:
Twitter is usually how I hear about news first.
Barton Gellman:
Yeah, it’s remarkable. I remember when there was an earthquake that did palpably reach New York City some years back. It originated down closer to Washington, D.C. It was a large one. I learned about it on Twitter before the tremors even reached New York. So that was pretty good speed.
Aaron Nathans:
Is it possible to lurk on Facebook and not post without giving up your privacy?
Barton Gellman:
You’re giving up quite a bit of privacy just being on Facebook and having an account. And certainly, if you click on links and if you make and accept friend requests, you’re creating a psychometric profile that Facebook exploits. Certainly, if you send messages or receive messages, and it’s hard to be on Facebook without those things happening. I think inspired by this conversation, I’m going to delete my account. I don’t know when I last logged on. Yes, and I do thank you for this public service.
Aaron Nathans:
Anytime. You’re listening to Cookies, a podcast about technology security and privacy. We’re speaking with Barton Gellman, a journalist whose groundbreaking work with The Washington Post in 2013 revealed widespread surveillance by the National Security Agency and brought the name Edward Snowden into the national consciousness. On our next episode, we’ll talk with Ruby Lee, the Forrest G. Hamrick Professor in Engineering and a Professor of Electrical and Computer Engineering here at Princeton. As a chief computer architect at Hewlett Packard in the 1980s, she was a leader in changing the way computers are built, and she revolutionized the way computers use multimedia. We’ll talk about how to make the hardware in computers more secure without sacrificing performance.
It’s the hundredth anniversary of Princeton School of Engineering and Applied Science. To celebrate, we’re providing 100 facts about our past, our present, and our future, including some quiz questions to test your knowledge about the people, places and discoveries who have made us who we are. Join the conversation by following us on Instagram at @eprinceton. That’s the letter E, Princeton. But for now, back to our conversation with Bart Gellman.
In the book, I read an interesting passage about how Snowden covered his keyboard with a blanket to type in his password. What did you learn from Edward Snowden about keeping your communication secure? And do you recommend going to that extreme?
Barton Gellman:
Well, Edward Snowden had the highest threat model that it’s possible to imagine… You ask who your adversary is, well, his adversary was the world’s most capable surveillance agency. It was already watching him in the sense that it watches all its employees. He was doing something inside the NSA that he didn’t want anyone to find out about until he was ready. The stakes were his freedom. He even believed that he could be killed, something I did not personally believe but it entered his thinking. So he had very high stakes, a very capable adversary, and so he did everything possible to avoid surveillance. And one of the threats if you are under surveillance is that someone’s got a pinhole camera stuck in your wall.
Now, if the FBI is surveilling you, that is quite possible. That’s one of their standard techniques, is to record clandestine video of a surveillance target, in which case they could watch you type your password on the keyboard. And so he thought it’s probably not necessary. It’s a little bit of a pain in the ass, but it’s not that big a deal to cover your keyboard with a blanket. It doesn’t cost you anything except a few seconds’ time and maybe feeling silly, and it might help in some circumstances, so why not? That was the way he thought about it. I don’t cover my keyboard with a blanket. I tend to think that if there’s a pinhole camera in my wall, my operational security is sufficiently blown, then I no longer have to worry about precautions.
Aaron Nathans:
True. In the book, you seem to be learning things by necessity as the investigation goes on to conduct the story, things like using laptops that aren’t connected to the internet or taking the battery out of your phone. Was there anything that he showed you that seemed like it has ongoing use in your professional life?
Barton Gellman:
Yeah. I mean, there are levels and levels of using encryption, for example when… We used primarily two methods. SecureDrop, by the way, didn’t exist at this time. We used email using anonymous accounts, that is, accounts using fake and randomized names and accessed only over the Tor anonymity network so that someone watching, and of course it was encrypted using PGP. So someone watching would see an encrypted blob of communications going from Donald Duck to Mickey Mouse and would not be able to tell where in the world those accounts were being accessed. That was one thing. And the other was live chat using Jabber. Again, with anonymity switched on and encryption and fake names.
So the thing is each of those methods uses an exchange of encryption keys which you have to verify that, “Yes, I recognize this key as being valid.” Well, how do you exchange keys? To do that, and Snowden showed me how you use a second channel. So for example, you could use an encrypted email to verify a Jabber key or vice versa, so you’re using two different channels of communication to authenticate yourselves. And for example, sometimes he would say, “Let’s change accounts.” And so he would write me as Mickey Mouse and say, “I’m going to create a new account called Pluto. Here’s the encryption key.” That would be encrypted to me at Donald Duck and I would reply back saying, “I’m going to be Daffy,” and send him the encryption key to Daffy.
And so now you’ve you’ve got two completely unrelated accounts, and let’s just assume that we were not using characters from the same cartoon universe. So it was Mickey and Donald talking, now it’s Pluto and Daffy talking and there’s no external reason based on metadata for any observer to link the two. So we would periodically change keys that way.
Aaron Nathans:
Let’s talk about the cloud. You write that putting your information in the cloud is akin to giving up control. How so?
Barton Gellman:
There’s a British security expert, his name is Graham Cluley, and he coined the very simple but revealing aphorism that the cloud is just another name for somebody else’s computer. So you are deciding to store material that may be confidential to you, and maybe even in a very consequential way. You’re saying, “Instead of storing it here on my computer, I’m just going to store it on someone else’s computer, a stranger’s computer. And there will be other people who can read my file. And in fact, I can’t read my own file without going to these other people and asking for them to send it back to me.” And that just seems like a bad deal to me when you’re dealing with confidential material.
Barton Gellman:
Now, it’s the business model of these cloud companies to persuade large companies who are going to be paying customers that the information is secure enough that a company can be comfortable with it, but it is not secure, again, against law enforcement subpoenas, and it’s not potentially as secure against hackers as your own storage might be if you’re taking good precautions. And it’s a very attractive hacking target because it’s got such an enormous volume of valuable information on it.
So I use cloud services for some things, for some routine things. I use it, especially for storing encrypted volumes. So I will back up encrypted files to the cloud because the cloud has big advantages in that if you have a fire at your house, you’re not going to burn your only backup copy in the fire. It’s good to have remote backups when possible. But I don’t want to have my unencrypted files backed up that way, not most of the time.
Aaron Nathans:
So should those of us who keep our photos or music in the cloud but don’t deal in national security matters reconsider our casual use of the cloud?
Barton Gellman:
I don’t know, photos can be pretty revealing. If I had my druthers, people would use something called a zero knowledge cloud system. That’s a system that has been engineered so that the hosting company, and you’re going to have to pay for this, it’s not going to be a free service, the hosting company doesn’t have access to your files, can’t look at your photos no matter what, because before they leave your computer, they’re encrypted with a key that only you know. Now, that means two things. It means that you can’t forget your password because the company can’t rescue you if you do. And you’re probably going to have to pay, let’s say a hundred bucks a year. And for that price, you get remote redundant storage that only you can access, and you don’t have all your personal information and your tax returns and your shopping habits and so forth and all your kids’ school files and medical files stored where other people can read them.
So I would recommend for people who are interested in that, there are products that I’ve used myself that are, again, they’re built with this zero knowledge principle. One is called SpiderOak and one’s called Tresorit.
Aaron Nathans:
What’s that second one?
Barton Gellman:
Tresorit. It’s T-R-E-S-O-R-I-T.
Aaron Nathans:
Do you get the sense that the security of ordinary Americans’ information is any more or less secure with the new administration or after the January six Capitol invasion?
Barton Gellman:
I don’t think anything’s changed with the change in government that we know of or after the Capitol insurrection. I think what has changed pretty dramatically in recent years is that the security and privacy of the internet were both considerably enhanced by the Snowden disclosures and by the reactions they provoked in the technology companies. Before people understood the nature of the privacy threat, there was no market demand for privacy. There was no market demand for security, or not nearly as much. As soon as the government’s spying was disclosed, people started talking more about commercial spying as well. And then in order to protect their market shares, companies like Google and Microsoft and Facebook had to step up their security measures considerably.
And so instead of laggards like Yahoo, which had refused to encrypt connections between Yahoo servers and user computers for years. It had been asked to do so by privacy advocates and it hadn’t bothered to do so. So now the reason why almost every site on the internet now that you’re likely to encounter has a little lock icon in the top of the browser in the address bar, meaning that it’s running over HTTPS, the S standing for secure, reason that all these websites are encrypted by default now is Edward Snowden. It changed dramatically in just a couple of years, beginning in mid 2013 when his disclosures began.
Aaron Nathans:
And that was remarkable. I know that you’ve covered this extensively in other podcasts and interviews about the legacy of the Snowden revelations and I don’t want to waste a lot of your time repeating what you’ve already said many times, but it is very remarkable, the legacy of this outstanding reporting that you did and the risks that he took to get the word out.
What kind of general advice do you have? Given everything that’s changed since your stories came out and everything that you just told me in the last couple minutes about the changes that each one of these platforms has made in response to your reporting, given the state of the art right now, what general advice do you have for anybody who’s looking to up their privacy game?
Barton Gellman:
Well, I would at least glance through the terms of service and the privacy policies, because they do differ in material ways from place to place. And if the privacy is good, there will be clear plain English explanations of that near the top. And if it’s not good, just consider, do you really need to use this service? Is it really important for you to play this game? If you do want to do it, consider using throwaway email addresses. Most websites don’t really need to know, from your point of view, for your interest what your email address is.
So you can go to something like YOPmail, Y-O-P mail.com and create a throwaway address to get verified by the website and then you never need to check it again. Don’t give your cell phone number if you can help it. And if you can’t help it, then get one of the free or cheap apps that let you create a throwaway number on your smartphone and give that out routinely because it’s like a social security number almost. It’s the easiest way you’re tracked around the internet, your email address and your telephone number.
Use DuckDuckGo for your searches, use Signal for your texts, use Brave for your browser. Look at the privacy permissions on your smartphone and turn off access to personal information for any app that doesn’t need it. That takes maybe 10 minutes. Use a password manager. I prefer one password, to save your passwords, and that enables you to choose a unique and hard to guess passwords for every account. It simplifies it. It actually speeds up your log on to websites compared to not using a password manager, and it’s highly secure. Those would be… Well, actually, my number one before all of that is to install updates for your computer operating system and software as soon as they arrive. Don’t be one of those people has like 17 uninstalled updates because you don’t feel like bothering.
Aaron Nathans:
Why is it important?
Barton Gellman:
Almost every single update includes security updates. It’s when hackers have figured out a way to break into a system by Adobe or Microsoft or Apple or Google, and the manufacturer has become aware of this and it has patched the system, it has patched the hole. It has stopped the vulnerability, but you have to upgrade to get that patch. I mean, in general, things will work better, but in particular, there are security implications to just about every upgrade there is.
Aaron Nathans:
Good advice. So finally, as someone who isn’t a tenured professor but knows our community quite well, what is the role of folks at the Center for Information Technology Policy in improving security and privacy?
Barton Gellman:
It’s a fantastic center. It’s something that I don’t know that it’s unique, but it’s unusual at universities to have world-class technical people collaborating with and thinking about public policy people and public policy questions. So it’s the intersection of policy law and technology and very thoughtful about the big issues, which also include things like artificial intelligence that come at those intersections. There’s been great work done there to show unintended consequences of policies, gaps in the law, technical flaws that belie consumer claims made by various companies, ways that your smart devices at home are insecure or are tracking you that the companies haven’t advertised. So it’s a great center. It’s a real national resource.
Aaron Nathans:
Terrific. Anything else you want to mention before we wrap up?
Barton Gellman:
No, I think you’ve covered it well.
Aaron Nathans:
Thank you. Well, when I was a kid looking at journalism school and then starting out, when I grew up I wanted to do exactly what you’ve ended up doing. So I have a lot of admiration for the work you’ve done, and it’s an honor to have you on the podcast.
Barton Gellman:
Well, it’s an honor to be here. You’re thinking all the time about exactly the issues that interest me.
Aaron Nathans:
Terrific. Well, thank you so much.
Barton Gellman:
Take care now.
Aaron Nathans:
You too. Well, we’ve been speaking with journalist Barton Gellman, a 1982 Princeton graduate. His book is “Dark Mirror: Edward Snowden and the Surveillance State.” We’ll list a link to it in our show notes. I want to thank Bart as well as our recording engineer, Dan Kearns, thanks as well to Emily Lawrence, Molly Sharlach, Neil Adelantar, and Steven Schultz.
Cookies is a production of the Princeton University School of Engineering and Applied Science. This podcast is available on iTunes, Spotify, Stitcher, and other platforms. Show notes and an audio recording of this podcast are available at our website, engineering.princeton.edu. If you get a chance, please leave a review. It helps.
The views expressed on this podcast do not necessarily reflect those of Princeton University. I am Aaron Nathans, digital media editor at Princeton Engineering. Watch your feed for another episode of Cookies soon. Peace.