Student and professor talk while sitting at table in front of laptop computer

Student projects use computing to ensure technology serves society

Elena Lucherini achieved high grades in her computer science master’s courses in Italy, yet she couldn’t help but feel that something crucial was missing. "You code all day and don’t have to care about anything else," she said. "It was just the nerdy stuff with basically no focus on ethics or policy."

For Lucherini, technology for technology’s sake wasn’t enough. She was considering quitting the field altogether when a chance meeting with Arvind Narayanan, an associate professor of computer science at Princeton University’s Center for Information Technology Policy (CITP), profoundly changed her trajectory. “Arvind told me that the emphasis at CITP was not just on coding and technology, but on the effect of technology on society as a whole and the impact it can have on your everyday life,” Lucherini said. “That was exactly what I was interested in.”

Now a doctoral candidate in computer science at CITP, Lucherini is building a simulator that mimics algorithmic recommendation systems employed by YouTube, Amazon and more. On social media sites, such systems have been implicated in pushing viewers toward extreme political viewpoints, yet their technical workings and social impacts are poorly understood. Lucherini hopes to use her simulator to better understand the impacts those systems have on users, and to produce results that will inform real-world decisions. “Everything we do at CITP is geared toward having a policy impact or having an impact on society,” Lucherini said. “That’s really unique, and to me, much more interesting than playing computer science.”

Lucherini’s research epitomizes CITP’s mission: to dig into the complicated questions of how technology and computing affect society, both positively and negatively. “It’s not about making computers faster, better and smaller,” Narayanan said. Instead, the program’s “bread and butter” topics, he continues, include privacy, online deception, free speech and expression, freedom to modify devices, and AI policy. CITP’s professors and fellows include not just computer scientists, but also a mix of philosophers, sociologists, political scientists, economists and psychologists.

CITP’s mixing of disciplines accurately reflects the complexities of the real world, said Matthew Salganik, a professor of sociology at Princeton University and interim director of CITP.

“I think there’s a now a broader realization in our field that it isn’t sufficient anymore to focus only on technology,” said Ben Kaiser, a doctoral candidate in computer science at CITP. “The most challenging problems in modern computer science can’t be solved by technology alone.”

Kaiser’s research in the group of Jonathan Mayer, for example, focuses on the emerging discipline of disinformation studies: how false information gets spread on the web, how people interact with it and how it influences voting, electoral participation and more. While computer scientists have made strides in designing interventions that deter people from clicking on suspected malware or spam – filters for email, for example, or pop-up web pages warning that a site may be insecure – they have yet to determine how to best alert users that they may be encountering disinformation. “We need a whole suite of methods for responding to disinformation, and warnings will be a key part of that toolset,” Kaiser said. “My study is the first attempt to look at whether techniques used for other types of security warnings might also work for disinformation, and how to refine them to be more effective.”

To explore this, Kaiser and Jerry Wei, a master’s student at CITP, designed various alerts based on existing warnings built into Google Chrome. He recruited students from Princeton’s campus to participate in an online information retrieval task in which some encountered warnings and others did not. He analyzed whether those who encountered the warnings changed their behavior, and which warnings worked best and what type of influence they had. “One thing we realized from that study is that clickthrough alone is not a sufficient metric for measuring whether a warning is effective,” Kaiser said. “Unlike a security warning that warns of a direct threat of having your bank account hacked, there are a lot of reasons people might click through a disinformation warning.”

Kaiser is now working on a larger-scale online study using a web tool that mimics a Google search but provides him with complete control over the content. By putting participants into a simulated, controlled environment, he can better delve into the nuances of how people respond to warnings in a realistic setting. He is testing how people’s responses vary when the warnings are subtly altered, such as subbing out a triangle for a stop sign or the phrase “disinformation” for “fake news,” by drawing on a social sciences principle called the multi-armed bandit problem. The method measures how efficient decisions are made with limited resources when multiple options are available, such as when a gambler is trying to maximize winnings at a slot machine.

Kaiser’s work could one day be the basis of warnings deployed throughout the web, which reflects the center’s aim to identify and address “the things that people will be talking about tomorrow,” as Salganik put it, and to translate its findings into real-world changes.

In 2014, for example, Narayanan and his former doctoral student, Steven Englehardt, who now works as a Firefox privacy engineer at Mozilla, and his former postdoctoral researcher, Günes Acar, released OpenWPM, an open-source program that sends a bot to around 1 million websites every month, pretending to be a real user. The bot collects data about cookies, fingerprinting scripts and other behind-the-scenes tricks that a human user does not see, but that affect their privacy. Narayanan, Englehardt, Acar and their colleagues have since published a number of highly cited papers about their findings, as have other researchers, resulting in 50-plus papers based on OpenWPM. The New York Times and The Washington Post have written investigative stories using the tool.

“This is an example of CITP’s unusual mode of working, in which we put public interest first and worry about the academic brownie points second,” Narayanan said of his decision to release OpenWPM prior to publishing a research paper about it.

Influencing policy is another way of affecting change. CITP benefits from several faculty members who have spent time working in government, and numerous graduates who have pursued careers at the federal and state level. “We’ve built up an understanding of how government works and what levers are available to make a policy impact,” Narayanan said.

To do this even more effectively, in 2019, CITP opened a technology policy clinic – perhaps the first such program outside a law school. “There’s really nothing like it anywhere else,” said Mihir Kshirsagar, lead of the clinic, who previously worked as a lawyer in the Bureau of Internet and Technology at the New York State Office of the Attorney General. “My goal is to help technologists understand how policy functions and to contribute to real-world policy debates, and at the same time have those debates inform the research agenda.”

Graduate student speaks and gestures while seated in front of laptop computer

Arunesh Mathur, a graduate student in computer science at CITP, is drawing on the policy clinic’s resources to more thoroughly investigate how organizations around the world engage in manipulative, sometimes illegal practices. In one study, he and collaborators used machine learning techniques to analyze YouTube influencers who are paid to promote products. The researchers found that just 10 percent actually disclosed that they were advertising a product, violating several laws. In another study, Mathur and collaborators identified nearly 1,200 instances of dark patterns, or interfaces on big platforms like Facebook, Google and others that are designed to trick users into doing things they would not normally do, such as handing over private information or buying something.

Princeton undergraduate Michael Swart, who graduated in 2019, then worked with the researchers to develop an extension, called Adtuition, for the Chrome and Firefox web browsers, which alerts users to unidentified advertising.

“CITP is a center that looks into issues that matter to the world,” Mathur said. “We study things that are very technical along with the social science, psychology, political science and other disciplines that help us understand how technology impacts society at a much more informed level.”

Related Faculty

Arvind Narayanan

Related Departments

Computer Science

Computer Science

Leading the field through foundational theory, applications, and societal impact