Welcome to the Penn Theory Group

Theoretical computer science (TCS) looks at computational universe around us through the lens of mathematics. The span of problems in TCS include both the design of new models for computational problems as well as the study of efficient algorithms (and computational complexity) for various tasks in well established models. In addition to being central to computer science, in recent decades, TCS has forged strong connections with several areas including biology, economics, physics and law.

The Theory Group at Penn has world-renowned researchers working on core areas of algorithms and complexity as well as applications to areas including algorithmic fairness, cryptography, computational biology, databases, game theory and data privacy.

We host Theory Lunch every Friday! More information can be found here.

Recent Research Collaborators

Theory Lunch at Penn

The theory group will be hosting talks by resident researchers and invited speakers every Friday noon at Room 401B, 3401 Walnut Street. More details will be updated here a week before each talk. Announcements regarding the talk will be sent out to the theory group listserv, which can be signed up for at this link. The list of speakers for September is below.

September 9: Sanjeev Khanna, University of Pennsylvania 

September 16: Aaron Roth, University of Pennsylvania

September 23: Matt Weinberg, Princeton University

September 30: Rachel Cummings, Columbia University

Abstract for Rachel Cumming’s talk (September 30):

Title : Improving Technical Communication with End Users about Differential Privacy

Differential privacy (DP) is widely regarded as a gold standard for privacy-preserving computation over users’ data.  A key challenge with DP is that its mathematical sophistication makes its privacy guarantees difficult to communicate to users, leaving them uncertain about how and whether they are protected. Despite recent widespread deployment of differential privacy, relatively little is known about what users think of differential privacy and how to effectively communicate the practical privacy guarantees it offers.

This talk will cover a series of recent and ongoing user studies aimed at measuring and improving communication with non-technical end users about differential privacy. The first set explores users’ privacy expectations related to differential privacy and measures the efficacy of existing methods for communicating the privacy guarantees of DP systems. We find that users care about the kinds of information leaks against which differential privacy protects and are more willing to share their private information when the risk of these leaks is reduced. Additionally, we find that the ways in which differential privacy is described in-the-wild set users’ privacy expectations haphazardly, which can be misleading depending on the deployment. Motivated by these findings, the second set of user studies develops and evaluates prototype descriptions designed to help end users understand DP guarantees. These descriptions target two important technical details in DP deployments that are often poorly communicated to end users: the privacy parameter epsilon (which governs the level of privacy protections) and the distinctions between the local and central models of DP (which governs who can access exact user data).

based on joint works with Gabriel Kaptchuk, Priyanka Nanayakkara, Elissa Redmiles, Mary Anne Smart, including https://arxiv.org/abs/2110.06452 and other ongoing work

Bio: Dr. Rachel Cummings Shavit is an Assistant Professor of Industrial Engineering and Operations Research and (by courtesy) Computer Science at Columbia University. Before joining Columbia, she was an Assistant Professor of Industrial and Systems Engineering and (by courtesy) Computer Science at the Georgia Institute of Technology. Her research interests lie primarily in data privacy, with connections to machine learning, algorithmic economics, optimization, statistics, and public policy. Her work has focused on problems such as strategic aspects of data generation, incentivizing truthful reporting of data, privacy-preserving algorithm design, impacts of privacy policy, and human decision-making. Dr. Cummings received her Ph.D. in Computing and Mathematical Sciences from the California Institute of Technology, her M.S. in Computer Science from Northwestern University, and her B.A. in Mathematics and Economics from the University of Southern California.  She is the recipient of an NSF CAREER award, a DARPA Young Faculty Award, an Apple Privacy-Preserving Machine Learning Award, JP Morgan Chase Faculty Award, a Google Research Fellowship for the Simons Institute program on Data Privacy, a Mozilla Research Grant, the ACM SIGecom Doctoral Dissertation Honorable Mention, the Amori Doctoral Prize in Computing and Mathematical Sciences, and best paper awards at the 2021 ACM Conference on Computer and Communications Security and the 2014 International Symposium on Distributed Computing.  Dr. Cummings also serves on the ACM U.S. Public Policy Council’s Privacy Committee and the Future of Privacy Forum’s Advisory Board.