“Rigorous Foundations for Privacy in Statistical Databases”

Mon Apr 18, 2016 3:00 PM

Location: Room 4172 A.V. Williams Building

Speaker:
Adam D. Smith
Computer Science and Engineering
Penn State

Abstract:
Consider an agency holding a large database of sensitive personal information—medical records, census survey answers, web search records or genetic data, for example. The agency would like to discover and publicly release global characteristics of the data (say, to inform policy or business decisions), while protecting the privacy of individuals' records. This problem is known variously as "statistical disclosure control,” “privacy-preserving data mining” or “private data analysis.”

I will begin by discussing what makes this problem difficult, and exhibit some of the nontrivial problems that plague simple attempts at anonymization and aggregation. Motivated by this, I will present differential privacy, a rigorous definition of privacy in statistical databases that has received significant attention. I'll explain some recent results on the design of differentially private algorithms, as well as the application of these ideas in contexts with no (previously) apparent connection to privacy.

Bio:
Adam Smith is an associate professor in the Department of Computer Science and Engineering at Penn State. His research interests lie in data privacy and cryptography and their connections to information theory, statistical learning and quantum computing. He received his Ph.D. from MIT in 2004 and was a visiting scholar at the Weizmann Institute of Science and at UCLA. He was also a visiting professor at Boston and Harvard universities. Smith received a 2009 Presidential Early Career Award for Scientists and Engineers (PECASE), and the 2016 Theory of Cryptography Test of Time Award (with Dwork, McSherry and Nissim).