This talk will be aimed at an audience unfamiliar with the literature on privacy preservation (as I was a few weeks ago). The goal of the talk will be to first illustrate that whether or not the output of some interaction with real data is privacy preserving is not as simple of a concept as it may first seem, motivating the need for a precise definition of privacy preservation. Then I will give one possible definition, that of differential privacy, invented in 2006, for which the authors were awarded the Gödel Prize in 2017. I will give some examples of database queries that are differentially private, and others that are not, to instill an intuition for what it means for a random algorithm to be “differentially private”. I will also note some key properties of differential privacy. Lastly, I will describe how the authors of the DPGAN paper prove that their generative adversarial network is differentially private.
Joe Pedersen is from Pittsburgh, PA. He earned a B.S. in mathematics from Penn State in 2006, after which he was commissioned into the U.S. Army as an Infantry officer. After being selected by the U.S. Army to attend graduate school, he earned master’s degrees in math and physics from RPI in 2013, followed by a three-year teaching position in the Department of Mathematical Sciences at the United States Military Academy. From 2016 to 2019, he served as an operations research analyst at Fort Benning, GA. He is in his first year of a PhD program in the Department of Industrial and Systems Engineering at RPI, after which he will return to the United States Military Academy as an assistant professor in the Department of Systems Engineering. He is focusing his studies in the areas of machine learning and data analytics.