UVM Theses and Dissertations
Format:
Online
Author:
Abuah, Chukwunweike (Chiké)
Dept./Program:
Computer Science
Year:
2021
Degree:
Ph. D.
Abstract:
Differential privacy (Dwork, 2006; Dwork et al., 2006a) has achieved prominence over the past decade as a rigorous formal foundation upon which diverse tools and mechanisms for performing private data analysis can be built. The guarantee of differential privacy is that it protects privacy at the individual level: if the result of a differentially private query or operation on a dataset is publicly released, any individual present in that dataset can claim plausible deniability. This means that any participating individual can deny the presence of their information in the dataset based on the query result, because differentially private queries introduce enough random noise/bias to make the result indistinguishable from that of the same query run on a dataset which actually does not contain the individual's information. Additionally, differential privacy guarantees are resilient against any form of linking attack in the presence of auxiliary information about individuals. Both static and dynamic tools have been developed to help non-experts write differentially private programs: static analysis tools construct a proof without needing to run the program; dynamic analysis tools construct a proof while running the program, using a dynamic monitor executed by the unmodified runtime system. The resulting proof may apply only to that execution of the program. Many of the static tools take the form of statically-typed programming languages, where correct privacy analysis is built into the soundness of the type system. Meanwhile dynamic systems typically take either a prescriptive or descriptive approach to analysis when running the program. This dissertation proposes new techniques for language-based analysis of differential privacy of programs in a variety of contexts spanning static and dynamic analysis. Our approach towards differential privacy analysis makes use of ideas from linear type systems and static/dynamic taint analysis. While several prior approaches towards differential privacy analysis exist, this dissertation proposes techniques which are designed to, in several regards, be more flexible and usable than prior work.