Ask a Librarian

Threre are lots of ways to contact a librarian. Choose what works best for you.

HOURS TODAY

11:00 am - 3:00 pm

Reference Desk

CONTACT US BY PHONE

(802) 656-2022

Voice

(802) 503-1703

Text

MAKE AN APPOINTMENT OR EMAIL A QUESTION

Schedule an Appointment

Meet with a librarian or subject specialist for in-depth help.

Email a Librarian

Submit a question for reply by e-mail.

WANT TO TALK TO SOMEONE RIGHT AWAY?

Library Hours for Friday, April 26th

All of the hours for today can be found below. We look forward to seeing you in the library.
HOURS TODAY
8:00 am - 6:00 pm
MAIN LIBRARY

SEE ALL LIBRARY HOURS
WITHIN HOWE LIBRARY

MapsM-Th by appointment, email govdocs@uvm.edu

Media Services8:00 am - 4:30 pm

Reference Desk11:00 am - 3:00 pm

OTHER DEPARTMENTS

Special Collections10:00 am - 5:00 pm

Dana Health Sciences Library7:30 am - 6:00 pm

 

CATQuest

Search the UVM Libraries' collections

UVM Theses and Dissertations

Browse by Department
Format:
Online
Author:
Abuah, Chukwunweike (Chiké)
Dept./Program:
Computer Science
Year:
2021
Degree:
Ph. D.
Abstract:
Differential privacy (Dwork, 2006; Dwork et al., 2006a) has achieved prominence over the past decade as a rigorous formal foundation upon which diverse tools and mechanisms for performing private data analysis can be built. The guarantee of differential privacy is that it protects privacy at the individual level: if the result of a differentially private query or operation on a dataset is publicly released, any individual present in that dataset can claim plausible deniability. This means that any participating individual can deny the presence of their information in the dataset based on the query result, because differentially private queries introduce enough random noise/bias to make the result indistinguishable from that of the same query run on a dataset which actually does not contain the individual's information. Additionally, differential privacy guarantees are resilient against any form of linking attack in the presence of auxiliary information about individuals. Both static and dynamic tools have been developed to help non-experts write differentially private programs: static analysis tools construct a proof without needing to run the program; dynamic analysis tools construct a proof while running the program, using a dynamic monitor executed by the unmodified runtime system. The resulting proof may apply only to that execution of the program. Many of the static tools take the form of statically-typed programming languages, where correct privacy analysis is built into the soundness of the type system. Meanwhile dynamic systems typically take either a prescriptive or descriptive approach to analysis when running the program. This dissertation proposes new techniques for language-based analysis of differential privacy of programs in a variety of contexts spanning static and dynamic analysis. Our approach towards differential privacy analysis makes use of ideas from linear type systems and static/dynamic taint analysis. While several prior approaches towards differential privacy analysis exist, this dissertation proposes techniques which are designed to, in several regards, be more flexible and usable than prior work.