Thesis Defense - Frank Cangialosi: Privacy-Preserving Video Analytics
Host
Hari Balakrishnan
MIT CSAIL
As video cameras have become pervasive in public settings and accurate computer vision has become commonplace, there has been increasing interest in collecting and processing data from these cameras at scale (“video analytics”). While these trends enable many useful applications (such as monitoring the mobility patterns of cars and pedestrians to improve road safety), they also enable detailed surveillance of citizens at an unprecedented level. Prior solutions fail to practically resolve this tension between utility and privacy, as they rely on perfect detection of all private information in each video frame—an unrealistic assumption.
In this dissertation, we present Privid, a privacy-preserving video analytics system that aims to provide both a meaningful guarantee of privacy and an expressive query interface that is amenable to analysts. In particular, Privid’s privacy definition does not require perfect detection of private information, and its query interface allows analysts to provide their own arbitrary (untrusted) machine learning (ML) processing models.
The key takeaway from our evaluation is that Privid can provide a balance between privacy and utility: across a variety of queries over both real surveillance videos and a simulated city-wide camera network, Privid protects the appearance of all people with differential privacy, and maintains accuracy within 79-99% relative to a non-private system. As a result, we hope Privid provides a practical way forward for video analytics to progress while assuring that it cannot aid mass surveillance.
Thesis Committee: Hari Balakrishnan, Ravi Netravali, Henry Corrigan-Gibbs
In this dissertation, we present Privid, a privacy-preserving video analytics system that aims to provide both a meaningful guarantee of privacy and an expressive query interface that is amenable to analysts. In particular, Privid’s privacy definition does not require perfect detection of private information, and its query interface allows analysts to provide their own arbitrary (untrusted) machine learning (ML) processing models.
The key takeaway from our evaluation is that Privid can provide a balance between privacy and utility: across a variety of queries over both real surveillance videos and a simulated city-wide camera network, Privid protects the appearance of all people with differential privacy, and maintains accuracy within 79-99% relative to a non-private system. As a result, we hope Privid provides a practical way forward for video analytics to progress while assuring that it cannot aid mass surveillance.
Thesis Committee: Hari Balakrishnan, Ravi Netravali, Henry Corrigan-Gibbs