Manipulation Attacks in Local Differential Privacy

Local differential privacy is a widely studied restriction on distributed algorithms that collect aggregates about sensitive user data, and is now deployed in several large systems. We initiate a systematic study of a fundamental limitation of locally differentially private protocols: they are high...

Full description

Bibliographic Details
Main Authors: Albert Cheu, Adam Smith, Jonathan Ullman
Format: Article
Language:English
Published: Labor Dynamics Institute 2021-02-01
Series:The Journal of Privacy and Confidentiality
Subjects:
Online Access:http://www.journalprivacyconfidentiality.org/index.php/jpc/article/view/754
id doaj-8f92a5c662314c45a748a654b8b2a898
record_format Article
spelling doaj-8f92a5c662314c45a748a654b8b2a8982021-02-20T18:32:59ZengLabor Dynamics InstituteThe Journal of Privacy and Confidentiality2575-85272021-02-0111110.29012/jpc.754Manipulation Attacks in Local Differential PrivacyAlbert Cheu0Adam Smith1Jonathan Ullman2Khoury College of Computer Sciences, Northeastern UniversityDepartment of Computer Science, Boston UniversityKhoury College of Computer Sciences, Northeastern University Local differential privacy is a widely studied restriction on distributed algorithms that collect aggregates about sensitive user data, and is now deployed in several large systems. We initiate a systematic study of a fundamental limitation of locally differentially private protocols: they are highly vulnerable to adversarial manipulation. While any algorithm can be manipulated by adversaries who lie about their inputs, we show that any noninteractive locally differentially private protocol can be manipulated to a much greater extent---when the privacy level is high, or the domain size is large, a small fraction of users in the protocol can completely obscure the distribution of the honest users' input. We also construct protocols that are optimally robust to manipulation for a variety of common tasks in local differential privacy. Finally, we give simple experiments validating our  theoretical results, and demonstrating that protocols that are optimal without manipulation can have dramatically different levels of robustness to manipulation. Our results suggest caution when deploying local differential privacy and reinforce the importance of efficient cryptographic  techniques for the distributed emulation of centrally differentially private mechanisms. http://www.journalprivacyconfidentiality.org/index.php/jpc/article/view/754Differential Privacy
collection DOAJ
language English
format Article
sources DOAJ
author Albert Cheu
Adam Smith
Jonathan Ullman
spellingShingle Albert Cheu
Adam Smith
Jonathan Ullman
Manipulation Attacks in Local Differential Privacy
The Journal of Privacy and Confidentiality
Differential Privacy
author_facet Albert Cheu
Adam Smith
Jonathan Ullman
author_sort Albert Cheu
title Manipulation Attacks in Local Differential Privacy
title_short Manipulation Attacks in Local Differential Privacy
title_full Manipulation Attacks in Local Differential Privacy
title_fullStr Manipulation Attacks in Local Differential Privacy
title_full_unstemmed Manipulation Attacks in Local Differential Privacy
title_sort manipulation attacks in local differential privacy
publisher Labor Dynamics Institute
series The Journal of Privacy and Confidentiality
issn 2575-8527
publishDate 2021-02-01
description Local differential privacy is a widely studied restriction on distributed algorithms that collect aggregates about sensitive user data, and is now deployed in several large systems. We initiate a systematic study of a fundamental limitation of locally differentially private protocols: they are highly vulnerable to adversarial manipulation. While any algorithm can be manipulated by adversaries who lie about their inputs, we show that any noninteractive locally differentially private protocol can be manipulated to a much greater extent---when the privacy level is high, or the domain size is large, a small fraction of users in the protocol can completely obscure the distribution of the honest users' input. We also construct protocols that are optimally robust to manipulation for a variety of common tasks in local differential privacy. Finally, we give simple experiments validating our  theoretical results, and demonstrating that protocols that are optimal without manipulation can have dramatically different levels of robustness to manipulation. Our results suggest caution when deploying local differential privacy and reinforce the importance of efficient cryptographic  techniques for the distributed emulation of centrally differentially private mechanisms.
topic Differential Privacy
url http://www.journalprivacyconfidentiality.org/index.php/jpc/article/view/754
work_keys_str_mv AT albertcheu manipulationattacksinlocaldifferentialprivacy
AT adamsmith manipulationattacksinlocaldifferentialprivacy
AT jonathanullman manipulationattacksinlocaldifferentialprivacy
_version_ 1724259662242512896