Towards practical differential privacy in data analysis: Understanding the effect of epsilon on utility in private ERM
Tài liệu tham khảo
Backstrom, 2011, Wherefore art thou R3579X? Anonymized social networks, hidden patterns, and structural steganography, Commun. ACM, 54, 133, 10.1145/2043174.2043199
Bassily, 2014, Private empirical risk minimization: efficient algorithms and tight error bounds, 464
Chaudhuri, 2011, Differentially private empirical risk minimization, J. Mach. Learn. Res., 12, 1069
Dua, D., Graff, C., 2017. UCI machine learning repository.
Dwork, 2006, Calibrating noise to sensitivity in private data analysis, 265
Fredrikson, 2015, Model inversion attacks that exploit confidence information and basic countermeasures, 1322
Gaboardi, 2016, PSI (Ψ): a private data sharing interface, CoRR
Ganju, 2018, Property inference attacks on fully connected neural networks using permutation invariant representations, 619
Ge, 2019, APEx: accuracy-aware differentially private data exploration, 177
Guyon, 2004, Result analysis of the nips 2003 feature selection challenge, 545
Hay, 2008, Resisting structural re-identification in anonymized social networks, Proc. VLDB, 1, 102, 10.14778/1453856.1453873
Hsu, 2014, Differential privacy: an economic method for choosing epsilon, 398
Iyengar, 2019, Towards practical differentially private convex optimization, 299
Jimenez Rezende, 2014, Stochastic backpropagation and approximate inference in deep generative models, 1278
Kifer, 2012, Private convex empirical risk minimization and high-dimensional regression, 25.1
Kingma, 2014, Auto-encoding variational bayes, CoRR
Koh, 2017, Understanding black-box predictions via influence functions, 1885
Kohli, 2018, Epsilon voting: Mechanism design for parameter selection in differential privacy, 19
Lee, 2011, How much is enough? Choosing ϵ for differential privacy, 325
Li, 2012, On sampling, anonymization, and differential privacy or, k-anonymization meets differential privacy, 32
Ligett, 2017, Accuracy first: selecting a differential privacy level for accuracy-constrained ERM, 2563
Liu, 2008, Towards identity anonymization on graphs, 93
Lobo-Vesga, 2020, A programming framework for differential privacy with accuracy concentration bounds, 411
Melis, 2019, Exploiting unintended feature leakage in collaborative learning, 691
Mohan, 2012, GUPT: privacy preserving data analysis made easy, 349
Naldi, 2015, Differential privacy: an estimation theory-based method for choosing epsilon, Cryptog. Secur.
Narayanan, 2009, De-anonymizing social networks, 173
Shalev-Shwartz, 2008, SVM optimization: Inverse dependence on training set size, 928
Shokri, 2017, Membership inference attacks against machine learning models, 3
Srivatsa, 2012, Deanonymizing mobility traces: using social network as a side-channel, 628
Stamper, J., Niculescu-Mizil, A., Ritter, S., Gordon, G., Koedinger, K., 2010. Challenge data set from KDD cup 2010 educational data mining challenge.
Sweeney, 2002, k-anonymity: a model for protecting privacy, 557
Tramèr, 2016, Stealing machine learning models via prediction APIs, 601
Wang, 2017, Differentially private empirical risk minimization revisited: faster and more general, 2722
Yeom, 2018, Privacy risk in machine learning: Analyzing the connection to overfitting, 268