- About
- Secure Machine Learning
- Secure Federated Learning
- Libraries and Frameworks
- General Research
- Blogs
This is a current list of resources related to the research and development of privacy-preserving machine learning.
- Machine Learning Classification over Encrypted Data, NDSS'14
- Oblivious Multi-Party Machine Learning on Trusted Processors, USENIX SECURITY'16
- Prio: Private, Robust, and Scalable Computation of Aggregate Statistics, NSDI'17
- SecureML: A System for Scalable Privacy-Preserving Machine Learning, S&P'17
- MiniONN: Oblivious Neural Network Predictions via MiniONN Transformations, CCS'17
- Chameleon: A Hybrid Secure Computation Framework for Machine Learning Applications, AsiaCCS'17
- DeepSecure: Scalable Provably-Secure Deep Learning, DAC'17
- Secure Computation for Machine Learning With SPDZ, NIPS'18
- ABY3:a Mixed protocol Framework for Machine Learning, CCS'18
- SecureNN: Efficient and Private Neural Network Training, PETS'18
- Gazelle: A Low Latency Framework for Secure Neural Network Inference, USENIX SECURITY'18
- CHET: an optimizing compiler for fully-homomorphic neural-network inferencing, PLDI'19
- New Primitives for Actively-Secure MPC over Rings with Applications to Private Machine Learning, S&P'19
- FLASH: Fast and Robust Framework for Privacy-preserving Machine Learning, PETS'19
- Helen: Maliciously Secure Coopetitive Learning for Linear Models, S&P'19
- Efficient multi-key homomorphic encryption with packed ciphertexts with application to oblivious neural network inference. CCS'19
- QUOTIENT: two-party secure neural network training and prediction, CCS'19
- ASTRA: High Throughput 3PC over Rings with Application to Secure Prediction, CCSW'19
- Trident: Efficient 4PC Framework for Privacy Preserving Machine Learning, NDSS'20
- BLAZE: Blazing Fast Privacy-Preserving Machine Learning, NDSS'20
- Delphi: A Cryptographic Inference Service for Neural Networks, USENIX SECURITY'20
- FALCON: Honest-Majority Maliciously Secure Framework for Private Deep Learning
- MP2ML: A Mixed-Protocol Machine Learning Framework for Private Inference, ARES'20
- SANNS: Scaling Up Secure Approximate k-Nearest Neighbors Search, USENIX Security'20
- PySyft: A Generic Framework for Privacy Preserving Deep Learning
- Private Deep Learning in TensorFlow Using Secure Computation
- CryptoDL: Deep Neural Networks over Encrypted Data
- CryptoNets: Applying Neural Networks to Encrypted Data with High Throughput and Accuracy
- CrypTFlow: Secure TensorFlow Inference
- CrypTFlow2: Practical 2-Party Secure Inference, CCS'20
- ARIANN: Low-Interaction Privacy-Preserving Deep Learning via Function Secret Sharing
- Practical Privacy-Preserving K-means Clustering, PETS'20
- ABY2.0: Improved Mixed-Protocol Secure Two-Party Computation (Full Version), USENIX Security'21
- Secure Evaluation of Quantized Neural Networks, PETS'20
- SWIFT: Super-fast and Robust Privacy-Preserving Machine Learning
- Privacy-Preserving Deep Learning, CCS'15
- Practical Secure Aggregation for Privacy Preserving Machine Learning, CCS'17
- Privacy-Preserving Deep Learning via Additively Homomorphic Encryption, TIFS'17
- NIKE-based Fast Privacy-preserving High-dimensional Data Aggregation for Mobile Devices, CACR'18
- PrivFL: Practical Privacy-preserving Federated Regressions on High-dimensional Data over Mobile Networks, CCSW'19
- VerifyNet: Secure and verifiable federated learning, TIFS'19
- PrivColl: Practical Privacy-Preserving Collaborative Machine Learning
- NPMML: A Framework for Non-interactive Privacy-preserving Multi-party Machine Learning, TDSC'20
- SAFER: Sparse secure Aggregation for FEderated leaRning
- Secure Byzantine-Robust Machine Learning
- Secure Single-Server Aggregation with (Poly)Logarithmic Overhead, CCS'20
- Batchcrypt: Efficient homomorphic encryption for cross-silo federated learning, USENIX ATC'21
- FedSel: Federated SGD under Local Differential Privacy with Top-k Dimension Selection, DASFAA'20
- TinyGarble: Logic Synthesis and Sequential Descriptions for Yao's Garbled Circuits
- SPDZ-2: Multiparty computation with SPDZ, MASCOT, and Overdrive offline phases
- ABY: A Framework for Efficient Mixed-Protocol Secure Two-Party Computation
- Obliv - C: C compiler for embedding privacy preserving protocols:
- TFHE: Fast Fully Homomorphic Encryption Library over the Torus
- SEAL: Simple Encypted Arithmatic Library
- PySEAL: Python interface to SEAL
- HElib: An Implementation of homomorphic encryption
- EzPC: programmable, efficient, and scalable secure two-party computation for machine learning
- CUDA-accelerated Fully Homomorphic Encryption Library
- CrypTen: A framework for Privacy Preserving Machine Learning
- tf-encrypted: A Framework for Machine Learning on Encrypted Data
- Sharemind
- Multiparty computation from somewhat homomorphic encryption, Crypto'12
- Practical covertly secure MPC for dishonest majority–or: breaking the SPDZ limits, ESORICS'13
- MASCOT: faster malicious arithmetic secure computation with oblivious transfer, CCS'16
- SPDZ^2k: Efficient MPC mod 2^k for Dishonest Majority, Crypto'18
- Overdrive^2k: Making SPDZ Great Again
- High-Throughput Semi-Honest Secure Three-Party Computation with an Honest Majority, CCS'16
- Sharemind: A framework for fast privacy-preserving computations, ESORICS'08
- Efficiently Verifiable Computation on Encrypted Data, CCS'14
- Membership inference attacks against machine learning models, S&P'17
- Comprehensive privacy analysis of deep learning: Passive and active white-box inference attacks against centralized and federated learning, S&P'19
- Terngrad: Ternary gradients to reduce communication in distributed deep learning, NIPS'17
- MachineLearningwithAdversaries: ByzantineTolerantGradientDescen, NIPS'17