Dates are inconsistent

Dates are inconsistent

57 results sorted by ID

2024/1196 (PDF) Last updated: 2024-07-24
Client-Aided Privacy-Preserving Machine Learning
Peihan Miao, Xinyi Shi, Chao Wu, Ruofan Xu
Cryptographic protocols

Privacy-preserving machine learning (PPML) enables multiple distrusting parties to jointly train ML models on their private data without revealing any information beyond the final trained models. In this work, we study the client-aided two-server setting where two non-colluding servers jointly train an ML model on the data held by a large number of clients. By involving the clients in the training process, we develop efficient protocols for training algorithms including linear regression,...

2024/1089 (PDF) Last updated: 2024-07-04
Juliet: A Configurable Processor for Computing on Encrypted Data
Charles Gouert, Dimitris Mouris, Nektarios Georgios Tsoutsos
Applications

Fully homomorphic encryption (FHE) has become progressively more viable in the years since its original inception in 2009. At the same time, leveraging state-of-the-art schemes in an efficient way for general computation remains prohibitively difficult for the average programmer. In this work, we introduce a new design for a fully homomorphic processor, dubbed Juliet, to enable faster operations on encrypted data using the state-of-the-art TFHE and cuFHE libraries for both CPU and GPU...

2024/866 (PDF) Last updated: 2024-05-31
Ripple: Accelerating Programmable Bootstraps for FHE with Wavelet Approximations
Charles Gouert, Mehmet Ugurbil, Dimitris Mouris, Miguel de Vega, Nektarios Georgios Tsoutsos
Cryptographic protocols

Homomorphic encryption can address key privacy challenges in cloud-based outsourcing by enabling potentially untrusted servers to perform meaningful computation directly on encrypted data. While most homomorphic encryption schemes offer addition and multiplication over ciphertexts natively, any non-linear functions must be implemented as costly polynomial approximations due to this restricted computational model. Nevertheless, the CGGI cryptosystem is capable of performing arbitrary...

2024/723 (PDF) Last updated: 2024-05-11
$\mathsf{OPA}$: One-shot Private Aggregation with Single Client Interaction and its Applications to Federated Learning
Harish Karthikeyan, Antigoni Polychroniadou
Applications

Our work aims to minimize interaction in secure computation due to the high cost and challenges associated with communication rounds, particularly in scenarios with many clients. In this work, we revisit the problem of secure aggregation in the single-server setting where a single evaluation server can securely aggregate client-held individual inputs. Our key contribution is One-shot Private Aggregation ($\mathsf{OPA}$) where clients speak only once (or even choose not to speak) per...

2024/124 (PDF) Last updated: 2024-07-23
Perceived Information Revisited II: Information-Theoretical Analysis of Deep-Learning Based Side-Channel Attacks
Akira Ito, Rei Ueno, Naofumi Homma
Attacks and cryptanalysis

Previous studies on deep-learning-based side-channel attacks (DL-SCAs) have shown that traditional performance evaluation metrics commonly used in DL, like accuracy and F1 score, are not effective in evaluating DL-SCA performance. Therefore, some previous studies have proposed new alternative metrics for evaluating the performance of DL-SCAs. Notably, perceived information (PI) and effective perceived information (EPI) are major metrics based on information theory. While it has been...

2023/1546 (PDF) Last updated: 2023-10-09
PERFORMANCE EVALUATION OF MACHINE LEARNING ALGORITHMS FOR INTRUSION DETECTION SYSTEM
Sudhanshu Sekhar Tripathy, Bichitrananda Behera
Implementation

The escalation of hazards to safety and hijacking of digital networks are among the strongest perilous difficulties that must be addressed in the present day. Numerous safety procedures were set up to track and recognize any illicit activity on the network's infrastructure. IDS are the best way to resist and recognize intrusions on internet connections and digital technologies. To classify network traffic as normal or anomalous, Machine Learning (ML) classifiers are increasingly utilized. An...

2023/1462 (PDF) Last updated: 2023-09-25
High-precision RNS-CKKS on fixed but smaller word-size architectures: theory and application
Rashmi Agrawal, Jung Ho Ahn, Flavio Bergamaschi, Ro Cammarota, Jung Hee Cheon, Fillipe D. M. de Souza, Huijing Gong, Minsik Kang, Duhyeong Kim, Jongmin Kim, Hubert de Lassus, Jai Hyun Park, Michael Steiner, Wen Wang
Cryptographic protocols

A prevalent issue in the residue number system (RNS) variant of the Cheon-Kim-Kim-Song (CKKS) homomorphic encryption (HE) scheme is the challenge of efficiently achieving high precision on hardware architectures with a fixed, yet smaller, word-size of bit-length $W$, especially when the scaling factor satisfies $\log\Delta > W$. In this work, we introduce an efficient solution termed composite scaling. In this approach, we group multiple RNS primes as $q_\ell:= \prod_{j=0}^{t-1}...

2023/1345 (PDF) Last updated: 2023-09-08
Experimenting with Zero-Knowledge Proofs of Training
Sanjam Garg, Aarushi Goel, Somesh Jha, Saeed Mahloujifar, Mohammad Mahmoody, Guru-Vamsi Policharla, Mingyuan Wang
Cryptographic protocols

How can a model owner prove they trained their model according to the correct specification? More importantly, how can they do so while preserving the privacy of the underlying dataset and the final model? We study this problem and formulate the notion of zero-knowledge proof of training (zkPoT), which formalizes rigorous security guarantees that should be achieved by a privacy-preserving proof of training. While it is theoretically possible to design zkPoT for any model using generic...

2023/1203 (PDF) Last updated: 2023-08-08
Collaborative Privacy-Preserving Analysis of Oncological Data using Multiparty Homomorphic Encryption
Ravit Geva, Alexander Gusev, Yuriy Polyakov, Lior Liram, Oded Rosolio, Andreea Alexandru, Nicholas Genise, Marcelo Blatt, Zohar Duchin, Barliz Waissengrin, Dan Mirelman, Felix Bukstein, Deborah T. Blumenthal, Ido Wolf, Sharon Pelles-Avraham, Tali Schaffer, Lee A. Lavi, Daniele Micciancio, Vinod Vaikuntanathan, Ahmad Al Badawi, Shafi Goldwasser
Applications

Real-world healthcare data sharing is instrumental in constructing broader-based and larger clinical data sets that may improve clinical decision-making research and outcomes. Stakeholders are frequently reluctant to share their data without guaranteed patient privacy, proper protection of their data sets, and control over the usage of their data. Fully homomorphic encryption (FHE) is a cryptographic capability that can address these issues by enabling computation on encrypted data without...

2023/804 (PDF) Last updated: 2023-06-01
Falkor: Federated Learning Secure Aggregation Powered by AES-CTR GPU Implementation
Mariya Georgieva Belorgey, Sofia Dandjee, Nicolas Gama, Dimitar Jetchev, Dmitry Mikushin
Cryptographic protocols

We propose a novel protocol, Falkor, for secure aggregation for Federated Learning in the multi-server scenario based on masking of local models via a stream cipher based on AES in counter mode and accelerated by GPUs running on the aggregating servers. The protocol is resilient to client dropout and has reduced clients/servers communication cost by a factor equal to the number of aggregating servers (compared to the naïve baseline method). It scales simultaneously in the two major...

2022/866 (PDF) Last updated: 2024-05-13
Communication-Efficient Secure Logistic Regression
Amit Agarwal, Stanislav Peceny, Mariana Raykova, Phillipp Schoppmann, Karn Seth
Cryptographic protocols

We present a novel construction that enables two parties to securely train a logistic regression model on private secret-shared data. Our goal is to minimize online communication and round complexity, while still allowing for an efficient offline phase. As part of our construction, we develop many building blocks of independent interest. These include a new approximation technique for the sigmoid function that results in a secure protocol with better communication, protocols for secure...

2022/280 (PDF) Last updated: 2022-07-18
Efficient Homomorphic Evaluation on Large Intervals
Jung Hee Cheon, Wootae Kim, Jai Hyun Park
Public-key cryptography

Homomorphic encryption (HE) is being widely used for privacy-preserving computation. Since HE schemes only support polynomial operations, it is prevalent to use polynomial approximations of non-polynomial functions. We cannot monitor the intermediate values during the homomorphic evaluation; as a consequence, we should utilize polynomial approximations with sufficiently large approximation intervals to prevent the failure of the evaluation. However, the large approximation interval...

2022/146 (PDF) Last updated: 2022-02-12
Training Differentially Private Models with Secure Multiparty Computation
Sikha Pentyala, Davis Railsback, Ricardo Maia, Rafael Dowsley, David Melanson, Anderson Nascimento, Martine De Cock
Cryptographic protocols

We address the problem of learning a machine learning model from training data that originates at multiple data owners, while providing formal privacy guarantees regarding the protection of each owner's data. Existing solutions based on Differential Privacy (DP) achieve this at the cost of a drop in accuracy. Solutions based on Secure Multiparty Computation (MPC) do not incur such accuracy loss but leak information when the trained model is made publicly available. We propose an MPC solution...

2022/099 (PDF) Last updated: 2022-01-31
Performance of Hierarchical Transforms in Homomorphic Encryption: A case study on Logistic Regression inference
Pedro Geraldo M. R. Alves, Jheyne N. Ortiz, Diego F. Aranha
Implementation

Recent works challenged the Number-Theoretic Transform (NTT) as the most efficient method for polynomial multiplication in GPU implementations of Fully Homomorphic Encryption schemes such as CKKS and BFV. In particular, these works argue that the Discrete Galois Transform (DGT) is a better candidate for this particular case. However, these claims were never rigorously validated, and only intuition was used to argue in favor of each transform. This work brings some light on the dis- cussion...

2021/1683 (PDF) Last updated: 2021-12-22
PUBA: Privacy-Preserving User-Data Bookkeeping and Analytics
Valerie Fetzer, Marcel Keller, Sven Maier, Markus Raiber, Andy Rupp, Rebecca Schwerdt
Cryptographic protocols

In this paper we propose Privacy-preserving User-data Bookkeeping & Analytics (PUBA), a building block destined to enable the implementation of business models (e.g., targeted advertising) and regulations (e.g., fraud detection) requiring user-data analysis in a privacy-preserving way. In PUBA, users keep an unlinkable but authenticated cryptographic logbook containing their historic data on their device. This logbook can only be updated by the operator while its content is not...

2021/733 (PDF) Last updated: 2021-12-06
GenoPPML – a framework for genomic privacy-preserving machine learning
Sergiu Carpov, Nicolas Gama, Mariya Georgieva, Dimitar Jetchev
Applications

We present a framework GenoPPML for privacy-preserving machine learning in the context of sensitive genomic data processing. The technology combines secure multiparty computation techniques based on the recently proposed Manticore secure multiparty computation framework for model training and fully homomorphic encryption based on TFHE for model inference. The framework was successfully used to solve breast cancer prediction problems on gene expression datasets coming from distinct private...

2021/555 (PDF) Last updated: 2022-06-20
Neural-Network-Based Modeling Attacks on XOR Arbiter PUFs Revisited
Nils Wisiol, Bipana Thapaliya, Khalid T. Mursi, Jean-Pierre Seifert, Yu Zhuang
Foundations

By revisiting, improving, and extending recent neural-network based modeling attacks on XOR Arbiter PUFs from the literature, we show that XOR Arbiter PUFs, (XOR) Feed-Forward Arbiter PUFs, and Interpose PUFs can be attacked faster, up to larger security parameters, and with fewer challenge-response pairs than previously known both in simulation and in silicon data. To support our claim, we discuss the differences and similarities of recently proposed modeling attacks and offer a fair...

2021/508 (PDF) Last updated: 2021-04-23
Over 100x Faster Bootstrapping in Fully Homomorphic Encryption through Memory-centric Optimization with GPUs
Wonkyung Jung, Sangpyo Kim, Jung Ho Ahn, Jung Hee Cheon, Younho Lee
Implementation

Fully Homomorphic encryption (FHE) has been gaining popularity as an emerging way of enabling an unlimited number of operations on the encrypted message without decryption. A major drawback of FHE is its high computational cost. Especially, a bootstrapping that refreshes the noise accumulated through consequent FHE operations on the ciphertext is even taking minutes. This significantly limits the practical use of FHE in numerous real applications. By exploiting massive parallelism available...

2021/200 (PDF) Last updated: 2021-02-24
Manticore: Efficient Framework for Scalable Secure Multiparty Computation Protocols
Sergiu Carpov, Kevin Deforth, Nicolas Gama, Mariya Georgieva, Dimitar Jetchev, Jonathan Katz, Iraklis Leontiadis, M. Mohammadi, Abson Sae-Tang, Marius Vuille
Implementation

We propose a novel MPC framework, Manticore, in the multiparty setting, with full threshold and semi-honest security model, supporting a combination of real number arithmetic (arithmetic shares), Boolean arithmetic (Boolean shares) and garbled circuits (Yao shares). In contrast to prior work [MZ17, MR18], Manticore never overflows, an important feature for machine learning applications. It achieves this without compromising efficiency or security. Compared to other overflow-free...

2020/1514 (PDF) Last updated: 2020-12-11
Improved privacy-preserving training using fixed-Hessian minimisation
Tabitha Ogilvie, Rachel Player, Joe Rowell
Applications

The fixed-Hessian minimisation method can be used to implement privacy-preserving machine learning training from homomorphic encryption. This is a relatively under-explored part of the literature, with the main prior work being that of Bonte and Vercauteren (BMC Medical Genomics, 2018), who proposed a simplified Hessian method for logistic regression training over the BFV homomorphic encryption scheme. Our main contribution is to revisit the fixed- Hessian approach for logistic regression...

2020/1483 (PDF) Last updated: 2024-05-01
A Low-Depth Homomorphic Circuit for Logistic Regression Model Training
Eric Crockett
Applications

Machine learning is an important tool for analyzing large data sets, but its use on sensitive data may be limited by regulation. One solution to this problem is to perform machine learning tasks on encrypted data using homomorphic encryption, which enables arbitrary computation on encrypted data. We take a fresh look at one specific task: training a logistic regression model on encrypted data. The most important factor in the efficiency of a solution is the multiplicative depth of the...

2020/1396 (PDF) Last updated: 2020-11-10
Efficient Privacy Preserving Logistic Regression Inference and Training
Kyoohyung Han, Jinhyuck Jeong, Jung Hoon Sohn, Yongha Son
Public-key cryptography

Recently, privacy-preserving logistic regression techniques on distributed data among several data owners drew attention in terms of their applicability in federated learning environment. Many of them have been built upon cryptographic primitives such as secure multiparty computations(MPC) and homomorphic encryptions(HE) to protect the privacy of data. The secure multiparty computation provides fast and secure unit operations for arithmetic and bit operations but they often does not scale...

2020/1225 (PDF) Last updated: 2022-01-26
ABY2.0: Improved Mixed-Protocol Secure Two-Party Computation
Arpita Patra, Thomas Schneider, Ajith Suresh, Hossein Yalame
Cryptographic protocols

Secure Multi-party Computation (MPC) allows a set of mutually distrusting parties to jointly evaluate a function on their private inputs while maintaining input privacy. In this work, we improve semi-honest secure two-party computation (2PC) over rings, with a focus on the efficiency of the online phase. We propose an efficient mixed-protocol framework, outperforming the state-of-the-art 2PC framework of ABY. Moreover, we extend our techniques to multi- input multiplication gates without...

2020/957 (PDF) Last updated: 2020-10-15
Combining Optimization Objectives: New Machine-Learning Attacks on Strong PUFs
Johannes Tobisch, Anita Aghaie, Georg T. Becker
Applications

Strong Physical Unclonable Functions (PUFs), as a promising security primitive, are supposed to be a lightweight alternative to classical cryptography for purposes such as device authentication. Most of the proposed candidates, however, have been plagued by machine-learning attacks breaking their security claims. The Interpose PUF (iPUF), which has been introduced at CHES 2019, was explicitly designed with state-of-the-art machine-learning attacks in mind and is supposed to be impossible to...

2020/592 (PDF) Last updated: 2021-02-17
SWIFT: Super-fast and Robust Privacy-Preserving Machine Learning
Nishat Koti, Mahak Pancholi, Arpita Patra, Ajith Suresh
Cryptographic protocols

Performing machine learning (ML) computation on private data while maintaining data privacy, aka Privacy-preserving Machine Learning (PPML), is an emergent field of research. Recently, PPML has seen a visible shift towards the adoption of the Secure Outsourced Computation (SOC) paradigm due to the heavy computation that it entails. In the SOC paradigm, computation is outsourced to a set of powerful and specially equipped servers that provide service on a pay-per-use basis. In this work, we...

2020/171 (PDF) Last updated: 2020-03-03
High Performance Logistic Regression for Privacy-Preserving Genome Analysis
Martine De Cock, Rafael Dowsley, Anderson C. A. Nascimento, Davis Railsback, Jianwei Shen, Ariel Todoki
Cryptographic protocols

In this paper, we present a secure logistic regression training protocol and its implementation, with a new subprotocol to securely compute the activation function. To the best of our knowledge, we present the fastest existing secure Multi-Party Computation implementation for training logistic regression models on high dimensional genome data distributed across a local area network.

2020/042 (PDF) Last updated: 2021-01-06
BLAZE: Blazing Fast Privacy-Preserving Machine Learning
Arpita Patra, Ajith Suresh
Cryptographic protocols

Machine learning tools have illustrated their potential in many significant sectors such as healthcare and finance, to aide in deriving useful inferences. The sensitive and confidential nature of the data, in such sectors, raises natural concerns for the privacy of data. This motivated the area of Privacy-preserving Machine Learning (PPML) where privacy of the data is guaranteed. Typically, ML techniques require large computing power, which leads clients with limited infrastructure to rely...

2019/1365 (PDF) Last updated: 2020-02-20
FLASH: Fast and Robust Framework for Privacy-preserving Machine Learning
Megha Byali, Harsh Chaudhari, Arpita Patra, Ajith Suresh
Cryptographic protocols

Privacy-preserving machine learning (PPML) via Secure Multi-party Computation (MPC) has gained momentum in the recent past. Assuming a minimal network of pair-wise private channels, we propose an efficient four-party PPML framework over rings $\Z{\ell}$, FLASH, the first of its kind in the regime of PPML framework, that achieves the strongest security notion of Guaranteed Output Delivery (all parties obtain the output irrespective of adversary's behaviour). The state of the art ML...

2019/1315 (PDF) Last updated: 2021-06-08
Trident: Efficient 4PC Framework for Privacy Preserving Machine Learning
Harsh Chaudhari, Rahul Rachuri, Ajith Suresh
Cryptographic protocols

Machine learning has started to be deployed in fields such as healthcare and finance, which involves dealing with a lot of sensitive data. This propelled the need for and growth of privacy-preserving machine learning (PPML). We propose an actively secure four-party protocol (4PC), and a framework for PPML, showcasing its applications on four of the most widely-known machine learning algorithms -- Linear Regression, Logistic Regression, Neural Networks, and Convolutional Neural Networks. Our...

2019/1148 (PDF) Last updated: 2019-10-07
On the Feasibility and Impact of Standardising Sparse-secret LWE Parameter Sets for Homomorphic Encryption
Benjamin R. Curtis, Rachel Player
Public-key cryptography

In November 2018, the HomomorphicEncryption.org consortium published the Homomorphic Encryption Security Standard. The Standard recommends several sets of Learning with Errors (LWE) parameters that can be selected by application developers to achieve a target security level \( \lambda \in \{128,192,256\} \). These parameter sets all involve a power-of-two dimension \( n \leq 2^{15} \), an error distribution of standard deviation \( \sigma \approx 3.19 \), and a secret whose coefficients are...

2019/979 (PDF) Last updated: 2019-08-29
PrivFL: Practical Privacy-preserving Federated Regressions on High-dimensional Data over Mobile Networks
Kalikinkar Mandal, Guang Gong
Cryptographic protocols

Federated Learning (FL) enables a large number of users to jointly learn a shared machine learning (ML) model, coordinated by a centralized server, where the data is distributed across multiple devices. This approach enables the server or users to train and learn an ML model using gradient descent, while keeping all the training data on users' devices. We consider training an ML model over a mobile network where user dropout is a common phenomenon. Although federated learning was aimed at...

2019/744 (PDF) Last updated: 2021-03-12
Privacy-Preserving Classification of Personal Text Messages with Secure Multi-Party Computation: An Application to Hate-Speech Detection
Devin Reich, Ariel Todoki, Rafael Dowsley, Martine De Cock, Anderson C. A. Nascimento
Cryptographic protocols

Classification of personal text messages has many useful applications in surveillance, e-commerce, and mental health care, to name a few. Giving applications access to personal texts can easily lead to (un)intentional privacy violations. We propose the first privacy-preserving solution for text classification that is provably secure. Our method, which is based on Secure Multiparty Computation (SMC), encompasses both feature extraction from texts, and subsequent classification with logistic...

2019/429 (PDF) Last updated: 2020-02-02
ASTRA: High Throughput 3PC over Rings with Application to Secure Prediction
Harsh Chaudhari, Ashish Choudhury, Arpita Patra, Ajith Suresh
Cryptographic protocols

The concrete efficiency of secure computation has been the focus of many recent works. In this work, we present concretely-efficient protocols for secure $3$-party computation (3PC) over a ring of integers modulo $2$^$l$ tolerating one corruption, both with semi-honest and malicious security. Owing to the fact that computation over ring emulates computation over the real-world system architectures, secure computation over ring has gained momentum of late. Cast in the offline-online...

2019/425 (PDF) Last updated: 2019-08-19
Homomorphic Training of 30,000 Logistic Regression Models
Flavio Bergamaschi, Shai Halevi, Tzipora T. Halevi, Hamish Hunt
Applications

In this work, we demonstrate the use the CKKS homomorphic encryption scheme to train a large number of logistic regression models simultaneously, as needed to run a genome-wide association study (GWAS) on encrypted data. Our implementation can train more than 30,000 models (each with four features) in about 20 minutes. To that end, we rely on a similar iterative Nesterov procedure to what was used by Kim, Song, Kim, Lee, and Cheon to train a single model [KSKLC18]. We adapt this method to...

2019/294 (PDF) Last updated: 2020-05-23
Semi-parallel Logistic Regression for GWAS on Encrypted Data
Miran Kim, Yongsoo Song, Baiyu Li, Daniele Micciancio
Applications

The sharing of biomedical data is crucial to enable scientific discoveries across institutions and improve health care. For example, genome-wide association studies (GWAS) based on a large number of samples can identify disease-causing genetic variants. The privacy concern, however, has become a major hurdle for data management and utilization. Homomorphic encryption is one of the most powerful cryptographic primitives which can address the privacy and security issues. It supports the...

2019/281 (PDF) Last updated: 2019-12-13
Make Some ROOM for the Zeros: Data Sparsity in Secure Distributed Machine Learning
Phillipp Schoppmann, Adria Gascon, Mariana Raykova, Benny Pinkas
Cryptographic protocols

Exploiting data sparsity is crucial for the scalability of many data analysis tasks. However, while there is an increasing interest in efficient secure computation protocols for distributed machine learning, data sparsity has so far not been considered in a principled way in that setting. We propose sparse data structures together with their corresponding secure computation protocols to address common data analysis tasks while utilizing data sparsity. In particular, we define a Read-Only...

2019/223 (PDF) Last updated: 2020-07-24
Optimized Homomorphic Encryption Solution for Secure Genome-Wide Association Studies
Marcelo Blatt, Alexander Gusev, Yuriy Polyakov, Kurt Rohloff, Vinod Vaikuntanathan
Implementation

Genome-Wide Association Studies (GWAS) refer to observational studies of a genome-wide set of genetic variants across many individuals to see if any genetic variants are associated with a certain trait. A typical GWAS analysis of a disease phenotype involves iterative logistic regression of a case/control phenotype on a single-neuclotide polymorphism (SNP) with quantitative covariates. GWAS have been a highly successful approach for identifying genetic-variant associations with many...

2019/145 (PDF) Last updated: 2019-08-01
Achieving GWAS with Homomorphic Encryption
Jun Jie Sim, Fook Mun Chan, Shibin Chen, Benjamin Hong Meng Tan, Khin Mi Mi Aung
Applications

One way of investigating how genes affect human traits would be with a genome-wide association study (GWAS). Genetic markers, known as single-nucleotide polymorphism (SNP), are used in GWAS. This raises privacy and security concerns as these genetic markers can be used to identify individuals uniquely. This problem is further exacerbated by a large number of SNPs needed, which produce reliable results at a higher risk of compromising the privacy of participants. We describe a method using...

2019/140 (PDF) Last updated: 2019-02-14
CodedPrivateML: A Fast and Privacy-Preserving Framework for Distributed Machine Learning
Jinhyun So, Basak Guler, A. Salman Avestimehr, Payman Mohassel
Applications

How to train a machine learning model while keeping the data private and secure? We present CodedPrivateML, a fast and scalable approach to this critical problem. CodedPrivateML keeps both the data and the model information-theoretically private, while allowing efficient parallelization of training across distributed workers. We characterize CodedPrivateML's privacy threshold and prove its convergence for logistic (and linear) regression. Furthermore, via experiments over Amazon EC2, we...

2019/101 (PDF) Last updated: 2019-01-31
Privacy-preserving semi-parallel logistic regression training with Fully Homomorphic Encryption
Sergiu Carpov, Nicolas Gama, Mariya Georgieva, Juan Ramon Troncoso-Pastoriza
Applications

Background Privacy-preserving computations on genomic data, and more generally on medical data, is a critical path technology for innovative, life-saving research to positively and equally impact the global population. It enables medical research algorithms to be securely deployed in the cloud because operations on encrypted genomic databases are conducted without revealing any individual genomes. Methods for secure computation have shown significant performance improvements over the last...

2018/931 (PDF) Last updated: 2018-10-02
A Full RNS Variant of Approximate Homomorphic Encryption
Jung Hee Cheon, Kyoohyung Han, Andrey Kim, Miran Kim, Yongsoo Song

The technology of homomorphic encryption has improved rapidly in a few years. The cutting edge implementations are efficient enough to use in practical applications. Recently, Cheon et al. (ASIACRYPT'17) proposed a homomorphic encryption scheme which supports an arithmetic of approximate numbers over encryption. This scheme shows the current best performance in computation over the real numbers, but its implementation could not employ core optimization techniques based on the Residue Number...

2018/662 (PDF) Last updated: 2018-07-10
Efficient Logistic Regression on Large Encrypted Data
Kyoohyung Han, Seungwan Hong, Jung Hee Cheon, Daejun Park
Applications

Machine learning on encrypted data is a cryptographic method for analyzing private and/or sensitive data while keeping privacy. In the training phase, it takes as input an encrypted training data and outputs an encrypted model without using the decryption key. In the prediction phase, it uses the encrypted model to predict results on new encrypted data. In each phase, no decryption key is needed, and thus the privacy of data is guaranteed while the underlying encryption is secure. It has...

2018/589 (PDF) Last updated: 2019-03-06
Implementation and Performance Evaluation of RNS Variants of the BFV Homomorphic Encryption Scheme
Ahmad Al Badawi, Yuriy Polyakov, Khin Mi Mi Aung, Bharadwaj Veeravalli, Kurt Rohloff
Implementation

Homomorphic encryption is an emerging form of encryption that provides the ability to compute on encrypted data without ever decrypting them. Potential applications include aggregating sensitive encrypted data on a cloud environment and computing on the data in the cloud without compromising data privacy. There have been several recent advances resulting in new homomorphic encryption schemes and optimized variants. We implement and evaluate the performance of two optimized variants, namely...

2018/462 (PDF) Last updated: 2018-05-21
Logistic regression over encrypted data from fully homomorphic encryption
Hao Chen, Ran Gilad-Bachrach, Kyoohyung Han, Zhicong Huang, Amir Jalali, Kim Laine, Kristin Lauter
Applications

One of the tasks in the $2017$ iDASH secure genome analysis competition was to enable training of logistic regression models over encrypted genomic data. More precisely, given a list of approximately $1500$ patient records, each with $18$ binary features containing information on specific mutations, the idea was for the data holder to encrypt the records using homomorphic encryption, and send them to an untrusted cloud for storage. The cloud could then apply a training algorithm on the...

2018/403 (PDF) Last updated: 2022-01-10
ABY3: A Mixed Protocol Framework for Machine Learning
Payman Mohassel, Peter Rindal
Cryptographic protocols

Machine learning is widely used to produce models for a range of applications and is increasingly offered as a service by major technology companies. However, the required massive data collection raises privacy concerns during both training and prediction stages. In this paper, we design and implement a general framework for privacy-preserving machine learning and use it to obtain new solutions for training linear regression, logistic regression and neural network models. Our protocols are...

2018/350 (PDF) Last updated: 2019-07-09
The Interpose PUF: Secure PUF Design against State-of-the-art Machine Learning Attacks
Phuong Ha Nguyen, Durga Prasad Sahoo, Chenglu Jin, Kaleel Mahmood, Ulrich Rührmair, Marten van Dijk
Implementation

The design of a silicon Strong Physical Unclonable Function (PUF) that is lightweight and stable, and which possesses a rigorous security argument, has been a fundamental problem in PUF research since its very beginnings in 2002. Various effective PUF modeling attacks, for example at CCS 2010 and CHES 2015, have shown that currently, no existing silicon PUF design can meet these requirements. In this paper, we introduce the novel Interpose PUF (iPUF) design, and rigorously prove its...

2018/254 (PDF) Last updated: 2018-03-07
Logistic Regression Model Training based on the Approximate Homomorphic Encryption
Andrey Kim, Yongsoo Song, Miran Kim, Keewoo Lee, Jung Hee Cheon
Applications

Security concerns have been raised since big data became a prominent tool in data analysis. For instance, many machine learning algorithms aim to generate prediction models using training data which contain sensitive information about individuals. Cryptography community is considering secure computation as a solution for privacy protection. In particular, practical requirements have triggered research on the efficiency of cryptographic primitives. This paper presents a practical method to...

2018/233 (PDF) Last updated: 2019-01-02
Privacy-Preserving Logistic Regression Training
Charlotte Bonte, Frederik Vercauteren
Applications

Logistic regression is a popular technique used in machine learning to construct classification models. Since the construction of such models is based on computing with large datasets, it is an appealing idea to outsource this computation to a cloud service. The privacy-sensitive nature of the input data requires appropriate privacy preserving measures before outsourcing it. Homomorphic encryption enables one to compute on encrypted data directly, without decryption, and can be used to...

2018/202 (PDF) Last updated: 2018-02-22
Doing Real Work with FHE: The Case of Logistic Regression
Jack L. H. Crawford, Craig Gentry, Shai Halevi, Daniel Platt, Victor Shoup
Implementation

We describe our recent experience, building a system that uses fully-homomorphic encryption (FHE) to approximate the coefficients of a logistic-regression model, built from genomic data. The aim of this project was to examine the feasibility of a solution that operates "deep within the bootstrapping regime," solving a problem that appears too hard to be addressed just with somewhat-homomorphic encryption. As part of this project, we implemented optimized versions of many "bread and butter"...

2018/074 (PDF) Last updated: 2018-03-26
Secure Logistic Regression Based on Homomorphic Encryption: Design and Evaluation
Miran Kim, Yongsoo Song, Shuang Wang, Yuhou Xia, Xiaoqian Jiang

Learning a model without accessing raw data has been an intriguing idea to the security and machine learning researchers for years. In an ideal setting, we want to encrypt sensitive data to store them on a commercial cloud and run certain analysis without ever decrypting the data to preserve the privacy. Homomorphic encryption technique is a promising candidate for secure data outsourcing but it is a very challenging task to support real-world machine learning tasks. Existing frameworks can...

2017/1234 (PDF) Last updated: 2017-12-22
High-Precision Privacy-Preserving Real-Valued Function Evaluation
Christina Boura, Ilaria Chillotti, Nicolas Gama, Dimitar Jetchev, Stanislav Peceny, Alexander Petric
Cryptographic protocols

We propose a novel multi-party computation protocol for evaluating continuous real-valued functions with high numerical precision. Our method is based on approximations with Fourier series and uses at most two rounds of communication during the online phase. For the offline phase, we propose a trusted-dealer and honest-but-curious aided solution, respectively. We apply our algorithm to train a logistic regression classifier via a variant of Newton's method (known as IRLS) to compute...

2017/396 (PDF) Last updated: 2017-06-07
SecureML: A System for Scalable Privacy-Preserving Machine Learning
Payman Mohassel, Yupeng Zhang
Cryptographic protocols

Machine learning is widely used in practice to produce predictive models for applications such as image processing, speech and text recognition. These models are more accurate when trained on large amount of data collected from different sources. However, the massive data collection raises privacy concerns. In this paper, we present new and efficient protocols for privacy preserving machine learning for linear regression, logistic regression and neural network training using the stochastic...

2016/736 (PDF) Last updated: 2017-03-05
Efficient and Private Scoring of Decision Trees, Support Vector Machines and Logistic Regression Models based on Pre-Computation
Martine De Cock, Rafael Dowsley, Caleb Horst, Raj Katti, Anderson C. A. Nascimento, Stacey C. Newman, Wing-Sea Poon
Cryptographic protocols

Many data-driven personalized services require that private data of users is scored against a trained machine learning model. In this paper we propose a novel protocol for privacy-preserving classification of decision trees, a popular machine learning model in these scenarios. Our solutions are composed out of building blocks, namely a secure comparison protocol, a protocol for obliviously selecting inputs, and a protocol for evaluating polynomials. By combining some of the building blocks...

2016/428 (PDF) Last updated: 2016-05-01
An Efficient and Scalable Modeling Attack on Lightweight Secure Physically Unclonable Function
Phuong Ha Nguyen, Durga Prasad Sahoo
Applications

The Lightweight Secure Physically Unclonable Function (LSPUF) was proposed as a secure composition of Arbiter PUFs with additional XOR based input and output networks. But later, researchers proposed a Machine Learning (ML) based modeling attack on $x$-XOR LSPUF, and they also empirically showed that pure ML based modeling is not computationally scalable if the parameter $x$ of $x$-XOR LSPUF is larger than nine. Besides this pure computational attack using only challenge-response pairs...

2016/111 (PDF) Last updated: 2017-03-31
Scalable and Secure Logistic Regression via Homomorphic Encryption
Yoshinori Aono, Takuya Hayashi, Le Trieu Phong, Lihua Wang
Applications

Logistic regression is a powerful machine learning tool to classify data. When dealing with sensitive data such as private or medical information, cares are necessary. In this paper, we propose a secure system for protecting both the training and predicting data in logistic regression via homomorphic encryption. Perhaps surprisingly, despite the non-polynomial tasks of training and predicting in logistic regression, we show that only additively homomorphic encryption is needed to build our...

2013/112 (PDF) Last updated: 2013-08-20
PUF Modeling Attacks on Simulated and Silicon Data
Ulrich Rührmair, Jan Sölter, Frank Sehnke, Xiaolin Xu, Ahmed Mahmoud, Vera Stoyanova, Gideon Dror, Jürgen Schmidhuber, Wayne Burleson, Srinivas Devadas
Implementation

We show in this paper how several proposed Strong Physical Unclonable Functions (PUFs) can be broken by numerical modeling attacks. Given a set of challenge-response pairs (CRPs) of a Strong PUF, our attacks construct a computer algorithm which behaves indistinguishably from the original PUF on almost all CRPs. This algorithm can subsequently impersonate the PUF, and can be cloned and distributed arbitrarily. This breaks the security of almost all applications and protocols that are based on...

2010/251 (PDF) Last updated: 2010-05-03
Modeling Attacks on Physical Unclonable Functions
Ulrich Rührmair, Frank Sehnke, Jan Sölter, Gideon Dror, Srinivas Devadas, Jürgen Schmidhuber
Implementation

We show in this paper how several proposed Physical Unclonable Functions (PUFs) can be broken by numerical modeling attacks. Given a set of challenge-response pairs (CRPs) of a PUF, our attacks construct a computer algorithm which behaves indistinguishably from the original PUF on almost all CRPs. This algorithm can subsequently impersonate the PUF, and can be cloned and distributed arbitrarily. This breaks the security of essentially all applications and protocols that are based on the...

Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.