Piyush Malhotra
@piyush2896
piyush2896
Homomorphic Encryption in Python
Optimizing Encryption
Building a Neural Network
How to make it Easy?
Deep Learning is a stack of automation algorithms that converts one form of data into another using the concept of Neural Networks
Without being told how well it's predictions are, it cannot learn. This will be important to remember.
Form of encryption that can:
When you homomorphically encrypt data, you can't read it but you still maintain most of the interesting statistical structure. This has allowed people to train models on encrypted data (CryptoNets)
Many state-of-the-art neural networks can be created using only the following operations:
A Taylor Series allows one to compute a complicated (nonlinear) function using an infinite series of additions, subtractions, multiplications, and divisions.
The quest for a fully homomorphic scheme seeks to find an algorithm that can efficiently and securely compute the various logic gates required to run arbitrary computation. The general hope is that people would be able to securely offload work to the cloud with no risk that the data being sent could be read by anyone other than the sender.
In general, most Fully Homomorphic Encryption schemes are incredibly slow relative to normal computers (not yet practical). This has sparked an interesting thread of research to limit the number of operations to be Somewhat homomorphic so that at least some computations could be performed. Less flexible but faster, a common tradeoff in computation.
The best one to use here is likely either YASHE or FV. YASHE was the method used for the popular CryptoNets algorithm, with great support for floating point operations.
YASHE and FV are pretty complex. For the purpose of making this talk easy and fun to play around with, we're going to go with the slightly less advanced (and possibly less secure) Efficient Integer Vector Homomorphic Encryption.
Let's Go to Notebook...