Private AI

Kristin Lauter
Microsoft Research

As the world rushes towards the adoption of Artificial Intelligence in all aspects of our lives, the promise is great, but the risks are many. AI can help to improve and simplify our daily lives and introduce new types of safety measures, but at the cost of sharing and potentially leaking or misusing our private data. Cloud services built on AI algorithms need to make use of customer or enterprise data, to train models and make predictions. It is up to us to protect that data, in storage and in use. Private AI is a set of tools being developed by the Cryptography Group at Microsoft Research which can help protect data while in use by encrypting it. The leading edge of Private AI is based on Homomorphic Encryption (HE), a new encryption paradigm which allows the cloud to operate on private data in encrypted form, without ever decrypting it, enabling private training and private prediction. This talk will give a snapshot of the state of the art and show some compelling demos of HE in action.

Presentation (PDF File)

Back to Women in Mathematics and Public Policy