Home | Resources | Conference 2024 | Join the community


Differential Privacy for Free? Harnessing the Noise In Approximate Homomorphic Encryption

by Tabitha Ogilvie - 2023.06.08

Video recording (Youtube) | Slides (Github) | Paper (iacr) | Join the discussion (Discord)

026 Meetup cover

Abstract

In this meetup, we’ll make a connection between two important ideas in the Privacy Enhancing Technologies ecosystem – Homomorphic Encryption and Differential Privacy. While Homomorphic Encryption ensures that sensitive data is not exposed during computation, Differential Privacy guarantees that each data subject can maintain their privacy when we share the result of that computation.

During this talk, we’ll look at noise growth in Homomorphic Encryption (HE), and investigate the possibility that this inherent noise can give Differential Privacy (DP) “for free”. We will recap what we mean by HE, noise, and DP, before examining new results on the DP guarantees of the Approximate HE setting. We’ll finish by applying our results to a case study: Ridge Regression Training via Gradient Descent.

About the speaker

Tabitha is a PhD student in the Information Security Group at Royal Holloway, University of London, and has just completed a year long internship at Intel, as part of the Security and Privacy Research Group within Intel Labs. Her area of research is Privacy Enhancing Technologies and Privacy Preserving Machine Learning, with a focus on Homomorphic Encryption.

Never miss an update

The newsletter where we post community announcements: https://fheorg.substack.com/

The discord server where you can discuss FHE related topics with the community: https://discord.fhe.org

Make sure to join either (or both) of these to stay informed about future events!