🤖 AI Summary
This work addresses the excessive noise introduced by traditional differential privacy mechanisms in machine unlearning, which stems from their reliance on worst-case sensitivity and consequently degrades model utility. The authors propose an adaptive noise injection method that calibrates noise based on the individual sensitivity of each data point, achieving instance-specific certified unlearning in ridge regression trained via Langevin dynamics. The key innovation lies in establishing, for the first time, high-probability upper bounds on individual sensitivities for each sample slated for deletion, thereby significantly reducing required noise while preserving formal unlearning guarantees. Theoretical analysis yields tight bounds on individual sensitivity, and experiments on both ridge regression and deep learning models demonstrate the method’s superior trade-off between utility preservation and privacy assurance.
📝 Abstract
Certified machine unlearning can be achieved via noise injection leading to differential privacy guarantees, where noise is calibrated to worst-case sensitivity. Such conservative calibration often results in performance degradation, limiting practical applicability. In this work, we investigate an alternative approach based on adaptive per-instance noise calibration tailored to the individual contribution of each data point to the learned solution. This raises the following challenge: how can one establish formal unlearning guarantees when the mechanism depends on the specific point to be removed? To define individual data point sensitivities in noisy gradient dynamics, we consider the use of per-instance differential privacy. For ridge regression trained via Langevin dynamics, we derive high-probability per-instance sensitivity bounds, yielding certified unlearning with substantially less noise injection. We corroborate our theoretical findings through experiments in linear settings and provide further empirical evidence on the relevance of the approach in deep learning settings.