Forget What's Sensitive, Remember What Matters: Token-Level Differential Privacy in Memory Sculpting for Continual Learning

📅 2025-09-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In continual learning (CL), the accumulation of historical data poses severe privacy leakage risks, while conventional uniform differential privacy (DP) mechanisms suffer from substantial utility degradation due to coarse-grained privacy budget allocation. To address this, we propose a semantic-aware, token-level dynamic DP mechanism: it adaptively allocates per-token DP budgets based on input semantic sensitivity and integrates a privacy-guided memory sculpting module that selectively forgets sensitive historical information during parameter updates—thereby jointly enhancing privacy preservation and mitigating catastrophic forgetting. To our knowledge, this is the first token-level DP framework for CL that incorporates dynamic budget allocation. Evaluated on multiple CL benchmarks, our method achieves strong privacy guarantees (ε ≤ 4) while improving average accuracy by 3.2–7.8% over baselines, demonstrating significantly superior privacy–utility trade-offs compared to existing approaches.

Technology Category

Application Category

📝 Abstract
Continual Learning (CL) models, while adept at sequential knowledge acquisition, face significant and often overlooked privacy challenges due to accumulating diverse information. Traditional privacy methods, like a uniform Differential Privacy (DP) budget, indiscriminately protect all data, leading to substantial model utility degradation and hindering CL deployment in privacy-sensitive areas. To overcome this, we propose a privacy-enhanced continual learning (PeCL) framework that forgets what's sensitive and remembers what matters. Our approach first introduces a token-level dynamic Differential Privacy strategy that adaptively allocates privacy budgets based on the semantic sensitivity of individual tokens. This ensures robust protection for private entities while minimizing noise injection for non-sensitive, general knowledge. Second, we integrate a privacy-guided memory sculpting module. This module leverages the sensitivity analysis from our dynamic DP mechanism to intelligently forget sensitive information from the model's memory and parameters, while explicitly preserving the task-invariant historical knowledge crucial for mitigating catastrophic forgetting. Extensive experiments show that PeCL achieves a superior balance between privacy preserving and model utility, outperforming baseline models by maintaining high accuracy on previous tasks while ensuring robust privacy.
Problem

Research questions and friction points this paper is trying to address.

Adaptively allocates token-level privacy budgets for semantic sensitivity
Integrates privacy-guided memory sculpting to forget sensitive information
Balances privacy protection with model utility in continual learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Token-level dynamic Differential Privacy allocation
Adaptive privacy budgets for semantic sensitivity
Privacy-guided memory sculpting for forgetting
🔎 Similar Papers
No similar papers found.
Bihao Zhan
Bihao Zhan
East China Normal University
CLLLMRAGKG
J
Jie Zhou
School of Computer Science and Technology, East China Normal University, Shanghai
Junsong Li
Junsong Li
East China Normal University
NLPLLMNLI
Y
Yutao Yang
School of Computer Science and Technology, East China Normal University, Shanghai
S
Shilian Chen
School of Computer Science and Technology, East China Normal University, Shanghai
Qianjun Pan
Qianjun Pan
East China Normal University
LLM
X
Xin Li
Shanghai AI Laboratory
W
Wen Wu
School of Computer Science and Technology, East China Normal University, Shanghai
Xingjiao Wu
Xingjiao Wu
East China Normal University
Computer VisionCrowd CountingDocument Layout AnalysisHuman-in-the-loop
Q
Qin Chen
School of Computer Science and Technology, East China Normal University, Shanghai
H
Hang Yan
The Chinese University of HongKong
L
Liang He
School of Computer Science and Technology, East China Normal University, Shanghai