Privacy on the Fly: A Predictive Adversarial Transformation Network for Mobile Sensor Data

📅 2025-11-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Mobile sensor data, when accessed in real time by third-party applications, risks exposing sensitive user attributes (e.g., gender, identity). Existing privacy-preserving methods either compromise real-time performance—requiring full sequence acquisition—or distort spatiotemporal semantics, degrading downstream tasks such as activity recognition. This paper proposes the first online predictive adversarial privacy protection framework: lightweight adversarial perturbations are dynamically predicted and generated *at the moment of data acquisition*, leveraging historical signals without waiting for sequence completion. Innovatively integrating time-series forecasting with generative adversarial networks, we introduce gradient masking and adversarial training to achieve low-distortion, low-latency perturbation generation. Experiments show that our method reduces inference attack success rates on sensitive attributes to 40.11%–44.65%, increases equal error rates to 41.65%–46.22%, significantly outperforming baselines, while fully preserving accuracy for downstream activity recognition.

Technology Category

Application Category

📝 Abstract
Mobile motion sensors such as accelerometers and gyroscopes are now ubiquitously accessible by third-party apps via standard APIs. While enabling rich functionalities like activity recognition and step counting, this openness has also enabled unregulated inference of sensitive user traits, such as gender, age, and even identity, without user consent. Existing privacy-preserving techniques, such as GAN-based obfuscation or differential privacy, typically require access to the full input sequence, introducing latency that is incompatible with real-time scenarios. Worse, they tend to distort temporal and semantic patterns, degrading the utility of the data for benign tasks like activity recognition. To address these limitations, we propose the Predictive Adversarial Transformation Network (PATN), a real-time privacy-preserving framework that leverages historical signals to generate adversarial perturbations proactively. The perturbations are applied immediately upon data acquisition, enabling continuous protection without disrupting application functionality. Experiments on two datasets demonstrate that PATN substantially degrades the performance of privacy inference models, achieving Attack Success Rate (ASR) of 40.11% and 44.65% (reducing inference accuracy to near-random) and increasing the Equal Error Rate (EER) from 8.30% and 7.56% to 41.65% and 46.22%. On ASR, PATN outperforms baseline methods by 16.16% and 31.96%, respectively.
Problem

Research questions and friction points this paper is trying to address.

Protecting mobile sensor data privacy against unauthorized inference attacks
Enabling real-time privacy protection without disrupting application functionality
Addressing latency and utility limitations of existing privacy-preserving techniques
Innovation

Methods, ideas, or system contributions that make the work stand out.

Predictive Adversarial Transformation Network for real-time privacy
Generates adversarial perturbations using historical sensor data
Applies perturbations immediately to protect without disrupting utility
🔎 Similar Papers
No similar papers found.
T
Tianle Song
Xi’an Jiaotong University, China
C
Chenhao Lin
Xi’an Jiaotong University, China
Y
Yang Cao
Tokyo Institute of Technology, Japan
Zhengyu Zhao
Zhengyu Zhao
Xi'an Jiaotong University, China
Adversarial Machine LearningComputer Vision
J
Jiahao Sun
Xi’an Jiaotong University, China
C
Chong Zhang
Xi’an Jiaotong University, China
L
Le Yang
Xi’an Jiaotong University, China
C
Chao Shen
Xi’an Jiaotong University, China