BlowPrint: Blow-Based Multi-Factor Biometrics for Smartphone User Authentication

📅 2025-07-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Addressing the challenge of achieving high accuracy, usability, non-invasiveness, spoof resistance, and low computational overhead in multi-factor biometric authentication, this paper proposes the first behavioral biometric authentication method based on acoustic features of exhalation. Specifically, it captures unique airflow-induced acoustic signals generated when users exhale toward a smartphone screen using the built-in microphone, extracts time-frequency domain features, and performs score-level fusion with facial image-based physiological features. Its key innovation lies in modeling exhalation as a stable, discriminative behavioral biometric—offering natural interaction, strong liveness assurance, and minimal computational cost. Experimental results show that the exhalation subsystem achieves 99.35% accuracy, facial recognition attains 99.96%, and their fused system reaches 99.82%. The proposed approach significantly outperforms unimodal baselines in security, robustness, and user experience.

Technology Category

Application Category

📝 Abstract
Biometric authentication is a widely used security mechanism that leverages unique physiological or behavioral characteristics to authenticate users. In multi-factor biometrics (MFB), multiple biometric modalities, e.g., physiological and behavioral, are integrated to mitigate the limitations inherent in single-factor biometrics. The main challenge in MFB lies in identifying novel behavioral techniques capable of meeting critical criteria, including high accuracy, high usability, non-invasiveness, resilience against spoofing attacks, and low use of computational resources. Despite ongoing advancements, current behavioral biometric techniques often fall short of fulfilling one or more of these requirements. In this work, we propose BlowPrint, a novel behavioral biometric technique that allows us to authenticate users based on their phone blowing behaviors. In brief, we assume that the way users blow on a phone screen can produce distinctive acoustic patterns, which can serve as a unique biometric identifier for effective user authentication. It can also be seamlessly integrated with physiological techniques, such as facial recognition, to enhance its robustness and security. To assess BlowPrint's effectiveness, we conduct an empirical study involving 50 participants from whom we collect blow-acoustic and facial feature data. Subsequently, we compute the similarity scores of the two modalities using various similarity algorithms and combine them through score-level fusion. Finally, we compute the accuracy using a machine learning-based classifier. As a result, the proposed method demonstrates an accuracy of 99.35% for blow acoustics, 99.96% for facial recognition, and 99.82% for the combined approach. The experimental results demonstrate BlowPrint's high effectiveness in terms of authentication accuracy, spoofing attack resilience, usability, non-invasiveness, and other aspects.
Problem

Research questions and friction points this paper is trying to address.

Develops blow-based behavioral biometrics for smartphone authentication
Integrates acoustic and facial features for multi-factor security
Evaluates accuracy and resilience against spoofing attacks empirically
Innovation

Methods, ideas, or system contributions that make the work stand out.

Blow-based acoustic patterns for authentication
Integration with facial recognition for robustness
Score-level fusion of multi-modal biometrics
🔎 Similar Papers
No similar papers found.