🤖 AI Summary
Addressing the challenge of age estimation for occluded faces in unconstrained real-world scenarios, this paper proposes a robust multi-task framework integrating generative adversarial networks (GANs) and Transformers. Our key contributions are: (1) a novel SN-Patch GAN-based de-occlusion module that effectively reconstructs occluded facial regions; (2) an ARCM-Swin hybrid feature enhancement architecture that jointly models local texture details and global semantic context; and (3) a Multi-Task Age Head (MTAH) that unifies age regression with probabilistic age distribution learning, regularized by KL divergence. Extensive experiments demonstrate state-of-the-art performance: mean absolute error (MAE) of 3.00, 4.54, and 2.53 years on FG-NET, UTKFace, and MORPH benchmarks, respectively—surpassing prior methods. These results validate the proposed framework’s superior robustness and accuracy under complex, realistic occlusion conditions.
📝 Abstract
Facial age estimation has achieved considerable success under controlled conditions. However, in unconstrained real-world scenarios, which are often referred to as 'in the wild', age estimation remains challenging, especially when faces are partially occluded, which may obscure their visibility. To address this limitation, we propose a new approach integrating generative adversarial networks (GANs) and transformer architectures to enable robust age estimation from occluded faces. We employ an SN-Patch GAN to effectively remove occlusions, while an Attentive Residual Convolution Module (ARCM), paired with a Swin Transformer, enhances feature representation. Additionally, we introduce a Multi-Task Age Head (MTAH) that combines regression and distribution learning, further improving age estimation under occlusion. Experimental results on the FG-NET, UTKFace, and MORPH datasets demonstrate that our proposed approach surpasses existing state-of-the-art techniques for occluded facial age estimation by achieving an MAE of $3.00$, $4.54$, and $2.53$ years, respectively.