🤖 AI Summary
Skeleton-based action recognition heavily relies on large-scale manually annotated data, incurring high labeling costs and low annotation efficiency. To address this, we propose a label-efficient framework that employs a reversible graph convolutional network (RevGCN) to construct a stable latent-space mapping, jointly modeling data representativeness, diversity, and uncertainty. We design an optimizable informativeness scoring function to enable active selection of high-quality labeling samples. Our key contributions are: (i) the first integration of invertible neural networks into active learning for skeleton action recognition, ensuring bijective feature mappings and distribution fidelity; and (ii) unified multi-objective optimization of sampling criteria. Experiments on NTU-RGB+D and NTU-120 demonstrate that our method achieves full-supervision performance using only 30% of labeled data—significantly outperforming state-of-the-art active and semi-supervised learning approaches while substantially reducing annotation overhead.
📝 Abstract
Skeleton-based action recognition is a hotspot in image processing. A key challenge of this task lies in its dependence on large, manually labeled datasets whose acquisition is costly and time-consuming. This paper devises a novel, label-efficient method for skeleton-based action recognition using graph convolutional networks (GCNs). The contribution of the proposed method resides in learning a novel acquisition function -- scoring the most informative subsets for labeling -- as the optimum of an objective function mixing data representativity, diversity and uncertainty. We also extend this approach by learning the most informative subsets using an invertible GCN which allows mapping data from ambient to latent spaces where the inherent distribution of the data is more easily captured. Extensive experiments, conducted on two challenging skeleton-based recognition datasets, show the effectiveness and the outperformance of our label-frugal GCNs against the related work.