GarmentTracking: Category-Level Garment Pose Tracking

πŸ“… 2023-03-24
πŸ›οΈ Computer Vision and Pattern Recognition
πŸ“ˆ Citations: 12
✨ Influential: 1
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the challenging problem of category-level non-rigid garment pose estimation and real-time 3D tracking. Methodologically, we propose the first end-to-end online 3D tracking framework, which: (1) employs a point-cloud-based deep network to jointly regress full non-rigid poses in both canonical and task-specific spaces; (2) introduces the VR-Garment acquisition system and the VR-Folding datasetβ€”the first benchmark enabling modeling of complex deformations such as folding and flattening; and (3) incorporates a real-time non-rigid deformation modeling mechanism. Extensive experiments demonstrate that our method maintains high accuracy and real-time performance (high frame rate) under large-scale deformations, significantly outperforming existing baselines in both tracking precision and speed. To foster reproducibility and further research, we release all code and datasets publicly.
πŸ“ Abstract
Garments are important to humans. A visual system that can estimate and track the complete garment pose can be useful for many downstream tasks and real-world applications. In this work, we present a complete package to address the category-level garment pose tracking task: (1) A recording system VR-Garment, with which users can manipulate virtual garment models in simulation through a VR interface. (2) A large-scale dataset VR-Folding, with complex garment pose configurations in manipulation like flattening and folding. (3) An end-to-end online tracking framework GarmentTracking, which predicts complete garment pose both in canonical space and task space given a point cloud sequence. Extensive experiments demonstrate that the proposed GarmentTracking achieves great performance even when the garment has large non-rigid deformation. It outperforms the baseline approach on both speed and accuracy. We hope our proposed solution can serve as a platform for future research. Codes and datasets are available in https://garment-tracking.robotflow.ai.
Problem

Research questions and friction points this paper is trying to address.

Tracking complete garment pose for real-world applications
Handling large non-rigid deformations in garment movements
Improving speed and accuracy in garment pose estimation
Innovation

Methods, ideas, or system contributions that make the work stand out.

VR-Garment system for virtual garment manipulation
VR-Folding dataset with complex garment poses
End-to-end GarmentTracking framework for pose prediction
πŸ”Ž Similar Papers
No similar papers found.
H
Han Xue
Shanghai Qi Zhi Institute, Shanghai Jiao Tong University
Wenqiang Xu
Wenqiang Xu
Shanghai Jiao Tong University
Computer visionRobotics
J
Jieyi Zhang
Shanghai Jiao Tong University
Tutian Tang
Tutian Tang
Shanghai Jiao Tong University
Robotics
Y
Yutong Li
Shanghai Jiao Tong University
W
Wenxin Du
Shanghai Jiao Tong University
Ruolin Ye
Ruolin Ye
Cornell University
Human robot interactionRoboticsSimulation
C
Cewu Lu
Qing Yuan Research Institute and MoE Key Lab of Artificial Intelligence, AI Institute, Shanghai Jiao Tong University, China and Shanghai Qi Zhi Institute