🤖 AI Summary
Heterogeneity in tactile sensor structures, materials, and transduction principles impedes cross-device transferability of force perception models, severely limiting generalizability. To address this, we propose GenForce—a novel framework introducing a “landmark-to-landmark” tactile signal translation paradigm. It uniformly encodes multi-source tactile signals into binary landmark images and employs a lightweight generative cross-domain translation network jointly optimized with few-shot landmark supervision, enabling zero-shot force model transfer. Our method achieves cross-sensor adaptation using only five paired samples, reducing average force prediction error by 47% across six heterogeneous tactile sensors. This significantly lowers manual calibration overhead and, for the first time, enables generalized force perception across sensors with disparate geometries, material compositions, and sensing principles.
📝 Abstract
Robotic tactile sensors, including vision-based and taxel-based sensors, enable agile manipulation and safe human-robot interaction through force sensation. However, variations in structural configurations, measured signals, and material properties create domain gaps that limit the transferability of learned force sensation across different tactile sensors. Here, we introduce GenForce, a general framework for achieving transferable force sensation across both homogeneous and heterogeneous tactile sensors in robotic systems. By unifying tactile signals into marker-based binary tactile images, GenForce enables the transfer of existing force labels to arbitrary target sensors using a marker-to-marker translation technique with a few paired data. This process equips uncalibrated tactile sensors with force prediction capabilities through spatiotemporal force prediction models trained on the transferred data. Extensive experimental results validate GenForce's generalizability, accuracy, and robustness across sensors with diverse marker patterns, structural designs, material properties, and sensing principles. The framework significantly reduces the need for costly and labor-intensive labeled data collection, enabling the rapid deployment of multiple tactile sensors on robotic hands requiring force sensing capabilities.