๐ค AI Summary
This paper introduces Reference-based Camouflaged Object Detection (Ref-COD), a novel task that aims to achieve precise pixel-level segmentation of a specified camouflaged object, given only a few reference images containing salient objects. To support this task, we construct R2C7Kโthe first large-scale benchmark dataset for Ref-CODโcomprising over 7,000 image triplets with fine-grained masks. We further propose R2CNet, a dual-branch end-to-end differentiable framework: (i) a Reference Mask Generation module provides pixel-level prior guidance; and (ii) a Reference Feature Aggregation and Cross-Image Enhancement module enables target-specific recognition without post-processing. Extensive experiments on R2C7K demonstrate that R2CNet significantly outperforms conventional COD methods in both segmentation accuracy and foreground localization precision. The code and dataset are publicly available.
๐ Abstract
We consider the problem of referring camouflaged object detection (Ref-COD), a new task that aims to segment specified camouflaged objects based on a small set of referring images with salient target objects. We first assemble a large-scale dataset, called R2C7K, which consists of 7K images covering 64 object categories in real-world scenarios. Then, we develop a simple but strong dual-branch framework, dubbed R2CNet, with a reference branch embedding the common representations of target objects from referring images and a segmentation branch identifying and segmenting camouflaged objects under the guidance of the common representations. In particular, we design a Referring Mask Generation module to generate pixel-level prior mask and a Referring Feature Enrichment module to enhance the capability of identifying specified camouflaged objects. Extensive experiments show the superiority of our Ref-COD methods over their COD counterparts in segmenting specified camouflaged objects and identifying the main body of target objects. Our code and dataset are publicly available at https://github.com/zhangxuying1004/RefCOD.