🤖 AI Summary
High-quality underwater robotic perception data for marine aquaculture is critically scarce. Method: This work introduces the first open-source, multimodal dataset specifically designed for offshore aquaculture, capturing the full spectrum of manual and autonomous inspection operations in real fish farms. It systematically acquires and annotates synchronized, multi-source data of intact net cages under realistic marine conditions—including live fish motion and biofouling—using a comprehensive sensor suite: Doppler Velocity Log (DVL), Ultra-Short Baseline (USBL), multibeam sonar, monocular/stereo cameras, IMU, pressure, and temperature sensors. High-precision time synchronization, acoustic-optical co-calibration, and sea-state-adaptive collaborative operation ensure data fidelity. Contribution/Results: The dataset comprises over 10 TB of high-fidelity data, establishing the first benchmark for underwater aquaculture robotics perception and navigation. It has enabled advances in SLAM, net integrity assessment, biofouling detection, and autonomous navigation, and is actively used by multiple international research groups for algorithm benchmarking.
📝 Abstract
This paper presents a dataset gathered with an underwater robot in a sea-based aquaculture setting. Data was gathered from an operational fish farm and includes data from sensors such as the Waterlinked A50 DVL, the Nortek Nucleus 1000 DVL, Sonardyne Micro Ranger 2 USBL, Sonoptix Mulitbeam Sonar, mono and stereo cameras, and vehicle sensor data such as power usage, IMU, pressure, temperature, and more. Data acquisition is performed during both manual and autonomous traversal of the net pen structure. The collected vision data is of undamaged nets with some fish and marine growth presence, and it is expected that both the research community and the aquaculture industry will benefit greatly from the utilization of the proposed SOLAQUA dataset.