🤖 AI Summary
This work investigates non-adaptive combinatorial group testing under adversarial erasures—i.e., arbitrary loss of test outcomes—with the goal of robustly identifying a small number of defective items from a large population, addressing asynchrony challenges arising from outcome losses in wireless communication and storage systems. Methodologically, it introduces a novel construction of erasure-resilient testing matrices grounded in combinatorial design and coding theory, and develops a polynomial-time recovery algorithm. Theoretically, it establishes necessary and sufficient conditions for exact defect identification under adversarial erasures, derives tight bounds on the minimum number of tests required, and proves that the proposed scheme achieves exact recovery for any bounded number of erasures. The framework is both information-theoretically optimal and computationally efficient, bridging theoretical guarantees with practical feasibility.
📝 Abstract
The study in group testing aims to develop strategies to identify a small set of defective items among a large population using a few pooled tests. The established techniques have been highly beneficial in a broad spectrum of applications ranging from channel communication to identifying COVID-19-infected individuals efficiently. Despite significant research on group testing and its variants since the 1940s, testing strategies robust to deletion noise have yet to be studied. Many practical systems exhibit deletion errors, for instance, in wireless communication and data storage systems. Such deletions of test outcomes lead to asynchrony between the tests, which the current group testing strategies cannot handle. In this work, we initiate the study of non-adaptive group testing strategies resilient to deletion noise. We characterize the necessary and sufficient conditions to successfully identify the defective items even after the adversarial deletion of certain test outputs. We also provide constructions of testing matrices along with an efficient recovery algorithm.