🤖 AI Summary
This paper addresses the Nonogram (logic grid) puzzle solving problem by proposing the first end-to-end differentiable neural solver. Methodologically, row- and column-wise constraints are fully embedded into the training objective via a differentiable block-length matching loss, soft Boolean logic encoding, and a progressive confidence distillation mechanism—enabling joint optimization of grid prediction and rule consistency. The key contribution is a paradigm shift from traditional backtracking search to a backtrack-free, interpretable, and fully differentiable solving framework, supporting zero-shot generalization to unseen puzzle sizes. Evaluated on standard benchmarks, the solver achieves 98.7% accuracy, operates 42× faster than classical backtracking algorithms at inference time, and demonstrates strong robustness to input noise.