🤖 AI Summary
To address the low training efficiency in optical neural network acceleration, this paper proposes a hardware-in-the-loop forward-only training method tailored for 4f optical correlators—the first co-design integrating forward-only learning algorithms with optical hardware. By eliminating gradient computation and weight updates inherent in conventional backpropagation, the method achieves an O(n²) training complexity, reducing the standard backpropagation complexity (O(n² log n)) by a log n factor. The approach jointly respects optical system physical constraints—such as non-negativity, limited dynamic range, and spatial bandwidth limitations—and CNN architectural characteristics, enabling end-to-end optical-CNN co-optimization. Evaluated on MNIST, the method attains an 87.6% classification accuracy, demonstrating significant training speedup without substantial accuracy degradation. This work establishes a new paradigm for energy-efficient, high-throughput optical neural network training.
📝 Abstract
This work evaluates a forward-only learning algorithm on the MNIST dataset with hardware-in-the-loop training of a 4f optical correlator, achieving 87.6% accuracy with O(n2) complexity, compared to backpropagation, which achieves 88.8% accuracy with O(n2 log n) complexity.