🤖 AI Summary
To address the low efficiency and poor stability of phase retrieval in low-light, dynamic scenes under coherent illumination, this work proposes a defocus-event-based phase reconstruction method leveraging an event-based vision sensor (EVS). By precisely scanning the optical system along the optical axis, the EVS captures logarithmic intensity-change events induced by defocus, exploiting its high temporal resolution and logarithmic response. We formulate the first event-domain propagation equation, establishing a linear partial differential relationship between the defocus event stream and the object-plane phase distribution. A dedicated event-data inversion algorithm is then designed to enable rapid and stable phase reconstruction. Experiments demonstrate that, under extremely low illumination, the proposed method achieves a 3.2× improvement in phase recovery accuracy and a 5.8× acceleration in convergence speed over conventional frame-based sensors. This work establishes the first event-driven framework for coherent diffraction imaging phase retrieval.
📝 Abstract
To time-efficiently and stably acquire the intensity information for phase retrieval under a coherent illumination, we leverage an event-based vision sensor (EVS) that can detect changes in logarithmic intensity at the pixel level with a wide dynamic range. In our optical system, we translate the EVS along the optical axis, where the EVS records the intensity changes induced by defocus as events. To recover phase distributions, we formulate a partial differential equation, referred to as the transport of event equation, which presents a linear relationship between the defocus events and the phase distribution. We demonstrate through experiments that the EVS is more advantageous than the conventional image sensor for rapidly and stably detecting the intensity information, defocus events, which enables accurate phase retrieval, particularly under low-lighting conditions.