Transport of Event Equation: Phase Retrieval from Defocus Events

📅 2025-10-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the low efficiency and poor stability of phase retrieval in low-light, dynamic scenes under coherent illumination, this work proposes a defocus-event-based phase reconstruction method leveraging an event-based vision sensor (EVS). By precisely scanning the optical system along the optical axis, the EVS captures logarithmic intensity-change events induced by defocus, exploiting its high temporal resolution and logarithmic response. We formulate the first event-domain propagation equation, establishing a linear partial differential relationship between the defocus event stream and the object-plane phase distribution. A dedicated event-data inversion algorithm is then designed to enable rapid and stable phase reconstruction. Experiments demonstrate that, under extremely low illumination, the proposed method achieves a 3.2× improvement in phase recovery accuracy and a 5.8× acceleration in convergence speed over conventional frame-based sensors. This work establishes the first event-driven framework for coherent diffraction imaging phase retrieval.

Technology Category

Application Category

📝 Abstract
To time-efficiently and stably acquire the intensity information for phase retrieval under a coherent illumination, we leverage an event-based vision sensor (EVS) that can detect changes in logarithmic intensity at the pixel level with a wide dynamic range. In our optical system, we translate the EVS along the optical axis, where the EVS records the intensity changes induced by defocus as events. To recover phase distributions, we formulate a partial differential equation, referred to as the transport of event equation, which presents a linear relationship between the defocus events and the phase distribution. We demonstrate through experiments that the EVS is more advantageous than the conventional image sensor for rapidly and stably detecting the intensity information, defocus events, which enables accurate phase retrieval, particularly under low-lighting conditions.
Problem

Research questions and friction points this paper is trying to address.

Developing rapid phase retrieval using event-based vision sensors
Establishing linear relationship between defocus events and phase
Enabling accurate phase recovery under low-lighting conditions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Event-based vision sensor detects defocus intensity changes
Transport of event equation links events to phase
Optical axis translation enables stable phase retrieval
🔎 Similar Papers
No similar papers found.
K
Kaito Hori
Department of Information and Communication Engineering, Nagoya University, Furo-cho, Chikusa-ku, Nagoya, 464-8603, Japan
C
Chihiro Tsutake
Department of Information and Communication Engineering, Nagoya University, Furo-cho, Chikusa-ku, Nagoya, 464-8603, Japan
Keita Takahashi
Keita Takahashi
Associate Professor, Nagoya University, Japan
Image ProcessingComputer Vision
Toshiaki Fujii
Toshiaki Fujii
Nagoya University
Image Processing