🤖 AI Summary
This work addresses the challenging problem of single-view, non-intrusive reconstruction of indoor 3D airflow fields. We propose a novel method integrating Background-Oriented Schlieren (BOS) imaging with Physics-Informed Neural Networks (PINNs). By modeling optical distortions of projected patterns induced by refractive index variations in inhomogeneous airflow, our approach jointly reconstructs the 3D refractive index field and enforces physical consistency via buoyancy-driven Navier–Stokes equations. We design a differentiable ray-tracing and physics-driven rendering pipeline, incorporating PINN-based regularization derived from governing PDEs and a multi-physical loss function coupling velocity, density, and optical measurements. Our method overcomes the ill-posedness inherent in single-view tomography, enabling high-fidelity, calibration-free, tracer-free 3D reconstruction of velocity and density fields. Experiments demonstrate accurate recovery of complex indoor flow features—including thermal plumes and vortical structures—achieving significant improvements in reconstruction accuracy and physical fidelity.
📝 Abstract
We develop a framework for non-invasive volumetric indoor airflow estimation from a single viewpoint using background-oriented schlieren (BOS) measurements and physics-informed reconstruction. Our framework utilizes a light projector that projects a pattern onto a target back-wall and a camera that observes small distortions in the light pattern. While the single-view BOS tomography problem is severely ill-posed, our proposed framework addresses this using: (1) improved ray tracing, (2) a physics-based light rendering approach and loss formulation, and (3) a physics-based regularization using a physics-informed neural network (PINN) to ensure that the reconstructed airflow is consistent with the governing equations for buoyancy-driven flows.