🤖 AI Summary
This work addresses the challenges of low computational efficiency and poor generalization in multi-task optimization of parameterized differential-algebraic equations (DAEs), where strong coupling between objectives and constraints impedes effective solution strategies. To overcome this, we propose a dual-coupled physics-informed neural network (PINN) architecture that introduces slack variables with global error bounds to rigorously decouple constraints from the objective function. The training process is further enhanced by integrating a genetic algorithm for improved optimization. The resulting framework enables high-accuracy solutions across diverse tasks through a single training run, eliminating the need for repeated retraining while significantly enhancing both solution precision and real-time responsiveness. This approach provides an efficient and general-purpose solver for complex multi-task DAE systems.
📝 Abstract
Simulation and modeling are essential in product development, integrated into the design and manufacturing process to enhance efficiency and quality. They are typically represented as complex nonlinear differential algebraic equations. The growing diversity of product requirements demands multi-task optimization, a key challenge in simulation modeling research. A dual physics-informed neural network architecture has been proposed to decouple constraints and objective functions in parametric differential algebraic equation optimization problems. Theoretical analysis shows that introducing a relaxation variable with a global error bound ensures solution equivalence between the network and optimization problem. A genetic algorithm-enhanced training framework for physics-informed neural networks improves training precision and efficiency, avoiding redundant solving of differential algebraic equations. This approach enables generalization for multi-task objectives with a single, training maintaining real-time responsiveness to product requirements.