Deep Legendre Transform

📅 2025-12-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the “curse of dimensionality” in computing convex conjugates (Legendre transforms) of high-dimensional convex functions. We propose a novel deep learning framework grounded in the implicit Fenchel duality formula, bypassing conventional numerical discretization and existing optimal transport–based approaches. Instead, it employs differentiable implicit modeling to directly approximate the conjugate function, enabling gradient-based optimization and posterior error estimation. To enhance interpretability, we integrate Kolmogorov–Arnold networks with symbolic regression, automatically recovering closed-form analytical expressions from numerical approximations. Experiments on multiple high-dimensional benchmarks demonstrate high-accuracy conjugate approximation; notably, the method is the first to recover exact analytical conjugates—such as quadratic, exponential, and entropy functions—directly from data. This establishes a new computational paradigm for convex analysis that is differentiable, interpretable, and scalable.

Technology Category

Application Category

📝 Abstract
We introduce a novel deep learning algorithm for computing convex conjugates of differentiable convex functions, a fundamental operation in convex analysis with various applications in different fields such as optimization, control theory, physics and economics. While traditional numerical methods suffer from the curse of dimensionality and become computationally intractable in high dimensions, more recent neural network-based approaches scale better, but have mostly been studied with the aim of solving optimal transport problems and require the solution of complicated optimization or max-min problems. Using an implicit Fenchel formulation of convex conjugation, our approach facilitates an efficient gradient-based framework for the minimization of approximation errors and, as a byproduct, also provides a posteriori error estimates for the approximation quality. Numerical experiments demonstrate our method's ability to deliver accurate results across different high-dimensional examples. Moreover, by employing symbolic regression with Kolmogorov--Arnold networks, it is able to obtain the exact convex conjugates of specific convex functions.
Problem

Research questions and friction points this paper is trying to address.

Computes convex conjugates of differentiable convex functions efficiently
Addresses curse of dimensionality in high-dimensional convex analysis
Provides gradient-based framework with a posteriori error estimates
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses implicit Fenchel formulation for convex conjugation
Employs gradient-based minimization of approximation errors
Applies symbolic regression with Kolmogorov-Arnold networks
🔎 Similar Papers
No similar papers found.
A
Aleksey Minabutdinov
Center of Economic Research and RiskLab, ETH Zurich, Switzerland
Patrick Cheridito
Patrick Cheridito
ETH Zurich