The Symmetric Perceptron: a Teacher-Student Scenario

📅 2026-03-26
📈 Citations: 0
✹ Influential: 0
📄 PDF
đŸ€– AI Summary
This work reformulates the traditional storage-oriented symmetric perceptron as a solvable teacher–student inference model to address the feasibility of learning across arbitrary sample densities. By introducing a two-region potential function, the study analyzes Bayes-optimal learning behavior under both noiseless and noisy conditions, establishing—through annealed and quenched free entropy analyses, high-dimensional asymptotics, and Monte Carlo optimization—the first teacher–student framework for the symmetric perceptron. The research fully characterizes the three-parameter phase diagram governed by sample density α, hyperplane offset Îș, and temperature T, revealing the dynamical mechanisms between suboptimal metastable states induced by second-order instabilities and full alignment driven by first-order phase transitions. It further elucidates the critical role of potential function choice in shaping learning trajectories and identifies robust learning regimes alongside their melting pathways toward planted solutions.

Technology Category

Application Category

📝 Abstract
We introduce and solve a teacher-student formulation of the symmetric binary Perceptron, turning a traditionally storage-oriented model into a planted inference problem with a guaranteed solution at any sample density. We adapt the formulation of the symmetric Perceptron which traditionally considers either the u-shaped potential or the rectangular one, by including labels in both regions. With this formulation, we analyze both the Bayes-optimal regime at for noise-less examples and the effect of thermal noise under two different potential/classification rules. Using annealed and quenched free-entropy calculations in the high-dimensional limit, we map the phase diagram in the three control parameters, namely the sample density $α$, the distance between the origin and one of the symmetric hyperplanes $Îș$ and temperature $T$, and identify a robust scenario where learning is organized by a second-order instability that creates teacher-correlated suboptimal states, followed by a first-order transition to full alignment. We show how this structure depends on the choice of potential, the interplay between metastability of the suboptimal solution and its melting towards the planted configuration, which is relevant for Monte Carlo-based optimization algorithms.
Problem

Research questions and friction points this paper is trying to address.

Symmetric Perceptron
Teacher-Student
Planted Inference
Phase Transition
Statistical Physics
Innovation

Methods, ideas, or system contributions that make the work stand out.

teacher-student model
symmetric perceptron
phase transition
free-entropy calculation
planted inference
🔎 Similar Papers
No similar papers found.