Maintaining Difficulty: A Margin Scheduler for Triplet Loss in Siamese Networks Training

📅 2026-03-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitation of fixed-margin triplet loss, where a static margin parameter μ fails to adapt to training dynamics and often underestimates the true separation required by most triplets, thereby reducing learning difficulty and causing premature convergence on easy samples. To mitigate this, the paper introduces, for the first time, a training-difficulty maintenance mechanism that dynamically adjusts μ based on the proportion of “easy” triplets observed in each training iteration, ensuring sustained effective learning difficulty. Integrated with a Siamese network and triplet margin ranking loss, the proposed adaptive margin scheduling strategy consistently outperforms both fixed-margin and monotonically increasing margin approaches across four standard benchmarks, significantly enhancing feature discriminability.
📝 Abstract
The Triplet Margin Ranking Loss is one of the most widely used loss functions in Siamese Networks for solving Distance Metric Learning (DML) problems. This loss function depends on a margin parameter μ, which defines the minimum distance that should separate positive and negative pairs during training. In this work, we show that, during training, the effective margin of many triplets often exceeds the predefined value of μ, provided that a sufficient number of triplets violating this margin is observed. This behavior indicates that fixing the margin throughout training may limit the learning process. Based on this observation, we propose a margin scheduler that adjusts the value of μ according to the proportion of easy triplets observed at each epoch, with the goal of maintaining training difficulty over time. We show that the proposed strategy leads to improved performance when compared to both a constant margin and a monotonically increasing margin scheme. Experimental results on four different datasets show consistent gains in verification performance.
Problem

Research questions and friction points this paper is trying to address.

Triplet Loss
Margin Scheduling
Siamese Networks
Distance Metric Learning
Training Difficulty
Innovation

Methods, ideas, or system contributions that make the work stand out.

triplet loss
margin scheduling
Siamese networks
distance metric learning
training difficulty
🔎 Similar Papers
No similar papers found.
R
Roberto Sprengel Minozzo Tomchak
Departamento de Informática, Universidade Federal do Paraná, Curitiba, Brazil
Oge Marques
Oge Marques
Affiliate Professor of Computer Science and Engineering, Florida Atlantic University
Artificial IntelligenceMedical Image Processing and AnalysisDeep LearningMachine Learning
L
Lucas Garcia Pedroso
Departamento de Matemática, Universidade Federal do Paraná, Curitiba, Brazil
L
Luiz Eduardo Oliveira
Departamento de Informática, Universidade Federal do Paraná, Curitiba, Brazil
P
Paulo Lisboa de Almeida
Departamento de Informática, Universidade Federal do Paraná, Curitiba, Brazil