Morphological Perceptron with Competitive Layer: Training Using Convex-Concave Procedure

📅 2025-09-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Morphological Perceptrons—particularly Morphological Perceptrons with Competitive Layers (MPCLs)—are hindered in multi-class classification due to the non-differentiability of mathematical morphology operators, rendering standard gradient-based optimization inapplicable. To address this, we propose a novel training framework based on the Concave-Convex Procedure (CCP), which reformulates the nonsmooth optimization objective as a difference of convex functions and iteratively solves linear programming subproblems for parameter learning. This marks the first systematic application of CCP to train the competitive layer of MPCLs, circumventing the differentiability requirement inherent in gradient methods. The approach synergistically integrates morphological structural modeling with competitive output mechanisms, ensuring both theoretical interpretability and computational tractability. Extensive experiments demonstrate substantial performance gains for MPCLs on multi-class classification benchmarks, validating the effectiveness and superiority of CCP for training non-gradient-based neural networks.

Technology Category

Application Category

📝 Abstract
A morphological perceptron is a multilayer feedforward neural network in which neurons perform elementary operations from mathematical morphology. For multiclass classification tasks, a morphological perceptron with a competitive layer (MPCL) is obtained by integrating a winner-take-all output layer into the standard morphological architecture. The non-differentiability of morphological operators renders gradient-based optimization methods unsuitable for training such networks. Consequently, alternative strategies that do not depend on gradient information are commonly adopted. This paper proposes the use of the convex-concave procedure (CCP) for training MPCL networks. The training problem is formulated as a difference of convex (DC) functions and solved iteratively using CCP, resulting in a sequence of linear programming subproblems. Computational experiments demonstrate the effectiveness of the proposed training method in addressing classification tasks with MPCL networks.
Problem

Research questions and friction points this paper is trying to address.

Training morphological perceptrons without gradient-based optimization
Solving non-differentiable network training using convex-concave procedure
Addressing multiclass classification with competitive layer networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Morphological perceptron with competitive layer for classification
Convex-concave procedure training without gradient optimization
Solving DC functions via linear programming subproblems
🔎 Similar Papers
No similar papers found.