Configurable p-Neurons Using Modular p-Bits

📅 2026-01-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes a modular p-bit architecture that overcomes the limitations of conventional p-bit neurons, which are constrained by fixed stochastic activation mechanisms and thus unable to support diverse probabilistic behaviors. By decoupling the stochastic signal path from the input data path, the design enables configurable probabilistic neurons capable of implementing stochastic variants of canonical activation functions—including Sigmoid, Tanh, and ReLU—for the first time in hardware. Leveraging a spintronic (CMOS + sMTJ) implementation and FPGA prototyping, the architecture incorporates a shared stochastic unit mechanism that substantially reduces hardware overhead. Experimental results demonstrate a one-order-of-magnitude (approximately 10×) reduction in resource consumption compared to traditional digital p-bit implementations, confirming the approach’s efficiency and feasibility.

Technology Category

Application Category

📝 Abstract
Probabilistic bits (p-bits) have recently been employed in neural networks (NNs) as stochastic neurons with sigmoidal probabilistic activation functions. Nonetheless, there remain a wealth of other probabilistic activation functions that are yet to be explored. Here we re-engineer the p-bit by decoupling its stochastic signal path from its input data path, giving rise to a modular p-bit that enables the realization of probabilistic neurons (p-neurons) with a range of configurable probabilistic activation functions, including a probabilistic version of the widely used Logistic Sigmoid, Tanh and Rectified Linear Unit (ReLU) activation functions. We present spintronic (CMOS + sMTJ) designs that show wide and tunable probabilistic ranges of operation. Finally, we experimentally implement digital-CMOS versions on an FPGA, with stochastic unit sharing, and demonstrate an order of magnitude (10x) saving in required hardware resources compared to conventional digital p-bit implementations.
Problem

Research questions and friction points this paper is trying to address.

p-bits
probabilistic activation functions
configurable p-neurons
stochastic neurons
neural networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

modular p-bit
configurable p-neurons
probabilistic activation functions
stochastic computing
hardware-efficient implementation
🔎 Similar Papers
No similar papers found.
S
Saleh Bunaiyan
ECE, UCSB, Santa Barbara, CA, USA; EE, KFUPM, Dhahran, KSA
M
Mohammad Alsharif
COE, KFUPM, Dhahran, KSA; CEMSE, KAUST, Thuwal, KSA
A
Abdelrahman S. Abdelrahman
ECE, UCSB, Santa Barbara, CA, USA
H
Hesham Elsawy
School of Computing, Queen’s University, Kingston, ON, Canada
S
Suraj S. Cheema
Research Laboratory of Electronics, MIT, Cambridge, MA, USA
S
Suhaib A. Fahmy
CEMSE, KAUST, Thuwal, KSA
Kerem Y Çamsarı
Kerem Y Çamsarı
University of California, Santa Barbara
Probabilistic ComputingQuantum ComputingSpintronicsIsing MachinesAI Hardware
Feras Al-Dirini
Feras Al-Dirini
Massachusetts Institute of Technology (MIT) | Queen's University
NanoelectronicsAI HardwareNeuromorphic AIProbabilistic ComputingProbabilistic Sensing