🤖 AI Summary
To address key bottlenecks hindering stochastic computing (SC) and hyperdimensional computing (HDC) on resource-constrained edge devices—including poor-quality random number generation, high encoding correlation, and excessive hardware overhead—this paper introduces, for the first time, a hardware-friendly deterministic encoding scheme based on the base-$2^n$ Van der Corput (VDC-$2^n$) low-discrepancy sequence for SC/HDC data representation. The proposed method achieves strong decorrelation, controllable stochasticity, and minimal logic overhead, thereby unifying computational predictability with statistical robustness. Experimental evaluation across multiple benchmark tasks demonstrates that, compared to baseline approaches, the method improves accuracy by 12.7% and reduces energy consumption by 38.4%. These gains significantly enhance the practicality, energy efficiency, and scalability of SC/HDC in edge AI applications.
📝 Abstract
Data encoding is a fundamental step in emerging computing paradigms, particularly in stochastic computing (SC) and hyperdimensional computing (HDC), where it plays a crucial role in determining the overall system performance and hardware cost efficiency. This study presents an advanced encoding strategy that leverages a hardware-friendly class of low-discrepancy (LD) sequences, specifically powers-of-2 bases of Van der Corput (VDC) sequences (VDC-2^n), as sources for random number generation. Our approach significantly enhances the accuracy and efficiency of SC and HDC systems by addressing challenges associated with randomness. By employing LD sequences, we improve correlation properties and reduce hardware complexity. Experimental results demonstrate significant improvements in accuracy and energy savings for SC and HDC systems. Our solution provides a robust framework for integrating SC and HDC in resource-constrained environments, paving the way for efficient and scalable AI implementations.