Exploiting Boosting in Hyperdimensional Computing for Enhanced Reliability in Healthcare

📅 2024-11-21
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address overfitting, poor noise robustness, and insufficient resilience to class imbalance in high-dimensional computing (HDC) models for healthcare applications, this paper proposes BoostHD—a novel framework that deeply integrates boosting mechanisms (e.g., AdaBoost variants) into HDC. BoostHD constructs an ensemble of weak learners via orthogonal subspace partitioning, synergistically combining binary/hypervector encoding, online learning, and ensemble decision-making to enable efficient high-dimensional space utilization and collaborative error correction. It overcomes key limitations of conventional HDC, including noise sensitivity and weak generalization. Evaluated on the WESAD dataset, BoostHD achieves 98.37% accuracy—significantly outperforming Random Forest, XGBoost, and OnlineHD. Moreover, it maintains robust performance under noise corruption and severe class imbalance, and delivers personalized assessment with a mean accuracy of 96.19%.

Technology Category

Application Category

📝 Abstract
Hyperdimensional computing (HDC) enables efficient data encoding and processing in high-dimensional space, benefiting machine learning and data analysis. However, underutilization of these spaces can lead to overfitting and reduced model reliability, especially in data-limited systems a critical issue in sectors like healthcare that demand robustness and consistent performance. We introduce BoostHD, an approach that applies boosting algorithms to partition the hyperdimensional space into subspaces, creating an ensemble of weak learners. By integrating boosting with HDC, BoostHD enhances performance and reliability beyond existing HDC methods. Our analysis highlights the importance of efficient utilization of hyperdimensional spaces for improved model performance. Experiments on healthcare datasets show that BoostHD outperforms state-of-the-art methods. On the WESAD dataset, it achieved an accuracy of 98.37%, surpassing Random Forest, XGBoost, and OnlineHD. BoostHD also demonstrated superior inference efficiency and stability, maintaining high accuracy under data imbalance and noise. In person-specific evaluations, it achieved an average accuracy of 96.19%, outperforming other models. By addressing the limitations of both boosting and HDC, BoostHD expands the applicability of HDC in critical domains where reliability and precision are paramount.
Problem

Research questions and friction points this paper is trying to address.

High-Dimensional Computing
Imbalanced Data
Noise Handling
Innovation

Methods, ideas, or system contributions that make the work stand out.

BoostHD
High-Dimensional Computing
Overfitting Prevention
🔎 Similar Papers
No similar papers found.