skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: HDVQ-VAE: Binary Codebook for Hyperdimensional Latent Representations
Hyperdimensional computing (HDC) has emerged as a promising paradigm offering lightweight yet powerful computing capabilities with inherent learning characteristics. By leveraging binary hyperdimensional vectors, HDC facilitates efficient and robust data processing, surpassing traditional machine learning (ML) approaches in terms of both speed and resilience. This letter addresses key challenges in HDC systems, particularly the conversion of data into the hyperdimensional domain and the integration of HDC with conventional ML frameworks. We propose a novel solution, the hyperdimensional vector quantized variational auto encoder (HDVQ-VAE), which seamlessly merges binary encodings with codebook representations in ML systems. Our approach significantly reduces memory overhead while enhancing training by replacing traditional codebooks with binary (−1, +1) counterparts. Leveraging this architecture, we demonstrate improved encoding-decoding procedures, producing high-quality images within acceptable peak signal-to-noise ratio (PSNR) ranges. Our work advances HDC by considering efficient ML system deployment to embedded systems.  more » « less
Award ID(s):
2019511
PAR ID:
10648788
Author(s) / Creator(s):
 ;  
Publisher / Repository:
IEEE Embedded Systems Letters
Date Published:
Journal Name:
IEEE Embedded Systems Letters
Volume:
16
Issue:
4
ISSN:
1943-0663
Page Range / eLocation ID:
325 to 328
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Hyperdimensional vector processing is a nascent computing approach that mimics the brain structure and offers lightweight, robust, and efficient hardware solutions for different learning and cognitive tasks. For image recognition and classification, hyperdimensional computing (HDC) utilizes the intensity values of captured images and the positions of image pixels. Traditional HDC systems represent the intensity and positions with binary hypervectors of 1K–10K dimensions. The intensity hypervectors are cross-correlated for closer values and uncorrelated for distant values in the intensity range. The position hypervectors are pseudo-random binary vectors generated iteratively for the best classification performance. In this study, we propose a radically new approach for encoding image data in HDC systems. Position hypervectors are no longer needed by encoding pixel intensities using a deterministic approach based on quasi-random sequences. The proposed approach significantly reduces the number of operations by eliminating the position hypervectors and the multiplication operations in the HDC system. Additionally, we suggest a hybrid technique for generating hypervectors by combining two deterministic sequences, achieving higher classification accuracy. Our experimental results show up to 102× reduction in runtime and significant memory-usage savings with improved accuracy compared to a baseline HDC system with conventional hypervector encoding. 
    more » « less
  2. Abstract Although the connectivity offered by industrial internet of things (IIoT) enables enhanced operational capabilities, the exposure of systems to significant cybersecurity risks poses critical challenges. Recently, machine learning (ML) algorithms such as feature-based support vector machines and logistic regression, together with end-to-end deep neural networks, have been implemented to detect intrusions, including command injection, denial of service, reconnaissance, and backdoor attacks, by capturing anomalous patterns. However, ML algorithms not only fall short in agile identification of intrusion with few samples, but also fail in adapting to new data or environments. This paper introduces hyperdimensional computing (HDC) as a new cognitive computing paradigm that mimics brain functionality to detect intrusions in IIoT systems. HDC encodes real-time data into a high-dimensional representation, allowing for ultra-efficient learning and analysis with limited samples and a few passes. Additionally, we incorporate the concept of regenerating brain cells into hyperdimensional computing to further improve learning capability and reduce the required memory. Experimental results on the WUSTL-IIOT-2021 dataset show that HDC detects intrusion with the accuracy of 92.6%, which is superior to multi-layer perceptron (40.2%), support vector machine (72.9%), logistic regression (84.2%), and Gaussian process classification (89.1%) while requires only 300 data and 5 iterations for training. 
    more » « less
  3. Precise seizure identification plays a vital role in understanding cortical connectivity and informing treatment decisions. Yet, the manual diagnostic methods for epileptic seizures are both labor-intensive and highly specialized. In this study, we propose a Hyperdimensional Computing (HDC) classifier for accurate and efficient multi-type seizure classification. Despite previous seizure analysis efforts using HDC being limited to binary detection (seizure or no seizure), our work breaks new ground by utilizing HDC to classify seizures into multiple distinct types. HDC offers significant advantages, such as lower memory requirements, a reduced hardware footprint for wearable devices, and decreased computational complexity. Due to these attributes, HDC can be an alternative to traditional machine learning methods, making it a practical and efficient solution, particularly in resource-limited scenarios or applications involving wearable devices. We evaluated the proposed technique on the latest version of TUH EEG Seizure Corpus (TUSZ) dataset and the evaluation result demonstrate noteworthy performance, achieving a weighted F1 score of 94.6%. This outcome is in line with, or even exceeds, the performance achieved by the state-ofthe-art traditional machine learning methods. 
    more » « less
  4. Brain-inspired HyperDimensional Computing (HDC) is an alternative computation model working based on the observation that the human brain operates on highdimensional representations of data. Existing HDC solutions rely on expensive pre-processing algorithms for feature extraction. In this paper, we propose StocHD, a novel end-to-end hyperdimensional system that supports accurate, efficient, and robust learning over raw data. StocHD expands HDC functionality to the computing area by mathematically defining stochastic arithmetic over HDC hypervectors. StocHD enables an entire learning application (including feature extractor) to process using HDC data representation, enabling uniform, efficient, robust, and highly parallel computation. We also propose a novel fully digital and scalable Processing In-Memory (PIM) architecture that exploits the HDC memory-centric nature to support extensively parallel computation. 
    more » « less
  5. Abstract—Hyperdimensional Computing (HDC) is a neurallyinspired computation model working based on the observation that the human brain operates on high-dimensional representations of data, called hypervector. Although HDC is significantly powerful in reasoning and association of the abstract information, it is weak on features extraction from complex data such as image/video. As a result, most existing HDC solutions rely on expensive pre-processing algorithms for feature extraction. In this paper, we propose StocHD, a novel end-to-end hyperdimensional system that supports accurate, efficient, and robust learning over raw data. Unlike prior work that used HDC for learning tasks, StocHD expands HDC functionality to the computing area by mathematically defining stochastic arithmetic over HDC hypervectors. StocHD enables an entire learning application (including feature extractor) to process using HDC data representation, enabling uniform, efficient, robust, and highly parallel computation. We also propose a novel fully digital and scalable Processing In-Memory (PIM) architecture that exploits the HDC memorycentric nature to support extensively parallel computation. Our evaluation over a wide range of classification tasks shows that StocHD provides, on average, 3.3x and 6.4x (52.3x and 143.Sx) faster and higher energy efficiency as compared to state-of-the-art HDC algorithm running on PIM (NVIDIA GPU), while providing 16x higher computational robustness. 
    more » « less