Brain-inspired Hyperdimensional (HD) computing models cognition by exploiting properties of high dimensional statistics– high-dimensional vectors, instead of working with numeric values used in contemporary processors. A fundamental weakness of existing HD computing algorithms is that they require to use floating point models in order to provide acceptable accuracy on realistic classification problems. However, working with floating point values significantly increases the HD computation cost. To address this issue, we proposed QuantHD, a novel framework for quantization of HD computing model during training. QuantHD enables HD computing to work with a low-cost quantized model (binary or ternary model) while providing a similar accuracy as the floating point model. We accordingly propose an FPGA implementation which accelerates HD computing in both training and inference phases. We evaluate QuantHD accuracy and efficiency on various real-world applications, and observe that QuantHD can achieve on average 17.2% accuracy improvement as compared to the existing binarized HD computing algorithms which provide a similar computation cost. In terms of efficiency, QuantHD FPGA implementation can achieve on average 42.3× and 4.7× (34.1× and 4.1×) energy efficiency improvement and speedup during inference (training) as compared to the state-of-the-art HD computing algorithms.
more »
« less
DeepER-HD: An Error Resilient HyperDimensional Computing Framework with DNN Front-End for Feature Selection
Brain-inspired hyperdimensional (HD) computing models mimic cognition through combinatorial bindings of biological neuronal data represented by high-dimensional vectors and related operations. However, the efficacy of HD computing depends strongly on input signal and data features used to realize such bindings. In this paper, we propose a new HD-computing framework based on a co-trainable DNN-based feature extractor pre-processor and a hyperdimensional computing system. When trained with restrictions on the ranges of hypervector elements for resilience to memory access errors, the framework achieves up to 135% accuracy improvement over baseline HD-computing for error-free operation and up to three orders of magnitude improvement in error resilience compared to the state-of-the-art. Results for a range of applications from image classification, face recognition, human activity recognition and medical diagnosis are presented and demonstrate the viability of the proposed ideas.
more »
« less
- Award ID(s):
- 2128419
- PAR ID:
- 10541882
- Publisher / Repository:
- IEEE
- Date Published:
- ISBN:
- 979-8-3503-6555-9
- Page Range / eLocation ID:
- 1 to 6
- Format(s):
- Medium: X
- Location:
- Maceio, Brazil
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Emerging brain-inspired hyperdimensional computing (HDC) algorithms are vulnerable to timing and soft errors in associative memory used to store high-dimensional data representations. Such errors can significantly degrade HDC performance. A key challenge is error correction after an error in computation is detected. This work presents two novel error resilience frameworks for hyperdimensional computing systems. The first, called the checksum hypervector encoding (CHE) framework, relies on creation of a single additional hypervector that is a checksum of all the class hypervectors of the HDC system. For error resilience, elementwise validation of the checksum property is performed and those elements across all class vectors for which the property fails are removed from consideration. For an HDC system with K class hypervectors of dimension D, the second cross-hypervector clustering (CHC) framework clusters D, Kdimensional vectors consisting of the i-th element of each of the K HDC class hypervectors, 1 ≤ i ≤ K. Statistical properties of these vector clusters are checked prior to each hypervector query and all the elements of all K-dimensional vectors corresponding to statistical outlier vectors are removed as before. The choice of which framework to use is dictated by the complexity of the dataset to classify. Up to three orders of magnitude better resilience to errors than the state-of-the-art across multiple HDC high-dimensional encoding (representation) systems is demonstrated.more » « less
-
Emerging brain-inspired hyperdimensional computing (HDC) algorithms are vulnerable to timing and soft errors in associative memory used to store high-dimensional data representations. Such errors can significantly degrade HDC performance. A key challenge is error correction after an error in computation is detected. This work presents two novel error resilience frameworks for hyperdimensional computing systems. The first, called the checksum hypervector encoding (CHE) framework, relies on creation of a single additional hypervector that is a checksum of all the class hypervectors of the HDC system. For error resilience, elementwise validation of the checksum property is performed and those elements across all class vectors for which the property fails are removed from consideration. For an HDC system with K class hypervectors of dimension D, the second cross-hypervector clustering (CHC) framework clusters D, K-dimensional vectors consisting of the i-th element of each of the K HDC class hypervectors, 1 ≤ i ≤ K. Statistical properties of these vector clusters are checked prior to each hypervector query and all the elements of all K-dimensional vectors corresponding to statistical outlier vectors are removed as before. The choice of which framework to use is dictated by the complexity of the dataset to classify. Up to three orders of magnitude better resilience to errors than the state-of-the-art across multiple HDC high-dimensional encoding (representation) systems is demonstrated.more » « less
-
Hyperdimensional (HD) computing is a brain-inspired form of computing based on the manipulation of high-dimensional vectors. Offering robust data representation and relatively fast learning, HD computing is a promising candidate for energy-efficient classification of biological signals. This paper describes the application of HD computing-based machine learning to the classification of biological gender from resting-state and task functional magnetic resonance imaging (fMRI) from the publicly available Human Connectome Project (HCP). The developed HD algorithm derives predictive features through mean dynamic functional connectivity (dFC) analysis. Record encoding is employed to map features onto hyperdimensional space. Utilizing adaptive retraining techniques, the HD computing-based classifier achieves an average biological gender classification accuracy of 87%, as compared to 84% achieved by edge entropy measure.more » « less
-
Hyperdimensional computing (HD) is an emerging brain-inspired paradigm used for machine learning classification tasks. It manipulates ultra-long vectors-hypervectors- using simple operations, which allows for fast learning, energy efficiency, noise tolerance, and a highly parallel distributed framework. HD computing has shown a significant promise in the area of biological signal classification. This paper addresses group-specific premature ventricular contraction (PVC) beat detection with HD computing using the data from the MIT-BIH arrhythmia database. Temporal, heart rate variability (HRV), and spectral features are extracted, and minimal redundancy maximum relevance (mRMR) is used to rank and select features for classification. Three encoding approaches are explored for mapping the features into the HD space. The HD computing classifiers can achieve a PVC beat detection accuracy of 97.7 % accuracy, compared to 99.4% achieved by more computationally complex methods such as convolutional neural networks (CNNs).more » « less
An official website of the United States government

