skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Award ID contains: 1814759

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Hyperdimensional computing (HDC) has drawn significant attention due to its comparable performance with traditional machine learning techniques. HDC classifiers achieve high parallelism, consume less power, and are well-suited for edge applications. Encoding approaches such as record-based encoding and N -gram-based encoding have been used to generate features from input signals and images. These features are mapped to hypervectors and are input to HDC classifiers. This paper considers the group-based classification of graphs constructed from time series. The graph is encoded to a hypervector and the graph hypervectors are used to train the HDC classifier. This paper applies HDC to brain graph classification using fMRI data. Both the record-based encoding and GrapHD encoding are explored. Experimental results show that 1) For HDC encoding approaches, GrapHD encoding can achieve comparable classification performance and require significantly less memory storage compared to record-based encoding. 2) The utilization of sparsity can achieve higher performance as compared to fully connected brain graphs. Both threshold strategy and the minimum redundancy maximum relevance (mRMR) algorithm are employed to generate sub-graphs, where mRMR achieves higher performance for three binary classification problems: emotion vs. gambling, emotion vs. no-task, and gambling vs. no-task. The corresponding AUCs are 0.87, 0.88, and 0.88, respectively. 
    more » « less
  2. Hyperdimensional computing (HDC) has been assumed to be attractive for time-series classification. These classifiers are ideal for one or few-shot learning and require fewer resources. These classifiers have been demonstrated to be useful in seizure detection. This paper investigates subject-specific seizure prediction using HDC from intracranial elec-troencephalogram (iEEG) from the publicly available Kaggle dataset. In comparison to seizure detection (interictal vs. ictal), seizure prediction (interictal vs. preictal) is a more challenging problem. Two HDC-based encoding strategies are explored: local binary pattern (LBP) and power spectral density (PSD). The average performance of HDC classifiers using the two encoding approaches is computed using the leave-one-seizure-out cross-validation method. Experimental results show that the PSD method using a small number of features selected by the minimum redundancy maximum relevance (mRMR) achieves better seizure prediction performance than the LBP method on the trairring and validation data. 
    more » « less
  3. Hyperdimensional computing (HD) is an emerging brain-inspired paradigm used for machine learning classification tasks. It manipulates ultra-long vectors-hypervectors- using simple operations, which allows for fast learning, energy efficiency, noise tolerance, and a highly parallel distributed framework. HD computing has shown a significant promise in the area of biological signal classification. This paper addresses group-specific premature ventricular contraction (PVC) beat detection with HD computing using the data from the MIT-BIH arrhythmia database. Temporal, heart rate variability (HRV), and spectral features are extracted, and minimal redundancy maximum relevance (mRMR) is used to rank and select features for classification. Three encoding approaches are explored for mapping the features into the HD space. The HD computing classifiers can achieve a PVC beat detection accuracy of 97.7 % accuracy, compared to 99.4% achieved by more computationally complex methods such as convolutional neural networks (CNNs). 
    more » « less
  4. Hyperdimensional (HD) computing is a brain-inspired form of computing based on the manipulation of high-dimensional vectors. Offering robust data representation and relatively fast learning, HD computing is a promising candidate for energy-efficient classification of biological signals. This paper describes the application of HD computing-based machine learning to the classification of biological gender from resting-state and task functional magnetic resonance imaging (fMRI) from the publicly available Human Connectome Project (HCP). The developed HD algorithm derives predictive features through mean dynamic functional connectivity (dFC) analysis. Record encoding is employed to map features onto hyperdimensional space. Utilizing adaptive retraining techniques, the HD computing-based classifier achieves an average biological gender classification accuracy of 87%, as compared to 84% achieved by edge entropy measure. 
    more » « less
  5. null (Ed.)
    Hyperdimensional (HD) computing holds promise for classifying two groups of data. This paper explores seizure detection from electroencephalogram (EEG) from subjects with epilepsy using HD computing based on power spectral density (PSD) features. Publicly available intra-cranial EEG (iEEG) data collected from 4 dogs and 8 human patients in the Kaggle seizure detection contest are used in this paper. This paper explores two methods for classification. First, few ranked PSD features from small number of channels from a prior classification are used in the context of HD classification. Second, all PSD features extracted from all channels are used as features for HD classification. It is shown that for about half the subjects small number features outperform all features in the context of HD classification, and for the other half, all features outperform small number of features. HD classification achieves above 95% accuracy for six of the 12 subjects, and between 85-95% accuracy for 4 subjects. For two subjects, the classification accuracy using HD computing is not as good as classical approaches such as support vector machine classifiers. 
    more » « less
  6. This paper addresses design of accelerators using systolic architectures for training of neural networks using a novel gradient interleaving approach. Training the neural network involves backpropagation of error and computation of gradients with respect to the activation functions and weights. It is shown that the gradient with respect to the activation function can be computed using a weight-stationary systolic array while the gradient with respect to the weights can be computed using an output-stationary systolic array. The novelty of the proposed approach lies in interleaving the computations of these two gradients to the same configurable systolic array. This results in reuse of the variables from one computation to the other and eliminates unnecessary memory accesses. The proposed approach leads to 1.4–2.2× savings in terms of number of cycles and 1.9× savings in terms of memory accesses. Thus, the proposed accelerator reduces latency and energy consumption. 
    more » « less
  7. Hyperdimensional (HD) computing is built upon its unique data type referred to as hypervectors. The dimension of these hypervectors is typically in the range of tens of thousands. Proposed to solve cognitive tasks, HD computing aims at calculating similarity among its data. Data transformation is realized by three operations, including addition, multiplication and permutation. Its ultra-wide data representation introduces redundancy against noise. Since information is evenly distributed over every bit of the hypervectors, HD computing is inherently robust. Additionally, due to the nature of those three operations, HD computing leads to fast learning ability, high energy efficiency and acceptable accuracy in learning and classification tasks. This paper introduces the background of HD computing, and reviews the data representation, data transformation, and similarity measurement. The orthogonality in high dimensions presents opportunities for flexible computing. To balance the tradeoff between accuracy and efficiency, strategies include but are not limited to encoding, retraining, binarization and hardware acceleration. Evaluations indicate that HD computing shows great potential in addressing problems using data in the form of letters, signals and images. HD computing especially shows significant promise to replace machine learning algorithms as a light-weight classifier in the field of internet of things (IoTs). 
    more » « less