skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Applicability of Hyperdimensional Computing for Seizure Prediction Using LBP and PSD Features from iEEG
Hyperdimensional computing (HDC) has been assumed to be attractive for time-series classification. These classifiers are ideal for one or few-shot learning and require fewer resources. These classifiers have been demonstrated to be useful in seizure detection. This paper investigates subject-specific seizure prediction using HDC from intracranial elec-troencephalogram (iEEG) from the publicly available Kaggle dataset. In comparison to seizure detection (interictal vs. ictal), seizure prediction (interictal vs. preictal) is a more challenging problem. Two HDC-based encoding strategies are explored: local binary pattern (LBP) and power spectral density (PSD). The average performance of HDC classifiers using the two encoding approaches is computed using the leave-one-seizure-out cross-validation method. Experimental results show that the PSD method using a small number of features selected by the minimum redundancy maximum relevance (mRMR) achieves better seizure prediction performance than the LBP method on the trairring and validation data.  more » « less
Award ID(s):
1814759
PAR ID:
10488927
Author(s) / Creator(s):
;
Publisher / Repository:
IEEE
Date Published:
Journal Name:
Proc. of 2023 Midwest Symposium on Circuits and Systems
ISBN:
979-8-3503-0210-3
Page Range / eLocation ID:
1065 to 1069
Subject(s) / Keyword(s):
Hyperdimensional computing (HDC) local binary pattern (LBP), power spectral density (PSD) minimum redundancy maximum relevance (mRMR) and seizure prediction.
Format(s):
Medium: X
Location:
Tempe, AZ, USA
Sponsoring Org:
National Science Foundation
More Like this
  1. Hyperdimensional computing (HDC) has drawn significant attention due to its comparable performance with traditional machine learning techniques. HDC classifiers achieve high parallelism, consume less power, and are well-suited for edge applications. Encoding approaches such as record-based encoding and N -gram-based encoding have been used to generate features from input signals and images. These features are mapped to hypervectors and are input to HDC classifiers. This paper considers the group-based classification of graphs constructed from time series. The graph is encoded to a hypervector and the graph hypervectors are used to train the HDC classifier. This paper applies HDC to brain graph classification using fMRI data. Both the record-based encoding and GrapHD encoding are explored. Experimental results show that 1) For HDC encoding approaches, GrapHD encoding can achieve comparable classification performance and require significantly less memory storage compared to record-based encoding. 2) The utilization of sparsity can achieve higher performance as compared to fully connected brain graphs. Both threshold strategy and the minimum redundancy maximum relevance (mRMR) algorithm are employed to generate sub-graphs, where mRMR achieves higher performance for three binary classification problems: emotion vs. gambling, emotion vs. no-task, and gambling vs. no-task. The corresponding AUCs are 0.87, 0.88, and 0.88, respectively. 
    more » « less
  2. Precise seizure identification plays a vital role in understanding cortical connectivity and informing treatment decisions. Yet, the manual diagnostic methods for epileptic seizures are both labor-intensive and highly specialized. In this study, we propose a Hyperdimensional Computing (HDC) classifier for accurate and efficient multi-type seizure classification. Despite previous seizure analysis efforts using HDC being limited to binary detection (seizure or no seizure), our work breaks new ground by utilizing HDC to classify seizures into multiple distinct types. HDC offers significant advantages, such as lower memory requirements, a reduced hardware footprint for wearable devices, and decreased computational complexity. Due to these attributes, HDC can be an alternative to traditional machine learning methods, making it a practical and efficient solution, particularly in resource-limited scenarios or applications involving wearable devices. We evaluated the proposed technique on the latest version of TUH EEG Seizure Corpus (TUSZ) dataset and the evaluation result demonstrate noteworthy performance, achieving a weighted F1 score of 94.6%. This outcome is in line with, or even exceeds, the performance achieved by the state-ofthe-art traditional machine learning methods. 
    more » « less
  3. null (Ed.)
    Hyperdimensional (HD) computing holds promise for classifying two groups of data. This paper explores seizure detection from electroencephalogram (EEG) from subjects with epilepsy using HD computing based on power spectral density (PSD) features. Publicly available intra-cranial EEG (iEEG) data collected from 4 dogs and 8 human patients in the Kaggle seizure detection contest are used in this paper. This paper explores two methods for classification. First, few ranked PSD features from small number of channels from a prior classification are used in the context of HD classification. Second, all PSD features extracted from all channels are used as features for HD classification. It is shown that for about half the subjects small number features outperform all features in the context of HD classification, and for the other half, all features outperform small number of features. HD classification achieves above 95% accuracy for six of the 12 subjects, and between 85-95% accuracy for 4 subjects. For two subjects, the classification accuracy using HD computing is not as good as classical approaches such as support vector machine classifiers. 
    more » « less
  4. Abstract ObjectiveAnterior temporal lobectomy (ATL) is a widely performed and successful intervention for drug‐resistant temporal lobe epilepsy (TLE). However, up to one third of patients experience seizure recurrence within 1 year after ATL. Despite the extensive literature on presurgical electroencephalography (EEG) and magnetic resonance imaging (MRI) abnormalities to prognosticate seizure freedom following ATL, the value of quantitative analysis of visually reviewed normal interictal EEG in such prognostication remains unclear. In this retrospective multicenter study, we investigate whether machine learning analysis of normal interictal scalp EEG studies can inform the prediction of postoperative seizure freedom outcomes in patients who have undergone ATL. MethodsWe analyzed normal presurgical scalp EEG recordings from 41 Mayo Clinic (MC) and 23 Cleveland Clinic (CC) patients. We used an unbiased automated algorithm to extract eyes closed awake epochs from scalp EEG studies that were free of any epileptiform activity and then extracted spectral EEG features representing (a) spectral power and (b) interhemispheric spectral coherence in frequencies between 1 and 25 Hz across several brain regions. We analyzed the differences between the seizure‐free and non–seizure‐free patients and employed a Naïve Bayes classifier using multiple spectral features to predict surgery outcomes. We trained the classifier using a leave‐one‐patient‐out cross‐validation scheme within the MC data set and then tested using the out‐of‐sample CC data set. Finally, we compared the predictive performance of normal scalp EEG‐derived features against MRI abnormalities. ResultsWe found that several spectral power and coherence features showed significant differences correlated with surgical outcomes and that they were most pronounced in the 10–25 Hz range. The Naïve Bayes classification based on those features predicted 1‐year seizure freedom following ATL with area under the curve (AUC) values of 0.78 and 0.76 for the MC and CC data sets, respectively. Subsequent analyses revealed that (a) interhemispheric spectral coherence features in the 10–25 Hz range provided better predictability than other combinations and (b) normal scalp EEG‐derived features provided superior and potentially distinct predictive value when compared with MRI abnormalities (>10% higher F1 score). SignificanceThese results support that quantitative analysis of even a normal presurgical scalp EEG may help prognosticate seizure freedom following ATL in patients with drug‐resistant TLE. Although the mechanism for this result is not known, the scalp EEG spectral and coherence properties predicting seizure freedom may represent activity arising from the neocortex or the networks responsible for temporal lobe seizure generation within vs outside the margins of an ATL. 
    more » « less
  5. Emerging brain-inspired hyperdimensional computing (HDC) algorithms are vulnerable to timing and soft errors in associative memory used to store high-dimensional data representations. Such errors can significantly degrade HDC performance. A key challenge is error correction after an error in computation is detected. This work presents two novel error resilience frameworks for hyperdimensional computing systems. The first, called the checksum hypervector encoding (CHE) framework, relies on creation of a single additional hypervector that is a checksum of all the class hypervectors of the HDC system. For error resilience, elementwise validation of the checksum property is performed and those elements across all class vectors for which the property fails are removed from consideration. For an HDC system with K class hypervectors of dimension D, the second cross-hypervector clustering (CHC) framework clusters D, Kdimensional vectors consisting of the i-th element of each of the K HDC class hypervectors, 1 ≤ i ≤ K. Statistical properties of these vector clusters are checked prior to each hypervector query and all the elements of all K-dimensional vectors corresponding to statistical outlier vectors are removed as before. The choice of which framework to use is dictated by the complexity of the dataset to classify. Up to three orders of magnitude better resilience to errors than the state-of-the-art across multiple HDC high-dimensional encoding (representation) systems is demonstrated. 
    more » « less