Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Detecting vulnerable code blocks has become a highly popular topic in computer-aided design, especially with the advancement of natural language processing (NLP). Analyzing hardware description languages ( HDLs ), such as Verilog, involves dealing with lengthy code. This letter introduces an innovative identification of attack-vulnerable hardware by the use of opcode processing. Leveraging the advantage of architecturally defined opcodes and expressing all operations at the beginning of each code line, the word processing problem is efficiently transformed into opcode processing. This research converts a benchmark dataset into an intermediary code stack, subsequently classifying secure and fragile codes using NLP techniques. The results reveal a framework that achieves up to 94% accuracy when employing sophisticated convolutional neural networks (CNNs) architecture with extra embedding layers. Thus, it provides a means for users to quickly verify the vulnerability of their HDL code by inspecting a supervised learning model trained on the predefined vulnerabilities. It also supports the superior efficacy of opcode -based processing in Trojan detection by analyzing the outcomes derived from a model trained using the HDL dataset.more » « lessFree, publicly-accessible full text available June 1, 2025
-
Hyperdimensional computing (HDC) is a novel computational paradigm that operates on long-dimensional vectors known as hypervectors. The hypervectors are constructed as long bit-streams and form the basic building blocks of HDC systems. In HDC, hypervectors are generated from scalar values without considering bit significance. HDC is efficient and robust for various data processing applications, especially computer vision tasks. To construct HDC models for vision applications, the current state-of-the-art practice utilizes two parameters for data encoding: pixel intensity and pixel position. However, the intensity and position information embedded in high-dimensional vectors are generally not generated dynamically in the HDC models. Consequently, the optimal design of hypervectors with high model accuracy requires powerful computing platforms for training. A more efficient approach is to generate hypervectors dynamically during the training phase. To this aim, this work uses low-discrepancy sequences to generate intensity hypervectors, while avoiding position hypervectors. Doing so eliminates the multiplication step in vector encoding, resulting in a power-efficient HDC system. For the first time in the literature, our proposed approach employs lightweight vector generators utilizing unary bit-streams for efficient encoding of data instead of using conventional comparator-based generators.more » « lessFree, publicly-accessible full text available March 25, 2025
-
Stochastic computing (SC) division circuits have gained importance in recent years compared to other arithmetic circuits due to their low complexity as a result of an accuracy tradeoff. Designing a division circuit is already complex in conventional binary-based hardware systems. Developing an accurate and efficient SC division circuit is an open research problem. Prior work proposed different SC division circuits by using multiplexers and JK-flip-flop units, which may require correlated or uncorrelated input bit-streams. This study is primarily centered on exploring a cost-effective and highly efficient bit-stream generator specifically designed for SC division circuits. In conjunction with this objective, we assess the performance of multiple bit-stream generators and analyze the impact of correlation on SC division. We compare different designs in terms of accuracy and hardware cost. Moreover, we discuss a low-cost and energy-efficient bit-stream generator via powers-of-2 Van der Corput (VDC) sequences. Among the tested sequence generators, our best results were achieved with VDC sequences. Our evaluation results demonstrate that the novel VDC-based design yields promising outputs, resulting in a 15.5% reduction in the area-delay product and an 18.05% saving in energy consumption for the same accuracy level compared to conventional bit-stream generators. Significantly, our investigation reveals that employing the proposed generator improves the precision compared to the state-of-the-art. We validate the proposed architecture with an image processing case study, achieving high PSNR and structural similarity values.more » « lessFree, publicly-accessible full text available March 7, 2025
-
Hyperdimensional vector processing is a nascent computing approach that mimics the brain structure and offers lightweight, robust, and efficient hardware solutions for different learning and cognitive tasks. For image recognition and classification, hyperdimensional computing (HDC) utilizes the intensity values of captured images and the positions of image pixels. Traditional HDC systems represent the intensity and positions with binary hypervectors of 1K–10K dimensions. The intensity hypervectors are cross-correlated for closer values and uncorrelated for distant values in the intensity range. The position hypervectors are pseudo-random binary vectors generated iteratively for the best classification performance. In this study, we propose a radically new approach for encoding image data in HDC systems. Position hypervectors are no longer needed by encoding pixel intensities using a deterministic approach based on quasi-random sequences. The proposed approach significantly reduces the number of operations by eliminating the position hypervectors and the multiplication operations in the HDC system. Additionally, we suggest a hybrid technique for generating hypervectors by combining two deterministic sequences, achieving higher classification accuracy. Our experimental results show up to 102× reduction in runtime and significant memory-usage savings with improved accuracy compared to a baseline HDC system with conventional hypervector encoding.more » « lessFree, publicly-accessible full text available December 1, 2024