skip to main content

Title: 2022 5th International Conference on Artificial Intelligence and Pattern Recognition
Researchers are looking into solutions to support the enormous demand for wireless communication, which has been exponentially increasing along with the growth of technology. The sixth generation (6G) Network emerged as the leading solution for satisfying the requirements placed on the telecommunications system. 6G technology mainly depends on various machine learning and artificial intelligence techniques. The performance of these machine learning algorithms is high. Still, their security has been neglected for some reason, which leaves the door open to various vulnerabilities that attackers can exploit to compromise systems. Therefore, it is essential to evaluate the security of machine learning algorithms to prevent them from being spoofed by malicious hackers. Prior research has shown that the decision tree is one of the most popular algorithms used by 80% of researchers for classification problems. In this work, we collect the dataset from a laboratory testbed of over 100 Internet of things (IoT) devices. The devices include smart cameras, smart light bulbs, Alexa, and others. We evaluate classifiers using the original dataset during the experiment and record a 98% accuracy. We then use the label-flipping attack approach to poison our dataset and record the output. As a result, flipping 10%, 20%, 30%, 40%, and 50% of the poison data generated accuracies of 86%, 74%, 64%, 54%, and 50%, respectively.  more » « less
Award ID(s):
Author(s) / Creator(s):
Date Published:
Journal Name:
AIPR Workshop Proceedings
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The Internet of Things (IoT) is a network of sensors that helps collect data 24/7 without human intervention. However, the network may suffer from problems such as the low battery, heterogeneity, and connectivity issues due to the lack of standards. Even though these problems can cause several performance hiccups, security issues need immediate attention because hackers access vital personal and financial information and then misuse it. These security issues can allow hackers to hijack IoT devices and then use them to establish a Botnet to launch a Distributed Denial of Service (DDoS) attack. Blockchain technology can provide security to IoT devices by providing secure authentication using public keys. Similarly, Smart Contracts (SCs) can improve the performance of the IoT–blockchain network through automation. However, surveyed work shows that the blockchain and SCs do not provide foolproof security; sometimes, attackers defeat these security mechanisms and initiate DDoS attacks. Thus, developers and security software engineers must be aware of different techniques to detect DDoS attacks. In this survey paper, we highlight different techniques to detect DDoS attacks. The novelty of our work is to classify the DDoS detection techniques according to blockchain technology. As a result, researchers can enhance their systems by using blockchain-based support for detecting threats. In addition, we provide general information about the studied systems and their workings. However, we cannot neglect the recent surveys. To that end, we compare the state-of-the-art DDoS surveys based on their data collection techniques and the discussed DDoS attacks on the IoT subsystems. The study of different IoT subsystems tells us that DDoS attacks also impact other computing systems, such as SCs, networking devices, and power grids. Hence, our work briefly describes DDoS attacks and their impacts on the above subsystems and IoT. For instance, due to DDoS attacks, the targeted computing systems suffer delays which cause tremendous financial and utility losses to the subscribers. Hence, we discuss the impacts of DDoS attacks in the context of associated systems. Finally, we discuss Machine-Learning algorithms, performance metrics, and the underlying technology of IoT systems so that the readers can grasp the detection techniques and the attack vectors. Moreover, associated systems such as Software-Defined Networking (SDN) and Field-Programmable Gate Arrays (FPGA) are a source of good security enhancement for IoT Networks. Thus, we include a detailed discussion of future development encompassing all major IoT subsystems. 
    more » « less
  2. Smart ear-worn devices (called earables) are being equipped with various onboard sensors and algorithms, transforming earphones from simple audio transducers to multi-modal interfaces making rich inferences about human motion and vital signals. However, developing sensory applications using earables is currently quite cumbersome with several barriers in the way. First, time-series data from earable sensors incorporate information about physical phenomena in complex settings, requiring machine-learning (ML) models learned from large-scale labeled data. This is challenging in the context of earables because large-scale open-source datasets are missing. Secondly, the small size and compute constraints of earable devices make on-device integration of many existing algorithms for tasks such as human activity and head-pose estimation difficult. To address these challenges, we introduce Auritus, an extendable and open-source optimization toolkit designed to enhance and replicate earable applications. Auritus serves two primary functions. Firstly, Auritus handles data collection, pre-processing, and labeling tasks for creating customized earable datasets using graphical tools. The system includes an open-source dataset with 2.43 million inertial samples related to head and full-body movements, consisting of 34 head poses and 9 activities from 45 volunteers. Secondly, Auritus provides a tightly-integrated hardware-in-the-loop (HIL) optimizer and TinyML interface to develop lightweight and real-time machine-learning (ML) models for activity detection and filters for head-pose tracking. To validate the utlity of Auritus, we showcase three sample applications, namely fall detection, spatial audio rendering, and augmented reality (AR) interfacing. Auritus recognizes activities with 91% leave 1-out test accuracy (98% test accuracy) using real-time models as small as 6-13 kB. Our models are 98-740x smaller and 3-6% more accurate over the state-of-the-art. We also estimate head pose with absolute errors as low as 5 degrees using 20kB filters, achieving up to 1.6x precision improvement over existing techniques. We make the entire system open-source so that researchers and developers can contribute to any layer of the system or rapidly prototype their applications using our dataset and algorithms. 
    more » « less
  3. Over the past decades, the major objectives of computer design have been to improve performance and to reduce cost, energy consumption, and size, while security has remained a secondary concern. Meanwhile, malicious attacks have rapidly grown as the number of Internet-connected devices, ranging from personal smart embedded systems to large cloud servers, have been increasing. Traditional antivirus software cannot keep up with the increasing incidence of these attacks, especially for exploits targeting hardware design vulnerabilities. For example, as DRAM process technology scales down, it becomes easier for DRAM cells to electrically interact with each other. For instance, in Rowhammer attacks, it is possible to corrupt data in nearby rows by reading the same row in DRAM. As Rowhammer exploits a computer hardware weakness, no software patch can completely fix the problem. Similarly, there is no efficient software mitigation to the recently reported attack Spectre. The attack exploits microarchitectural design vulnerabilities to leak protected data through side channels. In general, completely fixing hardware-level vulnerabilities would require a redesign of the hardware which cannot be backported. In this paper, we demonstrate that by monitoring deviations in microarchitectural events such as cache misses, branch mispredictions from existing CPU performance counters, hardware-level attacks such as Rowhammer and Spectre can be efficiently detected during runtime with promising accuracy and reasonable performance overhead using various machine learning classifiers. 
    more » « less
  4. null (Ed.)
    The COVID-19 pandemic has highly impacted the communities globally by reprioritizing the means through which various societal sectors operate. Among these sectors, healthcare providers and medical workers have been impacted prominently due to the massive increase in demand for medical services under unprecedented circumstances. Hence, any tool that can help the compliance with social guidelines for COVID-19 spread prevention will have a positive impact on managing and controlling the virus outbreak and reducing the excessive burden on the healthcare system. This perspective article disseminates the perspectives of the authors regarding the use of novel biosensors and intelligent algorithms embodied in wearable IoMT frameworks for tackling this issue. We discuss how with the use of smart IoMT wearables certain biomarkers can be tracked for detection of COVID-19 in exposed individuals. We enumerate several machine learning algorithms which can be used to process a wide range of collected biomarkers for detecting (a) multiple symptoms of SARS-CoV-2 infection and (b) the dynamical likelihood of contracting the virus through interpersonal interaction. Eventually, we enunciate how a systematic use of smart wearable IoMT devices in various social sectors can intelligently help controlling the spread of COVID-19 in communities as they enter the reopening phase. We explain how this framework can benefit individuals and their medical correspondents by introducing Systems for Symptom Decoding (SSD), and how the use of this technology can be generalized on a societal level for the control of spread by introducing Systems for Spread Tracing (SST). 
    more » « less
  5. Internet of Things (IoT) is a connected network of devices that exchange data using different protocols. The application of IoT ranges from intelligent TVs and intelligent Refrigerators to smart Transportation. This research aims to provide students with hands-on training on how to develop an IoT platform that supports device management, connectivity, and data management. People tend to build interconnected devices without having a basic understanding of how the IoT platform backend function. Studying the Arm Pelion will help to understand how IoT devices operate under the hood. This past summer, Morgan State University has hosted undergraduate engineering students and high school STEM teachers to conduct IoT security research in the Cybersecurity Assurance & Policy (CAP) Center. The research project involved integrating various hardware sensor devices and real-time data monitoring using the Arm Pelion IoT development platform. Some of the student/teacher outcomes from the project include: 1) Learning about IoT Technology and security; 2) Programming an embedded system using Arm Mbed development board and IDE; 3 3) Developing a network of connected IoT devices using different protocols such as LWM2M, MQTT, CoAP; 4) Investigating the cybersecurity risks associated with the platform; and 5) Using data analysis and visualization to understand the network data and packet flow. First, the student/teacher must consider the IoT framework to understand how to address the security. The IoT framework describes the essential functions of an IoT network, breaking it down into separate layers. These layers include an application layer, middleware layer, and connectivity layer. The application layer allows the users to access the platform via a smartphone or any other dashboard. The Middleware layer represents the backend system that provides edge devices with data management, messaging, application services, and authentication. Finally, the connectivity layer includes devices that connect the user to the network, including Bluetooth or WiFi. The platform consists of several commercial IoT devices such as a smart camera, baby monitor, smart light, and other devices. We then create algorithms to classify the network data flow; to visualize the packets flow in the network and the structure of the packets data frame over time. 
    more » « less