skip to main content


Title: A cyber-physical system approach for photovoltaic array monitoring and control
In this paper, we describe a Cyber-Physical system approach to Photovoltaic (PV) array control. A machine learning and computer vision framework is proposed for improving the reliability of utility scale PV arrays by leveraging video analysis of local skyline imagery, customized machine learning methods for fault detection, and monitoring devices that sense data and actuate at each individual panel. Our approach promises to improve efficiency in renewable energy systems using cyber-enabled sensory analysis and fusion.  more » « less
Award ID(s):
1646542
NSF-PAR ID:
10076702
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ; ;
Date Published:
Journal Name:
IEEE IISA 2017
Page Range / eLocation ID:
1 to 6
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Photovoltaic (PV) array analytics and control have become necessary for remote solar farms and for intelligent fault detection and power optimization. The management of a PV array requires auxiliary electronics that are attached to each solar panel. A collaborative industry-university-government project was established to create a smart monitoring device (SMD) and establish associated algorithms and software for fault detection and solar array management. First generation smart monitoring devices (SMDs) were built in Japan. At the same time, Arizona State University initiated research in algorithms and software to monitor and control individual solar panels. Second generation SMDs were developed later and included sensors for monitoring voltage, current, temperature, and irradiance at each individual panel. The latest SMDs include a radio and relays which allow modifying solar array connection topologies. With each panel equipped with such a sophisticated SMD, solar panels in a PV array behave essentially as nodes in an Internet of Things (IoT) type of topology. This solar energy IoT system is currently programmable and can: a) provide mobile analytics, b) enable solar farm control, c) detect and remedy faults, d) optimize power under different shading conditions, and e) reduce inverter transients. A series of federal and industry grants sponsored research on statistical signal analysis, communications, and optimization of this system. A Cyber-Physical project, whose aim is to improve solar array efficiency and robustness using new machine learning and imaging methods, was launched recently 
    more » « less
  2. Communication networks in power systems are a major part of the smart grid paradigm. It enables and facilitates the automation of power grid operation as well as self-healing in contingencies. Such dependencies on communication networks, though, create a roam for cyber-threats. An adversary can launch an attack on the communication network, which in turn reflects on power grid operation. Attacks could be in the form of false data injection into system measurements, flooding the communication channels with unnecessary data, or intercepting messages. Using machine learning-based processing on data gathered from communication networks and the power grid is a promising solution for detecting cyber threats. In this paper, a co-simulation of cyber-security for cross-layer strategy is presented. The advantage of such a framework is the augmentation of valuable data that enhances the detection as well as identification of anomalies in the operation of the power grid. The framework is implemented on the IEEE 118-bus system. The system is constructed in Mininet to simulate a communication network and obtain data for analysis. A distributed three controller software-defined networking (SDN) framework is proposed that utilizes the Open Network Operating System (ONOS) cluster. According to the findings of our suggested architecture, it outperforms a single SDN controller framework by a factor of more than ten times the throughput. This provides for a higher flow of data throughout the network while decreasing congestion caused by a single controller’s processing restrictions. Furthermore, our CECD-AS approach outperforms state-of-the-art physics and machine learning-based techniques in terms of attack classification. The performance of the framework is investigated under various types of communication attacks. 
    more » « less
  3. null (Ed.)
    Cyber-threats are continually evolving and growing in numbers and extreme complexities with the increasing connectivity of the Internet of Things (IoT). Existing cyber-defense tools seem not to deter the number of successful cyber-attacks reported worldwide. If defense tools are not seldom, why does the cyber-chase trend favor bad actors? Although cyber-defense tools monitor and try to diffuse intrusion attempts, research shows the required agility speed against evolving threats is way too slow. One of the reasons is that many intrusion detection tools focus on anomaly alerts’ accuracy, assuming that pre-observed attacks and subsequent security patches are adequate. Well, that is not the case. In fact, there is a need for techniques that go beyond intrusion accuracy against specific vulnerabilities to the prediction of cyber-defense performance for improved proactivity. This paper proposes a combination of cyber-attack projection and cyber-defense agility estimation to dynamically but reliably augur intrusion detection performance. Since cyber-security is buffeted with many unknown parameters and rapidly changing trends, we apply a machine learning (ML) based hidden markov model (HMM) to predict intrusion detection agility. HMM is best known for robust prediction of temporal relationships mid noise and training brevity corroborating our high prediction accuracy on three major open-source network intrusion detection systems, namely Zeek, OSSEC, and Suricata. Specifically, we present a novel approach for combined projection, prediction, and cyber-visualization to enable precise agility analysis of cyber defense. We also evaluate the performance of the developed approach using numerical results. 
    more » « less
  4. null (Ed.)
    Modern digital manufacturing processes, such as additive manufacturing, are cyber-physical in nature and utilize complex, process-specific simulations for both design and manufacturing. Although computational simulations can be used to optimize these complex processes, they can take hours or days--an unreasonable cost for engineering teams leveraging iterative design processes. Hence, more rapid computational methods are necessary in areas where computation time presents a limiting factor. When existing data from historical examples is plentiful and reliable, supervised machine learning can be used to create surrogate models that can be evaluated orders of magnitude more rapidly than comparable finite element approaches. However, for applications that necessitate computationally- intensive simulations, even generating the training data necessary to train a supervised machine learning model can pose a significant barrier. Unsupervised methods, such as physics- informed neural networks, offer a shortcut in cases where training data is scarce or prohibitive. These novel neural networks are trained without the use of potentially expensive labels. Instead, physical principles are encoded directly into the loss function. This method substantially reduces the time required to develop a training dataset, while still achieving the evaluation speed that is typical of supervised machine learning surrogate models. We propose a new method for stochastically training and testing a convolutional physics-informed neural network using the transient 3D heat equation- to model temperature throughout a solid object over time. We demonstrate this approach by applying it to a transient thermal analysis model of the powder bed fusion manufacturing process. 
    more » « less
  5. Exploring many execution paths in a binary program is essential to discover new vulnerabilities. Dynamic Symbolic Execution (DSE) is useful to trigger complex input conditions and enables an accurate exploration of a program while providing extensive crash replayability and semantic insights. However, scaling this type of analysis to complex binaries is difficult. Current methods suffer from the path explosion problem, despite many attempts to mitigate this challenge (e.g., by merging paths when appropriate). Still, in general, this challenge is not yet surmounted, and most bugs discovered through such techniques are shallow. We propose a novel approach to address the path explosion problem: A smart triaging system that leverages supervised machine learning techniques to replicate human expertise, leading to vulnerable path discovery. Our approach monitors the execution traces in vulnerable programs and extracts relevant features—register and memory accesses, function complexity, system calls—to guide the symbolic exploration. We train models to learn the patterns of vulnerable paths from the extracted features, and we leverage their predictions to discover interesting execution paths in new programs. We implement our approach in a tool called SyML, and we evaluate it on the Cyber Grand Challenge (CGC) dataset—a well-known dataset of vulnerable programs—and on 3 real-world Linux binaries. We show that the knowledge collected from the analysis of vulnerable paths, without any explicit prior knowledge about vulnerability patterns, is transferrable to unseen binaries, and leads to outperforming prior work in path prioritization by triggering more, and different, unique vulnerabilities. 
    more » « less