skip to main content


Title: WTC2: Impact-Aware Threat Analysis for Water Treatment Centers
A water treatment center (WTC) removes contaminants and unwanted components from the water and makes the water more acceptable to the end-users. A modern WTC is equipped with different water sensors and uses a combination of wired/wireless communication network. During the water treatment process, controllers periodically collect sensor measurements and make important operational decisions. Since accuracy is vital, a WTC also uses different data validation mechanisms to validate the incoming sensor measurements. However, like any other cyber-physical system, water treatment facilities are prone to cyberattacks and an intelligent adversary can alter the sensors measurements stealthily, and corrupt the water treatment process. In this work, we propose WTC Checker (WTC2), an impact-aware formal analysis framework that demonstrates the impact of stealthy false data injection attacks on the water treatment sensors. Through our work, we demonstrate that if an adversary has sufficient access to sensor measurements and can evade the data validation process, he/she can compromise the sensors measurements, change the water disinfectant contact time, and inflict damage to the clean water production process. We model this attack as a constraint satisfaction problem (CSP) and encode it using Satisfiability Modulo Theories (SMT). We evaluate the proposed framework for its threat analysis capability as well as its scalability by executing experiments on different synthetic test cases.  more » « less
Award ID(s):
1929183
PAR ID:
10145192
Author(s) / Creator(s):
;
Date Published:
Journal Name:
IEEE Computer Society Signature Conference on Computers, Software and Applications (COMPSAC)
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Wind energy is one of the major sources of renewable energy. Countries around the world are increasingly deploying large wind farms that can generate a significant amount of clean energy. A wind farm consists of many turbines, often spread across a large geographical area. Modern wind turbines are equipped with meteorological sensors. The wind farm control center monitors the turbine sensors and adjusts the power generation parameters for optimal power production. The turbine sensors are prone to cyberattacks and with the evolving of large wind farms and their share in the power generation, it is crucial to analyze such potential cyber threats. In this paper, we present a formal framework to verify the impact of false data injection attack on the wind farm meteorological sensor measurements. The framework designs this verification as a maximization problem where the adversary's goal is to maximize the wind farm power production loss with its limited attack capability. Moreover, the adversary wants to remain stealthy to the wind farm bad data detection mechanism while it is launching its cyberattack on the turbine sensors. We evaluate the proposed framework for its threat analysis capability as well as its scalability by executing experiments on synthetic test cases. 
    more » « less
  2. The field of oceanography is transitioning from data-poor to data-rich, thanks in part to increased deployment ofin-situplatforms and sensors, such as those that instrument the US-funded Ocean Observatories Initiative (OOI). However, generating science-ready data products from these sensors, particularly those making biogeochemical measurements, often requires extensive end-user calibration and validation procedures, which can present a significant barrier. Openly available community-developed and -vetted Best Practices contribute to overcoming such barriers, but collaboratively developing user-friendly Best Practices can be challenging. Here we describe the process undertaken by the NSF-funded OOI Biogeochemical Sensor Data Working Group to develop Best Practices for creating science-ready biogeochemical data products from OOI data, culminating in the publication of the GOOS-endorsed OOI Biogeochemical Sensor Data Best Practices and User Guide. For Best Practices related to ocean observatories, engaging observatory staff is crucial, but having a “user-defined” process ensures the final product addresses user needs. Our process prioritized bringing together a diverse team and creating an inclusive environment where all participants could effectively contribute. Incorporating the perspectives of a wide range of experts and prospective end users through an iterative review process that included “Beta Testers’’ enabled us to produce a final product that combines technical information with a user-friendly structure that illustrates data analysis pipelines via flowcharts and worked examples accompanied by pseudo-code. Our process and its impact on improving the accessibility and utility of the end product provides a roadmap for other groups undertaking similar community-driven activities to develop and disseminate new Ocean Best Practices.

     
    more » « less
  3. Suppose an agent asserts that it will move through an environment in some way. When the agent executes its motion, how does one verify the claim? The problem arises in a range of contexts including validating safety claims about robot behavior, applications in security and surveillance, and for both the conception and the (physical) design and logistics of scientific experiments. Given a set of feasible sensors to select from, we ask how to choose sensors optimally in order to ensure that the agent's execution does indeed fit its pre-disclosed itinerary. Our treatment is distinguished from prior work in sensor selection by two aspects: the form the itinerary takes (a regular language of transitions) and that families of sensor choices can be grouped as a single choice. Both are intimately tied together, permitting construction of a product automaton because the same physical sensors (i.e., the same choice) can appear multiple times. This paper establishes the hardness of sensor selection for itinerary validation within this treatment, and proposes an exact algorithm based on an integer linear programming (ILP) formulation that is capable of solving problem instances of moderate size. We demonstrate its efficacy on small-scale case studies, including one motivated by wildlife tracking. 
    more » « less
  4. Endpoint sensors play an important role in an organization's network defense. However, endpoint sensors may be disabled or sabotaged if an adversary gains root-level access to the endpoint running the sensor. While traditional sensors cannot reliably defend against such compromises, this work explores an approach to detect these compromises in applications where multiple sensors can be correlated. We focus on the OpenFlow protocol and show that endpoint sensor data can be corroborated using a remote endpoint's sensor data or that of in-network sensors, like an OpenFlow switch. The approach allows end-to-end round trips of less than 20ms for around 90% of flows, which includes all flow elevation and processing overheads. In addition, the approach can detect flows from compromised nodes if there is a single uncompromised sensor on the network path. This approach allows defenders to quickly identify and quarantine nodes with compromised endpoint sensors. 
    more » « less
  5. Roll-to-roll printing has significantly shortened the time from design to production of sensors and IoT devices, while being cost-effective for mass production. But due to less manufacturing tolerance controls available, properties such as sensor thickness, composition, roughness, etc., cannot be precisely controlled. Since these properties likely affect the sensor behavior, roll-to-roll printed sensors require validation testing before they can be deployed in the field. In this work, we improve the testing of Nitrate sensors that need to be calibrated in a solution of known Nitrate concentration for around 1–2 days. To accelerate this process, we observe the initial behavior of the sensors for a few hours, and use a physics-informed machine learning method to predict their measurements 24 hours in the future, thus saving valuable time and testing resources. Due to the variability in roll-to-roll printing, this prediction task requires models that are robust to changes in properties of the new test sensors. We show that existing methods fail at this task and describe a physics-informed machine learning method that improves the prediction robustness to different testing conditions (≈ 1.7× lower in real-world data and ≈ 5× lower in synthetic data when compared with the current state-of-the-art physics-informed machine learning method). 
    more » « less