A water treatment center (WTC) removes contaminants and unwanted components from the water and makes the water more acceptable to the end-users. A modern WTC is equipped with different water sensors and uses a combination of wired/wireless communication network. During the water treatment process, controllers periodically collect sensor measurements and make important operational decisions. Since accuracy is vital, a WTC also uses different data validation mechanisms to validate the incoming sensor measurements. However, like any other cyber-physical system, water treatment facilities are prone to cyberattacks and an intelligent adversary can alter the sensors measurements stealthily, and corrupt the water treatment process. In this work, we propose WTC Checker (WTC2), an impact-aware formal analysis framework that demonstrates the impact of stealthy false data injection attacks on the water treatment sensors. Through our work, we demonstrate that if an adversary has sufficient access to sensor measurements and can evade the data validation process, he/she can compromise the sensors measurements, change the water disinfectant contact time, and inflict damage to the clean water production process. We model this attack as a constraint satisfaction problem (CSP) and encode it using Satisfiability Modulo Theories (SMT). We evaluate the proposed framework for its threat analysis capability as well as its scalability by executing experiments on different synthetic test cases.
more »
« less
Sensor Selection for Detecting Deviations from a Planned Itinerary
Suppose an agent asserts that it will move through an environment in some way. When the agent executes its motion, how does one verify the claim? The problem arises in a range of contexts including validating safety claims about robot behavior, applications in security and surveillance, and for both the conception and the (physical) design and logistics of scientific experiments. Given a set of feasible sensors to select from, we ask how to choose sensors optimally in order to ensure that the agent's execution does indeed fit its pre-disclosed itinerary. Our treatment is distinguished from prior work in sensor selection by two aspects: the form the itinerary takes (a regular language of transitions) and that families of sensor choices can be grouped as a single choice. Both are intimately tied together, permitting construction of a product automaton because the same physical sensors (i.e., the same choice) can appear multiple times. This paper establishes the hardness of sensor selection for itinerary validation within this treatment, and proposes an exact algorithm based on an integer linear programming (ILP) formulation that is capable of solving problem instances of moderate size. We demonstrate its efficacy on small-scale case studies, including one motivated by wildlife tracking.
more »
« less
- Award ID(s):
- 1849291
- PAR ID:
- 10333050
- Date Published:
- Journal Name:
- IEEE/RSJ International Conference on Intelligent Robots and Systems
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
In deep learning (DL) based human activity recognition (HAR), sensor selection seeks to balance prediction accuracy and sensor utilization (how often a sensor is used). With advances in on-device inference, sensors have become tightly integrated with DL, often restricting access to the underlying model used. Given only sensor predictions, how can we derive a selection policy which does efficient classification while maximizing accuracy? We propose a cascaded inference approach which, given the prediction of any one sensor, determines whether to query all other sensors. Typically, cascades use a sequence of classifiers which terminate once the confidence of a classifier exceeds a threshold. However, a threshold-based policy for sensor selection may be suboptimal; we define a more general class of policies which can surpass the threshold. We extend to settings where little or no labeled data is available for tuning the policy. Our analysis is validated on three HAR datasets by improving upon the F1-score of a threshold policy across several utilization budgets. Overall, our work enables practical analytics for HAR by relaxing the requirement of labeled data for sensor selection and reducing sensor utilization to directly extend a sensor system’s lifetime.more » « less
-
null (Ed.)As the market for autonomous vehicles advances, a need for robust safety protocols also increases. Autonomous vehicles rely on sensors to understand their operating environment. Active sensors such as camera, LiDAR, ultrasonic, and radar are vulnerable to physical channel attacks. One way to counter these attacks is to pattern match the sensor data with its own unique physical distortions, commonly referred to as a fingerprint. This fingerprint exists because of how the sensor was manufactured, and it can be used to determine the transmitting sensor from the received waveform. In this paper, using an ultrasonic sensor, we establish that there exists a specific distortion profile in the transmitted waveform called physical fingerprint that can be attributed to their intrinsic characteristics. We propose a joint time-frequency analysis-based framework for ultrasonic sensor fingerprint extraction and use it as a feature to train a Naive Bayes classifier. The trained model is used for transmitter identification from the received physical waveform.more » « less
-
On minimal tests of sensor veracity for dynamic watermarking-based defense of cyber-physical systemsWe address the problem of security of cyber-physical systems where some sensors may be malicious. We consider a multiple-input, multiple-output stochastic linear dynamical system controlled over a network of communication and computational nodes which contains (i) a controller that computes the inputs to be applied to the physical plant, (ii) actuators that apply these inputs to the plant, and (iii) sensors which measure the outputs of the plant. Some of these sensors, however, may be malicious. The malicious sensors do not report the true measurements to the controller. Rather, they report false measurements that they fabricate, possibly strategically, so as to achieve any objective that they may have, such as destabilizing the closed-loop system or increasing its running cost. Recently, it was shown that under certain conditions, an approach of “dynamic watermarking” can secure such a stochastic linear dynamical system in the sense that either the presence of malicious sensors in the system is detected, or the malicious sensors are constrained to adding a distortion that can only be of zero power to the noise already entering the system. The first contribution of this paper is to generalize this result to partially observed MIMO systems with both process and observation noises, a model which encompasses some of the previous models for which dynamic watermarking was established to guarantee security. This result, similar to the prior ones, is shown to hold when the controller subjects the reported sequence of measurements to two particular tests of veracity. The second contribution of this paper is in showing, via counterexamples, that both of these tests are needed in order to secure the control system in the sense that if any one of these two tests of sensor veracity is dropped, then the above guarantee does not hold. The proposed approach has several potential applications, including in smart grids, automated transportation, and process control.more » « less
-
Explaining why animal groups vary in size is a fundamental problem in behavioral ecology. One hypothesis is that life-history differences among individuals lead to sorting of phenotypes into groups of different sizes where each individual does best. This hypothesis predicts that individuals should be relatively consistent in their use of particular group sizes across time. Little is known about whether animals’ choice of group size is repeatable across their lives, especially in long-lived species. We studied consistency in choice of breeding-colony size in colonially nesting cliff swallows ( Petrochelidon pyrrhonota ) in western Nebraska, United States, over a 32-year period, following 6,296 birds for at least four breeding seasons. Formal repeatability of size choice for the population was about 0.41. About 45% of individuals were relatively consistent in choice of colony size, while about 40% varied widely in the colony size they occupied. Birds using the smaller and larger colonies appeared more consistent in size use than birds occupying more intermediate sized colonies. Consistency in colony size was also influenced by whether a bird used the same physical colony site each year and whether the site had been fumigated to remove ectoparasites. The difference between the final and initial colony sizes for an individual, a measure of the net change in its colony size over its life, did not significantly depart from 0 for the dataset as a whole. However, different year-cohorts did show significant net change in colony size, both positive and negative, that may have reflected fluctuating selection on colony size among years based on climatic conditions. The results support phenotypic sorting as an explanation for group size variation, although cliff swallows also likely use past experience at a given site and the extent of ectoparasitism to select breeding colonies.more » « less
An official website of the United States government

