skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Persistence landscapes of affine fractals
Abstract We develop a method for calculating the persistence landscapes of affine fractals using the parameters of the corresponding transformations. Given an iterated function system of affine transformations that satisfies a certain compatibility condition, we prove that there exists an affine transformation acting on the space of persistence landscapes, which intertwines the action of the iterated function system. This latter affine transformation is a strict contraction and its unique fixed point is the persistence landscape of the affine fractal. We present several examples of the theory as well as confirm the main results through simulations.  more » « less
Award ID(s):
1830254 1934884
PAR ID:
10329468
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
Demonstratio Mathematica
Volume:
55
Issue:
1
ISSN:
2391-4661
Page Range / eLocation ID:
163 to 192
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Chechik, M.; Katoen, JP.; Leucker, M. (Ed.)
    Efficient verification algorithms for neural networks often depend on various abstract domains such as intervals, zonotopes, and linear star sets. The choice of the abstract domain presents an expressiveness vs. scalability trade-off: simpler domains are less precise but yield faster algorithms. This paper investigates the octatope abstract domain in the context of neural net verification. Octatopes are affine transformations of n-dimensional octagons—sets of unit-two-variable-per-inequality (UTVPI) constraints. Octatopes generalize the idea of zonotopes which can be viewed as an affine transformation of a box. On the other hand, octatopes can be considered as a restriction of linear star set, which are affine transformations of arbitrary H-Polytopes. This distinction places octatopes firmly between zonotopes and star sets in their expressive power, but what about the efficiency of decision procedures? An important analysis problem for neural networks is the exact range computation problem that asks to compute the exact set of possible outputs given a set of possible inputs. For this, three computational procedures are needed: 1) optimization of a linear cost function; 2) affine mapping; and 3) over-approximating the intersection with a half-space. While zonotopes allow an efficient solution for these approaches, star sets solves these procedures via linear programming. We show that these operations are faster for octatopes than the more expressive linear star sets. For octatopes, we reduce these problems to min-cost flow problems, which can be solved in strongly polynomial time using the Out-of-Kilter algorithm. Evaluating exact range computation on several ACAS Xu neural network benchmarks, we find that octatopes show promise as a practical abstract domain for neural network verification. 
    more » « less
  2. This paper demonstrates the fundamental vulnerability of networked linear control systems to perfectly undetectable false data injection attacks (FDIAs) based on affine transformations. The work formulates a generalized FDIA framework that coordinates multiplicative and additive data injections targeting both control commands and observables in networked systems. The paper derives mathematical conditions for executing affine transformation based perfectly undetectable attacks (ATPAs) on state-feedback and output-feedback control systems, with attack capabilities varying based on the attacker’s knowledge of plant dynamics and control gains. The paper examines several attack scenarios, including scaling and general affine transformations, and characterizes the range of system knowledge—from minimum to full—required for different attack types. The paper classifies ATPA into four types based on the feedback structure (state or output) and knowledge requirements: those that match plant dynamics without controller knowledge and those that match closed-loop dynamics by exploiting controller information. The paper examines several attack scenarios and shows how carefully ATPAs can create the illusion of normal system operation while the actual system behavior deviates significantly from intended trajectories. 
    more » « less
  3. Vedaldi, A. (Ed.)
    In state-of-the-art deep neural networks, both feature normalization and feature attention have become ubiquitous. They are usually studied as separate modules, however. In this paper, we propose a light-weight integration between the two schema and present Attentive Normalization (AN). Instead of learning a single affine transformation, AN learns a mixture of affine transformations and utilizes their weighted-sum as the final affine transformation applied to re-calibrate features in an instance-specific way. The weights are learned by leveraging channel-wise feature attention. In experiments, we test the proposed AN using four representative neural architectures. In the ImageNet-1000 classification benchmark and the MS-COCO 2017 object detection and instance segmentation benchmark. AN obtains consistent performance improvement for different neural architectures in both benchmarks with absolute increase of top-1 accuracy in ImageNet-1000 between 0.5\% and 2.7\%, and absolute increase up to 1.8\% and 2.2\% for bounding box and mask AP in MS-COCO respectively. We observe that the proposed AN provides a strong alternative to the widely used Squeeze-and-Excitation (SE) module. The source codes are publicly available at \href{https://github.com/iVMCL/AOGNet-v2}{the ImageNet Classification Repo} and \href{https://github.com/iVMCL/AttentiveNorm\_Detection}{the MS-COCO Detection and Segmentation Repo}. 
    more » « less
  4. Efficient verification algorithms for neural networks often depend on various abstract domains such as intervals, zonotopes, and linear star sets. The choice of the abstract domain presents an expressiveness vs. scalability trade-off: simpler domains are less precise but yield faster algorithms. This paper investigates the hexatope and octatope abstract domains in the context of neural net verification. Hexatopes are affine transformations of higher-dimensional hexagons, defined by difference constraint systems, and octatopes are affine transformations of higher-dimensional octagons, defined by unit-two-variable-per-inequality constraint systems. These domains generalize the idea of zonotopes which can be viewed as affine transformations of hypercubes. On the other hand, they can be considered as a restriction of linear star sets, which are affine transformations of arbitrary H-Polytopes. This distinction places hexatopes and octatopes firmly between zonotopes and linear star sets in their expressive power, but what about the efficiency of decision procedures? An important analysis problem for neural networks is the exact range computation problem that asks to compute the exact set of possible outputs given a set of possible inputs. For this, three computational procedures are needed: (1) optimization of a linear cost function; (2) affine mapping; and (3) over-approximating the intersection with a half-space. While zonotopes allow an efficient solution for these approaches, star sets solves these procedures via linear programming. We show that these operations are faster for hexatopes and octatopes than they are for the more expressive linear star sets by reducing the linear optimization problem over these domains to the minimum cost network flow, which can be solved in strongly polynomial time using the Out-of-Kilter algorithm. Evaluating exact range computation on several ACAS Xu neural network benchmarks, we find that hexatopes and octatopes show promise as a practical abstract domain for neural network verification. 
    more » « less
  5. Abstract We provide necessary and sufficient conditions for joint ergodicity results for systems of commuting measure preserving transformations for an iterated Hardy field function of polynomial growth. Our method builds on and improves recent techniques due to Frantzikinakis and Tsinas, who dealt with multiple ergodic averages along Hardy field functions; it also enhances an approach introduced by the authors and Ferré Moragues to study polynomial iterates. The more general expression, in which the iterate is a linear combination of a Hardy field function of polynomial growth and a tempered function, is studied as well. 
    more » « less