skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: TROPHY: A Topologically Robust Physics-Informed Tracking Framework for Tropical Cyclones
Existing error-bounded lossy compression techniques control the pointwise error during compression to guarantee the integrity of the decompressed data. However, they typically do not explicitly preserve the topological features in data. When performing post hoc analysis with decompressed data using topological methods, preserving topology in the compression process to obtain topologically consistent and correct scientific insights is desirable. In this paper, we introduce TopoSZ, an error-bounded lossy compression method that preserves the topological features in 2D and 3D scalar fields. Specifically, we aim to preserve the types and locations of local extrema as well as the level set relations among critical points captured by contour trees in the decompressed data. The main idea is to derive topological constraints from contour-tree-induced segmentation from the data domain, and incorporate such constraints with a customized error-controlled quantization strategy from the SZ compressor (version 1.4). Our method allows users to control the pointwise error and the loss of topological features during the compression process with a global error bound and a persistence threshold.  more » « less
Award ID(s):
2145499 1910733
PAR ID:
10440939
Author(s) / Creator(s):
; ; ; ;
Publisher / Repository:
IEEE
Date Published:
Journal Name:
IEEE Transactions on Visualization and Computer Graphics
Volume:
30
Issue:
1
ISSN:
1077-2626
Subject(s) / Keyword(s):
Feature tracking robustness topology-based methods in visualization applications climate science tropical cyclones
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Existing error-bounded lossy compression techniques control the pointwise error during compression to guarantee the integrity of the decompressed data. However, they typically do not explicitly preserve the topological features in data. When performing post hoc analysis with decompressed data using topological methods, preserving topology in the compression process to obtain topologically consistent and correct scientific insights is desirable. In this paper, we introduce TopoSZ, an error-bounded lossy compression method that preserves the topological features in 2D and 3D scalar fields. Specifically, we aim to preserve the types and locations of local extrema as well as the level set relations among critical points captured by contour trees in the decompressed data. The main idea is to derive topological constraints from contour-tree-induced segmentation from the data domain, and incorporate such constraints with a customized error-controlled quantization strategy from the SZ compressor (version 1.4). Our method allows users to control the pointwise error and the loss of topological features during the compression process with a global error bound and a persistence threshold. 
    more » « less
  2. The objective of this work is to develop error-bounded lossy compression methods to preserve topological features in 2D and 3D vector fields. Specifically, we explore the preservation of critical points in piecewise linear and bilinear vector fields. We define the preservation of critical points as, without any false positive, false negative, or false type in the decompressed data, (1) keeping each critical point in its original cell and (2) retaining the type of each critical point (e.g., saddle and attracting node). The key to our method is to adapt a vertex-wise error bound for each grid point and to compress input data together with the error bound field using a modified lossy compressor. Our compression algorithm can be also embarrassingly parallelized for large data handling and in situ processing. We benchmark our method by comparing it with existing lossy compressors in terms of false positive/negative/type rates, compression ratio, and various vector field visualizations with several scientific applications. 
    more » « less
  3. Vast volumes of data are produced by today’s scientific simulations and advanced instruments. These data cannot be stored and transferred efficiently because of limited I/O bandwidth, network speed, and storage capacity. Error-bounded lossy compression can be an effective method for addressing these issues: not only can it significantly reduce data size, but it can also control the data distortion based on user-defined error bounds. In practice, many scientific applications have specific requirements or constraints for lossy compression, in order to guarantee that the reconstructed data are valid for post hoc analysis. For example, some datasets contain irrelevant data that should be isolated in particular and users often have intuition regarding value ranges, geospatial regions, and other data subsets that are crucial for subsequent analysis. Existing state-of-the-art error-bounded lossy compressors, however, do not consider these constraints during compression, resulting in inferior compression ratios with respect to user’s post hoc analysis, due to the fact that the data itself provides little or no value for post hoc analysis. In this work we address this issue by proposing an optimized framework that can preserve diverse constraints during the error-bounded lossy compression, e.g., cleaning the irrelevant data, efficiently preserving different precision for multiple value intervals, and allowing users to set diverse precision over both regular and irregular regions. We perform our evaluation on a supercomputer with up to 2,100 cores. Experiments with six real-world applications show that our proposed diverse constraints based error-bounded lossy compressor can obtain a higher visual quality or data fidelity on reconstructed data with the same or even higher compression ratios compared with the traditional state-of-the-art compressor SZ. Our experiments also demonstrate very good scalability in compression performance compared with the I/O throughput of the parallel file system. 
    more » « less
  4. Data compression is a powerful solution for addressing big data challenges in database and data management. In scientific data compression for vector fields, preserving topological information is essential for accurate analysis and visualization. The topological skeleton, a fundamental component of vector field topology, consists of critical points and their connectivity, known as separatrices. While previous work has focused on preserving critical points in error-controlled lossy compression, little attention has been given to preserving separatrices, which are equally important. In this work, we introduce TspSZ, an efficient error-bounded lossy compression framework designed to preserve both critical points and separatrices. Our key contributions are threefold: First, we propose TspSZ, a topological-skeleton-preserving lossy compression framework that integrates two algorithms. This allows existing critical-point-preserving compressors to also retain separatrices, significantly enhancing their ability to preserve topological structures. Second, we optimize TspSZ for efficiency through tailored improvements and parallelization. Specifically, we introduce a new error control mechanism to achieve high compression ratios and implement a shared-memory parallelization strategy to boost compression throughput. Third, we evaluate TspSZ against state-of-the-art lossy and lossless compressors using four real-world scientific datasets. Experimental results show that TspSZ achieves compression ratios of up to 7.7 times while effectively preserving the topological skeleton. This ensures efficient storage and transmission of scientific data without compromising topological integrity. 
    more » « less
  5. This research explores a novel paradigm for preserving topological segmentations in existing error-bounded lossy compressors. Today's lossy compressors rarely consider preserving topologies such as Morse-Smale complexes, and the discrepancies in topology between original and decompressed datasets could potentially result in erroneous interpretations or even incorrect scientific conclusions. In this paper, we focus on preserving Morse-Smale segmentations in 2D/3D piecewise linear scalar fields, targeting the precise reconstruction of minimum/maximum labels induced by the integral line of each vertex. The key is to derive a series of edits during compression time; the edits are applied to the decompressed data, leading to an accurate reconstruction of segmentations while keeping the error within the prescribed error bound. To this end, we developed a workflow to fix extrema and integral lines alternatively until convergence within finite iterations; we accelerate each workflow component with shared-memory/GPU parallelism to make the performance practical for coupling with compressors. We demonstrate use cases with fluid dynamics, ocean, and cosmology application datasets with a significant acceleration with an NVIDIA A100 GPU. 
    more » « less