A finite element elasticity complex on tetrahedral meshes and the corresponding commutative diagram are devised. The H 1 H^1 conforming finite element is the finite element developed by Neilan for the velocity field in a discrete Stokes complex. The symmetric div-conforming finite element is the Hu-Zhang element for stress tensors. The construction of an H ( inc ) H(\operatorname {inc}) -conforming finite element of minimum polynomial degree 6 6 for symmetric tensors is the focus of this paper. Our construction appears to be the first H ( inc ) H(\operatorname {inc}) -conforming finite elements on tetrahedral meshes without further splitting. The key tools of the construction are the decomposition of polynomial tensor spaces and the characterization of the trace of the inc \operatorname {inc} operator. The polynomial elasticity complex and Koszul elasticity complex are created to derive the decomposition. The trace of the inc \operatorname {inc} operator is induced from a Green’s identity. Trace complexes and bubble complexes are also derived to facilitate the construction. Two-dimensional smooth finite element Hessian complex and div div \operatorname {div}\operatorname {div} complex are constructed.
more »
« less
Two Decades of Industrializing Formal Verification: The Reactis Story
Reactis® is a suite of tools produced by Reactive Systems, Inc. (RSI), for automated test generation from, and verification of, sys- tems given in either the modeling languages MATLAB® / Simulink® / Stateflow® of The MathWorks, Inc., or ANSI C. RSI was founded by three of the authors of this paper in 1999, with the first release of Reactis coming in 2002; the tools are used in the testing and validation of embed- ded control systems in a variety of industries, including automotive and aerospace / defense. This paper traces the development of the Reactis tool suite from earlier research on model-checking tools undertaken by the authors and others, highlighting the importance of both the foun- dational basis of Reactis and the essential adaptations and extensions needed for a commercially successful product.
more »
« less
- Award ID(s):
- 1918225
- PAR ID:
- 10556851
- Editor(s):
- Springer
- Publisher / Repository:
- https://doi.org/10.1007/978-3-031-66149-5_5
- Date Published:
- Subject(s) / Keyword(s):
- Reactis Model-based testing MATLAB / Simulink / Stateflow Guided simulation Instrumentation-based specification and verification
- Format(s):
- Medium: X
- Location:
- Luxembourg City, Luxembourg
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract Recent advances in complex automated handwriting identification systems have led to a lack of understandability of these systems’ computational processes and features by the forensic handwriting examiners that they are designed to support. To mitigate this issue, this research studied the relationship between two systems: FLASH ID®, an automated handwriting/black box system that uses measurements extracted from a static image of handwriting, and MovAlyzeR®, a system that captures kinematic features from pen strokes. For this study, 33 writers each wrote 60 phrases from the London Letter using cursive writing and handprinting, which led to thousands of sample pairs for analysis. The dissimilarities between pairs of samples were calculated using two score functions (one for each system). The observed results indicate that dissimilarity scores based on kinematic spatial‐geometric pen stroke features (e.g., amplitude and slant) have a statistically significant relationship with dissimilarity scores obtained using static, graph‐based features used by the FLASH ID®system. Similar relationships were observed for temporal features (e.g., duration and velocity) but not pen pressure, and for both handprinting and cursive samples. These results strongly imply that both the current implementation of FLASH ID®and MovAlyzeR®rely on similar features sets when measuring differences in pairs of handwritten samples. These results suggest that studies of biometric discrimination using MovAlyzeR®, specifically those based on the spatial‐geometric feature set, support the validity of biometric matching algorithms based on FLASH ID®output.more » « less
-
This data set consists of 3,244 gridded, daily averaged temperature, practical salinity, potential density, and dissolved oxygen profiles. These profiles were collected from October 2014 to May 2025 by the NSF Ocean Observatories Initiative Washington Offshore Profiler Mooring (CE09OSPM) located at 46.8517°N, 124.982°W between approximately 35 and 510 meters water depth using a McLane® Moored Profiler (MMP). The MMP was equipped with a Sea-Bird Scientific 52-MP (SBE 52-MP) CTD instrument and an associated Sea-Bird Scientific (SBE 43F) dissolved oxygen sensor. Raw binary data files [C*.DAT (CTD data); E*.DAT (engineering data plus auxiliary sensor data) and A*.DAT (current meter data)] were converted to ASCII text files using the McLane® Research Laboratories, Inc. Profile Unpacker v3.10 application. Dissolved oxygen calibration files for each of the twenty deployments were downloaded from the Ocean Observatories Initiative asset-management GitHub® repository. The unpacked C*.TXT (CTD data); E*.TXT (engineering data plus auxiliary sensors) and A*.TXT (current meter data) ASCII data files associated with each deployment were processed using a MATLAB® toolbox that was specifically created to process OOI MMP data. The toolbox imports MMP A*.TXT, C*.TXT, and E*.TXT data files, and applies the necessary calibration coefficients and data corrections, including adjusting for thermal-lag, flow, and sensor time constant effects. mmp_toolbox calculates dissolved oxygen concentration using the methods described in Owens and Millard (1985) and Garcia and Gordon (1992). Practical salinity and potential density are derived using the Gibbs-SeaWater Oceanographic Toolbox. After the corrections and calculations for each profile are complete, the data are binned in space to create a final, 0.5-dbar binned data set. The more than 24,000 individual temperature, practical salinity, pressure, potential density, and dissolved oxygen profiles were temporally averaged to form the final, daily averaged data set presented here. Using the methods described in Risien et al. (2023), daily temperature, practical salinity, potential density, and dissolved oxygen climatologies were calculated for each 0.5-dbar depth bin using a three-harmonic fit (1, 2, and 3 cycles per year) based on the 10-year period January 2015 to December 2024.more » « less
-
This dataset includes aerosol microphysics and chemical measurements collected at Mt. Soledad and Scripps Pier during the Eastern Pacific Cloud Aerosol Precipitation Experiment (EPCAPE) from February 2023 to February 2024. The measurements include the following instruments at Mt. Soledad: High-Resolution Time-of-Flight Aerosol Mass Spectrometer (HR-ToF-AMS, Aerodyne), Scanning Electrical Mobility Spectrometer (SEMS, Brechtel Manufacturing Inc.), Aerodynamic Particle Sizer (APS, Droplet Measurements Technologies), Single Particle Soot Photometer (SP2, Drople Measurements Technologies), Meteorological Station (WXT520, Vaisala), Ozone (Teco), and trace gas proxies (Teledyne). In addition, the analyses of particle filters collected at Mt. Soledad for three dry-diameter size cuts (<1 micron, <0.5 micron, <0.18 micron) and at Scripps Pier for one dry-diametr size cut (<1 micron) by Fourier Transform Infrared (FTIR) and X-ray Fluorescence (XRF) are reported. A differential mobility analyzer operated as a scanning mobility particle sizer (SMPS, TSI Inc.), a printed particle optical spectrometer (POPS, Grimm), and a continuous flow diffusion cloud condensation nuclei (CCN, DMT) counter provide the mobility aerosol size distribution (30-360 nm), optical size distribution (150 - 6000 nm), size-resolved CCN distribution (30-360 nm) at 0.2, 0.4, 0.6, 0.8, and 1.0% supersaturation. Measurements are reported for both sampling from an isokinetic aerosol inlet and from a Counterflow Virtual Impactor (CVI, Brechtel Manufacturing Inc.). Users of these measurements are encouraged to consult with the authors about appropriate interpretation before submitting for publication, offering coauthorship where appropriate.more » « less
-
null (Ed.)Containerized applications have exploded in popularity in recent years, due to their ease of deployment, reproducible nature, and speed of startup. Accordingly, container orchestration tools such as Kubernetes have emerged as resource providers and users alike try to organize and scale their work across clusters of systems. This paper documents some real-world experiences of building, operating, and using self-hosted Kubernetes Linux clusters. It aims at comparisons between Kubernetes and single-node container solutions and traditional multi-user, batch queue Linux clusters. The authors of this paper have background experience first running traditional HPC Linux clusters and queuing systems like Slurm, and later virtual machines using technologies such as Openstack. Much of the experience and perspective below is informed by this perspective. We will also provide a use-case from a researcher who deployed on Kubernetes without being as opinionated about other potential choices.more » « less
An official website of the United States government

