skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Asymptotically matched extrapolation of fishnet failure probability to continuum scale
Motivated by the extraordinary strength of nacre, which exceeds the strength of its fragile constituents by an order of magnitude, the fishnet statistics became in 2017 the only analytically solvable probabilistic model of structural strength other than the weakest-link and fiberbundle models. These two models lead, respectively, to the Weibull and Gaussian (or normal) distributions at the large-size limit, which are hardly distinguishable in the central range of failure probability. But they differ enormously at the failure probability level of 10−6 , considered as the maximum tolerable for engineering structures. Under the assumption that no more than three fishnet links fail prior to the peak load, the preceding studies led to exact solutions intermediate between Weibull and Gaussian distributions. Here massive Monte Carlo simulations are used to show that these exact solutions do not apply for fishnets with more than about 500 links. The simulations show that, as the number of links becomes larger, the likelihood of having more than three failed links up to the peak load is no longer negligible and becomes large for fishnets with many thousands of links. A differential equation is derived for the probability distribution of not-too-large fishnets, characterized by the size effect, the mean and the coefficient of variation. Although the large-size asymptotic distribution is beyond the reach of the Monte Carlo simulations, it can by illuminated by approximating the large-scale fishnet as a continuum with a crack or a circular hole. For the former, instability is proven via complex variables, and for the latter via a known elasticity solution for a hole in a continuum under antiplane shear. The fact that rows or enclaves of link failures acting as cracks or holes can form in the largescale continuum at many random locations necessarily leads to the Weibull distribution of the large fishnet, given that these cracks or holes become unstable as soon they reach a certain critical size. The Weibull modulus of this continuum is estimated to be more than triple that of the central range of small fishnets. The new model is expected to allow spin-offs for printed materials with octet architecture maximizing the strength–weight ratio.  more » « less
Award ID(s):
2029641
PAR ID:
10525831
Author(s) / Creator(s):
; ; ; ; ;
Publisher / Repository:
Elsevier
Date Published:
Journal Name:
Journal of the Mechanics and Physics of Solids
Volume:
182
Issue:
C
ISSN:
0022-5096
Page Range / eLocation ID:
105479
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract The investigation of statistical scaling in localization-induced failures dates back to da Vinci's speculation on the length effect on rope strength in 1500 s. The early mathematical description of statistical scaling emerged with the birth of the extreme value statistics. The most commonly known mathematical model for statistical scaling is the Weibull size effect, which is a direct consequence of the infinite weakest-link model. However, abundant experimental observations on various localization-induced failures have shown that the Weibull size effect is inadequate. Over the last two decades, two mathematical models were developed to describe the statistical size effect in localization-induced failures. One is the finite weakest-link model, in which the random structural resistance is expressed as the minimum of a set of independent discrete random variables. The other is the level excursion model, a continuum description of the finite weakest-link model, in which the structural failure probability is calculated as the probability of the upcrossing of a random field over a barrier. This paper reviews the mathematical formulation of these two models and their applications to various engineering problems including the strength distributions of quasi-brittle structures, failure statistics of micro-electromechanical systems (MEMS) devices, breakdown statistics of high– k gate dielectrics, and probability distribution of buckling pressure of spherical shells containing random geometric imperfections. In addition, the implications of statistical scaling for the stochastic finite element simulations and the reliability-based structural design are discussed. In particular, the recent development of the size-dependent safety factors is reviewed. 
    more » « less
  2. Fracture patterns experienced under a dynamic uniaxial compressive load are highly sensitive to rock microstructural defects due to its brittleness and the absence of macroscopic stress concentration points. We propose two different approaches for modeling rock microstructural defects and inhomogeneity. In the explicit realization approach, microcracks with certain statistics are incorporated in the computational domain. In the implicit realization approach, fracture strength values are sampled using a Weibull probability distribution. We use the Mohr-Coulomb failure criterion to define an effective stress in the context of an interfacial damage model. This model predicts crack propagation at angles ±ɸch = ±(45 − ɸ/2) relative to the direction of compressive load, where ɸ is the friction angle. By using appropriate models for fracture strength anisotropy, we demonstrate the interaction of rock weakest plane and ɸch. Numerical results demonstrate the greater effect of strength anisotropy on fracture pattern when an explicit approach is employed. In addition, the density of fractures increases as the angle of the weakest planes approaches ±ɸch. The fracture simulations are performed by an h-adaptive asynchronous spacetime discontinuous Galerkin (aSDG) method that can accommodate crack propagation in any directions. 
    more » « less
  3. In password security a defender would like to identify and warn users with weak passwords. Similarly, the defender may also want to predict what fraction of passwords would be cracked within B guesses as the attacker’s guessing budget B varies from small (online attacker) to large (offline attacker). Towards each of these goals the defender would like to quickly estimate the guessing number for each user password pwd assuming that the attacker uses a password cracking model M i.e., how many password guesses will the attacker check before s/he cracks each user password pwd. Since naïve brute-force enumeration can be prohibitively expensive when the guessing number is very large, Dell’Amico and Filippone [1] developed an efficient Monte Carlo algorithm to estimate the guessing number of a given password pwd. While Dell’Amico and Filippone proved that their estimator is unbiased there is no guarantee that the Monte Carlo estimates are accurate nor does the method provide confidence ranges on the estimated guessing number or even indicate if/when there is a higher degree of uncertainty.Our contributions are as follows: First, we identify theoretical examples where, with high probability, Monte Carlo Strength estimation produces highly inaccurate estimates of individual guessing numbers as well as the entire guessing curve. Second, we introduce Confident Monte Carlo Strength Estimation as an extension of Dell’Amico and Filippone [1]. Given a password our estimator generates an upper and lower bound with the guarantee that, except with probability δ, the true guessing number lies within the given confidence range. Our techniques can also be used to characterize the attacker’s guessing curve. In particular, given a probabilistic password cracking model M we can generate high confidence upper and lower bounds on the fraction of passwords that the attacker will crack as the guessing budget B varies. 
    more » « less
  4. Baraldi, P.; null; Zio, E. (Ed.)
    Critical infrastructure networks are becoming increasingly interdependent which adversely impacts their performance through the cascading effect of initial failures. Failing to account for these complex interactions could lead to an underestimation of the vulnerability of interdependent critical infrastructure (ICI). The goal of this research is to assess how important interdependent links are by evaluating the interdependency strength using a dynamic network flow redistribution model which accounts for the dynamic and uncertain aspects of interdependencies. Specifically, a vulnerability analysis is performed considering two scenarios, one with interdependent links and the other without interdependent links. The initial failure is set to be the same under both scenarios. Cascading failure is modeled through a flow redistribution until the entire system reaches a stable state in which cascading failure no longer occurs. The unmet demand of the networks at the stable state over the initial demand is defined as the vulnerability. The difference between the vulnerability of each network under these two scenarios is used as the metric to quantify interdependency strength. A case study of a real power-water-gas system subject to earthquake risk is conducted to illustrate the proposed method. Uncertainty is incorporated by considering failure probability using Monte Carlo simulation. By varying the location and magnitude of earthquake disruptions, we show that interdependency strength is determined not only by the topology and flow of ICIs but also the characteristics of the disruptions. This compound system-disruption effect on interdependency strength can inform the design, assessment, and restoration of ICIs. 
    more » « less
  5. We propose an analytic approach for the steady-state dynamics of Markov processes on locally tree-like graphs. It is based on time-translation invariant probability distributions for edge trajectories, which we encode in terms of infinite matrix products. For homogeneous ensembles on regular graphs, the distribution is parametrized by a single d×d×r^2 tensor, where r is the number of states per variable, and d is the matrix-product bond dimension. While the method becomes exact in the large-d limit, it typically provides highly accurate results even for small bond dimensions d. The d^2r^2 parameters are determined by solving a fixed point equation, for which we provide an efficient belief-propagation procedure. We apply this approach to a variety of models, including Ising-Glauber dynamics with symmetric and asymmetric couplings, as well as the SIS model. Even for small d, the results are compatible with Monte Carlo estimates and accurately reproduce known exact solutions. The method provides access to precise temporal correlations, which, in some regimes, would be virtually impossible to estimate by sampling. 
    more » « less