skip to main content

This content will become publicly available on March 29, 2024

Title: Emergence of pseudo-time during optimal Monte Carlo sampling and temporal aspects of symmetry breaking and restoration

We argue that one can associate a pseudo-time with sequences of configurations generated in the course of classical Monte Carlo simulations for a single-minimum bound state if the sampling is optimal. Hereby, the sampling rates can be, under special circumstances, calibrated against the relaxation rate and frequency of motion of an actual physical system. The latter possibility is linked to the optimal sampling regime being a universal crossover separating two distinct suboptimal sampling regimes analogous to the physical phenomena of diffusion and effusion, respectively. Bound states break symmetry; one may thus regard the pseudo-time as a quantity emerging together with the bound state. Conversely, when transport among distinct bound states takes place—thus restoring symmetry—a pseudo-time can no longer be defined. One can still quantify activation barriers if the latter barriers are smooth, but simulation becomes impractically slow and pertains to overdamped transport only. Specially designed Monte Carlo moves that bypass activation barriers—so as to accelerate sampling of the thermodynamics—amount to effusive transport and lead to severe under-sampling of transition-state configurations that separate distinct bound states while destroying the said universality. Implications of the present findings for simulations of glassy liquids are discussed.

Award ID(s):
Publication Date:
Journal Name:
The Journal of Chemical Physics
American Institute of Physics
Sponsoring Org:
National Science Foundation
More Like this
  1. Faceted nanoparticles can be used as building blocks to assemble nanomaterials with exceptional optical and catalytic properties. Recent studies have shown that surface functionalization of such nanoparticles with organic molecules, polymer chains, or DNA can be used to control the separation distance and orientation of particles within their assemblies. In this study, we computationally investigate the mechanism of assembly of nanocubes grafted with short-chain molecules. Our approach involves computing the interaction free energy landscape of a pair of such nanocubes via Monte Carlo simulations and using the Dijkstra algorithm to determine the minimum free energy pathway connecting key states in the landscape. We find that the assembly pathway of nanocubes is very rugged involving multiple energy barriers and metastable states. Analysis of nanocube configurations along the pathway reveals that the assembly mechanism is dominated by sliding motion of nanocubes relative to each other punctuated by their local dissociation at grafting points involving lineal separation and rolling motions. The height of energy barriers between metastable states depends on factors such as the interaction strength and surface roughness of the nanocubes and the steric repulsion from the grafts. These results imply that the observed assembly configuration of nanocubes depends not only onmore »their globally stable minimum free energy state but also on the assembly pathway leading to this state. The free energy landscapes and assembly pathways presented in this study along with the proposed guidelines for engineering such pathways should be useful to researchers aiming to achieve uniform nanostructures from self-assembly of faceted nanoparticles.« less
  2. Abstract Background

    No versatile web app exists that allows epidemiologists and managers around the world to comprehensively analyze the impacts of COVID-19 mitigation. The app presented here fills this gap.


    Our web app uses a model that explicitly identifies susceptible, contact, latent, asymptomatic, symptomatic and recovered classes of individuals, and a parallel set of response classes, subject to lower pathogen-contact rates. The user inputs a CSV file of incidence and, if of interest, mortality rate data. A default set of parameters is available that can be overwritten through input or online entry, and a user-selected subset of these can be fitted to the model using maximum-likelihood estimation (MLE). Model fitting and forecasting intervals are specifiable and changes to parameters allow counterfactual and forecasting scenarios. Confidence or credible intervals can be generated using stochastic simulations, based on MLE values, or on an inputted CSV file containing Markov chain Monte Carlo (MCMC) estimates of one or more parameters.


    We illustrate the use of our web app in extracting social distancing, social relaxation, surveillance or virulence switching functions (i.e., time varying drivers) from the incidence and mortality rates of COVID-19 epidemics in Israel, South Africa, and England. The Israeli outbreak exhibits fourmore »distinct phases: initial outbreak, social distancing, social relaxation, and a second wave mitigation phase. An MCMC projection of this latter phase suggests the Israeli epidemic will continue to produce into late November an average of around 1500 new case per day, unless the population practices social-relaxation measures at least 5-fold below the level in August, which itself is 4-fold below the level at the start of July. Our analysis of the relatively late South African outbreak that became the world’s fifth largest COVID-19 epidemic in July revealed that the decline through late July and early August was characterised by a social distancing driver operating at more than twice the per-capita applicable-disease-class (pc-adc) rate of the social relaxation driver. Our analysis of the relatively early English outbreak, identified a more than 2-fold improvement in surveillance over the course of the epidemic. It also identified a pc-adc social distancing rate in early August that, though nearly four times the pc-adc social relaxation rate, appeared to barely contain a second wave that would break out if social distancing was further relaxed.


    Our web app provides policy makers and health officers who have no epidemiological modelling or computer coding expertise with an invaluable tool for assessing the impacts of different outbreak mitigation policies and measures. This includes an ability to generate an epidemic-suppression or curve-flattening index that measures the intensity with which behavioural responses suppress or flatten the epidemic curve in the region under consideration.

    « less
  3. Abstract

    Numerical simulations of neutron star–neutron star and neutron star–black hole binaries play an important role in our ability to model gravitational-wave and electromagnetic signals powered by these systems. These simulations have to take into account a wide range of physical processes including general relativity, magnetohydrodynamics, and neutrino radiation transport. The latter is particularly important in order to understand the properties of the matter ejected by many mergers, the optical/infrared signals powered by nuclear reactions in the ejecta, and the contribution of that ejecta to astrophysical nucleosynthesis. However, accurate evolutions of the neutrino transport equations that include all relevant physical processes remain beyond our current reach. In this review, I will discuss the current state of neutrino modeling in general relativistic simulations of neutron star mergers and of their post-merger remnants. I will focus on the three main types of algorithms used in simulations so far: leakage, moments, and Monte-Carlo scheme. I will review the advantages and limitations of each scheme, as well as the various neutrino–matter interactions that should be included in simulations. We will see that the quality of the treatment of neutrinos in merger simulations has greatly increased over the last decade, but also that many potentiallymore »important interactions remain difficult to take into account in simulations (pair annihilation, oscillations, inelastic scattering).

    « less
  4. Many problems in the physical sciences, machine learning, and statistical inference necessitate sampling from a high-dimensional, multimodal probability distribution. Markov Chain Monte Carlo (MCMC) algorithms, the ubiquitous tool for this task, typically rely on random local updates to propagate configurations of a given system in a way that ensures that generated configurations will be distributed according to a target probability distribution asymptotically. In high-dimensional settings with multiple relevant metastable basins, local approaches require either immense computational effort or intricately designed importance sampling strategies to capture information about, for example, the relative populations of such basins. Here, we analyze an adaptive MCMC, which augments MCMC sampling with nonlocal transition kernels parameterized with generative models known as normalizing flows. We focus on a setting where there are no preexisting data, as is commonly the case for problems in which MCMC is used. Our method uses 1) an MCMC strategy that blends local moves obtained from any standard transition kernel with those from a generative model to accelerate the sampling and 2) the data generated this way to adapt the generative model and improve its efficacy in the MCMC algorithm. We provide a theoretical analysis of the convergence properties of this algorithm andmore »investigate numerically its efficiency, in particular in terms of its propensity to equilibrate fast between metastable modes whose rough location is known a priori but respective probability weight is not. We show that our algorithm can sample effectively across large free energy barriers, providing dramatic accelerations relative to traditional MCMC algorithms.« less
  5. Several important biological processes are initiated by the binding of a protein to a specific site on the DNA. The strategy adopted by a protein, called transcription factor (TF), for searching its specific binding site on the DNA has been investigated over several decades. In recent times the effects obstacles, like DNA-binding proteins, on the search by TF has begun to receive attention. RNA polymerase (RNAP) motors collectively move along a segment of the DNA during a genomic process called transcription. This RNAP trac is bound to affect the diffusive scanning of the same segment of the DNA by a TF searching for its binding site. Motivated by this phenomenon, here we develop a kinetic model where a ‘particle’, that represents a TF, searches for a specific site on a one-dimensional lattice. On the same lattice another species of particles, each representing a RNAP, hop from left to right exactly as in a totally asymmetric simple exclusion process (TASEP) which forbids simultaneous occupation of any site by more than one particle, irrespective of their identities. Although the TF is allowed to attach to or detach from any lattice site, the RNAPs can attach only to the first site at themore »left edge and detach from only the last site on the right edge of the lattice. We formulate the search as a first-passage process; the time taken to reach the target site for the first time, starting from a well defined initial state, is the search time. By approximate analytical calculations and Monte Carlo (MC) computer simulations, we calculate the mean search time. We show that RNAP traffic rectifies the diffusive motion of TF to that of a Brownian ratchet, and the mean time of successful search can be even shorter than that required in the absence of RNAP traffic. Moreover, we show that there is an optimal rate of detachment that corresponds to the shortest mean search time.« less