This article presents a search for new resonances decaying into a
One-dimensional persistent homology is arguably the most important and heavily used computational tool in topological data analysis. Additional information can be extracted from datasets by studying multi-dimensional persistence modules and by utilizing cohomological ideas, e.g. the cohomological cup product. In this work, given a single parameter filtration, we investigate a certain 2-dimensional persistence module structure associated with persistent cohomology, where one parameter is the cup-length
- Award ID(s):
- 1901360
- NSF-PAR ID:
- 10468088
- Publisher / Repository:
- Springer Science + Business Media
- Date Published:
- Journal Name:
- Journal of Applied and Computational Topology
- Volume:
- 8
- Issue:
- 1
- ISSN:
- 2367-1726
- Format(s):
- Medium: X Size: p. 93-148
- Size(s):
- ["p. 93-148"]
- Sponsoring Org:
- National Science Foundation
More Like this
-
A bstract Z orW boson and a 125 GeV Higgs bosonh , and it targets the ,$$ \nu \overline{\nu}b\overline{b} $$ , or$$ {\ell}^{+}{\ell}^{-}b\overline{b} $$ final states, where$$ {\ell}^{\pm}\nu b\overline{b} $$ ℓ =e orμ , in proton-proton collisions at = 13 TeV. The data used correspond to a total integrated luminosity of 139 fb$$ \sqrt{s} $$ − 1collected by the ATLAS detector during Run 2 of the LHC at CERN. The search is conducted by examining the reconstructed invariant or transverse mass distributions ofZh orWh candidates for evidence of a localised excess in the mass range from 220 GeV to 5 TeV. No significant excess is observed and 95% confidence-level upper limits between 1.3 pb and 0.3 fb are placed on the production cross section times branching fraction of neutral and charged spin-1 resonances and CP-odd scalar bosons. These limits are converted into constraints on the parameter space of the Heavy Vector Triplet model and the two-Higgs-doublet model. -
Abstract The notion of generalized rank in the context of multiparameter persistence has become an important ingredient for defining interesting homological structures such as generalized persistence diagrams. However, its efficient computation has not yet been studied in the literature. We show that the generalized rank over a finite interval
I of a -indexed persistence module$$\textbf{Z}^2$$ M is equal to the generalized rank of the zigzag module that is induced on a certain path inI tracing mostly its boundary. Hence, we can compute the generalized rank ofM overI by computing the barcode of the zigzag module obtained by restricting to that path. IfM is the homology of a bifiltrationF of simplices (while accounting for multi-criticality) and$$t$$ I consists of points, this computation takes$$t$$ time where$$O(t^\omega )$$ is the exponent of matrix multiplication. We apply this result to obtain an improved algorithm for the following problem. Given a bifiltration inducing a module$$\omega \in [2,2.373)$$ M , determine whetherM is interval decomposable and, if so, compute all intervals supporting its indecomposable summands. -
Abstract This paper reports a search for Higgs boson pair (
hh ) production in association with a vector boson ( ) using 139 fb$$W\; {\text {o}r}\; Z$$ of proton–proton collision data at$$^{-1}$$ recorded with the ATLAS detector at the Large Hadron Collider. The search is performed in final states in which the vector boson decays leptonically ($$\sqrt{s}=13\,\text {TeV}$$ with$$W\rightarrow \ell \nu ,\, Z\rightarrow \ell \ell ,\nu \nu $$ ) and the Higgs bosons each decay into a pair of$$\ell =e, \mu $$ b -quarks. It targetsVhh signals from both non-resonanthh production, present in the Standard Model (SM), and resonanthh production, as predicted in some SM extensions. A 95% confidence-level upper limit of 183 (87) times the SM cross-section is observed (expected) for non-resonantVhh production when assuming the kinematics are as expected in the SM. Constraints are also placed on Higgs boson coupling modifiers. For the resonant search, upper limits on the production cross-sections are derived for two specific models: one is the production of a vector boson along with a neutral heavy scalar resonanceH , in the mass range 260–1000 GeV, that decays intohh , and the other is the production of a heavier neutral pseudoscalar resonanceA that decays into aZ boson andH boson, where theA boson mass is 360–800 GeV and theH boson mass is 260–400 GeV. Constraints are also derived in the parameter space of two-Higgs-doublet models. -
Abstract The double differential cross sections of the Drell–Yan lepton pair (
, dielectron or dimuon) production are measured as functions of the invariant mass$$\ell ^+\ell ^-$$ , transverse momentum$$m_{\ell \ell }$$ , and$$p_{\textrm{T}} (\ell \ell )$$ . The$$\varphi ^{*}_{\eta }$$ observable, derived from angular measurements of the leptons and highly correlated with$$\varphi ^{*}_{\eta }$$ , is used to probe the low-$$p_{\textrm{T}} (\ell \ell )$$ region in a complementary way. Dilepton masses up to 1$$p_{\textrm{T}} (\ell \ell )$$ are investigated. Additionally, a measurement is performed requiring at least one jet in the final state. To benefit from partial cancellation of the systematic uncertainty, the ratios of the differential cross sections for various$$\,\text {Te\hspace{-.08em}V}$$ ranges to those in the Z mass peak interval are presented. The collected data correspond to an integrated luminosity of 36.3$$m_{\ell \ell }$$ of proton–proton collisions recorded with the CMS detector at the LHC at a centre-of-mass energy of 13$$\,\text {fb}^{-1}$$ . Measurements are compared with predictions based on perturbative quantum chromodynamics, including soft-gluon resummation.$$\,\text {Te\hspace{-.08em}V}$$ -
Abstract Matrix reduction is the standard procedure for computing the persistent homology of a filtered simplicial complex with
m simplices. Its output is a particular decomposition of the total boundary matrix, from which the persistence diagrams and generating cycles are derived. Persistence diagrams are known to vary continuously with respect to their input, motivating the study of their computation for time-varying filtered complexes. Computing persistence dynamically can be reduced to maintaining a valid decomposition under adjacent transpositions in the filtration order. Since there are such transpositions, this maintenance procedure exhibits limited scalability and is often too fine for many applications. We propose a coarser strategy for maintaining the decomposition over a 1-parameter family of filtrations. By reduction to a particular longest common subsequence problem, we show that the minimal number of decomposition updates$$O(m^2)$$ d can be found in time and$$O(m \log \log m)$$ O (m ) space, and that the corresponding sequence of permutations—which we call aschedule —can be constructed in time. We also show that, in expectation, the storage needed to employ this strategy is actually sublinear in$$O(d m \log m)$$ m . Exploiting this connection, we show experimentally that the decrease in operations to compute diagrams across a family of filtrations is proportional to the difference between the expected quadratic number of states and the proposed sublinear coarsening. Applications to video data, dynamic metric space data, and multiparameter persistence are also presented.