Abstract We state a general purpose algorithm for quickly finding primes in evenly divided sub-intervals. Legendre’s conjecture claims that for every positive integern, there exists a prime between$$n^2$$ and$$(n+1)^2$$ . Oppermann’s conjecture subsumes Legendre’s conjecture by claiming there are primes between$$n^2$$ and$$n(n+1)$$ and also between$$n(n+1)$$ and$$(n+1)^2$$ . Using Cramér’s conjecture as the basis for a heuristic run-time analysis, we show that our algorithm can verify Oppermann’s conjecture, and hence also Legendre’s conjecture, for all$$n\le N$$ in time$$O( N \log N \log \log N)$$ and space$$N^{O(1/\log \log N)}$$ . We implemented a parallel version of our algorithm and improved the empirical verification of Oppermann’s conjecture from the previous$$N = 2\cdot 10^{9}$$ up to$$N = 7.05\cdot 10^{13} > 2^{46}$$ , so we were finding 27 digit primes. The computation ran for about half a year on each of two platforms: four Intel Xeon Phi 7210 processors using a total of 256 cores, and a 192-core cluster of Intel Xeon E5-2630 2.3GHz processors.
more »
« less
The two-center problem of uncertain points on cactus graphs
Abstract We study the two-center problem on cactus graphs in facility locations, which aims to place two facilities on the graph network to serve customers in order to minimize the maximum transportation cost. In our problem, the location of each customer is uncertain and may appear atO(m) points on the network with probabilities. More specifically, given are a cactus graphGand a set$$\mathcal {P}$$ ofn(weighted) uncertain points where every uncertain point hasO(m) possible locations onGeach associated with a probability and is of a non-negative weight. The problem aims to compute two centers (points) onGso that the maximum (weighted) expected distance of thenuncertain points to their own expected closest center is minimized. No previous algorithms are known for this problem. In this paper, we present the first algorithm for this problem and it solves the problem in$$O(|G|+ m^{2}n^{2}\log mn)$$ time.
more »
« less
- Award ID(s):
- 2339371
- PAR ID:
- 10583947
- Publisher / Repository:
- Springer Science + Business Media
- Date Published:
- Journal Name:
- Journal of Combinatorial Optimization
- Volume:
- 49
- Issue:
- 4
- ISSN:
- 1382-6905
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract In a Merlin–Arthur proof system, the proof verifier (Arthur) accepts valid proofs (from Merlin) with probability 1, and rejects invalid proofs with probability arbitrarily close to 1. The running time of such a system is defined to be the length of Merlin’s proof plus the running time of Arthur. We provide new Merlin–Arthur proof systems for some key problems in fine-grained complexity. In several cases our proof systems have optimal running time. Our main results include:Certifying that a list ofnintegers has no 3-SUM solution can be done in Merlin–Arthur time$$\tilde{O}(n)$$ . Previously, Carmosino et al. [ITCS 2016] showed that the problem has a nondeterministic algorithm running in$$\tilde{O}(n^{1.5})$$ time (that is, there is a proof system with proofs of length$$\tilde{O}(n^{1.5})$$ and a deterministic verifier running in$$\tilde{O}(n^{1.5})$$ time).Counting the number ofk-cliques with total edge weight equal to zero in ann-node graph can be done in Merlin–Arthur time$${\tilde{O}}(n^{\lceil k/2\rceil })$$ (where$$k\ge 3$$ ). For oddk, this bound can be further improved for sparse graphs: for example, counting the number of zero-weight triangles in anm-edge graph can be done in Merlin–Arthur time$${\tilde{O}}(m)$$ . Previous Merlin–Arthur protocols by Williams [CCC’16] and Björklund and Kaski [PODC’16] could only countk-cliques in unweighted graphs, and had worse running times for smallk.Computing the All-Pairs Shortest Distances matrix for ann-node graph can be done in Merlin–Arthur time$$\tilde{O}(n^2)$$ . Note this is optimal, as the matrix can have$$\Omega (n^2)$$ nonzero entries in general. Previously, Carmosino et al. [ITCS 2016] showed that this problem has an$$\tilde{O}(n^{2.94})$$ nondeterministic time algorithm.Certifying that ann-variablek-CNF is unsatisfiable can be done in Merlin–Arthur time$$2^{n/2 - n/O(k)}$$ . We also observe an algebrization barrier for the previous$$2^{n/2}\cdot \textrm{poly}(n)$$ -time Merlin–Arthur protocol of R. Williams [CCC’16] for$$\#$$ SAT: in particular, his protocol algebrizes, and we observe there is no algebrizing protocol fork-UNSAT running in$$2^{n/2}/n^{\omega (1)}$$ time. Therefore we have to exploit non-algebrizing properties to obtain our new protocol.Certifying a Quantified Boolean Formula is true can be done in Merlin–Arthur time$$2^{4n/5}\cdot \textrm{poly}(n)$$ . Previously, the only nontrivial result known along these lines was an Arthur–Merlin–Arthur protocol (where Merlin’s proof depends on some of Arthur’s coins) running in$$2^{2n/3}\cdot \textrm{poly}(n)$$ time.Due to the centrality of these problems in fine-grained complexity, our results have consequences for many other problems of interest. For example, our work implies that certifying there is no Subset Sum solution tonintegers can be done in Merlin–Arthur time$$2^{n/3}\cdot \textrm{poly}(n)$$ , improving on the previous best protocol by Nederlof [IPL 2017] which took$$2^{0.49991n}\cdot \textrm{poly}(n)$$ time.more » « less
-
Abstract The theory of forbidden 0–1 matrices generalizes Turán-style (bipartite) subgraph avoidance, Davenport-Schinzel theory, and Zarankiewicz-type problems, and has been influential in many areas, such as discrete and computational geometry, the analysis of self-adjusting data structures, and the development of the graph parametertwin width. The foremost open problem in this area is to resolve thePach-Tardos conjecturefrom 2005, which states that if a forbidden pattern$$P\in \{0,1\}^{k\times l}$$ isacyclic, meaning it is the bipartite incidence matrix of a forest, then$$\operatorname {Ex}(P,n) = O(n\log ^{C_P} n)$$ , where$$\operatorname {Ex}(P,n)$$ is the maximum number of 1s in aP-free$$n\times n$$ 0–1 matrix and$$C_P$$ is a constant depending only onP. This conjecture has been confirmed on many small patterns, specifically allPwith weight at most 5, and all but two with weight 6. The main result of this paper is a clean refutation of the Pach-Tardos conjecture. Specifically, we prove that$$\operatorname {Ex}(S_0,n),\operatorname {Ex}(S_1,n) \ge n2^{\Omega (\sqrt{\log n})}$$ , where$$S_0,S_1$$ are the outstanding weight-6 patterns. We also prove sharp bounds on the entire class ofalternatingpatterns$$(P_t)$$ , specifically that for every$$t\ge 2$$ ,$$\operatorname {Ex}(P_t,n)=\Theta (n(\log n/\log \log n)^t)$$ . This is the first proof of an asymptotically sharp bound that is$$\omega (n\log n)$$ .more » « less
-
Abstract We obtain new optimal estimates for the$$L^{2}(M)\to L^{q}(M)$$ ,$$q\in (2,q_{c}]$$ ,$$q_{c}=2(n+1)/(n-1)$$ , operator norms of spectral projection operators associated with spectral windows$$[\lambda ,\lambda +\delta (\lambda )]$$ , with$$\delta (\lambda )=O((\log \lambda )^{-1})$$ on compact Riemannian manifolds$$(M,g)$$ of dimension$$n\ge 2$$ all of whose sectional curvatures are nonpositive or negative. We show that these two different types of estimates are saturated on flat manifolds or manifolds all of whose sectional curvatures are negative. This allows us to classify compact space forms in terms of the size of$$L^{q}$$ -norms of quasimodes for each Lebesgue exponent$$q\in (2,q_{c}]$$ , even though it is impossible to distinguish between ones of negative or zero curvature sectional curvature for any$$q>q_{c}$$ .more » « less
-
Abstract It is natural to generalize the online$$k$$ -Server problem by allowing each request to specify not only a pointp, but also a subsetSof servers that may serve it. To date, only a few special cases of this problem have been studied. The objective of the work presented in this paper has been to more systematically explore this generalization in the case of uniform and star metrics. For uniform metrics, the problem is equivalent to a generalization of Paging in which each request specifies not only a pagep, but also a subsetSof cache slots, and is satisfied by having a copy ofpin some slot inS. We call this problemSlot-Heterogenous Paging. In realistic settings only certain subsets of cache slots or servers would appear in requests. Therefore we parameterize the problem by specifying a family$${\mathcal {S}}\subseteq 2^{[k]}$$ of requestable slot sets, and we establish bounds on the competitive ratio as a function of the cache sizekand family$${\mathcal {S}}$$ :If all request sets are allowed ($${\mathcal {S}}=2^{[k]}\setminus \{\emptyset \}$$ ), the optimal deterministic and randomized competitive ratios are exponentially worse than for standard Paging ($${\mathcal {S}}=\{[k]\}$$ ).As a function of$$|{\mathcal {S}}|$$ andk, the optimal deterministic ratio is polynomial: at most$$O(k^2|{\mathcal {S}}|)$$ and at least$$\Omega (\sqrt{|{\mathcal {S}}|})$$ .For any laminar family$${\mathcal {S}}$$ of heighth, the optimal ratios areO(hk) (deterministic) and$$O(h^2\log k)$$ (randomized).The special case of laminar$${\mathcal {S}}$$ that we callAll-or-One Pagingextends standard Paging by allowing each request to specify a specific slot to put the requested page in. The optimal deterministic ratio forweightedAll-or-One Paging is$$\Theta (k)$$ . Offline All-or-One Paging is$$\mathbb{N}\mathbb{P}$$ -hard.Some results for the laminar case are shown via a reduction to the generalization of Paging in which each request specifies a set$$P$$ ofpages, and is satisfied by fetching any page from$$P$$ into the cache. The optimal ratios for the latter problem (with laminar family of heighth) are at mosthk(deterministic) and$$hH_k$$ (randomized).more » « less
An official website of the United States government
