skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Algorithms yield upper bounds in differential algebra
Consider an algorithm computing in a differential field with several commuting derivations such that the only operations it performs with the elements of the field are arithmetic operations, differentiation, and zero testing. We show that, if the algorithm is guaranteed to terminate on every input, then there is a computable upper bound for the size of the output of the algorithm in terms of the size of the input. We also generalize this to algorithms working with models of good enough theories (including, for example, difference fields). We then apply this to differential algebraic geometry to show that there exists a computable uniform upper bound for the number of components of any variety defined by a system of polynomial PDEs. We then use this bound to show the existence of a computable uniform upper bound for the elimination problem in systems of polynomial PDEs with delays.  more » « less
Award ID(s):
1760448 1853650 1760413 1853482
PAR ID:
10336668
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Canadian Journal of Mathematics
ISSN:
0008-414X
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Cardinality estimation and conjunctive query evaluation are two of the most fundamental problems in database query processing. Recent work proposed, studied, and implemented a robust and practical information-theoretic cardinality estimation framework. In this framework, the estimator is the cardinality upper bound of a conjunctive query subject to ''degree-constraints'', which model a rich set of input data statistics. For general degree constraints, computing this bound is computationally hard. Researchers have naturally sought efficiently computable relaxed upper bounds that are as tight as possible. The polymatroid bound is the tightest among those relaxed upper bounds. While it is an open question whether the polymatroid bound can be computed in polynomial-time in general, it is known to be computable in polynomial-time for some classes of degree constraints. Our focus is on a common class of degree constraints called simple degree constraints. Researchers had not previously determined how to compute the polymatroid bound in polynomial time for this class of constraints. Our first main result is a polynomial time algorithm to compute the polymatroid bound given simple degree constraints. Our second main result is a polynomial-time algorithm to compute a ''proof sequence'' establishing this bound. This proof sequence can then be incorporated in the PANDA-framework to give a faster algorithm to evaluate a conjunctive query. In addition, we show computational limitations to extending our results to broader classes of degree constraints. Finally, our technique leads naturally to a new relaxed upper bound called theflow bound,which is computationally tractable. 
    more » « less
  2. Megow, Nicole; Smith, Adam (Ed.)
    In this paper, we study the weighted k-server problem on the uniform metric in both the offline and online settings. We start with the offline setting. In contrast to the (unweighted) k-server problem which has a polynomial-time solution using min-cost flows, there are strong computational lower bounds for the weighted k-server problem, even on the uniform metric. Specifically, we show that assuming the unique games conjecture, there are no polynomial-time algorithms with a sub-polynomial approximation factor, even if we use c-resource augmentation for c < 2. Furthermore, if we consider the natural LP relaxation of the problem, then obtaining a bounded integrality gap requires us to use at least 𝓁 resource augmentation, where 𝓁 is the number of distinct server weights. We complement these results by obtaining a constant-approximation algorithm via LP rounding, with a resource augmentation of (2+Ξ΅)𝓁 for any constant Ξ΅ > 0. In the online setting, an exp(k) lower bound is known for the competitive ratio of any randomized algorithm for the weighted k-server problem on the uniform metric. In contrast, we show that 2𝓁-resource augmentation can bring the competitive ratio down by an exponential factor to only O(𝓁² log 𝓁). Our online algorithm uses the two-stage approach of first obtaining a fractional solution using the online primal-dual framework, and then rounding it online. 
    more » « less
  3. Recently, a broad class of linear delayed and ODE-PDEs systems was shown to have an equivalent representation using Partial Integral Equations (PIEs). In this paper, we use this PIE representation, combined with algorithms for convex optimization of Partial Integral (PI) operators to bound the H2-norm for input-output systems of this class. Specifically, the methods proposed here apply to delayed and ODE-PDE systems (including delayed PDE systems) in one or two spatial variables where the disturbance does not enter through the boundary. For such systems, we define a notion of H2-norm using an initial state-to-output framework and show that this notion reduces to more traditional concepts under the assumption of existence of a strongly continuous semigroup. Next, we consider input-output systems for which there exists a PIE representation and for such systems show that computing a minimal upper bound on the H2-norm of delayed and PDE systems can be equivalently formulated as a convex optimization problem subject to linear PI operator inequalities (LPIs). We convert, then, these optimization problems to Semi-Definite Programming (SDP) problems using the PIETOOLS toolbox. Finally, we apply the results to several numerical examples – focusing on time-delay systems (TDS) for which comparable H2 approximation results are available in the literature. The numerical results demonstrate the accuracy of the computed upper bound on the H2-norm. 
    more » « less
  4. null (Ed.)
    Composition is one of the most important properties of differential privacy (DP), as it allows algorithm designers to build complex private algorithms from DP primitives. We consider precise composition bounds of the overall privacy loss for exponential mechanisms, one of the fundamental classes of mechanisms in DP. We give explicit formulations of the optimal privacy loss for both the adaptive and non-adaptive settings. For the non-adaptive setting in which each mechanism has the same privacy parameter, we give an efficiently computable formulation of the optimal privacy loss. Furthermore, we show that there is a difference in the privacy loss when the exponential mechanism is chosen adaptively versus non-adaptively. To our knowledge, it was previously unknown whether such a gap existed for any DP mechanisms with fixed privacy parameters, and we demonstrate the gap for a widely used class of mechanism in a natural setting. We then improve upon the best previously known upper bounds for adaptive composition of exponential mechanisms with efficiently computable formulations and show the improvement. 
    more » « less
  5. The multiplicity Schwartz-Zippel lemma bounds the total multiplicity of zeroes of a multivariate polynomial on a product set. This lemma motivates the multiplicity codes of Kopparty, Saraf and Yekhanin [J. ACM, 2014], who showed how to use this lemma to construct high-rate locally-decodable codes. However, the algorithmic results about these codes crucially rely on the fact that the polynomials are evaluated on a vector space and not an arbitrary product set. In this work, we show how to decode multivariate multiplicity codes of large multiplicities in polynomial time over finite product sets (over fields of large characteristic and zero characteristic). Previously such decoding algorithms were not known even for a positive fraction of errors. In contrast, our work goes all the way to the distance of the code and in particular exceeds both the unique-decoding bound and the Johnson radius. For errors exceeding the Johnson radius, even combinatorial list-decodablity of these codes was not known. Our algorithm is an application of the classical polynomial method directly to the multivariate setting. In particular, we do not rely on a reduction from the multivariate to the univariate case as is typical of many of the existing results on decoding codes based on multivariate polynomials. However, a vanilla application of the polynomial method in the multivariate setting does not yield a polynomial upper bound on the list size. We obtain a polynomial bound on the list size by taking an alternative view of multivariate multiplicity codes. In this view, we glue all the partial derivatives of the same order together using a fresh set z of variables. We then apply the polynomial method by viewing this as a problem over the field F(z) of rational functions in z . 
    more » « less