skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Optimal Algorithms for Computing Average Temperatures
Abstract A numerical algorithm is presented for computing average global temperature (or other quantities of interest such as average precipitation) from measurements taken at speci_ed locations and times. The algorithm is proven to be in a certain sense optimal. The analysis of the optimal algorithm provides a sharp a priori bound on the error between the computed value and the true average global temperature. This a priori bound involves a computable compatibility constant which assesses the quality of the measurements for the chosen model. The optimal algorithm is constructed by solving a convex minimization problem and involves a set of functions selected a priori in relation to the model. It is shown that the solution promotes sparsity and hence utilizes a smaller number of well-chosen data sites than those provided. The algorithm is then applied to canonical data sets and mathematically generic models for the computation of average temperature and average precipitation over given regions and given time intervals. A comparison is provided between the proposed algorithms and existing methods.  more » « less
Award ID(s):
1817603
PAR ID:
10181890
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
Mathematics of Climate and Weather Forecasting
Volume:
5
Issue:
1
ISSN:
2353-6438
Page Range / eLocation ID:
34 to 44
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. This paper describes, along with some validation results, the one-dimensional variational method (1D-Var) that is in use at the University Corporation for Atmospheric Research (UCAR) to retrieve atmospheric profiles of temperature, pressure, and humidity from the observation of the Global Navigation Satellite System (GNSS) radio occultation (RO). The retrieved profiles are physically consistent among the variables and statistically optimal as regards to a priori error statistics. Tests with idealized data demonstrate that the 1D-Var is highly effective in spreading the observational information and confirm that the method works as designed and expected, provided that correct input data are given. Tests for real-world data sets show that the retrieved profiles agree remarkably well with global weather analyses and collocated high vertical resolution radiosonde observations, and that the 1D-Var can produce value-added retrievals with respect to a priori profiles. We also find that the retrieved profiles are of exceptional long-term stability, suggesting that the 1D-Var can provide an excellent climate data record. 
    more » « less
  2. Kumar, Amit; Ron-Zewi, Noga (Ed.)
    We consider the problem in which n points arrive online over time, and upon arrival must be irrevocably assigned to one of k clusters where the objective is the standard k-median objective. Lower-bound instances show that for this problem no online algorithm can achieve a competitive ratio bounded by any function of n. Thus we turn to a beyond worst-case analysis approach, namely we assume that the online algorithm is a priori provided with a predicted budget B that is an upper bound to the optimal objective value (e.g., obtained from past instances). Our main result is an online algorithm whose competitive ratio (measured against B) is solely a function of k. We also give a lower bound showing that the competitive ratio of every algorithm must depend on k. 
    more » « less
  3. The HTTP adaptive streaming technique opened the door to cope with the fluctuating network conditions during the streaming process by dynamically adjusting the volume of the future chunks to be downloaded. The bitrate selection in this adjustment inevitably involves the task of predicting the future throughput of a video session, owing to which various heuristic solutions have been explored. The ultimate goal of the present work is to explore the theoretical upper bounds of the QoE that any ABR algorithm can possibly reach, therefore providing an essential step to benchmarking the performance evaluation of ABR algorithms. In our setting, the QoE is defined in terms of a linear combination of the average perceptual quality and the buffering ratio. The optimization problem is proven to be NP-hard when the perceptual quality is defined by chunk size and conditions are given under which the problem becomes polynomially solvable. Enriched by a global lower bound, a pseudo-polynomial time algorithm along the dynamic programming approach is presented. When the minimum buffering is given higher priority over higher perceptual quality, the problem is shown to be also NP-hard, and the above algorithm is simplified and enhanced by a sequence of lower bounds on the completion time of chunk downloading, which, according to our experiment, brings a 36.0% performance improvement in terms of computation time. To handle large amounts of data more efficiently, a polynomial-time algorithm is also introduced to approximate the optimal values when minimum buffering is prioritized. Besides its performance guarantee, this algorithm is shown to reach 99.938% close to the optimal results, while taking only 0.024% of the computation time compared to the exact algorithm in dynamic programming. 
    more » « less
  4. A comprehensive method is provided for smoothing noisy, irregularly sampled data with non-Gaussian noise using smoothing splines. We demonstrate how the spline order and tension parameter can be chosen a priori from physical reasoning. We also show how to allow for non-Gaussian noise and outliers that are typical in global positioning system (GPS) signals. We demonstrate the effectiveness of our methods on GPS trajectory data obtained from oceanographic floating instruments known as drifters. 
    more » « less
  5. null (Ed.)
    Simulated annealing is an effective and general means of optimization. It is in fact inspired by metallurgy, where the temperature of a material determines its behavior in thermodynamics. Likewise, in simulated annealing, the actions that the algorithm takes depend entirely on the value of a variable which captures the notion of temperature. Typically, simulated annealing starts with a high temperature, which makes the algorithm pretty unpredictable, and gradually cools the temperature down to become more stable. A key component that plays a crucial role in the performance of simulated annealing is the criteria under which the temperature changes namely, the cooling schedule. Motivated by this, we study the following question in this work: "Given enough samples to the instances of a specific class of optimization problems, can we design optimal (or approximately optimal) cooling schedules that minimize the runtime or maximize the success rate of the algorithm on average when the underlying problem is drawn uniformly at random from the same class?" We provide positive results both in terms of sample complexity and simulation complexity. For sample complexity, we show that O (m^1/2) samples suffice to find an approximately optimal cooling schedule of length m. We complement this result by giving a lower bound of Ω (m^1/3) on the sample complexity of any learning algorithm that provides an almost optimal cooling schedule. These results are general and rely on no assumption. For simulation complexity, however, we make additional assumptions to measure the success rate of an algorithm. To this end, we introduce the monotone stationary graph that models the performance of simulated annealing. Based on this model, we present polynomial time algorithms with provable guarantees for the learning problem. 
    more » « less