Land surface air temperature is an essential climate variable for understanding rapid global environmental changes. Sparse network coverage prior to the 1950s is a significant source of uncertainty in global climate change evaluations. Recognizing the importance of spatial coverage, more stations are continually being added to global climate networks. A challenge is how to best use the information introduced by the new station observations to enhance our understanding and assessment of global climate states and changes, particularly for times prior to the mid‐20th century. In this study, Data INterpolating Empirical Orthogonal Functions (DINEOF) were used to reconstruct mean monthly air temperatures from the Global Historical Climatology Network‐monthly (GHCNm version 4) over the land surface from 1880 through 2017. The final reconstructed air temperature dataset covers about 95% of the global land surface area, improving the spatial coverage by ~80% during 1880–1900 and by 10%–20% since the 1950s. Validation tests show that the mean absolute error of the reconstructed data is less than 0.82°C. Comparison with the Coupled Model Intercomparison Project Phase 5 (CMIP5) climate model output shows that the reconstructed dataset substantially reduces the bias in global datasets caused by sparse station coverage, particularly before the 1950s.
We outline a new and improved uncertainty analysis for the Goddard Institute for Space Studies Surface Temperature product version 4 (GISTEMP v4). Historical spatial variations in surface temperature anomalies are derived from historical weather station data and ocean data from ships, buoys, and other sensors. Uncertainties arise from measurement uncertainty, changes in spatial coverage of the station record, and systematic biases due to technology shifts and land cover changes. Previously published uncertainty estimates for GISTEMP included only the effect of incomplete station coverage. Here, we update this term using currently available spatial distributions of source data, state‐of‐the‐art reanalyses, and incorporate independently derived estimates for ocean data processing, station homogenization, and other structural biases. The resulting 95% uncertainties are near 0.05 °C in the global annual mean for the last 50 years and increase going back further in time reaching 0.15 °C in 1880. In addition, we quantify the benefits and inherent uncertainty due to the GISTEMP interpolation and averaging method. We use the total uncertainties to estimate the probability for each record year in the GISTEMP to actually be the true record year (to that date) and conclude with 87% likelihood that 2016 was indeed the hottest year of the instrumental period (so far).
more » « less- PAR ID:
- 10457404
- Publisher / Repository:
- DOI PREFIX: 10.1029
- Date Published:
- Journal Name:
- Journal of Geophysical Research: Atmospheres
- Volume:
- 124
- Issue:
- 12
- ISSN:
- 2169-897X
- Page Range / eLocation ID:
- p. 6307-6326
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Abstract -
null (Ed.)Ocean temperature observations are crucial for a host of climate research and forecasting activities, such as climate monitoring, ocean reanalysis and state estimation, seasonal-to-decadal forecasts, and ocean forecasting. For all of these applications, it is crucial to understand the uncertainty attached to each of the observations, accounting for changes in instrument technology and observing practices over time. Here, we describe the rationale behind the uncertainty specification provided for all in situ ocean temperature observations in the International Quality-controlled Ocean Database (IQuOD) v0.1, a value-added data product served alongside the World Ocean Database (WOD). We collected information from manufacturer specifications and other publications, providing the end user with uncertainty estimates based mainly on instrument type, along with extant auxiliary information such as calibration and collection method. The provision of a consistent set of observation uncertainties will provide a more complete understanding of historical ocean observations used to examine the changing environment. Moving forward, IQuOD will continue to work with the ocean observation, data assimilation and ocean climate communities to further refine uncertainty quantification. We encourage submissions of metadata and information about historical practices to the IQuOD project and WOD.more » « less
-
null (Ed.)Abstract Historical estimates of ocean heat content (OHC) are important for understanding the climate sensitivity of the Earth system and for tracking changes in Earth’s energy balance over time. Prior to 2004, these estimates rely primarily on temperature measurements from mechanical and expendable bathythermograph (BT) instruments that were deployed on large scales by naval vessels and ships of opportunity. These BT temperature measurements are subject to well-documented biases, but even the best calibration methods still exhibit residual biases when compared with high-quality temperature datasets. Here, we use a new approach to reduce biases in historical BT data after binning them to a regular grid such as would be used for estimating OHC. Our method consists of an ensemble of artificial neural networks that corrects biases with respect to depth, year, and water temperature in the top 10 m. A global correction and corrections optimized to specific BT probe types are presented for the top 1800 m. Our approach differs from most prior studies by accounting for multiple sources of error in a single correction instead of separating the bias into several independent components. These new global and probe-specific corrections perform on par with widely used calibration methods on a series of metrics that examine the residual temperature biases with respect to a high-quality reference dataset. However, distinct patterns emerge across these various calibration methods when they are extrapolated to BT data that are not included in our cross-instrument comparison, contributing to uncertainty that will ultimately impact estimates of OHC.more » « less
-
The uncertainties in sea ice extent (total area covered by sea ice with concentration>15%) derived from passive microwave sensors are assessed in two ways. Absolute uncertainty (accuracy) is evaluated based on the comparison of the extent between several products. There are clear biases between the extent from the different products that are of the order of 500 000 to 1×106 km2 depending on the season and hemisphere. These biases are due to differences in the algorithm sensitivity to ice edge conditions and the spatial resolution of different sensors. Relative uncertainty is assessed by examining extents from the National Snow and Ice Data Center Sea Ice Index product. The largest source of uncertainty,∼100 000 km2, is between near-real-time and final products due to different input source data and different processing and quality control. For consistent processing, the uncertainty is assessed using different input source data and by varying concentration algorithm parameters. This yields a relative uncertainty of 30 000–70 000 km2. The Arctic minimum extent uncertainty is∼40 000 km2. Uncertainties in comparing with earlier parts of the record may be higher due to sensor transitions. For the first time, this study provides a quantitative estimate of sea ice extent uncertainty.more » « less
-
Abstract Since the Paris Agreement, climate policy has focused on 1.5° and 2°C maximum global warming targets. However, the agreement lacks a formal definition of the nineteenth-century “pre-industrial” temperature baseline for these targets. If global warming is estimated with respect to the 1850–1900 mean, as in the latest IPCC reports, uncertainty in early instrumental temperatures affects the quantification of total warming. Here, we analyze gridded datasets of instrumental observations together with large-scale climate reconstructions from tree rings to evaluate nineteenth-century baseline temperatures. From 1851 to 1900 warm season temperatures of the Northern Hemisphere extratropical landmasses were 0.20°C cooler than the twentieth-century mean, with a range of 0.14°–0.26°C among three instrumental datasets. At the same time, proxy-based temperature reconstructions show on average 0.39°C colder conditions with a range of 0.19°–0.55°C among six records. We show that anomalously low reconstructed temperatures at high latitudes are underrepresented in the instrumental fields, likely due to the lack of station records in these remote regions. The nineteenth-century offset between warmer instrumental and colder reconstructed temperatures is reduced by one-third if spatial coverage is reduced to those grid cells that overlap between the different temperature fields. The instrumental dataset from Berkeley Earth shows the smallest offset to the reconstructions indicating that additional stations included in this product, due to more liberal data selection, lead to cooler baseline temperatures. The limited early instrumental records and comparison with reconstructions suggest an overestimation of nineteenth-century temperatures, which in turn further reduces the probability of achieving the Paris targets.
Significance Statement The warming targets formulated in the Paris Agreement use a “pre-industrial” temperature baseline that is affected by significant uncertainty in the instrumental temperature record. During the second half of the nineteenth century, much of the continental landmasses were not yet covered by the observational station network and existing records were often subject to inhomogeneities and biases, thus resulting in uncertainty regarding the large-scale mean temperature estimate. By analyzing summer temperature reconstructions from tree-rings for the Northern Hemisphere extratropical land areas, we examine an independent climate archive with a typically broader and more continuous spatial extent during the “pre-industrial” period. Despite the additional uncertainty when using climate reconstructions instead of direct observations, there is evidence for an overestimation of land temperature during the summer season in early instrumental data. Colder early instrumental temperatures would reduce the probability of reaching the Paris targets.