skip to main content

Title: Changes in monthly baseflow across the U.S. Midwest
Characterizing streamflow changes in the agricultural U.S. Midwest is critical foreffective planning and management of water resources throughout the region. Theobjective of this study is to determine if and how baseflow has responded to landalteration and climate changes across the study area during the 50‐year study periodby exploring hydrologic variations based on long‐term stream gage data. This studyevaluates monthly contributions to annual baseflow along with possible trends overthe 1966–2016 period for 458 U.S. Geological Survey streamflow gages within 12different Midwestern states. It also examines the influence of climate and land usefactors on the observed baseflow trends. Monthly contribution breakdowns demon-strate how the majority of baseflow is discharged into streams during the springmonths (March, April, and May) and is overall more substantial throughout the spring(especially in April) and summer (June, July, and August). Baseflow has not remainedconstant over the study period, and the results of the trend detection from theMann–Kendall test reveal that baseflows have increased and are the strongest fromMay to September. This analysis is confirmed by quantile regression, which suggeststhat for most of the year, the largest changes are detected in the central part of thedistribution. Although increasing baseflow trends are widespread throughout theregion, decreasing trends are few and more » limited to Kansas and Nebraska. Furtheranalysis reveals that baseflow changes are being driven by both climate and landuse change across the region. Increasing trends in baseflow are linked to increasesin precipitation throughout the year and are most prominent during May and June.Changes in agricultural intensity (in terms of harvested corn and soybean acreage)are linked to increasing trends in the central and western Midwest, whereasincreasing temperatures may lead to decreasing baseflow trends in spring and summerin northern Wisconsin, Kansas, and Nebraska. « less
; ; ;
Award ID(s):
Publication Date:
Journal Name:
Hydrological processes
Page Range or eLocation-ID:
Sponsoring Org:
National Science Foundation
More Like this
  1. Complete transformations of land cover from prairie, wetlands, and hardwood forests to row crop agriculture and urban centers are thought to have caused profound changes in hydrology in the Upper Midwestern US since the 1800s. In this study, we investigate four large (23 000–69 000 km2) Midwest river basins that span climate and land use gradients to understand how climate and agricultural drainage have influenced basin hydrology over the last 79 years. We use daily, monthly, and annual flow metrics to document streamflow changes and discuss those changes in the context of precipitation and land use changes. Since 1935, flow, precipitation, artificial drainage extent, and corn and soybean acreage have increased across the region. In extensively drained basins, we observe 2 to 4 fold increases in low flows and 1.5 to 3 fold increases in high and extreme flows. Using a water budget, we determined that the storage term has decreased in intensively drained and cultivated basins by 30–200 % since 1975, but increased by roughly 30 % in the less agricultural basin. Storage has generally decreased during spring and summer months and increased during fall and winter months in all watersheds. Thus, the loss of storage and enhanced hydrologic connectivity and efficiency imparted by artificialmore »agricultural drainage appear to have amplified the streamflow response to precipitation increases in the Midwest. Future increases in precipitation are likely to further intensify drainage practices and increase streamflows. Increased streamflow has implications for flood risk, channel adjustment, and sediment and nutrient transport and presents unique challenges for agriculture and water resource management in the Midwest. Better documentation of existing and future drain tile and ditch installation is needed to further understand the role of climate versus drainage across multiple spatial and temporal scales.« less
  2. Climate studies based on global climate models (GCMs) project a steady increase in annual average temperature and severe heat extremes in central North America during the mid-century and beyond. However, the agreement of observed trends with climate model trends varies substantially across the region. The present study focuses on two different locations: Des Moines, IA and Austin, TX. In Des Moines, annual extreme temperatures have not increased over the past three decades unlike the trend of regionally-downscaled GCM data for the Midwest, likely due to a “warming hole” over the area linked to agricultural factors. This warming hole effect is not evident for Austin over the same time period, where extreme temperatures have been higher than projected by regionally-downscaled climate (RDC) forecasts. In consideration of the deviation of such RDC extreme temperature forecasts from observations, this study statistically analyzes RDC data in conjunction with observational data to define for these two cities a 95% prediction interval of heat extreme values by 2040. The statistical model is constructed using a linear combination of RDC ensemble-member annual extreme temperature forecasts with regression coefficients for individual forecasts estimated by optimizing model results against observations over a 52-year training period.
  3. Documenting how ground- and surface water systems respond to climate change is crucial to understanding water resources, particularly in the U.S. Great Lakes region, where drastic temperature and precipitation changes are observed. This study presents baseflow and baseflow index (BFI) trend analyses for 10 undisturbed watersheds in Michigan using (1) multi-objective optimization (MOO) and (2) modified Mann–Kendall (MK) tests corrected for short-term autocorrelation (STA). Results indicate a variability in mean baseflow (0.09–8.70 m3/s) and BFI (67.9–89.7%) that complicates regional-scale extrapolations of groundwater recharge. Long-term (>60 years) MK trend tests indicate a significant control of total precipitation (P) and snow- to rainfall transitions on baseflow and BFI. In the Lower Peninsula Rifle River watershed, increasing P and a transition from snow- to rainfall has increased baseflow at a lower rate than streamflow; an overall pattern that may contribute to documented flood frequency increases. In the Upper Peninsula Ford River watershed, decreasing P and a transition from rain- to snowfall had no significant effects on baseflow and BFI. Our results highlight the value of an objectively constrained BFI parameter for shorter-term (<50 years) hydrologic trend analysis because of a lower STA susceptibility.
  4. RCMs produced at ~0.5° (available in the NA-CORDEX database address issues related to coarse resolution of GCMs (produced at 2° to 4°). Nevertheless, due to systematic and random model errors, bias correction is needed for regional study applications. However, an acceptable threshold for magnitude of bias correction that will not affect future RCM projection behavior is unknown. The goal of this study is to evaluate the implications of a bias correction technique (distribution mapping) for four GCM-RCM combinations for simulating regional precipitation and, subsequently, streamflow, surface runoff, and water yield when integrated into Soil and Water Assessment Tool (SWAT) applications for the Des Moines River basin (31,893 km²) in Iowa-Minnesota, U.S. The climate projections tested in this study are an ensemble of 2 GCMs (MPI-ESM-MR and GFDL-ESM2M) and 2 RCMs (WRF and RegCM4) for historical (1981-2005) and future (2030-2050) projections in the NA-CORDEX CMIP5 archive. The PRISM dataset was used for bias correction of GCM-RCM historical precipitation and for SWAT baseline simulations. We found bias correction improves historical total annual volumes for precipitation, seasonality, spatial distribution and mean error for all GCM-RCM combinations. However, improvement of correlation coefficient occurred only for the RegCM4 simulations. Monthly precipitation was overestimated formore »all raw models from January to April, and WRF overestimated monthly precipitation from January to August. The bias correction method improved monthly average precipitation for all four GCM-RCM combinations. The ability to detect occurrence of precipitation events was slightly better for the raw models, especially for the GCM-WRF combinations. Simulated historical streamflow was compared across 26 monitoring stations: Historical GCM-RCM outputs were unable to replicate PRISM KGE statistical results (KGE>0.5). However, the Pbias streamflow results matched the PRISM simulation for all bias-corrected models and for the raw GFDL-RegCM4 combination. For future scenarios there was no change in the annual trend, except for raw WRF models that estimated an increase of about 35% in annual precipitation. Seasonal variability remained the same, indicating wetter summers and drier winters. However, most models predicted an increase in monthly precipitation from January to March, and a reduction in June and July (except for raw WRF models). The impact on hydrological simulations based on future projected conditions was observed for surface runoff and water yield. Both variables were characterized by monthly volume overestimation; the raw WRF models predicted up to three times greater volume compared to the historical run. RegCM4 projected increased surface runoff and water yield for winter and spring by two times, and a slight volume reduction in summer and autumn. Meanwhile, the bias-corrected models showed changes in prediction signals: In some cases, raw models projected an increase in surface runoff and water yield but the bias-corrected models projected a reduction of these variables. These findings underscore the need for more extended research on bias correction and transposition between historical and future data.« less
  5. Abstract

    California’s Central Valley is one of the world’s most productive agricultural regions. Its high-value fruit, vegetable, and nut crops rely on surface water imports from a vast network of reservoirs and canals as well as groundwater, which has been substantially overdrafted to support irrigation. The region has undergone a shift to perennial (tree and vine) crops in recent decades, which has increased water demand amid a series of severe droughts and emerging regulations on groundwater pumping. This study quantifies the expansion of perennial crops in the Tulare Lake Basin, the southern region of the Central Valley with limited natural water availability. A gridded crop type dataset is compiled on a 1 mi2spatial resolution from a historical database of pesticide permits over the period 1974–2016 and validated against aggregated county-level data. This spatial dataset is then analyzed by irrigation district, the primary spatial scale at which surface water supplies are determined, to identify trends in planting decisions and agricultural water demand over time. Perennial crop acreage has nearly tripled over this period, and currently accounts for roughly 60% of planted area and 80% of annual revenue. These trends show little relationship with water availability and have been driven primarily bymore »market demand. From this data, we focus on the increasing minimum irrigation needs each year to sustain perennial crops. Results indicate that under a range of plausible future regulations on groundwater pumping ranging from 10% to 50%, water supplies may fail to consistently meet demands, increasing losses by up to 30% of annual revenues. More broadly, the datasets developed in this work will support the development of dynamic models of the integrated water-agriculture system under uncertain climate and regulatory changes to understand the combined impacts of water supply shortages and intensifying irrigation demand.

    « less