skip to main content

Title: How does water yield respond to mountain pine beetle infestation in a semiarid forest?
Abstract. Mountain pine beetle (MPB) outbreaks in the western United States result inwidespread tree mortality, transforming forest structure within watersheds.While there is evidence that these changes can alter the timing and quantity of streamflow, there is substantial variation in both the magnitude and direction of hydrologic responses, and the climatic and environmental mechanisms driving this variation are not well understood. Herein, we coupled an eco-hydrologic model (RHESSys) with a beetle effects model and applied it to a semiarid watershed, Trail Creek, in the Bigwood River basin in central Idaho, USA, to examine how varying degrees of beetle-caused tree mortality influence water yield. Simulation results show that water yield during the first 15 years after beetle outbreak is controlled by interactions between interannual climate variability, the extent of vegetation mortality, and long-term aridity. During wet years, water yield after a beetle outbreak increased with greater tree mortality; this was driven by mortality-caused decreases in evapotranspiration. During dry years, water yield decreased at low-to-medium mortality but increased at high mortality. The mortality threshold for the direction of change was location specific. The change in water yield also varied spatially along aridity gradients during dry years. In wetter areas of the Trail Creek basin, post-outbreak water more » yield decreased at low mortality (driven by an increase in ground evaporation) and increased when vegetation mortality was greater than 40 % (driven by a decrease in canopy evaporation and transpiration). In contrast, in more water-limited areas, water yield typically decreased after beetle outbreaks, regardless of mortality level (although the driving mechanisms varied). Our findings highlight the complexity and variability of hydrologic responses and suggest that long-term (i.e., multi-decadal mean) aridity can be a useful indicator for the direction of water yield changes after a disturbance. « less
Authors:
; ; ; ; ; ; ;
Award ID(s):
1916658
Publication Date:
NSF-PAR ID:
10352169
Journal Name:
Hydrology and Earth System Sciences
Volume:
25
Issue:
9
Page Range or eLocation-ID:
4681 to 4699
ISSN:
1607-7938
Sponsoring Org:
National Science Foundation
More Like this
  1. Since the late 1990s, extensive outbreaks of native bark beetles (Curculionidae: Scolytinae) have affected coniferous forests throughout Europe and North America, driving changes in carbon storage, wildlife habitat, nutrient cycling, and water resource provisioning. Remote sensing is a crucial tool for quantifying the effects of these disturbances across broad landscapes. In particular, Landsat time series (LTS) are increasingly used to characterize outbreak dynamics, including the presence and severity of bark beetle-caused tree mortality, though broad-scale LTS-based maps are rarely informed by detailed field validation. Here we used spatial and temporal information from LTS products, in combination with extensive field data and Random Forest (RF) models, to develop 30-m maps of the presence (i.e., any occurrence) and severity (i.e., cumulative percent basal area mortality) of beetle-caused tree mortality 1997–2019 in subalpine forests throughout the Southern Rocky Mountains, USA. Using resultant maps, we also quantified spatial patterns of cumulative tree mortality throughout the region, an important yet poorly understood concept in beetle-affected forests. RF models using LTS products to predict presence and severity performed well, with 80.3% correctly classified (Kappa = 0.61) and R2 = 0.68 (RMSE = 17.3), respectively. We found that ≥10,256 km2 of subalpine forest area (39.5% of themore »study area) was affected by bark beetles and 19.3% of the study area experienced ≥70% tree mortality over the twenty-three year period. Variograms indicated that severity was autocorrelated at scales < 250 km. Interestingly, cumulative patch-size distributions showed that areas with a near-total loss of the overstory canopy (i.e., ≥90% mortality) were relatively small (<0.24 km2) and isolated throughout the study area. Our findings help to inform an understanding of the variable effects of bark beetle outbreaks across complex forested regions and provide insight into patterns of disturbance legacies, landscape connectivity, and susceptibility to future disturbance.« less
  2. Abstract Reducing the risk of large, severe wildfires while also increasing the security of mountain water supplies and enhancing biodiversity are urgent priorities in western US forests. After a century of fire suppression, Yosemite and Sequoia-Kings Canyon National Parks located in California’s Sierra Nevada initiated programs to manage wildfires and these areas present a rare opportunity to study the effects of restored fire regimes. Forest cover decreased during the managed wildfire period and meadow and shrubland cover increased, especially in Yosemite’s Illilouette Creek basin that experienced a 20% reduction in forest area. These areas now support greater pyrodiversity and consequently greater landscape and species diversity. Soil moisture increased and drought-induced tree mortality decreased, especially in Illilouette where wildfires have been allowed to burn more freely resulting in a 30% increase in summer soil moisture. Modeling suggests that the ecohydrological co-benefits of restoring fire regimes are robust to the projected climatic warming. Support will be needed from the highest levels of government and the public to maintain existing programs and expand them to other forested areas.
  3. Bark beetles naturally inhabit forests and can cause large-scale tree mortality when they reach epidemic population numbers. A recent epidemic (1990s–2010s), primarily driven by mountain pine beetles ( Dendroctonus ponderosae ), was a leading mortality agent in western United States forests. Predictive models of beetle populations and their impact on forests largely depend on host related parameters, such as stand age, basal area, and density. We hypothesized that bark beetle attack patterns are also dependent on inferred beetle population densities: large epidemic populations of beetles will preferentially attack large-diameter trees, and successfully kill them with overwhelming numbers. Conversely, small endemic beetle populations will opportunistically attack stressed and small trees. We tested this hypothesis using 12 years of repeated field observations of three dominant forest species (lodgepole pine Pinus contorta , Engelmann spruce Picea engelmannii , and subalpine fir Abies lasiocarpa ) in subalpine forests of southeastern Wyoming paired with a Bayesian modeling approach. The models provide probabilistic predictions of beetle attack patterns that are free of assumptions required by frequentist models that are often violated in these data sets. Furthermore, we assessed seedling/sapling regeneration in response to overstory mortality and hypothesized that higher seedling/sapling establishment occurs in areas with highestmore »overstory mortality because resources are freed from competing trees. Our results indicate that large-diameter trees were more likely to be attacked and killed by bark beetles than small-diameter trees during epidemic years for all species, but there was no shift toward preferentially attacking small-diameter trees in post-epidemic years. However, probabilities of bark beetle attack and mortality increased for small diameter lodgepole pine and Engelmann spruce trees in post-epidemic years compared to epidemic years. We also show an increase in overall understory growth (graminoids, forbs, and shrubs) and seedling/sapling establishment in response to beetle-caused overstory mortality, especially in lodgepole pine dominated stands. Our observations provide evidence of the trajectories of attack and mortality as well as early forest regrowth of three common tree species during the transition from epidemic to post-epidemic stages of bark beetle populations in the field.« less
  4. Complete transformations of land cover from prairie, wetlands, and hardwood forests to row crop agriculture and urban centers are thought to have caused profound changes in hydrology in the Upper Midwestern US since the 1800s. In this study, we investigate four large (23 000–69 000 km2) Midwest river basins that span climate and land use gradients to understand how climate and agricultural drainage have influenced basin hydrology over the last 79 years. We use daily, monthly, and annual flow metrics to document streamflow changes and discuss those changes in the context of precipitation and land use changes. Since 1935, flow, precipitation, artificial drainage extent, and corn and soybean acreage have increased across the region. In extensively drained basins, we observe 2 to 4 fold increases in low flows and 1.5 to 3 fold increases in high and extreme flows. Using a water budget, we determined that the storage term has decreased in intensively drained and cultivated basins by 30–200 % since 1975, but increased by roughly 30 % in the less agricultural basin. Storage has generally decreased during spring and summer months and increased during fall and winter months in all watersheds. Thus, the loss of storage and enhanced hydrologic connectivity and efficiency imparted by artificialmore »agricultural drainage appear to have amplified the streamflow response to precipitation increases in the Midwest. Future increases in precipitation are likely to further intensify drainage practices and increase streamflows. Increased streamflow has implications for flood risk, channel adjustment, and sediment and nutrient transport and presents unique challenges for agriculture and water resource management in the Midwest. Better documentation of existing and future drain tile and ditch installation is needed to further understand the role of climate versus drainage across multiple spatial and temporal scales.« less
  5. Green Lake is the deepest natural inland lake in Wisconsin, with a maximum depth of about 72 meters. In the early 1900s, the lake was believed to have very good water quality (low nutrient concentrations and good water clarity) with low dissolved oxygen (DO) concentrations occurring in only the deepest part of the lake. Because of increased phosphorus (P) inputs from anthropogenic activities in its watershed, total phosphorus (TP) concentrations in the lake have increased; these changes have led to increased algal production and low DO concentrations not only in the deepest areas but also in the middle of the water column (metalimnion). The U.S. Geological Survey has routinely monitored the lake since 2004 and its tributaries since 1988. Results from this monitoring led the Wisconsin Department of Natural Resources (WDNR) to list the lake as impaired because of low DO concentrations in the metalimnion, and they identified elevated TP concentrations as the cause of impairment. As part of this study by the U.S. Geological Survey, in cooperation with the Green Lake Sanitary District, the lake and its tributaries were comprehensively sampled in 2017–18 to augment ongoing monitoring that would further describe the low DO concentrations in the lake (especiallymore »in the metalimnion). Empirical and process-driven water-quality models were then used to determine the causes of the low DO concentrations and the magnitudes of P-load reductions needed to improve the water quality of the lake enough to meet multiple water-quality goals, including the WDNR’s criteria for TP and DO. Data from previous studies showed that DO concentrations in the metalimnion decreased slightly as summer progressed in the early 1900s but, since the late 1970s, have typically dropped below 5 milligrams per liter (mg/L), which is the WDNR criterion for impairment. During 2014–18 (the baseline period for this study), the near-surface geometric mean TP concentration during June–September in the east side of the lake was 0.020 mg/L and in the west side was 0.016 mg/L (both were above the 0.015-mg/L WDNR criterion for the lake), and the metalimnetic DO minimum concentrations (MOMs) measured in August ranged from 1.0 to 4.7 mg/L. The degradation in water quality was assumed to have been caused by excessive P inputs to the lake; therefore, the TP inputs to the lake were estimated. The mean annual external P load during 2014–18 was estimated to be 8,980 kilograms per year (kg/yr), of which monitored and unmonitored tributary inputs contributed 84 percent, atmospheric inputs contributed 8 percent, waterfowl contributed 7 percent, and septic systems contributed 1 percent. During fall turnover, internal sediment recycling contributed an additional 7,040 kilograms that increased TP concentrations in shallow areas of the lake by about 0.020 mg/L. The elevated TP concentrations then persisted until the following spring. On an annual basis, however, there was a net deposition of P to the bottom sediments. Empirical models were used to describe how the near-surface water quality of Green Lake would be expected to respond to changes in external P loading. Predictions from the models showed a relatively linear response between P loading and TP and chlorophyll-a (Chl-a) concentrations in the lake, with the changes in TP and Chl-a concentrations being less on a percentage basis (50–60 percent for TP and 30–70 percent for Chl-a) than the changes in P loading. Mean summer water clarity, quantified by Secchi disk depths, had a greater response to decreases in P loading than to increases in P loading. Based on these relations, external P loading to the lake would need to be decreased from 8,980 kg/yr to about 5,460 kg/yr for the geometric mean June–September TP concentration in the east side of the lake, with higher TP concentrations than in the west side, to reach the WDNR criterion of 0.015 mg/L. This reduction of 3,520 kg/yr is equivalent to a 46-percent reduction in the potentially controllable external P sources (all external sources except for precipitation, atmospheric deposition, and waterfowl) from those measured during water years 2014–18. The total external P loading would need to decrease to 7,680 kg/yr (a 17-percent reduction in potentially controllable external P sources) for near-surface June–September TP concentrations in the west side of the lake to reach 0.015 mg/L. Total external P loading would need to decrease to 3,870–5,320 kg/yr for the lake to be classified as oligotrophic, with a near-surface June–September TP concentration of 0.012 mg/L. Results from the hydrodynamic water-quality model GLM–AED (General Lake Model coupled to the Aquatic Ecodynamics modeling library) indicated that MOMs are driven by external P loading and internal sediment recycling that lead to high TP concentrations during spring and early summer, which in turn lead to high phytoplankton production, high metabolism and respiration, and ultimately DO consumption in the upper, warmer areas of the metalimnion. GLM–AED results indicated that settling of organic material during summer might be slowed by the colder, denser, and more viscous water in the metalimnion and thus increase DO consumption. Based on empirical evidence from a comparison of MOMs with various meteorological, hydrologic, water quality, and in-lake physical factors, MOMs were lower during summers, when metalimnetic water temperatures were warmer, near-surface Chl-a and TP concentrations were higher, and Secchi depths were lower. GLM–AED results indicated that the external P load would need to be reduced to about 4,060 kg/yr, a 57-percent reduction from that measured in 2014–18, to eliminate the occurrence of MOMs less than 5 mg/L during more than 75 percent of the years (the target provided by the WDNR). Large reductions in external P loading are expected to have an immediate effect on the near-surface TP concentrations and metalimnetic DO concentrations in Green Lake; however, it may take several years for the full effects of the external-load reduction to be observed because internal sediment recycling is an important source of P for the following spring.« less