skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Construction of Critical Periods for Water Resources Management and Their Application in the FEW Nexus
Amidst the growing population, urbanization, globalization, and economic growth, along with the impacts of climate change, decision-makers, stakeholders, and researchers need tools for better assessment and communication of the highly interconnected food–energy–water (FEW) nexus. This study aimed to identify critical periods for water resources management for robust decision-making for water resources management at the nexus. Using a 4610 ha agricultural watershed as a pilot site, historical data (2006–2012), scientific literature values, and SWAT model simulations were utilized to map out critical periods throughout the growing season of corn and soybeans. The results indicate that soil water deficits are primarily seen in June and July, with average deficits and surpluses ranging from −134.7 to +145.3 mm during the study period. Corresponding water quality impacts include average monthly surface nitrate-N, subsurface nitrate-N, and soluble phosphorus losses of up to 0.026, 0.26, and 0.0013 kg/ha, respectively, over the growing season. Estimated fuel requirements for the agricultural practices ranged from 24.7 to 170.3 L/ha, while estimated carbon emissions ranged from 0.3 to 2.7 kg CO2/L. A composite look at all the FEW nexus elements showed that critical periods for water management in the study watershed occurred in the early and late season—primarily related to water quality—and mid-season, related to water quantity. This suggests the need to adapt agricultural and other management practices across the growing season in line with the respective water management needs. The FEW nexus assessment methodologies developed in this study provide a framework in which spatial, temporal, and literature data can be implemented for improved water resources management in other areas.  more » « less
Award ID(s):
1735282 1855882
PAR ID:
10285671
Author(s) / Creator(s):
; ; ; ; ; ;
Date Published:
Journal Name:
Water
Volume:
13
Issue:
5
ISSN:
2073-4441
Page Range / eLocation ID:
718
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Abstract Excessive phosphorus (P) applications to croplands can contribute to eutrophication of surface waters through surface runoff and subsurface (leaching) losses. We analyzed leaching losses of total dissolved P (TDP) from no-till corn, hybrid poplar ( Populus nigra X P. maximowiczii ), switchgrass ( Panicum virgatum ), miscanthus ( Miscanthus giganteus ), native grasses, and restored prairie, all planted in 2008 on former cropland in Michigan, USA. All crops except corn (13 kg P ha −1  year −1 ) were grown without P fertilization. Biomass was harvested at the end of each growing season except for poplar. Soil water at 1.2 m depth was sampled weekly to biweekly for TDP determination during March–November 2009–2016 using tension lysimeters. Soil test P (0–25 cm depth) was measured every autumn. Soil water TDP concentrations were usually below levels where eutrophication of surface waters is frequently observed (> 0.02 mg L −1 ) but often higher than in deep groundwater or nearby streams and lakes. Rates of P leaching, estimated from measured concentrations and modeled drainage, did not differ statistically among cropping systems across years; 7-year cropping system means ranged from 0.035 to 0.072 kg P ha −1  year −1 with large interannual variation. Leached P was positively related to STP, which decreased over the 7 years in all systems. These results indicate that both P-fertilized and unfertilized cropping systems may leach legacy P from past cropland management. 
    more » « less
  2. null (Ed.)
    Technical best management practices are the dominant approach promoted to mitigate agriculture’s significant contributions to environmental degradation. Yet very few social science studies have examined how farmers actually use these practices. This study focuses on the outcomes of farmers’ technical best management practice adoption related to synthetic nitrogen fertilizer management in the context of Midwestern corn agriculture in the United States. Moving beyond predicting the adoption of nitrogen best management practices, I use structural equation modeling and data from a sample of over 2500 farmers to analyze how the number of growing season applications a farmer uses influences the rate at which synthetic nitrogen is applied at the field-level. I find that each additional application of N during the growing season is associated with an average increase of 2.4 kg/ha in farmers’ average N application rate. This result counters expectation for the outcome of this practice and may suggest that structural pressures are leading farmers to use additional growing season applications to ensure sufficiently high N rates, rather than allowing them to reduce rates. I conclude by discussing the implication of this study for future research and policy. 
    more » « less
  3. Abstract Agriculture is a key contributor to gaseous emissions causing climate change, the degradation of water quality, and biodiversity loss. The extant climate change crisis is driving a focus on mitigating agricultural gaseous emissions, but wider policy objectives, beyond net zero, mean that evidence on the potential co-benefits or trade-offs associated with on-farm intervention is warranted. For novelty, aggregated data on farm structure and spatial distribution for different farm types were integrated with high-resolution data on the natural environment to generate representative model farms. Accounting for existing mitigation effects, the Catchment Systems Model was then used to quantify global warming potential, emissions to water, and other outcomes for water management catchments across England under both business-as-usual and a maximum technically feasible mitigation potential scenario. Mapped spatial patterns were overlain with the distributions of areas experiencing poor water quality and biodiversity loss to examine potential co-benefits. The median business-as-usual GWP20 and GWP100, excluding embedded emissions, were estimated to be 4606 kg CO2eq. ha−1(inter-quartile range 4240 kg CO2eq. ha−1) and 2334 kg CO2eq. ha−1(inter-quartile range 1462 kg CO2eq. ha−1), respectively. The ratios of business-as-usual GHG emissions to monetized farm production ranged between 0.58 and 8.89 kg CO2eq. £−1for GWP20, compared with 0.53–3.99 kg CO2eq. £−1for GWP100. The maximum mitigation potentials ranged between 17 and 30% for GWP20 and 19-27% for GWP100 with both corresponding medians estimated to be ~24%. Here, we show for the first time that the co-benefits for water quality associated with reductions in phosphorus and sediment loss were both equivalent to around a 34% reduction, relative to business-as-usual, in specific management catchment reporting units where excess water pollutant loads were identified. Several mitigation measures included in the mitigation scenario were also identified as having the potential to deliver co-benefits for terrestrial biodiversity. 
    more » « less
  4. Abstract Efforts to reduce nitrogen and carbon loading from developed watersheds typically target specific flows or sources, but across gradients in development intensity there is no consensus on the contribution of different flows to total loading or sources of nitrogen export. This information is vital to optimize management strategies leveraging source reductions, stormwater controls, and restorations. We investigate how solute loading and sources vary across flows and land‐use using high frequency monitoring and stable nitrate isotope analysis from five catchments with different sanitary infrastructure, along a gradient in development intensity. High frequency monitoring allowed estimation of annual loading and attribution to storm versus baseflows. Nitrate loads were 16 kg/km2/yr. from the forested catchment and ranged from 68 to 119 kg/km2/yr., across developed catchments, highest for the septic served site. Across developed catchments, baseflow contributions ranged from 40% of N loading to 75% from the septic served catchment, and the contribution from high stormflows increased with development intensity. Stormflows mobilized and mixed many surface and subsurface nitrate sources while baseflow nitrate was dominated by fewer sources which varied by catchment (soil, wastewater, or fertilizer). To help inform future sampling designs, we demonstrate that grab sampling and targeted storm sampling would likely fail to accurately predict annual loadings within the study period. The dominant baseflow loads and subsurface stormflows are not treated by surface water management practices primarily targeted to surface stormflows. Using a balance of green and gray infrastructure and stream/riparian restoration may target specific flow paths and improve management. 
    more » « less
  5. Excessive nitrate loading from agricultural runoff leads to substantial environmental and economic harm, and although hydrological models are used to mitigate these effects, the influence of various satellite precipitation products (SPPs) on nitrate load simulations is often overlooked. This study addresses this research gap by evaluating the impacts of using different satellite precipitation products—ERA5, IMERG, and gridMET—on flow and nitrate load simulations with the Soil and Water Assessment Tool Plus (SWAT+), using the Tar-Pamlico watershed as a case study. Although agricultural activities are higher in the summer, this study found the lowest nitrate load during this season due to reduced runoff. In contrast, the nitrate load was higher in the winter because of increased runoff, highlighting the dominance of water flow in driving riverine nitrate load. This study found that although IMERG predicts the highest annual average flow (120 m3/s in Pamlico Sound), it unexpectedly results in the lowest annual average nitrate load (1750 metric tons/year). In contrast, gridMET estimates significantly higher annual average nitrate loads (3850 metric tons/year). This discrepancy underscores the crucial impact of rainfall datasets on nitrate transport predictions and highlights how the choice of dataset can significantly influence nitrate load simulations. 
    more » « less