Modeling the Impacts of Historical Weather and Management on Nitrogen Loss
There continues to be a need for more detailed information on where in the landscape and by how much recommended practices to reduce N loss can be expected to realistically mitigate water quality challenges, as well as their impacts on crop production.
This project will provide actionable data to inform stakeholders on
1) the contribution of controllable (management) and uncontrollable factors on water quality from 1980 to 2020, and
2) the effectiveness of three N loss reduction practices at scale: N fertilization amount, N fertilization timing and inclusion of cover crops.
Researchers will use the Agricultural Production Systems sIMulator (APSIM), a well-calibrated cropping systems model for Iowa, to perform a regional scale systems analysis (4,000 fields). The model will be driven by weather, soil and management databases that include tile drainage in the landscape, synthetic and organic N fertilizer by county and by year, annual changes in planting densities, cultivars and planting dates. By keeping one factor at a time constant we will determine the contribution of each factor on N loss, while a model sensitivity analysis will provide additional insights.
Note: Project reports published on the INRC website are often revised from researchers' original reports to increase consistency.
During July-Dec., 2022, researchers used the regional scale calibrated model to understand temporal dynamics in nitrogen balance (mineralization, leaching, crop N uptake) and also to identify correlations with precipitation. The modeling results were presented in the Dec. 2022 INRC seminar, which is available online. In brief, yield-scaled emissions were found to be improving over the years because of improved agronomic management and use of better cultivars, however, the actual amount of N loss is increasing because of the wetter conditions. The evaluation period was 1984 to 2019.
Other activities included one field day, two presentations and one workshop.
After incorporating the regional scale N inputs into the model (regional scale, 5-arc minute spatial resolution), researchers performed a series of simulations using the coupled pSIMS-APSIM software. Each model run was about 1 million annual-point-level simulations per crop. After executing these simulations, results were processed using ArcGIS and R software. Next, was to compare the regional scale simulations with the existing datasets at scale, which were mostly USDA NASS yields and phenology records and also some soil water datasets.
After a couple of iterations, the model parameters were calibrated and a good agreement was reached between simulated and observed corn yields. Most importantly, the model captured well the historical yield increase over the last 50 years.
Other activities included two presentations and two workshops.
This progress report is for the project’s first year (2021). Datasets were compiled including major nitrogen inputs sources across US Corn Belt States. The datasets included annual N input rates per county for both manure and synthetic N from fertilizers. A total of 1,056 counties and 36 years (1984 – 2019) for two crops (corn and soybean) and two sources of N resulted in 152,064 records. These inputs were converted to the APSIM format for further use and simulation analysis. Currently, we analyze spatial (irrigated, rainfed, tile-drained fields) and temporal dynamics of N losses to better understand G x E x M effects on N losses. Two papers are under preparation. One is to demonstrate the performance of the model at regional scale (yield and N loss prediction) and the second is to explain and separate the contribution of management and weather on N loss and yield increase.