Return to search

Using average net returns and risk measures to compare irrigation management strategies

Master of Agribusiness / Department of Agricultural Economics / Nathan P. Hendricks / Risk and uncertainty are inherent in agriculture especially when lack of precipitation needed for crop production is common. Precipitation in the High Plains is highly variable. To supplement precipitation, the Ogallala Aquifer, a large underground water storage reservoir, was developed for irrigation. However, as the saturated thickness of the aquifer decreases, the rate at which water can be extracted (i.e., well capacities) decreases. Limited well capacities induce risk in agricultural production because producers may not be able to irrigate sufficiently in dry years.
This study’s objective was to develop a method to assist producers in comparing alternative irrigation management strategies in the face of risk due to a limited well capacity. The objective was accomplished by simulating average net returns for 172 different irrigation strategies across 30 years (1986-2015) of historical weather (Kansas Mesonet 2016). Management strategies include different combinations of corn and wheat production with full irrigation, moderate irrigation, deficit irrigation and dryland production. The three risk measures were Value at Risk (VaR), expected shortfall, and standard deviation.
The risk-return tradeoff is estimated for management strategies for two well capacities, 300 GPM (gallons per minute) and 600 GPM. Estimating these risk measures can help producers better evaluate the optimal management strategy compared to the approach of only equating average net returns.

Identiferoai:union.ndltd.org:KSU/oai:krex.k-state.edu:2097/35548
Date January 1900
CreatorsBretz, Frances
PublisherKansas State University
Source SetsK-State Research Exchange
Languageen_US
Detected LanguageEnglish
TypeThesis

Page generated in 0.0021 seconds