The last decade has been a period of great expansion and land use intensification for the New Zealand dairy farming industry with a 44% increase in national dairy cow numbers. Intensive dairy farming is now considered to be a major contributor to the deterioration in the quality of surface and ground water resources in some regions of New Zealand. Previous research has demonstrated intensive dairy farming is responsible for accelerated contamination of wateways by nutrients, suspended solids, pathogenic organisms and faecal material. A number of common dairy farming practices increase the risk of nutrient leaching. In particular, farm dairy effluent (FDE) has been implicated as a major contributor to the degradation of water quality. With the introduction of the Resource Management Act in 1991, the preferred treatment for FDE shifted away from traditional two-pond systems to land application. However, on most farms, irrigation of FDE has occurred on a daily basis, often without regard for soil moisture status. Therefore, it has been commonplace for partially treated effluent to drain through and/or runoff soils and contaminate fresh water bodies. The objectives of this thesis were to design and implement a sustainable land application system for FDE on difficult to manage, mole and pipe drained soils, and to assess the impacts of FDE application, urea application and cattle grazing events on nutrient losses via artificial drainage and surface runoff from dairy cattle grazed pasture. To meet these objectives a research field site was established on Massey University's No.4 Dairy farm near Palmerston North. The soil type was Tokomaru silt loam, a Fragiaqualf with poor natural drainage. Eight experimental plots (each 40 x 40 m) were established with two treatments. Four of the plots represented standard farm practice including grazing and fertiliser regimes. Another four plots were subjected to the same farm practices but without the fertiliser application and they were also irrigated with FDE. Each plot had an isolated mole and pipe drainage system. Four surface runoff plots (each 5 m x 10 m) were established as subplots (two on the fertilised plots and two on the plots irrigated with FDE) in the final year of the study. Plots were instrumented to allow the continuous monitoring of drainage and surface runoff and the collection of water samples for nutrient analyses. An application of 25 mm of FDE to a soil with limited soil water deficit - simulating a 'daily' irrigation regime - resulted in considerable drainage of partially treated FDE. Approximately 70% of the applied FDE left the experimental plots with 10 mm of drainage and 8 mm of surface runoff. The resulting concentrations of N and P in drainage and runoff were approximately 45% and 80% of the original concentrations in the applied FDE, respectively. From this single irrigation event, a total of 12.1 kg N ha-1 and 1.9 kg P ha-1 was lost to surface water representing 45% of expected annual N loss and 100% of expected annual P loss. An improved system for applying farm dairy effluent to land called 'deferred irrigation' was successfully developed and implemented at the research site. Deferred irrigation involves the storage of effluent in a two-pond system during periods of small soil moisture deficits and the scheduling of irrigation at times of suitable soil water deficits. Deferred irrigation of FDE all but eliminated direct drainage losses with on average <1 % of the volume of effluent and nutrients applied leaving the experimental plots. Adopting an approach of applying 'little and often' resulted in no drainage and, therefore, zero direct loss of nutrients applied. A modelling exercise, using the APSlM simulation model, was conducted to study the feasibility of practising deferred irrigation at the farm scale on No 4 Dairy farm. Using climate data for the past 30 years, this simulation exercise demonstrated that applying small application depths of FDE, such as 15 mm or less, provided the ability to schedule irrigations earlier in spring and decreased the required effluent storage capacity. A travelling irrigator, commonly used to apply FDE (a rotating irrigator), was found to have 2-3 fold differences in application depth and increased the risk of generating FDE contaminated drainage. New irrigator technology (an oscillating travelling irrigator) provided a more uniform application pattern allowing greater confidence that an irrigation depth less than the soil water deficit could be applied. This allowed a greater volume to be irrigated, whilst avoiding direct drainage of FDE when the soil moisture deficit is low in early spring and late autumn. A recommendation arising from this work is that during this period of low soil water deficits, all irrigators should be set to travel at their fastest speed (lowest application depth) to minimise the potential for direct drainage of partially treated FDE and associated nutrient losses. The average concentrations of N and P in both 2002 and 2003 winter mole and pipe drainage water from grazed dairy pastures were all well above the levels required to prevent aquatic weed growth in fresh water bodies. Total N losses from plots representing standard farm practice were 28 kg N ha-1 and 34 kg N ha-1 for 2003 and 2004, respectively. Total P losses in 2003 and 2004 were 0.35 kg P ha-1 and 0.7 kg P ha-1, respectively. Surface runoff was measured in 2003 and contributed a further 3.0 kg N ha-1and 0.6 kg P ha-1. A number of common dairy farm practices immediately increased the losses of N and P in the artificial drainage water. Recent grazing events increased NO3--N and DIP concentrations in drainage by approximately 5 mg litre-1 and 0.1 mg litre-1, respectively. The duration between the grazing and drainage events influenced the form of N loss due to a likely urine contribution when grazing and drainage coincide, but had little impact on the total quantity of N lost. Nitrogen loss from an early spring application of urea in 2002 was minimal, whilst a mid June application in 2003 resulted in an increased loss of NO3--N throughout 80 mm of cumulative drainage suggesting that careful timing of urea applications in winter is required to prevent unnecessary N leaching. Storage and deferred irrigation of FDE during the lactation season caused no real increase in either the total-N concentrations or total N losses in the winter drainage water of 2002 and 2003. In contrast, land application of FDE using the deferred irrigation system resulted in a gradual increase in total P losses over the 2002 and 2003 winter drainage seasons. However, this increase represents less than 4% of the P applied in FDE during the lactation season. An assessment of likely losses of nutrients at a whole-farm scale suggests that it is standard dairy farming practice (particularly intensive cattle grazing) that is responsible for the great majority of N and P loss at a farm scale. When expressed as a proportion of whole-farm losses, only a very small quantity of N is lost under an improved land treatment technique for FDE such as deferred irrigation. The management of FDE plays a greater role in the likely P loss at a farm scale with a 5% contribution to wholefarm P losses from deferred irrigation.
Identifer | oai:union.ndltd.org:ADTP/290080 |
Date | January 2005 |
Creators | Houlbrooke, David John |
Source Sets | Australiasian Digital Theses Program |
Language | English |
Detected Language | English |
Page generated in 0.002 seconds