• Visual drainage assessment: A standardised visual soil assessment method for use in land drainage design in Ireland

      Tuohy, P.; Humphreys, James; Holden, N.M.; O'Loughlin, James; Reidy, B.; Fenton, Owen T. (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-08-20)
      The implementation of site-specific land drainage system designs is usually disregarded by landowners in favour of locally established ‘standard practice’ land drainage designs. This is due to a number of factors such as a limited understanding of soil–water interactions, lack of facilities for the measurement of soil’s physical or hydrological parameters and perceived time wastage and high costs. Hence there is a need for a site-specific drainage system design methodology that does not rely on inaccessible, time-consuming and/or expensive measurements of soil physical or hydrological properties. This requires a standardised process for deciphering the drainage characteristics of a given soil in the field. As an initial step, a new visual soil assessment method, referred to as visual drainage assessment (VDA), is presented whereby an approximation of the permeability of specific soil horizons is made using seven indicators (water seepage, pan layers, texture, porosity, consistence, stone content and root development) to provide a basis for the design of a site-specific drainage system. Across six poorly drained sites (1.3 ha to 2.6 ha in size) in south-west Ireland a VDA-based design was compared with (i) an ideal design (utilising soil physical measurements to elucidate soil hydraulic parameters) and (ii) a standard design (0.8 m deep drains at a 15 m spacing) by model estimate of water table control and rainfall recharge/drain discharge capacity. The VDA method, unlike standard design equivalents, provided a good approximation of an ideal (from measured hydrological properties) design and prescribed an equivalent land drainage system in the field. Mean modelled rainfall recharge/drain discharge capacity for the VDA (13.3 mm/day) and ideal (12.0 mm/day) designs were significantly higher (P < 0.001, s.e. 1.42 mm/day) than for the standard designs (0.5 mm/day), when assuming a design minimum water table depth of 0.45 m.
    • Impact of slurry application method on phosphorus loss in runoff from grassland soils during periods of high soil moisture content

      McConnell, D.A.; Doody, D.G.; Elliott, C.T.; Matthews, D.I.; Ferris, C.P. (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-08-27)
      Previous studies have reported that the trailing shoe application technique reduces phosphorus (P) in the runoff postslurry application when compared to the traditional splash-plate application technique. However, the effectiveness of the trailing-shoe technique as a means of reducing P losses has not been evaluated when slurry is applied during periods of high soil moisture levels and lower herbage covers. To address this issue, three treatments were examined in a 3 × 4 factorial design split-plot experiment, with treatments comprising three slurry treatments: control (no slurry), splashplate and trailing-shoe, and four slurry application dates: 7 December, 18 January, 1 March and 10 April. Dairy cow slurry was applied at a rate of 20 m3/ha, while simulated runoff was generated 2, 9 and 16 days later and analysed for a range of P fractions. Dissolved reactive P concentrations in runoff at day two was 41% lower when slurry was applied using the trailing-shoe technique, compared to the splash-plate technique (P < 0.05). In addition, P concentrations in runoff were higher (P < 0.05) from slurry applied in December and March compared to slurry applied in January or April, coinciding with periods of higher soil moisture contents. While the latter highlights that ‘calendar’-based non-spreading periods might not always achieve the desired consequences, the study demonstrated that further field-scale investigations into the trailing shoe as a mitigation measure to reduced P loss from agricultural soils is warranted.
    • Additive genetic, non-additive genetic and permanent environmental effects for female reproductive performance in seasonal calving dairy females

      Kelleher, M.M.; Buckley, Frank; Evans, R.D.; Berry, Donagh P. (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-09-08)
      Excellent reproductive performance (i.e. 365-day calving interval) is paramount to herd profit in seasonal-calving dairy systems. Reproductive targets are currently not being achieved in Irish dairy herds. Furthermore, most research on the genetics of reproductive performance in dairy cattle has focused primarily on lactating cows and relatively few studies have attempted to quantify the genetic contribution to differences in reproductive performance in nulliparae. The objective of the present study was to estimate the contribution of both the additive and non-additive genetic components, as well as the permanent environmental component, to phenotypic variation in the reproductive traits in nulliparous, primiparous and multiparous seasonal-calving dairy females. Reproductive phenotypes were available on up to 202,525 dairy females. Variance components were estimated using (repeatability where appropriate) linear animal mixed models; fixed effects included in the mixed models were contemporary group, parity (where appropriate), breed proportion, inter-breed specific heterosis coefficients and inter-breed specific recombination loss coefficients. Heritability of the reproductive traits ranged from 0.004 (pregnancy rate to first service) to 0.17 (age at first service in nulliparae), while repeatability estimates for the reproductive traits in cows ranged from 0.01 (calving interval) to 0.11 (pregnant in the first 42 days of the breeding season). Breed-specific heterosis regression coefficients suggest that, relative to the parental mean, a first-cross Holstein–Jersey crossbred was almost 7 days younger at first calving, had a 9-day shorter calving interval, a 6 percentage unit greater pregnancy rate in the first 42 days of the breeding season and a 3 percentage unit greater survival rate to next lactation. Heifer calving rate traits were strongly genetically correlated with age at first calving (–0.97 to –0.66) and calving rate in the first 42 days of the calving season for first parity cows (0.77 to 0.56), but genetic correlations with other cow reproductive traits were weak and inconsistent. Calving interval was strongly genetically correlated with the majority of the cow traits; 56%, 40%, and 92% of the genetic variation in calving interval was explained by calving to the first service interval, number of services and pregnant in the first 42 days of the breeding season, respectively. Permanent environmental correlations between the reproductive performance traits were generally moderate to strong. The existence of contributions from non-additive genetic and permanent environmental effects to phenotypic differences among cows suggests the usefulness of such information to rank cows on future expected performance; this was evidenced by a stronger correlation with future reproductive performance for an individual cow index that combined additive genetic, non-additive genetic and permanent environmental effects compared to an index based solely on additive genetic effects (i.e. estimated breeding values).
    • The interactive effects of fertiliser nitrogen with dung and urine on nitrous oxide emissions in grassland

      Hyde, B.P.; Forrestal, Patrick J.; Jahangir, M.M.R.; Ryan, M.; Fanning, A.F.; Carton, Owen T.; Lanigan, Gary; Richards, Karl G. (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-09-08)
      Nitrous oxide (N2O) is an important and potent greenhouse gas (GHG). Although application of nitrogen (N) fertiliser is a feature of many grazing systems, limited data is available on N2O emissions in grassland as a result of the interaction between urine, dung and fertiliser N. A small plot study was conducted to identify the individual and interactive effects of calcium ammonium nitrate (CAN) fertiliser, dung and urine. Application of CAN with dung and urine significantly increased the mass of N2O-N emission. Importantly, the sum of N2O-N emitted from dung and CAN applied individually approximated the emission from dung and CAN fertiliser applied together, that is, an additive effect. However, in the case of urine and CAN applied together, the emission was more than double the sum of the emission from urine and CAN fertiliser applied individually, that is, a multiplicative effect. Nitrous oxide emissions from dung, urine and fertiliser N are typically derived individually and these individual emission estimates are aggregated to produce estimates of N2O emission. The presented findings have important implications for how individual emission factors are aggregated; they suggest that the multiplicative effect of the addition of CAN fertiliser to urine patches needs to be taken into account to refine the estimation of N2O emissions from grazing grasslands.
    • Effect of nitrogen fertilizer application timing on nitrogen use efficiency and grain yield of winter wheat in Ireland.

      Efretuei, A.; Gooding, M.; White, E.; Spink, John; Hackett, Richard (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-12-01)
      The objectives of this work were to determine the effects of initiating application of fertilizer nitrogen (N) to winter wheat at different growth stages (GSs) on grain yield and N use efficiency (NUE). A factorial experiment was carried out in two growing seasons (2011 and 2012) with five timings of first N application (GS 24/26 [tillering], GS 30, GS 31, GS 32 or GS 37) and an unfertilized control, two sowing densities (100 and 400 seeds/m2) and a cattle slurry treatment (with or without slurry). The latter was included to simulate variation in soil N supply (SNS). Delaying the first application of N from the tillering stage until GS 30 had no significant effect on grain yield in either year. Further delaying the initial N application until GS 31 caused a significant yield reduction in 2011, in comparison to GS 30 application, but not in 2012. Differences in efficiency of recovery and use of fertilizer N by the crop among the first three application timings were small. There was no evidence to support alteration in the timing of the first application of N in response to low plant density. Slurry application did not influence SNS, so the interaction between SNS and fertilizer N application timing could not be determined. It is concluded that in order to maximise yield and NUE, the first N application should be applied to winter wheat between late tillering and GS 30 and that delaying the first N until GS 31 can lead to yield reductions compared to the yield obtained with earlier application.
    • Effect of a bacteriophage cocktail in combination with modified atmosphere packaging in controlling Listeria monocytogenes on fresh-cut spinach

      Boyacioglu, O.; Sulakvelidza, A.; Sharma, M.; Goktepe, I. (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-12-14)
      A Listeria monocytogenes-specific bacteriophage cocktail was evaluated for its activity against a nalidixic acid-resistant L. monocytogenes (Lm-NalR) isolate on fresh-cut spinach stored under modified atmosphere packaging at various temperatures. Pieces (~2 × 2 cm2) of fresh spinach inoculated with 4.5 log CFU/cm2 Lm-NalR were sprayed with the phage cocktail (6.5 log plaque-forming units [PFU]/cm2) or a control. The samples were stored at 4°C or 10°C for up to 14 d in sealed packages filled with either atmospheric air (AA) or modified atmosphere (MA). At 4°C under AA, the phages significantly (P ≤ 0.05) lowered the Lm-NalR populations on spinach, compared to control-treated inoculated samples, by 1.12 and 1.51 log CFU/cm2 after 1 and 14 d, respectively. At 4°C under MA, Lm-NalR was significantly reduced by 1.95 log CFU/cm2 compared to control leaves after both 1 and 14 d. At 10°C under AA, the phages significantly reduced Lm-NalR by 1.50 and 2.51 log CFU/cm2 after 1 and 14 d compared to the control. Again at 10°C under MA, the phages significantly reduced Lm-NalR by 1.71 and 3.24 log CFU/cm2 compared to control after 1 and 14 d, respectively. The results support the potential of lytic bacteriophages in effectively reducing populations of L. monocytogenes on freshcut leafy produce, under both AA and MA conditions.
    • Trends, over 14 years, in the ground cover on an unimproved western hill grazed by sheep, and associated trends in animal performance

      Walsh, M.; Hanrahan, J.P.; O'Malley, L; Moles, R. (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-12-14)
      The frequency of individual plant species at ground level and the species composition of the unimproved vegetation on a western hill farm, stocked with Scottish Blackface sheep, were monitored from 1995 to 2008. Performance criteria of the flock that relied totally, or almost totally, on this vegetation for sustenance from 1994 to 2011 were evaluated. The frequency of vegetation increased over time (from 65% to 82% of the surface area; P < 0.05), with a corresponding decline in the frequency of bare soil, thus reducing vulnerability to soil erosion. This increased incidence of vegetation cover reflected increases in ‘other forbs’(P < 0.01), heather(P < 0.05)and grass (P < 0.08).A significant change (P < 0.05) also occurred in the species composition of the vegetation, reflecting an increase in the proportions of ‘other forbs’ (P < 0.05) and heather (P = 0.14), and a decline in the proportion of sedges (P = 0.14). A similar pattern occurred in the two main habitats: blanket bog and wet heath. Annual stocking rate (ewes per hectare, based on actual ewe grazing days) on the unimproved hill grazing averaged 0.9 (0.13 livestock units) per hectare prior to 1999 and 0.78 (0.11 livestock units) per hectare subsequently. There was no trend in weight gain of replacement females while confined to the unimproved hill area between weaning (14 weeks old) and first joining at 18 months of age. A negative trend (P < 0.01) occurred in the pre-weaning growth rate of lambs on the hill. The average number of lambs reared per ewe joined (reflecting fertility, litter size and ewe/lamb mortality) was 1.0, and this showed no evidence of change over time. The study flock performed 10% to > 60% better, depending on the variable, than similar flocks in the National Farm Survey at comparable stocking rates. A well-defined rational management system can sustain a productive sheep enterprise on unimproved hill land without negative consequences for the frequency or composition of the vegetation.