Now showing items 1-20 of 214

    • Electronic feeding behavioural data as indicators of health status in dairy calves

      Johnston, D.; Kenny, David A.; McGee, Mark; Waters, Sinead; Kelly, A.K.; Earley, Bernadette (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-12-30)
      The objectives of this study were (i) to characterise clinical health in dairy calves on an Irish research farm during the artificial calf-rearing period and (ii) to determine whether calves’ pre-weaning intakes and feeding behaviour, recorded by electronic calf feeders, changes in response to incidents of bovine respiratory disease (BRD). Holstein-Friesian (H-F) and Jersey (J) calves were fed by automatic milk replacer (MR) and concentrate feeders. Feeding behaviour, including MR consumption, drinking speed, number of rewarded and unrewarded visits to the feeder as well as concentrate consumption, was recorded by the feeders. A modified version of the Wisconsin calf health scoring criteria chart was used to score calves’ clinical measurements and identify incidences of BRD. Thus, 40% of calves were found to have at least one incident of BRD. Feeding behaviour was altered during incidents of BRD. The number of unrewarded visits to the feeder was reduced, by approximately four visits, for calves with BRD during the 3 d prior to the identification of BRD (P < 0.05) and tended to be reduced during the 7 d following the identification of BRD (P = 0.05), compared with healthy calves. Additionally, calves with BRD had a tendency for reduced net energy intake (approximately 8%) during the 3 d prior to the identification of BRD, compared with healthy calves. Therefore, calf feeding behavioural data, recorded by electronic feeders during the pre-weaning period, can indicate cases of BRD.
    • A field-based comparison of ammonia emissions from six Irish soil types following urea fertiliser application

      Burchill, William; Lanigan, Gary; Forrestal, Patrick J.; Reville, F.; Misselbrook, T.; Rochards, Karl G. (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-12-30)
      Ammonia (NH3) emissions from a range of soil types have been found to differ under laboratory conditions. However, there is lack of studies comparing NH3 emissions from different soil types under field conditions. The objective was to compare NH3 emissions from six different soil types under similar environmental conditions in the field following urea fertiliser application. The study was conducted on a lysimeter unit and NH3 emissions were measured, using wind tunnels, from six different soil types with varying soil characteristics following urea fertiliser application (80 kg N/ha). On average, 17.6% (% total N applied) was volatilised, and there was no significant difference in NH3 emissions across all soil types. Soil variables, including pH, cation exchange capacity and volumetric moisture, were not able to account for the variation in emissions. Further field studies are required to improve the urea-NH3 emission factor used for Ireland’s NH3 inventory.
    • Detection of presumptive Bacillus cereus in the Irish dairy farm environment

      O'Connell, A.; Lawton, E.M.; Leong, Dara; Cotter, Paul D; Gleeson, David; Guinane, Catriona M. (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-01-30)
      The objective of the study was to isolate potential Bacillus cereus sensu lato (B. cereus s.l.) from a range of farm environments. Samples of tap water, milking equipment rinse water, milk sediment filter, grass, soil and bulk tank milk were collected from 63 farms. In addition, milk liners were swabbed at the start and the end of milking, and swabs were taken from cows’ teats prior to milking. The samples were plated on mannitol egg yolk polymyxin agar (MYP) and presumptive B. cereus s.l. colonies were isolated and stored in nutrient broth with 20% glycerol and frozen at -80 °C. These isolates were then plated on chromogenic medium (BACARA) and colonies identified as presumptive B. cereus s.l. on this medium were subjected to 16S ribosomal RNA (rRNA) sequencing. Of the 507 isolates presumed to be B. cereus s.l. on the basis of growth on MYP, only 177 showed growth typical of B. cereus s.l. on BACARA agar. The use of 16S rRNA sequencing to identify isolates that grew on BACARA confirmed that the majority of isolates belonged to B. cereus s.l. A total of 81 of the 98 isolates sequenced were tentatively identified as presumptive B. cereus s.l. Pulsed-field gel electrophoresis was carried out on milk and soil isolates from seven farms that were identified as having presumptive B. cereus s.l. No pulsotype was shared by isolates from soil and milk on the same farm. Presumptive B. cereus s.l. was widely distributed within the dairy farm environment.
    • Developing farm-level sustainability indicators for Ireland using the Teagasc National Farm Survey

      Ryan, Mary; Hennessy, Thia; Buckley, Cathal; Dillon, Emma J.; Donnellan, Trevor; Hanrahan, Kevin; Moran, Brian (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-12-30)
      In the context of an expanding, export-dependent agri-food sector, indicators of sustainable development and intensification are necessary to measure, assess and verify the comparative advantage afforded by Ireland’s natural pastoral-based food production systems. Such indicators are also necessary to ensure that we produce more food with less adverse impacts on the Irish environment, climate and society. This article outlines the development of farm-level indicators that refect the multifaceted nature of sustainability, which is encompassed in economic, environmental and social indicators. The role of innovation in farm sustainability was also examined. A comparison of indicators across Irish farm systems showed that dairy farms, followed by tillage farms, tended to be the most economically and socially sustainable farm systems. In relation to greenhouse gas emissions in particular, the top-performing dairy farms, in an economic sense, also tended to be the best-performing farms from an environmental sustainability perspective. This trend was also evident in relation to the adoption of innovative practices on farm, which was found to be strongly correlated with economic performance.
    • Dairy product production and lactose demand in New Zealand and Ireland under different simulated milk product-processing portfolios

      Sneddon, N.W.; Lopez-Villabos, N.; Hickson, R.E.; Davis, S.R.; Geary, U.; Garrick, D.J.; Shalloo, Laurence (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-12-30)
      Maximising dairy industry profitability involves maximising product returns for a specific set of costs or minimising costs for a certain level of output. A strategy currently utilised by the New Zealand dairy industry to optimise the value of exports is to incorporate imported lactose along with local milk to maximise the production of whole milk powder (WMP) while complying with the Codex Alimentarius (Codex) standards, in addition to increasing the exported product for every litre of milk. This study investigated the impact of different product portfolio strategies on lactose requirements for the Irish and New Zealand dairy industries for current and predicted 2020 milk output projections. A mass balance processing sector model that accounts for all inputs, outputs and losses involved in dairy processing was used to simulate the processing of milk into WMP, skim milk powder (SMP), cheese, butter and fluid milk of different proportions. All scenarios investigated projected an increase in production and revenue from 2012 to 2020. Higher cheese production reduced industry lactose demand through whey processing, while scenarios reliant on an increase in the proportion of WMP were associated with increased lactose deficits.
    • Response of two-row and six-row barley to fertiliser N under Irish conditions

      Hackett, Richard (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-12-30)
      A range of cultivar types, including two-row and six-row types as well as line and hybrid types, are used for winter barley production in Ireland. There is little information available on the fertiliser nitrogen (N) requirements or the N use efficiency of these different types, particularly under Irish conditions. The objectives of the work presented here were to compare the response to fertiliser N of a two-row line cultivar, a six-row line cultivar and a six-row hybrid cultivar in terms of grain yield and aspects of N use efficiency. Experiments were carried out over three growing seasons, in the period 2012-2014, on a light-textured soil comparing the response of the three cultivars of winter barley to fertiliser N application rates ranging from 0 to 260 kg N/ha. There was no evidence that cultivar type, regardless of whether it was a two-row or six-row line cultivar or a six-row hybrid cultivar, influenced the response to fertiliser N of winter barley. There were some indications that six-row cultivars were less efficient at recovering soil N but used accumulated N more efficiently than the two-row cultivar. This work provided no evidence to support adjustment of fertiliser N inputs to winter barley based on cultivar type
    • The costs of seasonality and expansion in Ireland’s milk production and processing

      Heinschink, K.; Shalloo, Laurence; Wallace, M. (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-12-30)
      Ireland’s milk production sector relies on grass-based spring-calving systems, which facilitates cost advantages in milk production but entails a high degree of supply seasonality. Among other implications, this supply seasonality involves extra costs in the processing sector including elevated plant capacities and varying levels of resource utilisation throughout the year. If both the national raw milk production increased substantially (e.g. post-milk quota) and a high degree of seasonality persisted, extra processing capacities would be required to cope with peak supplies. Alternatively, existing capacities could be used more efficiently by distributing the milk volume more evenly during the year. In this analysis, an optimisation model was applied to analyse the costs and economies arising to an average Irish milk-processing business due to changes to the monthly distribution of milk deliveries and/or the total annual milk pool. Of the situations examined, changing from a seasonal supply prior to expansion to a smoother pattern combined with an increased milk pool emerged as the most beneficial option to the processor because both the processor’s gross surplus and the marginal producer milk price increased. In practice, it may however be the case that the extra costs arising to the producer from smoothing the milk intake distribution exceed the processor’s benefit. The interlinkages between the stages of the dairy supply chain mean that nationally, the seasonality trade-offs are complex and equivocal. Moreover, the prospective financial implications of such strategies will be dependent on the evolving and uncertain nature of international dairy markets in the post-quota environment.
    • A methodological framework to determine optimum durations for the construction of soil water characteristic curves using centrifugation

      Vero, Sara E.; Healy, Mark G.; Henry, Tiernan; Creamer, Rachel E.; Ibrahim, Tristan G.; Forrestal, Patrick J.; Richards, Karl G.; Fenton, Owen T. (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-12-30)
      During laboratory assessment of the soil water characteristic curve (SWCC), determining equilibrium at various pressures is challenging. This study establishes a methodological framework to identify appropriate experimental duration at each pressure step for the construction of SWCCs via centrifugation. Three common temporal approaches to equilibrium – 24-, 48- and 72-h – are examined, for a grassland and arable soil. The framework highlights the differences in equilibrium duration between the two soils. For both soils, the 24-h treatment significantly overestimated saturation. For the arable site, no significant difference was observed between the 48- and 72-h treatments. Hence, a 48-h treatment was sufficient to determine ‘effective equilibrium’. For the grassland site, the 48- and 72-h treatments differed significantly. This highlights that a more prolonged duration is necessary for some soils to conclusively determine that effective equilibrium has been reached. This framework can be applied to other soils to determine the optimum centrifuge durations for SWCC construction.
    • Comparison of methods for the identification and sub-typing of O157 and non-O157 Escherichia coli serotypes and their integration into a polyphasic taxonomy approach

      Prieto-Calvo, M.A.; Omer, M.K.; Alveseike, O.; Lopez, M.; Alvarez-Ordonez, A.; Prieto, M. (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-12-30)
      Phenotypic, chemotaxonomic and genotypic data from 12 strains of Escherichia coli were collected, including carbon source utilisation profiles, ribotypes, sequencing data of the 16S–23S rRNA internal transcribed region (ITS) and Fourier transform-infrared (FT-IR) spectroscopic profiles. The objectives were to compare several identification systems for E. coli and to develop and test a polyphasic taxonomic approach using the four methodologies combined for the sub-typing of O157 and non-O157 E. coli. The nucleotide sequences of the 16S–23S rRNA ITS regions were amplified by polymerase chain reaction (PCR), sequenced and compared with reference data available at the GenBank database using the Basic Local Alignment Search Tool (BLAST) . Additional information comprising the utilisation of carbon sources, riboprint profiles and FT-IR spectra was also collected. The capacity of the methods for the identification and typing of E. coli to species and subspecies levels was evaluated. Data were transformed and integrated to present polyphasic hierarchical clusters and relationships. The study reports the use of an integrated scheme comprising phenotypic, chemotaxonomic and genotypic information (carbon source profile, sequencing of the 16S–23S rRNA ITS, ribotyping and FT-IR spectroscopy) for a more precise characterisation and identification of E. coli. The results showed that identification of E. coli strains by each individual method was limited mainly by the extension and quality of reference databases. On the contrary, the polyphasic approach, whereby heterogeneous taxonomic data were combined and weighted, improved the identification results, gave more consistency to the final clustering and provided additional information on the taxonomic structure and phenotypic behaviour of strains, as shown by the close clustering of strains with similar stress resistance patterns.
    • Effect of a bacteriophage cocktail in combination with modified atmosphere packaging in controlling Listeria monocytogenes on fresh-cut spinach

      Boyacioglu, O.; Sulakvelidza, A.; Sharma, M.; Goktepe, I. (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-12-14)
      A Listeria monocytogenes-specific bacteriophage cocktail was evaluated for its activity against a nalidixic acid-resistant L. monocytogenes (Lm-NalR) isolate on fresh-cut spinach stored under modified atmosphere packaging at various temperatures. Pieces (~2 × 2 cm2) of fresh spinach inoculated with 4.5 log CFU/cm2 Lm-NalR were sprayed with the phage cocktail (6.5 log plaque-forming units [PFU]/cm2) or a control. The samples were stored at 4°C or 10°C for up to 14 d in sealed packages filled with either atmospheric air (AA) or modified atmosphere (MA). At 4°C under AA, the phages significantly (P ≤ 0.05) lowered the Lm-NalR populations on spinach, compared to control-treated inoculated samples, by 1.12 and 1.51 log CFU/cm2 after 1 and 14 d, respectively. At 4°C under MA, Lm-NalR was significantly reduced by 1.95 log CFU/cm2 compared to control leaves after both 1 and 14 d. At 10°C under AA, the phages significantly reduced Lm-NalR by 1.50 and 2.51 log CFU/cm2 after 1 and 14 d compared to the control. Again at 10°C under MA, the phages significantly reduced Lm-NalR by 1.71 and 3.24 log CFU/cm2 compared to control after 1 and 14 d, respectively. The results support the potential of lytic bacteriophages in effectively reducing populations of L. monocytogenes on freshcut leafy produce, under both AA and MA conditions.
    • Effect of nitrogen fertilizer application timing on nitrogen use efficiency and grain yield of winter wheat in Ireland.

      Efretuei, A.; Gooding, M.; White, E.; Spink, John; Hackett, Richard (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-12-01)
      The objectives of this work were to determine the effects of initiating application of fertilizer nitrogen (N) to winter wheat at different growth stages (GSs) on grain yield and N use efficiency (NUE). A factorial experiment was carried out in two growing seasons (2011 and 2012) with five timings of first N application (GS 24/26 [tillering], GS 30, GS 31, GS 32 or GS 37) and an unfertilized control, two sowing densities (100 and 400 seeds/m2) and a cattle slurry treatment (with or without slurry). The latter was included to simulate variation in soil N supply (SNS). Delaying the first application of N from the tillering stage until GS 30 had no significant effect on grain yield in either year. Further delaying the initial N application until GS 31 caused a significant yield reduction in 2011, in comparison to GS 30 application, but not in 2012. Differences in efficiency of recovery and use of fertilizer N by the crop among the first three application timings were small. There was no evidence to support alteration in the timing of the first application of N in response to low plant density. Slurry application did not influence SNS, so the interaction between SNS and fertilizer N application timing could not be determined. It is concluded that in order to maximise yield and NUE, the first N application should be applied to winter wheat between late tillering and GS 30 and that delaying the first N until GS 31 can lead to yield reductions compared to the yield obtained with earlier application.
    • Trends, over 14 years, in the ground cover on an unimproved western hill grazed by sheep, and associated trends in animal performance

      Walsh, M.; Hanrahan, J.P.; O'Malley, L; Moles, R. (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-12-14)
      The frequency of individual plant species at ground level and the species composition of the unimproved vegetation on a western hill farm, stocked with Scottish Blackface sheep, were monitored from 1995 to 2008. Performance criteria of the flock that relied totally, or almost totally, on this vegetation for sustenance from 1994 to 2011 were evaluated. The frequency of vegetation increased over time (from 65% to 82% of the surface area; P < 0.05), with a corresponding decline in the frequency of bare soil, thus reducing vulnerability to soil erosion. This increased incidence of vegetation cover reflected increases in ‘other forbs’(P < 0.01), heather(P < 0.05)and grass (P < 0.08).A significant change (P < 0.05) also occurred in the species composition of the vegetation, reflecting an increase in the proportions of ‘other forbs’ (P < 0.05) and heather (P = 0.14), and a decline in the proportion of sedges (P = 0.14). A similar pattern occurred in the two main habitats: blanket bog and wet heath. Annual stocking rate (ewes per hectare, based on actual ewe grazing days) on the unimproved hill grazing averaged 0.9 (0.13 livestock units) per hectare prior to 1999 and 0.78 (0.11 livestock units) per hectare subsequently. There was no trend in weight gain of replacement females while confined to the unimproved hill area between weaning (14 weeks old) and first joining at 18 months of age. A negative trend (P < 0.01) occurred in the pre-weaning growth rate of lambs on the hill. The average number of lambs reared per ewe joined (reflecting fertility, litter size and ewe/lamb mortality) was 1.0, and this showed no evidence of change over time. The study flock performed 10% to > 60% better, depending on the variable, than similar flocks in the National Farm Survey at comparable stocking rates. A well-defined rational management system can sustain a productive sheep enterprise on unimproved hill land without negative consequences for the frequency or composition of the vegetation.
    • Impact of slurry application method on phosphorus loss in runoff from grassland soils during periods of high soil moisture content

      McConnell, D.A.; Doody, D.G.; Elliott, C.T.; Matthews, D.I.; Ferris, C.P. (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-08-27)
      Previous studies have reported that the trailing shoe application technique reduces phosphorus (P) in the runoff postslurry application when compared to the traditional splash-plate application technique. However, the effectiveness of the trailing-shoe technique as a means of reducing P losses has not been evaluated when slurry is applied during periods of high soil moisture levels and lower herbage covers. To address this issue, three treatments were examined in a 3 × 4 factorial design split-plot experiment, with treatments comprising three slurry treatments: control (no slurry), splashplate and trailing-shoe, and four slurry application dates: 7 December, 18 January, 1 March and 10 April. Dairy cow slurry was applied at a rate of 20 m3/ha, while simulated runoff was generated 2, 9 and 16 days later and analysed for a range of P fractions. Dissolved reactive P concentrations in runoff at day two was 41% lower when slurry was applied using the trailing-shoe technique, compared to the splash-plate technique (P < 0.05). In addition, P concentrations in runoff were higher (P < 0.05) from slurry applied in December and March compared to slurry applied in January or April, coinciding with periods of higher soil moisture contents. While the latter highlights that ‘calendar’-based non-spreading periods might not always achieve the desired consequences, the study demonstrated that further field-scale investigations into the trailing shoe as a mitigation measure to reduced P loss from agricultural soils is warranted.
    • Visual drainage assessment: A standardised visual soil assessment method for use in land drainage design in Ireland

      Tuohy, P.; Humphreys, James; Holden, N.M.; O'Loughlin, James; Reidy, B.; Fenton, Owen T. (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-08-20)
      The implementation of site-specific land drainage system designs is usually disregarded by landowners in favour of locally established ‘standard practice’ land drainage designs. This is due to a number of factors such as a limited understanding of soil–water interactions, lack of facilities for the measurement of soil’s physical or hydrological parameters and perceived time wastage and high costs. Hence there is a need for a site-specific drainage system design methodology that does not rely on inaccessible, time-consuming and/or expensive measurements of soil physical or hydrological properties. This requires a standardised process for deciphering the drainage characteristics of a given soil in the field. As an initial step, a new visual soil assessment method, referred to as visual drainage assessment (VDA), is presented whereby an approximation of the permeability of specific soil horizons is made using seven indicators (water seepage, pan layers, texture, porosity, consistence, stone content and root development) to provide a basis for the design of a site-specific drainage system. Across six poorly drained sites (1.3 ha to 2.6 ha in size) in south-west Ireland a VDA-based design was compared with (i) an ideal design (utilising soil physical measurements to elucidate soil hydraulic parameters) and (ii) a standard design (0.8 m deep drains at a 15 m spacing) by model estimate of water table control and rainfall recharge/drain discharge capacity. The VDA method, unlike standard design equivalents, provided a good approximation of an ideal (from measured hydrological properties) design and prescribed an equivalent land drainage system in the field. Mean modelled rainfall recharge/drain discharge capacity for the VDA (13.3 mm/day) and ideal (12.0 mm/day) designs were significantly higher (P < 0.001, s.e. 1.42 mm/day) than for the standard designs (0.5 mm/day), when assuming a design minimum water table depth of 0.45 m.
    • Additive genetic, non-additive genetic and permanent environmental effects for female reproductive performance in seasonal calving dairy females

      Kelleher, M.M.; Buckley, Frank; Evans, R.D.; Berry, Donagh P. (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-09-08)
      Excellent reproductive performance (i.e. 365-day calving interval) is paramount to herd profit in seasonal-calving dairy systems. Reproductive targets are currently not being achieved in Irish dairy herds. Furthermore, most research on the genetics of reproductive performance in dairy cattle has focused primarily on lactating cows and relatively few studies have attempted to quantify the genetic contribution to differences in reproductive performance in nulliparae. The objective of the present study was to estimate the contribution of both the additive and non-additive genetic components, as well as the permanent environmental component, to phenotypic variation in the reproductive traits in nulliparous, primiparous and multiparous seasonal-calving dairy females. Reproductive phenotypes were available on up to 202,525 dairy females. Variance components were estimated using (repeatability where appropriate) linear animal mixed models; fixed effects included in the mixed models were contemporary group, parity (where appropriate), breed proportion, inter-breed specific heterosis coefficients and inter-breed specific recombination loss coefficients. Heritability of the reproductive traits ranged from 0.004 (pregnancy rate to first service) to 0.17 (age at first service in nulliparae), while repeatability estimates for the reproductive traits in cows ranged from 0.01 (calving interval) to 0.11 (pregnant in the first 42 days of the breeding season). Breed-specific heterosis regression coefficients suggest that, relative to the parental mean, a first-cross Holstein–Jersey crossbred was almost 7 days younger at first calving, had a 9-day shorter calving interval, a 6 percentage unit greater pregnancy rate in the first 42 days of the breeding season and a 3 percentage unit greater survival rate to next lactation. Heifer calving rate traits were strongly genetically correlated with age at first calving (–0.97 to –0.66) and calving rate in the first 42 days of the calving season for first parity cows (0.77 to 0.56), but genetic correlations with other cow reproductive traits were weak and inconsistent. Calving interval was strongly genetically correlated with the majority of the cow traits; 56%, 40%, and 92% of the genetic variation in calving interval was explained by calving to the first service interval, number of services and pregnant in the first 42 days of the breeding season, respectively. Permanent environmental correlations between the reproductive performance traits were generally moderate to strong. The existence of contributions from non-additive genetic and permanent environmental effects to phenotypic differences among cows suggests the usefulness of such information to rank cows on future expected performance; this was evidenced by a stronger correlation with future reproductive performance for an individual cow index that combined additive genetic, non-additive genetic and permanent environmental effects compared to an index based solely on additive genetic effects (i.e. estimated breeding values).
    • The interactive effects of fertiliser nitrogen with dung and urine on nitrous oxide emissions in grassland

      Hyde, B.P.; Forrestal, Patrick J.; Jahangir, M.M.R.; Ryan, M.; Fanning, A.F.; Carton, Owen T.; Lanigan, Gary; Richards, Karl G. (Teagasc (Agriculture and Food Development Authority), Ireland, 2016-09-08)
      Nitrous oxide (N2O) is an important and potent greenhouse gas (GHG). Although application of nitrogen (N) fertiliser is a feature of many grazing systems, limited data is available on N2O emissions in grassland as a result of the interaction between urine, dung and fertiliser N. A small plot study was conducted to identify the individual and interactive effects of calcium ammonium nitrate (CAN) fertiliser, dung and urine. Application of CAN with dung and urine significantly increased the mass of N2O-N emission. Importantly, the sum of N2O-N emitted from dung and CAN applied individually approximated the emission from dung and CAN fertiliser applied together, that is, an additive effect. However, in the case of urine and CAN applied together, the emission was more than double the sum of the emission from urine and CAN fertiliser applied individually, that is, a multiplicative effect. Nitrous oxide emissions from dung, urine and fertiliser N are typically derived individually and these individual emission estimates are aggregated to produce estimates of N2O emission. The presented findings have important implications for how individual emission factors are aggregated; they suggest that the multiplicative effect of the addition of CAN fertiliser to urine patches needs to be taken into account to refine the estimation of N2O emissions from grazing grasslands.
    • A note on the Hybrid Soil Moisture Deficit Model v2.0

      Schulte, Rogier P. O.; Simo, Iolando; Creamer, Rachel E.; Holden, Nicholas M. (Teagasc (Agriculture and Food Development Authority), Ireland, 2015-12-30)
      The Hybrid Soil Moisture Deficit (HSMD) model has been used for a wide range of applications, including modelling of grassland productivity and utilisation, assessment of agricultural management opportunities such as slurry spreading, predicting nutrient emissions to the environment and risks of pathogen transfer to water. In the decade since its publication, various ad hoc modifications have been developed and the recent publication of the Irish Soil Information System has facilitated improved assessment of the spatial soil moisture dynamics. In this short note, we formally present a new version of the model (HSMD2.0), which includes two new soil drainage classes, as well as an optional module to account for the topographic wetness index at any location. In addition, we present a new Indicative Soil Drainage Map for Ireland, based on the Irish Soil Classification system, developed as part of the Irish Soil Information System.
    • A note on challenge trials to determine the growth of Listeria monocytogenes on mushrooms (Agaricus bisporus)

      Leong, Dara; Alvarez-Ordonez, Avelino; Jordan, Kieran (Teagasc (Agriculture and Food Development Authority), Ireland, 2015-12-30)
      In the EU, food is considered safe with regard to Listeria monocytogenes if the number of micro-organisms does not exceed 100 colony forming units (cfu)/g throughout its shelf-life. Therefore, it is important to determine if a food supports growth of L. monocytogenes. Guidelines for conducting challenge tests for growth assessment of L. monocytogenes on foods were published by the European Union Reference Laboratory (EURL) in 2014. The aim of this study was to use these guidelines to determine if refrigerated, fresh, whole, closed-cap, prepackaged mushrooms (Agaricus bisporus) support the growth of L. monocytogenes. Three batches of mushrooms were artificially inoculated at approximately 100 cfu/g with a three-strain mix of L. monocytogenes and incubated for 2 days at 8°C followed by 4 days at 12°C. L. monocytogenes numbers were determined (in triplicate for each batch) on days 0, 2 and 6. Water activity, pH and total bacterial counts were also determined. There was no increase in the number of L. monocytogenes above the threshold of 0.5 log cfu/g in any of the replicates. In 8 of 9 replicates, the numbers decreased indicating that A. bisporus do not support the growth of L. monocytogenes. As the EU regulations allow < 100 cfu/g if the food cannot support growth of L. monocytogenes, the significance of this study is that mushrooms with < 100 cfu/g may be within the regulations and therefore, quantitative rather than qualitative determination may be required.
    • The response of sward-dwelling arthropod communities to reduced grassland management intensity in pastures

      Helden, Alvin J.; Anderson, Annette; Finn, John; Purvis, Gordon (Teagasc (Agriculture and Food Development Authority), Ireland, 2015-12-30)
      We compared arthropod taxon richness, diversity and community structure of two replicated grassland husbandry experiments to investigate effects of reduced management intensity, as measured by nutrient input levels (390, 224 and 0 kg/ha per year N in one experiment, and 225 and 88 kg/ha per year N in another). Suction sampling was used to collect Araneae, Coleoptera, Hemiptera and Hymenoptera, with Araneae and Coleoptera also sampled with pitfall trapping. Univariate analyses found no significant differences in abundance and species density between treatments. However, with multivariate analysis, there were significant differences in arthropod community structure between treatments in both experiments. Reducing N input and associated stocking rates, as targeted by agri-environment schemes, can significantly alter arthropod communities but without increasing the number of species present. Other approaches that may be necessary to achieve substantial enhancement of sward arthropod biodiversity are suggested.
    • Distribution and incidence of viruses in Irish seed potato crops

      Hutton, Fiona; Spink, John H.; Griffin, Denis; Kildea, Stephen; Bonner, D.; Doherty, G.; Hunter, A. (Teagasc (Agriculture and Food Development Authority), Ireland, 2015-12-30)
      Virus diseases are of key importance in potato production and in particular for the production of disease-free potato seed. However, there is little known about the frequency and distribution of potato virus diseases in Ireland. Despite a large number of samples being tested each year, the data has never been collated either within or across years. Information from all known potato virus testing carried out in the years 2006–2012 by the Department of Agriculture Food and Marine was collated to give an indication of the distribution and incidence of potato virus in Ireland. It was found that there was significant variation between regions, varieties, years and seed classes. A definition of daily weather data suitable for aphid flight was developed, which accounted for a significant proportion of the variation in virus incidence between years. This use of weather data to predict virus risk could be developed to form the basis of an integrated pest management approach for aphid control in Irish potato crops.