• Validation and Improvement of the Beef Production Sub-index in Ireland for Beef Cattle

      Drennan, Michael J; McGee, Mark; Clarke, Anne Marie; Kenny, David A.; Evans, R. D.; Berry, Donagh (Teagasc, 2009-12-01)
      The objectives of the following study were to: a. Quantify the effect of sire genetic merit for BCI on: 1. feed intake, growth and carcass traits of progeny managed under bull or steer beef production systems. 2. live animal scores, carcass composition and plasma hormone and metabolite concentrations in their progeny. b. Compare the progeny of : 1. Late-maturing beef with dairy breeds and 2. Charolais (CH), Limousin (LM), Simmental (SM) and Belgian Blue (BB) sires bred to beef suckler dams, for feed intake, blood hormones and metabolites, live animal measurements, carcass traits and carcass value in bull and steer production systems.
    • Variance components for bovine tuberculosis infection and multi-breed genome-wide association analysis using imputed whole genome sequence data

      Ring, S. C.; Purfield, D. C.; Good, M.; Breslin, P.; Ryan, E.; Blom, A.; Evans, R. D.; Doherty, M. L.; Bradley, D. G.; Berry, Donagh; et al. (Public Library of Science (PLoS), 2019-02-14)
      Bovine tuberculosis (bTB) is an infectious disease of cattle generally caused by Mycobacterium bovis, a bacterium that can elicit disease humans. Since the 1950s, the objective of the national bTB eradication program in Republic of Ireland was the biological extinction of bTB; that purpose has yet to be achieved. Objectives of the present study were to develop the statistical methodology and variance components to undertake routine genetic evaluations for resistance to bTB; also of interest was the detection of regions of the bovine genome putatively associated with bTB infection in dairy and beef breeds. The novelty of the present study, in terms of research on bTB infection, was the use of beef breeds in the genome-wide association and the utilization of imputed whole genome sequence data. Phenotypic bTB data on 781,270 animals together with imputed whole genome sequence data on 7,346 of these animals’ sires were available. Linear mixed models were used to quantify variance components for bTB and EBVs were validated. Within-breed and multi-breed genome-wide associations were undertaken using a single-SNP regression approach. The estimated genetic standard deviation (0.09), heritability (0.12), and repeatability (0.30) substantiate that genetic selection help to eradicate bTB. The multi-breed genome-wide association analysis identified 38 SNPs and 64 QTL regions associated with bTB infection; two QTL regions (both on BTA23) identified in the multi-breed analysis overlapped with the within-breed analyses of Charolais, Limousin, and Holstein-Friesian. Results from the association analysis, coupled with previous studies, suggest bTB is controlled by an infinitely large number of loci, each having a small effect. The methodology and results from the present study will be used to develop national genetic evaluations for bTB in the Republic of Ireland. In addition, results can also be used to help uncover the biological architecture underlying resistance to bTB infection in cattle.
    • The variation in morphology of perennial ryegrass cultivars throughout the grazing season and effects on organic matter digestibility

      Beecher, Marion; Hennessy, Deirdre; Boland, T. M.; McEvoy, Mary; O'Donovan, Michael; Lewis, Eva (Wiley, 2013-09-19)
      The grass plant comprises leaf, pseudostem, true stem (including inflorescence) and dead material. These components differ in digestibility, and variations in their relative proportions can affect sward quality. The objective of this study was to determine the change in the proportion and organic matter digestibility (OMD) of leaf, pseudostem, true stem and dead components of four perennial ryegrass cultivars (two tetraploids: Astonenergy and Bealey and two diploids: Abermagic and Spelga) throughout a grazing season. The DM proportions and in vitro OMD of leaf, pseudostem, true stem and dead in all cultivars were determined during ten grazing rotations between May 2011 and March 2012. There was an interaction between rotation and cultivar for leaf, pseudostem, true stem and dead proportions. In May and June, Astonenergy had the highest leaf and lowest true stem proportion (P < 0·05). From July onwards, there was no difference in leaf or true stem proportion between cultivars. Bealey had the highest annual mean OMD (752 g kg−1) and Spelga the lowest (696 g kg−1; P < 0·05). The OMD followed the order leaf > pseudostem > true stem > dead. Bealey had the highest combined leaf and pseudostem proportion 0·92, which explains why it had the highest OMD. In this study, the tetraploid cultivars had the highest leaf and pseudostem proportion and OMD. For accurate descriptions of a sward in grazing studies and to accurately determine sward morphological composition, pseudostem should be separated from true stem, particularly during the reproductive stage when true stem is present.
    • Variations in travel time for N loading to groundwaters in four case studies in Ireland:Implications for policy makers and regulators

      Fenton, Owen; Coxon, Catherine E.; Haria, Atul H.; Horan, Brendan; Humphreys, James; Johnston, Paul; Murphy, Paul N. C.; Necpalova, Magdalena; Premrov, Alina; Richards, Karl G. (School of Agriculture, Food Science and Veterinary Medicine, University College Dublin in association with Teagasc, 2009)
      Mitigation measures to protect waterbodies must be implemented by 2012 to meet the requirements of the EU Water Framework Directive. The efficacy of these measures will be assessed in 2015. Whilst diffuse N pathways between source and receptor are generally long and complex, EU legislation does not account for differences in hydrological travel time distributions that may result in different water quality response times. The “lag time” between introducing mitigation measures and first improvements in water quality is likely to be different in different catchments; a process that should be considered by policy makers and catchment managers. Many examples of travel time variations have been quoted in the literature but no Irish specific examples are available. Lag times based on initial nutrient breakthrough at four contrasting sites were estimated to a receptor 500 m away from a source. Vertical travel times were estimated using a combination of depth of infiltration calculations based on effective rainfall and subsoil physical parameters and existing hydrological tracer data. Horizontal travel times were estimated using a combination of Darcian linear velocity calculations and existing tracer migration data. Total travel times, assuming no biogeochemical processes, ranged from months to decades between the contrasting sites; the shortest times occurred under thin soil/subsoil on karst limestone and the longest times through thick low permeability soils/subsoils over poorly productive aquifers. Policy makers should consider hydrological lag times when assessing the efficacy of mitigation measures introduced under the Water Framework Directive. This lag time reflects complete flushing of a particular nutrient from source to receptor. Further research is required to assess the potential mitigation of nitrate through denitrification along the pathway from source to receptor.
    • Visual drainage assessment: A standardised visual soil assessment method for use in land drainage design in Ireland

      Tuohy, Patrick; Humphreys, James; Holden, Nicholas M.; O'Loughlin, James; Reidy, Brian; Fenton, Owen (Teagasc (Agriculture and Food Development Authority), Ireland, 20/08/2016)
      The implementation of site-specific land drainage system designs is usually disregarded by landowners in favour of locally established ‘standard practice’ land drainage designs. This is due to a number of factors such as a limited understanding of soil–water interactions, lack of facilities for the measurement of soil’s physical or hydrological parameters and perceived time wastage and high costs. Hence there is a need for a site-specific drainage system design methodology that does not rely on inaccessible, time-consuming and/or expensive measurements of soil physical or hydrological properties. This requires a standardised process for deciphering the drainage characteristics of a given soil in the field. As an initial step, a new visual soil assessment method, referred to as visual drainage assessment (VDA), is presented whereby an approximation of the permeability of specific soil horizons is made using seven indicators (water seepage, pan layers, texture, porosity, consistence, stone content and root development) to provide a basis for the design of a site-specific drainage system. Across six poorly drained sites (1.3 ha to 2.6 ha in size) in south-west Ireland a VDA-based design was compared with (i) an ideal design (utilising soil physical measurements to elucidate soil hydraulic parameters) and (ii) a standard design (0.8 m deep drains at a 15 m spacing) by model estimate of water table control and rainfall recharge/drain discharge capacity. The VDA method, unlike standard design equivalents, provided a good approximation of an ideal (from measured hydrological properties) design and prescribed an equivalent land drainage system in the field. Mean modelled rainfall recharge/drain discharge capacity for the VDA (13.3 mm/day) and ideal (12.0 mm/day) designs were significantly higher (P < 0.001, s.e. 1.42 mm/day) than for the standard designs (0.5 mm/day), when assuming a design minimum water table depth of 0.45 m.