• Understanding and using somatic cell counts to improve milk quality

      Ruegg, P.L.; Pantoja, J.C.F. (Teagasc (Agriculture and Food Development Authority), Ireland, 2013)
      The production of high quality milk is a requirement to sustain a profitable dairy industry and somatic cell count (SCC) values are routinely used to identify subclinical mastitis and define quality standards. The objective of this paper is to review the use of SCC as a diagnostic tool for subclinical mastitis in order to improve milk quality on dairy farms. Mastitis is detected based on inflammation subsequent to intramammary infection (IMI) by pathogenic organisms. Individual cow SCC values are used to detect the inflammation that results from IMI and are necessary to define the prevalence and incidence of subclinical IMI. A threshold of <200,000 cells/mL is considered to be of the most practical value used to define a mammary quarter as healthy. The development of IMI is the most significant factor that influences milk SCC and assessment of monthly values to determine newly and chronically increased SCC can be highly diagnostic for resolving problems with increased bulk tank SCC. Methods to reduce the development of new IMI are well known and adoption of best management practices for milking and herd management have consistently been shown to result in reductions in bulk tank SCC. Implementation of mastitis control programmes can be improved by focusing on three practical recommendations: 1) Farmers should work with their advisors to develop an annual udder health plan that includes clear goals for milk quality. 2) The annual udder health plan should emphasise prevention of new IMI. 3) Farmers must identify and manage chronically infected cows. Proactive management of IMI can be extremely effective in helping farmers produce milk that meets industry standards for milk quality.
    • Urine patch distribution under dairy grazing at three stocking rates in Ireland

      Dennis, S.J.; Moir, J.L.; Cameron, K.C.; Di, H.J.; Hennessy, D.; Richards, Karl G. (Teagasc (Agriculture and Food Development Authority), Ireland, 2011)
      Nitrate pollution of water is a serious global environmental issue. Grassland agriculture is a major source of diffuse nitrate pollution, with much of this nitrate originating from the urine patches of grazing animals. To study nitrate losses from grassland it is necessary to consider the areas of grassland that are affected by urine separately from the remainder of the pasture. Urine patches can be observed in the field as areas of vigorously growing pasture, however the pasture may continue to respond for several months, making it difficult to determine when the observed patch was actually deposited. A global positioning system was used to record the location of all urine and dung patches in a pasture at every second grazing on an Irish dairy farm during the grazing season. Any patches reappearing were removed from the data, allowing the fresh urine patches to be identified. Dairy cows deposited 0.359 urine patches per grazing hour, a value that may be used to predict the distribution of urine patches under any grazing regime. This equated to 14.1 to 20.7% of the soil surface being wet by urine annually at stocking rates of 2.0 to 2.94 cows per hectare, consistent with previous research. These values may be used in conjunction with values for nitrate loss from urine and non-urine areas to calculate nitrate losses from grazed pasture at a range of stocking rates.
    • The use of near infrared reflectance spectroscopy (NIRS) for prediction of the nutritive value of barley for growing pigs

      McCann, M.E.E.; McCracken, K.J.; Agnew, R.E. (Teagasc, Oak Park, Carlow, Ireland, 2006)
      There is a need in the feed industry for a rapid means of evaluating the nutritive value of feeds and feed ingredients. Chemical analysis provides only basic information and most of the laboratory techniques take too long for this information to be of use in feed formulation at the feed mill. Near infrared reflectance spectroscopy (NIRS) has been proposed as an alternative means of predicting nutritive value. In this study, NIRS was used to predict the digestible energy (DE) concentration and in vitro ileal digestibility of crude protein (CP) and total-tract digestibility of energy of locally produced barley. The calibration and validation statistics were developed using modified partial least squares (MPLS). Derivatisation and scatter correction procedures were carried out to reduce interference from external effects. The correlations between actual and predicted DE values, based on both calibration (R2 0.93) and validation (R2 0.69), were strong with corresponding low standard errors of calibration (SEC) and cross validation (SECV) (SEC 0.128, SECV 0.279). Strong correlations were also observed between predicted and actual in vitro digestibility values for both calibration and validation exercises. It was noted that validation weakened the correlations (R2 0.73 vs. 0.50 for in vitro ileal digestibility of CP and 0.80 vs. 0.68 for in vitro total tract digestibility of energy) and fractionally increased the standard errors (0.016 vs. 0.020 for in vitro ileal digestibility of CP and 0.018 vs. 0.024 for in vitro total tract digestibility of energy). The correlations obtained by cross validation of the lowest SECV equations were not significantly different to those obtained by the scatter correction treatments. The strong relationships and low standard errors obtained between the actual and predicted values indicates that NIRS may be of use in predicting the nutritive value of barley for growing pigs, although more research is required to include larger sample sets.
    • Variogram investigation of covariance shape within longitudinal data with possible use of a krigeage technique as an interpolation tool: Sheep growth data as an example

      Chalh, A.; El Gazzah, M. (Teagasc (Agriculture and Food Development Authority), Ireland, 2014)
      Most quantitative traits considered in livestock evolve over time and several continuous functions have been proposed to model this change. For individual records (longitudinal data), it is evident that measures taken at close dates are generally more related than these further apart in time. Since milk production involves several parities, the covariance structure within this trait has been analysed by time series methodology. However, the covariance structure within traits that are not repeated during life, such as those linked to growth, has not yet been formally modelled by considering time lags as is done in time series analysis. We propose an adaptation of the variogram concept to shape this structure; which gives the possibility of kriging missing data at any particular time. A new parameter, the halftime variogram, has been proposed to characterise the growing potential of a given population. The weight records of a Barbarine male lamb population were used to illustrate the methodology. The variogram covering the whole growth process in this population could be modelled by a logistic equation. To estimate the missing data from birth to 105 days of age, a simple linear interpolation was sufficient since kriging on a linear model basis gives a relatively more accurate estimation than kriging on a logistic model basis. Nevertheless, when both known records around the missing data are distant, a krigeage on the basis of the logistic model provides a more accurate estimation.
    • Visual drainage assessment: A standardised visual soil assessment method for use in land drainage design in Ireland

      Tuohy, P.; Humphreys, James; Holden, N.M.; O'Loughlin, James; Reidy, B.; Fenton, Owen T. (Teagasc (Agriculture and Food Development Authority), Ireland, 20/08/2016)
      The implementation of site-specific land drainage system designs is usually disregarded by landowners in favour of locally established ‘standard practice’ land drainage designs. This is due to a number of factors such as a limited understanding of soil–water interactions, lack of facilities for the measurement of soil’s physical or hydrological parameters and perceived time wastage and high costs. Hence there is a need for a site-specific drainage system design methodology that does not rely on inaccessible, time-consuming and/or expensive measurements of soil physical or hydrological properties. This requires a standardised process for deciphering the drainage characteristics of a given soil in the field. As an initial step, a new visual soil assessment method, referred to as visual drainage assessment (VDA), is presented whereby an approximation of the permeability of specific soil horizons is made using seven indicators (water seepage, pan layers, texture, porosity, consistence, stone content and root development) to provide a basis for the design of a site-specific drainage system. Across six poorly drained sites (1.3 ha to 2.6 ha in size) in south-west Ireland a VDA-based design was compared with (i) an ideal design (utilising soil physical measurements to elucidate soil hydraulic parameters) and (ii) a standard design (0.8 m deep drains at a 15 m spacing) by model estimate of water table control and rainfall recharge/drain discharge capacity. The VDA method, unlike standard design equivalents, provided a good approximation of an ideal (from measured hydrological properties) design and prescribed an equivalent land drainage system in the field. Mean modelled rainfall recharge/drain discharge capacity for the VDA (13.3 mm/day) and ideal (12.0 mm/day) designs were significantly higher (P < 0.001, s.e. 1.42 mm/day) than for the standard designs (0.5 mm/day), when assuming a design minimum water table depth of 0.45 m.
    • Yield losses caused by late blight (Phytophthora infestans (Mont.) de Bary) in potato crops in Ireland

      Dowley, L.J.; Grant, Jim; Griffin, D. (Teagasc, Oak Park, Carlow, Ireland, 2008)
      Field experiments, using foliage blight susceptible cultivars, were conducted at Oak Park, Carlow from 1983 to 2007 to determine the loss in potato production caused by crop infection with Phytophthora infestans. In each of the 25 years an untreated control was compared with protectant and with systemic fungicide programmes to determine the effect of late blight on the defoliation percentage at the end of the season, the area under the disease progress curve, marketable tuber yield, total tuber yield and yield of blighted tubers. The earliest date of first recorded late blight was 22 June and the latest was 15 September, but in 15 of the 25 years, blight was first recorded between 17 July and 13 August. Disease reached epidemic proportions in all but 4 of the years. Yields varied considerably among years. The mean loss in total yield from not using a fungicide was 10.1 t/ha. Differences in yield were significant across the 25 seasons. No overall increase in aggressiveness of the pathogen could be detected over the 25-year period.