• Assessing the role of artificially drained agricultural land for climate change mitigation in Ireland

      Paul, Carsten; Fealy, Reamonn; Fenton, Owen; Lanigan, Gary; O’Sullivan, Lilian; Schulte, Rogier P.; Irish Dairy Research Fund; Teagasc Greenhouse Gas Working Group; Department of Agriculture, Food and the Marine (Elsevier, 2017-12-19)
      In 2014 temperate zone emission factor revisions were published in the IPCC Wetlands Supplement. Default values for direct CO2 emissions of artificially drained organic soils were increased by a factor of 1.6 for cropland sites and by factors ranging from 14 to 24 for grassland sites. This highlights the role of drained organic soils as emission hotspots and makes their rewetting more attractive as climate change mitigation measures. Drainage emissions of humic soils are lower on a per hectare basis and not covered by IPCC default values. However, drainage of great areas can turn them into nationally relevant emission sources. National policy making that recognizes the importance of preserving organic and humic soils’ carbon stock requires data that is not readily available. Taking Ireland as a case study, this article demonstrates how a dataset of policy relevant information can be generated. Total area of histic and humic soils drained for agriculture, resulting greenhouse gas emissions and climate change mitigation potential were assessed. For emissions from histic soils, calculations were based on IPCC emission factors, for humic soils, a modified version of the ECOSSE model was used. Results indicated 370,000 ha of histic and 426,000 ha of humic soils under drained agricultural land use in Ireland (8% and 9% of total farmed area). Calculated annual drainage emissions were 8.7 Tg CO2e from histic and 1.8 Tg CO2e from humic soils (equal to 56% of Ireland’s agricultural emissions in 2014, excluding emissions from land use). If half the area of drained histic soils was rewetted, annual saving would amount to 3.2 Tg CO2e. If on half of the deep drained, nutrient rich grasslands drainage spacing was decreased to control the average water table at −25 cm or higher, annual savings would amount to 0.4 Tg CO2e.
    • Assessing the role of artificially drained agricultural land for climate change mitigation in Ireland

      Paul, Carsten; Fealy, Reamonn; Fenton, Owen; Lanigan, Gary; O'Sullivan, Lilian; Schulte, Rogier P.; Irish Dairy Research Fund; Teagasc Greenhouse Gas Working Group; Department of Agriculture, Food and the Marine (Elsevier, 2017-12-19)
      In 2014 temperate zone emission factor revisions were published in the IPCC Wetlands Supplement. Default values for direct CO2 emissions of artificially drained organic soils were increased by a factor of 1.6 for cropland sites and by factors ranging from 14 to 24 for grassland sites. This highlights the role of drained organic soils as emission hotspots and makes their rewetting more attractive as climate change mitigation measures. Drainage emissions of humic soils are lower on a per hectare basis and not covered by IPCC default values. However, drainage of great areas can turn them into nationally relevant emission sources. National policy making that recognizes the importance of preserving organic and humic soils’ carbon stock requires data that is not readily available. Taking Ireland as a case study, this article demonstrates how a dataset of policy relevant information can be generated. Total area of histic and humic soils drained for agriculture, resulting greenhouse gas emissions and climate change mitigation potential were assessed. For emissions from histic soils, calculations were based on IPCC emission factors, for humic soils, a modified version of the ECOSSE model was used. Results indicated 370,000 ha of histic and 426,000 ha of humic soils under drained agricultural land use in Ireland (8% and 9% of total farmed area). Calculated annual drainage emissions were 8.7 Tg CO2e from histic and 1.8 Tg CO2e from humic soils (equal to 56% of Ireland’s agricultural emissions in 2014, excluding emissions from land use). If half the area of drained histic soils was rewetted, annual saving would amount to 3.2 Tg CO2e. If on half of the deep drained, nutrient rich grasslands drainage spacing was decreased to control the average water table at −25 cm or higher, annual savings would amount to 0.4 Tg CO2e.
    • Characterization of Potato Virus Y Isolates and Assessment of Nanopore Sequencing to Detect and Genotype Potato Viruses

      Della Bartola, Michele; Byrne, Stephen; Mullins, Ewen; Department of Agriculture, Food and the Marine; 15/S/618 SCOPE (MDPI AG, 2020-04-23)
      Potato virus Y (PVY) is the most economically important virus infecting cultivated potato (Solanum tuberosum L.). Accurate diagnosis is crucial to regulate the trade of tubers and for the sanitary selection of plant material for propagation. However, high genetic diversity of PVY represents a challenge for the detection and classification of isolates. Here, the diversity of Irish PVY isolates from a germplasm collection and commercial sites was investigated using conventional molecular and serological techniques. Recombinant PVY isolates were prevalent, with PVYNTNa being the predominant genotype. In addition, we evaluated Nanopore sequencing to detect and reconstruct the whole genome sequence of four viruses (PVY, PVX, PVS, PLRV) and five PVY genotypes in a subset of eight potato plants. De novo assembly of Nanopore sequencing reads produced single contigs covering greater than 90% of the viral genome and sharing greater than 99.5% identity to the consensus sequences obtained with Illumina sequencing. Interestingly, single near full genome contigs were obtained for different isolates of PVY co-infecting the same plant. Mapping reads to available reference viral genomes enabled us to generate near complete genome sequences sharing greater than 99.90% identity to the Illumina-derived consensus. This is the first report describing the use of Oxford Nanopore’s MinION to detect and genotype potato viruses. We reconstructed the genome of PVY and other RNA viruses; indicating the technologies potential for virus detection in potato production systems, and for the study of genetic diversity of highly heterogeneous viruses such as PVY.
    • Comparison of photosynthetic performance of Fagus sylvatica seedlings under natural and artificial shading

      Sevillano, Ignacio; Short, Ian; Campion, Jerry; Grant, Olga M.; Grant, Jim; O’Reilly, Conor; Department of Agriculture, Food and the Marine; Teagasc Walsh Fellowship Programme (Elsevier, 2018-03-14)
      Commitment to sustainable forest management (alternatives to clearfelling) has led to a renewed interest in continuous cover forestry systems, which promote the control of light to produce stand benefits. Physiological performance of shade-tolerant European beech (Fagus sylvatica L.) in response to light availability was investigated in natural regeneration below the canopy in contrast to planted seedlings under artificial-shade conditions. Although beech seedlings had higher photosynthetic capacity with increasing light availability, they were able to maintain positive CO2 assimilation rates under low light levels in both field and controlled conditions. Leaves of seedlings under low light had the ability to use light more efficiently (higher PSII efficiency) than those in high light, which offer some physiological explanation for the ability of beech seedlings to grow under very low light conditions. Whilst caution is advised to interpret results from controlled to field studies, the overall general correspondence in the trend of the physiological response to light levels within beech grown below the canopy and under artificial-shade conditions suggests that it might be possible to extrapolate results from studies performed under artificial shade (nets) to field conditions. Hence, the use of nets may be an alternative way of assessing the potential physiological responses of seedlings to light availability.
    • CropQuest: Minor Crops Report

      Zahoor, Faisal; Forristal, Dermot; Gillespie, Gary; Department of Agriculture, Food and the Marine; 11/S/119 (Teagasc, 2015)
      In this report as part of the DAFM funded CROPQUEST desk study, a brief description outlining the characteristics of a range of minor crops, their uses/markets and their potential, if known, for production in Ireland is presented. The crops include: Amaranth, Borage, Calendula, Camelina, Crambe, Echium, Flax / Linseed, Hemp, Hops, Lentils, Lupins, Oats, Poppy, Quinoa
    • Defining optimal DEM resolutions and point densities for modelling hydrologically sensitive areas in agricultural catchments dominated by microtopography

      Thomas, Ian; Jordan, Philip; Shine, Oliver; Fenton, Owen; Mellander, Per-Erik; Dunlop, Paul; Murphy, Paul N. C.; Department of Agriculture, Food and the Marine; Teagasc Walsh Fellowship Programme (Elsevier, 2016-09-16)
      Defining critical source areas (CSAs) of diffuse pollution in agricultural catchments depends upon the accurate delineation of hydrologically sensitive areas (HSAs) at highest risk of generating surface runoff pathways. In topographically complex landscapes, this delineation is constrained by digital elevation model (DEM) resolution and the influence of microtopographic features. To address this, optimal DEM resolutions and point densities for spatially modelling HSAs were investigated, for onward use in delineating CSAs. The surface runoff framework was modelled using the Topographic Wetness Index (TWI) and maps were derived from 0.25 m LiDAR DEMs (40 bare-earth points m−2), resampled 1 m and 2 m LiDAR DEMs, and a radar generated 5 m DEM. Furthermore, the resampled 1 m and 2 m LiDAR DEMs were regenerated with reduced bare-earth point densities (5, 2, 1, 0.5, 0.25 and 0.125 points m−2) to analyse effects on elevation accuracy and important microtopographic features. Results were compared to surface runoff field observations in two 10 km2 agricultural catchments for evaluation. Analysis showed that the accuracy of modelled HSAs using different thresholds (5%, 10% and 15% of the catchment area with the highest TWI values) was much higher using LiDAR data compared to the 5 m DEM (70–100% and 10–84%, respectively). This was attributed to the DEM capturing microtopographic features such as hedgerow banks, roads, tramlines and open agricultural drains, which acted as topographic barriers or channels that diverted runoff away from the hillslope scale flow direction. Furthermore, the identification of ‘breakthrough’ and ‘delivery’ points along runoff pathways where runoff and mobilised pollutants could be potentially transported between fields or delivered to the drainage channel network was much higher using LiDAR data compared to the 5 m DEM (75–100% and 0–100%, respectively). Optimal DEM resolutions of 1–2 m were identified for modelling HSAs, which balanced the need for microtopographic detail as well as surface generalisations required to model the natural hillslope scale movement of flow. Little loss of vertical accuracy was observed in 1–2 m LiDAR DEMs with reduced bare-earth point densities of 2–5 points m−2, even at hedgerows. Further improvements in HSA models could be achieved if soil hydrological properties and the effects of flow sinks (filtered out in TWI models) on hydrological connectivity are also considered.
    • Detection of Novel QTLs for Late Blight Resistance Derived from the Wild Potato Species Solanum microdontum and Solanum pampasense

      Meade, Fergus; Hutten, Ronald; Wagener, Silke; Prigge, Vanessa; Dalton, Emmet; Kirk, Hanne Grethe; Griffin, Denis; Milbourne, Dan; Department of Agriculture, Food and the Marine; IPM Potato Group; et al. (MDPI AG, 2020-06-30)
      Wild potato species continue to be a rich source of genes for resistance to late blight in potato breeding. Whilst many dominant resistance genes from such sources have been characterised and used in breeding, quantitative resistance also offers potential for breeding when the loci underlying the resistance can be identified and tagged using molecular markers. In this study, F1 populations were created from crosses between blight susceptible parents and lines exhibiting strong partial resistance to late blight derived from the South American wild species Solanum microdontum and Solanum pampasense. Both populations exhibited continuous variation for resistance to late blight over multiple field-testing seasons. High density genetic maps were created using single nucleotide polymorphism (SNP) markers, enabling mapping of quantitative trait loci (QTLs) for late blight resistance that were consistently expressed over multiple years in both populations. In the population created with the S. microdontum source, QTLs for resistance consistently expressed over three years and explaining a large portion (21–47%) of the phenotypic variation were found on chromosomes 5 and 6, and a further resistance QTL on chromosome 10, apparently related to foliar development, was discovered in 2016 only. In the population created with the S. pampasense source, QTLs for resistance were found in over two years on chromosomes 11 and 12. For all loci detected consistently across years, the QTLs span known R gene clusters and so they likely represent novel late blight resistance genes. Simple genetic models following the effect of the presence or absence of SNPs associated with consistently effective loci in both populations demonstrated that marker assisted selection (MAS) strategies to introgress and pyramid these loci have potential in resistance breeding strategies.
    • Developing new hardwood markets for Irish timber – the Hardwood Focus group’s study tour to Wales

      Spazzi, Jonathan; Garvey, Seán; Short, Ian; Department of Agriculture, Food and the Marine (Society of Irish Foresters, 2019-12-30)
      A discussion-group called Hardwood Focus (HF) was formed in Limerick in 2018 among broadleaf-forest owners in the region. This initiative is part of the Limerick Tipperary Woodland Owners (LTWO) Group and is facilitated by Jonathan Spazzi, the local Teagasc Forestry Development Officer. The group travelled to Wales between 30th September and 4th October 2019. This article discusses the outcomes.
    • The effect of entomopathogenic fungal culture filtrate on the immune response and haemolymph proteome of the large pine weevil, Hylobius abietis

      Mc Namara, Louise; Griffin, Christine T.; Fitzpatrick, David; Kavanagh, Kevin; Carolan, James C.; Department of Agriculture, Food and the Marine; Science Foundation Ireland; 10/RD/MCOP/NUIM/720; 12/RI/2346 (3) (Elsevier, 2018-07-17)
      The large pine weevil Hylobius abietis L. is a major forestry pest in 15 European countries, where it is a threat to 3.4 million hectares of forest. A cellular and proteomic analysis of the effect of culture filtrate of three entomopathogenic fungi (EPF) species on the immune system of H. abietis was performed. Injection with Metarhizium brunneum or Beauvaria bassiana culture filtrate facilitated a significantly increased yeast cell proliferation in larvae. Larvae co-injected with either Beauvaria caledonica or B. bassiana culture filtrate and Candida albicans showed significantly increased mortality. Together these results suggest that EPF culture filtrate has the potential to modulate the insect immune system allowing a subsequent pathogen to proliferate. Injection with EPF culture filtrate was shown to alter the abundance of protease inhibitors, detoxifing enzymes, antimicrobial peptides and proteins involved in reception/detection and development in H. abietis larvae. Larvae injected with B. caledonica culture filtrate displayed significant alterations in abundance of proteins involved in cellulolytic and other metabolic processes in their haemolymph proteome. Screening EPF for their ability to modulate the insect immune response represents a means of assessing EPF for use as biocontrol agents, particularly if the goal is to use them in combination with other control agents.
    • Evaluation of the ’Irish Rules’: The Potato Late Blight Forecasting Model and Its Operational Use in the Republic of Ireland

      Cucak, Mladen; Sparks, Adam; Moral, Rafael; Kildea, Steven; Lambkin, Keith; Fealy, Rowan; Department of Agriculture, Food and the Marine; 14/S/879 (MDPI AG, 2019-09-06)
      Potato late blight caused by Phytophthora infestans is one of the most important plant diseases known, requiring high pesticide inputs to prevent disease occurrence. The disease development is highly dependent on weather conditions, and as such, several forecasting schemes have been developed worldwide which seek to reduce the inputs required to control the disease. The Irish Rules, developed in the 1950s and calibrated to accommodate the meteorological network, the characteristics of potato production and the P. infestans population at the time, is still operationally utilized by the national meteorological agency, Met Éireann. However, numerous changes in the composition and dynamics of the pathosystem and the risks of production/economic consequences associated with potato late blight outbreaks have occurred since the inception of the Irish Rules model. Additionally, model and decision thresholds appear to have been selected ad hoc and without a clear criteria. We developed a systematic methodology to evaluate the model using the empirical receiver operating curve (ROC) analysis and the response surface methodology for the interpretation of the results. The methodology, written in the R language, is provided as an open, accessible and reproducible platform to facilitate the ongoing seasonal re-evaluation of the Irish Rules and corresponding decision thresholds. Following this initial analysis, based on the available data, we recommend the reduction of the thresholds for relative humidity and an initial period duration from 90% and 12 h to 88% and 10 h, respectively. Contrary to recent reports, we found that the risk of blight epidemics remains low at temperatures below 12 °C. With the availability of more comprehensive outbreak data and with greater insight into the founder population to confirm our findings as robust, the temperature threshold in the model could potentially be increased from 10 °C to 12 °C, providing more opportunities for reductions of pesticide usage. We propose a dynamic operational decision threshold between four and 11 effective blight hours (EBH) set according to frequency of the disease outbreaks in the region of interest. Although the risk estimation according to the new model calibrations is higher, estimated chemical inputs, on average, are lower than the usual grower’s practice. Importantly, the research outlined here provides a robust and reproducible methodological approach to evaluate a semi-empirical plant disease forecasting model.
    • Functional Land Management for managing soil functions: A case-study of the trade-off between primary productivity and carbon storage in response to the intervention of drainage systems in Ireland

      O'Sullivan, Lilian; Creamer, Rachel E.; Fealy, Reamonn; Lanigan, Gary; Simo, Iolanda; Fenton, Owen; Carfrae, J.; Schulte, Rogier; Department of Agriculture, Food and the Marine (Elsevier, 2015-09-30)
      Globally, there is growing demand for increased agricultural outputs. At the same time, the agricultural industry is expected to meet increasingly stringent environmental targets. Thus, there is an urgent pressure on the soil resource to deliver multiple functions simultaneously. The Functional Land Management framework (Schulte et al., 2014) is a conceptual tool designed to support policy making to manage soil functions to meet these multiple demands. This paper provides a first example of a practical application of the Functional Land Management concept relevant to policy stakeholders. In this study we examine the trade-offs, between the soil functions ‘primary productivity’ and ‘carbon cycling and storage’, in response to the intervention of land drainage systems applied to ‘imperfectly’ and ‘poorly’ draining managed grasslands in Ireland. These trade-offs are explored as a function of the nominal price of ‘Certified Emission Reductions’ or ‘carbon credits’. Also, these trade-offs are characterised spatially using ArcGIS to account for spatial variability in the supply of soil functions.To manage soil functions, it is essential to understand how individual soil functions are prioritised by those that are responsible for the supply of soil functions – generally farmers and foresters, and those who frame demand for soil functions – policy makers. Here, in relation to these two soil functions, a gap exists in relation to this prioritisation between these two stakeholder groups. Currently, the prioritisation and incentivisation of these competing soil functions is primarily a function of CO2 price. At current CO2 prices, the agronomic benefits outweigh the monetised environmental costs. The value of CO2 loss would only exceed productivity gains at either higher CO2 prices or at a reduced discount period rate. Finally, this study shows large geographic variation in the environmental cost: agronomic benefit ratio. Therein, the Functional Land Management framework can support the development of policies that are more tailored to contrasting biophysical environments and are therefore more effective than ‘blanket approaches’ allowing more specific and effective prioritisation of contrasting soil functions.
    • Genetic Analysis Using a Multi-Parent Wheat Population Identifies Novel Sources of Septoria Tritici Blotch Resistance

      Riaz, Adnan; KockAppelgren, Petra; Hehir, James Gerard; Kang, Jie; Meade, Fergus; Cockram, James; Milbourne, Dan; Spink, John; Mullins, Ewen; Byrne, Stephen; et al. (MDPI AG, 2020-08-04)
      Zymoseptoria tritici is the causative fungal pathogen of septoria tritici blotch (STB) disease of wheat (Triticum aestivum L.) that continuously threatens wheat crops in Ireland and throughout Europe. Under favorable conditions, STB can cause up to 50% yield losses if left untreated. STB is commonly controlled with fungicides; however, a combination of Z. tritici populations developing fungicide resistance and increased restrictions on fungicide use in the EU has led to farmers relying on fewer active substances. Consequently, this serves to drive the emergence of Z. tritici resistance against the remaining chemistries. In response, the use of resistant wheat varieties provides a more sustainable disease management strategy. However, the number of varieties offering an adequate level of resistance against STB is limited. Therefore, new sources of resistance or improved stacking of existing resistance loci are needed to develop varieties with superior agronomic performance. Here, we identified quantitative trait loci (QTL) for STB resistance in the eight-founder “NIAB Elite MAGIC” winter wheat population. The population was screened for STB response in the field under natural infection for three seasons from 2016 to 2018. Twenty-five QTL associated with STB resistance were identified in total. QTL either co-located with previously reported QTL or represent new loci underpinning STB resistance. The genomic regions identified and the linked genetic markers serve as useful resources for STB resistance breeding, supporting rapid selection of favorable alleles for the breeding of new wheat cultivars with improved STB resistance.
    • The Hardwood Focus Group: Exploring utilisation potential of Irish broadleaf forests

      Spazzi, Jonathan; O'Connell, John; Sykes, Jonathan; Short, Ian; Garvey, Seán; Department of Agriculture, Food and the Marine (Dawn Media Ltd,, 2020)
      Jonathan Spazzi of Teagasc Forestry outlines the work of the Hardwood Focus Group and lessons learned from a recent exploratory trip to counterparts in Wales.
    • The impact of cattle dung pats on earthworm distribution in grazed pastures

      Bacher, M. G.; Fenton, Owen; Bondi, G.; Creamer, Rachel; Karmarkar, M.; Schmidt, O.; Department of Agriculture, Food and the Marine; 13/S/468 (Springer Science and Business Media LLC, 2018-12-19)
      Background Grazed grassland management regimes can have various effects on soil fauna. For example, effects on earthworms can be negative through compaction induced by grazing animals, or positive mediated by increases in sward productivity and cattle dung pats providing a food source. Knowledge gaps exist in relation to the behaviour of different earthworm species i.e. their movement towards and aggregation under dung pats, the legacy effects of pats and the spatial area of recruitment. The present study addressed these knowledge gaps in field experiments, over 2 years, using natural and simulated dung pats on two permanent, intensively grazed pastures in Ireland. Results Dung pats strongly affected spatial earthworm distribution, with up to four times more earthworms aggregating beneath pats, than in the control locations away from pats. In these earthworm communities comprising 11 species, temporally different aggregation and dispersal patterns were observed, including absence of individual species from control locations, but no clear successional responses. Epigeic species in general, but also certain species of the anecic and endogeic groups were aggregating under dung. Sampling after complete dung pat disappearance (27 weeks after application) suggested an absence of a dung pat legacy effect on earthworm communities. Based on species distributions, the maximum size of the recruitment area from which earthworms moved to pats was estimated to be 3.8 m2 per dung pat. Since actual grazing over 6 weeks would result in the deposition of about 300 dung pats per ha, it is estimated that a surface area of 1140 m2 or about 11% of the total grazing area can be influenced by dung pats in a given grazing period. Conclusions This study showed that the presence of dung pats in pastures creates temporary hot spots in spatial earthworm species distribution, which changes over time. The findings highlight the importance of considering dung pats, temporally and spatially, when sampling earthworms in grazed pastures. Published comparisons of grazed and cut grasslands probably reached incorrect conclusions by ignoring or deliberately avoiding dung pats. Furthermore, the observed intense aggregation of earthworms beneath dung pats suggests that earthworm functions need to be assessed separately at these hot spots.
    • Impact of competition on the early growth and physiological responses of potential short-rotation forestry species in Ireland

      Foreman, Susie; Department of Agriculture, Food and the Marine; 13/C/498 (2019-07)
      The impact of planting density on the growth and physiological response of three potential short rotation forestry species, shining gum (Eucalyptus nitens (Deane & Maiden) Maiden), Italian alder (Alnus cordata (Loisel.) Duby) and Sitka spruce (Picea sitchensis (Bong.) Carrière) were investigated in this study over a four-year period. The three species were planted in a field trial in Co. Wexford. The trial was laid down as a randomised block design containing four planting densities (1,333 – 40,000 stems ha-1) per species. Height, stem diameter, branch length, diameter and quantity, crown height, along with shade leaf only determinations of leaf area and leaf dry weight, chlorophyll concentration (Chleaf) and photosynthesis rates (PN) were measured periodically over the 4-year period. E. nitens trees produced the shallowest live crown of the three species, which decreased as planting density increased. Chleaf declined as planting density increased, but PN remained the same. E. nitens produced the greatest volume and biomass per ha-1 of the three species at the end of four years growth. Height increased and stem diameter decreased as planting density increased in A. cordata, although stem volume remained about the same. However, planting density did not affect crown volume or Chleaf in A. cordata, but PN declined as density increased. Trees of P. sitchensis grew more slowly than those of the other species during the four-year period, but it produced the densest crown at all planting densities. Competition effects were apparent at leaf level in P. sitchensis. Planting density did not affect the above-ground biomass in A. cordata or P. sitchensis, which was similar for the two species and was lower than that recorded for E. nitens. Of the three species examined, E. nitens was the most productive at all planting densities.
    • Improving the identification of hydrologically sensitive areas using LiDAR DEMs for the delineation and mitigation of critical source areas of diffuse pollution

      Thomas, Ian; Jordan, Philip; Mellander, Per-Erik; Fenton, Owen; Shine, Oliver; O hUallachain, Daire; Creamer, Rachel E.; McDonald, Noeleen T.; Dunlop, Paul; Murphy, Paul N. C.; et al. (Elsevier, 2016-03-12)
      Identifying critical source areas (CSAs) of diffuse pollution in agricultural catchments requires the accurate identification of hydrologically sensitive areas (HSAs) at highest propensity for generating surface runoff and transporting pollutants. A new GIS-based HSA Index is presented that improves the identification of HSAs at the sub-field scale by accounting for microtopographic controls. The Index is based on high resolution LiDAR data and a soil topographic index (STI) and also considers the hydrological disconnection of overland flow via topographic impediment from flow sinks. The HSA Index was applied to four intensive agricultural catchments (~ 7.5–12 km2) with contrasting topography and soil types, and validated using rainfall-quickflow measurements during saturated winter storm events in 2009–2014. Total flow sink volume capacities ranged from 8298 to 59,584 m3 and caused 8.5–24.2% of overland-flow-generating-areas and 16.8–33.4% of catchment areas to become hydrologically disconnected from the open drainage channel network. HSA maps identified ‘breakthrough points’ and ‘delivery points’ along surface runoff pathways as vulnerable points where diffuse pollutants could be transported between fields or delivered to the open drainage network, respectively. Using these as proposed locations for targeting mitigation measures such as riparian buffer strips reduced potential costs compared to blanket implementation within an example agri-environment scheme by 66% and 91% over 1 and 5 years respectively, which included LiDAR DEM acquisition costs. The HSA Index can be used as a hydrologically realistic transport component within a fully evolved sub-field scale CSA model, and can also be used to guide the implementation of ‘treatment-train’ mitigation strategies concurrent with sustainable agricultural intensification.
    • Incidental nutrient transfers: Assessing critical times in agricultural catchments using high-resolution data

      Shore, Mairead; Jordan, Philip; Melland, Alice R.; Mellander, Per-Erik; McDonald, Noeleen T.; Shortle, Ger; Department of Agriculture, Food and the Marine (Elsevier, 2016-03-22)
      Managing incidental losses associated with liquid slurry applications during closed periods has significant cost and policy implications and the environmental data required to review such a measure are difficult to capture due to storm dependencies. Over four years (2010–2014) in five intensive agricultural catchments, this study used high-resolution total and total reactive phosphorus (TP and TRP), total oxidised nitrogen (TON) and suspended sediment (SS) concentrations with river discharge data to investigate the magnitude and timing of nutrient losses. A large dataset of storm events (defined as 90th percentile discharges), and associated flow-weighted mean (FWM) nutrient concentrations and TP/SS ratios, was used to indicate when losses were indicative of residual or incidental nutrient transfers. The beginning of the slurry closed period was reflective of incidental and residual transfers with high storm FWM P (TP and TRP) concentrations, with some catchments also showing elevated storm TP:SS ratios. This pattern diminished at the end of the closed period in all catchments. Total oxidised N behaved similarly to P during storms in the poorly drained catchments and revealed a long lag time in other catchments. Low storm FWM P concentrations and TP:SS ratios during the weeks following the closed period suggests that nutrients either weren't applied during this time (best times chosen) or that they were applied to less risky areas (best places chosen). For other periods such as late autumn and during wet summers, where storm FWM P concentrations and TP:SS ratios were high, it is recommended that an augmentation of farmer knowledge of soil drainage characteristics with local and detailed current and forecast soil moisture conditions will help to strengthen existing regulatory frameworks to avoid storm driven incidental nutrient transfers.
    • Influence of stormflow and baseflow phosphorus pressures on stream ecology in agricultural catchments

      Shore, Mairead; Murphy, Sinead; Mellander, Per-Erik; Shortle, Ger; Melland, A. R.; Crockford, Lucy; O'Flaherty, Vincent; Williams, Lauren; Morgan, Ger; Jordan, Philip; et al. (Elsevier, 2017-03-09)
      Stormflow and baseflow phosphorus (P) concentrations and loads in rivers may exert different ecological pressures during different seasons. These pressures and subsequent impacts are important to disentangle in order to target and monitor the effectiveness of mitigation measures. This study investigated the influence of stormflow and baseflow P pressures on stream ecology in six contrasting agricultural catchments. A five-year high resolution dataset was used consisting of stream discharge, P chemistry, macroinvertebrate and diatom ecology, supported with microbial source tracking and turbidity data. Total reactive P (TRP) loads delivered during baseflows were low (1–7% of annual loads), but TRP concentrations frequently exceeded the environmental quality standard (EQS) of 0.035 mg L− 1 during these flows (32–100% of the time in five catchments). A pilot microbial source tracking exercise in one catchment indicated that both human and ruminant faecal effluents were contributing to these baseflow P pressures but were diluted at higher flows. Seasonally, TRP concentrations tended to be highest during summer due to these baseflow P pressures and corresponded well with declines in diatom quality during this time (R2 = 0.79). Diatoms tended to recover by late spring when storm P pressures were most prevalent and there was a poor relationship between antecedent TRP concentrations and diatom quality in spring (R2 = 0.23). Seasonal variations were less apparent in the macroinvertebrate indices; however, there was a good relationship between antecedent TRP concentrations and macroinvertebrate quality during spring (R2 = 0.51) and summer (R2 = 0.52). Reducing summer point source discharges may be the quickest way to improve ecological river quality, particularly diatom quality in these and similar catchments. Aligning estimates of P sources with ecological impacts and identifying ecological signals which can be attributed to storm P pressures are important next steps for successful management of agricultural catchments at these scales.
    • Investigating the impact of varying levels of inventory data detail on private sector harvest forecasting

      Farrelly, Niall; O'Connor, Cian; Nieuwenhuis, Maarten; Phillips, Henry; Department of Agriculture, Food and the Marine; 14/C/824 (The Society of Irish Foresters, 2019-02-01)
      A comparison was made between four methods of generating roundwood production forecasts for private sector forests in Ireland which used varying levels of inventory data as inputs into the production Model. Two methods were based on stand variables: the Irish Dynamic Yield Model (IDYM) method and the General Yield Class (GYC) method. The other two methods were based on site variables used to derive predictions of productivity from climate and map-based data and include a local prediction (LPYC) and a national prediction of yield class (NPYC), the latter the same as that used in the All Ireland Roundwood Production Forecast 2016-2035 (Phillips et al. 2016). To determine the reliability of predictions for an individual stand, field measurements of yield class (GYC) were compared with the predictions of yield class derived using the NPYC and LPYC methods for 52 privately-owned stands of Sitka spruce in the north-west of Ireland. The prediction of yield class using the NPYC method had a low probability of agreement with GYC, with a large bias to under-predict yield class. The LPYC method had a higher probability of agreement and lower bias indicating a better assessment of local productivity. To assess the impact of the various productivity estimates on roundwood production forecasts, separate roundwood forecasts for the period 2016- 2035 were generated. The forecast produced using the NPYC method was used as a baseline for comparison purposes. As expected, the under-prediction of yield class using the NPYC method produced the lowest volume production estimate (318,454 m3) for the forecast period. Both the GYC and LPYC methods resulted in a significant increase in estimated volume production of between 25% and 29% over the baseline. The IDYM method provided the highest estimate of volume production (432,000 m3) for the forecast period, an increase of 35% over the baseline. The increased output predicted using the IDYM method is explained by the inclusion of stocking and basal area data, which more accurately reflected the increased growing stock of private forests than yield data derived using Forestry Commission yield models based on prescribed management. The increases in productivity associated with the use of LPYC, GYC and IDYM methods had the effect of producing shorter rotations and resulted in an increase in the area clearfelled and associated volume production. Perhaps more importantly, the timing of volume production was affected by using more accurate methods to assess productivity (i.e. LPYC, GYC, IDYM), owing to a higher yield-age profile of stands compared to those assessed using the NPYC predictions. The findings point to a possible under-estimation of the productivity for private stands in the All Ireland Roundwood Production Forecast and have implications for the timing of the forecasted volume which could be brought forward by 5 to 6 years. In the absence of field or aerial laser measurement of height and age, the use of the LPYC method is recommended for future private sector roundwood producion forecasts.
    • Nitrogen fertiliser interactions with urine deposit affect nitrous oxide emissions from grazed grasslands

      Maire, J.; Krol, Dominika; Pasquier, D.; Cowan, N.; Skiba, U.; Rees, R.M.; Reay, D.; Lanigan, Gary; Richards, Karl J.; Teagasc Walsh Fellowship Programme; et al. (Elsevier, 2019-12-06)
      Cattle excreta deposited on grazed pastures are responsible for one fifth of the global anthropogenic nitrous oxide (N2O) emissions. One of the key nitrogen (N) sources is urine deposited from grazing animals, which contributes to very large N loadings within small areas. The main objective of this plot study was to establish whether the application of N fertiliser and urine deposit from dairy cows synergistically interacts and thereby increases N2O emissions, and how such interaction is influenced by the timing of application. The combined application of fertiliser (calcium ammonium nitrate) and urine significantly increased the cumulative N2O emissions as well as the N2O emission factor (EF) from 0.35 to 0.74 % in spring and from 0.26 to 0.52 % in summer. By contrast, EFs were lower when only fertiliser (0.31 % in spring, 0.07 % in summer) or urine was applied (0.33 % in spring, 0.28 % in summer). In autumn, N2O emissions were larger than in other seasons and the emissions from the combined application were not statistically different to those from either the separately applied urine or N fertiliser (EF ranging from 0.72 to 0.83, p-value < 0.05). The absence of significant synergistic effect could be explained by weather conditions, particularly rainfall during the three days prior to and after application in autumn. This study implies that the interactive effects of N fertilisation and urine deposit, as well as the timing of the application on N2O emission need to be taken into account in greenhouse gas emission inventories.