The objective of the Food Safety department is to provide the science to underpin a total chain risk based approach to food safety, focusing on microbial and chemical contaminants in the ‘farm to fork’ food chain.

Recent Submissions

  • Economic Assessment of Waterborne Outbreak of Cryptosporidiosis

    Chyzheuskaya, Aksana; Cormican, Martin; Srivinas, Raghavendra; O’Donovan, Diarmuid; Prendergast, Martina; O’Donoghue, Cathal; Morris, Dearbháile (Centers for Disease Control and Prevention (CDC), 2017-10)
    In 2007, a waterborne outbreak of Cryptosporidium hominis infection occurred in western Ireland, resulting in 242 laboratory-confirmed cases and an uncertain number of unconfirmed cases. A boil water notice was in place for 158 days that affected 120,432 persons residing in the area, businesses, visitors, and commuters. This outbreak represented the largest outbreak of cryptosporidiosis in Ireland. The purpose of this study was to evaluate the cost of this outbreak. We adopted a societal perspective in estimating costs associated with the outbreak. Economic cost estimated was based on totaling direct and indirect costs incurred by public and private agencies. The cost of the outbreak was estimated based on 2007 figures. We estimate that the cost of the outbreak was >€19 million (≈€120,000/day of the outbreak). The US dollar equivalent based on today’s exchange rates would be $22.44 million (≈$142,000/day of the outbreak). This study highlights the economic need for a safe drinking water supply.
  • The impact of environmental conditions on Campylobacter jejuni survival in broiler faeces and litter

    Smith, Shaun; Meade, Joseph; Gibbons, James; McGill, Kevina; Bolton, Declan; Whyte, Paul; Irish Department of Agriculture, Food and Marine; 11SF328 (PubMed Central, 2016-06-28)
    Introduction Campylobacter jejuni is the leading bacterial food-borne pathogen within the European Union, and poultry meat is an important vehicle for its transmission to humans. However, there is limited knowledge about how this organism persists in broiler litter and faeces. The aim of this study was to assess the impact of a number of environmental parameters, such as temperature, humidity, and oxygen, on Campylobacter survival in both broiler litter and faeces. Materials and methods Used litter was collected from a Campylobacter-negative broiler house after final depopulation and fresh faeces were collected from transport crates. Samples were confirmed as Campylobacter negative according to modified ISO methods for veterinary samples. Both sample matrices were inoculated with 9 log10 CFU/ml C. jejuni and incubated under high (≥85%) and low (≤70%) relative humidity conditions at three different temperatures (20°C, 25°C, and 30°C) under both aerobic and microaerophilic atmospheres. Inoculated litter samples were then tested for Campylobacter concentrations at time zero and every 2 hours for 12 hours, while faecal samples were examined at time zero and every 24 hours for 120 hours. A two-tailed t-test assuming unequal variance was used to compare mean Campylobacter concentrations in samples under the various temperature, humidity, and atmospheric conditions. Results and discussion C. jejuni survived significantly longer (P≤0.01) in faeces, with a minimum survival time of 48 hours, compared with 4 hours in used broiler litter. C. jejuni survival was significantly enhanced at 20°C in all environmental conditions in both sample matrices tested compared with survival at 25°C and 30°C. In general, survival was greater in microaerophilic compared with aerobic conditions in both sample matrices. Humidity, at the levels examined, did not appear to significantly impact C. jejuni survival in any sample matrix. The persistence of Campylobacter in broiler litter and faeces under various environmental conditions has implications for farm litter management, hygiene, and disinfection practices.
  • The impact of biosecurity and partial depopulation on Campylobacter prevalence in Irish broiler flocks with differing levels of hygiene and economic performance

    Smith, Shaun; Messam, Locksley L. McV.; Meade, Joseph; Gibbons, James; McGill, Kevina; Bolton, Declan; Whyte, Paul; Irish Department of Agriculture, Food and Marine; 11SF328 (PubMed, 2016-05-10)
    Background: Campylobacter jejuni is the leading bacterial food-borne pathogen within the European Union (EU), and poultry meat is the primary route for transmission to humans. Material and methods: This study examined the impact of partial depopulation (thinning), season, and farm performance (economic, hygiene, and biosecurity) on Campylobacter prevalence in Irish broilers over a 13-month period. Ten caecal samples were taken per flock, for a total of 211 flocks from 23 farms during the duration of the study. Campylobacter was isolated and enumerated according to modified published ISO methods for veterinary samples. Biosecurity was evaluated through a questionnaire based on risk factors for Campylobacter identified in previous studies. Hygiene compliance was assessed from audit records taken over the course of 1 year. All information relating to biosecurity and hygiene was obtained directly from the processing company. This was done to ensure farmers were unaware they were being monitored for Campylobacter prevalence and prevent changes to their behaviour. Results and discussion: Farms with high performance were found to have significantly lower Campylobacter prevalence at first depopulation compared with low-performance farms across all seasons (P≤0.01). Peak Campylobacter levels were observed during the summer season at first thin in both the high- and low-performance groups. Campylobacter prevalence was found to increase to ≥85% in both high- and low-performance farms across all seasons at final depopulation, suggesting that Campylobacter was introduced during the first depopulation. On low-performance farms, four biosecurity interventions were found to significantly reduce the odds of a flock being Campylobacter positive (physical step-over barrier OR=0.17, house-specific footwear OR=0.13, absence of water body within 0.5 km OR=0.13, two or more broiler houses on a farm OR=0.16), compared with farms without these interventions. For high-performance farms, no single biosecurity intervention was identified as significant as this group had full compliance with multiple factors. High-performance farms had significantly better feed conversion ratios compared with low-performance farms (1.61 v 1.67 (P≤0.01)). No differences in flock mortality rates were observed (P≥0.05). This highlights the impact of season, biosecurity, partial depopulation, and farm performance on Campylobacter prevalence in Irish broilers.
  • Iron availability shapes the evolution of bacteriocin resistance in Pseudomonas aeruginosa

    Inglis, R Fredrik; Scanlan, Pauline; Buckling, Angus; AXA Research Fund; NERC; ERC; BBSRC (PubMed Central, 2016-02-23)
    The evolution of bacterial resistance to conventional antimicrobials is a widely documented phenomenon with gravely important consequences for public health. However, bacteria also produce a vast repertoire of natural antimicrobials, presumably in order to kill competing species. Bacteriocins are a common class of protein-based antimicrobials that have been shown to have an important role in the ecology and evolution of bacterial communities. Relative to the evolution of antibiotic resistance, little is known about how novel resistance to these toxic compounds evolves. In this study, we present results illustrating that, although resistance is able to evolve, it remains critically dependent on the environmental context. Resistance to bacteriocins, in particular the pyocin S2, evolves readily when iron is present but less so when iron is limiting, because the receptor for this pyocin is also required for iron uptake during iron limitation. This suggests that although resistance to bacteriocins can easily evolve, environmental conditions will determine how and when resistance occurs.
  • Bacteriocin production: a relatively unharnessed probiotic trait?

    Hegarty, James W.; Guinane, Caitriona M.; Paul Ross, R.; Hill, Colin; Cotter, Paul D.; Science Foundation Ireland; 11/P1/1137 (F1000 Research, 2016-10-27)
    Probiotics are “live microorganisms which, when consumed in adequate amounts, confer a health benefit to the host”. A number of attributes are highly sought after among these microorganisms, including immunomodulation, epithelial barrier maintenance, competitive exclusion, production of short-chain fatty acids, and bile salt metabolism. Bacteriocin production is also generally regarded as a probiotic trait, but it can be argued that, in contrast to other traits, it is often considered a feature that is desirable, rather than a key probiotic trait. As such, the true potential of these antimicrobials has yet to be realised.
  • Diversity of Survival Patterns among Escherichia coli O157:H7 Genotypes Subjected to Food-Related Stress Conditions

    Elhadidy, Mohamed; Álvarez-Ordóñez, Avelino; Science Foundation Ireland (SFI); 13/SIRG/2157 (Frontiers Media SA, 2016-03-15)
    The purpose of this study was to evaluate the resistance patterns to food-related stresses of Shiga toxin producing Escherichia coli O157:H7 strains belonging to specific genotypes. A total of 33 E. coli O157:H7 strains were exposed to seven different stress conditions acting as potential selective pressures affecting the transmission of E. coli O157:H7 to humans through the food chain. These stress conditions included cold, oxidative, osmotic, acid, heat, freeze-thaw, and starvation stresses. The genotypes used for comparison included lineage-specific polymorphism, Shiga-toxin-encoding bacteriophage insertion sites, clade type, tir (A255T) polymorphism, Shiga toxin 2 subtype, and antiterminator Q gene allele. Bacterial resistance to different stressors was calculated by determining D-values (times required for inactivation of 90% of the bacterial population), which were then subjected to univariate and multivariate analyses. In addition, a relative stress resistance value, integrating resistance values to all tested stressors, was calculated for each bacterial strain and allowed for a ranking-type classification of E. coli O157:H7 strains according to their environmental robustness. Lineage I/II strains were found to be significantly more resistant to acid, cold, and starvation stress than lineage II strains. Similarly, tir (255T) and clade 8 encoding strains were significantly more resistant to acid, heat, cold, and starvation stress than tir (255A) and non-clade 8 strains. Principal component analysis, which allows grouping of strains with similar stress survival characteristics, separated strains of lineage I and I/II from strains of lineage II, which in general showed reduced survival abilities. Results obtained suggest that lineage I/II, tir (255T), and clade 8 strains, which have been previously reported to be more frequently associated with human disease cases, have greater multiple stress resistance than strains of other genotypes. The results from this study provide a better insight into how selective pressures encountered through the food chain may play a role in the epidemiology of STEC O157:H7 through controlling the transmission of highly adapted strains to humans.
  • Bacteriophage-based tools: recent advances and novel applications

    O'sullivan, Lisa; Buttimer, Colin; McAuliffe, Olivia; Bolton, Declan; Coffey, Aidan; Teagasc Walsh Fellowship; 2013003 (F1000 Research Ltd, 2016-11-29)
    Bacteriophages (phages) are viruses that infect bacterial hosts, and since their discovery over a century ago they have been primarily exploited to control bacterial populations and to serve as tools in molecular biology. In this commentary, we highlight recent diverse advances in the field of phage research, going beyond bacterial control using whole phage, to areas including biocontrol using phage-derived enzybiotics, diagnostics, drug discovery, novel drug delivery systems and bionanotechnology.
  • Incorporation of commercially-derived antimicrobials into gelatin-based films and assessment of their antimicrobial activity and impact on physical film properties

    Clarke, David; Molinaro, Stefano; Tyuftin, Andrey; Bolton, Declan; Fanning, S.; Kerry, Joe P.; Department of Agriculture, Fisheries and Food, Ireland; 11/F/033 (Elsevier, 2016-06)
    Four antimicrobials, namely; Articoat DLP 02 (AR), Artemix Consa 152/NL (AX), Auranta FV (AFV) and sodium octanoate (SO) were examined for their effectiveness, both before and after heat treatments, against bacterial strains Bacillus cereus, Pseudomonas fluorescens, Escherichia coli, Staphylococcus aureus and the microflora obtained from commercial beef steaks. Minimum inhibitory concentrations (MIC) using AR, AX, AFV and SO against these microbes were then obtained using the 96-well plate method. SO was the most effective against all bacterial strains, demonstrating the lowest MIC compared to the other antimicrobials. These antimicrobials were then successively incorporated into beef-derived gelatine films and these films were subsequently tested for structural, mechanical and barrier properties. Significantly (p < 0.05) enhanced water vapour barrier properties were determined only for antimicrobial films containing AX or SO when compared to control films. On the basis of FTIR spectra, significant changes in the structure of SO-containing films were determined when compared with control gelatin films. It was shown that active antimicrobial agents could potentially serve as commercial antimicrobial coatings for application onto conventional plastic-based food packaging.
  • The pattern of Campylobacter contamination on broiler farms; external and internal sources

    Battersby, T.; Whyte, P.; Bolton, Declan; Department of Agriculture, Food and Marine (Ireland); 11/F/051 (Wiley, 2016-03-07)
    Aim: The aim of this study was to apply the most sensitive molecular techniques in combination with culture-based methods to characterize broiler farms in terms of the timeline (‘appearance’ and ‘pattern’) of Campylobacter contamination prior to and post detection in the birds. Methods and Results: Faecal and environmental samples were collected from three broiler farms (two flocks per farm). Real-time PCR was used to test for the presence of Campylobacter. Culture-based methods (enrichment and direct plating) were also applied and isolates were subject to a range of confirmatory tests before speciation (multiplex PCR). All flocks were colonized by Campylobacter before first thin and a similar pattern of Campylobacter contamination was observed; (day 1) a range of external and internal samples real-time PCR positive but culture negative; (day 0) chicks negative; (6–9 days pre-detection in the birds) internal samples (feeders, drinkers, barrier and/or bird weigh) culture positive and (post broiler infection) increasing concentrations of Campylobacter in internal samples but also on the tarmac apron and anteroom. Conclusion: It was concluded that; (i) vertical transmission did not occur; (ii) the environment was a potential source of Campylobacter; (iii) testing areas frequented by all birds (e.g. feeders and drinkers), may offer an opportunity for early Campylobacter detection and (iv) once the broilers are infected with Campylobacter, these bacteria are spread from the birds, through the anteroom to the areas surrounding the broiler house, highlighting the need for improved biosecurity. Significance and Impact of the Study: This study has established the pattern of Campylobacter contamination on broiler farms, identified an early detection opportunity, highlighted the need to better understand the role of viable but nonculturable Campylobacter in the ecology of Campylobacter on broiler farms and demonstrated the need for improved biosecurity to prevent the spread of Campylobacter from within the house to the surrounding environment.
  • Reagent Free Electrochemical-Based Detection of Silver Ions at Interdigitated Micro Electrodes Using in Situ pH Control

    Wasiewska, Luiza Adela; Seymour, Ian; Patella, Bernardo; Burgess, Catherine; Duffy, Geraldine; O'Riordan, Alan; Teagasc Walsh Scholarship; Science Foundation Ireland (SFI); Department of Agriculture, Food and Marine; 2016024; et al. (Elsevier, 2020-07-10)
    Herein we report on the development of an electrochemical sensor for silver ions detection in tap water using anodic sweep voltammetry with in-situ pH control; enabled by closely spaced interdigitated electrode arrays. The in-situ pH control approach allowed the pH of a test solution to be tailored to pH 3 (experimentally determined as the optimal pH) by applying 1.65 V to a protonator electrode with the subsequent production of protons, arising from water electrolysis, dropping the local pH value. Using this approach, an initial proof-of-concept study for silver detection in sodium acetate was undertaken where 1.25 V was applied during deposition (to compensate for oxygen production) and 1.65 V during stripping. Using these conditions, calibration between 0.2 and 10 μM was established with the silver stripping peak ∼0.3 V. The calculated limit of detection was 13 nM. For the final application in tap water, 1.65 V was applied to a protonator electrode for both deposition and stripping of silver. The chloride ions, present in tap water (as a consequence of adding chlorine during the disinfection process) facilitated silver detection and caused the striping peak to shift catholically to ∼0.2 V. The combination of the complexation of silver ions with chloride and in-situ pH control resulted in a linear calibration range between 0.25 and 2 μM in tap water and a calculated limit of detection of 106 nM without the need to add acid or supporting electrolytes.
  • Modelling the effect of UV light at different wavelengths and treatment combinations on the inactivation of Campylobacter jejuni

    Soro, Arturo B.; Whyte, Paul; Bolton, Declan J.; Tiwari, Brijesh K.; Teagasc Walsh Fellowship program; Department of Agriculture, Food, and Marine (DAFM); DAFM/17/F/275 (Elsevier, 2021-05)
    Application of novel decontamination strategies such as Ultraviolet (UV) irradiation are required to mitigate the risks associated with Campylobacter jejuni in food. This study evaluated the use of a light-emitting diode (LED) based technology to inactivate C. jejuni NCTC 11168 in Maximum Recovery Diluent (MRD) at wavelengths of 280, 300 and 365 nm and combinations. To assess the survival curves, two linear (Log linear (LL) and Linear and Shoulder) and two non-linear models (Weibull and Double Weibull) were fitted. UV exposures showed different antimicrobial effects where a combination of 280/300 nm was the most effective treatment with a 4Dt value of 5 s observed in a bacterial suspension of 5 log CFU/mL. Moreover, the LL model was the most robust model to describe the inactivation kinetics of Campylobacter when exposed to UV and therefore, modelling tools could be applied to predict the efficiency of UV light in a model solution. Industrial relevance: Light-based technologies like UV light are identified in the literature as potential alternatives to assure the decontamination of surfaces, liquids and solid food. However, some of these techniques require further investigation. The present study evaluated the use of a LED system and effect of combined wavelengths in the inactivation of Campylobacter through predictive modelling. This technique was observed to predict and explain kinetics of inactivation of Campylobacter and could be key in the scaling-up process of UV light at industrial level.
  • Quantitative microbial human exposure model for faecal indicator bacteria and risk assessment of pathogenic Escherichia coli in surface runoff following application of dairy cattle slurry and co-digestate to grassland

    Nag, Rajat; Nolan, Stephen; O'Flaherty, Vincent; Fenton, Owen; Richards, Karl G.; Markey, Bryan K.; Whyte, Paul; Bolton, Declan; Cummins, Enda; Department of Agriculture, Food & Marine (DAFM), Ireland; et al. (Elsevier, 2021-12)
    Animal waste contains high numbers of microorganisms and therefore can present a potential biological threat to human health. During episodic rainfall events resulting in runoff, microorganisms in the waste and soil may migrate into surface runoff, contaminating surface water resources. A probabilistic human exposure (HE) model was created to determine exposure to faecal indicator bacteria (FIB): total coliforms (TC), E. coli and enterococci following application of bio-based fertiliser (dairy cattle slurry, digestate) to grassland; using a combination of experimental field results and literature-based data. This step was followed by a quantitative microbial risk assessment (QMRA) model for pathogenic E. coli based on a literature-based dose-response model. The results showed that the maximum daily HE (HEdaily) is associated with E. coli for unprocessed slurry (treatment T1) on day 1, the worst-case scenario where the simulated mean HEdaily was calculated as 2.84 CFU day −1. The results indicate that the overall annual probability of risk (Pannual) of illness from E. coli is very low or low based on the WHO safe-limit of Pannual as 10 −6. In the worst-case scenario, a moderate risk was estimated with simulated mean Pannual as 1.0 × 10 −5. Unpasteurised digestate application showed low risk on day 1 and 2 (1.651 × 10 −6, 1.167 × 10 −6, respectively). Pasteurised digestate showed very low risk in all scenarios. These results support the restriction imposed on applying bio-based fertiliser if there is any rain forecast within 48 h from the application time. This study proposes a future extension of the probabilistic model to include time, intensity, discharge, and distance-dependant dilution factor. The information generated from this model can help policymakers ensure the safety of surface water sources through the quality monitoring of FIB levels in bio-based fertiliser.
  • Risk assessment of Escherichia coli in bioaerosols generated following land application of farmyard slurry

    Nag, Rajat; Monahan, Ciaran; Whyte, Paul; Markey, Bryan K.; O'Flaherty, Vincent; Bolton, Declan; Fenton, Owen; Richards, Karl G.; Cummins, Enda; Department of Agriculture, Food and the Marine (DAFM), Ireland; et al. (Elsevier, 2021-10)
    Transfer of Escherichia coli in bioaerosols to humans during and shortly after the land application of farmyard slurry may pose human health hazards, but it has not been extensively explored to date. The present study developed a quantitative risk assessment model for E. coli through the air exposure route. The probabilistic model assessed the predicted number of microorganisms in the air (PNair) to which humans may be exposed. A Gaussian air dispersion model was used to calculate the concentration of E. coli transmitted through aerosols. Human exposure (HE) to E. coli was estimated using a Monte Carlo simulation approach. This research predicted the mean HE as 26 CFU day−1 (95th percentile 263 CFU day−1) and suggests the importance of keeping a distance of at least 100 m for the residential population from land spreading activities. However, the simulated mean daily or annual (once a year application) risk of 2.65 × 10−7 person−1 year−1 due to land application of slurry indicates very low occupational risk for farmworkers not equipped with the personal protective equipment (PPE), who are potentially exposed to E. coli indirectly. The model found that the decay constant of E. coli in air, duration of decay, and bio-aerosolisation efficiency factor (top three) could influence HE to airborne E. coli. Furthermore, this research recommends an average time lag of at least 2.5 h following the application of farmyard slurry to the field before humans access the field again without PPE, allowing the airborne pathogen to decay, thereby ensuring occupational safety. The model suggested that the bio-aerosolisation efficiency factor (E) for other pathogens requires further investigation. The information generated from this model can help to assess likely exposure from bioaerosols triggered by land application of farmyard slurry.
  • A Bayesian inference approach to quantify average pathogen loads in farmyard manure and slurry using open-source Irish datasets

    Nag, Rajat; Markey, Bryan K.; Whyte, Paul; O'Flaherty, Vincent; Bolton, Declan; Fenton, Owen; Richards, Karl G.; Cummins, Enda; Department of Agriculture, Food and the Marine (DAFM), Ireland; 14/SF/847 (Elsevier, 2021-09)
    Farm-to-fork quantitative microbial risk assessments (QMRA) typically start with a preliminary estimate of initial concentration (Cinitial) of microorganism loading at farm level, consisting of an initial estimate of prevalence (P) and the resulting pathogen levels in animal faeces. An average estimation of the initial concentration of pathogens can be achieved by combining P estimates in animal populations and the levels of pathogens in colonised animals' faeces and resulting cumulative levels in herd farmyard manure and slurry (FYM&S). In the present study, 14 years of data were collated and assessed using a Bayesian inference loop to assess the likely P of pathogens. In this regard, historical and current survey data exists on P estimates for a number of pathogens, including Cryptosporidium parvum, Mycobacterium avium subspecies paratuberculosis (MAP), Salmonella spp., Clostridium spp., Campylobacter spp., pathogenic E. coli, and Listeria monocytogenes in several species (cattle, pigs, and sheep) in Ireland. The results revealed that Cryptosporidium spp. has potentially the highest mean P (Pmean) (25.93%), followed by MAP (15.68%) and Campylobacter spp. (8.80%) for cattle. The Pmean of E. coli is highest (7.42%) in pigs, while the Pmean of Clostridium spp. in sheep was estimated to be 7.94%. Cinitial for Cryptosporidium spp., MAP., Salmonella spp., Clostridium spp., and Campylobacter spp. in cattle faeces were derived with an average of 2.69, 4.38, 4.24, 3.46, and 3.84 log10 MPN g −1, respectively. Average Cinitial of Cryptosporidium spp., Salmonella spp., Clostridium spp., and E. coli in pig slurry was estimated as 1.27, 3.12, 3.02, and 4.48 log10 MPN g −1, respectively. It was only possible to calculate the average Cinitial of Listeria monocytogenes in sheep manure as 1.86 log10 MPN g −1. This study creates a basis for future farm-to-fork risk assessment models to base initial pathogen loading values for animal faeces and enhance risk assessment efforts.
  • Evaluation of pathogen concentration in anaerobic digestate using a predictive modelling approach (ADRISK)

    Nag, Rajat; Auer, Agathe; Nolan, Stephen; Russell, Lauren; Markey, Bryan K.; Whyte, Paul; O'Flaherty, Vincent; Bolton, Declan; Fenton, Owen; Richards, Karl G.; et al. (Elsevier, 2021-12)
    Farmyard manure and slurry (FYM&S) is a valuable feedstock for anaerobic digestion (AD) plants. However, FYM&S may contain high concentrations of pathogens, and complete inactivation through the AD process is unlikely. Thus, following land application of digestate, pathogens may contaminate a range of environmental media posing a potential threat to public health. The present study aimed to combine primary laboratory data with literature-based secondary data to develop an Excel-based exposure assessment model (ADRISK) using a gamma generalised linear model to predict the final microorganism count in the digestate. This research examines the behaviour of a suite of pathogens (Cryptosporidium parvum, norovirus, Mycobacterium spp., Salmonella spp., Listeria monocytogenes, Clostridium spp., and pathogenic Escherichia coli) and indicators (total coliforms, E. coli, and enterococci) during mesophilic anaerobic digestion (M-AD) at 37 °C, pre/post-AD pasteurisation, and after a period of storage (with/without lime) for different feedstock proportions (slurry:food waste: 0:1, 1:3, 2:1, and 3:1). ADRISK tool simulations of faecal indicator bacteria levels across all scenarios show that the digestate can meet the EU standard without pasteurisation if the AD runs at 37 °C or a higher temperature with a higher C:N ratio (recipe 3) and a hydraulic retention time ≥ 7 days. The storage of digestate also reduced levels of microorganisms in the digestate. The Irish pasteurisation process (60 °C for 4 days), although more energy-intensive, is more effective than the EU pasteurisation (70 °C for 1 h) specification. Pre-AD pasteurisation was more effective for C. parvum, norovirus, Mycobacterium thermoresistibile. However, post-AD literature-based pasteurisation is most likely to assure the safety of the digestate. The information generated from this model can inform policy-makers regarding the optimal M-AD process parameters necessary to maximise the inactivation of microorganisms, ensuring adverse environmental impact is minimised, and public health is protected.
  • An analysis of the spatio-temporal occurrence of anthelmintic veterinary drug residues in groundwater

    Mooney, D.; Richards, K.G.; Danaher, M.; Grant, J.; Gill, L.; Mellander, P.-E.; Coxon, C.E.; Science Foundation Ireland (SFI); European Regional Development Fund; 13/RC/2092 (Elsevier, 2021-05)
    Anthelmintics are antiparasitic drugs used to control helminthic parasites such as nematodes and trematodes in animals, particularly those exposed through pasture-based production systems. Even though anthelmintics have been shown to be excreted into the environment in relatively high amounts as unmetabolized drug or transformation products (TPs), there is still only limited information available on their environmental occurrence, particularly in groundwater, which has resulted in them being considered as potential emerging contaminants of concern. A comprehensive study was carried out to investigate the occurrence of 40 anthelmintic residues (including 13 TPs) in groundwaters (and associated surface waters) throughout the Republic of Ireland. The study focused on investigating the occurrence of these contaminants in karst and fractured bedrock aquifers, with a total of 106 sites (88 groundwaters and 18 surface waters) samples during spring 2017. Seventeen anthelmintic compounds consisting of eight parent drugs and nine TPs were detected at 22% of sites at concentrations up to 41 ng L−1. Albendazole and its TPs were most frequently detected residues, found at 8% of groundwater sites and 28% of surface water sites. Multivariate statistical analysis identified several source and pathway factors as being significantly related to the occurrence of anthelmintics in groundwater, however there was an evident localised effect which requires further investigation. An investigation of the temporal variations in occurrence over a 13 month period indicated a higher frequency and concentration of anthelmintics during February/March and again later during August/September 2018, which coincided with periods of increased usage and intensive meteorological events. This work presents the first detections of these contaminants in Irish groundwater and it contributes to broadening our understanding of anthelmintics in the environment. It also provides insight to seasonal trends in occurrence, which is critical for assessing potential future effects and implications of climate change.
  • The microbiology of beef from carcass chilling through primal storage to retail steaks

    McSharry, Siobhán; Koolman, Leonard; Whyte, Paul; Bolton, Declan; Meat Technology Ireland; Enterprise Ireland; Teagasc Walsh Scholarship scheme (Elsevier, 2021)
    The primary objective of this study was to investigate if alternative time-temperature carcass chilling combinations resulted in lower microbial (TVC, Enterobacteriaceae, Lactic Acid Bacteria, Pseudomonas spp. And Brochothrix thermosphacta) counts and, if achieved, would reduced levels remain throughout the beef chain. Physicochemical (temperature, pH, water activity) characteristics were also recorded. A secondary objective was to investigate the effect of primal maturation periods (2 versus 5 weeks) on the sensory properties of steaks by a trained panel for colour, odour, tenderness, and flavour. While microbial populations reduced by over 1 log10 ​cfu/cm2 by fast carcass chilling, these reductions were lost due to cross contamination in the boning hall and cutting room. The pH and water activity remained stable throughout the study and there was no significant difference for colour or sensory characteristics in retail steaks from the different treatment groups. It was concluded that there was no improvement to the microbial shelf-life of retail steaks from modified chilled carcasses or in the sensory shelf-life of primals which were aged for an extended period.
  • An assessment of contamination fingerprinting techniques for determining the impact of domestic wastewater treatment systems on private well supplies

    Fennell, Chris; Misstear, Bruce; O’Connell, David; Dubber, Donata; Behan, Patrice; Danaher, Martin; Moloney, Mary; Gill, Laurence; Irish Environmental Protection Agency; Irish Research Council; et al. (Elsevier, 2021-01-01)
    Private wells in Ireland and elsewhere have been shown to be prone to microbial contamination with the main suspected sources being practices associated with agriculture and domestic wastewater treatment systems (DWWTS). While the microbial quality of private well water is commonly assessed using faecal indicator bacteria, such as Escherichia coli, such organisms are not usually source-specific, and hence cannot definitively conclude the exact origin of the contamination. This research assessed a range of different chemical contamination fingerprinting techniques (ionic ratios, artificial sweeteners, caffeine, fluorescent whitening compounds, faecal sterol profiles and pharmaceuticals) as to their use to apportion contamination of private wells between human wastewater and animal husbandry wastes in rural areas of Ireland. A one-off sampling and analysis campaign of 212 private wells found that 15% were contaminated with E. coli. More extensive monitoring of 24 selected wells found 58% to be contaminated with E. coli on at least one occasion over a 14-month period. The application of fingerprinting techniques to these monitored wells found that the use of chloride/bromide and potassium/sodium ratios is a useful low-cost fingerprinting technique capable of identifying impacts from human wastewater and organic agricultural contamination, respectively. The artificial sweetener acesulfame was detected on several occasions in a number of monitored wells, indicating its conservative nature and potential use as a fingerprinting technique for human wastewater. However, neither fluorescent whitening compounds nor caffeine were detected in any wells, and faecal sterol profiles proved inconclusive, suggesting limited suitability for the conditions investigated.
  • The impact of key processing stages and flock variables on the prevalence and levels of Campylobacter on broiler carcasses

    Emanowicz, Malgorzata; Meade, Joseph; Bolton, Declan; Golden, Olwen; Gutierrez, Montserrat; Byrne, William; Egan, John; Lynch, Helen; O'Connor, Lisa; Coffey, Aidan; et al. (Elsevier, 2021-05)
    This study examined the impact of key processing stages and flock variables on the prevalence of Campylobacter on broiler carcasses. Overall, the prevalence of Campylobacter was 62% in caeca, and 68%, 65% and 62% in neck skin samples collected after evisceration, final wash and carcass chilling, respectively. Campylobacter were found in 32% of caeca, and 52%, 40% and 32% of neck skin samples collected after evisceration, final wash and carcass chilling, respectively from first thin broiler batches. Final thin broiler batches were more frequently contaminated with prevalences of 83% found in caeca, 80% in neck skin samples collected after evisceration and 83% found in neck skin samples collected after both final wash and carcass chilling stages (p < 0.05). Thinning status had a significant effect on Campylobacter counts with significantly higher counts observed in samples from final thin batches (p < 0.05). Highest Campylobacter concentrations in neck skin samples were observed at the evisceration stage in both first and final thin samples, with counts ranging from 2.0 to 3.8 log10 CFU/g and 2.3 to 4.8 log10 CFU/g in first and final thin batches, respectively. All first thin samples had counts below the European Union (EU) Process Hygiene Criterion threshold level of 3 log10 CFU/g after chilling while 52% of final thin batches had counts above this limit.
  • Landspreading with co-digested cattle slurry, with or without pasteurisation, as a mitigation strategy against pathogen, nutrient and metal contamination associated with untreated slurry

    Nolan, S.; Thorn, C.E.; Ashekuzzaman, S.M.; Kavanagh, I.; Nag, R.; Bolton, D.; Cummins, E.; O'Flaherty, V.; Abram, F.; Richards, K.; et al. (Elsevier, 2020-11)
    North Atlantic European grassland systems have a low nutrient use efficiency and high rainfall. This grassland is typically amended with unprocessed slurry, which counteracts soil organic matter depletion and provides essential plant micronutrients but can be mobilised during rainfall events thereby contributing to pathogen, nutrient and metal incidental losses. Co-digesting slurry with waste from food processing mitigates agriculture-associated environmental impacts but may alter microbial, nutrient and metal profiles and their transmission to watercourses, and/or soil persistence, grass yield and uptake. The impact of EU and alternative pasteurisation regimes on transmission potential of these various pollutants is not clearly understood, particularly in pasture-based agricultural systems. This study utilized simulated rainfall (Amsterdam drip-type) at a high intensity indicative of a worst-case scenario of ~11 mm hr−1 applied to plots 1, 2, 15 and 30 days after grassland application of slurry, unpasteurised digestate, pasteurised digestate (two conditions) and untreated controls. Runoff and soil samples were collected and analysed for a suite of potential pollutants including bacteria, nutrients and metals following rainfall simulation. Grass samples were collected for three months following application to assess yield as well as nutrient and metal uptake. For each environmental parameter tested: microbial, nutrient and metal runoff losses; accumulation in soil and uptake in grass, digestate from anaerobic co-digestion of slurry with food processing waste resulted in lower pollution potential than traditional landspreading of slurry without treatment. Reduced microbial runoff from digestate was the most prominent advantage of digestate application. Pasteurisation of the digestate further augmented those environmental benefits, without impacting grass output. Anaerobic co-digestion of slurry is therefore a multi-beneficial circular approach to reducing impacts of livestock production on the environment.

View more