• Recreational demand modelling for agricultural resources

      Hynes, Stephen (Teagasc, 2007-07-31)
      In the last decade the demand for rural recreation has increased in Ireland as the population has become increasingly urbanised. Increased affluence, mobility and changing values have also brought new demands with respect to landscape, conservation, heritage and recreation, with a greater emphasis on consumption demands for goods and services in rural areas. This project’s contribution to the understanding of outdoor recreational pursuits in Ireland is based on the estimation of the first recreation demand functions for farm commonage walking, small-scale forestry recreation and whitewater kayaking. These are all popular activities that take place in Irish rural space. We use this empirical work to investigate the more general conflict between countryside recreational pursuits and farming activity. Through the estimation of travel cost models, the study derives the mean willingness to pay of the average outdoors enthusiast using small-scale forestry sites in Co. Galway, using farm commonage in Connemara and using the Roughty river for kayaking recreation in Co. Kerry. An estimate of the gross economic value of the sites as recreational resources was also derived. The results indicate the high value of Irish farmland (and the Irish rural countryside in general) from a recreational amenity perspective. The project lasted approximately 2 years and was completed on-time (31st July 2007).
    • Reduced Fungicide Inputs in Winter Wheat

      Dunne, B. (Teagasc, 1998-09-01)
      Nine trials were conducted over three years at three sites to evaluate the efficacy of reduced rates of various fungicide products for their biological efficacy in controlling stem, foliar and ear diseases of winter wheat as well as their effects on yield and grain quality, and to compare the relative profitability of full and reduced rates of fungicides. The results show that the use of half rates can give an economic benefit over that of full rates in many situations. In circumstances where variety or seasonal factors resulted in low to moderate foliar disease pressure the use of half rates gave similar yields to that of full rates. Where foliar disease pressure was high, half rates generally gave lower yields than full rates but the amount of the reduction varied with the fungicide product used. The use of spray additives improved the yield response of the half rate treatments in most cases. Disease levels (septoria) were higher in treatments where half rates were used, compared with the corresponding full rates, but the used of spray additives improved the disease control in the half rate treatments. The timing of spray applications is critical when half rates of fungicides are being used. Reduced rate treatments need to be applied more frequently. In these trials reduced rate treatments were applied as a three-spray programme rather than the conventional two-spray programme.
    • Reduced Herbicide Inputs in Cereals

      Mitchell, B.J. (Teagasc, 1998-09-01)
      The objective of this project was to examine if herbicides used in cereals at rates lower than recommended by the manufacturer (reduced rates) would give acceptable weed control resulting in lower crop production costs. Field trials with a number of herbicides at full and reduced rates were carried out in winter barley, winter wheat and spring barley in 1994-1996. Herbicides used at recommended rates gave the highest and most consistent levels of weed control. Herbicides used at 50% of the recommended rates gave slightly lower levels of weed control than the recommended rates but did not result in lower yields. While rates lower than 50% gave about 70% control of weeds, grain yield was reduced in some trials. Reduced rates gave higher weed control in barley than in wheat. The level of weed control was influenced by weed species and the growth stages of the weeds at the time of herbicide spraying. Thus selection of herbicides and their rates of application should be field specific. The findings show that it is possible to reduce the amount of herbicides used in cereals with considerable cost savings and reduced risk of herbicide residues in grain, soil and water.
    • Reducing The Cost of Beef Production by Increasing Silage Intake.

      O'Kiely, Padraig; Moloney, Aidan P; O'Riordan, Edward G. (Teagasc, 2002-12-01)
      Grass silage must support the predictable, consistent and profitable production of quality animal produce within environmentally sustainable farming systems. This can be quite a challenge for a crop that is so strongly influenced by the prevailing variable weather conditions, and the many interactions of the latter with farm management practices. Research and scientific progress must therefore continue to provide improved technologies if grass silage is to fulfil the above requirements. Yield, quality (including effects on intake, feed conversion efficiency, growth, meat quality, etc.), conservation losses, inputs and eligibility for EU financial supports determine the cost of providing cattle with silage, and this can have a major impact on the cost of producing milk or beef. Consequently, there has been an emphasis in the research reported here to add new information to the existing framework of knowledge on these
    • Reducing the incidence of boar taint in Irish pigs

      Allen, Paul; Joseph, Robin; Lynch, Brendan (Teagasc, 2001-04)
      Boar taint is an unpleasant odour that is released during cooking from some pork and products made from the meat and fat of non-castrated male pigs. Only a proportion of boars produce this odour and not all consumers are sensitive to it. Nevertheless it is a potential problem for the industry since an unpleasant experience can mean that a sensitive consumer may not purchase pork or pork products again. Some European countries are very concerned about this problem and most castrate all the male pigs not required for breeding. Irish pig producers ceased castration more than 20 years ago because boars are more efficient converters of feed into lean meat and a research study had shown that boar taint was not a problem at the carcass weights used in this country at that time.
    • Reducing the nitrate content of protected lettuce.

      Byrne, C.; Maher, J.; Hennerty, J.; Mahon, J.; Walshe, A. (Teagasc, 2001-03-01)
      A research project was carried out jointly between Teagasc, Kinsealy Research Centre and University College Dublin, Department of Crop Science, Horticulture and Forestry which studied the effects of cultivar, nitrogen fertilisation and light intensity on the nitrate content of protected butterhead lettuce. In a series of cultivar trials of winter and summer butterhead lettuce, significant differences in the nitrate content of the lettuce between cultivars were found only in one experiment. In this instance, the differences were not consistent between successive harvests. It was concluded that screening lettuce cultivars for tissue nitrate level is unlikely to contribute to an overall reduction of nitrate levels. The application of N in a liquid feed throughout the cropping period resulted in higher nitrate levels in lettuce plants grown in soil filled containers compared with a similar amount of N applied to the soil before planting. Withdrawing N for the final 10 days of the cropping period did not affect the nitrate content of the lettuce. In an experiment studying nitrogen source and rate on lettuce grown in containers, the use of calcium cyanamide as a N source resulted in lower nitrate levels in the lettuce and gave a reduced head weight compared with calcium ammonium nitrate (CAN) or ammonium sulphate. Increasing the rate of CAN or ammonium sulphate gave higher lettuce nitrate levels. A nitrification inhibitor reduced the soil nitrate levels especially with sulphate of ammonia as the N source but did not affect the plant nitrate levels significantly. The addition of chloride to the soil reduced nitrate levels in the lettuce. In a further fertilisation study using containers, calcium cyanamide again resulted in lower plant nitrate levels than CAN. Increasing the rate of CAN increased soil nitrate levels, lettuce head weight and plant nitrate levels. The relationship between soil nitrate levels, lettuce head weight and plant nitrate level indicates that the level of 100-150 mg·L-1 of nitrate N in the soil, advocated in the Code of Good Practice, is a compromise between maximising plant growth and minimising lettuce nitrate content. A comparison between CAN and calcium cyanamide in a border soil experiment again showed that the latter N source resulted in lower lettuce nitrate levels. In this experiment the addition of chloride to the soil did not affect plant nitrate levels. Lettuce was grown, in late summer, in small tunnels using a range of polyethylene cladding materials. Head weight correlated well with the overall light transmission of the materials. In one of the materials that had a low light transmission, lettuce nitrate content was doubled compared with those grown under the materials with high light transmission. Under both winter and summer conditions, the nitrate content of lettuce heads was not influenced by the time of day at which harvest took place. In experiments in which multiple harvests were carried out there was no consistent trend in nitrate content as the heads developed and matured. Within individual heads of lettuce there was a steep concentration gradient with the older outer leaves having much higher concentrations of nitrate than the younger inner leaves. Herbicides commonly used in protected lettuce production did not influence the nitrate content of the lettuce.
    • Reducing the seasonality of prime lamb production

      Grennan, Eamonn J. (Teagasc, 1998-10-01)
      Lambing part of the national lowland flock in April to late May has potential to reduce the seasonality of supply and extend the season for prime young lamb. This would, potentially, enhance ability to maintain and increase market share for Irish lamb. A farmlet system was operated over two years, with some 50 ewes on 4 ha of pasture. The objectives were: to assess the overall performance of a flock lambing in mid to late April : to monitor lamb growth rate and drafting patterns for lambs; to determine the changes in feed demand over the season; to identify any saving in feed costs, and any difficulties that may arise with late lambing. The feed demand over the grazing season differs from normal March lambing. A grass surplus tends to occur in April/May and a deficit in November/December, and this imbalance between supply and demand increases if lambing is in late May. The balance between feed demand and supply may be more easily achieved where sheep are combined with cattle or tillage. Results show that a late-lambing flock can be managed successfully on an all-grass farm. If lambing takes place from mid-April to late May, some lambs will finish off pasture in September/October. Remainder can be finished indoor on silage with concentrate supplementation for sale in October to February. Lambing from mid-April onwards allows ewes to be at pasture for 4 to 6 weeks pre-lambing and concentrate feeding to ewes pre or post lambing should not be necessary. However this saving on concentrate input is offset by the need for concentrates to finish lambs. Lamb growth rate on pasture to weaning will be somewhat lower than with March lambing, due to deterioration in pasture digestibility in mid-season. A high standard of grassland management is critical to maintain pastures leafy, in order to achieve high lamb growth rate pre and post weaning. Profitability will depend on supplying niche markets with younger lambs at premium prices.
    • Reducing windthrow losses in Farm Forestry

      Mulqueen, J.; McHale, J.; Rodgers, M. (Teagasc, 1999-05-01)
      The study comprised a field and laboratory investigation on the stability of Sitka spruce trees planted on a surface water gley. The field-testing was conducted at Ballyfarnon Forest in County Sligo in the north west of Ireland. Nine destructive monotonic pulling tests were conducted on trees selected from three different site preparations, namely, mole drained, double mouldboard ploughed and an uncultivated control. Dynamic testing, using a mechanical rocking device, was performed on a tree selected from the uncultivated control. A simple shear apparatus was used to conduct monotonic and cyclic tests on reconstituted samples of the Ballyfarnon soil. This allowed a comparison of soil behaviour under monotonic and cyclic loading. A computer software package was used to model the behaviour of groundwater for soil mole drained at two drain spacings. Results from this mathematical modeling were compared to experimental data gathered during a previous study. Results indicate that the use of mole drainage as a site preparation technique produces more stable trees than either double mouldboard ploughing or no cultivation.
    • Regional images and the promotion of quality food products

      McIntyre, Bridin; Henchion, Maeve; Pitts, Eamonn; European Commission; CT96 1827, (Teagasc, 2001-02)
      This research was undertaken as part of the RIPPLE (Regional images and the promotion of quality products and services in the lagging regions of the European Union) project, funded within the FAIR Programme (1994-1999). The project objective was to assist public and private institutions develop strategies, policies and structures to aid the successful marketing of quality products in the lagging regions of the EU. The project also sought to provide consumer perspectives on the issue of regional quality products using survey research.
    • Relating starch properties to boiled potato texture

      Gormley, Ronan T.; Department of Agriculture, Food and the Marine (Teagasc, Ballsbridge, Dublin 4, 1998-08)
      Basic information on starch properties may help to explain the different textural characteristics of potato cultivars, and also their suitability for different forms of processing. The study involved tests on both raw potatoes, and on starch separated from potatoes, and embraced three main activities: (i) to relate boiled-potato texture with the other test variables; (ii) to develop a rapid crush-test for assessing cooked-potato texture; (iii) to study the effect of chilling and freezing on the development of resistant starch (RS) in boiled potatoes.
    • The relationship between various live animal scores/measurements and carcass classification for conformation and fatness with meat yield and distribution, and ultimate carcass value.

      Drennan, Michael J; McGee, Michael; Conroy, S.B.; Keane, Michael G.; Kenny, David A.; Berry, Donagh (Teagasc, 2008-01-01)
      the primary objectives of the following study were to: (1) determine the relationship of live animal muscular and skeletal scores, ultrasonically scanned muscle and fat depth measurements of the m. longissimus dorsi, and carcass conformation and fat scores with kill-out proportion, carcass composition and value. (2) Specifically develop and test the accuracy of prediction equations for carcass meat, fat and bone proportions, derived from carcass conformation and fat scores, and develop prediction equations for total carcass composition from hind-quarter composition.
    • Relative Tissue Growth Patterns and Carcass Composition in Beef Cattle

      Keane, Michael G. (Teagasc, 2011-03-01)
      The main objective of the beef breed evaluation programme carried out at Grange Beef Research Centre was to compare the productive characteristics of different beef breed crosses out of Holstein-Friesian cows. In the course of this work much additional information was acquired, particularly on growth patterns of body organs and tissues, and how these affect kill-out proportion and carcass composition. The data were also used to examine relationships between carcass classification variables and carcass composition. Cattle used for beef production in Ireland can be classified into three main biological types: (i) early maturing, (ii) dairy, and (iii) late maturing. Results from an experiment that compared Friesians (dairy), Hereford × Friesians (early maturing) and Charolais × Friesians (late maturing) are used to represent these biological types. The material is organized under the following headings: (i) non carcass parts and kill-out proportion, (ii) carcass composition, (iii) carcass tissue distribution, (iv) muscle chemical composition, (v) gender, (vi) dairy breeds, and (vii) carcass classification and composition.
    • Repeated low rate herbicide applications for weed control in scallions.

      Murphy, Richard F.; Marren, Peter (Teagasc, 1998-09-01)
      The main objective of this investigation was to determine if satisfactory weed control in vegetable crops (scallions) could be achieved and overall chemical use reduced by very repeated low dose applications of contact and contact residuals herbicides. The trials showed that scallion crops could tolerate certain post-emergence herbicides better than weed populations earlier in their life cycle provided the first true leaf of the crop had reached 1.5- 2cm in length . Various combinations or cocktails of herbicides of cyanazine, oxyfourfen, ioxynil and gesagard were tested both at Kinsealy and in several commercial locations in Co. Dublin at very low rates varying from 17 to 70g/ha and these were compared with current standard post-emergence recommendation of single rate ioxynil at 0.4-0.7 kg/ha. Each of the cocktail combinations apart from ioxynil and cyanazine produced highly satisfactory weed control when repeated at 7-12day intervals commencing when the first true leaf averaged 2cms long, over a wide range of conditions. The most effective and satisfactory weed control was achieved from either the oxyflourfen (17- 35g/ha) plus cyanazine (35-70g/ha) or the ioxynil(17-35g/ha) plus gesagard (17-35g/ha) combinations. These matched the weed control given by the standard recommended treatment of ioxynil with the advantage of a reduction of up to 50% chemical usage.
    • Replacement strategies to maximise profiltability in dairying.

      Crosse, Seamus; Haran, P.; Killen, L. (Teagasc, 1998-01-01)
      The overall objective of this project was to develop a dynamic model which would determine the optimum replacement rates for dairy herds under an Irish system of production. A model was designed and applied to the Irish dairy replacement problem, which included in the decision making process production, fertility, calving interval, seasonality, month of calving and various economic factors. The output from the Hierarchic model is a series of rankings. The dynamic programming approach can enable one to inform a farmer which cows in the herd should be replaced, on the basis that a replacement heifer( and its future successors) are expected to be more profitable than the current cow. The optimum replacement rate was 17.8% from this analysis.
    • Risk Analysis and Stochastic Modelling of Agriculture

      Thorne, Fiona; Hennessy, Thia (Teagasc, 2007-01-01)
      This project analysed the role of risk in farmers’ production decisions and the impact of policy changes on risk in agricultural production. · A stochastic budgetary farm level model was developed using Irish National Farm Survey data and FAPRI-Ireland projections. · The model was used to examine the varying level of farmers’ exposure to risk under different policy regimes. · Results showed that under the Mac Sharry and Agenda 2000 regimes of agricultural policy the major incentive for profit maximising farmers to engage in production was to qualify for direct income support. Direct payments were relatively risk free sources of income and therefore risk played only a minor role in the production decision. The results showed that farmers would be exposed to more risk under decoupling. The return to production post decoupling is market based only, as the direct payment is no longer linked to production, and therefore is more exposed to price and production risk. · The stochastic budgetary model, which accounts for price and production risk, was used to estimate the economic trade off between “entitlement farming”, that is retaining farm land only to claim payments and not produce any tangible products, and conventional farming. · The results showed that for less efficient farms, the probability of achieving a significantly higher profit by engaging in entitlement farming is 46 percent, while further analysis shows that there is a 9 percent probability that profits from conventional farming systems would be only marginally higher than the ‘entitlement farming’ option.
    • A risk assessment and hazard analysis and critical control point (HACCP) study for the Irish catering industry

      Bolton, Declan; Meally, Aisling; Downey, Gerard; Safefood (Teagasc, 2007-02)
      This report provides details of a food safety knowledge survey, a microbiological survey, a chilled temperature survey and an audit conducted in 200 restaurants throughout the island of Ireland. The results suggest a low incidence of several bacterial pathogens (including Salmonella enterica) and identify areas in which food safety knowledge, procedures and practices should be improved. Salmonella enterica isolates were characterised and the results suggested distinct pockets of different serotypes. Growth curves for L. monocytogenes isolates suggest considerably reduced shelf-life for a variety of foods. For example, lettuce should not be stored at room temperature or the shelf-life is reduced from 6.5 days (chilled storage) to 3.3 days.The predicted shelf-life for fresh milk was 4.5 days (chilled storage). Chlorine (sodium hypochlorite, 5 ppm), 1-monolauroyl-rac-glycerol and a laurate ester (ester-glucoside laurate) were also tested for application as vegetable decontaminating agents in restaurant kitchens. The report concludes with recommendations for improved food safety and hygiene in Irish restaurants.
    • A Risk Assessment and Hazard Analysis and Critical Control Point (HACCP) Study for the Irish Catering Industry

      Bolton, Declan; Meally, Aisling; Downey, Gerard; Safefood (Teagasc, 01/02/2007)
      This report provides details of a food safety knowledge survey, a microbiological survey, a chilled temperature survey and an audit conducted in 200 restaurants throughout the island of Ireland. The results suggest a low incidence of several bacterial pathogens (including Salmonella enterica) and identify areas in which food safety knowledge, procedures and practices should be improved. Salmonella enterica isolates were characterised and the results suggested distinct pockets of different serotypes. Growth curves for L. monocytogenes isolates suggest considerably reduced shelf-life for a variety of foods. For example, lettuce should not be stored at room temperature or the shelf-life is reduced from 6.5 days (chilled storage) to 3.3 days.The predicted shelf-life for fresh milk was 4.5 days (chilled storage). Chlorine (sodium hypochlorite, 5 ppm), 1-monolauroyl-rac-glycerol and a laurate ester (ester-glucoside laurate) were also tested for application as vegetable decontaminating agents in restaurant kitchens. The report concludes with recommendations for improved food safety and hygiene in Irish restaurants.
    • Risk-based determination of critical control points for pork slaughter

      Bolton, Declan; Pearce, Rachel; Sheridan, James J.; Department of Agriculture, Food and the Marine, Ireland (Teagasc, 2002-05)
      To identify the critical control points (CCPs) during commercial pork slaughter, 60 pigs in a small abattoir (80 pigs per day) and a similar number in a larger plant (2000 pigs per day) and/or their resultant carcasses were swabbed at the ham, belly and neck. The total bacterial contamination was determined after each stage from the live pigs on the farm to chilling of the carcasses in the abattoir.
    • Role of Lactobacilli in Flavour Development of Cheddar Cheese.

      Beresford, Tom; Cogan, Tim; Rea, Mary; Drinan, Finbarr; Fitzsimons, Nora; Brennan, N.; Kenny, Owen; Fox, P.F. (Teagasc, 2001-05-01)
      Cheddar cheese is a complex microbial ecosystem. The internal cheese environment, in particular of hard and semi-hard cheeses, is not conducive to the growth of many microorganisms. At the beginning of ripening the dominant microorganisms are the starter bacteria which are present at high levels (~109/g). However, during ripening, non-starter lactic acid bacteria (NSLAB) grow from relatively low levels (<103/g) at the beginning of ripening, to 108/g within 6 - 8 weeks. Other bacteria, e.g. enterococci and staphylococci, may also be present but in much lower numbers. In a previous study of mature and extra mature Cheddar cheeses from different manufacturers (see End of Project Report No. 1), it was found that the NSLAB population was dominated by strains of Lb. paracasei. However, their contribution to cheese flavour and their source(s) are still unclear, nor is it known if the NSLAB flora is unique to each plant. Hence, understanding the growth of this group of organisms in cheese is a key to defining their role in flavour development. The biochemistry of flavour development in cheese is poorly understood. For most cheese varieties, including Cheddar, proteolysis, which results in the accumulation of free amino acids, is of vital importance for flavour development. Increasing evidence suggests that the main contribution of amino acids is as substrates for the development of more complex flavour and aroma compounds. The manner by which such compounds are generated in cheese is currently the focus of much research. Starter bacteria have been shown to contain a range of enzymes capable of facilitating the conversion of amino acids to potential flavour compounds. However, the potential of lactobacilli (NSLAB) to produce similar enzymes has only recently been investigated. Hence, although, it is generally accepted that the cheese starter flora is the primary defining influence on flavour development, the contribution of NSLAB is also considered significant. The objectives of these studies were: - to develop a greater understanding of the behaviour of NSLAB in cheese, and - to identify suitable strains, and other cheese bacteria, to be used as starter adjuncts for flavour improvement.
    • Routine diagnostic tests for food-borne pathogens

      Duffy, Geraldine; Kilbride, Brendan; Fitzmaurice, Justine; Sheridan, James J. (Teagasc, 2001-01)
      Rapid techniques were developed and applied to the determination of total viable bacteria and to the detection of food borne pathogens (Listeria monocytogenes, Salmonella, Campylobacter jejuni and E. coli O157:H7). The method developed for total viable counts is based on membrane filtration and fluorescent staining and the technique can be performed and a result obtained within 20 min. The results correlate well with the standard plate count and the technique is now being implemented in Irish food factories.