• Factors Affecting the Cleanliness of Cattle Housed in Buildings wiith Concrete Slatted Floors.

      Fallon, Richard J.; Lenehan, J.J. (Teagasc, 2002-01-01)
      From a series of experiments at Grange Research Centre, cattle were cleanest at housing in the autumn, however, within 3 to 4 weeks of housing on concrete slats and fed with a diet of grass silage, cattle were dirtiest, with the majority of the cattle in category 4 or 5. Cattle tended to be cleaner in the late March, early April period as they shed their winter hair coat. Cattle fed concentrates plus straw were significantly cleaner at slaughter compared to similar cattle offered grass silage plus concentrates. Cattle housed indoor on slats during the summer were cleaner than cattle on similar diet and accommodation during the winter. High dry matter silage produced cleaner cattle than did low dry matter silages. Back and tail clipping of cattle at the commencement of the winter finishing period did not have any positive effect on cleanliness score or liveweight gain when the cattle were accommodated in well ventilated slatted floor houses. A survey of 19 farms specialising in finishing cattle failed to show any correlation between stocking density, solid floor area or level of concentrate feeding on the cleanliness of finishing cattle. A survey of 36 finishing units, designated as producers of "clean" or "dirty" cattle at slaughter, found that units with clean cattle had houses which were in general well ventilated, had A-type roofs with an open ridge outlet and in general the grass silage offered was a higher dry matter. In contrast, finishing units with dirty cattle tended to be poorly ventilated and the grass silage offered had a lower dry matter. Overall in the survey cattle cleanliness score was not affected by stocking density (2.0m 2 3.8m 2) or the proportion of solid floor area in the pen. Cattle accommodated on gang slats were dirtier than those accommodated on single slats.
    • Factors affecting the composition and use of camelina

      Crowley, J.G.; Frohlich, A. (Teagasc, 1998-09-01)
      Camelina (Camelina sativa), a member of the mustard family, is a summer annual oilseed plant. Winter hardy types also exist. False flax and Gold of Pleasure are the popular common names for the crop. The crop was widely grown in Eastern Europe and Russia up to the early 1940’s but was replaced with the introduction and widespread use of oilseed rape. The revival of interest in camelina oil is due to its high linolenic acid (38%) content. Linolenic acid is one of the OMEGA-3 fatty acids which are generally found in substantial quantities only in linseed and fish oils. Camelina offers an opportunity to supply the growing demand for high quality edible oils rich in OMEGA-3 fatty acids. A three year study established that camelina is a very suitable crop to grow in Ireland, producing 2.5 t/ha of high quality seed (42-47%) with no agrochemical inputs required. The oil contains 35 to 40% linolenic acid compared to 8% in rape and soya oils. The oil does not deteriorate during refining or storage and can be used in a number of oil based products such as spreads and salad dressings.
    • Factors affecting the yield of winter lupins.

      Crowley, J.G. (Teagasc, 1998-09-01)
      The white lupin (Lupinus albus) is a temperate legume whose seed contains high levels of protein (36-44%), oil (10-16%) and high quality dietary fibre in the dry matter. Modern varieties contain extremely low levels of alkaloids (<0.01%) and no anti-nutritional factors. Thus their composition is more similar to soya bean than peas and beans, which contain much less protein (23-27%) and no oil. Nitrogen fixation by autumn-sown, determinate varieties is large (ca. 300 kg/ha) and harvest index for nitrogen is high (more than 85% of the crop N is recovered in the grain). Lupins also have the ability to release phosphorus and iron from mineral sources in the soil. These two characteristics make the winter lupin crop an ideal choice as a low input alternative crop, particularly in nitrogen-sensitive areas. Attempts to introduce spring-sown lupins have failed, mainly due to low yield potential, poor yield stability and late harvest. The release of the first winterhardy determinate varieties by French breeders in 1994 promised the first real chance of success. The successful introduction of lupins offers the possibility of reducing soya bean imports and replacing it with a high-quality, home-grown protein source, with the added advantage of traceability. Autumn-sown lupins are capable of producing satisfactory yields (3.7-4.5 t/ha). The crop does require careful management, i.e. early sowing (by mid-September), at the correct seed rate (100 kg/ha), into well-structured free-draining soil and with a pH below 7. Sown in early September, the crop will mature from late August to mid-September.
    • Factors Affecting Yield and Quality of Oats.

      Burke, James I.; Browne, R.; White, E.M. (Teagasc, 2001-05-01)
      Quality evaluation of oats relies primarily on hectolitre weight and, while it is an important characteristic, work carried out at Oak Park and elsewhere has shown that it does not accurately measure grain quality. Consequently, the selection of oat lots and varieties which have a high milling value has been limited, as present techniques fail to accurately determine the characteristics most closely related to milling quality. In this regard the kernel content and the ease of husk removal, termed the hullability, are the most important. This study has developed a new test for assessing oat kernel content, which is more rapid and cheaper than techniques currently available. Despite its obvious importance, oat hullability has not been assessed to date in quality evaluation due to the absence of a test procedure. However, this obstacle has now been overcome. The results of this work also provide a much better understanding of how hullability of individual varieties can be assessed, as well as investigating how this could be manipulated at field level. Using the methods developed, the selection of varieties with enhanced processing characteristics can now be carried out more precisely for Irish conditions. The field trials conducted to evaluate the effect of agronomic practices on quality indicated that the effect of factors such as nitrogen rate and seed rate was small in comparison to variety, which had the largest and most consistent effect. The variation in quality could not be completely explained by variation in the panicle characteristics studied. Increasing the nitrogen rate increased yield with the optimum being 160 kg N/ha in 1998 and 1999. However, lodging became a very significant factor at nitrogen rates above 100 kg N/ha in 1998, although it did not occur in 1999. This work supports the current Teagasc nitrogen recommendations for oats where levels of 110-140 kg N/ha (Soil Index 1) are advised.
    • Factors Shaping Expenditure on Food-Away-from-Home in Irish and UK Households

      Keelan, Conor; Henchion, Maeve; Newman, Carol; Downey, Gerard; Teagasc Walsh Fellowship Programme (Teagasc, 01/10/2009)
      Factors influencing consumer spending in two sectors of the food-away-from-home (FAFH) market (quick-service e.g. takeaways, and full-service e.g. restaurants) were analysed using national household expenditure survey data. Different variables affect expenditure in the two sectors in different ways. Income has a greater effect on expenditure in the full-service sector than in the quick-service sector. Similarly households that are health-conscious indicate a greater preference for full-service meals while households which place more value on time (and therefore are more convenience-oriented) indicate a greater preference for quick-service. Households of a higher social class and those with higher education levels also appear to favour full-service expenditure. In addition, younger, urbanised households favour quickservice meal options. The results emphasise the merits of analysing different sectors within the FAFH market separately.
    • Factors shaping expenditure on meat and prepared meals

      Newman, Carol; Henchion, Maeve; Matthews, Alan (Teagasc, 2002-02)
      The factors shaping Irish households' expenditure decisions on meat and prepared meals are analysed using the two most recent datasets of the Irish Household Budget Survey (1987/8 and 1994/5). The motivation for the research stems from the changing pattern of food consumption, leading to a decline in the importance of price and income factors, and a simultaneous increase in the significance of socio-demographic factors, assumed to underpin consumers' tastes and preferences. Irish households' expenditure patterns on all meat, specific meat categories and prepared meals are analysed using tobit, double-hurdle and infrequency of purchase models.
    • Farm Facilities On Small - Medium Type Dairy Farms.

      Gleeson, David E (Teagasc, 2000-11-01)
      82 % of farms with milk quota < 54,552 litres have bucket/pipeline milking plants. • There were a high percentage of milking machine faults on the farms surveyed. • Fragmented land portions are more likely to limit dairy expansion than farm size. • 60% of farms had beef buildings suitable for conversion to dairy housing • 88 % of farms had adequate cubicle spaces for present cow numbers • The cost of purchasing milk quota was considered to be the biggest factor restricting expansion. • 67 % of farms with quota > 54,552 litres are joined REPS. • 51 % of farms had dairies registered under dairy hygiene regulations. • Milk bulk tank size would limit dairy expansion without investment in larger static tanks. • The number of cows to fill milk quota is better matched in the higher quota category. • The length of the working day was 12.7 hrs/day for an average herd size of 23 cows. • Estimated cost of extra facilities per farm to allow for scaling up in milk production from 90,920-181,840 litres is £33,760
    • Farm Forestry: Land Availability, Take-up Rates and Economics.

      Frawley, J.P.; Leavy, Anthony (Teagasc, 2001-02-01)
      Of the Member States in the European Union Ireland has the lowest proportion of land area covered by forest. Given the large surpluses of agricultural commodities and expected future increases in farm productivity, less land resources will be needed to produce EU food requirements. The Irish government has, therefore, adopted a target to plant 25,000 ha of new forest annually to the year 2000 and thereafter a target of 20,000 ha annually. Substantial incentives to promote afforestation are in place, but with the exception of 1995, the area of land planted has been considerably below target. The objectives of this study is to examine (i) the availability of land for afforestation, (ii) the factors which impede or promote the uptake of forestry and (iii) the relative economic returns from forestry in a farm context. The availability of land via the market has steadily diminished between 1990 and 1998. The area of agricultural land sold in the period fell from 33,282 ha to 8,656 ha, a fall of 74 per cent. At the same time average price increased from £3,964 per ha to £6,865, an increase of 72 per cent. Surveys of the opinions of landholders indicate that attitudes toward afforestation are becoming more positive in the 1990s. This is reflected in a substantial increase in the area of farm forestry during the decade. However, a survey of opinions of farmers who had already planted forestry indicated a perception that it is not a suitable replacement for conventional farm enterprises on `good' farmland. Land planted in 78 per cent of sites in this survey was previously utilised as either summer grazing or rough grazing. The principal motivation for planting was the favourable returns to forestry on land that had limited alternative use. The relative economic returns of forestry in comparison with farm enterprises such as dairying and cattle were assessed post CAP reform (2007), using linear programming techniques. Scenarios involved alternative uses of the farm resources such as extensive/intensive land use, forestry/no forestry and off farm job/no off farm job. The objective was to examine the profitability of forestry on farms in situations in which livestock enterprises qualified for REPS and extensification payments and in which off farm jobs were (a) not available and (b) available at different wage levels. Non economic considerations, such as the perceived unsuitability of forestry as a replacement for agricultural enterprises on `good' land and the irrevocability of the decision to plant forestry could, come into play. In order to reflect these non-economic considerations, together with the higher risk associated with investment by individuals, a high discount rate (10%) was used in calculating returns to forestry. The analysis shows that in situations in which off farm jobs are either not available or are available at a low wage level, extensification and REPS payments enable efficient livestock enterprises to compete with forestry. In these situations forestry is a profit maximiser only on farms which have surplus land, having first qualified for both extensification and REPS on existing livestock enterprises. However, the availability of off farm earnings at or near the industrial wage rate leads to increases in the forestry area, sometimes to the exclusion of cattle enterprises. Economic criteria therefore could mean that large areas of land could be transferred to forestry from conventional agriculture in the post 1999 CAP reform situation. Economics may not, however, be the most appropriate arbiter of such a decision.
    • A Farm Scale integrated constructed wetland to treat farmyard dirty water.

      Dunne, E.; Culleton, Noel; O'Donovan, Grace; Harrington, Rory (Teagasc, 2005-01-01)
      In Ireland, the use of constructed wetlands to manage agricultural waters such as farm yard dirty water has been primarily based on an ecosystems approach. Integrated constructed wetlands, which are a design specific approach of conventional surface flow constructed wetlands, were first used in the Anne Valley, Waterford, Ireland (Harrington and Ryder, 2002). At present, 13 farms in the Anne Valley catchment use integrated constructed wetlands to manage farmyard dirty water (Harrington et al., 2004). Fundamental to their design is water quality improvement, landscape fit (designing the wetland into the topography of the landscape) and that the wetland provides an ecological habitat within the agricultural landscape. Typically, integrated constructed wetlands have greater land area requirements than conventional surface flow constructed wetlands in order to provide for these other fundamental ecological services. Few studies (Ryan, 1990) have addressed the issue of quality and quantity of farmyard dirty generated at farm-scales in Ireland. No studies were readily available documenting the effectiveness of a farm-scale constructed or integrated constructed wetland in Ireland to remove nutrients such as phosphorus (P) from dairy farmyard dirty water on a mass basis. To address such, the main objectives of this research were to (i) determine the quality and quantity of farmyard dirty water generated at a farm-scale (ii) determine the effectiveness of three treatment cells of an integrated constructed wetland to treat farmyard dirty, using the difference between input and output mass loadings, (iii) investigate if there were seasonal effects in the wetland’s performance to retain phosphorus, and (iv) assess the impact of the integrated constructed wetland on the receiving environment by monitoring soil-water parameter concentrations up gradient, down gradient and within the wetland system using piezometers at different soil depths.
    • The Farmland Wildlife Survey – raising awareness of wildlife habitats

      Gabbett, Mairead; Finn, John; The Heritage Council (Teagasc, 01/08/2005)
      The Farmland Wildlife Survey involved a short visit (about 3 hours) to 19 REPS demonstration farms, and an identification of habitats and wildlife on each farm, with an emphasis on common farmland habitats such as hedgerows, ponds, watercourses, field margins, woodland, plant species and other areas of wildlife value. The survey results were provided to the farmer and Teagasc REPS advisor as a report with colour pictures of representative habitats, and an explanation of why these habitats were important for wildlife.
    • The Farmland Wildlife Survey –raising awareness of wildlife habitats.

      Gabbett, Mairead; Finn, John (Teagasc, 2005-08-01)
      The Farmland Wildlife Survey aims to support the wildlife objectives of the REPS and communicate a greater awareness of wildlife to farmers. The Farmland Wildlife Survey was conducted on 19 farms that form part of the national network of demonstration farms for farmers who participate in the Rural Environment Protection Scheme (REPS). At each farm, an ecologist conducted a survey that identified existing wildlife areas on the farm. The survey highlighted the existing management practices that were beneficial to wildlife, and pointed out any management practices that could be changed or adopted to be more beneficial. The Wildlife Survey also focused on common wildlife habitats on each farm, such as hedgerows, ponds, watercourses, field margins, woodland, mature trees and farmyard features of wildlife value. The attitudes and beliefs of the farmers were investigated with a short questionnaire. All farmers in the project farmed with some degree of sensitivity and consideration for wildlife and farm habitats. While most of the farmers were quite aware of farmland wildlife before joining REPS, most credited REPS for an increased awareness of the needs of wildlife in the farmed landscape. Most of the farmers believed there is a need for improved provision of information about identity and management of farmland habitats and wildlife. The outcome of the farm survey was provided to each farmer as a short report with colour pictures of relevant wildlife features on their farm. The results of the survey were also summarised in a leaflet for distribution to farmers who visit the REPS demonstration farms. Feedback on the farm visit or in subsequent comment cards was very positive. REPS planners have found the reports useful and interesting. In addition, some Teagasc REPS advisors are using the reports as part of farmer training visits to the demonstration farms. In this way, the Farmland Wildlife Survey complements wildlife objectives of the REPS and promotes a greater awareness of wildlife amongst farmers.
    • Feeding Prolific Ewes in Late Pregnancy and Rearing Triplet Lambs.

      Grennan, Eamonn J. (Teagasc, 2002-06-01)
      In prolific flocks a significant proportion of ewes give birth to 3 or more lambs. It was considered that the birth weight of triplet lambs, and also of twins, could be increased by offering ewes a higher than normal level of concentrate supplementation in late pregnancy. Trials for the evaluation of rates of supplementation were conducted during the years 2000 and 2001. In-wintered ewes were offered silage ad libitum. Based on scanning results groups of twin-bearing (twins) and triplet-bearing (triplets) mature ewes were offered one of three rates of supplementation in late pregnancy. The lower rate in each case was set at a level considered appropriate for twins or triplets. Two groups of twin-bearing hoggets were offered either a low or high rate of supplementation. Average silage intake over the last 6 weeks of pregnancy was 0.8 to 0.9 kg dry matter per day. Intake by triplets was about 95 percent of that by twins, while intake by triplets at the high rate of supplementation was 90 percent of that at the low rate. Total dry matter and metabolisable energy intakes were increased by supplementation. Triplets had significantly lower condition score than twins at lambing but the rate of supplementation did not affect condition score. Average condition score of all ewes decreased by 0.6 units between mid pregnancy and lambing, a decline that is considered acceptable.Average birth weight of triplet lambs was about 1 kg lower than for twins from mature ewes. The high rate of supplementation increased the birth weight of twins by 0.51 kg and triplets by 0.26 kg. Birth weight of twins from hoggets was not affected by the rates of supplementation offered. The rearing of triplet lambs by their dam, rather than cross fostering or artificial rearing the third lamb, may be a useful option in prolific flocks. Previous research findings showed that triplets were reared successfully by selected ewes when the ewes were offered concentrate supplementation for 4 to 6 weeks at pasture post lambing, and the lambs received creep feed from birth to sale. Trials were carried out over two years to assess the response to concentrate supplementation of ewes at pasture post lambing and creep feeding lambs with a view to reducing the quantity and cost of concentrates for rearing triplet lambs. Concentrates at 1 kg/ewe/day were offered to ewes at pasture for 3 or 6 weeks post lambing. Creep feed was available to lambs from week 1 and offered at one of three rates : 300g/day to age 10 weeks: 300 g/day to sale, or 600 g/day to sale. Lambs were drafted for sale by weight and condition and carcasses were classified according to MLC standards. There was no response to feeding concentrates to ewes for 6 weeks rather than 3 weeks in this situation when grass supply was considered adequate. Weaning weight was increased by offering creep feed to lambs to 14 rather than 10 weeks. All lambs were finished to acceptable carcass weights and grades. The main effect of continuing creep feeding lambs to sale was to reduce the average age at sale. The medium and high levels of creep reduced the average age at sale by 20 and 45 days respectively compared with the low rate. However the total quantity of concentrates offered per ewe plus 3 lambs was about 60, 120 and 180 kg for the vi low, medium and high rates of creep respectively, including 21 kg concentrates for the ewes post lambing.
    • Feeding Techniques to Increase Calf Growth in the First Two Months of Life

      Fallon, Richard J.; Morrison, Steven; Dawson, L.; Twigge, J. (Teagasc, 2008-01-01)
      Data from Cornell University and the University of Illinois in the USA suggested that average daily liveweight gains of 900 to 1000 g/calf/day could be achieved from birth to weaning provided the calf milk replacer (CMR) is formulated to meet the calf’s amino acid requirements for such a rate of gain. Their findings suggested a daily milk replacer DM allowance of 1250 to 1500 g/d with a crude protein content of 26 to 30%. A series of studies were undertaken, at ARINI with home born dairy calves and at Grange Beef Research Centre with purchased dairy calves, to determine the effect of increasing the daily milk replacer DM allowance and or increasing the crude protein content of the CMR on calf performance. The main outcomes of these studies were  There was no growth or intake response in any of the studies to increasing the crude protein content of the CMP from 23% to 28%.  Calf growth rates responded to increasing the dailymilk replacer allowance from 600 to 1200 g/day for both home bred and purchased calves. However, the effect was not significant post-weaning in any of the studies.  In all of the studies (for both home reared and purchased calves) feeding a high level of CMRdecreased concentrate DM intake. However, the calves concentrate intakes were similar post-weaning.  The home bred calves with free access to the milk replacer feeders failed to consume their 1200 g/day allowance. Calves offered 600 or 1200 g of CMR/day had average consumption of 554 and 944 g/d, respectively, in the milk feeding period.  Feeding a high (1200 g/d) compared to a low level (600 g/d) CMRdiet for the first 56 days had no significant effect on carcass weight or carcass characteristics when purchased male calves were slaughtered off an ad libitum concentrate diet after 388 days. The final carcass weights were 231 and 240 kg for the respective 600 and 1200 g/d CMR.  Reducing the fat content of the CMRfrom 18% to 12% did not have any effect on concentrate intake or liveweight gain.
    • Field Performance and Quality of Hybrid Winter Wheat.

      Burke, James I.; Hackett, Richard; Tiernan, P. (Teagasc, 2002-10-01)
      An assessment of hybrid winter wheat was carried out over three seasons to determine the commercial potential of hybrid varieties under Irish conditions. The studies examined the effects of reduced seeding rate on hybrid grain yield and quality in comparison to pure-line varieties. A comparison of the hybrid varieties available was also carried out and the higher yielding hybrids were then compared with the best pure-line varieties in terms of grain yield and quality, and response to fungicide. The results indicated that in good sowing conditions hybrids can give greater yields than pure-line varieties at reduced seeding rates, but the effect is not large or consistent. There was generally no effect of seeding rate on the grain quality of hybrid varieties but crop lodging occurred in one season at high seeding rates. Of the hybrid varieties examined, Mercury and Hyno Esta were the two better varieties in terms of grain yield; there was little difference between the hybrids examined in terms of grain quality. The best hybrid varieties did not give consistently higher yields or quality than the best pure-line varieties and exhibited a similar response to fungicide application as the pure-line varieties. It is concluded that when the price differential between seed of pure-line varieties and hybrid varieties is taken into account, the winter wheat hybrid varieties currently available do not offer any economic advantages to commercial growers at the present time.
    • Field Performance of Winter Lupins.

      Crowley, J.G. (Teagasc, 2001-07-01)
      The yield potential of winter lupins is of the order of 3.5–4.5 t/ha. However, this potential is very dependent on sowing during the optimum sowing window of approximately fifteen days. Crops sown between the 11 and 26 September gave the highest yields. However, there can be significant variation around this date which cannot be predicted in advance. This became very evident in crops sown in autumn 2000, where the Rothamsted model identified the 4-19 September as the optimum sowing date. This fact creates a serious risk for the commercial exploitation of winter lupins. Maximum yields are achieved at a plant density of between 20 and 29 plants/m2 in spring. Higher plant densities can result in lodging and a loss in yield. Using the recommended seed rate of 40 seeds/m2, equivalent to 98 kg/ha, will ensure the correct plant density over a wide range of conditions. Pest and diseases are not a serious problem in winter lupin production. The only serious pest, Bean Seed Fly, is easily controlled by routine use of an insecticide before sowing. Yellow Rust, which attacks the crop occasionally, can be adequately controlled by two fungicides, Alto and Folicur. Winter lupins should be grown on light to medium well-drained soils only. Sowing on heavy or poorly drained soils can result in very high seedling losses over the winter period, and can also seriously delay the natural maturation process in lupins, resulting in a late September harvest. The variety Ludet proved to be the best all round variety in these trials. It combines good yield potential with a relatively early harvest. New varieties are required which are less sensitive to sowing date before winter lupins can be regarded as a safe reliable crop for commercial production.
    • Field Validation Of Four Decision Support Systems For The Control Of Late Blight In Potatoes

      Dowley, L.J.; Burke, J.J. (Teagasc, 2003-12-01)
      Field experiments were carried out between 2001 and 2003 to determine the efficacy of the NegFry, Simphyt, ProPhy and Plant Plus decision support systems (DSS) in controlling late blight of potatoes compared with routine fungicide treatments. The experiments were also used to determine the potential of the systems to reduce fungicide inputs. Over the three year period of the experiment the 7-day routine programme received an average of 13.7 fungicide applications while the DSS programmes varied between 5.7 and 12.3 applications. All decision support systems resulted in a reduction in the number of fungicide application (Fig. 2). Compared with the routine control, the NegFry and SimPhyt programmes resulted in a 58-44% reduction in application frequency. The ProPhy and Plant Plus programmes resulted in more modest savings of between 10 and 25% (Tables 1 & 2). All fungicide treatments significantly delayed the date of disease onset compared with the untreated control. Compared with the routine control treatment, the NegFry and Plant Plus significantly delayed disease onset in King Edward in 2001 as did NegFry and ProPhy in Rooster. In 2002 there were no differences between treatments in terms of delaying disease onset, while in 2003, disease developed significantly earlier the Plant Plus programme compared with the routine control. In general, the date of disease onset was not significantly different between routine programmes and DSS programmes irrespective of the cultivar. In each of the three years, all fungicide treatments significantly reduced the incidence of foliage blight at the end of the season compared with the untreated control. When compared with the routine control, no decision support system resulted in significantly more foliage blight at the end of the season, irrespective of the cultivar or year. Similar results were achieved when the treatments were compared using the area under the disease progress curve (AUDPC). These results would confirm that none of the DSS’s resulted in inferior disease control when compared with the 7-day routine application of fluazinam. All fungicide treatments resulted in significantly higher marketable yields compared with the untreated control in all years, irrespective of the variety. Within the fungicide treatments the DSS programmes generally out-yielded the routine fungicide treatment. However, these differences were only significant for Plant-Plus in King Edward in 2001. Within the DSS treatments there were no significant differences in marketable yield in any of the years or either of the varieties. Within the fungicide treatments there were no significant differences between treatments in terms of tuber blight control for the resistant variety Rooster. In the case of the more susceptible variety, King Edward, all the DSS programmes resulted in significantly lower levels of tuber blight than the routine Shirlan control in 2001 except for Simphyt. More importantly, the routine Shilan did not result in significantly better tuber blight control in any of the years when compared with any of the DSS programmes. This confirms that all DSS programmes give equivalent tuber blight control to the routine Shirlan application at 7-day intervals even with a very tuber blight susceptible variety.
    • Fluctuations in Energy Intake and Fertility in Cattle.

      Diskin, Michael G.; Sreenan, J.M.; Dunne, L.D.; O'Farrell, Kevin (Teagasc, 2001-12-01)
      Reproductive failure in dairy cows results in fewer calves born, lower milk sales, slower genetic progress and consequently, significant financial loss to the industry. Dairy cattle breed improvement programmes have, at least until very recently, focused primarily on increasing the yields of milk or milk solids. The resulting genetic improvement has led to significant increases in milk yield per cow but this increase is now associated with a significant decline in cow reproductive wastage. An important part of the Teagasc research programme in this area is to determine the time at which embryo loss occurs and also to determine whether the extent of the embryo loss is affected by the energy nutrition of the cow and to devise strategies to reduce its extent. This project has focused on the relationship between changes in dietary energy intake near the time of insemination and the extent and pattern of embryo survival. The main results are summarised in this report and detailed results of the several experiments involved have been published in the papers listed at the end of this report. • • 4 The objectives of this project were to determine the effect of changes in energy intake near the time of insemination on embryo loss rate, on the timing of embryo loss and on the possible biological mechanisms involved. Cross bred heifers were provided with either high or low energy intakes that were based on pasture allowances calculated to provide either 0.8 or 2.0 times their maintenance requirements. These energy intakes were allocated for two weeks before and about five weeks after insemination. The effect of the changes in energy intake on embryo loss and on the time at which embryo loss occurred, relative to the time of insemination, was established. Possible associations between embryo loss and blood concentrations of progesterone, NEFAs, insulin and glucose were examined. A sudden reduction from a high to a low energy intake imposed for two weeks from the day of insemination reduced the subsequent embryo survival rate by 30 percentage points to a survival rate of 38%. When energy intake over this same period was either maintained or increased, embryo survival rate remained high (overall mean, 69%), within a range of 65-71%. The time at which embryo loss occurred was established. Embryo survival or pregnancy rates measured on days 14 and 30 after insemination and at full term were 68%, 76% and 72%, respectively. These results provide new information indicating that most embryo loss, at least in heifers, had occurred on or before day 14 after insemination. There was no evidence of any association between the shortterm changes in energy intake either before or after AI and blood progesterone concentration. Neither was there any evidence that the detrimental effect of the sudden reduction in energy intake on embryo survival was mediated through changes in the systemic concentrations of non-esterified fatty acids (NEFAs) or insulin. There was a suggestion, however, that the detrimental effect of the reduced energy intake may operate through a reduction in systemic glucose concentrations.
    • Food Authentication using Infrared Spectroscopic Methods

      Downey, Gerard; Kelly, J. Daniel (Teagasc, 01/06/2006)
      Confirmation of the authenticity of a food or food ingredient is an increasing challenge for food processors and regulatory authorities. This is especially the case when an added-value claim, such as one relating to geographic origin or a particular processing history, is made on the food label. Regulatory agencies are concerned with the prevention of economic fraud while the food processor needs confirmation of such claims in order to protect a brand, the image of which could be severely damaged should an adulterated ingredient make its way into the branded food product.To be of greatest value, any analytical tool deployed to confirm authenticity claims needs to be portable, easy to use, non-destructive and accurate. Infrared spectroscopy, near and mid-infrared, is a tool which has been demonstrated to possess these properties in a wide range of situations.While some applications in food authenticity have been reported, the work undertaken in this project was designed to explore their capabilities regarding a number of products and authenticity issues of particular interest to the Irish agri-food industry i.e. olive oil, honey, soft fruit purées and apple juice.
    • Food authentication using infrared spectroscopic methods

      Downey, Gerard; Kelly, J. Daniel (Teagasc, 2006-06)
      Confirmation of the authenticity of a food or food ingredient is an increasing challenge for food processors and regulatory authorities. This is especially the case when an added-value claim, such as one relating to geographic origin or a particular processing history, is made on the food label. Regulatory agencies are concerned with the prevention of economic fraud while the food processor needs confirmation of such claims in order to protect a brand, the image of which could be severely damaged should an adulterated ingredient make its way into the branded food product.
    • Food choice and consumer concerns about animal welfare in Ireland

      Meehan, Hilary; Cowan, Cathal; McIntyre, Bridin; European Commission; CT98-3678 (Teagasc, 2002-04)
      Consumer concerns about farm animal welfare and the impact of these concerns on food choice in Ireland were investigated. The aim was to identify and analyse the nature and level of consumer concern. The qualitative and quantitative studies demonstrated that although consumers are concerned about farm animal welfare, this concern is not a priority in food choice. Consumers use animal welfare as an indicator of other product attributes such as food safety, quality and healthiness, which they usually perceive as more important. Consequently, consumers equate good animal welfare standards with good food standards.