BSPP1997 Abstracts Booklet
This is a subpage of our conference ‘BSPP1997: Global Perspectives of an Applied Science‘
Session I – Setting the scene
Plant disease, a global problem
Dr Jim M. Waller
International Mycological Institute, Egham, Surrey TW20 9TY
Despite the current agricultural sufficiency in much of the developed world, pressures for increased agricultural productivity and efficiency continue to occur on a global scale and are driven by factors such as population increase, urbanisation and related infrastructural development, civil strife, climatic change and environmental degradation. Many facets of agricultural and related development lead to an increase in the actual or potential hazards caused by plant diseases. ‘Globalisation’ is leading to greater movement of people, goods and services with consequent risk of pathogens spreading to new areas and a reduction in the normal epidemiological constraints restricting disease development. The expansion of agriculture and novel germplasm to new areas can lead the emergence of new disease problems sometimes involving the appearance of apparently new pathogens. Intensification of crop production often involves techniques which exacerbate disease problems especially those of a soil-borne nature and where this is accompanied by monoculture selection pressure for the emergence of virulent types increases. Greater investment in crop production raises the economic significance of diseases as pathogens previously considered as minor nuisances become recognised as yield constraining factors. Examples of disease problems which have arisen largely as a consequence of these activities will be considered. Some of these, particularly in Africa are currently of major significance and are able to attract the attentions of funding agencies. Others are of a more insidious nature but constrain the productivity of the millions of small farmers in the developing world. Many of these problems are ill-defined and unresolved and represent a greater challenge for plant pathologists in a world where client articulated demand and shorter term impact with longer term sustainability are the keys to secure research funding.
BSPP Presidential Address: Whither or wither extension plant pathology?
Dr Nigel V. Hardwick
Central Science Laboratory, Sand Hutton, York YO4 1LZ
The full script of the President Address will be published in the Society’s journal Plant Pathology.
Session II – Identifying the Problem
The role of extension services -agony and ecstacy
Dr Reuben Ausher
Ministry of Agriculture and Rural Development, Extension Service, P.O.Box 7054, Tel Aviv 61070, Israel
Extension Services are supposed to cover the whole technology development continuum, namely technology generation, diffusion and adoption. Further, they should convert data and information into knowledge and technologies. The classical growers/extension/research triangle is getting much more complex due to a proliferation of actors interacting with growers. Extension services should catalyze the conversion of subsistence farmers into commercial growers and finally into entrepreneurs. Most extension systems are either production, environment or community oriented. Adviser/advised ratios differ widely from 1:325 in Europe to 1:3500 in the Near East. Under intensive cropping systems there is an increased demand for advice to support both managerial and technical decisions. Agricultural extension faces a series of management and professional conflicts and crises. The crises debilitate extension organizations and they could dwindle and even collapse. Extension systems in the developing world are either managed by government (as community-, commodity-, integrated development and Training & Visit systems) or partially privatized and decentralized. In the industrialized world most systems are either government-managed, land-grant, grower-owned or privatized. The debate on the function of privatized extension (consulting) systems and the need for agricultural extension and R&D as public goods is still not resolved. The era of ideology and of rigid extension models adequate for the developing world is over. One central requirement has to be stressed in this context – professionalism.
Fortunately, extension faces new opportunities and is expected to develop leadership in areas such as Integrated Pest Management, computer-supported technologies, distant education. We feel new and clear front-line demand for integrated crop protection extension focusing on disease, pest and weed control; good understanding of crop husbandry, protection and marketing; wide range diagnostic capabilities; understanding of the whole range of chemical control, and application techniques; mastering supervised control, fully-fledged IPM, and biocontrol practices; specialization of crop protection extension by commodities.
Monitoring for disease in seed potato production
Dr Jane M. Chard
Scottish Agricultural Science Agency, East Craigs, Craigs Road, Edinburgh, EH12 8NJ
Monitoring for disease is an integral part of the Seed Potato Classification Scheme administered by the Scottish Office Agriculture, Environment and Fisheries Department in Scotland. The Scheme is based on visual assessment for disease and trueness to variety to meet tolerances specified in national regulations. Visual assessment is also used to monitor infection of stocks in the previous season.
Surveys are done to fulfil legislative requirements to confirm freedom from quarantine pathogens and also to monitor the occurrence of other pathogens in seed stocks. The data from such surveys, when compared with those of previous years, can reflect the impact of changes in technology or cultural practices.
Occasionally it is necessary to implement specific monitoring for diseases or their vectors. In Scotland an aphid monitoring scheme has been in operation since 1992. A reduction in virus incidence in stocks has been recorded following implementation of this measure.
Monitoring for diseases is a vital component of the seed potato production system in Scotland. For successful control of diseases within the Scheme, however, in addition to monitoring disease occurrence, measures and specified tolerances must be based on a sound knowledge of the biology and epidemiology of each pathogen.
Sustainable farming – are we getting there?
Prof. Martin S. Wolfe
Wakelyns Agroforestry, Fressingfield, Suffolk, IP21 5DS, England
A comprehensive concept of sustainable farming must take into account social, economic and environmental aspects of agriculture. However, the current structure of mainstream farming in Europe arose from the post Second World War policy that was directed to secure adequate food supplies at almost any cost. Subsidies and technological developments did secure a vast increase in some agricultural commodities. Unfortunately, this movement was based largely on monocultures supported by high levels of external inputs, often with unforeseen and far-reaching negative consequences. A wide range of technical solutions is now being applied to try to improve existing mainstream systems. However, in terms of long-term sustainability, there are strong arguments to suggest that this may be the wrong approach. Biological, economic and social reasoning indicates that a major alternative direction is to change towards greater diversity within farming systems. The basis for this view, some of the progress made so far and some implications for plant pathology, will be considered, together with some suggestions for further development.
The closed environment – a challenge to horticulture
Dr G Martin McPherson
Horticulture Research International, Cawood, Selby, North Yorkshire, YO8 0TZ, UK
Introduction
Salad crops have been grown intensively under protection for many decades and production has increasingly relied heavily on fungicides to maintain control of both foliar and root infecting pathogens. The occurrence of persistent soil-borne root pathogens, e.g. Pyrenochaeta lycopersici, Phomopsis sclerotioides, largely uncontrolled by fungicides, stimulated the move into soil-less or hydroponic production systems. Partly as a consequence of this move into inert substrates, e.g. rockwool, the marketable yield and quality of tomato and cucumber crops improved dramatically.
In the last few years environmental issues have predominated and attention has been focused on the increased occurrence of pesticide residues in harvested produce and the threat of ground-water contamination by both fertilisers and pesticides in the run-off from these crops. The hydroponics industry, particularly in developed countries, are now looking to respond to retailer and consumer needs by developing ‘closed’ production systems to minimise both environmental pollution and pesticide inputs into salad crops.
Fungicides are used prophylactically in both the aerial and root environment of salad crops. The challenge for horticulture, as we move towards the next millennium, will be to develop sustainable production techniques which reduce pesticide use, minimise residues in food and which safeguard the environment for future generations. This must be achieved within a framework of continued economic production to deliver the quality of produce that the consumer has become accustomed.
The aerial environment
Significant advances have been made in the area of pest control in UK-grown protected salad crops and it is now unusual for any insecticides to be applied. Instead, natural enemies (predators) are routinely introduced to suppress pest populations. Yet, in these same crops, fungicides continue to be used routinely for the control of powdery mildew (Sphaerotheca fuliginea, Erysiphe sp.), grey mould (Botrytis cinerea) and stem rots (Didymella [Mycosphaerella] spp.) These fungicides potentially disrupt not only the predator-prey balance but also the epiphytic microflora and prevent the establishment of myco-parasites. We should, as pathologists, perhaps be a little disappointed that further progress has not been made to improve our understanding of the role, and possible antagonism, of the epiphytic microflora on leaf surfaces and at the same time established myco-parasites, e.g. Ampelomyces quisqualis ‘AQ10’, Sporothrix flocculosus, for use in this controlled environment. It should be acknowledged that, in the UK at least, one of the primary hurdles hindering progress in this area is the registration or authorisation process for ‘bio-pesticides’.
The root environment. Currently, in most hydroponic crops, excess nutrient solution and the nitrates, phosphates, pesticides and any pathogen propagules it might contain, is discarded. This approach minimises the risk of disseminating pathogen spores though is perceived to be environmentally unfavourable. As production costs rise, the inefficient use of both water and fertiliser is encouraging growers to consider recirculation or ‘closed’ hydroponics technology.
It has been demonstrated that various root pathogens are disseminated widely in re-used hydroponic solution, although this can be effectively countered by adopting a strategy of solution disinfection, e.g. heat, UV, ozone, albeit at a significant cost. Interestingly, the rate of disease development in ‘closed’ hydroponic crops has been observed to be much slower than in equivalent ‘open’ culture systems. This has led to a series of hypotheses to account for the observed disease suppression. Collaborative studies are now paving the way towards a better understanding of the various plant-microbe interactions in the hydroponic root environment. The hope is that the observed suppressive mechanisms may be harnessed and utilised more fully to maintain disease control in these crops without resorting to prophylactic use of fungicides in the future.
Satellite imagery – avoiding muddy boots?
Dr Mike D Steven
University of Nottingham, Department of Geography, University Park, Nottingham NG7 2RD
Satellite imagery offers the opportunity to monitor the health of crops over a wide region, but to be useful for monitoring plant disease, it must satisfy certain preconditions:
- The satellite observation must be significantly affected by the disease
- Satellite data must be available at the critical times required.
Remote sensing systems operate from optical wavelengths, through the region of thermal emissions to microwave wavelengths. The best prospects for disease monitoring lie in detecting changes in leaf area or leaf pigmentation, which are susceptible to optical techniques, or in stomatal responses which can sometimes be detected thermally, or in canopy structural responses which may be detectable by radar. However, a difficulty with current orbiting systems is that the spatial resolution is not fine enough to measure the spatial patterns traditionally associated with crop disease monitoring from a Landsat image acquired six weeks earlier, suggesting that satellite imaging might offer a feasible approach in a less demanding climate.
Session III – Recognising the cause
Diagnosis – the means to an end
Dr David Stead
Central Science Laboratory, Ministry of Agriculture, Fisheries & Food, Sand Hutton, York YO4 1LZ
I will define the end in this case as production of plants and their products with minimal loss due to pests and diseases. For much of the work we do at CSL it could have been prevention of alien pests and diseases becoming established in the UK. The use of appropriate diagnostic methods is an essential means of achieving these ends.For example, correct diagnosis facilitates selection of the best means of control.Diagnostic methods are of equal value to the extension pathologist giving advice on control based on a field diagnosis and to the pathologist screening propagating material for a pest /pathogen listed in a certification scheme.
Diagnostic methods can be placed in 3 broad categories:
- methods which allow diagnosis in symptomatic plants or products
- methods which allow detection of pests and pathogens in symptomless infections
- methods which allow indexing for pest/pathogen free material.
There is some obvious overlap between the last two.
The last decade has seen a marked reduction within the UK in diagnoses in the first category, especially through regular crop inspections for visual symptoms. However, there has been a large increase in diagnoses in the last two categories, ie. those that test for the presence or absence of the pest or pathogen.The increased emphasis on detection has been accompanied by advances in technology especially in the development of nucleic acid -based methods and in the development of diagnostic kits.
This talk reviews some of the key elements of modern diagnostic methods, including – specificity, sensitivity, sampling, speed, cost, reliability and how the results should be used to achieve the end.The talk is illustrated with pests and diseases currently of concern to UK agriculture and in particular to CSL. In particular it discusses the problems that may occur if the trend towards indexing continues.
Central diagnostic facilities in support of local problems
Dr Ghita Cordsen Nielsen
The Danish Agricultural Advisory Centre, The National Department of Plant Production, Udkaersvej 15, Skejby, DK-8200 Aarhus N., Denmark.
The Danish agricultural advisory system is organised at two levels – a national and a local level. The local level involves about 85 local advisory centres, organised and run by the local farmers’ unions and associations. The advisors working at these centres provide the individual farmers with guidance and other services.
At the national level the Danish Agricultural Advisory Centre (DAAC) provides the local centres with the latest information from both Danish and foreign research. The Centre also undertakes its own programme of investigations on practical issues. The plant clinic is located at the DAAC and operating at the national level.
The Danish Agricultural Advisory Centre has a staff of about 370 whereas a local centre typically has 20-70 employees and serves between 500 and 2,000 members. The entire advisory service serves about 70,000 farmers which corresponds to 95% of the Danish farmers.
The DAAC is responsible for the main services that are most appropriately organized at the national level and has 5 main tasks:
- Specialized advisory services
- Communication of knowledge and information
- Development activities
- Trials and investigations
- Education , in-service training and courses
- Service activities
The DAAC performs these duties in many ways. Today I will mention only one service activity and that is the central diagnostic facility.
The diagnostic facilities
The central plant clinic of The National Department of Plant Production employs a senior adviser and a laboratory technician who treat the about 600 samples which the local advisers send in annually.
The advantages of a central diagnostic organisation are:
- We build up expert knowledge which benefits all the local advisers (and farmers). In this way we can hopefully give more qualified answers.
- We only need laboratory facilities in one place
- The total amount of time used to solve a specific problem is less as not all the local advisers have to deal with this problem.
- The clinic knows about the special problems of a specific growing season. This knowledge can be used for example in newsletters to the local advisers who can use the information in locally adapted newsletters to the farmers.
- In order to be able to follow the general problems during the growing season we operate a monitoring system for pests and diseases in the main crops in co-operation with local advisers.
- The DAAC also develops new field experiments and investigations – for instance in case the local advisers present us with problems we cannot solve or prevent. We organise and carry out about 2,000 field trials every year in collaboration with the local advisers. About 40% of them are related to plant protection.
- New problems in plant production are more easily found when you get many plant samples. In the late 1980s for example we received an increasing number of samples showing symptoms of sulphur deficiency. Over the past few years we have seen black scurf and stem canker (Rhizoctonia solani) in organic potatoes.
Quantitative diagnostics – have we arrived?
Dr Paul Nicholson
Cereals Research Department, John Innes Centre, Colney Lane, Norwich NR7 7UH, UK
In an attempt to answer the question posed (by Nigel Hardwick) in the title I will draw on examples and experience from our work with cereal diseases. Two economically important disease complexes affect cereals in the UK – ‘stem-base disease’ (SBD) and ‘Fusarium ear blight’ (FEB). Visual diagnosis of these disease complexes is difficult, with several species often occurring together in the same tissue. Even attempts to evaluate the relative proportion of each species in plant samples by isolation into axenic culture only reveals what can be grown out of the plant rather than what is within the plant.
The inability to detect, identify and quantify individual species within plant tissues has seriously hindered the study of these diseases. In addition, the inability to diagnose correctly may result in the adoption of inappropriate of poorly timed control measures.
Molecular techniques are being developed to overcome many of the problems associated with the study of SBD and FEB. Among the most sensitive techniques available in the polymerase chain reaction (PCR). We have developed PCR-based assays for detection of the SBD and FEB fungi directly in extracts from plant tissue. These assays have been designed to enable simultaneous detection of several pathogens in each reaction and so provide and integrated system. These assays have been refined to enable quantification of each species, allowing the relative contribution of each component to the disease of the plant to be estimated. This paper reports aspects of this work and some preliminary results achieved using these systems.
Quantifying Fusarium diseases of cereals
Dr Philip Jennings, J.A. Turner, J.N. Banks and R.H. Rizvi
Central Science Laboratory, Sand Hutton, York YO4 1LZ
The major pathogens causing Fusarium diseases of cereals in the UK include Fusarium avenaceum (Gibberella avenacea), Fusarium culmorum, Fusarium graminearum (Gibberella zeae), Fusarium poae and Microdochium nivale (Monographella nivalis formerly Fusarium nivale). Symptoms caused by different species are often indistinguishable and pathogen isolation using traditional methods is time consuming. Subsequent species identification requires expertise and results are not quantitative. Control of these pathogens is often difficult due to differences in fungicide sensitivity and in determining the correct timing of fungicide applications. Development of a rapid, simple to use, diagnostic method for the identification of these species would allow more accurate epidemiological investigation, leading to improved disease control. Enzyme-linked immunosorbent assay (ELISA) is a specific, rapid and quantitative method, which has been used in the identification and detection of a range of viral and fungal diseases. Monoclonal antibodies (MAbs) were raised against F. avenaceum, F. culmorum, F. graminearum, F. poae and M. nivale, with a view to developing an ELISA for their detection in plant material. Cross-reactivity studies carried out against fourteen Fusarium species plus M. nivale, ten other field fungi and eight storage fungi indicated that cell lines had been produced that secreted MAbs specific to F. avenaceum, F. culmorum, F. poae or M. nivale, but not F. graminearum. The MAbs detected antigen from plate washings; this both simplified and reduced the time required for their identification. The limit of detection was estimated to be between 0.4 and 2 g antigen/ml. Screening against twenty different isolates for each species showed that selected cell lines produced species-specific MAbs. However, other cell lines produced MAbs which recognised only the isolate to which it was raised. Initially, no MAb detected antigen from infected plant material. Further investigation indicated that the type of plant material and its treatment affected antigen detection. The presence of crushed grain, stem-base or root material totally inhibited antigen detection. However, inhibition was reduced when plant material was soaked, but increasing the soaking time increased the inhibition. The form of the antigen also affected antigen detection. MAbs did not detect spores or non-active mycelia in plant material. However antigen was detected when actively growing mycelia was produced from diseased stem-base and root material. The use of bio-amplification techniques was necessary for successful detection in infected grain. Further refinement of the protocol, possibly by the use of more complex ELISA, is being undertaken to minimise inhibition and produce a more sensitive assay.
Session IV – Assessing the risk
Plant disease – barrier to world trade
Mr Robert L. Griffin
Co-ordinator, Secretariat of the International Plant Protection Convention, Food & Agriculture Organisation – AGPP, Vialle Delle Terme di Caracalla, 00100, Rome, Italy.
Abating the introduction and spread of plant diseases through exclusion is a concept receiving greater attention in the dawning era of globalization. As the movement of people and goods accelerates geometrically with the liberalization of trade, the regulatory measures used to inhibit the spread of harmful plant diseases becomes increasingly more important. The basis for the placement and strength of such measures is key to determining whether they withstand international scrutiny. It is in this respect that regulators are strongly dependent upon the research community to provide scientific support needed to properly develop, evaluate, and challenge regulatory measures for plant diseases. A harmonized process of pest risk analysis (PRA) is the recognized means to provide a scientific foundation to regulatory decision making. Increasing the awareness and sensitivity of plant pathologists to trade issues and the needs of regulators, particularly in the area of pest risk analysis, is essential to strengthening the role of science in regulatory policy making and assuring that needed protection is provided without plant diseases becoming unjustified barriers to trade.
Bacterial brown rot of potato, caused by Ralstonia (Pseudomonas) solanacearum race 3, biovar 2 in Europe
Dr Jaap D. Janse
Plantenziektenkundige Dienst, Geerjesweg 15, Postbus 9102, 6700 HC Wageningen, The Netherlands
Race 3, biovar 2 of Ralstonia (Pseudomonas) solanacearum is adapted to cooler climates (optimum growth temperature 27C, unlike the tropical races 1 and 2 with 35-37C). Its origin, as it is for potato, is most likely South America, (where it occurs in the highlands and also resistance is present in wild potato) and spread with its main host, potato. It has a narrow host range (potato, tomato, some solanaceous weeds and exceptionally a few others like eggplant and pepper).
Race 3 was first observed in the Mediterranean area in the forties and adequately described from Portugal in 1947, where it caused severe outbreaks up to the sixties, but disappearing in later years. It was reported in Greece in 1951. In Egypt the disease became endemic and findings in early table potatoes exported to Europe were reported as early as 1962, but infections were also found in potatoes coming from Cyprus and Malta at that time. In subsequent years reports were mainly from Egyptian potatoes and recently also a few cases from Turkey.
An outbreak of brown rot in Western Europe was first reported from Sweden (1972) in ware potatoes, where an apparent link was found between waste for processing industry dumped into a river, infection of bittersweet (Solanum dulcamara) growing with its roots in water and irrigation with contaminated surface water of potato fields. In 1989 in Belgium and 1992 in Belgium, the Netherlands and the UK isolated cases in ware potatoes were reported. For the UK and Belgium epidemiological research established a link between contaminated surface water and bittersweet and not with seed. In 1995 a more severe outbreak was found in the Netherlands in both seed and ware potatoes. Here spreading of the disease could largely be explained by a heavily infected seedline (probably contaminated by the use of contaminated surface water), but in a number of cases irrigation with contaminated surface water appeared to be the cause and the relation with bittersweet was clearly established. In the same year (with an exceptional warm summer in Western Europe) the disease was also reported from France (possible links with contaminated surface water, also in outdoor and glasshouse tomato), Portugal (unexplained cases) and Italy (some perhaps connected with imported seed, others unexplained). In 1996 further findings occurred in the UK, the Netherlands (also one case of glasshouse tomato with a link to the use of contaminated irrigation water) and Spain (unexplained). In 1997 the disease was reported from Germany (unexplained), the UK (glasshouse tomato with a link to contaminated irrigation water), the Netherlands (contaminated irrigation water) and Italy (possible link to processing industry).
Ecological research performed so far, has established for most infected countries a possible link between industries using infected potatoes fro processing, contaminated surface water and, especially in Western Europe, bittersweet. Apparently soil seems not to play an important role in survival of the bacterium and perpetuation of the disease. Many aspects in the epidemiology are still unclear, however.
Possibilities for control like the use of healthy seed, testing of seed and imported table and processing potatoes, avoidance of use of irrigation water, decontamination of waste water, hygiene and measures for infected fields will shortly be outlined.
Quarantine strategies and new disease risks in Australia
Dr Peter Merriman
Institute for Horticultural Development, Knoxfield, Victoria, Australia
Primary industries have, for many years, benefited from Australia’s geographic isolation and relative freedom from some of the most destructive pests and diseases which are common in other western agricultural systems. This status has long been recognised by Government and Industry, and quarantine barriers are generally considered to have been effective in minimising the risk of incursions.
In the past 10 years, increased tourism, business and movement of cargo have imposed greater pressure on services which may have contributed to more frequent breaches of the quarantine barrier, especially in plant industries. Papaya Fruit Fly, Race 4 of Fusarium oxysporum f sp. cubense, Mycosphaerella fijiensis and Erwinia amylovora are currently subject to eradication programs which are funded by the State and Commonwealth governments. Costs are substantial. Eradication of Papaya Fruit Fly in Far North Queensland is estimated at $30 M since Oct. 1995 and the program against Erwinia amylovora, which commenced May 1997, at $2 M.
The Commonwealth Government is responsible for the Australian Quarantine and Inspection Service (AQIS), and has responded to recommendations of a comprehensive review of AQIS ( Nairn review) by investing $76 M in strengthening the capabilities of operational and policy groups. Plant industries are major beneficiaries through the newly created Chief Plant Protection Officer with specific responsibilities for contingency planning and incursion management; and through the proposed Australian Plant Health Council which will commission projects to enhance plant protection programs. Increased resources are being provided for inspection at ports and mail centres, for pest risk analysis, for off shore eradication and containment of pests and pathogens on Australian islands in the Torres Strait, and for the development of more prescriptive contingency plans for exotic pests and pathogens at both generic and specific levels.
Funding for reference collections of pests and pathogens and diagnostic services in Australia is inadequate and a national review is seeking long term solutions for this issue, which undoubtedly will assume increased importance under the Sanitary and Phytosanitary agreement of the GATT Uruguay round concluded in 1993 (now WTO).
Pathogen risk assessment in the UK
Dr Claire E. Sansford
Central Science Laboratory, Sand Hutton, York, Y04 1LZ
The concept of risk analysis is not new. Examples of applications can be found in a wide-range of disciplines outside of plant pathology (e.g. finance and engineering), but the fundamental principal of risk analysis is that it provides a practical framework for decision making.
Pest Risk Analysis (PRA) (for “pest” read pathogen) is necessary to identify and assess risks to agricultural and horticultural crops and forestry, from economically damaging alien pests. The process of PRA includes not only the assessment of the risk of entry and establishment of the pest but also the economic and environmental damage which might result should the pest be introduced.
Until recently risks were assessed and plant health regulations were developed in an unsystematic manner. However, following the Uruguay round of the GATT (now WTO) an agreement on Sanitary and Phytosanitary Measures (SPS) was made which aims, amongst other things, for greater transparency of national policies and objectivity in the process of PRA. The intention is that phytosanitary measures that restrict trade should be applied only to the extent necessary to protect plant health, thus facilitating freer trade in plant material. The SPS agreement also aims to encourage the use of international standards. With respect to plant health any measures stricter than international standards must be scientifically justifiable and all new phytosanitary measures must be justified by an analysis of pest risk.
The development of risk analysis has been underpinned by developments in a number of other key areas, including: epidemiology – understanding the mechanisms of disease spread among plant populations; computer-based tools such as Geographical Information Systems which can help make more accurate predictions about events such as the likelihood of introduction of a new pest; improved techniques for surveillance and monitoring for the identification of pests, diseases and their vectors; and growth in international agreement and mutual understanding between trading nations about common standards for the assessment and management of risk.
National, European and international schemes for PRA are being standardised. The ultimate aim is to construct an effective system which can be used for a range of situations such as interceptions on imported plants/plant products, or requests for importation of plant material. Examples of pathogen risk analysis given in this paper include the risks associated with Tilletia indica Mitra, the cause of Karnal bunt in wheat; tospoviruses transmitted by the insect vector Thrips palmi Karny; Zoysia grass from the USA and Fusarium oxysporum f.sp. basilici on sweet basil.
At the end of the PRA process, decisions need to be made. Different interest groups have differing views on acceptable levels of risk. However, risk analysis aims to ensure that the decisions will be well informed, transparent and neutral.
Session VII – Predicting the event
Garrett Memorial Lecture: Forecasting – the ends or a means?
Dr William E Fry
Department of Plant Pathology, 334 Plant Science Building, Cornell University, Ithaca, NY 14853, USA
Disease forecasts linked to disease control decisions represent the convergence of goals of disparate groups. Growers desire a tool to suppress disease more effectively and more efficiently. Society desires a tool that will lead to decreased use of fungicide. Plant disease epidemiologists desire tools that use the results of their research. However disappointment results from the expectation that the forecast will work perfectly when introduced. In general, farmers have found forecasts less helpful than hoped. Therefore adoption of forecasts by farmers has been less than desired by epidemiologists, and reductions in use of fungicide have been less than desired by environmentalists. These pessimistic views contrast with an alternative view. Forecasts are models (dramatic simplifications of nature) and as such are to be viewed as imperfect works in progress. As model predictions are compared to reality (validation), new insight develops and the model is revised. Revision is an integral part of the modelling process. New insight leads to an improved understanding of the pathosystem and better disease control – sometimes without use of the forecast system. Insight derived from previous effort in forecasting has identified new factors important to forecasts in some pathosystems. These new factors include: predictions of future weather, predictions of pathogen influx, and methods for interpreting predictions. Examples of new factors in forecasting will be examined. Additionally, improvements in weather monitoring equipment and in computer technology enable the integration of spatial and temporal factors.
Potato blight – do we have the answer?
Dr Huub T. A. M. Schepers
PAV, Postbus 430, 8200 AK Lelystad, The Netherlands
In 1845 late blight made the first dramatic appearance in Europe, but it was not until 15 years later that De Bary (1861) proved beyond any doubt that the fungus Phytophthora infestans was the cause of the late blight epidemic. From the time of its first appearance, attempts were made to control late blight with chemicals. The earliest materials included compounds such as sodium chloride (Tebitt, 1847), lime (Focke, 1846) and sulphur (Kuhn, 1858). Effective control of the disease became possible with the introduction of the copper fungicides in the 1880s and it was Bordeaux mixture (Millardet, 1885) which opened up the area of chemical control. Later the dithiocarbamates (1930s) and more recently the systemic fungicides with curative properties (1970s) were introduced.
It was recognised at a very early stage that seasonal weather played an important role in the development of the disease each year, but is was not until the 1920s that in The Netherlands an empirical model of meteorological conditions was developed to advise growers on what dates to apply fungicides (Van Everdingen, 1926). Subsequently, a number of empirical derived forecasting rules based on weather conditions were developed, for example: Beaumont period (1948), “Irish rules” (1949), Smith period (1956). In the USA two of those empirical systems were amalgamated in 1975 to form the warning system known as Blitecast which uses temperature, relative humidity and rainfall data to generate a recommendation for the first spray and also subsequent spray intervals. More recently, fundamental models have been developed which incorporate and combine experimental data on the pathogen’s life cycle, meteorological conditions, fungicides and cultivar resistance. During the 1980s, following the development of more powerful computers, a large number of factors and their inter-relationships could be included in forecasting systems. Not all of these systems were introduced in practice, but nowadays farmers do have the possibility to use these systems in supporting their decision to spray or not. Examples of such models are NegFry, Prophy, Plant-Plus, Simphyt and Guntz-Divoux. Although there is no limit to the calculating power of computer programmes as a tool for the development of forecasting models, a limiting factor remains the knowledge we have on crucial aspects of the late blight epidemic. Despite 150 years of research there are still many unanswered questions regarding epidemiology, fungicides, weather conditions and cultivar resistance and their interaction. Improvements in forecasting systems will therefore need to keep pace with our up to date knowledge of these factors. The cost of these systems in relation to their benefits and most importantly their relatively ‘user-unfriendliness’ may well preclude their wide scale use at least in the foreseeable future. In my opinion, Decision Support Systems will play a vital role in our ability to control late blight, but they will not in themselves provide the ultimate solution to the problem.
Forecasting apple diseases
Dr Angela Berrie
Horticulture Research International – East Malling, West Malling, Kent, ME19 6BJ, UK
The main UK apple cultivars Cox, Jonagold and Gala (dessert) and Bramley’s Seedling (culinary) are susceptible to a range of fungal diseases which reduce yield and quality and may result in further losses post-harvest through rotting in store. The major UK diseases are scab (Venturia inaequalis) and powdery mildew (Podosphaera leucotricha) and control relies on routine application of fungicide at 7-14 day intervals to achieve blemish-free fruit required by the market. Such practices are usually reliable but increased public concern about pesticides and rising costs to growers mean they have led to a reappraisal of their use. The development of disease warning systems offers scope to optimise fungicide use by better timing of sprays. ADEM (Apple Disease East Malling) is a PC-based system which forecasts the risk of scab, mildew, Nectria fruit rot and canker (Nectria galligena) and fireblight (Erwinia amylovora).
For disease warning systems to be adopted by growers for decision-making on fungicide use, they must demonstrate the accurate identification of infection periods. Orchard tests over several years have shown the models in ADEM to give accurate warnings of infection periods. Trails have also been conducted to develop practical strategies, which make use of warnings for scab and mildew without putting the apple crop at risk. A key stage strategy had been developed. In this, routine fungicide applications are made at key growth stages of bud burst and petal fall. At other times spray decisions are based on disease warnings generated by ADEM, but also taking into account other practical considerations e.g. treatments for other diseases, pests or nutrient sprays and also holidays when spray operators may be unavailable.
The key stage strategy has been tested in large plot orchard trials and , more recently, used to manage 15 orchards at East Malling. Use of the system has resulted in similar or better disease control compared to routine spray programmes, but with reduced fungicide inputs. ADEM has been available commercially since 1996. Uptake by fruit growers in the UK has been slow, but now the advantages of the system are being recognised and used commercially.
Forecasting the effect of disease as influenced by the host
Dr Fen Beed
University of Nottingham & ADAS Boxworth, Cambridge CB3 8NN
Predictions of yield loss in cereals due to disease are often imprecise because of the variation in the relationship between percent severity measurements and final yield. This study tested the hypothesis that yield response to an unit of disease would change with differences in the growth of the host. Shading devices, which produced levels of radiation similar to those created by continuous cloud, were used to reduce the growth of winter wheat cv. Slejpner during sequential intervals of crop development. Shading did not effect the progress of disease symptoms of introduced epidemics of Puccinia striiformis (yellow rust) but did effect the crop’s response to fungicidal control. For example, when plants were inoculated at GS 39 to produce a late epidemic, a response of 3 t ha-1 was observed for a crop shaded between GS 31 and GS 39 (during stem extension), 1.25 t ha-1 in a crop shaded between GS 39 and GS 55 (flag emergence to 50 % ear emergence) and 2 t ha-1 in an unshaded crop. Differences in yield response were due to differences in allometry with different intervals of development, and particularly the contribution of current photoassimilate to components of yield. Shading between GS 31 and 39 reduced the production of soluble sugar stored in stems and sheaths from 2.5 t ha-1 to 0.5 t ha-1 while shading between GS 39 and GS 55 reduced the number of grains formed in each ear from 42 to 33. In terms of yield, shading between GS 31 and 39 produced a “source” limited crop as soluble sugars were eventually used to fill grain and shading between GS 39 to GS 55 produced a “sink” limited crop. The effect of the late epidemic, in reducing the green area available for the interception of radiation, was greater on crops shaded between GS 31 and GS 39 as compensatory growth to replenish concentrations of soluble sugar was prevented. In contrast, the yield response for crops shaded between GS 39 and GS 55 was less than for an unshaded crop which had more grains to fill. It can be concluded that accurate predictions of the effect of disease cannot be made if host development and growth are not considered.
Session X – Controlling the problem
Inducing host resistance to pathogens
Prof. John Mansfield
Biological Sciences Department, Wye College, University of London, Wye, Ashford, Kent, TN25 5AH
This paper will include discussion of what we now know about mechanisms of disease resistance in plants and how this knowledge has been, and might be, used to develop new control strategies. The major gaps in our understanding of plant-pathogen interactions will also be considered.
Established features of the plant’s defence are antimicrobial compounds either phytoanticipins (pre-formed) or phytoalexins (induced). It is perhaps surprising that the rich variety of chemical structures found in plants has not produced a useful fungicide. Molecular genetics has confirmed the role of secondary metabolites in plant-microbe interactions and provides routes to engineer new forms of resistance. The introduction of novel phytoalexins and modification of structures to enhance antimicrobial activity are becoming more attainable as the complexity of biosynthetic pathways becomes unravelled. Activation of local accumulation of phytoalexin is, in some plants, followed by induction of systemic acquired resistance (SAR) in distant plant parts. The expression of SAR is characterized by increased speed of response in the protected tissue. Such a potentiation towards resistance has been linked to accumulation of salicylic acid. Compounds which might activate SAR or enhance natural defence responses have potential in crop protection but not all plants respond in the same way. For example dichloroisonicotinic acid, an effective inducer of SAR in tobacco and Arabidopsis, surprisingly can cause increased susceptibility to downy mildew in lettuce.
SAR and other defence responses are associated with the hypersensitive reaction (HR) at infection sites. The recognition processes leading to the HR are closely linked to gene-for-gene interactions between pathogens and their hosts. There has been remarkable progress in cloning genes for resistance to a range of pathogens including bacteria, fungi, nematodes and viruses. The emerging theme so far is that the proteins encoded by resistance genes are structurally related. Introduction of cloned genes into previously susceptible plants confers resistance. A notable success has been use of the Xa21 gene to engineer resistance to bacterial blight of rice, but how durable will the introduced resistance be to rapidly evolving pathogens? Understanding the signal transduction pathways that activate the HR requires characterization of both the resistance gene in the host and the avirulence (avr) gene in the pathogen. In fungi, such ‘matching pairs’ are only available for Cladosporium fulvum. Compared with the bacterial pathogens, the avr genes from fungi are poorly understood especially amongst the obligate parasites, the rusts and mildews which remain of major economic importance. Cloning genes on the basis of mapping molecular markers should allow potential avr genes to be isolated from the obligate parasites. Recent results with several bacterial genes for example avrPphB and avrPphE from Pseudomonas syringae pv. phaseolicola has demonstrated that their expression within plant cells leads to the HR i.e. the encoded Avr proteins act as the elicitors of the plant’s response. Using the approach of expression in the plant may remove the stumbling block of transforming obligate fungal parasites which would have been necessary to confirm their function.
Understanding the delivery of Avr proteins from bacteria has led to the discovery of the key determinant of pathogenicity the type III secretion system. Analysis of how fungal avr genes function might lead to the discovery of similar fundamental processes in fungi which lead to the establishment of obligate parasitism. New targets for chemotherapeutic intervention should emerge from these basic studies.
In the preface to his book on Physiological Plant Pathology, R.K.S. Wood wrote in 1967,
“…. most plants resist infection and colonization by most bacteria and fungi. They are naturally in the state that we still seek to reproduce by the use of fungicides that have for the most part been discovered …. by empirical methods”. Although, 30 years on, this statement remains a useful focus for further studies, our increased understanding of resistance has revealed several direct routes and new avenues for the development of disease control strategies.
GMOs – a boon or a major risk?
Dr Philip J Dale
Cambridge Laboratory, John Innes Centre, Colney Lane, Norwich NR4 7UH, UK
Modern methods of genetic modification (GM) present opportunities to improve our crops, but also challenges to manage the technology carefully and responsibly. It is important to assess the potential impact of GM crops within the context of conventional plant breeding. Many of the issues raised by the widespread use of GM crops are familiar to the traditional plant breeder. However, some are different, and as geneticists, breeders, pathologists and agriculturalists, we need to take account of this. Because we can introduce genes into our crops from viruses, bacteria, plants, animals, humans, and even make genes synthetically in the laboratory, there is international agreement that a risk assessment should be carried out to determine their possible impact on human health, the environment and on food. This involves asking a series of questions about the modified crop plant and in generating new data, where necessary.
GM methods provide us with very important opportunities both to understand the ways in which plants defend themselves against diseases and in the design of new kinds of resistance mechanisms, some of which are likely to be more robust than those available through traditional breeding. There are also a number of challenges that genetic modification presents, including developing new agricultural strategies for their management; the extent to which their use should be governed by regulation, market forces and codes of practice; and how they can benefit developing countries.
The role of sanitation in suppressing inoculum
Mr David J. Yarham
Croxton Cereal pathology, Fulmodeston, Fakenham, Norfolk NR21 0NP
The aim of any disease control strategy is to delay for as long as possible the epidemic development of the pathogen. This can be achieved either by slowing the rate of increase of the pathogen on the host or by reducing the initial level of inoculum available for infection. The interaction of these two approaches has been expressed mathematically be Van der Plank in the equation:
230 Ia
t = . log
r Ib
where “t” is the delay in the development of an epidemic achieved by reducing the initial level of inoculum (“Ia“) to a lower level (“Ib“) when the rate of increase of the pathogen during the delay period is “r”.
Any strategy aimed at reducing “r” will be assisted by a reduction in the value of “Ib“. Conversely, the benefit derived from reducing “Ib” will be lessened if the value of “r” is high. Obviously, if either “Ib” or “r” can be reduced to zero the value of “t” will be increased to infinity and complete control of the pathogen will have been achieved.
In protected horticulture strict hygiene can so completely eliminate inoculum of some obligate parasites that there is no need for the use of chemicals to control them. In agricultural practice elimination of indigenous pathogens is seldom a feasible option, and for some the rate of increase is so rapid as almost to obviate the benefits of inoculum reduction. In many situations, however, a reduction in the initial level of inoculum can greatly augment the use of other methods of delaying epidemic development and can thus form a vital component of a disease control strategy.
Are fungicides the ultimate answer to disease control?
Mr Andy Leadbeater
Novartis Crop Protection, CH4002 Basle, Switzerland
The use of chemical fungicides is routine practice in agriculture and horticulture throughout the world as a measure to provide protection against yield and quality reducing plant diseases. The need for fungicides should however always be questioned, as they are only a part of the integrated management of crops and are almost the final step after consideration of agronomic good practice to reduce the occurrence and effects of pathogens.
New technologies bring new opportunities for disease control. These technologies include new classes of conventional fungicides such as anilinopyrimidines or strobilurins, transgenic crops offering disease resistance, and utilisation of plant natural defence mechanisms through Systemic Acquired Resistance (SAR). These have now reached the stage where practical products are available, for example in SAR, the plant activator acibenzolar-s-methyl (“Bion”). We therefore have several new possible approaches to plant disease control to really provide alternatives to the conventional fungicide one. Chemical fungicides are increasingly selected for yield and quality effects rather than purely disease control.
The current status of these new technologies can be reviewed with their possible impact on future management strategies. A major issue with new technology such as transgenic crops is acceptance by the public and in the market place, this could be a real barrier to advances being made.
Whilst fungicides are certainly not the ultimate answer to disease control, they equally certainly have their place and will continue to do so, providing effective and flexible disease control and yield and quality improvements, whilst also in themselves protecting the new technology (and being protected) in terms of durability of control and resistance management.
Session XI – Turning research into advice
Decision support systems – the answer to the ultimate question?
Dr David H. Brooks
Screech Cottage, Northchapel, Petworth, West Sussex, GU28 9EG
A very large number of decision support systems (DSS) have been developed for use in agriculture, but, almost without exception, these have fallen into disuse after a short period. Reasons include failure to pay adequate attention to the users needs in the design phase, they were too difficult or demanding to use, or they were not perceived to give sufficient benefit to the user
The advent of powerful PC’s, and the increasing willingness of farmers and advisers to use them, is making possible a new generation of DSS based on mathematical models which simulate the development of crops and their associated pests and diseases, accounting for the impact of weather, pest pressure and other factors, and predicting the outcome of possible remedial measures such as application of pesticides.
The Decision Support System for Arable Crops (DESSAC) will, in time, provide a suite of such DSS modules which will address all of the key decision areas facing arable crop farmers. The module being developed for winter wheat fungicides illustrates some of the criteria which are thought to be important to success for a DSS. However ultimately it is the farmer or his adviser who takes crop management decisions: DSS can only provide support for such decisions.
Are farmers getting the message?
Mr William S. Clark
ADAS Boxworth, Cambridge CB3 8NN
Despite the technological revolution of global communication and a plethora of technologies that have engulfed the agricultural and horticultural industries many growers still find themselves isolated. The amount of information that is available within a subject area can appear vast to anyone surfing the net but in most instances information is not the problem. Gathering information does not necessarily help a user to make a decision. Historically ‘information systems’ have been set up to ‘help’ growers make decisions but almost without exception they have failed. DESSAC, described elsewhere in these abstracts, is one of the pioneers in a new breed of decision support systems which aims to help users, rather than bombarding them with information. Support to growers is undoubtedly vital as economic pressures grow and research findings produce more and more complex findings which need to be interpreted and then packaged for end users. Constructing advisory messages is alien to many researchers but having constructed the messages the method of delivery to end users can still be a problem. The best form of delivery is face to face discussion but this is grossly inefficient. Many variants of this have formed – specialist discussion groups, crop centres, focus groups, friends of research centres, research roadshows – and yet we generally only reach a very small percentage of growers, relying on a trickle-down to the majority of end users. The balance of funding for research versus funding for dissemination of advisory messages is under consideration by many funders but is not yet right. Without adequate funding for the transfer of messages from research findings there is little point in the research itself, other than as an academic exercise.
Do gardeners matter?
Pippa Greenwood
Hawkley, Liss, Hants. GU33 6NR
Given a title like this, my first instinct was simply to write ‘YES’ – Gardeners may not be the majority land owners or cultivators, but their potential influence on pathology is still very significant.
Domestic gardens and allotments are often blamed by farmers and growers as being the source of many disease outbreaks, the extent to which this is a fair criticism is analysed.
Whether growing edible crops or ornamentals, or a combination of both, the keen gardener should perhaps be viewed as a useful observer, perhaps not driven by commercial pressure, but all the same, often extremely observant and knowledgeable. Some examples of problems affecting garden plants, and the various methods we use to communicate information and advice about these to gardeners will be discussed. Problems associated with turning the information provided by research into information which can be easily understood and then used to the gardener’s advantage are also discussed.
At certain levels, plant pathogens which prove to be a ‘gardeners’ nightmare’, can raise questions and initiate investigation and research. After all, pathogens which may be of lesser significance commercially are still fascinating and research about these could provide useful spin-off sources of information for farmers and growers.
The role of botanic gardens in plant pathology. Looking forward to 1998, an exciting year in prospect
Prof. David S. Ingram FRSE
Royal Botanic Garden Edinburgh, 20A Inverleith Row, Edinburgh, EH3 5LR
Botanic gardens have an important role to play in applied plant pathology, as follows:
i) the maintenance of reference collections – living, preserved (in herbaria) and books;
ii) research in systematics (taxonomy and phylogeny), based upon the collections;
iii) diagnosis, including the development of novel methods;
iv) studies of the role of pathogens in natural ecosystems, and conservation;
v) education, including the training of new professionals, contributions to schools’ curricula and the provision of information and advice to the public.
These functions will be adumbrated and illustrated.
1998 will be an exciting year for the British Society for Plant Pathology, with the prospect of constitutional change and the Seventh International Congress to be held in Edinburgh in August. The talk will therefore end with a brief overview of the year to come from the perspective of the incoming President of both the Society and the Congress.
PH Gregory Prize Competition
Development of a PCR-based detection technique for Pyrenopeziza brassicae, causal agent of light leaf spot on winter oilseed rape (Brassica napus L. subsp. oleifera).
Simon J. Foster1, A.M. Ashby1 & B.D.L. Fitt2
1 Department of plant Sciences, University of Cambridge, Downing Street, Cambridge, CB2 3EA.
2 IACR Rothamsted. Harpenden, Hertfordshire, AL5 2JQ
The heterothallic discomycete fungus Pyrenopeziza brassicae Sutton and Rawlinson (anamorph Cylindrosporium concentricum) is the causal agent of light leaf spot of brassicas. In particular, light leaf spot is considered to be one of the most damaging diseases of winter oilseed rape (Brassica napus L. subsp. oleifera) in the UK. The work described here represents the initial stages in the development of a polymerase chain reaction (PCR) based diagnostic technique that will be used to detect P. brassicae in infected plant tissues. Degenerate PCR primers, designed to amplify a region of DNA encoding the Sex Factor Induced (SFI1) protein, were shown to amplify a 700 base pair (bp) product from DNA extracted from a range of P. brassicae isolates of mating type MAT 1-2. This PCR product was not generated upon amplification of DNA from other fungal pathogens of oilseed rape and was shown to hybridise to 5kb (kilo base pairs) and 3kb Sal I fragments from P. brassicae isolates of mating type MAT 1-1 and MAT 1-2 respectively. Sequence analysis of the 700 bp PCR fragment enabled the design of fully homologous primers which facilitated the amplification of PR products from P. brassicae DNA only. These primers also differentiated between the two mating type of P. brassicae. These findings detailed here and discussed in relation to a light leaf spot forecasting system currently under development.
Production, separation, toxicity and metabolism of the solanapyrone toxins produced by the chickpea pathogen, Ascochyta rabiei
Khalid Hamid
Department of Biology, Darwin Building, University College London, Gower Street, London C1E 6BT
Ascochyta rabiei secreted the toxins solanapyrones A, B and C when grown on a medium consisting of Czapek Dox nutrients supplemented with cations. After filtering off the fungus, the toxins were partitioned into ethyl acetate and separated by flash chromatography on silica gel using 1. cyclohexane : dichloromethane : ethyl acetate 3:3:1 (v/v/v), 2. the same solvents but in equal proportions and lastly 3. ethyl acetate. Samples of the crude ethyl acetate fraction, obtained by partitioning culture filtrates, and pure compounds from flash chromatography were chromatographed on a C18 HPLC column with tetrahydrofuran : methanol : water 20.6 : 23.1 : 56.3 (v/v/v) in order to quantitate the toxins and determine their recovery.
The range of sensitivity of cells isolated from leaflets of 12 cultivars of chickpea to solanapyrone A varied 5-fold and solanapyrone A was 2 – 12 times more toxic than solanapyrone B, depending on cultivar. When shoots of chickpea were placed in solutions of solanapyrone A the stems lost their turgor and became shrivelled. In contrast, the stems of shoots placed in solanapyrone B remained turgid but the leaves became twisted and chlorotic. These symptoms, are typical of Ascochyta blight. Both compounds were metabolised by shoot cuttings, cells isolated from leaflets and by a protein preparation from shoots of the plants.
Relationship between height and resistance to fusarium ear blight in wheat
Alex J. Hilton1 , P. Jenkinson1, T.W. Hollins2 & D.W. Parry3
1 Crop and Environment Research Centre, Harper Adams Agricultural College, Newport, Shropshire TF10 8NB
2 Plant Breeding International, Maris Lane, Trumpington, Cambridge
3 Horticultural research International – East Malling, West Malling, Kent
The significance of morphological characters such as straw height, peduncle length, compactness of ear and angle of flag leaf on the development of Fusarium ear blight in eight cultivars of winter wheat was studied in a field trial in 1994/95. Straw height was shown to be significantly correlated to disease severity with taller cultivars such as Spark and Cadenza showing fewer symptoms of FEB compared with shorter cultivars such as Brigadier and Genesis. No other morphological character showed any significant correlation with disease.
To determine if this relationship was the result of genetic associations, cultivars of varying height were intercrossed, and the resulting F3 lines were assessed following artificial inoculation. Among random F3 populations, there was a clear tendency for tall straw and resistance to FEB to co-segregate. The effect of the individual dwarfing genes Rht 1 and Rht 2 on severity of FEB were studied in an artificially inoculated field trial in 1996/97 using a number of near-isogenic lines of Maris Huntsman. The presence of the dwarfing genes caused a significant increase in severity. These trials suggested that the relationship between disease and height is controlled either linked genes or genes with a pleiotropic effect.
The monitoring of humidity at ear height in both short and tall isogenic lines of Maris Huntsman showed no difference between lines in the shorter lines that could explain an increase in severity of FEB. It was concluded that the relationship between disease severity and height is controlled by linked genes and not by micro-climate effects.
Ultrastructure and composition of the cell surface of Colletotrichum lindemuthianum conidia.
Bleddyn Hughes1, R. Carzaniga2,3, R.J. O’Connell2 and J.R. Green1
1 School of Biological Sciences, University of Birmingham, Birmingham, B15 2TT
2 IACR – Long Ashton Research Station, Department of Agricultural Sciences, University of Bristol, Long Ashton, Bristol BS18 9AF; 3Istituto di Patologia Vegetale, Universita di Milano, Via Celoria 2, 20133 Milano, Italy
The spores of Colletotrichum lindemuthianum, causal agent of bean anthracnose, are coated with a fibrillar layer arranged perpendicular to the cell wall. This spore coat is stained by silver proteinate and removed after protease digestion, indicating the presence of glycoproteins. The monoclonal antibody UB20 was raised to germlings growing in vitro and recognises a carbohydrate epitope carried by glycoproteins on the spore surface (Pain et al., 1992). Western blotting is used to demonstrate that the glycoproteins recognised by UB20 are extracted from spores by hot water, SDS and by -mercaptoethanol at alkaline pH. Electron microscopy shows that the spore coat is also removed by these treatments. The major glycoprotein extracted from the cell is identified at 100 kD. Biotinylation of washed spores shows that this glycoprotein is located at the spore surface. Further characterisation of the glycoprotein demonstrates that it contains both O-linked and complex N-linked carbohydrate side-chains. High performance anion-exchange chromatography suggests that the carbohydrate moiety nay contain rhamnose. The 100 kD glycoprotein is currently being characterised using molecular biological techniques and a monoclonal antibody is being raised. Possible functions of the 110 kD glycoprotein will be discussed.
Reference
Pain et al. (1992). Physiological and Molecular Plant Pathology, 41, 111-126.
The role of saprophytic microflora in the development of fusarium ear blight of winter wheat caused by Fusarium culmorum.
Jo Liggit
Crop and Environment Research Centre, Harper Adams Agricultural College, Newport, Shropshire TF10 8NB
A number of fungicides were shown to give effective control of F. culmorum in vitro, moderately effective control of Fusarium ear blight (FEB) in the glasshouse, but poor control under field conditions. It as proposed that saprophytic microflora may interact with ear blight pathogens of wheat and contribute to the poor performance of fungicides against this disease. A glasshouse experiment and a series of experiments in vitro were conducted to determine the relationship between saprophytic microflora and Fusarium culmorum and to determine fungicide effects on Alternaria alternata, Botrytis cinerea, Cladosporium herbarum and Fusarium culmorum. Inoculation of winter wheat ears (cultivar Avalon) in the greenhouse with either A. alternata, B. cinerea or C. herbarum at GS 59 prior to inoculation with F. culmorum at GS 69 led to a decrease in the severity of FEB. The mean percentage of spikelets infected was reduced from 19% for ears which had been inoculated with F. culmorum alone to 4, 6 and 5% for ears which had been inoculated with A. alternata, B. cinerea and C. herbarum, respectively. In vitro, the mycelial growth of F. culmorum was reduced when inoculated opposite A. alternata, B. cinerea and C. herbarum. The antagonism was shown to be due to production of non-volatile and volatile antibiotics by the saprophytes showed differential sensitivity to the fungicides benomyl, chlorothalonil, fluquinconazole, flusilazole, flutriafiol, prochloraz, pyrimethanil and tebuconazole.
The possibility that fungicides differentially suppress saprophytic microflora on the ears of wheat leading to poor control of FEB in the field is discussed.
Identification and toxicity of Alternaria isolates obtained from cruciferous crops grown in Thailand
Preprame Pattanamahakul
Department of Biology, Darwin Building, University College London, Gower Street, London C1E 6BT
Fungi isolated from annular, necrotic leaf spots on leaves of cabbage, cauliflower, Chinese Kale and Choi-Sum, grown in northern Thailand, produced pigmented multicellular spores characteristic of the genus Alternaria. Comparison of the nucleotide sequences of the ITS2 region of the ribosomal DNA of eight of the isolates, two from each host plant, with that published for an authentic isolate of Alternaria brassicicola showed complete identity.
Filtrates from cultures grown on Czapek-Dox nutrients, supplemented with cations, were toxic to cells isolated from all four host plants. An isolate from cauliflower was the least toxic and one from cabbage the most. About half the activity was retained by a dialysis membrane and the remainder was diffusible and partitioned into ethyl acetate. Purification of the ethyl acetate fraction by solid phase extraction on C18 cartridges and thin layer chromatography on silica gel led to the isolation of two toxic compounds with characteristic UV spectra.
Offered Posters
The fungicidal properties of plant extracts and essential oils
Alefyah Ali & Avice M. Hall
Department of Environmental Sciences, University of Hertfordshire
The aesthetic, medicinal and antimicrobial properties of plant essential oils have been known since ancient times. Numerous studies on the fungicidal and fungistatic activities of essential oils have indicated that many have the power to inhibit fungal growth.
Several studies have been conducted within the university of Hertfordshire on the antimicrobial effects of the essential oils of basil (Ocimum basilicum), Coriander (Corriandum sativum), Lavender (Lavandula angustifolia), Neem (Azadirachta indica) and Thyme (Thymus vulgaris). Both in vitro and in vivo experiments were designed to find out the fungicidal, fungistatic and phytotoxic effects of the oils both as a contact and as a fumigant fungicide. The test organisms were a range of economically significant fungi (Alternaria sp., Aspergillus sp., Botrytis cinerea, Erysiphe graminis). Several techniques were evaluated to find the Minimum Inhibitory Concentration (MIC) of the oil needed to inhibit the growth of the fungi, these included the droplet technique, the borehole method and the disc diffusion method. Vapour chambers were constructed to evaluate the fungicidal properties of the volatile components of the oil.
Various degrees of inhibition were observed for all oils examined except Neem oil, which did not show any fungicidal properties. Thyme oil proved to be extremely effective as a fumigant as well as a contact fungicide. The findings emphasise the fungitoxicity of various oils and pave the way for further studies and for their adoption as effective fungicides in the horticulture and agriculture industries.
The history of fireblight in England in relation to weather
Eve Billing
4 Fromandez Drive, Horsmonden, Tonbridge, Kent TN12 8LN
Movement of diseased plant material was an important factor in the spread of fireblight across England but, the amount of disease in different seasons on different hosts and cultivars was largely weather dependent. Major outbreaks of fireblight since 1957 were as follows: pear secondary blossom (1957 – 1960, 1966, 1967); late apple primary blossom (1978, 1982); hawthorn blossom (1960, 1964, 1982); pyracantha and cotoneaster blossom (1960, 1966, 1967 and some later years). Significant shoot blight on two or more hosts was associated with stroms (1968, 1969, 1982, 1994). In 1982, disease was widespread and severe on all hosts except pear. Since then, outbreaks on pear, apple and hawthorn have been sporadic but rarely severe. Weather patterns associated with outbreaks will be shown and problems of cross-infection between hosts described. The importance for risk assessment of good flowering records for each host and of storm records is emphasised.
Field Resistance of Wheat Varieties to Isolates of Mycosphaerella graminicola.
JK M Brown1, H-R Forrer2, G H J Kema3, L S Arraiano1, P A Brading1, E M Foster1, E Jenny2, A Hecker2 and E C P Verstappen3
1 John Innes Centre, Colney Lane, Norwich, NR4 7UH, England
2 Swiss Federal Research Station for Acroecology and Agriculture, Reckenholzstrae 191/211, 8046 Zrich-Reckenholz, Switzerland
3 DLO Research Institute for Plant Protection (IPO-DLO), P O Box 9060, 6700 GW Wageningen, The Netherlands
Field trials of the resistance of 69 wheat varieties to single-pycnidium isolates of Mycosphaerella gramincola were carried out in England, Switzerland and the Netherlands in 1995, 1996 and 1997. These included varieties and breeding lines from the Czech Republic, England, France, Germany, the Netherlands, Portugal, Sweden and Switzerland, varieties from Europe and America which are parents of precise genetic stocks held at the John Innes Centre, and Veranopolis and Kavkaz-K4500, which have been identified as possible sources of resistance.
There was great variation among the varieties in their resistance to M. graminicola. A Czech breeding line was the most resistant, followed by Veranopolis, but several varieties which have been grown widely in the UK also had very good resistance. This indicates that there is considerable potential for UK breeders to improve the general level of resistance using well-adapted germplasm but further improvements in resistance may be obtained by introgression from exotic varieties.
There was also considerable specificity in interactions between varieties and isolates. In particular, a number of lines were resistant to IPO323 but susceptible to other isolates; these include parents of JIC precise stocks from countries throughout western and eastern Europe and from the United States. Varietyisolate interaction must therefore be taken into account in wheat breeding, because the evolution of specifically virulent M. gramincola populations presents a threat to the durability of varietal resistance. Although very little is currently known about the frequencies of specific virulences towards wheat varieties, the trials gave a hint that there may be regional differentiation between sub-populations of M. graminicola from different parts of Europe, since varieties which are used as resistant parents in Portuguese wheat breeding programmes were very susceptible to several Dutch isolates.
Effects of meteorological conditions, sclerotial position and cropping practice on Sclerotinia in field-grown lettuce
J.M. Ll Davies1, C S Young2, J M Whipps3, S P Budge3, L C Hiron1, J A Smith2, W J Stevenson2 and M. Watling1
1 ADAS Terrington, Terrington St. Clement, Kings Lynn, Norfolk. PE34 4PW
2 ADAS Wolverhampton, “Woodthorne”, Wergs Rd, Wolverhampton. WV6 8TQ
3 Horticulture Research International, Wellesbourne, Warwick CV 35 9EF
Five sequential crops of lettuce were planted in 1996 and 1997, on a naturally infected site and a disease free site, in both Cheshire and East Anglia. Plots of single (normal) spaced plants and double spaced plants were artificially infested with sclerotia placed at specific positions relative to the plants. Meteorological stations were set up in 1995 to measure rainfall, air and soil temperatures in the winter months and, in addition, soil moisture, soil moisture tension, surface wetness, wind speed and solar radiation during the growing seasons. The maximum sclerotial germination observed per plot in 1996 and 1997, respectively, was 100% and 52% in Cheshire, and 34% and 16% in East Anglia. There was considerable variation in the timing and quantity of apothecial development between plantings and particularly between sites. Disease only occurred at the Cheshire site with maximum disease incidence per inoculated plot of 66% and 22% plants affected in 1996 and 1997, respectively. Disease was not related to the presence of apothecia directly beneath plants, which suggests that airborne ascospores may be responsible for infection. A qualitative examination of the meteorological data suggested that temperatures between 10 and 18oC and an increase in soil tension, i.e. a period of drying following a rain event appeared to stimulate the formation of apothecia in the field. Laboratory and glasshouse studies under controlled conditions are underway to confirm field observations of the key factors controlling apothecial production and lettuce infection.
The effect of the spore concentration of Verticillium albo-atrum and salt treatment on infection and disease progression in tomato plants
Murat Dikilitas & C. J. Smith
School of Biological Sciences, University of Wales Swansea, Singleton Park SA2 8PP
The pathogenic effects of virulence of Verticillium diseases on tomato plants was increased when they were inoculated with high spore concentration of the fungus in early stages of growth (for-week old seedlings). When plants were inoculated in later stages (five-week old) with a lower spore concentration the effect of pathogen on tomato plants was decreased (although it was still pathogenic to tomato).
The severity of the disease was higher in tomato plants treated with the fungus and salt solution, than in plants given either treatment separately.
Pathogenicity of lucerne (V1) and tomato (V2) isolates of Verticillium albo-atrum to salt-tolerant lucerne strains
Murat Dikilitas, C.J. Smith & J. M. Milton
School of Biological Sciences, University of Wales Swansea, Singleton Park SA2 8PP
The pathogenicity of lucerne V1 isolate and tomato V2 isolate to salt-tolerant lucerne strains (150, 200, 250 and 300 mol/m3) was investigated both under greenhouse and field conditions. Six week old salt-tolerant lucerne seedlings (Medicago media) were inoculated by the method of root dipping and wound inoculation. Height and symptom index were assessed one week after inoculation for a period of ten weeks. Isolate V2 caused mild symptoms under greenhouse conditions but did not cause any symptoms or height reduction under field conditions on salt-tolerant strains, while isolate V1 caused severe symptoms on the salt-tolerant plants both in the greenhouse and under field conditions. Susceptibility of salt-tolerant plants to the disease increased in both conditions when their level of tolerance to salt increased.
Assessing the Greening Effect of New Fungicide Chemistry on Winter Wheat
1Leonor Leandro1, Rosie J. Bryson2, W. S. Clark2 & J. Criagon1
1 Dept. of Environmental Sciences, Sutton Bonington Campus, University of Nottingham, Sutton Bonington. Leics. LE12 5RD
2 ADAS Boxworth, Battlegate Road, Boxworth. Cambs. CB3 8NN
Two new groups of plant protection chemicals, Strobilurin fungicides and plant activators, have been reported to have, apart from disease control, a ‘greening’ effect on treated plants, the leaves having a more intense green colour and staying green longer than controls. The main aim of this study was to test if the claimed greening effect was measurable on winter wheat (Triticum aestivum L.) with methods currently used to assess leaf colour. A secondary aim was to compare the performance of the available methods of detecting possible differences in greenness.
Chlorophyll content of the leaves was assessed by extraction in acetone and with a portable SPAD-502 chlorophyll meter. Spectral transmittance and reflectance were measured with a LI-COR LI-1800 spectroradiometer with an integrating sphere attachment. A high correlation was obtained for the calibration curve (R2 = 0.97) between SPAD readings and extractable chlorophyll. The relation between SPAD readings and the NIR/R ratio calculated from LI-COR measurements was also highly correlated (R2 = 0.92). both the SPAD meter and the LI-COR detected differences in greenness (P<0.01) between leaf layers, with the second leaf being the greenest in all treatments. Differences were also found (P<0.01) between untreated and fungicide treated plants. The top three leaf layers of the Strobilurin treated plants were shown to be significantly greener (P<0.05) than corresponding leaves of all other treatments, with a suggestion of a stronger effect on older leaves. This suggestion was further supported by the lower percentage (P = 0.017) of dead or very senesced fourth layer leaves found in this treatment. Transmittance spectra provided an indication that there was also an effect of the Strobilurin fungicide on internal leaf structure. No significant greening effect was found for the plant activator.
Foliar fertilisers can suppress Septoria tritici
Ruth L Mann, Peter S. Kettlewell & Peter Jenkinson
Crop and Environment Research centre, Harper Adams Agricultural College, Newport, Shropshire TF10 8NB
Septoria tritici is a widespread and damaging disease of winter wheat in he UK. It has been suggested that fertiliser application, particularly potassium chloride, to the foliage of wheat will suppress S. tritici.
A field experiment was set up to investigate the efficacy of potassium chloride compared to a conventional fungicide. Potassium chloride or epoxyconazole (opus, BASF) was applied tot the leaves of winter wheat, cultivar Consort, either at growth stage (GS) 31, or 39, or at both growth stages, or according to a disease threshold. The percentage leaf area infected with S. tritici was assessed visually at GS 71. Although not significant between timings, applications of potassium chloride were shown to significantly reduce the percentage area of leaf 2 affected with S. tritici, from 18.4% in unsprayed controls to between 10.3% and 12.3% in treated plots. A similar situation was observed following epoxyconazole applications, where between 2.1% and 9.5% of leaf area 2 was affected, depending on timing, compared to 18.4% in control plots. On leaf 3 two applications of potassium chloride or epoxyconazole proved most effective, reducing the severity of S. tritici from 28.6% in control plots to 22% for potassium chloride and 9.7% for epoxyconazole treated plots. This may be because leaf 3 was more severely infected. Therefore foliar applied potassium chloride can suppress S. tritici, although to a lesser degree than epoxiconazole.
In vitro experiments were undertaken to investigate whether the potassium or the chloride ion was the most active against S. tritici mycelial growth and spore germination. Potassium chloride, sodium chloride and potassium nitrate were used. All three salts significantly reduced mycelial growth and spore germination. Both potassium chloride and sodium chloride caused an almost linear reduction in mycelial growth up to 2 M concentration. potassium nitrate caused an greater reduction of mycelial growth up to 1 M concentration compared with the chloride salts. Concentrations of potassium nitrate greater than 1 M did not cause a further reduction in growth. Spore germination was significantly reduced from 98% to approximately 2% by all three salts at concentrations of 1.5 M and 2.0 M. Since all three salts were effective, it cannot be only the potassium ion or only the sodium ion which is responsible for reducing mycelial growth and spore germination. It is possible that this reduction is caused by adverse osmotic conditions.
Survival of Phytophthora infestans sporangia exposed to solar radiation
E. S. G. Mizubuti & W. E. Fry
Dept. of Plant Pathology, 334 Pl. Sci. Bldg., Cornell University, 14853 Ithaca, NY, USA
Sporangia of Phytophthora infestans of the US-1 and US-8 clonal lineages were collected from lesions and exposed to direct solar radiation (SR) and viability assessed by germination. Exposures during a three-hour period from 800 to 1100, from 1100 to 1400 or from 1400 to 1700 on sunny days (SR> 600 W/m2) resulted in practically complete inhibition of germination after each exposure period regardless of the time of the day. Sporangia were then exposed on sunny or cloudy days (< 400 W/m2) from 1100 to 1400 with samples taken at hourly intervals. In sunny days after one hour of exposure sporangia had low germination levels. On overcast (SR < 300 W/m2) days survival after three hours was only slightly reduced. On cloudy days the average ED95 of solar radiation was 7.7 MJ/m2 and the effective time (ET95) necessary to inactivate 95% of the sporangia was 7.9h. On sunny days, one hour exposure periods from 1100 to noon with samples taken at every 15 min confirmed the previous results of reduced viability after one hour. Overall, on sunny days the average ED95 was 2.8 MJ/m2 and ET95 was 1.1h. The ET95 of non-exposed sporangia was on average 16.8h. These results suggest a differential effect of SR on cloudy days compared to sunny days. Apparently sporangia of both clonal lineages were similarly sensitive to solar radiation.
A Multiplex PCR method for the simultaneous detection of Tomato Yellow Leaf Curl and Tomato Mottle geminiviruses
Jane Morris
Central Science Laboratory, Sand Hutton, York YO 4 1LZ
A rapid, sensitive Multiplex PCR (MPCR) test suitable for the diagnosis of L. esculentum infected with Tomato yellow leaf curl virus (TYLCV)/Indian tomato leaf curl virus (ITmLCV) and Tomato mottle virus (TMoV) was developed. All virus isolates tested were successfully amplified using degenerate primers designed by Deng et al; 1994. MPCR generated a 377 base pair TMoV specific DNA fragment and a 540bp TYLCV/ITmLCV fragment. MPCR DNA products were visualised using ethidium bromide staining following agarose gel electrophoresis. The viral origin of the MPCR products was confirmed by cloning and sequencing. Additionally, sequence data was generated for an unsequenced isolate of TYLCV originating from Turkey. This isolate was found to be almost identical to the published sequence for an Israeli isolate of TYLCV (Navot et al.; 1991), but shared less than 25% homology with the published sequence for a Sicilian isolate of TYLCV (Crespi et al.; 1995).MPCR was found to have several advantages over the currently used Indirect ELISA assay. The MPCR test allowed TMoV DNA to be distinguished from that of TYLCV/ITmLCV isolates, and could be carried out more rapidly than the ELISA assay with results achievable within a working day. A comparison of the sensitivity of detection of MPCR with Indirect ELISA indicated that the MPCR assay was more sensitive.
Protecting Brassica plants from Plasmodiophora brassicae
Lisa Page & Geoff R. Dixon
The Department of Horticulture, SAC / University of Strathclyde, SAC-Auchincruive, Ayr, KA6 5HW, UK
Brassica seedlings raised in peat compost modules appear to be more susceptible to Plasmodiophora brassicae (clubroot) than those grown in other media. Peat is a sterile medium hence it may fail to elicit host protective mechanisms, thereby increasing vulnerability to symptom development. Hypotheses regarding root camouflage (Gilberts et al., 1993) suggest rhizosphere communities differ between various growing media, some exposing the host to increased disease risk. While soil-borne pathogens such as Pythium spp. and Colletotrichum spp. were controlled by organic composts added to the root zone (Zhang et al., 1996).
The aim of the current project is to understand mechanisms which lead to the suppression of P. brassicae in organic media. This may lead to improved knowledge of the manner by which the root environment elicits host resistance mechanisms. Work has commenced by surveying the interaction of several growing media as substrates in which clubroot is either repressed or enhanced. Preliminary results support the view that peat based composts induce greater disease development.
Use will be made of a range of crude organic extracts and other compounds such as salicylic acid which are well demonstrated to enhance specific acquired resistance.
References
Gilbert, Handelsman & Parke (1994). Phytopathology, 84: 222-225.
Zhang, Dick & Hoitink (1996). Phytopathology, 86:1066-1070.
CIH1, A biotrophy-related gene expressed specifically at the intracellular interface formed between Colletotrichum lindemuthianum and French bean
Sarah Perfect1, Jon Green1 & Richard O’Connell2
1 University of Birmingham, UK., 2IACR-Long Ashton, UK
Colletotrichum is a large genus of plant pathogenic fungi causing anthracnose on a wide range of crops. C. lindemuthianum is a hemibiotrophic species which causes anthracnose of bean, Phaseolus vulgaris. During the initial biotrophic stage of infection, the fungus differentiates infection vesicles and primary hyphae within host epidermal cells. These specialised intracellular hyphae invaginate the host plasma membrane, from which they are separated by a matrix layer. Monoclonal antibodies (MAbs) raised to isolated infection structures have been used to identify proteins present at the fungal-plant interface.
One of these MAbs, designated UB25, recognises a protein epitope in a 40kDa N-linked glycoprotein specific to intracellular hyphae. Indirect immunofluorescence and EM-immunogold labelling show that the glycoprotein is present in the infection peg, and the fungal walls and matrix surrounding the intracellular hyphae. However, it is not present in secondary necrotrophic hyphae, which suggests that it is specific to biotrophic infection structures. The glycoprotein may therefore be involved in the establishment and maintenance of biotrophy.
A cDNA library has been constructed from total RNA isolated from infected bean hypocotyl epidermis. The MAb UB25 has been used to immunoscreen the library and positive clones have been isolated and sequenced. Southern analysis indicates that the CIH1 glycoprotein recognised by UB25 is fungally encoded and is present in several Colletotrichum species. Analysis of the deduced amino acid sequence of CIH1 revealed the presence of two distinct domains, one of which is proline-rich and contains short repetitive motifs with tyrosine-lysine pairs. Tyrosine residues have been implicated in the oxidative cross-linking of proteins such as extensins. Cross-linking studies of CIH1 indicate that this glycoprotein has the potential to be oxidatively cross-linked by peroxidase in the presence of hydrogen peroxide. The functional significance of this will be discussed.
The rapid detection of Colletotrichum gloeosporioides in yam tubers by ELISA
Jeff C. Peters1, L. Kenyon2, M. James3 & A. Jarma1
1 Department of Agriculture, The University of Reading, Earley Gate, Reading, Reading, RG6 6AT
2 Naturl Resources Institute, Central Avenue, Chatham Maritime, Kent, ME4 4TB.
3 Ministry of Agriculture, Graeme hall, Barbados
Colletotrichum gloeosporioides causes, the often-devastating disease, anthracnose, on yam (Dioscorea spp.) vines. However, the pathogen can also infect tubers, causing disfigurement of the periderm and necrotic lesions in the underlying cortex (Green & Simons, 1994). As yams are almost entirely propagated by seed tubers (setts), planting material infected with C. gloeosporioides is thought to be an important source of new infections (Adebanjo & Onesirosan, 1986). Therefore, a rapid and reliable method for detecting C. gloeosporioides in yam setts would help prevent anthracnose epidemics by identifying farmer’s seed supplies which have high levels of infection. In the study presented here, yam tubers, which were inoculated with C. gloeosporioides, developed characteristic discoloration of the meristem and cortex. The pathogen could be re-isolated from infected tissue; however, this can take up to 72 hours. An ELISA test, modified from Hughes (in press), was developed to detect C. gloeosporioides directly from tuber tissue in less than 4 hours. this diagnostic tests was primarily successful in diagnosing superficial infections by detecting C. gloeosporioides in the outer periderm layer. However, the ELISA also detected the pathogen in the deeper meristematic layer in, what is thought to be, naturally infected tubers.
References
Adebanjo, A. & Onesirosan, P.J. (1986). Surface bore infection of Dioscorea alata tubers by Colletotrichum gloeosporioides. Journal of Plant Protection in the Tropics 3: 135-137.
Green, K.R. & Simons, S.A. (1994). ‘Dead skin’ on yams (Dioscorea alata) caused by Colletotrichum gloeosporioides. Plant Pathology 43: 1062-1065.
Hughes, K.J.D., Lane, C.R. & Cook, R.T.A. (In Press). Development of a rapid method for the detection and identification of Colletotrichum acutatum. In: Diagnosis and Identification of Plant Pathogens; Proceedings of the European Foundation for Plant Pathology, 4th International Symposium 1996, Bonn, Germany.
Experiences with blight forecasting
Moray C. Taylor1, N.V. Hardwick1, N.J. Bradshaw2 & A.M. Hall3
1 Central Science Laboratory, Sand Hutton, York YO4 1LZ
2 ADAS Cardiff, St. Agnes Road, Cardiff
3 University of Hertfordshire, College Lane, Hatfield
The Smith period is the major late blight forecasting scheme available to UK growers and provides recommendations, based on specific weather conditions, on when to begin a protective fungicide spray programme. Warnings of impending blight attack are issued by ADAS, using weather data from a selected set of the Meteorological Office’s synoptic network of weather stations. The stations are widely separated and often located many miles from the major potato growing areas which means that one station has to cover a large geographical area. Within such an area the conditions experienced in many potato fields may be quite different from those at the meteorological station leading either to false alarms or, more seriously, a failure to warn sufficiently ahead of an outbreak to allow sprays to be applied.
Since the 1960’s new, more complex, forecasting systems have been published which aim to give more accurate forecasts provided their weather data requirements can be met. The last few years have also seen the proliferation of in-crop weather stations which give an indication of the conditions on very local scale. These provide an excellent opportunity to test forecasting systems fully and gain an indication of their usefulness to individual growers.
Using frequently recorded in-crop weather data from each of five widely dispersed locations during 1996-1997, the Blitecast, NEGFRY, Sparks, Ullrich & Schrdter and Smith Period forecasting systems were evaluated with respect to time of first blight warning, the first occurrence of blight and the number of sprays recommended. No single system proved effective at all sites in all years. The schemes varied widely in their ability to predict blight infection particularly in the drier year of 1996. Furthermore the number of applications recommended could be doubled or halved depending which scheme was used despite being challenged with the same meteorological data.
Of all the systems evaluated only the Smith Period was consistently successful in triggering spray applications prior to blight appearing in the crop. It is concluded that the Smith Period is still effective forecasting scheme, despite being prone to initiate sprays well ahead (especially in dry years) of the blight outbreak, as it is the least likely to fail.
Using disease survey data to develop schemes for predictinG epidemics of light leaf spot (Pyrenopeziza brassicae) on winter oilseed rape in England and Wales
Judith A.Turner1, S. Welham2, B.D.L. Fitt2, P. Gladders3 & K. Sutherland4
1 Central Science Laboratory, Sand Hutton, York, YO4 1LZ
2 IACR-Rothamsted, Harpenden, Hertfordshire, AL5 2JQ
3 ADAS Boxworth, Cambridge, CB3 8NN
4 Scottish Agricultural College, 581 King Street, Aberdeen, AB24 5UA
Data on incidence and severity of light leaf spot (Pyrenopeziza brassicae) of winter oilseed rape has been collected as part of the MAFF Winter Oilseed Rape Pest and Disease Survey since 1987. Light leaf spot is a serious disease of oilseed rape in the UK but the disease incidence and severity differ between regions and between seasons. During the last ten years the incidence of light leaf spot at pod ripening has varied between 5% and 46% plants affected with highest levels occurring in 1987, 1988 and 1994. Based on these data, annual yield losses, after fungicide application to control the disease, were estimated to range between 13M and 49M. Fungicide inputs to control diseases of oilseed rape have increased substantially since 1992 with costs exceeding 8M in 1996 when disease risk was relatively low. The extent of the losses during epidemics and the high costs of inputs clearly indicate the need for a forecasting scheme for this disease, not only to reduce yield losses but to reduce fungicide inputs in years with low disease risk. Analyses of the survey data indicated a strong link between the incidence of plants affected with light leaf spot on the pods and the subsequent incidence of plants with leaf infection at stem extension in the following spring. Disease levels in surveyed crops were greatly influenced by the timing of sprays to control light leaf spot. Sprays applied during mid-November proved most effective. The data were further analysed to identify additional seasonal and crop risk factors involved in outbreaks of light leaf spot. Analyses identified agronomic factors such as sowing date, cultivar and geographical location as important crop risk factors. These risk factors have been integrated into a predictive model to forecast epidemics of light leaf spot which can be used at a time when spray decisions need to be made. Meteorological factors have been recently included to improve accuracy of the model. Regional predictions made during 1996 for the 1997 growing season indicated that accurate forecasts were possible.
Dispersal of the mycoparasite Coniothyrium minitans by soil mesofauna
Roger H. Williams1*, John M. Whipps1 & Roderic C. Cooke2
1 Horticulture Research International, Wellesbourne, Warwick, U.K.
2 University of Sheffield, Sheffield, U.K.
* Present address: IACR-Rothamsted, Harpenden, Hertfordshire, U.K.
Coniothyrium minitans, a mycoparasite with proven biocontrol activity against Sclerotinia sclerotiorum in the field and glasshouse, was found to spread from sites of application during glasshouse biocontrol trials. Whilst details regarding the mechanisms of dispersal are unclear, Coniothyrium minitans was recovered from collembolans collected during these trials and it was observed that numerous mites and insects were frequently associated with decaying plant material in the glasshouse. This led to speculation that soil mesofauna may be involved in transmission of the mycoparasite. Consequently, the mite, Acarus siro, and the collembolan, Folsomia candida, were used to investigate the role of soil mesofauna in the dispersal of Coniothyrium minitans. Both arthropods transmitted the mycoparasite from colonised wheat grains to uninfected sclerotia of Sclerotinia sclerotiorum in sterile and non-sterile soil adjusted to a range of water potentials (-0.25 to -3.6 MPa). Germinable conidia of Coniothyrium minitans were present in the faeces of both Acarus siro and Folsomia candida whereas only the mite carried detectable inoculum externally. When single faecal pellets produced by Acarus siro that had been feeding on Coniothyrium minitans were applied to sclerotia placed in sterile sand or non-sterile soil, 12 and 8% respectively of the sclerotia became infected with the mycoparasite. These results suggest that soil mesofauna may be important in dissemination of Coniothyrium minitans.