Garrett Memorial Lecture: Forecasting - the ends or a means?
Disease forecasts linked to disease control decisions represent the
convergence of goals of disparate groups. Growers desire a tool to suppress
disease more effectively and more efficiently. Society desires a tool that will
lead to decreased use of fungicide. Plant disease epidemiologists desire tools
that use the results of their research. However disappointment results from the
expectation that the forecast will work perfectly when introduced. In general,
farmers have found forecasts less helpful than hoped. Therefore adoption of
forecasts by farmers has been less than desired by epidemiologists, and
reductions in use of fungicide have been less than desired by environmentalists.
These pessimistic views contrast with an alternative view. Forecasts are models
(dramatic simplifications of nature) and as such are to be viewed as imperfect
works in progress. As model predictions are compared to reality (validation),
new insight develops and the model is revised. Revision is an integral part of
the modelling process. New insight leads to an improved understanding of the
pathosystem and better disease control - sometimes without use of the forecast
system. Insight derived from previous effort in forecasting has identified new
factors important to forecasts in some pathosystems. These new factors include:
predictions of future weather, predictions of pathogen influx, and methods for
interpreting predictions. Examples of new factors in forecasting will be
examined. Additionally, improvements in weather monitoring equipment and in
computer technology enable the integration of spatial and temporal factors.
Potato blight - do we have the answer?
In 1845 late blight made the first dramatic appearance in Europe, but it was not until 15 years later that De Bary (1861) proved beyond any doubt that the fungus Phytophthora infestans was the cause of the late blight epidemic. From the time of its first appearance, attempts were made to control late blight with chemicals. The earliest materials included compounds such as sodium chloride (Tebitt, 1847), lime (Focke, 1846) and sulphur (Kuhn, 1858). Effective control of the disease became possible with the introduction of the copper fungicides in the 1880s and it was Bordeaux mixture (Millardet, 1885) which opened up the area of chemical control. Later the dithiocarbamates (1930s) and more recently the systemic fungicides with curative properties (1970s) were introduced.
It was recognised at a very early stage that seasonal weather played an
important role in the development of the disease each year, but is was not until
the 1920s that in The Netherlands an empirical model of meteorological
conditions was developed to advise growers on what dates to apply fungicides
(Van Everdingen, 1926). Subsequently, a number of empirical derived forecasting
rules based on weather conditions were developed, for example: Beaumont period
(1948), "Irish rules" (1949), Smith period (1956). In the USA two of
those empirical systems were amalgamated in 1975 to form the warning system
known as Blitecast which uses temperature, relative humidity and rainfall data
to generate a recommendation for the first spray and also subsequent spray
intervals. More recently, fundamental models have been developed which
incorporate and combine experimental data on the pathogen's life cycle,
meteorological conditions, fungicides and cultivar resistance. During the 1980s,
following the development of more powerful computers, a large number of factors
and their inter-relationships could be included in forecasting systems. Not all
of these systems were introduced in practice, but nowadays farmers do have the
possibility to use these systems in supporting their decision to spray or not.
Examples of such models are NegFry, Prophy, Plant-Plus, Simphyt and
Guntz-Divoux. Although there is no limit to the calculating power of computer
programmes as a tool for the development of forecasting models, a limiting
factor remains the knowledge we have on crucial aspects of the late blight
epidemic. Despite 150 years of research there are still many unanswered
questions regarding epidemiology, fungicides, weather conditions and cultivar
resistance and their interaction. Improvements in forecasting systems will
therefore need to keep pace with our up to date knowledge of these factors. The
cost of these systems in relation to their benefits and most importantly their
relatively 'user-unfriendliness' may well preclude their wide scale use at least
in the foreseeable future. In my opinion, Decision Support Systems will play a
vital role in our ability to control late blight, but they will not in
themselves provide the ultimate solution to the problem.
Forecasting apple diseases
The main UK apple cultivars Cox, Jonagold and Gala (dessert) and Bramley's Seedling (culinary) are susceptible to a range of fungal diseases which reduce yield and quality and may result in further losses post-harvest through rotting in store. The major UK diseases are scab (Venturia inaequalis) and powdery mildew (Podosphaera leucotricha) and control relies on routine application of fungicide at 7-14 day intervals to achieve blemish-free fruit required by the market. Such practices are usually reliable but increased public concern about pesticides and rising costs to growers mean they have led to a reappraisal of their use. The development of disease warning systems offers scope to optimise fungicide use by better timing of sprays. ADEM™ (Apple Disease East Malling) is a PC-based system which forecasts the risk of scab, mildew, Nectria fruit rot and canker (Nectria galligena) and fireblight (Erwinia amylovora).
For disease warning systems to be adopted by growers for decision-making on fungicide use, they must demonstrate the accurate identification of infection periods. Orchard tests over several years have shown the models in ADEM™ to give accurate warnings of infection periods. Trails have also been conducted to develop practical strategies, which make use of warnings for scab and mildew without putting the apple crop at risk. A key stage strategy had been developed. In this, routine fungicide applications are made at key growth stages of bud burst and petal fall. At other times spray decisions are based on disease warnings generated by ADEM™, but also taking into account other practical considerations e.g. treatments for other diseases, pests or nutrient sprays and also holidays when spray operators may be unavailable.
The key stage strategy has been tested in large plot orchard trials and , more recently, used to manage 15 orchards at East Malling. Use of the system has resulted in similar or better disease control compared to routine spray programmes, but with reduced fungicide inputs. ADEM™ has been available commercially since 1996. Uptake by fruit growers in the UK has been slow, but now the advantages of the system are being recognised and used commercially.
Forecasting the effect of disease as influenced by the host
Predictions of yield loss in cereals due to disease are often imprecise because of the variation in the relationship between percent severity measurements and final yield. This study tested the hypothesis that yield response to an unit of disease would change with differences in the growth of the host. Shading devices, which produced levels of radiation similar to those created by continuous cloud, were used to reduce the growth of winter wheat cv. Slejpner during sequential intervals of crop development. Shading did not effect the progress of disease symptoms of introduced epidemics of Puccinia striiformis (yellow rust) but did effect the crop's response to fungicidal control. For example, when plants were inoculated at GS 39 to produce a late epidemic, a response of 3 t ha-1 was observed for a crop shaded between GS 31 and GS 39 (during stem extension), 1.25 t ha-1 in a crop shaded between GS 39 and GS 55 (flag emergence to 50 % ear emergence) and 2 t ha-1 in an unshaded crop. Differences in yield response were due to differences in allometry with different intervals of development, and particularly the contribution of current photoassimilate to components of yield. Shading between GS 31 and 39 reduced the production of soluble sugar stored in stems and sheaths from 2.5 t ha-1 to 0.5 t ha-1 while shading between GS 39 and GS 55 reduced the number of grains formed in each ear from 42 to 33. In terms of yield, shading between GS 31 and 39 produced a "source" limited crop as soluble sugars were eventually used to fill grain and shading between GS 39 to GS 55 produced a "sink" limited crop. The effect of the late epidemic, in reducing the green area available for the interception of radiation, was greater on crops shaded between GS 31 and GS 39 as compensatory growth to replenish concentrations of soluble sugar was prevented. In contrast, the yield response for crops shaded between GS 39 and GS 55 was less than for an unshaded crop which had more grains to fill. It can be concluded that accurate predictions of the effect of disease cannot be made if host development and growth are not considered.