|
2.7.11 DEVELOPMENT OF IPM AGAINST RHIZOCTONIA SOLANI IN FLOWER BULBS, INCLUDING BIOCONTROL AND CULTURAL IMPROVEMENT OF SOIL SUPPRESSIVENESS G DIJST1, PJ OYARZUN2, SN NEATE3, M DE BRUIJNE1 and TD VATCHEV4 1DLO-Research Institute for Plant Protection (IPO-DLO), PO Box 9060, 6700 GW Wageningen, the Netherlands; 2CIPP, Quito, Ecuador; 3CSIRO Division of Soils, Adelaide, South Australia; 4Plant Protection Institute, Kostinbrod Sofia, Bulgaria Background and objectives Results and conclusions In a search for prevention of patch formation we studied disease spread under regulated soil water and temperature conditions [1]. Soil suppressiveness was determined in a row-test as spread of bulb rot in Iris by R. ;solani AG 2-t at 12°C and by R. ;solani AG 4 at 18°C. For both pathosystems, more or less the same 30 commercial field soil samples performed as extremely conducive or suppressive. Suppressiveness was more pronounced at 18°C. In contrast to ANOVA, PCA (principal component analysis) yielded a useful differentiation between the natural soil receptivity, taking both disease severity and the type of response into account. Soil suppressiveness increased in time after amending with wheat straw, whereas matured GFT compost was ineffective. Soil disinfestation destroyed the soil suppressiveness and antagonist V. ;biguttatum but not the pathogens. Further addition of straw brought the soil suppression back to its original level. The effects are currently being tested in field plots. For routine use in risk prediction, a simple indicator is sought to replace the laborious row-test. Disease severity in a one-bulb-test was inadequate, as well as colony extension on different carriers on or in soil. The differences in natural suppressiveness between 30 field soils correlated with combinations of their chemical soil factors. Culturally induced suppressiveness in the bioassay correlated with enhanced microbial activity and biomass. The reliability of such indicators for the field should be determined. References |