Terry M. Stewart
Institute of Natural Resources
Massey University
Palmerston North
New Zealand
t.stewart@massey.ac.nz
Stewart, T.M. 2004. Teaching the art and science of plant disease diagnosis: Training students with DIAGNOSIS for Crop Problems. The Plant Health Instructor. DOI:10.1094/PHI-T-2004-0426-01
INTRODUCTION
The ability to diagnose an abnormal plant condition is an important skill for the practicing plant pathologist. Those working in the area of plant pathology extension are often called upon to undertake this task in the course of their daily activities. Even lab-based plant pathologists should have an appreciation of the diagnostic process as they may also engage in it at some stage in their careers.
Plant diagnosis is both an art and a science, and it requires considerable experience to become skilled at the task (2). How then, can students be best educated in this process? In the absence of the "the real thing", one way experience can be gained is to give students the opportunity to explore and analyse diagnostic "cases" or "scenarios." The use of case studies as learning tools in plant pathology has been reported by other authors (6). In educational terms, this facilitates "goal-based" or "problem-based" learning, an effective learning paradigm that has gained wide acceptance among educators (7).
The article below describes the use and development of DIAGNOSIS for Crop Problems (DIAGNOSIS), a computer-based authoring tool and scenario presenter which facilitates training in plant disease diagnosis. Constructing training scenarios is also covered.
DIAGNOSTIC REASONING
Before describing the DIAGNOSIS training system, it is useful to spend time considering the diagnostic process itself. Although little work has been done on the process of diagnostic reasoning in plant pathology, this topic has been researched extensively in medicine, where clinical diagnosis is a necessary skill for any practitioner.
The process of diagnostic reasoning follows the Select and Test Model (ST-Model) described in (Figure 1) (9). Although developed to explain clinical diagnosis in medicine, it can be applied to plant diagnosis. As we see from the model, upon the first observations of a disease, a diagnostician would take those observations and form hypotheses which may explain the causes behind the signs and symptoms. Such a process is known as abduction. Deduction is then used to narrow down the cause by seeking data which support these hypotheses. For example, if we know that crown rot of apples (Phythophthora cactorum) on a susceptible cultivar exhibits a basal rot or canker, and we suspect that the crop is affected by crown rot, then we can deduce that if we examine the base of the trees, we should see a canker! This deductive process may require the diagnostician to carry out a series of tasks or tests that will reveal more data. Sometimes, as a result of the tasks carried out during this deductive process, the observations may not match the hypothesis being investigated. If this is the case, new data may be sought, other hypotheses explored or new hypotheses created. The process by which hypotheses are rejected or strengthened is through a particular type of induction. The term induction is most often used to refer to a process where observable facts are synthesized into generalizations or laws (Types I and II). However, during diagnosis, a particular type of induction (Type III) is used, which compares the assertions of the hypotheses in question, with the observable facts (5). For example, if many basal cankers are found in apples of a particular cultivar, and it is known that basal cankers are a feature of crown rot, a disease common in that same cultivar, then comparing these two statements strengthens the hypothesis that the condition in question is indeed a reflection of crown rot.
Figure 1
The ST-model is one representation from a host of studies on diagnostic reasoning. Other researchers describe this process as forward reasoning (i.e. abduction) and backward reasoning (i.e. deduction-induction) (3). While the ST-model shows the general process of diagnostic reasoning, the heuristics used can differ between the expert and non-expert. Non-experts tend to use non-selective search and seek patterns which can be poor for formulating hypotheses. Experts on the other hand, recognise patterns and can generate hypotheses quickly, on the basis of just a few observations. Hence their use of forward-reasoning is high. Research suggests that only about seven hypotheses are active at one time during any diagnostic episode, constrained by the limited capacity of short-term memory (4). A recent review of clinical problem-solving and diagnostic decision-making in medicine has been published by Elsein and Schwarz, 2002 (1).
How does knowing about the cognitive process which takes place during diagnosis assist with training plant pathology students to better identify and assess plant diseases? Knowing the process is important both in using, and developing, diagnostic teaching scenarios for students to explore. Realistic scenarios are required, and such scenarios should allow the students to undertake this diagnostic reasoning. Students, at the level of knowledge being taught, should have the opportunity to form hypotheses on the basis of the initial scenario information and explore them. This means the problems should not be too easy so that they are diagnosed immediately, nor should they be so difficult and beyond the student's subject knowledge (or their ability to find that knowledge) that they are unable to form any meaningful hypotheses at all.
DIAGNOSIS for Crop Problems - A description
DIAGNOSIS has been used at Massey University and at other Institutions for over a decade, and has been expanded and improved over a number of operating systems (10, 11, 12, 13). Versions 2 and 3 have been developed in conjunction with the Centre for Biological Information Technology, University of Queensland, Australia. The latest version is titled DIAGNOSIS for Crop Problems version 3.1.
In short, the DIAGNOSIS software presents students with a plant diagnostic problem to solve; it captures their diagnosis, justification and solution; and it provides feedback. The exercises are designed to be firmly embedded into a series of lectures and laboratory classes on the diagnostic process, within a plant pathology course. In this way the software is designed to complement, not replace, hands-on laboratory work.
Contrary to what the name might suggest, DIAGNOSIS is not an expert system, but the reverse of an expert system. It assumes students are 'experts' and provides them with simulated observations of a plant problem. The students need to come up with the diagnoses themselves. Therefore the students, not the program, carry out the interpretation and gathering of information. The software simply provides a virtual reality environment in which they can work. It uses the metaphor of the computer "adventure game," where players inspect objects and clues and draw inferences, which then enable them to progress to a conclusion.
The core of a DIAGNOSIS exercise is the instructor-constructed scenario (Figure 2). The scenario presents students with an unknown plant health problem - perhaps wilting plants in a potato field, poorly growing trees in an apple orchard or yellow leaves on wheat. In a typical scenario, the students are "dropped into" a field or environment containing this problem, usually accompanied by a grower who can supply information on management and the history of the crop. Then, by way of navigational links and icons, the student gathers observations about the plant and environment, checks equipment and asks questions of the grower. Students can collect objects such as plant parts, weeds and soil and take them to a laboratory for testing. Here they can make further observations, look up further information (such as weather data), try to extract the causal organisms and/or conduct a number of tests.
Figure 2
Once students feel they have enough information to make a diagnosis, they enter this into the computer along with a justification and recommendation. Students can then be told the answer and be given immediate feedback or (more commonly) the student input is later extracted by their instructor, printed out and graded. In the case of the latter, feedback is appended to their input text. Feedback takes the form of the diagnosis, a correct interpretation of the significant observations, the common investigative pathway an experienced diagnostician would have taken and a model recommendation to the grower. Some feedback is customised depending on whether or not students carried out particular tasks.
Many of the tasks in DIAGNOSIS, such as laboratory tests, have a price tag associated with them. If instructors wish it, students can be given a budget before the exercise and asked not to exceed it. This gives some incentive for the students to limit themselves to tasks and tests appropriate to the likely problem rather than trying all the menu options. It mirrors the real world, where such procedures do indeed cost real money.
Scenarios, then, are the "data" used by the DIAGNOSIS program. The software provides a "Builder" tool for instructors so that scenarios can be constructed or altered as desired (Figure 3). A scenario "Player" program then presents the problems to students. Constructing scenarios is discussed more fully later in this paper.
Figure 3
Version 3 of DIAGNOSIS uses modern Windows technology and NT/2000/XP conventions. It considerably enhances the DIAGNOSIS experience over previous versions by allowing the creation of customized icons and interfaces (Figures 2 and 3). It also allows instructors to embed, or hyperlink to, reference information either on the hard disk or the Internet. This feature enables instructors to provide not only the problem, but also information on the significance of what the student is considering at the time. In this way, its usefulness as a problem-based scenario engine is considerably enhanced. Also, scenarios can now be written in any language, making the tool more accessible to instructors and students throughout the world.
With the previous versions of DIAGNOSIS, students were not aware of the significance of observations until the debriefing at the end. In the current version, there is the option to provide feedback as they explore the scenario.
Using DIAGNOSIS in the Classroom
At Massey University, DIAGNOSIS is mostly used in a third-year undergraduate plant pathology course for our Bachelor of Science (plant protection) and Bachelor of Applied Science degrees. An objective of this course is to give students the skills to diagnosis plant problems and recommend management strategies to alleviate those problems. DIAGNOSIS is introduced towards the end of the course where, by this time, students should have a good understanding of the biology of the common causal agents and the symptoms they produce. What they are required to learn is the process of diagnosis and how to evaluate uncommon or complex crop problems. They must be able to integrate their existing knowledge and seek out new facts in order to solve the problem. The requirements of this section of the course lend themselves to problem-based learning.
Students begin with two lectures on the process of disease diagnosis. Towards the end of the lecture series they are introduced to the software and "walked through" a scenario, after which they undertake the exercises on their own. An Internet-connected computer station is provided as well as some diagnostic and reference literature. They are expected to use these resources, but the software has a "save session" function so they can leave the computer in order to look up things elsewhere if they wish.
By continually monitoring the program's suitability, we have found the exercises work best when the students undertake them in pairs or small groups. This way they can "bounce" ideas off each other, working and communicating as a team. Students are not forced to work in groups however, and we do give them the option of working alone if they so desire.
Students usually complete two scenarios in the Diagnosis section of the course, one reasonably straightforward and the other more complex. After all students have completed the exercises, their files are collected and assessed. A mark from 0 to 10 is awarded based on the following criteria:
- their diagnosis
- justification for their diagnosis
- recommendations to the grower
- how they approached the diagnosis (as read from the log file)
- how much money they spent in carrying out the diagnosis (if relevant)
The justification and recommendations make up the bulk of the assessment. Having to explain why they arrived at their particular diagnosis, reveals the student's true understanding of the situation. Also, identifying the problem is one thing; knowing what to do about it is another. Students are expected to suggest a remedy, or at least propose what further action should be taken.
Another useful way of using DIAGNOSIS is to ask students to construct scenarios themselves, and then get other students to use and critique them. Just as having to teach new material forces a teacher to learn it well, constructing scenarios makes students think about how a plant health problem may relate to management and the environment in which it exists. Plant pathology postgraduate students are required to do this in one of their postgraduate papers.
WRITING DIAGNOSTIC SCENARIOS
Good learning requires good scenarios. The DIAGNOSIS scenarios use reflected reality and all its ambiguities and include situations where the solution is not clear-cut.
Writing useful scenarios takes a fair degree of planning. A plant problem must be identified which is suitable for the level of student being taught. Information should be obtained on the signs and symptoms, and thought should be given as to the circumstances that may have encouraged this condition to develop. What other causal factors could it be confused with? These will become the alternative hypotheses students can explore. After these are determined, thought must be given to the tasks a student would need to undertake if they wanted to explore these other hypotheses. These tasks should be available in the scenario.
Finally, a scenario author needs to determine which other, non-essential tasks and observations will be available. There should be some redundant tasks that students can work through, even if in doing so they are not going to advance their knowledge any. For example, in a particular scenario (say Armillaria damage to kiwifruit), instructors may like to allow an examination and testing of spray equipment, even though spray damage is an unlikely cause. After all, in the real world we can choose to make many observations, whether they are appropriate things to do or not. For the instructor, monitoring students' choice and order of tasks can reveal the level of basic knowledge and diagnostic process (or lack thereof) the student may have.
After these tasks are noted, responses to each task need to be formulated. This is the information the students will see and will need to interpret when they actually carry out the task. Feedback, background information, costs and clues also need to be determined and multimedia components collected.
Scenarios can be tailored to the level of the students concerned. For example, if students send away samples for nutrient analysis, the scenario may return the results saying "the nutrient levels appear OK" or alternatively provide a list of values for each nutrient. The former response is suitable for lower-level students, while the latter response is used with students who should know (or be able to find out) the significance of these values.
In the 12 years DIAGNOSIS has been available, it seems that formulating good scenarios is the most difficult part of using the training program. Certainly, using extension agents and consultants to provide the raw case studies is valuable. Even so, the allowable tasks/tests and presented observations/data still need to be determined. As an aid to writing scenarios, we have developed a Microsoft WORD template. Instructors can work through this template step by step, replacing the example text with their own. The template can be downloaded from the DIAGNOSIS for Crop Problems support website at http://www.diagnosis.co.nz. A demo program of DIAGNOSIS for Crop Problems can also be found at this site which includes a free version of the Player software.
Student evaluation of DIAGNOSIS
Informal student surveys on DIAGNOSIS have been carried out over the years. In two surveys conducted in 1992 and 1993 respectively, third-year plant protection students were asked to rate the value of DIAGNOSIS as a learning exercise using a scale of 1 to 5 where 1=no value and 5=excellent value. Out of a combined total of 23 students over two years, 19 students gave the exercise a score of 5 and four students rated it a 4. Of the latter students, three awarded a 4 rather than a 5 due to the "user unfriendly" interface of the MS-DOS version, while one awarded the lower score due to the lack of procedures available. Both deficiencies were corrected in the next version (version 3) for Windows 3.1. This version was also assessed in a survey similar to the one above. From a class of 15, 14 students awarded the DIAGNOSIS exercise using the Windows 3.1 version a 5, while one student rated it a 4.
An evaluation was carried out in 2002 and 2003 on DIAGNOSIS for Crop Problems version 3 with a combined total of 14 students enrolled in the Plant Pathology class in consecutive years. Table 1 is a result of a questionnaire that students completed after working through two DIAGNOSIS scenarios. The results show a positive response to the exercises.
Table 1 - DIAGNOSIS for Crop Problem 2002-2003 student survey (1 = strongly agree, 5= strongly disagree) |
Statement |
Mean |
Std Dev |
I enjoyed these exercises. |
1.4 |
0.5 |
The interface was intuitive and easy to use. |
1.6 |
0.5 |
These exercises helped me improve my knowledge of the diagnostic process. |
1.6 |
0.5 |
I would have learned more from just covering diagnostic examples given in formal lectures rather than using the software. |
4.3 |
0.6 |
I would have learned more from just being given diagnostic examples in written form to read about at home rather than using the software. |
4.4 |
0.5 |
I found the typed “creator’s solution” feedback comments on the returned script helpful. |
1.7 |
0.7 |
I prefer to go through a scenario with one other, rather than myself |
2.5 |
1.2 |
Note: sample size = 14
Student surveys were always qualitative as there were usually no more than 10 or so students in the class at any one time. This made obtaining statistically significant data difficult. However, the comments provided by students led to improvements in the software.
CONCLUSION
Teaching the diagnostic process is an important but often overlooked aspect of the plant pathology curriculum. For this component, DIAGNOSIS has aided the move from an information-based style of teaching to a problem-based one. However, like all teaching technologies, it is how it is used that matters (8). The value of DIAGNOSIS as an educational experience is not the Builder and the Player software, useful though these are. The real value of DIAGNOSIS is in the quality of the scenarios and how the exercises are used within a course. The software is a useful tool, but it will always require good teachers and good scenarios to realize its full potential.
REFERENCES
(1) Elstein, A.S. and Schwarz, A. 2002. Clinical problem-solving and diagnostic decision making. Selective review of the cognitive literature. Brit. Med. J. 321: 729-732.
(2)Grogan R.G. 1981. The science and art of pest and disease diagnosis. Annu. Rev. Phytopathol. 19: 333-351.
(3)Hunt, E. 1989. Cognitive science: definition, status and questions. Annu. Rev. Psychol. 40: 603-629.
(4)Kassier, J.P. 1989. Diagnostic reasoning. Ann. Intern. Med. 110: 893-900.
(5)Magnani L. 1997. Basic science reasoning and clinical reasoning intertwined: Epistemological analysis and consequences for medical education. Adv. Health Sci. Educ. 2: 115-130.
(6) Mathre D.E. and Grey, W.E. 2002. Naughty peat: a case study in plant pathology, with emphasis on Koch's postulates and disease etiology. The Plant Health Instructor. DOI: 10.1094/PHI-T-2002-0301-01.
(7)Schank, R., Fano, A., Bell, B., and Jona, M. 1993. The design of goal-based scenarios. J. Learning Sci. 3:305-345.
(8)Schumann, G. 2003. Innovations in teaching plant pathology. Annu. Rev. Phytopathol. 41: 377-398.
(9)Stefanelli, M., and Ramoni, M. 1991. Epistemological constraints on medical knowledge-based systems, in: D.A. Evans, V.L. Patel Eds. , Proceedings of the NATO Advanced Research Workshop on Advanced Models of Cognition for Medical Training and Practice, ll Ciocco, Barga, Italy, June 19-22, Springer-Verlag, pp. 3-20, Published in cooperation with NATO Scientific Affairs Division.
(10)Stewart T.M. 1992. Diagnosis. A Microcomputer-based teaching-aid: Plant Dis. 76:644-647.
(11)Stewart T.M., Duncan, S., and Rankin, L. 1992. Plant Diagnosis - "Detective work" via HyperCard. Abstracts from Pacific University Conference '92. Kyoto, Japan: 96-97.
(12)Stewart T.M., Blackshaw, B.P.,Duncan, S., Dale, M.L., Zalucki, M.P., and Norton, G.A. 1995. Diagnosis: a novel, multimedia, computer-based approach to training crop protection practitioners. Crop Prot. 14:241-246.
(13)Stewart, T., Kemp, R., Bartrum, P. 2001. Computerised Problem Based Scenarios in Practice - A Decade of DIAGNOSIS. IEEE International Conference on Advanced Learning Technologies, Madison, USA. pp. 153-156.
http://www.diagnosis.co.nz