IIMS 96 contents
[ IIMS 96 contents ]

Multimedia as cognitive tools: Students working with a performance support system[1]

Martyn Wild and Denise Kirkpatrick
Edith Cowan University
This paper describes a particular approach to the design of interactive multimedia software in tertiary education, rationalising the design and implementation of this software within the context of, (i) computers as cognitive tools, and, (ii) computers as performance support systems (PSSs). The software is specifically aimed at student teachers and is conceptualised to provide cognitive support in the completion of a complex task - the planning of lessons. In this sense, the software can also be considered to behave as a performance support system (PSS), a type of software more commonly found in commercial and industry settings, than in educational ones.


Introduction

Cognitive tools are computer based applications that are normally used as productivity software. However these applications may also function as knowledge representation formalisms that require learners to think critically when using them to represent content being studied or what they already know about a subject (Jonassen, 1995, 40).

In an extensive discussion of the value of cognitive tools, Jonassen describes how conventional applications, such as spreadsheets, databases, expert systems, etc, might become intellectual partners and serve to expand and amplify the thinking of learners, engaging students as knowledge constructors rather than information processors (Jonassen, 1995). In one sense, Jonassen is telling us nothing that is particularly new - Briggs, Nichol, Dean, and others of the 'Prolog community', have long sought to provide learners with a range of cognitive tools for the representation and exploration of knowledge (Briggs, Nichol, & Brough, 1990; Dean, 1990; Nichol, Briggs, & Dean, 1988). Furthermore, various research and development teams have similarly been involved with the provision of cognitive tools to engage students in a range of modelling environments, so that they might represent and manipulate knowledge according to various formalisms (Cox & Webb, 1994; Mellar, et al., 1994; Webb, 1994). The learning theories underpinning the development and our understanding of the value of such cognitive tools are reasonably robust, cognitivist based and are in a general sense, covered by the umbrella of constructionism. More particularly, the use and value of cognitive tools has, of late, been shown to owe much to mental models theories, particularly to that of Johnson-Laird (Johnson-Laird, 1983; Wild, in press).

This paper, then, describes a research project that seeks to exemplify how performance support systems (PSSs) can be used by students as cognitive tools to express and extend their thinking in a complex domain.

Performance support systems

A Performance Support System is interactive software that is intended to both train and support the novice user in the performance of tasks. Raybould describes an PSS as a "computer based system that improves worker productivity by providing on the job access to integrated information, advice and learning experiences" (Raybould, 1990).

There exist slightly different perspectives of PSSs, each moulded by small shifts in emphasis; for example, Barker and Banerjii stress the problem centred focus of PSSs (Barker & Banerji, 1995) , whilst McGraw characterises PSSs in terms of their facilities, noting their integration of AI technologies, hypermedia and CBT (McGraw, 1994). PSSs can also be described in terms of the uses made of them - that is, in addition to their role in instructing and supporting novices, they might be used by those more experienced in the focus tasks to increase efficiency and quality of output, for example, by serving as amplifiers of experience and knowledge (Gery, 1991). Traditionally, however, PSSs have been characterised by their structure and the software resources they provide; that is, they comprise hypermedia reference and instructional sequences, together with open ended software tools, and context sensitive supporting information (Gery, 1995). Such software have been developed in training situations in medicine (eg. medical diagnostic systems), engineering (eg. computer assisted design systems) and management (eg. decision support systems). More recently, the concept of performance support has been applied to mainstream and generic software tools, such as Microsoft Excel; and further, the nature of supporting functions in current and future designs of PSSs has been reconceptualised by Gery (1995), to allow for increasingly diverse applications and types of performance support functions of PSSs.

However, these later developments in the design, application and theory of PSSs have not altered their main purpose, which is, quite simply, to facilitate satisfactory or improved performance of a task by someone with limited experience and training in such a task, by providing just in time resources (instructional, supporting and performance resources). Moreover, PSSs, as well as the supporting functions found in more sophisticated mainstream generic software tools or applications, are more often applied to simple rather than complex tasks. In applying PSSs to complex tasks, it is argued that both instruction and performance support functions need to provide for higher order learning, and particularly for transfer of knowledge. Again, this is fundamentally different to the traditional nature and purpose of PSSs which are concerned with tasks characterised by training in systems' use, whether in a business or software engineering sense (Raybould, 1995). It is worth noting that the uses and types of PSSs are likely to diversify in their future manifestations, when the nature of any particular PSS will be defined largely in terms of its target application, its users and the related domain.

In this light, a PSS has been developed for use by first year Education students in the academic setting and in the classroom. This PSS is intended to facilitate the development of student skills in the area of lesson planning. The basic premise underlying the development of the Lesson Planning System (LPS) is that it provides a structured environment within which student and beginning teachers are able to design lesson plans for immediate implementation and also receive instructional support in the design process. By engaging novices in the process of designing materials that impact directly on their teaching, it is intended to provide for deeper processing of a complex task, resulting in a more complete understanding of the domain. This is essentially the role and purpose of all cognitive tools (Jonassen, 1994; Jonassen, 1995).

To date, instructional materials based on interactive technologies, have tended to focus on the instructional aspect of task performance (Brown, 1991; Jih & Reeves, 1992). It is contended that student use of the LPS in the school and university setting will facilitate the transfer of cognitive strategies and minimise the distinction between 'learning and doing', thereby improving students' lesson planning performance.

The design of the LPS

The LPS incorporates the model of lesson planning required by Edith Cowan University, Western Australia, and wider afield. It includes essential components of lesson planning such as writing learning objectives, developing learning experiences and planning evaluation. Each component is supported by activities that instructs the user about the task (eg. provision of information relating to reasons why objectives are necessary, criteria for quality objectives), and which also assist the user in performing the task (eg. provision of a database of verbs to assist in writing quality learning objectives). A set of software tools are available to support each activity. One of these, for example, is a knowledge base system, to provide students with the ability to evaluate the effectiveness of their completed lesson plans. This works by prompting users to analyse and reflect upon the appropriateness of evaluation processes set in relation to lesson objectives.

The lesson planning process can be viewed as an exercise in problem solving. An important factor in solving problems is domain specific comprehension. Glaser has suggested that one of the features distinguishing a novice from an expert is the incompleteness of the novice's knowledge base, rather than limitations in their processing capabilities (Glaser, 1984). It has been suggested that the transition from novice to expert performance is largely provided for by the acquisition of a suitable knowledge base (Glaser, 1982). A knowledge base consists of both descriptive and heuristic components descriptive knowledge is the shared knowledge of experts and practitioners that is usually found in text books, while the heuristic component includes the knowledge of good practice and judgement constructed over years of experience. It is suggested that the description of expert performance should include two related aspects: the information structures and declarative knowledge that are required for performance and the cognitive strategies and procedural knowledge that are required by the task.

Lesson planning is an essential cognitive skill for teachers. Effective lesson planners possess declarative knowledge about themselves as planners, about the task of lesson planning and about ways of going about the task. They also possess domain specific knowledge, such as the criteria for creating instructional objectives, the most appropriate strategies to achieve particular objectives and the range and relevance of evaluation techniques. They know how to plan lessons in the appropriate way, what is required of them in planning a lesson and they know when and why to perform particular aspects of lesson planning. In addition to this knowledge they have the skills to regulate their own performance, checking and monitoring to ensure they are meeting certain criteria. They also possess the skills and knowledge to allow themselves to correct errors. These characteristics of the lesson planner, the task and the interaction of both, are all addressed in the design of the LPS.

Figure 1

Figure 1: Two of the tools within the LPS in use: the Verb Database and the WorkPad.

Multimedia or hypermedia?

The title of this paper alludes to the design of interactive multimedia tools; however, there is an implicit assumption made here, that the value of multimedia as cognitive tools is not centrally concerned with the various media embedded in multimedia but rather the use of multimedia in a hypermedia structure. That is, the cognitive value of multimedia lies in the hypermedia structure governing its application - a semantic or associative network of interlinked information, distributed across a range of media (ie. sound, graphic, animation, video).

Indeed, the most notable if not the most distinguishing feature of interactive multimedia software in terms of its educational significance, is this facility to provide for non-hierarchical representations. Interestingly, it was those working with knowledge representation tools who, looking for a theoretical framework in approaches to learning, initially suggested that computer based semantic representation of knowledge perhaps best mirrored the behaviour of certain higher order cognitive activities (Nichol, 1988; Nichol, Briggs, & Dean, 1988) - a suggestion that finds a basis in Minsky's theory of cognitive frame representation (Minsky, 1975) ; and, more recently, in mental models theories (Gentner & Stevens, 1983; Glaser, 1984; Johnson-Laird, 1983; Johnson-Laird, 1993; Wild, in press). Although, of course, even if one accepts this premise, it does not automatically provide for the assumption, a priori, that using hypermedia structures for knowledge representation will result in better cognitive representations on the part of learners.

Modelling

The LPS is a cognitive tool that encourages problem solving through modelling, that is, the building and exploring of qualitative models. In this sense, users of the LPS are encouraged to create models of lesson plans and to explore, test and refine those models.

Modelling is an essential component of cognitive activity, of thinking, and for Craik, the originator of the concept of mental models, thinking is concerned with the organisation and functioning of mental processes and representations (Craik, 1943; Johnson-Laird, 1993). It follows that cognitive tools must necessarily provide for modelling activity. That is, they must provide the means by which learners can construct, manipulate and evaluate representations of knowledge. The modelling environment needs to be accurate and structural but not necessarily complete, enabling learners to move from their own mental representations of lesson planning to the conceptual model of that process required by an expert. In this process, novices will be able to construct a deeper understanding of a complex domain.

It is generally agreed that although a modelling environment should not be complete it is important that it remains functional; that is, it must provide the learner with some expert knowledge and it must facilitate learner predictions (L.M.M.G., 1988; Mellar, et al., 1994; Wild, in press). It is the incompleteness of the model that provides the opportunity for construction, reflection and change. In this sense, the LPS provides an environment for learners to externalise their own understanding of the lesson planning process, to identify inaccuracies or insufficiencies in their thinking and to reflect on their cognitive models without expressing a commitment to any one in particular.

Indeed, it is known that mental reasoning (propositional, relational and quantified reasoning) involves the construction and evaluation of a number of possible models to suit particular interpretations of premises to an event, before making a final inference or conclusion (Johnson-Laird, 1983; Johnson-Laird & Byrne, 1991). Since the limitation to inferential processing is the capacity of working memory, the greater the number of models needed for an inference, the harder that inference will be (Sweller & Chandler, 1994). Furthermore, learners will sometimes fail to construct all possible models to for a given event - if they arrive at an conclusion that fits their available beliefs, they will tend not to search for others, with the consequence of overlooking the correct conclusion (Johnson-Laird, 1993; Johnson-Laird & Byrne, 1991). Also, in this context, learners may construct mental models based on seemingly analogous experiences which may compound the construction of misconceived models (Jih & Reeves, 1992). Thus, by providing cognitive tools on the computer, it is possible to provide the necessary means for learners (in this case, student teachers) to externalise their thinking and consequently create strong and accurate models that otherwise might prove elusive.

Cognitive load

The greater the availability and accessibility of information within a given computer environment, the more likely users will flounder as a result of excessive cognitive load or cognitive overload and consequently fail to learn. According to Jih and Reeves, learners using a hypermedia system must cope with and integrate three types of cognitive load: the content of the information, the structure of the program and the response strategies available (Jih & Reeves, 1992). How learners cope with such a load depends largely on the human-computer interface. For example, cognitive load can be reduced by: (i) reducing the number of options at any one point in the program; (ii) by encouraging users to externalise their thinking, by use, for example, of text annotations and place marking; (iii) by 'hiding' program options not likely to be needed by most users; (iv) by providing strong visual cues to aid navigation; and, (v) by reducing the number of hypermedia links between information nodes (Oren, 1990).

The means by which users deal with the cognitive load imposed by the LPS will largely be a function of their conception of the lesson planning task as well as that of the software interface. Certainly software features such as online help (ie. help, for example, in planning the task) and dynamic structure maps (ie. maps to show a user's position in the hypermedia environment at any one point), are included in the design of the LPS to encourage learners to build strong conceptualisations, or mental models (Jih & Reeves, 1992).

Learner control

Learner control is a reference to that dimension in computer based education that describes the level of control exercised by the learner when interacting with a given software item. Despite the fact that learner control has been one of the most heavily researched dimensions of computer based education in recent years (Steinberg, 1989), Reeves has pointed out that many of the research studies are flawed both in their theoretical and methodological bases (Reeves, 1993). It seems to be popularly assumed that the greater the control exercised by the learner (as opposed to that exercised by the software) within a given software environment, the greater the level of learning will be. This assumption is undoubtedly a product of cognitivist learning perspectives, and is closely related to the following, fundamental, premises: (i) learners are active processors of information; and, (ii) knowledge is more likely to be successfully constructed when learners have control over the learning process (Rowe, 1993). However, what evidence we do have about learner control is at best contradictory and at worst negative (Reeves, 1993; Steinberg, 1989). In particular, Oliver draws attention to research that suggests that unskilled learners fare especially badly in terms of performance outcomes when the degree of learner control is high and external control (eg. control by the program) is low (Oliver, 1994).

The LPS provides for significant learner control over a range of learning processes, including: task perception, information retrieval and processing, problem solving, knowledge construction, revision, reflection and cognitive modelling. The research program to investigate the effectiveness of the LPS will, in part, consider whether the high degree of learner control invested in the software system effects performance outcomes.

An initial evaluation

A hypothesis central to the development of the LPS, is that it's use will result in students creating better quality lesson plans, and more efficiently, than they do by traditional means. Indeed, although it is relatively simple to obtain measures to test this hypothesis, the complexity of a PSS in terms of the ways and contexts in which it might be used, make it difficult to isolate the effects of the use of the system on performance alone, or even to compare alternative ways of completing the same task (Collis & Verwijs, 1995). However, once this fact is recognised, it is relatively easy to account for the strategies that students employ in making use of the system; this is something planned for in the research program, the results of which will be published elsewhere.

What is perhaps of more immediate interest is evaluation of the nature of the task the LPS is directed towards, the experience of the students in the task and the context in which students use the LPS to help perform that task. This is what Collis and Verwijs conceptualise, in a protracted description, as user orientated evaluation (Coil is & Verwijs, 1995). Barker and Banerji, perhaps more elegantly, suggest something similar, namely that evaluation of a PSS should make account of task execution in terms of its context, the use of resources available to perform that task, and the skill and knowledge levels required by the task in relation to those possessed by the user (Barker & Banerji, 1995).

As part of the iterative evaluation processes employed in the life cycle of the LPS, a quasi-experiment was conducted, after an approach followed by Barker and Banerji (1995), to assess the potential usefulness of the LPS in terms of context of use, types of user and system resources made available. Two groups of users were identified - an expert and a novice group; both groups of users were asked to make use of the LPS at university, in preparation for professional practice (where lesson planning is a required and assessed performance skill of the students), and in the field (ie. at schools, over the period of the two-week professional practice). The expert (E) and novice (N) groups were differentiated by students' experience with lesson planning and, partially, by their own perceptions of their lesson planning skills. Thus, expert students were described as students who had completed 2 years of an education degree course, whereas novices were those who had completed only 6 months; expert students were those who perceived themselves as 'very capable' in lesson planning; novices were students who considered their lesson planning skills as 'poor'.

Each of the students in the expert and novice groups (4 students per group) were asked to undertake the lesson planning task twice (after initially being made familiar with the system), at their own pace. There is, of course, no control over the products generated as a result of task completion (ie. the lesson plans); however, it is assumed that they will be applied to the student's own teaching. Data was collected using independent checklists for each student. These data included the number of student interactions with both instruction (support) and performance (tools) components of the LPS (Figure 2), as well as the time taken for each student to complete the tasks (Figure 3). For purposes of analysis, the data collected for students working in the two different contexts (ie. at university and in the field) have been presented as means.

Figure 2

Figure 2: Task performance characteristics

Although the combined population of the student groups was small (n=8) and therefore a limiting factor on the value of the findings, the results are of interest as part of the iterative evaluation process. For example, Figure 2 demonstrates how, for the second task (T/2) at least, the manner in which the novice students perform the task is closer to the manner in which the expert students perform the task; that is, the novice students make less use of the instructional components of the LPS, and the level of their use more closely resembles that of the expert students. Also, novice and expert students' use of the performance functions of the LPS converge, demonstrating comparatively consistent user behaviours across student groups. Furthermore, this pattern holds when the task performance time characteristics are accounted for (Figure 3). Here, despite a slightly erratic pattern for task 1 (T/1), novice and expert students are seen to take very similar amounts of time for completing the second task (T/2).

Figure 3

Figure 3: Task performance time

Obviously, the data collected has only limited functionality; but analysis does demonstrate that the LPS can effectively operate to improve the performance characteristics of novice lesson planners, aligning aspects of their performance more closely with that of expert lesson planners. There are, of course, a range of other issues and areas that need to be evaluated in terms, for example, of the context of use and the strategies of use employed by both novice and expert student lesson planners over longer time frames. More importantly, it would also be necessary to tackle issues of transfer of knowledge.

Conclusion

This paper has sought to describe something of the design and development of a cognitive tool for student teachers, based on the concept of a performance support system. It is implied throughout the paper, that PSSs can provide valuable cognitive tools for both novice and more experienced specialists, to undertake complex tasks in a range of domains. The paper has further intended to map the value of cognitive tools in general, and the Lesson Planning System in particular, onto mental models theories, especially those advanced by Johnson-Laird (1983, 1993), Gentner and Stevens (1983) and Glaser (1982, 1984).

Within the paper issues have been raised concerning the design and development of cognitive tools such as the LPS, issues which face all of us interested in providing the power of such tools in the hands of students. However, in this context, it is of value to note that although design decisions for the LPS have been based on what instructional psychologies suggest are likely to be effective conditions of learning, the concept of the LPS, and indeed of cognitive tools in general, is concerned more with describing how the learner might interact with the instruction to construct new knowledge. That is, the design of the LPS is less about creating the instructional conditions that Gagne and Glaser prescribe as being necessary for learning (Gagne, 1977; Glaser, 1987) , and more about describing and defining how the instructor, the knowledge and the learner should interact. Indeed, this is much more in keeping with what Marton and Ramsden and other advocates of phenomenographic approaches to researching teaching and learning have revealed about effective learning (Laurillard, 1993; Marton & Ramsden, 1988).

Endnote

  1. An earlier version of this paper was first presented at Australian Council for Education through Technology, National Conference, Perth, 8-12 January, 1996.

References

Barker, P. & Banerji, A. (1995). Designing electronic performance support systems. Innovations in Education and Training International, 32(1), 4-12.

Briggs, L, Nichol, L, & Brough, D. (1990). PEG: A way of thinking about things. In A. McDougall & C. Dowling (Eds.), Proceedings of the IFIP TC Fifth World Conference on Computers in Education - WCCE 90, (pp. 1061-1066). Sydney, Australia: Elsevier Science Publishers.

Brown, M. (1991). An investigation of the development process and costs of CBT in Australia. In R. Godfrey (Ed), Simulation and Academic Gaming in Tertiary Education. Proceedings of the 8th Annual Conference of the Australian Society for Computers in Learning in Tertiary Education, (pp. 43-54). Launceston, Tasmania: University of Tasmania.

Collis, B. & Verwijs, C. (1995). Evaluating electronic performance support systems: A methodology focused on future use-in-practice. Innovations in Education and Training International, 32(1), 2330.

Cox, M. & Webb, M. (1994). Developing software and curriculum materials: The ' Modus project. In H. Mellar, J. Bliss, R. Boohan, J. Ogborn, & C. Tompsett (Eds), Learning with artificial worlds: Computer based modelling in the curriculum (pp. 188-198). London: Falmer Press.

Craik, K. (1943). The nature of explanation. Cambridge, UK: Cambridge University Press.

Dean, J. (1990). Thinking, learning and knowledge based tools. In A. McDougall & C. Dowling (Eds), Proceedings of the IFIP TC Fifth World Conference on Computers in Education - WCCE 90, (pp. 1045-1050). Sydney, Australia: Elsevier Science Publishers.

Gagne, R. M. (1977). The conditions of learning. New York: Holt Rhinehart and Winston.

Gentner, D. & Stevens, A. L. (1983). Mental models. Hillsdale, NJ: Lawrence Erlbaum.

Gery, G. (1991). Electronic performance support systems. Boston, MA: Weingarten Publications.

Gery, G. (1995). The future of EPSS. Innovations in Education and Training International, 32(1), 7073.

Glaser, R. (1982). Instructional psychology: Past, present and future. American Psychologist, 37, 292-305.

Glaser, R. (1984). Education and thinking: The role of knowledge. American Psychologist, 39, 93-104.

Glaser, R. (Ed) (1987). Advances in instructional psychology. Hillsdale, NJ: Lawrence Erlbaum Associates.

Jih, H. J., & Reeves, T. (1992). Mental models: A research focus for interactive learning systems. Educational Technology Research and Development, 40(3), 39-53.

Johnson-Laird, P. N. (1983). Mental models: Towards a cognitive science of language, inference and consciousness. Cambridge: Cambridge University Press.

Johnson-Laird, P. N. (1993). Human and machine thinking. Hillsdale, NJ: Lawrence Erlbaum Associates.

Johnson-Laird, P. N. & Byrne, R. M. J. (1991). Deduction. Hillsdale, NJ: Lawrence Erlbaum Associates.

Jonassen, D. (1994). Technology as cognitive tools: Learners as designers. Paper presented in L. Rieber (Moderator), Instructional Technology Forum [online]. University of Georgia, 2 May 1994.

Jonassen, D. H. (1995). Computers as cognitive tools: Learning with technology, not from technology. Journal of Computing in Higher Education, 6(2), 40-73.

L. M. M. G. (1988). Tools for exploratory learning (Occasional paper No. InTER/5188). University of Lancaster. May, 1988.

Laurillard, D. (1993). Rethinking university teaching: A framework for the effective use of educational technology. London: Routledge.

Marton, F., & Ramsden, P. (1988). What does it take to improve learning? In P. Ramsden (Ed), Improving learning: New perspectives. London: Kogan Page.

McGraw, K. L. (1994). Performance support systems: Integrating AI, hypermedia and CBT to enhance user performance. Journal of Artificial Intelligence in Education, 5(1), 3-26.

Mellar, H., Bliss, J., Boohan, R., Ogborn, J. & Tompsett, C. (Eds). (1994). Learning with artificial worlds: Computer based modelling in the curriculum. London: Falmer Press.

Minsky, M. (1975). A framework for representing knowledge. In P. H. Winston (Ed), The psychology of computer vision. New York: McGraw Hill.

Nichol, J. (1988). Models, micro-worlds and minds. In J. Nichol, J. Briggs & J. Dean (Eds), Prolog, children and students. London: Kogan Page.

Nichol, J., Briggs, J., & Dean, J. (Eds) (1988). Prolog, children and students. London: Kogan Page.

Oliver, R. (1994). Measuring learning outcomes using IMM systems. In C. McBeath and R. Atkinson (Eds), Proceedings of the Second International Interactive Multimedia Symposium, 377-382. Perth, Western Australia, 23-28 January. Promaco Conventions. http://www.aset.org.au/confs/iims/1994/np/oliver.html

Oren, T. (1990). Cognitive load in hypermedia: Designing for the exploratory learner. In S. Ambron & K. Hooper (Eds), Learning with interactive multimedia (pp. 126-136). Washington: Microsoft Press.

Raybould, B. (1990). Solving human performance problems with computers. Performance and Instruction, (Nov/Dec), 4-14.

Raybould, B. (1995). Making a case for EPSS. Innovations in Education and Training International, 32(1), 65-69.

Reeves, T. (1993). Pseudoscience in computer based instruction: the case of learner control research. Journal of Computer Based Instruction, 20(2), 39-46.

Rowe, H. (1993). Learning with personal computers: Issues, observations and perspectives. Hawthorn, Victoria: Australian Council for Educational Research.

Steinberg, E. R. (1989). Cognition and learner control: A literature review, 1977-88. Journal of Computer Based Instruction, 3(3), 84-90.

Sweller, J., & Chandler, P. (1994). Why is some material difficult to learn? Cognition and Instruction, 8, 351-362.

Webb, M. E. (1994). Beginning computer-based modelling in primary schools. Computers and Education, 22(1-2), 129-144.

Wild, M. (in press). A perspective on mental models and computer modelling. Journal of Computer Assisted Learning.

Authors: Martyn Wild and Denise Kirkpatrick
Dept of Multimedia Learning Technologies
Edith Cowan University
Churchlands, PerthWestern Australia 6018
Tel: 619 273 8022 Fax: 619 387 7095
Email: m.wild@cowan.edu.au

Please cite as: Wild, M. and Kirkpatrick, D. (1996). Multimedia as cognitive tools: Students working with a performance support system. In C. McBeath and R. Atkinson (Eds), Proceedings of the Third International Interactive Multimedia Symposium, 412-418. Perth, Western Australia, 21-25 January. Promaco Conventions. http://www.aset.org.au/confs/iims/1996/ry/wild.html


[ IIMS 96 contents ] [ IIMS Main ] [ ASET home ]
This URL: http://www.aset.org.au/confs/iims/1996/ry/wild.html
© 1996 Promaco Conventions. Reproduced by permission. Last revision: 15 Jan 2004. Editor: Roger Atkinson
Previous URL 14 Jan 2001 to 30 Sep 2002: http://cleo.murdoch.edu.au/gen/aset/confs/iims/96/ry/wild.html