ASET-HERDSA 2000 Main Page
[ Proceedings ] [ Abstracts ] [ ASET-HERDSA 2000 Main ]

Assessment methodologies in transition: Changing practices in web-based learning

Catherine McLoughlin
Teaching and Learning Centre, University of New England
Joe Luca
School of Communications and Multimedia, Edith Cowan University
The use of the WWW in tertiary learning environments offers greater adaptability and flexibility than traditional assessment procedures as it enables the planning and design of tasks that monitor both learning processes and learning outcomes. This paper proposes that the move towards alternative assessment paradigms has been accelerated by technology with its capacity to offer learners a broad array of activities, tasks and forums for engaging in constructivist learning.

There is now a new wave of pedagogy advocating 'alternative assessment' in which assessment is integrated into learning through engagement in real life contexts. Authentic assessment fosters understanding of learning processes with real-life performance as opposed to a display of inert knowledge. Authentic assessment is solidly based on constructivism, which recognises the learner as the chief architect of knowledge building. In the constructivist learning environment of this study, assessment processes are mediated through social interaction, communication, exchange of views and collaboration so that learners to become aware of, and take responsibility for assessing their own learning processes. In this study, alternative modes of assessment in a tertiary Web-based environment are exemplified through the use of multiple assessment tasks enabling multiple modes of showcasing student achievement through portfolios, multimedia projects, skills demonstrations and teamwork.


Introduction

If assessment defines the curriculum and encapsulates the essential learning experience in higher education, the design of educative assessment tasks could be considered to be the most important element of tertiary teaching. As institutions move increasingly to online delivery, there is scope for technology to support authentic assessment practices in on-line environments. The case study reported here was designed in response to current social and economic debates in Australia about the quality of undergraduate education and of the qualities that university graduates should possess (Candy, Crebert & O'Leary, 1994; Assister, 1995).

During the last ten years there has been a major reappraisal of higher education, its purpose, outcomes and resourcing (Harvard, Hughes & Clark, 1998). There is now a more pronounced emphasis on the higher education-employment nexus, and particularly on the skills or competencies that can be transferred from a university setting to the workplace. In this changing environment, which is mirrored in New Zealand and Europe, it can be argued that peer assessment and peer learning are an appropriate response to the preparation of tertiary students for the workplace. This study focuses on a context where peer support is integrated with assessment to develop skills through interaction and learning. These skills enable students to move more easily from formal education to the next stage of their lives and are increasingly being demanded by employers. For the purposes of the present research, a broad working definition of peer support is made as follows: Learning activities that are formalised and initiated by teachers, but developed and owned by students. This definition is deliberately chosen to signal the importance of student empowerment and ownership of the process, and to endorse a learning process which supports student autonomy, self-direction and independent learning (Boud, 1988; Nightingale et al, 1996).

Computer-based learning and authentic assessment

Information and communication technologies have the capacity to support a wide range of learning goals and are now integrated into teaching approaches of many higher educational institutions. Laurillard (1993) suggests that computer-based learning has a major role in promoting: Through computer facilitated learning, students can access WWW sites, bulletin boards and on-line resources to support their own learning in generic research skills, information literacy, retrieval and management of data. However, many students find their experience in tertiary institutions too general or out of context, and cannot transfer these skills into their own professional disciplines (Hicks, Reid et al. 1999). The integration of generic competencies into contextualised, disciplinary areas offers learners a context in which to anchor their learning.

The shift to student self-direction and autonomy means that students need to take more responsibility for their own learning, but many need assistance in achieving this skill. Shaffer & Resnick (1999), maintain that technology can be used to create authentic contexts for learning, and provide resources that give students opportunities for:

Applied to assessment, representational pluralism enabled by computer technology expands the range of channels available to students to demonstrate understanding (Gardner, 1993; Greeno, 1997). For example, instead of using narrowly defined learning outcomes tested by examinations, technology offers a total environment where real life skills, such as written and verbal communication, collaboration and team work can be assessed by the team and tutor by giving learners multiple channels of expression, such as visualisation and multimedia presentations that include audio and video elements. Thus, information technologies can be closely inter-woven with the quality of the learning experience, and can be used to support authentic tasks for assessment.

Fostering deep learning through peer work and authentic assessment

Traditional university education has operated within a "transmissive paradigm", emphasising the transfer of knowledge from lecturer to student. Such a view of learning is not conducive to meaningful, active learning where students take a pro-active role in questioning, sharing ideas and applying prior knowledge to new ideas. However, the increased emphasis on generic transferable skills has required a re-alignment of teaching practices with desired learning outcomes (Biggs, 1999).

In contemporary education one influential group of researchers has identified students' approaches to be either surface level or deep level (Biggs, 1994; Ramsden, 1992). A deep learning approach is consistent with a search for knowledge and understanding, whereas surface learning is concerned only with passing exams and memorising facts. Applied to assessment and teaching approaches in higher education, the implication being that the creation of an appropriate learning environment can foster a deep approach. This can be achieved by enabling learners to take an active role in learning by initiating, managing, monitoring, reflecting and evaluating learning tasks and processes. Gibbs (1992) emphasises that a focus on process, rather than content, is essential in promoting active learning and that evaluation and assessment procedures are central to these issues as students interpret the objectives of a course of study according to the demands of the assessment system. For example, an exam requiring recall of facts will encourage learners to adopt a surface approach, whereas assessment of collaborative problem-solving or teamwork on a project will emphasise communication skills, planning and decision making and foster a deep approach.

The relevance of this to educational technology is that we can use the attributes of technology to increase learner autonomy and independence by designing authentic assessment tasks. Online learning environments offer the scope to equip learners with professional skills and attributes. In addition, by making assessment a 'learning event' that develops process knowledge (rather than rote learning) we bring it closer to the context of the workplace, where professionals are expected to have self-management skills, and be able to make judgements about their own and other's work (Erhaut, 1994). Moreover, traditional university examinations do not assess deep conceptual understanding and process skills (Entwistle & Entwistle, 1991). Indeed, the capacity of technology to foster professional skills through authentic assessment is an area of research that is just beginning to be explored.

Alternative assessment using technology

In recognition of the limitations of traditional university assessment, there is a new wave of pedagogy advocating 'alternative assessment' in which assessment is integrated with learning and learning processes with real-life performance as opposed to display of inert knowledge (Wiggins, 1998). This form of authentic assessment is solidly based on constructivism, which recognises the learner as the chief architect of knowledge building.

In constructivist learning environments there is social interaction, communication, exchange of views, collaboration and support for learners to take more responsibility for the learning process through learner-centred tasks (McLoughlin & Oliver, 1998; Collis, 1998). Socio-cultural theory is based on similar assumptions, and many theorists have highlighted the importance of reciprocal understanding and transactional dialogue where knowledge is exchanged and modified in the light of peer feedback (Crook 1994; Bruner, 1990). Salient features of constructivist learning environments include an emphasis on the following aspects:

Technology can be effectively used in constructivist learning environments as it affords the creation of situated learning contexts, communication channels and group work, learner control and creation of Computer Supported Intentional Learning Environments (CSILE's) which foster higher-order cognition and self-directed learning (Scardamalia & Bereiter, 1992; Birenbaum, 1999; Reeves & Laffey, 1999).

The use of the WWW to create assessment tasks offers greater adaptability and flexibility than traditional assessment procedures as it enables the collection and storage of continuous data, and easily created micro-environments where learners solve real life problems. It can be argued that the move towards alterative assessment paradigms has been accelerated by technology with its capacity to cope with a broad array of activities, tasks and forums for assessment. For instance, alternative modes of assessment encourage the use of multiple assessment tasks, and multiple modes of show-casing student achievement through portfolios, multimedia projects, skills demonstrations and teamwork. A further important contribution made by technology to these new modes of assessment is the capacity to support the evaluation of learning processes, such as communication, group work and collaborative problem solving, as opposed to a narrow focus on a single outcome as an indicator of competence. In this study, multiple forms of assessing student learning were facilitated by computer technology, while a focus was maintained on learning processes and professional skills rather than content-based outcomes. Table 1 contrasts the features of authentic assessment, and provides an example of how Web-based environments offer possibilities for authentic assessment tasks.

Table 1: Ensuring Authentic Performance through Web-based assessment
(based on Wiggins, 1998)

Typical TestsAuthentic TasksIndicators of AuthenticityWeb-based Support
Require correct responses onlyRequire quality product and/or performance, and justification.Assess whether the student can explain, apply, self-adjust, or justify answers, not just the correctness of answers using facts and algorithms.Allows students to articulate viewpoints in text-based conversation that can be archived as a learning resource
Must be unknown in advance to ensure validityAre known as much as possible in advance; involve excelling at predictable demanding and core tasks; are not "gotcha!" experiences.The tasks, criteria, and standards by which work will be judged are predictable or known-like a project proposal for a client, etc.Web-based teaching allows access to multiple sources of information about the task, while allowing learners to explore alternatives.
Are disconnected from a realistic context and realistic constraintsRequire real-world use of knowledge: the student must "do" history, science, etc. in realistic simulations or actual use.The task is a challenge and a set of constraints that are authentic- likely to be encountered by the professional. (Know-how, not plugging in, is required.)The task is a challenge and can extend the confines of the classroom to involve complex, ill-defined tasks and collaboration
Contain isolated items requiring use or recognition of known answers or skillsAre integrated challenges in which knowledge and judgment must be innovatively used to fashion a quality product or performance.The task is multifaceted and non routine, even if there is a "right" answer. It thus requires problem clarification, trial and error, adjustments, adapting to the case or facts at hand, etc.Web provides access to information, databases and course notes. Learners have control
Are simplified so as to be easy to score reliablyInvolve complex and non-arbitrary tasks, criteria, and standards.The task involves the important aspects of performance and/or core challenges of the field of study, not the easily scored.Web-based learning provides multiple vehicles for showcasing student achievement, including portfolios and skills demonstrations
Are one shotAre iterative: contain recurring essential tasks, genres, and learning processes.The work is designed to reveal whether the student has achieved real versus surface mastery, or understanding versus mere familiarity, over time.Web-based teaching enables gathering of continuous process data on student achievement
Depend on highly technical correlationsProvide direct evidence, involving tasks that have been validated against core skills and discipline-based challenges.The task is valid and has face validity. It evokes student interest and persistence, and seems apt and challenging to students and teachers.Web-based teaching allows students to be active and to demonstrate divergent as well as convergent thinking
Provide a scoreProvide useable, diagnostic feedback: the student is able to confirm results and self-adjust as needed.The assessment is designed not merely to audit performance but to improve future performance. The student is seen as the primary 'customer' of information.Web-based teaching allows intervention and continuous feedback on processes

Context of the study: Course description

The creation of the peer support tasks took place in a tertiary academic unit called "Interactive Multimedia Development Methodologies" intended to develop student expertise and knowledge in project management for developing multimedia in a team-based environment. Currently, this unit is offered in face-to-face mode using a combination of delivery modes, including small group work, team projects and problem solving tasks facilitated by a tutor. The on-line component provides access to course notes, syllabus, assessment details, access to previous projects as well as group communication facilities such as e-mail, bulletin boards and problem solving tasks. The course objectives were closely linked to the professional competencies required for multimedia development in the industry and integrated authentic assessment tasks where learners can develop multimedia products in a team environment, thus replicating the skills required of them in the workplace. A further feature of the assessment tasks was the focus on learning processes, rather than mastery of content. The assessment tasks and associated learning processes are presented in Table 2.

Table 2: The assessment tasks and associated learning processes

Assessment taskLearning processes
Web-based product development
  • Team work, planning, decision making project management
Peer assessment of problem solving
  • Higher order thinking and analysis
  • Development of criteria for assessment
  • Giving feedback to peers
  • Reflection on feedback leading to revision of ideas
Search for, and apply relevant knowledge by posting a URL that supports the solution to the problem.
  • Analysis, reorganising information testing for useability and applying to an authentic task

The learning objective of the peer assessment tasks was that participants would evaluate peer responses to problem solving and discuss issues and revisions in order to support each other through an on-line facility. The unit thus integrated process-based assessment tasks in order to contribute to the development of team-skills, inquiry and reflection (Biggs, 1999). The unit also aimed to teach content knowledge and thinking strategies using real life scenarios. This approach to on-line learning can be regarded as process-oriented as it focuses on the processes of knowledge construction and utilisation (Volet, 1995; Vermunt, 1995), as opposed to simple mastery of content knowledge.

Participants and study design

The nine student participants were all postgraduates and consisted of five females and four males. They were undertaking the Project Management Unit, which represented a core unit in the Graduate Diploma of Interactive Multimedia Technologies. Through the design of peer-supported learning tasks, the project management unit introduced learners to situations and ways of working with knowledge which were experientially based. Students participated in the decision making process of assessment in various ways. They could choose and define a topic for their project, create and manage their own development team and negotiate peer assessment as part of their contribution to presentations. The design was based on pedagogical and curriculum philosophies that acknowledge peer response, peer feedback and support as essential to the development of independent learning and self-direction (Candy, 1991; Boud, 1988).

The assessment tasks involved groups of students posting solutions to ill-structured problems on the class Web page, participating in team work and facilitating team building processes through discussion forums where team members could deliberate, plan and exchange ideas. The actual problems were loosely framed, ill-defined and open-ended, and required learners to analyse existing knowledge, suggest a course of action and then post a solution. In addition, students were asked to post a URL or information source that specifically supported the standpoint adopted in solving the problem. This was conducted before students attended a seminar where they presented their solution, justified their approaches and made comments on other students solutions. Students were required to submit their solution at a specific time during the week, and then evaluate each other's solutions and provide qualitative feedback. During the seminar the strengths and weaknesses of each solution were discussed and students had an opportunity to defend their solutions.

The process of developing peer support and feedback was an important part of the Project Management Unit, which replicated the socio-cognitive processes of a learning community. Sharing of ideas, resources, and offering reciprocal support created social ties, which helped students to work in teams. McConnell's (1999) features of an on-line community influenced the design of the whole environment:

The following four 'Snapshots' of the assessment process and the fostering of peer-feedback show how these elements were integrated into the technology-supported environment of the course.

Snapshot 1: Criteria used by students for peer assessment

Students were asked to develop and articulate the criteria that they used when assessing other students' on-line solutions and then defend these in class discussion. It was important that students formulated criteria against which they could measure peer solutions to the problem solving tasks. It was anticipated that students would be able to complete this task without much difficulty as they had experience of writing project specifications and engaging in project evaluation.

The results in Figure 1 show that the majority of students considered the following issues as being the most important when presenting solutions:

These results were later used as a 'spring board' to refine the processes of evaluation and to negotiate agreed on criteria that could be applied to all posted solutions. For example, all students mentioned style and format as essential criteria in their judgements of others' work. In class, they were asked to justify this response in an attempt to develop metacognitive processes (Lieberman 1991; Schraw & Dennison, 1994, 1987).

Figure 1

Figure 1: Criteria used by students when assessing other students work

Snapshot 2: Student reflections on problem solving strategies

Students were asked to reflect on the problem-solving processes they had engaged in while solving problems and then asked to articulate these during class discussions. This enabled the students to reflect on their own strategies, to identify areas of weaknesses and to conceptualise ways of addressing those weaknesses. Students were asked to write down the major strategies used when solving problems. As shown in Figure 2, the salient features of student problem solving strategies were: While these processes represent some of the major elements of expert problem solving strategies (Lipman, 1991; Paul, 1994), they nevertheless do not include all possible expert problem solving strategies such as planning, testing possible solutions, revising and checking details. By encouraging students to identify their own problem solving approaches and then compare them with the criteria that they had applied to their peers, students developed greater awareness of discrepancies in their judgements of others' work.

Figure 2

Figure 2: Student reflections on problem solving strategies

Snapshot 3: Student opinions of the assessment methods used

This example relates to the social function of the learning community, where learners gave share ideas and opinions about the value of assessment tasks. It was important that learners came to regard assessment tasks as a way of engaging with the content and a means of developing new skills through ownership and acceptance of the assessment process.

Students were asked to consider the value and relevance of the problem solving tasks for their own learning. As shown in figure 3, most of students considered the problem-based tasks to be:

Sharing of these views in class sessions consolidated group thinking and fostered a sense of self-direction and awareness of learning processes.

Figure 3

Figure 3: Student opinions of the assessment methods used

Snapshot 4: Student perceptions of peer feedback as support for learning

Students were encouraged to articulate the changes they would make to their solutions in the light of feedback from peers, and to use this feedback to improve their own learning and problem solving. This form of reflective practice is part of experiential learning and supported by the work of Mezirow (1990) who emphasises the role of critical reflection in self-directed learning. On-line learning can support critical self-reflection by providing access to others' work, but the actual reflective process needs to be facilitated by a tutor. Comments made by learners on how they would improve their problem solving strategies included the following:

In addition, students were asked to document and explain the impact of peer feedback on their own learning approaches. They found this task unusual, and had not previously considered that other student views could influence their own ways of thinking and problem solving. The responses demonstrate that students did in fact adopt a deep approach to learning and that the peer support promoted reflection and further dialogue. Table 3 provides examples of comments made and links these comments to implicit learning processes. The learning processes identified in Table 3 are indicators that student learning was enhanced by peer support and feedback and that participants raised their awareness of other perspectives and became more aware of their own strategies.

Table 3: Student views on peer feedback and its influence on their learning

Student commentLearning process
I learn to see things from a different angleConceptual change
I found it helpful to see what others had said and how they had different viewsConsideration of multiple perspectives
Critical feedback helps me develop my solutions and see my poor pointsSelf evaluation
I learnt that there were many aspects that I might have considered by brainstormingStrategic learning
You learn how people's positions can vary greatly and you have to be openOpenness and a sense of inquiry
Being able to see and hear others' feedback is good for my learning and brings up points that I had not consideredActing on feedback

Discussion

Encouraging and implementing change in assessment practices in essential in fostering self-directed learning as assessment drives student motivation and learning. A narrowly focused, content based assessment process can interfere with other strategies for assisting deep learning. Laurillard (1993) maintains that fostering an appropriate conception of learning is fundamental and that it is particularly important in the use of educational technology, as it often presupposes a reduction in teacher-learner dialogue. On the basis of the research conducted here, we could maintain that using technology to create an environment of peer support and feedback is an optimum way of achieving a deep learning approach, without increasing teacher direction and intervention in the learning process.

Online technologies have functionalities that enable display and sharing of ideas, open discussion of solutions and articulation of strategies by participants. Such functionalities support greater visibility and openness in the learning process, which in turn foster reflection and conceptualisation among learners. In the study presented here, the integration of technology into an innovative assessment approach resulted in deep learning and the 'Snapshots' showed that students were engaged in active, reflective learning.

The technology helped to foster the processes of learning by exposing learners to multiple views, achieved by assessment design and online discussion. From these discussions students came to realise that the correctness of a solution was not the only issue, but the way in which it was viewed and presented was also important. Exposure to the many and varied ways that solutions could be presented created further scope for reflection on problem-solving approaches. Thus, openness, flexibility and monitoring of alternative views became an integral part of the learning experience.

Appropriate use of technology can be used to extend authentic models of assessment, which encompass a broad range of learning experiences. Web-based learning environments can create contexts and tools for exploration of alternative worldviews, increase the visibility of thinking processes, lead to articulation and refinement of solutions and foster peer dialogue in the learning process. By providing scope for multiple expressions of learning achievement we believe that new models of technology enhanced assessment are just beginning to emerge. These forms of assessment will provide indicators of a broad array of competencies and recognise that learning processes and learning outcomes are inseparable.

References

Assiter, A. (1995). Transferable skills in higher education. London: Kogan Page.

Birenbaum, M. (1999). Reflective active learning in a graduate course on assessment. Higher Education Research and Development, 18(2), 201-219.

Biggs, J. (1999). Teaching for quality learning at university. Oxford: Oxford University Press.

Biggs, J. B. (1994). Student learning theory and research: Where do we currently stand? In G. Gibbs (Ed.), Improving student learning: Theory and Practice. Oxford: Oxford Brooks University.

Boud, D. (1988). Developing student autonomy in learning. London: Kogan Page.

Bruner, J. (1990). Acts of meaning. Cambridge, MA: Harvard University Press.

Candy, P., Crebert, G., & O'Leary, J. (1994). Developing lifelong learners through undergraduate education. Canberra: Australian Government Publishing Service.

Collis, B. (1998). WWW-based environments for collaborative group work. Education and Information Technologies, 3, 231-245.

Crook, C. (1994). Computers and the collaborative experience of learning. London: Routledge.

Entwistle, N. J., & Entwistle, A. C. (1991). Contrasting forms of understanding for degree examination: The student experience and its implications. Higher Education, 22, 205-227.

Erhaut, M. (1994). Developing professional knowledge and competence. London: The Falmer Press.

Gardner, H. (1993). Frames of mind: the theory of multiple intelligences. London: Fontana Press.

Gibbs, G. (1992). Improving the quality of student learning. Bristol: Technical and Educational Services.

Greeno, J. P., & Hall, R. P. (1997). Practicing representation: Learning with and about presentational forms. Phi Delta Kappan, 78(5), 361-367.

Havard, M., Hughes, M. & Clarke, J. (1998). The introduction and evaluation of key skills in undergraduate courses. Journal of Further and Higher Education, 22(1), 61-688.

Hicks, M., Reid, I., & George, R. (1999). Enhancing on-line teaching: Designing responsive learning environments. Proceedings of Cornerstones1999. The International HERDSA conference, Melbourne, Victoria. http://herdsa.org.au/vic/cornerstones/pdf/Hicks.PDF

Laurillard, D. (1993). Rethinking University Teaching. London: Routledge.

Lieberman, D. A., & Linn, M. C. (1991). Learning to learn revisited: computers and the development of self-directed learning skills. Journal of Research on Computing in Education, 23(3), 373-395.

Lipman, M. (1991). Thinking in education. Cambridge: Cambridge University Press.

McConnell, D. (1999). Examining collaborative assessment process in networked lifelong learning. Journal of Computer Assisted Learning, 15(2), 232-243.

McLoughlin, C., & Oliver, R. (1998). Maximising the language and learning link in computer learning environments. British Journal of Educational Technology, 29(2), 125-136.

Mezirow, J. (1990). Fostering critical reflection in adulthood: A guide to transformative and emancipatory learning. San Fransisco: Jossey Bass.

Nightingale, P., Wiata, I. T., Toohey, S., Ryan, G., Hughes, C., & Magin, D. (1996). Assessing learning in universities. Sydney: University of New South Wales Press.

Paul, R. (1994). Cultivating the reasoning mind: teaching for the logic of creative and critical thinking. In J. Edwards (Ed), Thinking: International Interdisciplinary Perspectives (pp. 73-82). Victoria: Hawker Brownlow.

Ramsden, P. (1992). Learning to teach in higher education. London: Routledge.

Reeves, T. & Laffey, J. M. (1999). Design, assessment and evaluation of a problem-based learning environment in undergraduate engineering.

Scardamalia, M., & Bereiter, C. (1992). An architecture for collaborative knowledge building. In E. D. Corte, M. C. Linn, H. Mandl, & L. Verschaffel (Eds), Computer-Based Learning Environments and Problem Solving (pp. 41-66). Berlin: Springer-Verlag.

Schraw, G. (1998). On the development of adult metacognition. In M. C. Smith & T. Pourchot (Eds), Adult learning and development: Perspectives from educational psychology (pp. 89-108). Mawah, NJ: Lawrence Erlbaum.

Schraw, G., & Dennison, R. S. P. (1994). Assessing metacognitive awareness. Contemporary Educational Psychology, 19, 460-475.

Shaffer, D. W., & Resnick, M. (1999). "Thick" authenticity: New media and authentic learning. Journal of Interactive Learning Research, 10(2), 195-215.

Vermunt, J. D. (1995). Process-oriented instruction in thinking and learning strategies. European Journal of Psychology of Education, 10, 325-349.

Volet, S., McGill, T., & Pears, H. (1995). Implementing process-based instruction in regular university instruction: Conceptual, methodological and practical issues. European Journal of Psychology of Education, 10, 385-400.

Wiggins, G. P. (1998). Educative assessment. San Francisco: Jossey Bass.

Contact details: Dr Catherine McLoughlin, University of New England
Phone (02) 6773 2670 Fax (02) 6773 3269 Email mcloughlin@metz.une.edu.au

Please cite as: McLoughlin, C. and Luca, J. (2001). Assessment methodologies in transition: Changing practices in web-based learning. In L. Richardson and J. Lidstone (Eds), Flexible Learning for a Flexible Society, 516-526. Proceedings of ASET-HERDSA 2000 Conference, Toowoomba, Qld, 2-5 July 2000. ASET and HERDSA. http://www.aset.org.au/confs/aset-herdsa2000/procs/mcloughlin1.html


[ Pre-conference abstract ] [ Proceedings ] [ Abstracts ] [ Program ] [ ASET-HERDSA 2000 Main ]
Created 15 Oct 2001. Last revised: 29 Mar 2003. HTML: Roger Atkinson
This URL: http://www.aset.org.au/confs/aset-herdsa2000/procs/mcloughlin1.html
Previous URL 15 Oct 2001 to 30 Sep 2002: http://cleo.murdoch.edu.au/gen/aset/confs/aset-herdsa2000/procs/mcloughlin1.html