The use of the WWW in tertiary learning environments offers greater adaptability and flexibility than traditional assessment procedures as it enables the planning and design of tasks that monitor both learning processes and learning outcomes. This paper proposes that the move towards alternative assessment paradigms has been accelerated by technology with its capacity to offer learners a broad array of activities, tasks and forums for engaging in constructivist learning.There is now a new wave of pedagogy advocating 'alternative assessment' in which assessment is integrated into learning through engagement in real life contexts. Authentic assessment fosters understanding of learning processes with real-life performance as opposed to a display of inert knowledge. Authentic assessment is solidly based on constructivism, which recognises the learner as the chief architect of knowledge building. In the constructivist learning environment of this study, assessment processes are mediated through social interaction, communication, exchange of views and collaboration so that learners to become aware of, and take responsibility for assessing their own learning processes. In this study, alternative modes of assessment in a tertiary Web-based environment are exemplified through the use of multiple assessment tasks enabling multiple modes of showcasing student achievement through portfolios, multimedia projects, skills demonstrations and teamwork.
During the last ten years there has been a major reappraisal of higher education, its purpose, outcomes and resourcing (Harvard, Hughes & Clark, 1998). There is now a more pronounced emphasis on the higher education-employment nexus, and particularly on the skills or competencies that can be transferred from a university setting to the workplace. In this changing environment, which is mirrored in New Zealand and Europe, it can be argued that peer assessment and peer learning are an appropriate response to the preparation of tertiary students for the workplace. This study focuses on a context where peer support is integrated with assessment to develop skills through interaction and learning. These skills enable students to move more easily from formal education to the next stage of their lives and are increasingly being demanded by employers. For the purposes of the present research, a broad working definition of peer support is made as follows: Learning activities that are formalised and initiated by teachers, but developed and owned by students. This definition is deliberately chosen to signal the importance of student empowerment and ownership of the process, and to endorse a learning process which supports student autonomy, self-direction and independent learning (Boud, 1988; Nightingale et al, 1996).
The shift to student self-direction and autonomy means that students need to take more responsibility for their own learning, but many need assistance in achieving this skill. Shaffer & Resnick (1999), maintain that technology can be used to create authentic contexts for learning, and provide resources that give students opportunities for:
In contemporary education one influential group of researchers has identified students' approaches to be either surface level or deep level (Biggs, 1994; Ramsden, 1992). A deep learning approach is consistent with a search for knowledge and understanding, whereas surface learning is concerned only with passing exams and memorising facts. Applied to assessment and teaching approaches in higher education, the implication being that the creation of an appropriate learning environment can foster a deep approach. This can be achieved by enabling learners to take an active role in learning by initiating, managing, monitoring, reflecting and evaluating learning tasks and processes. Gibbs (1992) emphasises that a focus on process, rather than content, is essential in promoting active learning and that evaluation and assessment procedures are central to these issues as students interpret the objectives of a course of study according to the demands of the assessment system. For example, an exam requiring recall of facts will encourage learners to adopt a surface approach, whereas assessment of collaborative problem-solving or teamwork on a project will emphasise communication skills, planning and decision making and foster a deep approach.
The relevance of this to educational technology is that we can use the attributes of technology to increase learner autonomy and independence by designing authentic assessment tasks. Online learning environments offer the scope to equip learners with professional skills and attributes. In addition, by making assessment a 'learning event' that develops process knowledge (rather than rote learning) we bring it closer to the context of the workplace, where professionals are expected to have self-management skills, and be able to make judgements about their own and other's work (Erhaut, 1994). Moreover, traditional university examinations do not assess deep conceptual understanding and process skills (Entwistle & Entwistle, 1991). Indeed, the capacity of technology to foster professional skills through authentic assessment is an area of research that is just beginning to be explored.
In constructivist learning environments there is social interaction, communication, exchange of views, collaboration and support for learners to take more responsibility for the learning process through learner-centred tasks (McLoughlin & Oliver, 1998; Collis, 1998). Socio-cultural theory is based on similar assumptions, and many theorists have highlighted the importance of reciprocal understanding and transactional dialogue where knowledge is exchanged and modified in the light of peer feedback (Crook 1994; Bruner, 1990). Salient features of constructivist learning environments include an emphasis on the following aspects:
The use of the WWW to create assessment tasks offers greater adaptability and flexibility than traditional assessment procedures as it enables the collection and storage of continuous data, and easily created micro-environments where learners solve real life problems. It can be argued that the move towards alterative assessment paradigms has been accelerated by technology with its capacity to cope with a broad array of activities, tasks and forums for assessment. For instance, alternative modes of assessment encourage the use of multiple assessment tasks, and multiple modes of show-casing student achievement through portfolios, multimedia projects, skills demonstrations and teamwork. A further important contribution made by technology to these new modes of assessment is the capacity to support the evaluation of learning processes, such as communication, group work and collaborative problem solving, as opposed to a narrow focus on a single outcome as an indicator of competence. In this study, multiple forms of assessing student learning were facilitated by computer technology, while a focus was maintained on learning processes and professional skills rather than content-based outcomes. Table 1 contrasts the features of authentic assessment, and provides an example of how Web-based environments offer possibilities for authentic assessment tasks.
Typical Tests | Authentic Tasks | Indicators of Authenticity | Web-based Support |
Require correct responses only | Require quality product and/or performance, and justification. | Assess whether the student can explain, apply, self-adjust, or justify answers, not just the correctness of answers using facts and algorithms. | Allows students to articulate viewpoints in text-based conversation that can be archived as a learning resource |
Must be unknown in advance to ensure validity | Are known as much as possible in advance; involve excelling at predictable demanding and core tasks; are not "gotcha!" experiences. | The tasks, criteria, and standards by which work will be judged are predictable or known-like a project proposal for a client, etc. | Web-based teaching allows access to multiple sources of information about the task, while allowing learners to explore alternatives. |
Are disconnected from a realistic context and realistic constraints | Require real-world use of knowledge: the student must "do" history, science, etc. in realistic simulations or actual use. | The task is a challenge and a set of constraints that are authentic- likely to be encountered by the professional. (Know-how, not plugging in, is required.) | The task is a challenge and can extend the confines of the classroom to involve complex, ill-defined tasks and collaboration |
Contain isolated items requiring use or recognition of known answers or skills | Are integrated challenges in which knowledge and judgment must be innovatively used to fashion a quality product or performance. | The task is multifaceted and non routine, even if there is a "right" answer. It thus requires problem clarification, trial and error, adjustments, adapting to the case or facts at hand, etc. | Web provides access to information, databases and course notes. Learners have control |
Are simplified so as to be easy to score reliably | Involve complex and non-arbitrary tasks, criteria, and standards. | The task involves the important aspects of performance and/or core challenges of the field of study, not the easily scored. | Web-based learning provides multiple vehicles for showcasing student achievement, including portfolios and skills demonstrations |
Are one shot | Are iterative: contain recurring essential tasks, genres, and learning processes. | The work is designed to reveal whether the student has achieved real versus surface mastery, or understanding versus mere familiarity, over time. | Web-based teaching enables gathering of continuous process data on student achievement |
Depend on highly technical correlations | Provide direct evidence, involving tasks that have been validated against core skills and discipline-based challenges. | The task is valid and has face validity. It evokes student interest and persistence, and seems apt and challenging to students and teachers. | Web-based teaching allows students to be active and to demonstrate divergent as well as convergent thinking |
Provide a score | Provide useable, diagnostic feedback: the student is able to confirm results and self-adjust as needed. | The assessment is designed not merely to audit performance but to improve future performance. The student is seen as the primary 'customer' of information. | Web-based teaching allows intervention and continuous feedback on processes |
Assessment task | Learning processes |
Web-based product development |
|
Peer assessment of problem solving |
|
Search for, and apply relevant knowledge by posting a URL that supports the solution to the problem. |
|
The learning objective of the peer assessment tasks was that participants would evaluate peer responses to problem solving and discuss issues and revisions in order to support each other through an on-line facility. The unit thus integrated process-based assessment tasks in order to contribute to the development of team-skills, inquiry and reflection (Biggs, 1999). The unit also aimed to teach content knowledge and thinking strategies using real life scenarios. This approach to on-line learning can be regarded as process-oriented as it focuses on the processes of knowledge construction and utilisation (Volet, 1995; Vermunt, 1995), as opposed to simple mastery of content knowledge.
The assessment tasks involved groups of students posting solutions to ill-structured problems on the class Web page, participating in team work and facilitating team building processes through discussion forums where team members could deliberate, plan and exchange ideas. The actual problems were loosely framed, ill-defined and open-ended, and required learners to analyse existing knowledge, suggest a course of action and then post a solution. In addition, students were asked to post a URL or information source that specifically supported the standpoint adopted in solving the problem. This was conducted before students attended a seminar where they presented their solution, justified their approaches and made comments on other students solutions. Students were required to submit their solution at a specific time during the week, and then evaluate each other's solutions and provide qualitative feedback. During the seminar the strengths and weaknesses of each solution were discussed and students had an opportunity to defend their solutions.
The process of developing peer support and feedback was an important part of the Project Management Unit, which replicated the socio-cognitive processes of a learning community. Sharing of ideas, resources, and offering reciprocal support created social ties, which helped students to work in teams. McConnell's (1999) features of an on-line community influenced the design of the whole environment:
The results in Figure 1 show that the majority of students considered the following issues as being the most important when presenting solutions:
Figure 1: Criteria used by students when assessing other students work
Figure 2: Student reflections on problem solving strategies
Students were asked to consider the value and relevance of the problem solving tasks for their own learning. As shown in figure 3, most of students considered the problem-based tasks to be:
Figure 3: Student opinions of the assessment methods used
Students were encouraged to articulate the changes they would make to their solutions in the light of feedback from peers, and to use this feedback to improve their own learning and problem solving. This form of reflective practice is part of experiential learning and supported by the work of Mezirow (1990) who emphasises the role of critical reflection in self-directed learning. On-line learning can support critical self-reflection by providing access to others' work, but the actual reflective process needs to be facilitated by a tutor. Comments made by learners on how they would improve their problem solving strategies included the following:
Student comment | Learning process |
I learn to see things from a different angle | Conceptual change |
I found it helpful to see what others had said and how they had different views | Consideration of multiple perspectives |
Critical feedback helps me develop my solutions and see my poor points | Self evaluation |
I learnt that there were many aspects that I might have considered by brainstorming | Strategic learning |
You learn how people's positions can vary greatly and you have to be open | Openness and a sense of inquiry |
Being able to see and hear others' feedback is good for my learning and brings up points that I had not considered | Acting on feedback |
Online technologies have functionalities that enable display and sharing of ideas, open discussion of solutions and articulation of strategies by participants. Such functionalities support greater visibility and openness in the learning process, which in turn foster reflection and conceptualisation among learners. In the study presented here, the integration of technology into an innovative assessment approach resulted in deep learning and the 'Snapshots' showed that students were engaged in active, reflective learning.
The technology helped to foster the processes of learning by exposing learners to multiple views, achieved by assessment design and online discussion. From these discussions students came to realise that the correctness of a solution was not the only issue, but the way in which it was viewed and presented was also important. Exposure to the many and varied ways that solutions could be presented created further scope for reflection on problem-solving approaches. Thus, openness, flexibility and monitoring of alternative views became an integral part of the learning experience.
Appropriate use of technology can be used to extend authentic models of assessment, which encompass a broad range of learning experiences. Web-based learning environments can create contexts and tools for exploration of alternative worldviews, increase the visibility of thinking processes, lead to articulation and refinement of solutions and foster peer dialogue in the learning process. By providing scope for multiple expressions of learning achievement we believe that new models of technology enhanced assessment are just beginning to emerge. These forms of assessment will provide indicators of a broad array of competencies and recognise that learning processes and learning outcomes are inseparable.
Birenbaum, M. (1999). Reflective active learning in a graduate course on assessment. Higher Education Research and Development, 18(2), 201-219.
Biggs, J. (1999). Teaching for quality learning at university. Oxford: Oxford University Press.
Biggs, J. B. (1994). Student learning theory and research: Where do we currently stand? In G. Gibbs (Ed.), Improving student learning: Theory and Practice. Oxford: Oxford Brooks University.
Boud, D. (1988). Developing student autonomy in learning. London: Kogan Page.
Bruner, J. (1990). Acts of meaning. Cambridge, MA: Harvard University Press.
Candy, P., Crebert, G., & O'Leary, J. (1994). Developing lifelong learners through undergraduate education. Canberra: Australian Government Publishing Service.
Collis, B. (1998). WWW-based environments for collaborative group work. Education and Information Technologies, 3, 231-245.
Crook, C. (1994). Computers and the collaborative experience of learning. London: Routledge.
Entwistle, N. J., & Entwistle, A. C. (1991). Contrasting forms of understanding for degree examination: The student experience and its implications. Higher Education, 22, 205-227.
Erhaut, M. (1994). Developing professional knowledge and competence. London: The Falmer Press.
Gardner, H. (1993). Frames of mind: the theory of multiple intelligences. London: Fontana Press.
Gibbs, G. (1992). Improving the quality of student learning. Bristol: Technical and Educational Services.
Greeno, J. P., & Hall, R. P. (1997). Practicing representation: Learning with and about presentational forms. Phi Delta Kappan, 78(5), 361-367.
Havard, M., Hughes, M. & Clarke, J. (1998). The introduction and evaluation of key skills in undergraduate courses. Journal of Further and Higher Education, 22(1), 61-688.
Hicks, M., Reid, I., & George, R. (1999). Enhancing on-line teaching: Designing responsive learning environments. Proceedings of Cornerstones1999. The International HERDSA conference, Melbourne, Victoria. http://herdsa.org.au/vic/cornerstones/pdf/Hicks.PDF
Laurillard, D. (1993). Rethinking University Teaching. London: Routledge.
Lieberman, D. A., & Linn, M. C. (1991). Learning to learn revisited: computers and the development of self-directed learning skills. Journal of Research on Computing in Education, 23(3), 373-395.
Lipman, M. (1991). Thinking in education. Cambridge: Cambridge University Press.
McConnell, D. (1999). Examining collaborative assessment process in networked lifelong learning. Journal of Computer Assisted Learning, 15(2), 232-243.
McLoughlin, C., & Oliver, R. (1998). Maximising the language and learning link in computer learning environments. British Journal of Educational Technology, 29(2), 125-136.
Mezirow, J. (1990). Fostering critical reflection in adulthood: A guide to transformative and emancipatory learning. San Fransisco: Jossey Bass.
Nightingale, P., Wiata, I. T., Toohey, S., Ryan, G., Hughes, C., & Magin, D. (1996). Assessing learning in universities. Sydney: University of New South Wales Press.
Paul, R. (1994). Cultivating the reasoning mind: teaching for the logic of creative and critical thinking. In J. Edwards (Ed), Thinking: International Interdisciplinary Perspectives (pp. 73-82). Victoria: Hawker Brownlow.
Ramsden, P. (1992). Learning to teach in higher education. London: Routledge.
Reeves, T. & Laffey, J. M. (1999). Design, assessment and evaluation of a problem-based learning environment in undergraduate engineering.
Scardamalia, M., & Bereiter, C. (1992). An architecture for collaborative knowledge building. In E. D. Corte, M. C. Linn, H. Mandl, & L. Verschaffel (Eds), Computer-Based Learning Environments and Problem Solving (pp. 41-66). Berlin: Springer-Verlag.
Schraw, G. (1998). On the development of adult metacognition. In M. C. Smith & T. Pourchot (Eds), Adult learning and development: Perspectives from educational psychology (pp. 89-108). Mawah, NJ: Lawrence Erlbaum.
Schraw, G., & Dennison, R. S. P. (1994). Assessing metacognitive awareness. Contemporary Educational Psychology, 19, 460-475.
Shaffer, D. W., & Resnick, M. (1999). "Thick" authenticity: New media and authentic learning. Journal of Interactive Learning Research, 10(2), 195-215.
Vermunt, J. D. (1995). Process-oriented instruction in thinking and learning strategies. European Journal of Psychology of Education, 10, 325-349.
Volet, S., McGill, T., & Pears, H. (1995). Implementing process-based instruction in regular university instruction: Conceptual, methodological and practical issues. European Journal of Psychology of Education, 10, 385-400.
Wiggins, G. P. (1998). Educative assessment. San Francisco: Jossey Bass.
Contact details: Dr Catherine McLoughlin, University of New England Phone (02) 6773 2670 Fax (02) 6773 3269 Email mcloughlin@metz.une.edu.au Please cite as: McLoughlin, C. and Luca, J. (2001). Assessment methodologies in transition: Changing practices in web-based learning. In L. Richardson and J. Lidstone (Eds), Flexible Learning for a Flexible Society, 516-526. Proceedings of ASET-HERDSA 2000 Conference, Toowoomba, Qld, 2-5 July 2000. ASET and HERDSA. http://www.aset.org.au/confs/aset-herdsa2000/procs/mcloughlin1.html |