ASET logo
[ EdTech'94 Contents ] [ EdTech Confs ]

Computer based assessment: Design considerations

Peter Poteralski
Torrens Valley Institute of TAFE

Computer based assessment (CBA) development is often viewed as a subset of computer assisted learning (CAL) development since the curriculum has already been transferred into learning objectives and the learning strategies and methodologies have been selected. However, as all educators are well aware, well designed assessment is an integral part of the learning process, eg providing feedback.

With the moves to open learning, with the characteristics of

the emphasis has shifted the responsibility from lecturer centred teaching to the learning process and the assessment of the learning objectives. Educators are finding that the time devoted to teaching has been transferred to assessment and not assistance with the learning process as was the intent (refer also to Trollip 1992 who believes that too little attention is paid to assessment anyway).

CBA is a means of redressing some of these assessment commitments. However just as CAL is the panacea of all learning strategies, so too CBA is not the panacea of all assessment. Many of the issues raised in this paper will be discussed further and illustrated with screen examples in the presentation.

Developing the assessment from the learning objectives

Generally technological and economic restraints have restricted assessment to the cognitive domain (there are possibilities for the psychomotor and affective domains but they require a higher order of sophistication, eg flight simulators, virtual reality). Simple cognitive skills assessment focus on the assessment of knowledge (simple recall) and comprehension, and those with increased sophistication extend to application, analysis, synthesis and evaluation (Kemp 1985, p84). When developing CBA use a reference such as Kemp (1985) to get the required action type verbs for phrasing and constructing your questions.

The development of assessment instruments requires a direct relationship between learning objectives and the assessment items to ensure the validity of question.

Most cognitive objectives can be tested using print based assessment instruments, but the inclusion of sound, animation and video in posing the question adds to the realism. However, we need to allow for flexible delivery and accessibility for all students, ie can question content easily be transcribed into print/ text form eg for video clip, this could be achieved by providing the conversation script. Not all question content will be readily convertible, eg a video clip requiring the identification of a hazardous practice may not be readily convertible to a graphic.

Competency based assessment

Competency based assessment is objective assessment, ie is the appraisal of knowledge and skills which are both observable and assessable.

To provide this objective testing and avoid ambiguity, the question types are generally restricted to

but all of these need to be We have some flexibility in designing the questions and answers Another consideration associated with competency based assessment is the usage of questions which could possibly reinforce incorrect procedures be used? eg picture of incorrect method may subconsciously be embedded in the student's mind with disastrous consequences. How often have students remembered precisely what they were told expressly not to remember, when you were using this technique to emphasise or contrast some points?

Media selection considerations

Whether CBA is an appropriate assessment tools is dependent on the following considerations

Design considerations - purpose and delivery

In the CBA design process we need to ask the following questions and address the related issues

User Information and user considerations


Administrative considerations include

Question design and navigation

With question design and navigational aids be mindful of the following points There are occasions where you really do want to know whether the learner knows how to do something before they actually attempt it.

Screen design

Make similar considerations to those for print/ text page and poster design, ie


The provision of feedback is one of the principles of adult learning (refer Race, 1994, for purposes of feedback). Feedback should be

Programming skills versus authoring

Most authoring packages are claimed to be easy to use. They are usually able to perform impressive display fairly simply, however they quickly become very complex when moved away from linear presentations and flow of data. Therefore while programming skills are not absolutely necessarily, a clear understanding of

Error trapping and debugging


Design issues checklist

As a final checklist ensure that all of the following point have been considered

Why CBA? - the advantages

Some of the reasons for employing CBA as an assessment tool are

Why CBA? - the disadvantages

It is also important to realise there are also some limiting aspects in using CBA for assessment

Future directions

While the greater use of multimedia, artificial intelligence, virtual reality, expert systems, spoken voice recognition and networking, we have a greater access to knowledge and skills. Just being able to discerningly collate information and assimilate it, is an emerging challenge and we need to develop complementary CBA to assist in this process.


Alessi, S. M. & Trollip, S. R. (1991). Computer Based Instruction: Methods and Development, 2nd Ed. Prentice Hall, Englewood Cliffs, New Jersey.

Beech, G. (1985). Computer Based Learning. Sigma Technical Press, Cheshire.

Electronic Publishing (1989). CALS, Adelaide Institute of TAFE, South Australia.

Gery, G. (1988). Making CBT Happen. Weingarten Publications, Boston.

Grant, P. (1992). Instructional Systems Design for CBT Software. Learnware Technologies.

Hall, W. & Saunders, J. (1993). Getting to grips with assessment. National Centre for Vocational Educational Research Ltd, South Australia.

Hannafin, M. J. & Peck, K. L. (1988). The Design, Development, and Evaluation of Instructional Software. Macmillan Publishing Co, New York.

Kemp, J. E. (1985). The Instructional Design Process. New York: Harper & Row.

Lally, M. (1993). Navigation in Rich Visual Data Spaces. University of SA seminar.

Mager, R. F. (1991). Making Instruction Work. Kogan Page, London.

Metros, S. E. (1992). Interface-lift: Elective or compulsory? In J. G. Hedberg and J. Steele (eds), Educational Technology for the Clever Country: Selected papers from EdTech'92, 110-150. Canberra: AJET Publications.

Phillips, J. & Crock, M. (1992). Interactive screen design principles. ASCILITE'92 Conference Proceedings (this a particularly comprehensive paper covering many aspects of CAL screen design).

Race, P. (1994). The Open Learning Handbook. 2nd Ed. Kogan Page, London (very useful for factors to consider in the development of open learning materials)

Reigeluth, C. M. (ed) (1984). Instructional Design Theories and Models. Erlbaum, Hillsdale.

Sims, R. (1993). Authorware with Style. Knowledgecraft, 2nd Ed 8/93.

Spencer, K. (1991). Modes, media and methods: The search for educational effectiveness. British Journal of Educational Technology, 22(1).

Trollip, S. R. (1992). Testing - the forgotten component of instruction. ASCILITE'92 Conference Proceedings.

Wheildon, C. (1989/90). Communicating or just making pretty shapes. Newspaper Advertising Bureau of Australia Ltd. 3rd Ed (this discusses print/text/ design considerations)

Author: Peter Poteralski, Instructional Designer, Learning Materials Development Unit, Torrens Valley Institute of TAFE, 100 Smart Road, Modbury 5092, South Australia. Phone: 61 8 207 8075; Fax: 61 8 207 8008; Email:

Please cite as: Poteralski, P. (1994). Computer based assessment: Design considerations. In J. Steele and J. G. Hedberg (eds), Learning Environment Technology: Selected papers from LETA 94, 248-254. Canberra: AJET Publications.

[ EdTech'94 contents ] [ ASET Confs ]
This URL:
Last revised 28 Sep 2003. HTML editor: Roger Atkinson
Previous URL 5 Apr 1999 to 30 Sep 2002: