[ EdTech'94 Contents ]
[ EdTech Confs ]
Computer based assessment: Design considerations
Peter Poteralski
Torrens Valley Institute of TAFE
Computer based assessment (CBA) development is often viewed as a subset of computer assisted learning (CAL) development since the curriculum has already been transferred into learning objectives and the learning strategies and methodologies have been selected. However, as all educators are well aware, well designed assessment is an integral part of the learning process, eg providing feedback.
With the moves to open learning, with the characteristics of
- resource based learning
- competency based assessment
- student centred
the emphasis has shifted the responsibility from lecturer centred teaching to the learning process and the assessment of the learning objectives. Educators are finding that the time devoted to teaching has been transferred to assessment and not assistance with the learning process as was the intent (refer also to Trollip 1992 who believes that too little attention is paid to assessment anyway).
CBA is a means of redressing some of these assessment commitments. However just as CAL is the panacea of all learning strategies, so too CBA is not the panacea of all assessment. Many of the issues raised in this paper will be discussed further and illustrated with screen examples in the presentation.
Developing the assessment from the learning objectives
Generally technological and economic restraints have restricted assessment to the cognitive domain (there are possibilities for the psychomotor and affective domains but they require a higher order of sophistication, eg flight simulators, virtual reality). Simple cognitive skills assessment focus on the assessment of knowledge (simple recall) and comprehension, and those with increased sophistication extend to application, analysis, synthesis and evaluation (Kemp 1985, p84). When developing CBA use a reference such as Kemp (1985) to get the required action type verbs for phrasing and constructing your questions.
The development of assessment instruments requires a direct relationship between learning objectives and the assessment items to ensure the validity of question.
Most cognitive objectives can be tested using print based assessment instruments, but the inclusion of sound, animation and video in posing the question adds to the realism. However, we need to allow for flexible delivery and accessibility for all students, ie can question content easily be transcribed into print/ text form eg for video clip, this could be achieved by providing the conversation script. Not all question content will be readily convertible, eg a video clip requiring the identification of a hazardous practice may not be readily convertible to a graphic.
Competency based assessment
Competency based assessment is objective assessment, ie is the appraisal of knowledge and skills which are both observable and assessable.
To provide this objective testing and avoid ambiguity, the question types are generally restricted to
- true/false
- multiple choice (can they be extended to include a why?)
- matching items (eg steps, locating, placing, fault finding)
- categorising (locating, placing, fault finding)
- ordering/sequencing
- short answer
but all of these need to be
- unambiguously worded (similarly for feedback)
- carefully worded to elicit the appropriate response without suggesting the correct response;
We have some flexibility in designing the questions and answers
- we can use wild characters to select out key words, phrases and retain correct order;
- for short answer we can have - fill in, completion, restricted choice (choose or select from the range of responses provided?), numerical, formulae. completion of tables, charts, forms, etc;
- all content items can be text or graphics or photos or animations or video clips;
Another consideration associated with competency based assessment is the usage of questions which could possibly reinforce incorrect procedures be used? eg picture of incorrect method may subconsciously be embedded in the student's mind with disastrous consequences. How often have students remembered precisely what they were told expressly not to remember, when you were using this technique to emphasise or contrast some points?
Media selection considerations
Whether CBA is an appropriate assessment tools is dependent on the following considerations
- are objectives adequately assessed?
- are development costs reasonable
- will the CBA be cost effective ie in term of comparison of development costs of other assessment instruments using other media, hardware and software costs, human resources time savings
- is a suitable delivery platform readily available to a majority of learners?
- characteristics of user group
Design considerations - purpose and delivery
In the CBA design process we need to ask the following questions and address the related issues
User Information and user considerations
- In the introductory screens give clear instructions related to
- the number of questions
- the method of responding ie for different answer types it should be clear as to how the response needs to be provided to be a 'valid' response
- performance/competence criteria
- the number of tries possible before being 'terminated'
- time constraints, if any
- feedback
- branching possibilities and whether they are determined by criteria or choice
- the degrees of difficulty available in question banks or sections
Note: Most of this information could be provided in the form of accompanying printed instructions sheets rather than on screen reading.
- provide performance reporting as appropriate ie indicate whether progressive reporting and/or final reporting is provided
- flag ability to exit and resume assessment at later stages
- list any prerequisites
- remember to tailor the CBA to the learner's needs
- the optimum length of time required to complete the assessment should be approximately 15-20 minutes, ie the developer should be aware of eye fatigue and other OH & S issues of learners spending too long gazing at screen (perhaps commercial television has some lessons, ie the realisation that after 10 minutes or longer, attention span lapses, or perhaps we have been conditioned by television?).
Administration
Administrative considerations include
- are there any special locations or facilities requirements?
- what are supervision requirements?
- how will results recording/transfer be achieved
- does access need to be via network or standalone or remote or portable
- are there any time constraints with availability of CBA
- what is dependence/interrelationship/linkage between the components of the assessment ie can it be done independently, subsequent to, or concurrently with other assessments?
- what is the availability of hard copy alternatives eg print + photos or graphics or audio or video?
Question design and navigation
With question design and navigational aids be mindful of the following points
- use a personalised approach eg in the introductory/ welcome screen and perhaps final report screen use the user's first name but be wary of using it in feedback as it can be perceived as being patronising. (HANNAFIN pp 198,9). Instead you use speech bubbles to achieve this effect or even include cartoon characters who appear as part of the generic design of your learning packages eg chef character in hospitality assessment
- decide whether questions are to have time constraints or be self paced.
- stylise photos or graphics to still deliver the message but simplify the design complexity Simple animations can sometimes replace video or vice versa
- make questions brief ie should not require a great deal of analysis for solution - keep computer screen reading and the time usage of the computer to a minimum, unless you want to develop a question which is expanding ie presents additional questions following on from the original Alternatively you may wish to have a layering of presentation, using screens/boxes to present additional elements of the original question
- use cartoons or drawings to create scenarios or situations
- break questions requiring lots of input into several clearly distinguishable subsections eg use of different colours, one set for core information and layout but different sets for different question sub parts
- avoid the overuse of boxes, borders, font types and styles, patterns and other effects which distract the user from the prime purposes. This is important for the over stimulus/tiring reasons, but it also annoys the user for they have to both filter and locate the essential information, but it also detracts from the learning process. Therefore while the making errors is a useful tool in the learning process we need to minimise errors caused by poor question designs. We can compare this overuse of features with the first examples of desk top publishing (DTP) where were made attempts to squeeze as many as possible features onto every page. The use of boxes can also compartmentalise information which is not always desirable, because in some cases you want the user to sort and assimilate the information in the way that suits them. Also there may already be support and navigation areas presented as buttons, boxes and sub screens already on the screen and so the addition of more boxes and borders only further clutters the screen. Although cliched, the KISS principle is applicable.
- pay attention to screen design and layout (see below)
- check reading age (as for text and print design)
- provide learners with option of attempting all questions first and the ability to retrace, check and modify answers before presenting the answers for judging?
- number or label each question or section clearly, for easy location and cross reference (eg. use pull down menus or icons on perimeter of the screen or a navigation button option - map). This achieves a preview, review, skip feature as would be available on print/text assessments. Too often questions are only presented in linear fashion with no option for students to traverse the question bank as they want to. Only use this linear evolution or sequence for questions which have an evolutionary presentation of questions
- don't necessarily ask students about concepts using familiar examples and context, but rather use different situations to check their understanding of facts, concepts, procedures or principles and their application (see Beech, p87, divergence p101).
- provide practice problems and hints and help ensure format consistency for each question type
- maximise user involvement. Think carefully as to how you structure your questions and the required responses. As the developer you want the situation to mimic as closely as possible the real life situation and thus reinforce the knowledge and skills and assist in the retention eg which method do you think would provide a more effective learning - ask the user to give the correct order of steps numbering them or enabling the user to actually move steps into the correct order? By enabling the user to actively mimic or simulate an action you enable them to
- externalise the thought processes
- make the process a concrete one not just an abstract one
- provide mental imagery to reinforce the long term memory retention
- Use mnemonics to assist in recall (see Beech)
- avoid negatives in the question stem (Trollip 1992)
- use the various question types imaginatively eg attempt to create a realistic scenario or at least a reality simulation - as Mager (1991 p89) says
There is often a temptation to want to use multiple choice and true-false items for testing competence. After all, didn't we spend a lifetime of in school (ie we tend to teach as we were taught) answering this type of item? Yes, we did. And aren't multiple choice and true-false items easily scoreable by computer, Yes, they are. And isn't that a useful type of item for spreading students out on a curve. Yes, indeed. But all of this is irrelevant and very time consuming. The most reliable way to find out whether students can change a tire is to ask them to do it. If you used multiple choice or true-false items you might find out what they know (ie knowledge checking) about tire changing, but you won't find out whether they can do it.
There are occasions where you really do want to know whether the learner knows how to do something before they actually attempt it.
Screen design
Make similar considerations to those for print/ text page and poster design, ie
- flow of text and graphics should allow the eye to move diagonally (as per the Gutenberg diagram DTP - CALS 1989) (Phillips & Crock 1992). Care with the use of fully justified or centre justified text.
- keep screen design simple, clear and uncluttered to avoid distracting from the focal point. Keep the features subtle unless you are specifically demanding attention.
- use space judiciously
- use soft, neutral or pastel colours for backgrounds and use stronger and contrasting colours for emphasis
- consider type size and style carefully (avoid underlining - this is a carry over from the old typewriters and is an unnecessary tool now and also reduces readability. It is better to use bold, italic, different colours or even reverse to emphasise and highlight).
- minimise text, instead use the object movement and click/point features to allow greater involvement of the user (since there is also less time spent writing) and also utilise the drawing features eg draw boxes, lines to link and group items.
- use the techniques of flashing, inversing, zooming and panning to focus on items requiring attention (Hannafin, pp 185-188)
- ensure graphics are of sufficient size and scale to ensure that the desired features are discernible
Feedback
The provision of feedback is one of the principles of adult learning (refer Race, 1994, for purposes of feedback). Feedback should be
- meaningful (not comments like 'you must study harder').
- both clearly related and connected to answer given ie user response is clearly displayed or denoted and feedback visually linked.
- clearly explains why the chosen response is fully or partially correct/incorrect.
- unambiguous (although it is not always easy to pre- empt the reasons for incorrect responses.
- if using sound in feedback, be discrete - nobody wants to have their progress blared out to all their peers/colleagues nor should neighbouring learners be disturbed. We often like to perform our self check exercises in 'privacy'. Therefore use sound appropriately.
- it can be immediate and can also be used to provide hints.
- feedback is an essential element of the learning process. ie learning from errors can at times be a more meaningful and resilient lesson (refer constructivist theory of learning).
Programming skills versus authoring
Most authoring packages are claimed to be easy to use. They are usually able to perform impressive display fairly simply, however they quickly become very complex when moved away from linear presentations and flow of data. Therefore while programming skills are not absolutely necessarily, a clear understanding of
- manipulation of data and dataflow
- functions
- variables
- logic states and logic flow
- sequencing flow
- looping and repeating flow and control mechanisms
- branching/decisions flow and control mechanisms
- error tracing and testing procedures
- use of main structures and sub structures and organisational structure and design eg flow chart, logic diagrams, top down, bottom up design principles greatly assists the development process.
Error trapping and debugging
Always
- check your questions and expected answers for clarity and ambiguity
- ensure that the appropriate answer is expected ie numeric, character string, several parts
- check that your and/or logic is correctly phrased ie use Venn diagrams or truth tables (be aware of the different interpretations of 'or', ie inclusive versus exclusive
- be aware that some programs allow you to do a trace of the executed stages of the lesson eg the use of IconTitle in Authorware Professional
- during the prototype use variables to display of answer judgements and logic values to check the correct recording and flow of data and logic.
Design issues checklist
As a final checklist ensure that all of the following point have been considered
- purpose (what is the purpose of the assessment - self check exercise, sample questions, diagnostic assessment?)
- user (has the CBA been designed with the learner's requirements being paramount?)
- hardware/software (will both these components be 'transparent'?)
- educational considerations (what is it you are trying to achieve with the assessment?)
- question types - have the attributes of CBA been used to maximum effect
- feedback (has this been adequately provided)
- navigation (does the learner have appropriate control and know their location at all times?)
Why CBA? - the advantages
Some of the reasons for employing CBA as an assessment tool are
- cost effective (performs repetitive tasks, freeing up the facilitator/educator to become more involved in the learning process );
- allows the learner to self assess;
- requires none or little human resource;
- allows for individualisation eg randomisation, different response weightings, different numeric values, similar but subtle differences in questions, variations on wording, use of converse/ complimentary questions
- enables the presentation of graded degrees of difficulty by use of different question banks or even sections within a question bank
- accessibility or branching or choice of pathways can be controlled to enable remediation or enrichment or extension or skipping of sections or fast tracking etc. Ability to divide the assessment into several sections
- enables a variety of media integration eg graphics, cartoons, photographs, audio, and video
- encourages variety of effects and learning devices, eg zooming in, panning, exploding windows for further clarification of questions, animation, hints; even the possibility of providing different language options (both text & audio ie use string variables to access different databases) within a single package
- provides immediate and appropriate feedback
- modifying or changing flow structures or content is relatively easy and simple
- possesses in built calculations facilities
- can be linked to other software for example word processing, spreadsheets, databases or other specialised packages for checking proficiencies
- ensures repeatability
Why CBA? - the disadvantages
It is also important to realise there are also some limiting aspects in using CBA for assessment
Future directions
While the greater use of multimedia, artificial intelligence, virtual reality, expert systems, spoken voice recognition and networking, we have a greater access to knowledge and skills. Just being able to discerningly collate information and assimilate it, is an emerging challenge and we need to develop complementary CBA to assist in this process.
References
Alessi, S. M. & Trollip, S. R. (1991). Computer Based Instruction: Methods and Development, 2nd Ed. Prentice Hall, Englewood Cliffs, New Jersey.
Beech, G. (1985). Computer Based Learning. Sigma Technical Press, Cheshire.
Electronic Publishing (1989). CALS, Adelaide Institute of TAFE, South Australia.
Gery, G. (1988). Making CBT Happen. Weingarten Publications, Boston.
Grant, P. (1992). Instructional Systems Design for CBT Software. Learnware Technologies.
Hall, W. & Saunders, J. (1993). Getting to grips with assessment. National Centre for Vocational Educational Research Ltd, South Australia.
Hannafin, M. J. & Peck, K. L. (1988). The Design, Development, and Evaluation of Instructional Software. Macmillan Publishing Co, New York.
Kemp, J. E. (1985). The Instructional Design Process. New York: Harper & Row.
Lally, M. (1993). Navigation in Rich Visual Data Spaces. University of SA seminar.
Mager, R. F. (1991). Making Instruction Work. Kogan Page, London.
Metros, S. E. (1992). Interface-lift: Elective or compulsory? In J. G. Hedberg and J. Steele (eds), Educational Technology for the Clever Country: Selected papers from EdTech'92, 110-150. Canberra: AJET Publications.
http://www.aset.org.au/confs/edtech92/metros/metros.html
Phillips, J. & Crock, M. (1992). Interactive screen design principles. ASCILITE'92 Conference Proceedings (this a particularly comprehensive paper covering many aspects of CAL screen design).
Race, P. (1994). The Open Learning Handbook. 2nd Ed. Kogan Page, London (very useful for factors to consider in the development of open learning materials)
Reigeluth, C. M. (ed) (1984). Instructional Design Theories and Models. Erlbaum, Hillsdale.
Sims, R. (1993). Authorware with Style. Knowledgecraft, 2nd Ed 8/93.
Spencer, K. (1991). Modes, media and methods: The search for educational effectiveness. British Journal of Educational Technology, 22(1).
Trollip, S. R. (1992). Testing - the forgotten component of instruction. ASCILITE'92 Conference Proceedings.
Wheildon, C. (1989/90). Communicating or just making pretty shapes. Newspaper Advertising Bureau of Australia Ltd. 3rd Ed (this discusses print/text/ design considerations)
Author: Peter Poteralski, Instructional Designer, Learning Materials Development Unit, Torrens Valley Institute of TAFE, 100 Smart Road, Modbury 5092, South Australia. Phone: 61 8 207 8075; Fax: 61 8 207 8008; Email: peterpot@tafe.sa.edu.au
Please cite as: Poteralski, P. (1994). Computer based assessment: Design considerations. In J. Steele and J. G. Hedberg (eds), Learning Environment Technology: Selected papers from LETA 94, 248-254. Canberra: AJET Publications. http://www.aset.org.au/confs/edtech94/mp/poteralski.html |
[ EdTech'94 contents ]
[ ASET Confs ]
This URL: http://www.aset.org.au/confs/edtech94/mp/poteralski.html
Last revised 28 Sep 2003. HTML editor: Roger Atkinson
Previous URL 5 Apr 1999 to 30 Sep 2002: http://cleo.murdoch.edu.au/aset/confs/edtech94/mp/poteralski.html