Design Patterns for digital item types in Higher Education |
S. Draaijer
Centre for Educational Training, Assessment and Research, Vrije Universiteit Amsterdam
s.draaijer@ond.vu.nl
R.J.M. Hartog
Wageningen MultiMedia Research Centre, Wageningen University
rob.hartog@wur.nl
Abstract
A set of design patterns for digital item types has been developed in response to challenges identified in various projects by teachers in higher education. The goal of the projects in question was to design and develop formative and summative tests, and to develop interactive learning material in the form of quizzes. The subject domains involved were mainly in the life sciences, medical sciences and engineering sciences. The use of digital item types and facilitating the process of designing items were typical examples of the challenges involved. From the viewpoint of subject matter experts, the main challenge in digital item type design was to design items that test for understanding. Furthermore, lecturers want to reduce student behaviour that is based on guesswork. With these conditions in mind, this paper presents a set of design patterns for digital items, together with a standard format for describing these patterns.
Introduction
New opportunities for designing items for computer based assessment and learning management systems
Currently available Computer Based Assessment systems (CBA) offer a great variety of digital item types (Bull & McKenna, 2001; Mills, Potenza et al., 2002; Parshall, Spray et al., 2002) such as multiple answer, drop-down lists, numeric, hot-spot, drag-and-drop. These systems also enable a variety of item types to be deployed within a single assessment. The availability of CBA systems and the Internet make it easier than ever before for Subject Matter Experts (SME’s – professors, academics, lecturers, tutors, instructors) to use such innovative item types. Also, other digital options can be used such as the inclusion of images. Several authors have referred to these item types as innovative. SME’s in many higher education courses are already using digital item types that are made available via CBA systems and Learning Management Systems (LMS’s). One recurring problem, however, is how to make optimal use of these new possibilities.
User roles in designing digital items for higher education
Within the field of higher education, digital test items are usually developed within the context of a course taught by SME’s and their assistants. In general, it must be assumed that SME’s and their assistants have limited time for designing and developing such items, as well as limited skill and experience in this area. In practice, Educational Technologists (ET’s) are increasingly being asked to advise on, and participate in, small-scale projects to design and develop pools of digital test items. These items are generally used for summative assessment, and in quizzes aimed at stimulating active learning. ET’s need a methodology for the design and development of digital items if they are to provide the best possible advice to those involved in projects of this kind.
ALTB project
The SURF ALTB project (Hartog, 2005) was carried out in 2005 and 2006. That project incorporated fifteen small-scale projects on the design and development of digital items. The aim of these various subprojects was to develop sets of questions for summative use, and for use in quizzes intended for formative applications. A systematic approach to the design and development of digital items was used under a range of conditions, in situations involving various forms of collaboration and types of task division. The intention was to identify the potential of digital items and to determine how they can best be used, to collate people’s experiences, and to formulate the lessons learned. These experiences were used as input for the development of a methodology for digital item design.
Information sources on the Design and Development of digital items
A methodology for the design and development of digital items as envisioned by Hartog (2005) should provide (1) a set of design requirements, (2) a set of design guidelines, (3) definitions of available components and item types (4) a library of paradigm examples (5) a library of design patterns (6) task structures and scenarios in which resources are allocated to subtasks along a time-line. In the ALTB project, attempts were made to collect information on these methodology ingredients. In this section we explore the usefulness of available information that is intended to support the process of designing and developing innovative digital items.
Design guidelines
The literature contains long lists of design guidelines for multiple choice items (T/F, alternate choice, four options) to be used in assessments. See, for example, Haladyna and Downing (2002). During the ALTB project, however, it was found that SME’s regard most of these guidelines to be unhelpful. This is due to the fact that such guidelines often actually are requirements in stead of pointers for inspiration. The projects showed that ET’s should avoid focusing their advice and participation on the promotion of such guidelines.
Available item type taxonomies
Some researchers have undertaken an effort to develop a framework within which both traditional and innovative question types can be categorized (Haladyna, 2004; Scalise & Gifford, 2006). Such categorizations should preferably lead to the appropriate development and use of the items in question. These frameworks offer a perspective that is based on a combination of stimuli presentation and item formats. These frameworks are based on the categorization of item formats ranging from very low complexity (e.g. True/False questions) to a greater complexity (e.g. drag-and-drop items, constructed response and essay-type items). Additional dimensions involving knowledge and cognitive processes are sometimes added to this framework, as an overlay. Parshall (2002) has indicated five dimensions in which digital items could be described as “innovative”. These dimensions are the item format (the response obtained), the response action (for example key presses, mouse clicks), media inclusion (images, photographs, graphs, video, animation, etc.), level of interactivity (system responses) and scoring method (how responses are converted to scores).
In the ALTB project, these frameworks were used to help SME’s and their assistants get their projects up and running. Although helpful in this way, the frameworks were not able to provide those involved with inspiration. The project participants regarded these frameworks as interesting instruments for the analysis and categorization of items, but not as a means of conceiving items for use in their own particular courses.
Examples of digital items
During the project, desk research was undertaken to identify possible sources of sample digital items for use in higher education. The number of such sources was found to be relatively limited (Bull & McKenna, 2001; King & Duke-Williams, 2001; Mills, Potenza et al., 2002; Parshall, Spray et al., 2002; Scalise & Gifford, 2006). For the most part, the samples available from these sources are derived from secondary education and from subject domains other than those involved in the fifteen small-scale projects (life sciences, medical sciences and engineering sciences). The ALTB project showed that ET’s and SME’s were seldom able to use these examples as paradigm examples or as a source of inspiration. One major problem was that SME’s encountered great difficulty in abstracting the examples. That imposes a barrier to subsequent transformation of those examples for applicability for their own courses.
Another issue that was often encountered in the cases dealt with by the ALTB project involved indicators for the effort needed to develop questions beyond the stage of the initial concept. “How much time will it take to flesh out that question within my own authoring environment?”, “Can I author it myself or do I need a specialist for this?”. Not one of the sources consulted was able to provide a satisfactory answer or approach to this problem.
The importance of the concept of design patterns as an instrument for a methodology derives from the limitations of individual examples, and the limitations of factors such as the usefulness of guidelines and the value of frameworks. In the next section, which explores the concept of design patterns, it is argued that one of their functions is to bridge the gap between abstract guidelines and isolated examples.
Design patterns
The term “Design Pattern”, which was introduced by Alexander (1979) in the seventies of the last century is a concept used in architectural design. It was adopted for use in software engineering (Gamma, Helm et al., 1995) about 15 years later. Relations between components that repeatedly occur in different designs in answer to specific design challenges are called design patterns. The central idea is that it is not realistic to suppose that designers design from scratch. On the contrary: an experienced designer is supposed to have very many design patterns in his mind. "It is only because a person has a pattern language in his mind, that he can be creative when he builds" (Alexander, 1979: p. 206).
Design patterns are generic combinations of solutions to recurring problems within problem-solving or design domains. Competent designers can instantly match a problem to the appropriate design pattern to arrive at satisfactory solutions to given problems and contexts. Design patterns are therefore an integral component of design methodology.
Design patterns for item design
Thinking in terms of design patterns for digital items takes the associated thought processes to another level. When applied to the design of digital items, design patterns bridge the gap between learning objectives and the item types currently available in CBA systems and LMS’s. Design patterns span the divide between guidelines for item designers and examples that are already available. They also reinforce the importance of the distinction between design on the one hand and the development of digital items on the other. Lastly, by sharing design patterns, designers are able to learn from one another. In the interests of an efficient flow of information among ET’s, a shared and accepted pattern language or format to describe patterns is necessary. .
With regard to question design, the present authors found just a single publication that intentionally adopts a design-pattern-based approach. The design pattern concept is used in the Principled Assessment Designs for Inquiry project (PADI), which focuses on designing high-quality assessments of scientific inquiries. “The design patterns that are being developed as part of the PADI system are intended to serve as a bridge or in-between layer for translating educational goals into an operational assessment” (Mislevy, Hamel et al., 2003: p. 5).
To date, it is likely that most ET’s have only managed to mentally internalize a few design patterns for digital design, or that they have very limited numbers of these resources to hand. Yet ET’s have the most to gain from the design pattern approach. It would enable them to provide better support for the SME’s, by supplying appropriate design patterns at just the right moment in item-development projects. The design pattern approach allows for a faster, more economical, yet more varied deployment of digital items.
Overview of the remainder of this paper
This paper presents one of the results of the ALTB project (2005), the aim of which was to develop a methodology for the design and development of digital items. The methodology is intended to bridge the gap between currently available literature and the day-to-day work of designing digital items in higher education. A number of design patterns which were brought to light by this project, and which have now been incorporated into the methodology, are presented here.
Design patterns are intended to reduce the cost of designing and developing digital items. They are intended to enhance the validity of questions by reducing the chance that someone could arrive at the correct answer by means of guesswork and by enabling the intended objective to be measured more directly. In the next section, the concept of design pattern will be explained in more detail and applied to the design of a number of digital items. A template for describing design patterns is presented. Its purpose is to support the design and development of digital items. A number of design patterns are also presented, together with arguments in support of their instructive value and versatility of purpose.
A template for describing design patterns for digital items
Introduction
A common way to describe a design pattern is to provide a set of attributes and to describe the particular characteristics of each design pattern in terms of those attributes. To a large extent, the value of design patterns is determined by the ease with which a designer can identify a match between a pattern and a given problem. Accordingly, the set of attributes selected must provide adequate support for this process. In the case of a large set of patterns, we assume that the approach would be to use a browser to search for patterns in an online database. This might, for example, involve entering specific values to search for specific attributes. Alternatively, free text searches could be conducted across all attributes.
The PADI project (Mislevy, Hamel et al., 2003) describes design patterns on the basis of quite a large number of attributes: Title, Summary, Rationale, Focal KSA’s (Knowledge, Skills and Abilities), Additional KSA’s, Potential observations, Potential work products, Potential rubrics, Characteristic features, Variable features, I am a kind of, These are kinds of me, I am a part of, Educational standards, Templates (task/evidence shells), Exemplar tasks, Online resources, References, Miscellaneous associations. A worked out design pattern consists of tabulated text that takes up as much as two pages of A4. However, there are few specific item and task examples in a design pattern.
In most cases within the ALTB project, the implementation of the design pattern concept of Mislevy and Hamel was felt to be too abstract for digital item design. ET’s in the field of higher education require design patterns that are less elaborate, to facilitate the process of searching for them. Another factor is the finding that design patterns must provide a clearer bridge to actual examples. At the same time, innovative digital items require greater emphasis on item format, in combination with the use of media. Lastly, the time required to design and develop real items are vitally important, if design teams are to allocate resources effectively. Therefore, it was decided to:
limit the number of attributes;
be more specific concerning the components of items (stimuli, prompts, item formats);
add attributes relating to the design and development effort;
add an attribute relating to the chance of arriving at the correct answer by guesswork alone;
add an attribute relating to the possible presence or absence of extraneous cognitive load;
provide more examples.
All of the attributes are listed and described below.
Title
The Title is intended to be a short description of the pattern’s core concept.
Context
The Context attribute describes the situation in which the design pattern in question can be used. It can contain information on the type of learning objective involved, together with details of the relevant domain of interest. It also describes the conditions in which the design pattern would be of use. The context provides references to specific sources, for further discussion of the design pattern in question.
KSA focus in a Summative Test
The focus on measuring Knowledge, Skills and Abilities (KSA) is a short description of the type of learning objectives that are to be measured. It is a combination of subject matter (i.e. domain knowledge), knowledge types, and cognitive processes. The descriptions of this attribute incorporate suggestions regarding the classification of the pattern within the taxonomy proposed by Anderson and Krathwohl (2001). As it is increasingly being used to classify objectives within education, this taxonomy is expected to remain a stable indicator for the foreseeable future. Its core concept is that educational tasks can be categorized on the basis of two factors, the knowledge dimension and the cognitive process dimension. This concept results in the following table.
The knowledge dimension: |
| |||||||||||||||||
A: Factual knowledge | A1 | A2 | A3 | A4 | A5 | A6 | ||||||||||||
B: Conceptual knowledge | B1 | B2 | B3 | B4 | B5 | B6 | ||||||||||||
C: Procedural knowledge | C1 | C2 | C3 | C4 | C5 | C6 | ||||||||||||
D: Meta-cognitive knowledge | D1 | D2 | D3 | D4 | D5 | D6 |
Table 1 Two Dimensional Framework by Anderson & Krathwohl (2001).
Within the context of design patterns for digital items, the range of questions turned out to be bound by dimensions A, B and C and by cognitive process dimensions 1, 2, 3 and 4. That is in line with observations by King (2001).
KSA focus in a Quiz
The learning focus is a short description of the type of cognitive process or line of reasoning that can be induced by a question based on this pattern and knowledge type. With regard to the descriptions of this attribute, here too suggestions are made concerning their classification within the taxonomy table proposed by Anderson and Krathwohl (2001).
Pattern Core
The pattern core is a description of the pattern that is sufficiently generic in nature to enable an item to be generated concerning various specific situations within the context. At the same time the description is very tangible, in that it lists the individual components of the question. Furthermore, this list sometimes contains suggestions regarding the spatial arrangement of these components, which are specific elements of the question (stimulus, prompt, item format).
Design Effort
Design Effort is the amount of time needed to arrive at, or compile, the main conceptual idea of a question. On the basis of the experience gained in the fifteen small projects on the design and development of closed questions, we are able to distinguish two levels of Design Effort:
Low: Less than 15 minutes.
Design Effort can be minimal if – for example – use of the pattern does not require the designer to develop distractors or to develop new representations of knowledge.High: From 15 minutes to several hours. This type of effort usually involves finding and formulating distractors or new representations of knowledge.
Realization Effort
The Realization Effort is the estimated amount of time required during the ALTB project to develop and implement the conceptual idea of a question in an authoring environment. It also comprises the time that is needed to check, discuss and revise the question.We distinguish three levels of Realization Effort:
Low: Less than 10 minutes. On average, this amount of development is needed for text only, standard type question formats such as True/False, alternate choice, multiple choice, fill-in-the-blank.
Medium: Between 10 minutes and 40 minutes. On average, this amount of development effort is required for more elaborate question formats such as hot spot, matching, multiple drop down lists, numeric and calculated formula. Some media resources, such as any images that are available, will often still need to be processed in order to make them suitable for display on screen.
High: More than 40 minutes and up to 3 hours. This level of development effort might, for instance, be due to the fact that the questions involve the integration of video and animation. The creation of drag-and-drop questions with multiple markers also tended to require considerable effort.
Extraneous Cognitive Load
One of the most essential requirements for any item is validity. The options for more direct measurement of the intended construct (Parshall, Spray et al., 2002) in particular are put forward as an argument in favour of the design, development, and deployment of digital items. Extraneous cognitive load occurs when the student is required to allocate cognitive processing capacity to cognitive actions that are actually irrelevant to the correct answer. In particular this is the situation when the spatial arrangement of stimuli and response mechanisms requires a lot of eye movement or mental re-arrangements of facts and concepts. Eliminating this aspect as much as possible results in questions with no extraneous cognitive load.
Guess Change
The high probability to arrive at the correct answer by pure guesswork is often seen as a drawback for the use of multiple choice questions. A number of design patterns have a set up that decreases this probability. For ET’s it therefore is an interesting attribute. In the attributes, a high guess chance is given to the traditional T/F and 4-option multiple choice questions (~ 0.5 to ~0.25). The value intermediate is given to design patterns that decrease that chance somewhat (to ~0.2 to 0.1). The value is set to low if this chance is decreased much more (< ~0.1).
Iconic Examples
The Iconic Examples section is an important attribute of design patterns. Iconic Examples clarify the semantics of pattern definition. In some examples, extra directives are mentioned as noteworthy aspects. However, we would like to emphasize the importance of abstracting from the example, rather than regarding the example as identical to the pattern. It gives details of real situations involving the use of the design pattern in question, either past or present, and of the solutions that were generated.
Scoring Rules
Scoring is of major importance for summative purposes, and must be considered carefully. Many of the fifteen projects showed that various design patterns give rise to time-consuming discussions about scoring rules. It is good practice to inform students about the scoring of an item upfront. Accordingly, decisions about scoring should be made before the items in question are deployed in an actual test. Firstly, the scoring of questions should be discussed in relation to the goal of the item, and to that of the test in which it has to function. Secondly, characteristics such as answering time and the probability of guessing the correct answer should be considered. Thirdly, the mutual interdependence of answering options must be taken into account when deciding on scoring rules. Finally, it is important to note that the specific characteristics of the CBA system in question impose limitations on the options for devising scoring rules. During the ALTB project no useful information was found in the literature that might lighten this task, nor could clear and univocal scoring rules for most patterns be devised. .
In general, SME’s were comfortable with the idea of providing as much transparency for students as possible when it comes to scoring rules. For that reason, it is proposed that the following rules be applied (regardless of the type of design pattern involved):
Let Si be the maximum number of points that a student can get for question i;
Let pi be a rational number between 0 and 1. Call pi the partial credit factor for question i;
Now, Si should be:
proportional to the weight allocated to a specific question within a test;
proportional to the amount of time that a student is supposed to allocate to this question within the test.
Now, pi should be:
proportional to the number of correctly chosen or constructed elements of an item.
Given the above mentioned aspects, the attribute of Scoring Rules is left out in the design patterns. Ideally, however, SME’s, their assistants, and ET’s should not have to invest any time in establishing scoring rules for questions.
Selected Design Patterns for digital items
About thirty design patterns were identified and described in the fifteen small-scale projects on the design and development of digital items. In this section we present 10 archetypical design patterns. These patterns were arrived at on the basis of the instructional qualities that they bring to item design and their usefulness in a number of other contexts such as domain, task structure, knowledge and cognitive characteristics. They:
require little design effort;
allow much of the design and development work to be allocated to assistants and ET’s.
minimize guessing behaviour and unintended answering strategies (such as the elimination of options);
are aimed at those knowledge categories and cognitive processes that are considered important by many SME’s in higher education (B2, B3 and C2, C3 of Bloom’s taxonomy, as revised by Anderson and Krathwohl).
Each pattern takes up two pages of A4. On the first page, the values of the attributes are described. The facing page illustrates one or more examples derived from the pattern in question. This presentation format allows for easy browsing, retrieval and presentation of the design patterns.
Indicating positions of sub processes in a process diagram. | |||||||||
ID | Context | KSA summative | KSA Quiz | Pattern Core | Design Effort | Development Effort | Extraneous Cognitive load | Guess chance | |
001 | Any type of subject matter that uses process diagrams. | Measuring the ability of a student to position a specific sub process within a given process. A&K: B2, B3 C2, C3 See also Roid & Haladyna, 1982 (1982: pp. 169-170) | Stimulating the student to think about the function, inputs and outputs of a specific sub process. Also the students must be aware of the inputs and outputs of each of the other sub processes. Stimulates student to scan the whole process. A&K: A2, A3, A4 B2, B3, B4 C2, C3, C4 | A diagram of the whole process. An indication of possible placements of the sub process with symbols. A name or description of a specific sub process. A prompt that tells the student to indicate which of the indicated possible placements of the specific sub process makes sense, given the function of the whole process. Multiple response. Or Drag-and-drop. | Low | Medium | No | Medium | |
Course Drinking Water Treatment, L. Rietveld, Delft University of Technology. | Course Drinking Water Treatment, L. Rietveld, Delft University of Technology. | Course Process Technology, H.vd. Schaaf / R Hartog, Wageningen University. |
Indicating relationships between qualitative changes of variables in a model. | ||||||||
ID | Context | KSA summative | KSA Quiz | Pattern Core | Design Effort | Development Effort | Extraneous Cognitive load | Guess chance |
002 | Any type of subject matter that uses quantitative or qualitative models. This pattern is useful in any type of subject matter that uses diagrams to illustrate the qualitative relationship between changes of process variables. | Measuring the ability of a student to indicate qualitative relationships between process variables, between processes, or between individual phenomena within a process. The student is forced to demonstrate his mastery of the process as a whole. A&K: B2, B3 C2, C3 See also Roid & Haladyna (1982: pp. 169-170). | Stimulates qualitative reasoning with respect to quantitative and qualitative models. Stimulates the student to think about the process as a whole. A&K: B2, B3 C2, C3 | A symbol or passage of text representing a qualitative change of each process variable. A graphical configuration of most of these symbols or texts indicating the relationships between process variables. Placeholders for some of these symbols or passages of text. A prompt asking the student to drag the appropriate markers to the correct positions. Drag-and-drop. | Low | High | No | Low |
Course Physiology, S. Draaijer, Vrije Universiteit Amsterdam. Note that all boxes are of equal size in order to prevent any cuing because of text length. Note that also foil text markers are present, this lowers the probability of a correct guess. | Course Phase 1, N.J. Part, University of Dundee. |
Recognizing characteristics of phenomenon in a graph. | ||||||||
ID | Context | KSA summative | KSA Quiz | Pattern Core | Design Effort | Development Effort | Extraneous Cognitive load | Guess chance |
005 | This pattern is useful in any type of subject matter that uses graphs to visualize recordings of natural phenomenon or to depict deviations of normal situations (in economy, medicine, earth sciences, chemistry, physics). | Measuring the ability of a student to recognize the characteristics of a specific phenomenon in a graph. A&K: A1, A2 B1, B2 | Stimulates the student to look carefully at the graph and to search for the characteristics of a phenomenon. Stimulates the student to attach the label of a phenomenon in his mind to a specific set of characteristics. A&K: A1, A2 B1, B2 | A graph that represents a recording of the actual behaviour of a system over time or other variable. A label of a phenomenon. A prompt requesting to indicate the characteristic of the phenomenon. A marker. Drag-and-drop. OR Hot-Spot. | Low | High | No | Low |
Course The Heart, R.J.M.P. Musters, Vrije Universiteit Amsterdam. | Course The Heart, R.J.M.P. Musters, Vrije Universiteit Amsterdam. | Course The Heart, R.J.M.P. Musters, Vrije Universiteit Amsterdam. |
Recognize or recall the legend of a diagram, graph or table. | ||||||||
ID | Context | KSA summative | KSA Quiz | Pattern Core | Design Effort | Development Effort | Extraneous Cognitive load | Guess chance |
008 | This pattern is useful in any type of subject matter that uses diagrams, graphs and tables to denote important characteristics of concepts. | Measures whether the student knows which variable belongs to which axis and/or which phenomenon belongs to which landmark point and/or which phenomenon belongs to which set of landmark points. A landmark point might be a maximum or a minimum or an intersection or some other “special” point in the graph A&K: B1, B2, B3 | Stimulates students to focus on the meaning of a graph where the visual representation is already well known. Might make the students aware that they have not yet fully grasped the meaning of the graph. A&K: B1, B2, B3 | A diagram (or graph or table). A prompt that asks the student to analyze the diagram and to determine what relations it depicts. Drag-and-drop. OR Drop down list. OR Fill-in-the-blank. | Low | High Drag-and-drop Low Drop down list and Fill-in-the-blank | No | Low - Medium |
Course Food Safety Economics, A. Velthuis / R. Hartog, Wageningen University. Note that all boxes are of equal size, in order to prevent cuing based on the length of the passage of text. Note that the single combination of this design pattern and the same graph may give rise to several digital items | Course Sampling and Monitoring, E. Boer / R. Hartog, Wageningen University. |
Ordering steps in a process or procedure. | ||||||||||
ID | Context | KSA summative | KSA Quiz | Pattern Core | Design Effort | Development Effort | Extraneous Cognitive load | Guess chance | ||
016 | This pattern is useful for any type of subject matter that deals with specific linear or cyclical processes or with the sequencing of events. | Measuring the ability of a student to remember or deduce the specific ordering of a specific process. Many instructors feel that a student who can provide an ordering that makes sense “understands” the related subject matter. A&K: B1, B2, B3 C1, C2, C3 See also Roid & Haladyna (1982: p. 170). | Stimulates the student to scan each process step, possible orderings based on matching inputs and outputs of process steps, and on the intended function of the whole process. May also stimulate the student to learn about specific process steps, and about specific inputs and outputs. Is perceived as “creative” by some students. Finding the correct answer is believed to be more satisfactory than answering a traditional multiple choice question A&K: B1, B2, B3 C1, C2, C3 | A set of process or procedural steps in terms of a verbal or diagrammatic description. A definition of the function or intended output of the process or procedure. A prompt that asks the student to present an ordering of the steps such that the sequence of steps constitutes a complete process that realizes the given function or procedure. Ordering. OR Drag-and-drop. | Low | Medium For ordering. High For drag and drop. | Yes For Ordering. No For drag and drop. | Low | ||
Course Drinking Water Treatment, L. Rietveld, Delft University of Technology . | Genetics course, T. Aegerter-Wilmsen / T. Bisseling, Wageningen University. Note that, in this example, use is made of the ordering question format. A drag-and-drop format is depicted for example in design pattern 002. | Course Sampling and Monitoring, E. Boer / R. Hartog, Wageningen University. Note that, in this example, use is made of the ordering question format. A drag-and-drop format is depicted, for example, in design pattern 002. |
Identify the error in process design. | ||||||||
ID | Context | KSA summative | KSA Quiz | Pattern Core | Design Effort | Development Effort | Extraneous Cognitive load | Guess chance |
018_2 | This pattern is useful with any type of subject matter that uses diagrams to describe processes. | Measures the ability of the student to detect errors in a process design. For large models or designs etc., the effort required of the student might be out of proportion to the information generated by measurements using this question. A&K: B2, C2 | Stimulates the student to study a design, model or process in total and to write a critique of it. A&K: B2, C2 | A model OR A design An error introduced into the model or design A representation in the form of a diagram or a picture. A prompt requesting the student to identify and indicate any errors. Hot Spot. OR Drag-and-drop. | Low | Low | No | Low |
Course Drinking Water Treatment, L. Rietveld, Delft University of Technology. | Course Process Technology, H.vd. Schaaf / R. Hartog, Wageningen University. |
Identify a detail error in a model-based calculation. | ||||||||
ID | Context | KSA summative | KSA Quiz | Pattern Core | Design Effort | Development Effort | Extraneous Cognitive load | Guess chance |
018_3 | This pattern is useful with any type of subject matter in which model-based calculations are performed. See also pattern ID 018_4 | Measures the ability of the student to detect errors in a calculation. For elaborate calculations, the effort required of the student might be out of proportion to the information generated by measurements using this question. A&K: B2, C2 | Stimulates the student to study a computation in total and to become aware of forms of accuracy. A&K: B2, C2 | A given problem. A computation for solving the problem. A detail error introduced into the computation. A prompt requesting the student to identify any errors. Hot Spot. OR Drag-and-drop. | Low | Low | No | Low |
Course Drinking Water Treatment, L. Rietveld. Delft University of Technology. Note that the calculation contains a detail error regarding the use of units within it. |
|
Selecting the primary problem-solving strategy for a calculation problem | ||||||||
ID | Context | KSA summative | KSA Quiz | Pattern Core | Design Effort | Development Effort | Extraneous Cognitive load | Guess chance |
032 | This design pattern is useful for any type of subject matter that requires a specific problem-solving strategy. The subject matter categorizes problems and solutions. Examples can be found in statistics, mechanics, mathematics etc. Successful problem solving is conditional on the ability to select a strategy that is appropriate to the problem in question. See also the literature on factors for successful problem solving (Gick & Holyoak, 1983; Sweller, 1989). | Measuring the ability of a student to select the primary problem solving strategy. A&K: B2, B3 C2, C3 | Stimulating the student to acquire factual knowledge about the functions and goals of processes. A&K: B2, B3 C2, C3 | A prompt asking the student to select the correct options An option list that gives the standard set of tools and/or operations and/or processes that is available in the subject matter domain Multiple Response. | Low | Low | No | Medium |
Course Sampling and Monitoring, E. Boer / R. Hartog, Wageningen University. | Course Sampling and Monitoring, E. Boer / R. Hartog, Wageningen University. |
Distinguishing relevant laws, values, formulas etc. from irrelevant ones, to solve a calculation problem. | ||||||||
ID | Context | KSA summative | KSA Quiz | Pattern Core | Design Effort | Development Effort | Extraneous Cognitive load | Guess chance |
029 | This design pattern is useful in situations where the subject matter calls for the application and execution of subject-matter relevant mathematical operations. This design pattern can be used in situations where it is necessary to perform calculations, but where additional information needs to be retrieved from the answer given. Compare this pattern with pattern ID 019. | Measuring the ability of students to potentially arrive at a correct answer to questions requiring the use of calculations. Understand the role of specific variables in calculations, without having to apply them. Selecting what is necessary for a computation. A&K: A2, A3 B2, B3 | Stimulate the student to study and apply subject-matter specific, mathematical and solving algorithms. A&K: A2, A3 B2, B3 | A prompt presenting a question about what is needed for a given calculation. A list with possible constants, variables or operations. Note that many textbooks include such a list as an appendix. Multiple Response. | Low | Low | No Note that the student needs to work on paper to be able to determine the correct choices. The student may be allowed to use a sheet containing formulas that are relevant to the subject matter. | Medium |
Course Drinking Water Treatment, L. Rietveld, Delft University of Technology. | Course Drinking Water Treatment, L. Rietveld, Delft University of Technology. Note that in this example, the same formulae are used as in example to the left. |
Distinguishing relevant classes of information for problem solving from irrelevant ones. | ||||||||
ID | Context | KSA summative | KSA Quiz | Pattern Core | Design Effort | Development Effort | Extraneous Cognitive load | Guess chance |
030 | Any subject matter that relates problem solving to classes of information. | Measuring whether a student knows what information is relevant to finding or creating solutions to a given problem. A&K: B2, B3 C2, C3 Also: direct measurement focussing on highest level in SOLO taxonomy (Biggs, 1999) | Stimulating students to be aware of the distinction between information that is either relevant or irrelevent to a given problem, and encouraging them to apply this awareness. A&K: B2, B3 C2, C3 | A list of information classes. a problem. a prompt asking which classes of the list of information classes is relevant to attempts to deal with this problem. Multiple Response. | Low | Low | No | Medium |
Course Food Safety Economics, A. Velthuis / R. Hartog, Wageningen University. | Course on Drinking Water Treatment, L. Rietveld, Delft University of Technology. |
Conclusions
About thirty design patterns were identified in fifteen small-scale projects on the design and development of digital items. Ten design patterns are presented in full. It is thought that many more design patterns can be devised. A format has been developed and used to describe the set of design patterns. The format helps ET’s to quickly scan through the patterns and to make matches between a given learning material, a given learning objective, and a given pattern.
A scan of the selected set of design patterns show that some patterns use the drag-and-drop item format. This supports statements by other researchers (King & Duke-Williams, 2001; Scalise & Gifford, 2006) that item types involving drag-and-drop operations hold great potential for use in digital environments. The design patterns described also demonstrate how the drag-and-drop format allows for a more direct measurement of the construct intended, through the alignment of conceptual, spatial and textual information. In this way, for example, the effects of construct-irrelevant variance on the basis of students’ reading level ability (Downing & Haladyna, 2004) and extraneous cognitive load are avoided. At the same time, developing drag-and-drop items induces more development effort.
A number of the selected design patterns are related to performing calculations. Calculation problems represent a challenging problem for question design. To date, most calculation problems are worked out in multiple choice questions in which students have to select or enter the correct numerical or algebraic answer to the given problem. Some design patterns described in this article show options that go beyond that approach by presenting problems in which students have to identify the mistake in a calculation or in which they have to select the appropriate laws and formulas needed to arrive at the correct answer for a given calculation problem.
One aspect of the concept of design patterns is that there are a great number of possible patterns. Scanning patterns to find one that matches a specific and detailed learning objective is time consuming, as they are only available on paper. This problem has already been encountered with the thirty patterns developed during the ALTB project. It is also unreasonable to expect SME’s to learn and internalize every single pattern. This is one area in particular in which ET’s in Higher Education can prove their worth, by internalizing as many design patterns as possible. In interviews with SME’s, they will then be able to offer an appropriate design pattern on a “just-in-time” basis. This will undoubtedly boost the level and efficiency of item design and development.
The next step in the concept of design patterns for item design is to familiarize a group of ET’s with the concept of design patterns, and to increase the number of available patterns. The ET’s will then have to invest effort in memorizing a large set of design patterns and in working with them. This will enable them to effectively internalize these patterns. In addition to this paper on the subject, a tutorial has been developed to instruct participants in the use of design patterns for digital item design. The first workshop on the basis of this tutorial, which attracted fifteen participants, has already been evaluated. Average overall satisfaction was rated at just above 8, on a scale of 1 to 10.
The problem of determining scoring rules for some of the design patterns, has had an impact on the extent to which design patterns are perceived to be useful. Furthermore, the lack of generally accepted scoring rules for the most promising design patterns has given rise to considerable debate on the validity of some of the design patterns in question. Further progress in the use design patterns and digital item types will require considerable input from the field of psychometrics.
References
Alexander, C. (1979). The timeless way of building: Oxford University Press.
Anderson, L.W., & Krathwohl, D.R. (2001). A taxonomy for learning, teaching, and assessing: A revision of bloom's taxonomy of educational objectives New York: Longman.
Bull, J., & McKenna, C. (2001). Blueprint for computer-assisted assessment: RoutledgeFalmer.
Downing, S.M., & Haladyna, T.M. (2004). Validity threats: Overcoming interference with proposed interpretations of assessment data. Medical Education, 38(3), 327-333.
Gamma, E., Helm, R., Johnson, R., & Vlissides, J. (1995). Design patterns: Elements of reusable object-oriented software: Addison-Wesley Professional Computing Series.
Haladyna, T., M., (2004). Developing and validating multiple-choice test items London: Lawrence Erlbaum Associates.
Haladyna, T.M., Downing, S.M., & Rodriguez, M.C. (2002). A review of multiple-choice item-writing guidelines for classroom assessment, Applied measurement in education (pp. 309-334): Lawrence Erlbaum Associates, Inc.
Hartog, R. (2005). Actief leren transparant beoordelen. SURF Foundation of the Netherlands, retrieved december 2006, Project website. http://fbt.wur.nl/altb.
King, T., & Duke-Williams, E. (2001). Using computer aided assessment to test higher level learning outcomes. In M. Danson (Ed.), 5th CAA Conference. Loughborough: Loughborough University.
Mills, C.N., Potenza, M.T., Fremer, J.J., & Ward, W.C. (2002). Computer-based testing, building the foundation for future assessments. London: Lawrence Erlbaum Associates.
Mislevy, R.J., Hamel, L., Fried, R., Gaffney, T., Haertel, G., Hafter, A., Murphy, R., Quellmatz, E., Rosenquist, A., Schank, P., Draney, K., Kennedy, C., Long, K., Wilson, M., Chudowski, N., Morrison, A.L., Pena, P., Songer, N.B., & Wenk, A. (2003). Design patterns for assessing science inquiry. PADI Principled Assessment Designs for Inquiry, 2003.
Parshall, C.G., Spray, J.A., Kalohn, J.C., & Davey, T. (2002). Practical considerations in computer-based testing New York: Springer-Verlag.
Scalise, K., & Gifford, B. (2006). Computer-based assessment in e-learning: A framework for constructing "Intermediate constraint" Questions and tasks for technology platforms. The Journal of Technology, Learning and Assessment., 4(6).
Acknowledgements
The ALTB Project has been realized with support of SURF Foundation. SURF Foundation is the higher education and research partnership organisation for network services and information and communications technology (ICT) in the Netherlands. For more information about SURF Foundation: http://www.surf.nl.
We are grateful to Prof. dr. J.J. Beishuizen for reading an earlier version of the manuscript and for his stimulating comments.