ASET logo
[ EdTech'88 Contents ] [ EdTech Confs ]

The elaboration theory model of instructional design: HyperCard - new perspectives on learner controlled CAL

Malcolm J. Morrison
Learning Laboratory Pty Ltd
and

Pascal Grant
Learnware Technologies Pty Ltd

Currently, most CAL courseware is developed using a procedural model based upon an hierarchical structure of learning outcomes. Although useful in its framework, the resulting instruction tends to be highly prescriptive in nature and provides little if any flexibility to the learner. This paper explores an alternative to the current approach by presenting the Elaboration Theory Model and its application through the use of HyperCard in order to allow greater learner control and more flexible CAL. The authors believe that HyperCard offers great potential for the development of computer assisted instruction. HyperCard allows the designer to organise large amounts of data in a non linear fashion and provides full user control in accessing and retrieving information. The paper will discuss ways in which HyperCard can become an effective training tool for the development of computer based instruction.


Instructional design is now recognised as the essential foundation of any successful CAL undertaking. In order to take full advantage of new technologies, instructional design must be sufficiently flexible to meet the needs of a wide range of learners, yet have adequate controls to ensure that all learners make the most effective use of their training program. Whilst advanced authoring systems, powerful hardware and exciting peripheral devices such as videodisc, digitised audio systems and touch screens, enhance significantly the development, delivery and management of CAL, the key to quality courseware remains firmly locked within the instructional design considerations of the developer.

There has been a number of instructional strategy models which have been applied to computer based courseware design. They include Block's (1971) Mastery Model, Lumsdaine and Glaser's (1960) Adaptive Model, Gagne and Briggs' (1979) Cognitive Skills Model and Reigeluth and Merrill's Elaboration Theory Model.

The formation of models of instruction requires that variables such as the sequencing of knowledge, pacing, motivation, the types of questions, the role of questions, time on task and learner characteristics be taken into account. The treatment of such variables in one manner will lead to a procedural model of instruction; treatment in another manner will give rise to a conceptual model of instruction.

Currently, the procedural model is the form most commonly used for computer based learning systems. An example of such a model would be that delineated by Gagne and Briggs (1979) which is based upon an hierarchical structure of learning outcomes. These events are:

  1. Gaining attention
  2. Informing the learner of the objective
  3. Stimulating recall of prerequisite learning
  4. Presenting stimulus material
  5. Providing learner guidance
  6. Eliciting information
  7. Providing feedback about performance
  8. Assessing the performance
  9. Enhancing retention and transfer
Conceptual models originate from the same base as procedural models, but differ significantly in the way they describe the process. They are analytic in nature and describe relevant events based upon deductive processes of logic and analysis.

Although these models have been highly useful frameworks for the development of computer based instruction, the resulting courseware has tended to be highly prescriptive and structured in its approach. True, this environment can be highly effective for some training applications, however it becomes inappropriate for developing higher order, problem solving training where the student's cognitive processes need to be monitored.

It is arguable that from within the procedural and conceptual models, one can best make use of recent developments in relational data base tools such as HyperCard and advanced authoring systems to produce individualised, learner controlled courseware capable of adaptive elements to meet the needs of a wider range of training applications.

Perhaps the most complex conceptual model is the Reigeluth-Merrill Elaboration Theory, which is an extension of Merrill's Component Display Theory. This model may have much potential for use in the computer based learning environment.

The Elaboration Theory comprises three models of instruction and a system for prescribing those models on the basis of the goals for a full course of instruction. Currently, the instructional designer makes the decision about which model of instruction to adopt; however, given a sufficiently smart management system with appropriate data analysis capability, it would be possible to allow the computer to decide which instructional strategy was most appropriate to impart that knowledge to the student.

To understand the concept of Elaboration Theory it helps to think of the action of a zoom lens. One begins with a broad picture and the major relationships among those parts (the overview, or epitome), without noticing the detail. As one zooms in on one level, on a given part of the picture, more detail (the elaboration) is exposed. After analysing the subparts and their interrelationships one can move deeper into the substrata or retreat to the wider canvas to view or review other parts of the whole. However, unlike the zoom lens, Elaboration Theory restrains the learner from moving into a lower stratum, without first viewing the immediate higher stratum.

Whilst such an instructional model might seem to leave instruction to chance, this need not be so; in fact, it should not be so. The system for prescribing instruction might force a student to complete all of one level before proceeding to the next, or force her to explore to the full depth of detail in one part of the whole before moving to the next, or allow the opportunity for her to follow her own path.

If the system is sufficiently intelligent, it could evoke any of the above alternatives by deciding the best course of action based on the interactions made by the student at each respective level of the elaboration.

Following Bruner's (1960) concept of the spiral curriculum, very few learning prerequisites exist at the level of overview. As the learner works to deeper levels of complexity, increasingly complex prerequisites emerge, but, given appropriate management capability, they will have been taught as parts of the previous lessons. If prerequisites are held back until the lesson for which they are immediately necessary, there will be only a few prerequisites to learn at any one level of complexity. Consequently, the learner will want to master them because she will see their significance. This simple-to-complex approach to sequencing instruction ensures that learning tasks are not beyond the present capability of the learner. Hence it promotes learning efficiency.

All models include seven major strategy components:

  1. a simple-to-complex sequence
  2. learning prerequisites sequence
  3. summarisers
  4. synthesisers
  5. analogies
  6. cognitive strategy activators
  7. learner control format
A simple-to-complex sequence (also known as an elaborative sequence) helps ensure that the learner always is aware of the context and importance of the different ideas being taught. It allows the instruction to be presented at the level of complexity that is most appropriate and meaningful to the learner at any given stage in the learner's development.

Within an elaborative sequence the general ideas epitomise rather than summarise the ideas that follow, and this is done on the basis of a single type of content. Whilst a summary presents a large number of ideas at a superficial, abstract level, an epitome presents a smaller number of ideas at a concrete, meaningful application level.

Content can be of three types: concepts, procedures and principles. In Elaboration Theory only one type of content is dealt with in any single epitome.

The process of epitomising involves:

  1. selecting one type of content
  2. listing all the organising content to be taught
  3. selecting a few organising consent ideas that are the most basic, simple and/ or fundamental
  4. presenting those ideas at the application level rather than the more superficial and abstract memorisation level.
Gagne (1968) states that a learning prerequisite sequence is based on a learning structure, which shows what facts or ideas must be learned before a new idea can be learned. For example, one can not learn how to calculate the interest owing on a sum of money until she knows the concept of time, interest rates, principle and masters the process of multiplication.

A summariser provides a concise statement of each idea and fact to be taught, an example for that idea and some diagnostic, self test practice items for that idea.

A synthesiser is a strategy component for relating and integrating ideas of a single type. This is achieved by presenting a generality in the form of one (or more) of the types of knowledge structures and explaining what it means, then following the explanation with some integrated reference examples and self test practice items. In this manner, new ideas are placed within the context of the previous instruction so the learner continually is aware of the structure of the ideas in the course and of their significance in relation to one another.

A powerful instructional strategy is an analogy because it makes the newly presented idea or fact easier to understand. An analogy describes similarities between some new ideas and some familiar ones that are outside of the knowledge domain of immediate interest. For example, in teaching the novice about the operations of a computer one often compares the computer with the human brain. For those with a statistical bent another commonly used analogy is the one of comparing experimental error (referred to as noise) with that of static in a television or radio transmission.

Much has been written about cognitive strategies and how they relate to the way students process the instructional inputs (Bruner, 1966; Gagne 1977; Rigney, 1978). Cognitive strategies include generic learning skills and thinking skills that can be applied across a wide variety of content areas, such as creating mental images and identifying analogies.

Merrill (1979) defined learner control as the freedom the learner is given to take control of the selection and sequencing of:

  1. the content to be learned
  2. the rate at which she will learn
  3. the particular instructional strategy component she selects
  4. the particular cognitive strategies she employs when interacting with the content
The quality of interactions in computer based learning depends on many factors, perhaps the most important of which, is the degree of learner control. Interactivity is enhanced by greater learner control of both pacing (time) and sequencing (branching). The quest therefore is to find authoring tools which can increase the scope of branching offering the student a wider reach into the subject matter.

Whilst the Elaboration Theory may seem to be a complex model, applying it to the development of computer based learning materials need not be difficult. To demonstrate the point, a simplified version of the process follows.

STEP 1: Competency specification

As with all other instructional models, the first task of any courseware development project is to specify the criterion abilities that are to be included in the course.

STEP 2: Develop the first epitome

Locate the criterion competency with the highest general weighting (its measure of uniqueness in relation to the abilities underlying all performances). Then determine in which other performances it occurs.

This is the first thing to be taught and should be sufficiently inclusive as to be able to relate to the learner's previous knowledge and experience. It also should be capable of translating to new knowledge and experiences.

STEP 3: Sequence remaining epitomes

Work from criterion abilities with high general weightings to those with low general weightings and those with high specific weightings to those with low specific weightings. By analysing these abilities, one can determine their relative contribution to the abilities which underlie any given performance.

STEP 4: Define prerequisite abilities

This step serves to expand the epitome by including all the sub abilities for that epitome for the first and subsequent levels.

STEP 5: Identify strategy components

These will include the seven strategy components referred to previously.

STEP 6: Plan lesson logic

This stage defines the types of presentation frames (screens) and the lesson logic which will analyse student responses, provide feedback, initiate animation and supporting audio and video materials if included.

The logic also will make provision for branching to other sections of the course, either as a learner controlled option or as a computer controlled direction.

STEP 7: Author the lesson

In CBT, this means the process of programming, debugging and testing the courseware. Authoring can be carried out using traditional programming languages (Pascal, BASIC, C) or by using authoring languages and systems (Best Course of Action, HyperCard), or through a combination of both.
The key feature in the Elaboration Theory model is the way in which the epitome is presented. Riegeluth (1983) suggests that one might start with a motivational strategy component, followed by one or more analogies and an examination of all the learning prerequisites to be mastered by the learner. Then the organising content ideas are presented and supported by other content ideas, which are arranged using a simple-to-complex strategy. Finally summarisers and synthesisers are used to ensure that the learning is efficient and effective.

Following the epitome, the content gradually unfolds through a series of elaborations, each of which follow a format not unlike that of the epitome.

Sequencing instruction under Elaboration Theory provides one with almost unlimited flexibility since the only restriction placed on the designer is to ensure that the learner does not enter a higher level of elaboration before completing the most immediate preceding level of elaboration. Unlike the simple, sequential or tree structure of the hierarchal model, Elaboration Theory allows a lattice structure to evolve, with movement in both the vertical and horizontal directions possible.

As stated previously, the model is founded on the principle of allowing the learner to take control of her own learning. Consequently, any forced instructional sequencing should guide the learner towards deciding for herself how best to move through the subject matter; a guided discovery approach to learning.

HyperCard: An authoring tool for the application of the Elaboration Theory Model

HyperCard has been described as one of the most exciting pieces of software launched onto the market in recent years. Its potential use for the development of computer based training should undoubtedly grow as courseware developers become aware of its capabilities.

HyperCard belongs to a growing technology called Hypertext. If you have an Apple Macintosh, you may have encountered this technology either in the form of HyperCard or Guide which is also available for the IBM or compatible.

Hypertext is a term and idea coined in the late 60s by Ted Nelson. The idea behind Hypertext is simple: organise data so that it can be accessed in a non linear fashion. Nelson's belief was that even large amounts of data should be accessible in flexible and intuitive ways (Carr, 1988). Not only could one create text retrieval facilities, but in a fully fledged system the designer could incorporate graphics, even video disk footage and access to other parts of a file or completely different files. An additional feature of this technology is that as a user moves through the data the system keeps track of the path which he has followed.

What differentiates this technology from other retrieval systems is that a user can "browse" through information, moving from one place or idea to another through a web of associations. It is this unlimited branching capability which has led some to suggest that such a facility more closely supports the way the human mind works than conventional, linear data base managers.

In the last several months Learnware Technologies has explored the use of HyperCard in the design of Computer based training. As developers of CBT, we believe that HyperCard has tremendous potential in the training area where information needs to be retrieved and presented to a student.

Earlier in this paper, learner control was noted as an important determinant of quality interactions. HyperCard, to a large extent, leaves control of the process in the hands of the student. Initial reactions from those used to a structured learning environment may argue that this is a serious drawback, after all, is not teaching or training an activity which should be teacher or instructor led.

In order to deflect such criticism, one needs to look at the flexibility of HyperCard. It can be modified to serve as an effective tutor. The student can be lead by offering avenues and paths through the material. Although the student retains control, the designer can direct him/her through the material or make available certain elements only after certain prerequisites have been met. A brief scenario may assist to point out how HyperCard could be effectively used.

HyperCard presents the student with screens containing text, graphics and questions to determine comprehension. Through the "hot buttons" facility, the student is given the opportunity to explore links to other information. Control can be provided by designing links to certain epitomes (stacks) and not to others.

The student is presented with a generalised overview of the topic and has the facility of "zooming in" to a given substrata and interacting at that level. Movement between the stratum (stacks) can be restricted by requesting mastery of certain elements (imbedded "gates") before making available any other related concept or elaboration.

Furthermore, HyperCard can keep track of the student's path within the various levels of the elaboration and consequently evoke the most appropriate series of interactions within specified domains of the knowledge base. Levels of the elaboration can be designed using a simple-to-complex approach ensuring that interactions are not beyond the intellectual reach of the student. As the student progresses from one prerequisite to another, more elaboration is offered which in turn allows the student to explore further links and associations. Throughout the entire process, the student is left to concentrate on the content and establish his/her own cognitive map within the confines of the various levels of the elaboration. At any time the student has the option of retracting to the overview level and reviewing the contextual framework or branching to related epitomes.

From our initial investigations of HyperCard, there is no doubt that the product has the inherent capabilities and flexibility to create highly effective courseware. Through the combination of a sound theoretical framework such as the "Elaboration model" with the appropriate software tools such as HyperCard, the authors believe that the prescriptive tutorial approach to CBT development may have a viable alternative.

Summary

The Elaboration Theory Model is not a new instructional paradigm. Indeed, references to this instructional approach can be found in literature dating back to 1979. Its application to CBT development however, has been largely ignored due to the difficulty in applying non sequential, associative branching capabilities with past authoring systems.

HyperCard is based on giving the user maximum control in accessing, retrieving and interacting with data. This characteristic offers significant opportunities in order to apply non sequential associative learning paths that can indeed allow the learner to select their desired learning sequences. Not only can students select their respective learning strategies but the software can keep track of the student's path and allow the user to model him/herself. The system can keep this information for future interactions assisting designers and students alike to establish the most appropriate sequencing of information. What is perhaps more important, however, is that HyperCard offers this capability without the need to be a genius programmer nor spend huge sums of money to author instructional materials.

References

Block, J. H. (1971). Mastery Learning: Theory and Practice. New York: Rinehart & Winston.

Briggs, L. J. (Ed.) (1977). Instructional Design: Principles and Applications. Englewood Cliffs, NJ: Educational Technology Publications.

Bruner, J. S. (1960). The Process of Education. New York: Vintage Books.

Bruner, J. S. (1966). Toward a Theory of Instruction. New York: Horton.

Carr, C. (1988). Hypertext: A New Training Tool? Educational Technology, Vol. XXVIII, No. 8, August, pp7-11.

Cronbach, L. J. & Snow, R. E. (1977). Aptitudes and Instructional Methods: A Handbook for Research on Interactions. New York: Irvington.

Gagne, R. M. (1968). Learning hierarchies. Educational Psychologist, G(1), 1-6.

Gagne, R. M. (1977). The Conditions of Learning, New York: Holt, Rinehart & Winston.

Lumsdaine, A. & Glaser, R. (1960). Teaching Machines and Programmed Learning. Washington, DC, Dept of Audiovisual Instruction, NEA.

Merrill, M. D. (1979). Learner controlled Instructional Strategies: An Empirical Investigation. Final report on NSF Grant # sed76-01650, February.

O'Neil, H. F. Jr. (Ed) (1978). Learning Strategies. New York: Academic Press.

Riegeluth, C. M. (1983). Instructional Design Theories and Models: An Overview of their Current Status. Hillsdale, NJ: Erlbaum.

Rigney, J. W. (1978). Learning Strategies: A Theoretical Perspective. In H.F. O'Neil Jr. (Ed). Learning Strategies. New York: Academic Press.

Authors: Malcolm Morrison is the Managing Director of the Learning Laboratory Pty Ltd, an Adelaide based company specialising in the development and delivery of computer based training and education. Malcolm's special fields of interest are human information processing, instructional design and the development of intelligent computer based learning systems.

Pascal Grant is the Managing Director of Learnware Technologies Pty Ltd, a technology based training consultancy company designing and developing computer and video based training systems for corporate and educational clients. His specific areas of interest include instructional design methodologies and the development of "user friendly" courseware.

Please cite as: Morrison, M. J. and Grant, P. (1988). The elaboration theory model of instructional design: HyperCard - new perspectives on learner controlled CAL. In J. Steele and J. G. Hedberg (Eds), Designing for Learning in Industry and Education, 51-57. Proceedings of EdTech'88. Canberra: AJET Publications. http://www.aset.org.au/confs/edtech88/morrison.html


[ EdTech'88 contents ] [ EdTech Confs ] [ ASET home ]
This URL: http://www.aset.org.au/confs/edtech88/morrison.html
© 1988 The authors and ASET. Last revised 11 May 2003. HTML editor: Roger Atkinson
Previous URL 10 Apr 1998 to 30 Sep 2002: http://cleo.murdoch.edu.au/aset/confs/edtech88/morrison.html