Not that 'technology' hasn't for a long time played a major role in all our educational systems; after all, a book requires an enormous amount of machinery and mechanical ingenuity to bring it into existence and a Biro, or even a 'lead' pencil, is a technological masterpiece. So what is it about the 'new' technologies that is so different? It is here that the word 'interactivity' enters the debate. In so doing, it has opened a Pandora's Box of myths, monsters and misunderstandings.
In the world of mathematics there is a useful concept of infeasibility. Infeasible things have the annoying property that whilst you can imagine them, and sometimes even prove useful things about them, you can't realise them. For example, there are some computational problems that are perfectly well defined and specified but for which, in order to perform the calculations, more time and resources would be required than are available in principle, never mind in practice. It just may be that some of the educational tasks that people have set for the new technologies fall into that class. It is appropriate to inquire, therefore, into the nature of the problems that these new technologies have been asked to solve.
There is an important sense in which the field of Interactive Systems is technology led, or technology driven. We have invented machines which have. in addition to their original purpose, opened up unlooked for possibilities; now, we ask, what extra use can we make of them? This technology push has driven us up some unpleasantly blind alleys, perhaps the worst of which is 'Interactive Multimedia'. Only now, when there are serious problems to be tackled, are people beginning to ask what the properties of such machines should be in order that they are fit for the tasks required.
There is a subtle interplay between the exact nature of a machine and the things it can do. No matter how you try, you can't boil an egg in a toaster. Everybody knows that because we all have a good working knowledge of eggs, water and toasters. Unfortunately, far too few people have a good working knowledge of computers and the things that they can reasonably be expected to do. Even amongst the community of those who should know better there is a reluctance to deny the possibility that one day (real soon now) the computer will be able to do it, whatever 'it' is. These people have been overwhelmed by the pace of technological advance and have lost the power to reason. In their world everything is possible, if not now, then with the next generation of computers, or perhaps the one after.... Such technoptimism is dangerous and often goes hand with the idea that there is no real need to understand exactly how something works in order to decide on its suitability.
If we are to make the best use of the failures appositely to apply technology to teaching and learning over the past 10 years we must be prepared to confront these issues and make a reconciliation between the human needs and the mechanical possibilities.
One essential feature of educational communication is that it is asymmetric. It is not, by definition, a dialogue amongst equals. There are those with a story to tell and those with a desire to understand it. This inequality is not a moral, spiritual or ethical pronouncement; it is a logical necessity. The expectation of learners is that they will receive more information than they will impart. This is certainly a quantitative and probably a qualitative assessment. The significance of this inequality is that it has implications for any communication channels that have to be implemented technologically. Its significance is embodied in the traditional forms of teaching, where the various schemes:
One | --> | One | Tutorial | |
One | --> | Few | Seminar | |
One | --> | Many | Lecture |
differ in the capacity of the 'return path' from the student(s) to the teacher. Even in a tutorial, if there were no net flux in the direction of the learner, the question might be raised as to who should be paying whom!
There is another important factor in educational dialogue. It must necessarily be the case that, whereas the responses by the teacher must be articulate, accurate and appropriate, those of the learner cannot be expected to be so. Indeed, they must, by definition, be largely inchoate since, if they were not, this would surely indicate a sufficient understanding of the subject under discussion. It was the failure properly to appreciate these two asymmetries that guarantied the failure of Computer Based Instruction. The aspirations of those heady days of the early 1970s, when computers would revolutionise the teaching of everything, are long gone. There is a ghetto of embattled diehards defending a redoubt called CBT, but, whenever I look at any of their products, I'm greatly relieved that I can't be coerced by my employer into using them. Almost invariably. these products take no account of individual differences, enforce a regime of prescribed choices and deliver responses that are insensitive to the evolving context of the training. It is not that the designers of these systems are themselves brutal or insensitive, it is just that the machinery that they have to work with only admits of brutish constructions.
In the face of this problem various reductionist strategies have been tried to split a problem into byte sized chunks. Training has been separated off from education by the strategy: 'training' is all the 'education' that you can measure. Training has been decomposed into components, or 'modules', whose sole properties are that they appear to be testable by one line multiple choice questions. The end result bears little resemblance to a human process, and the empirical results as to the efficacy of such methods can, in general, be expressed by the observation that if the subject is easy to teach in the real world, then doing it by computer isn't much worse and is probably cheaper and faster. Unfortunately, the sum of such tasks is a small proportion of all that must be imparted in today's bewildering world.
I have written at length elsewhere (Clark, 1991, 1987) of the structural problems of today's teaching machines. It is sufficient here to remark that the key (re-)discovery of recent times Is the importance of Narrative in the presentation of mediated information. This finding comes from a consideration of one of the ubiquitous metaphors of the interactive learning fraternity: Browsing.
From a consideration of how people deployed the physical resources at their disposal whilst engaged in constructive thinking sprang the notion of autonomous objects co-existing in an organisation which permitted their selection, activation and interconnection. The need to enable such a world in a computer universe led to the concept of Object oriented Programming and the development of Smalltalk. The precise ways in which objects on this 'desktop' could be activated, and the ways in which the status of the various objects could be represented led to the development of Icons and the Graphical User Interface (GUI). Amidst all this pioneering work, the word "browsing" was adopted to connote the activity of seeking out the particular aspect of the system important at that particular point in the user's thought process. Perhaps it is because there are not many sheep in Palo Alto that the pioneers were misled into that word: the essential feature of 'browsing' is that it is, in essence, a random activity undirected by any ontological strategy. If only they had used the word "hunting", with all its connotations of tactics, strategy and purpose, how much misbegotten thinking would have been saved.
We have a great many of our teaching styles as the legacy of Empire. Perhaps the greatest disabling feature of Empire has been the generation of a class of overseers, themselves incapable of carrying out the tasks they were to oversee. The demise of the Craft and Guild systems, with their emphasis on learning at the hand of the master by practical means are, in our current society, only retained in the professions that we value most: medicine and the law. It is impossible to become a Barrister or Doctor without an apprenticeship. No amount of scholarly endeavour is sufficient, and the License to Practice is not conferred by an educational institution but by a Guild. If, however, one has oneself not to 'do', but only to give orders to do, then sitting in a classroom is sufficient. The idea that knowledge exists divorced from practical skill is one of the most damaging heresies 'abroad today. "Just give me the broad brush and don't bother me with the details" is the most lethal utterance by a manager. It is essential to realise that there is no such thing as 'the general idea' of something. There is only a large number of minute particulars which, when fully grasped, can be referred to in shorthand by a single name. Possession of the name without the attendant mastery of the essential particulars is nugatory. The only way to own those particulars is through personal practical experience.
The heart of the power of the new technologies is the leverage that they give to those who are masters of the subject to change the direction of students' thoughts and ideas so that they come into line with the prevailing orthodoxy. (There is a delicate boundary between education and training from this point of view: whereas training is getting someone to see something in your way, education is enabling them to see it in their own. It is an inescapable fact to me that there can be no education without prior training.)
Until you know what it is that you don't understand, there is no hope whatever of learning anything. A 'good explanation' takes as its starting point a precise locus of incomprehension. To generate that 'good explanation', a teacher has to know what it is that the student(s) find difficult, and the context in which that problem is embedded. Having located that nexus, it is often comparatively easy for an experienced teacher to respond appropriately; this task is, however, quite out with the foreseeable range of computation, not because of questions of megaflops or gigabytes but because teachers don't know what they are doing in any analytical way. Teaching is an art. The place to deploy the computer power is in the first stage: helping students to discover for themselves what it is that they currently don't know about the question under study; then, when that point is reached, being able to generate a record of the process in such a way as to indicate in a concise fashion to the teacher some useful indicators of the current difficulty. In this way the total time spent by the teachers in the essentially irreplaceable task of teaching is maximised by minimising the time spent finding out what it is that the student doesn't understand.
There is nothing new in this idea, in that it is the function of the exercise, examination or worked example to reveal a student's lack of comprehension. However, it may well be that this activity should be conducted before, rather than after a piece of teaching. The reason for this is a follows. When the reasonable presumption was that the class was of largely uniform ability and following an ordered curriculum, then the most efficient way of proceeding was to teach to the majority first and catch the (few) stragglers afterwards. Now, however, when the backgrounds and abilities of individuals seeking education and training can no longer be presumed to be homogeneous, it is not possible to construct a plan of teaching that will be effective for a worthwhile majority: most people will be exceptions requiring attention. It is thus more efficient to offer a pre-filter to ascertain individual points of difficulty and only then to devise appropriate teaching strategies. The great hope for the new technologies is that it is now possible to devise such stratagems.
At the frontier of incomprehension, where you don't know what it is that you need to know, being asked questions isn't a lot of help. What is much more useful is if you can just try things and experiment a little with thoughts that come to you as you try to puzzle out a way to make the current state make sense. If these experiments have consequences that are properly representative of the system under study, then, like as not, if you have the relevant understanding filed under an unrelated set of thoughts, you will invoke it in this new context and see connections. The net result is that a history of your private investigation of a system will reveal to the experienced teacher those concepts, if any, that are, by there absence, making it difficult or impossible for you to 'get the point'.
Here 'interaction' is the essential enabling factor. This is not the interaction of 'press the spacebar to continue' or mouse clicking on one of a number of textual alternatives, but a way of bringing a worthwhile representation of a real system into the classroom. Note that there is no element of 'diagnosis' here: no 'artificial intelligence' or 'knowledge based system' working out what it is that you don't know or attempting to put you right. All that is envisaged, and that is difficult enough, is to create an appropriate emulation of some aspect of the subject under study. Such systems are now entering the realm of the feasible.
By devising mechanical means by which students can determine to a reasonable degree what it is that currently obstructs their comprehension, and by making this information easily accessible to the teacher, every minute of teacher time is used in the essentially human task of giving guidance and direction, rather than in diagnosing the problem.
It is appropriate to reconsider the strengths of the apprenticeship model of the acquisition of learning. Machines are now sufficiently powerful to deliver emulations of real world systems, and the narrative expression of the received way of understanding it can be expanded to take account of the different levels of knowledge. experience and understanding that individual students bring to the task.
The key enabling factor in this model of educational support is interaction, with the system specifically tailored to identify the individual actions so as to provide the teacher with an indication of the students' problem solving strategies.
There is more to this than adding a few 'buttons' to a graphic. The secret is to enable those with the gift of teaching to create narratives of their own which students can explore. The investment of time in the creation of such programs is large: larger than writing a conventional book, and requires skills not yet common in the profession. The concentration of effort is, nevertheless, essential, because the conventional methods are about to fall us. Our civilisation cannot withstand that collapse.
Clark, D. R. (1987). Twenty-first Century Books: An assessment of the role of videodisc in the next 25 years. In D. Laurillard (ed), Interactive Media: Working methods and practical applications. Chichester: John Wiley, 60-73.
Clark, D. R. and Sandford N. (1986). Semantic Descriptors and Maps of Meaning for Videodisc Images. Programmed Learning and Educational Technology (PLET), 23(Feb), 84-90.
Nelson, T. H. (1987). Literary Machines. South Bend: The Distributors, 702 South Michigan, South Bend, IN 46618.
Please cite as: Clark, D. R. (1992). The future of interactivity - is it really a hardware issue? In Promaco Conventions (Ed.), Proceedings of the International Interactive Multimedia Symposium, 547-556. Perth, Western Australia, 27-31 January. Promaco Conventions. http://www.aset.org.au/confs/iims/1992/clark.html |