IIMS 92 contents
[ IIMS 92 contents ]

The future of interactivity - is it really a hardware issue?

David R Clark, PhD
Managing Director, i-Media


Introduction

There is a consensus amongst teachers in all the westernised educational systems that technology based products have a role to play. There is, however, no unanimity as to what that role should be. This situation is paradoxical: how are we all so sure that technology will help if we don't agree how? It is in the resolution of this paradox that we can find the pointers to the ways in which the modem technologies can be profitably harnessed for educational purposes.

Not that 'technology' hasn't for a long time played a major role in all our educational systems; after all, a book requires an enormous amount of machinery and mechanical ingenuity to bring it into existence and a Biro, or even a 'lead' pencil, is a technological masterpiece. So what is it about the 'new' technologies that is so different? It is here that the word 'interactivity' enters the debate. In so doing, it has opened a Pandora's Box of myths, monsters and misunderstandings.

In the world of mathematics there is a useful concept of infeasibility. Infeasible things have the annoying property that whilst you can imagine them, and sometimes even prove useful things about them, you can't realise them. For example, there are some computational problems that are perfectly well defined and specified but for which, in order to perform the calculations, more time and resources would be required than are available in principle, never mind in practice. It just may be that some of the educational tasks that people have set for the new technologies fall into that class. It is appropriate to inquire, therefore, into the nature of the problems that these new technologies have been asked to solve.

There is an important sense in which the field of Interactive Systems is technology led, or technology driven. We have invented machines which have. in addition to their original purpose, opened up unlooked for possibilities; now, we ask, what extra use can we make of them? This technology push has driven us up some unpleasantly blind alleys, perhaps the worst of which is 'Interactive Multimedia'. Only now, when there are serious problems to be tackled, are people beginning to ask what the properties of such machines should be in order that they are fit for the tasks required.

There is a subtle interplay between the exact nature of a machine and the things it can do. No matter how you try, you can't boil an egg in a toaster. Everybody knows that because we all have a good working knowledge of eggs, water and toasters. Unfortunately, far too few people have a good working knowledge of computers and the things that they can reasonably be expected to do. Even amongst the community of those who should know better there is a reluctance to deny the possibility that one day (real soon now) the computer will be able to do it, whatever 'it' is. These people have been overwhelmed by the pace of technological advance and have lost the power to reason. In their world everything is possible, if not now, then with the next generation of computers, or perhaps the one after.... Such technoptimism is dangerous and often goes hand with the idea that there is no real need to understand exactly how something works in order to decide on its suitability.

If we are to make the best use of the failures appositely to apply technology to teaching and learning over the past 10 years we must be prepared to confront these issues and make a reconciliation between the human needs and the mechanical possibilities.

The nature of communication

It might be handy if we had, in the English language, the word 'munication'. It would indicate the one way transmission of information or to describe that dialogue of the deaf in which the interlocutors do not modify their locutions in response to incoming data. If we had such a word, then the 'com-' prefixing communication would emphasise the reciprocal, responsive nature of the current sense of the word. (In fact, 'municating' sounds just the right sort of word to describe the kind of thing that computers spend their time doing.) The etymology of 'communication' is, in fact, derived from the root 'common', which in turn carries the sense of 'binding with'; in contemporary language, the sense of 'common property making' has, sadly, almost entirely disappeared. In the context of education and training it is particularly important to understand the properties of the prevalent forms of communication in order that the suitability of machines to undertake some of the human tasks can properly be assessed.

One essential feature of educational communication is that it is asymmetric. It is not, by definition, a dialogue amongst equals. There are those with a story to tell and those with a desire to understand it. This inequality is not a moral, spiritual or ethical pronouncement; it is a logical necessity. The expectation of learners is that they will receive more information than they will impart. This is certainly a quantitative and probably a qualitative assessment. The significance of this inequality is that it has implications for any communication channels that have to be implemented technologically. Its significance is embodied in the traditional forms of teaching, where the various schemes:

     One --> One    Tutorial

One --> Few    Seminar

One --> Many    Lecture

differ in the capacity of the 'return path' from the student(s) to the teacher. Even in a tutorial, if there were no net flux in the direction of the learner, the question might be raised as to who should be paying whom!

There is another important factor in educational dialogue. It must necessarily be the case that, whereas the responses by the teacher must be articulate, accurate and appropriate, those of the learner cannot be expected to be so. Indeed, they must, by definition, be largely inchoate since, if they were not, this would surely indicate a sufficient understanding of the subject under discussion. It was the failure properly to appreciate these two asymmetries that guarantied the failure of Computer Based Instruction. The aspirations of those heady days of the early 1970s, when computers would revolutionise the teaching of everything, are long gone. There is a ghetto of embattled diehards defending a redoubt called CBT, but, whenever I look at any of their products, I'm greatly relieved that I can't be coerced by my employer into using them. Almost invariably. these products take no account of individual differences, enforce a regime of prescribed choices and deliver responses that are insensitive to the evolving context of the training. It is not that the designers of these systems are themselves brutal or insensitive, it is just that the machinery that they have to work with only admits of brutish constructions.

'Teaching' is essentially a human activity

One of the fundamental tenets of computing is that only those processes for which there is an explicit procedure are amenable to computation. If we don't know how to do it then we can't program computers to do it either. [pace Wittgenstein: 'Wovon man nicht sprechen kann, daruber muss man schweigen', Tractatus 7.7.7]. In this sense it is safe to say that we don't have any reliable universal algorithm for 'teaching'. In the last analysis, there are just 'teachers', individuals who have a gift not only for understanding issues but also imparting them to others. Like musicians, who have. no worthwhile idea of how they are able to make their music, and artists, who talk the most abominable rubbish when asked to articulate their particular skills in language, so teachers have no sure knowledge of how they do what they do. In a society that depends on the effective transmission of information, this is a most frustrating and inconvenient circumstance; unfortunately, this makes it no less true.

In the face of this problem various reductionist strategies have been tried to split a problem into byte sized chunks. Training has been separated off from education by the strategy: 'training' is all the 'education' that you can measure. Training has been decomposed into components, or 'modules', whose sole properties are that they appear to be testable by one line multiple choice questions. The end result bears little resemblance to a human process, and the empirical results as to the efficacy of such methods can, in general, be expressed by the observation that if the subject is easy to teach in the real world, then doing it by computer isn't much worse and is probably cheaper and faster. Unfortunately, the sum of such tasks is a small proportion of all that must be imparted in today's bewildering world.

'Multimedia'

In order to ameliorate this failure of the computer to live up to its early promise, various attempts have been made to employ some of the new sound and image techniques that, for entirely other reasons, have entered the computer domain. This is very much like the 'Bolt-on Goodies' market for motor cars. Adding a 'go faster' stripe on your old banger, or attaching a fancy tailpipe, won't actually do anything for the performance of your car; nevertheless, someone must be fooled, because there is a whole industry based on it. As 'Multimedia' is a bit more expensive, but often just as useless, that industry is taking a little longer to get off the ground. There is just time to rescue it before it becomes entirely discredited.

I have written at length elsewhere (Clark, 1991, 1987) of the structural problems of today's teaching machines. It is sufficient here to remark that the key (re-)discovery of recent times Is the importance of Narrative in the presentation of mediated information. This finding comes from a consideration of one of the ubiquitous metaphors of the interactive learning fraternity: Browsing.

Browsing

If there is one notion that has done more damage to rational educational thinking than 'Hypertext' over the past few years it is probably 'HyperMedia'. Those who have taken the trouble to seek out and read the seminal documents (eg Nelson, 1987) will know that building on the idea of hypertext is far worse than building on quicksand. There may, and this is an article of faith, be patches of firm ground in the morass, but, to judge by the number of optimistic ventures that have vanished without trace into the mire, they must be few and far between. There is, however, one metaphor that has underpinned a great deal of the work on which the hyper-industry is based that does repay study; that metaphor is The Desktop. The computer modelling of this essentially human entity owes almost all its inspiration to the work carried out over a decade across the 60s and 70s at the Palo Alto Research Centre (PARC) of the Xerox Corp.

From a consideration of how people deployed the physical resources at their disposal whilst engaged in constructive thinking sprang the notion of autonomous objects co-existing in an organisation which permitted their selection, activation and interconnection. The need to enable such a world in a computer universe led to the concept of Object oriented Programming and the development of Smalltalk. The precise ways in which objects on this 'desktop' could be activated, and the ways in which the status of the various objects could be represented led to the development of Icons and the Graphical User Interface (GUI). Amidst all this pioneering work, the word "browsing" was adopted to connote the activity of seeking out the particular aspect of the system important at that particular point in the user's thought process. Perhaps it is because there are not many sheep in Palo Alto that the pioneers were misled into that word: the essential feature of 'browsing' is that it is, in essence, a random activity undirected by any ontological strategy. If only they had used the word "hunting", with all its connotations of tactics, strategy and purpose, how much misbegotten thinking would have been saved.

Hypertext

The key concept behind hypertext is that, whilst in pursuit of information held is a document, if an unfamiliar or unknown term is encountered, activity in the current document may be suspended and an efficient search for clarification of the problematic term undertaken: on satisfactory completion of this subsidiary task, the original one can be resumed without displacement. Interesting extensions to an originally bibliographic model have been made by the incorporation of images into the universe of annotated documents, but this has in itself posed a new and much more difficult set of problems (see eg. Clark and Sandford, 1986). The underlying difficulty in this model of information provision is that the universe of possible annotations is large, the limiting case being one entry per new word of the original document. Moreover, the universe is recursive, in that the defining documents might be themselves hypertexts, offering the dreadful possibility of the circular reference and the near certainty of getting lost in hyperspace. These considerations lead me to suggest that generalised hypertext is infeasible, and that a more carefully constrained use of these ideas must be implemented if the system is to have practical utility. It is beginning to appear that the required constraint is the construction of a narrative.

Teaching or learning?

It is a view commonly held that in today's world the emphasis must be on learning, and that the needs of the learner are paramount. 'Teaching' carries with it an authoritarian air and a notion of regimentation and inflexibility of approach. In today's world, with its rapid changes of technology and the consequent need for retraining, the concept of Education Permanente, as the French have it, is very persuasive. Continuing Education. the idea that it is no longer enough to go to school as a child to equip yourself for the rest of your life is gaining ground in all technological societies (such an approach has always been de rigeur in 'primitive' societies). In the face of this pressure, the 19th Century model of the classroom full of compliant children absorbing by rote the dicta of the teacher has to go, but it is important not to throw the baby out with the bathwater.

We have a great many of our teaching styles as the legacy of Empire. Perhaps the greatest disabling feature of Empire has been the generation of a class of overseers, themselves incapable of carrying out the tasks they were to oversee. The demise of the Craft and Guild systems, with their emphasis on learning at the hand of the master by practical means are, in our current society, only retained in the professions that we value most: medicine and the law. It is impossible to become a Barrister or Doctor without an apprenticeship. No amount of scholarly endeavour is sufficient, and the License to Practice is not conferred by an educational institution but by a Guild. If, however, one has oneself not to 'do', but only to give orders to do, then sitting in a classroom is sufficient. The idea that knowledge exists divorced from practical skill is one of the most damaging heresies 'abroad today. "Just give me the broad brush and don't bother me with the details" is the most lethal utterance by a manager. It is essential to realise that there is no such thing as 'the general idea' of something. There is only a large number of minute particulars which, when fully grasped, can be referred to in shorthand by a single name. Possession of the name without the attendant mastery of the essential particulars is nugatory. The only way to own those particulars is through personal practical experience.

Narrative

It is useful to see the relation between master and apprentice as a narrative. The apprenticeship is an individual unfolding of the story of the craft. The precise way in which the revelation takes place is under the control of the master, subtly modified in response to the performance of the apprentice. This is an idealised vision. Tradition, custom and practice and vested interest serve to distort it, as in all human activity. Nevertheless, it is a vision worth pursuing. The essential feature of this model of the acquisition of learning is that control is in the hands of the master. This must be accepted as appropriate and all tests of the applicability of new technology to the process of instruction judged against their ability to empower the teacher. It is not possible to empower the learner. The essence of the concept of 'power' is directed energy. To be in the position of a learner is to be in a position from which the next step is not obvious, for, if that were otherwise, the next step would be immediate and the concept of 'learning' void - you knew what to do already! Thus the learner is, in this sense, powerless, in that the direction in which energy must be dissipated is either unknown or unclear. It is for the teacher to impart that direction. The trajectory through the given field of knowledge is a narrative whose direction at each point is set by the master in response to the local requirements of the apprentice. It is no accident that the etymology of 'education' is from 'to lead'.

The heart of the power of the new technologies is the leverage that they give to those who are masters of the subject to change the direction of students' thoughts and ideas so that they come into line with the prevailing orthodoxy. (There is a delicate boundary between education and training from this point of view: whereas training is getting someone to see something in your way, education is enabling them to see it in their own. It is an inescapable fact to me that there can be no education without prior training.)

What can machines do?

I hope that the foregoing analysis has led you to the conclusion that, in the educational context, the answer to that question is: "not a lot". Things are not, however, entirely hopeless. It is all a matter of setting appropriate goals. Given that machines can't 'teach' in any worthwhile sense of that word, as there is as yet no algorithm that can be programmed, what they can do is facilitate the task of a teacher in the modern world where it is no longer possible to presume a uniform level of knowledge when establishing pre-requisites for a given program of instruction. The manner of this facilitation is based on the division of a teaching act into two distinct parts: reaching a point of shared misunderstanding. then rectifying that misunderstanding.

Until you know what it is that you don't understand, there is no hope whatever of learning anything. A 'good explanation' takes as its starting point a precise locus of incomprehension. To generate that 'good explanation', a teacher has to know what it is that the student(s) find difficult, and the context in which that problem is embedded. Having located that nexus, it is often comparatively easy for an experienced teacher to respond appropriately; this task is, however, quite out with the foreseeable range of computation, not because of questions of megaflops or gigabytes but because teachers don't know what they are doing in any analytical way. Teaching is an art. The place to deploy the computer power is in the first stage: helping students to discover for themselves what it is that they currently don't know about the question under study; then, when that point is reached, being able to generate a record of the process in such a way as to indicate in a concise fashion to the teacher some useful indicators of the current difficulty. In this way the total time spent by the teachers in the essentially irreplaceable task of teaching is maximised by minimising the time spent finding out what it is that the student doesn't understand.

There is nothing new in this idea, in that it is the function of the exercise, examination or worked example to reveal a student's lack of comprehension. However, it may well be that this activity should be conducted before, rather than after a piece of teaching. The reason for this is a follows. When the reasonable presumption was that the class was of largely uniform ability and following an ordered curriculum, then the most efficient way of proceeding was to teach to the majority first and catch the (few) stragglers afterwards. Now, however, when the backgrounds and abilities of individuals seeking education and training can no longer be presumed to be homogeneous, it is not possible to construct a plan of teaching that will be effective for a worthwhile majority: most people will be exceptions requiring attention. It is thus more efficient to offer a pre-filter to ascertain individual points of difficulty and only then to devise appropriate teaching strategies. The great hope for the new technologies is that it is now possible to devise such stratagems.

The importance of interactivity

The notion underlying this model of learning is that most of the activity associated with developing an understanding of a topic consists in re-arranging or re-interpreting things already known in some other context. The fraction of events entailing the acquisition of entirely new information is comparatively small. Lack of this information is, by definition, nevertheless fatal to subsequent understanding. The problem is that different people will lack different bits, so to speak. However, one can't say what it is that one lacks, because the lack renders accurate utterance impossible. It is important here to distinguish clearly between poor performance and incomprehension: being bad at Tensor Calculus is quite different from realising that Tensor Calculus is the posh name for the things you had worked out for yourself (in a limited way) when you tried to understand elasticity. Once it is clear to you that Tensor Calculus (by any name) is what you need, getting good enough at it is a different kind of task.

At the frontier of incomprehension, where you don't know what it is that you need to know, being asked questions isn't a lot of help. What is much more useful is if you can just try things and experiment a little with thoughts that come to you as you try to puzzle out a way to make the current state make sense. If these experiments have consequences that are properly representative of the system under study, then, like as not, if you have the relevant understanding filed under an unrelated set of thoughts, you will invoke it in this new context and see connections. The net result is that a history of your private investigation of a system will reveal to the experienced teacher those concepts, if any, that are, by there absence, making it difficult or impossible for you to 'get the point'.

Here 'interaction' is the essential enabling factor. This is not the interaction of 'press the spacebar to continue' or mouse clicking on one of a number of textual alternatives, but a way of bringing a worthwhile representation of a real system into the classroom. Note that there is no element of 'diagnosis' here: no 'artificial intelligence' or 'knowledge based system' working out what it is that you don't know or attempting to put you right. All that is envisaged, and that is difficult enough, is to create an appropriate emulation of some aspect of the subject under study. Such systems are now entering the realm of the feasible.

The way forward

Once the goal of interactivity, from a pedagogical point of view, is clear, then the delivery of that interactivity becomes a matter of hardware. Its purpose, as argued in this presentation, is to allow students to reveal to themselves and to their teacher what it is that they currently don't know and which, by that lack or misunderstanding, is hampering their progress along the teacher's narrative. The most powerful means by which such representations are presented is by images and sounds from the real world. However, it is insufficient that mere images be presented: it is essential that individual regions of these images can have 'meanings' predicated to them by the teacher, such that specific interactions produce particular results. For two dimensional imagery this is currently manageable; for example it is possible to offer a medical student an image and pose the task "point to the region of necrotic kidney tissue". The value for the teacher lies in those responses which are incorrect and the nature of those responses can be a good guide to the teacher, if not the student, as to what information is lacking. More difficult, but nevertheless currently feasible, are time dependent 2-dimensional images. For example: "with regard to the on-coming traffic and other relevant factors, indicate the events which made it is safe and possible to move your articulated lorry into the overtaking lane". Where it gets really difficult is in the attribution of 'meaning' to points An the 2-D projection of 3-D objects which must be manipulated by the student. The problem here is not the detection of 'correct' responses, but the far more important tasks, from the point of view of the need to flag incomprehension, of associating meaning with inappropriate responses. I believe that it is not infeasible, but it will require all the resources of contemporary image computation to deliver.

Conclusion

The use of computing machinery in teaching has reached a crucial point. It is clear that for all significant teaching tasks, humans are irreplaceable: nevertheless, the needs of our society for trained people are so acute that ways must be found to increase the accessibility and efficiency of teachers, who are a scarce and valuable resource.

By devising mechanical means by which students can determine to a reasonable degree what it is that currently obstructs their comprehension, and by making this information easily accessible to the teacher, every minute of teacher time is used in the essentially human task of giving guidance and direction, rather than in diagnosing the problem.

It is appropriate to reconsider the strengths of the apprenticeship model of the acquisition of learning. Machines are now sufficiently powerful to deliver emulations of real world systems, and the narrative expression of the received way of understanding it can be expanded to take account of the different levels of knowledge. experience and understanding that individual students bring to the task.

The key enabling factor in this model of educational support is interaction, with the system specifically tailored to identify the individual actions so as to provide the teacher with an indication of the students' problem solving strategies.

There is more to this than adding a few 'buttons' to a graphic. The secret is to enable those with the gift of teaching to create narratives of their own which students can explore. The investment of time in the creation of such programs is large: larger than writing a conventional book, and requires skills not yet common in the profession. The concentration of effort is, nevertheless, essential, because the conventional methods are about to fall us. Our civilisation cannot withstand that collapse.

References

Clark, D. R. (1991). The demise of multimedia. IEEE Computer Graphics and Applications, 11(4, July), 75-80.

Clark, D. R. (1987). Twenty-first Century Books: An assessment of the role of videodisc in the next 25 years. In D. Laurillard (ed), Interactive Media: Working methods and practical applications. Chichester: John Wiley, 60-73.

Clark, D. R. and Sandford N. (1986). Semantic Descriptors and Maps of Meaning for Videodisc Images. Programmed Learning and Educational Technology (PLET), 23(Feb), 84-90.

Nelson, T. H. (1987). Literary Machines. South Bend: The Distributors, 702 South Michigan, South Bend, IN 46618.

Please cite as: Clark, D. R. (1992). The future of interactivity - is it really a hardware issue? In Promaco Conventions (Ed.), Proceedings of the International Interactive Multimedia Symposium, 547-556. Perth, Western Australia, 27-31 January. Promaco Conventions. http://www.aset.org.au/confs/iims/1992/clark.html


[ IIMS 92 contents ] [ IIMS Main ] [ ASET home ]
This URL: http://www.aset.org.au/confs/iims/1992/clark.html
© 1992 Promaco Conventions. Reproduced by permission. Last revision: 5 Apr 2004. Editor: Roger Atkinson
Previous URL 27 Mar 2000 to 30 Sep 2002: http://cleo.murdoch.edu.au/gen/aset/confs/iims/92/clark.html