IJET Logo

International
Journal of
Educational
Technology

home Issues submit        articles Editors

Articles

Feature Resources


Online Instruction Versus Face-to-Face Instruction AtUNIMAS

- Bing Hiong Ngu, Universiti Malaysia Sarawak

Abstract

This articlecompared online instruction and face-to-face instruction at University MalaysiaSarawak (UNIMAS). Participants were undergraduate full-time students who wereenrolled in the Human Resource Development Program. The learning materials usedwas ‘How to write research proposals and reports’; and this represented asubtopic in the ‘Research Methods’ course. The design of an online learningenvironment emphasized four types of interaction: learner-content interaction(topic notes), learner-self interaction (multiple exercises), learner-learnerinteraction and instructor-learner interaction (online discussion on casestudies and group project).

The‘QuickPlace’ software was customized to incorporate the component of multiplechoice exercises. This latter was written with html language and linked to the‘QuickPlace’. The face-to-face group attended routine lectures and tutorials onthe same topic. Test results indicate that the online discussion assistedstudents to learn case studies slightly better than the face-to-faceinstruction. This may due to the above learning interactions that resulted in agreater emphasis on self-oriented and group-oriented learning as compared to aninstructor-oriented face-to face learning experience. However, feedback fromthe students indicate a need to further improve the design of the onlinecourse.

Introduction

With the advancement in learningtechnologies, online instruction has become popular these days (Newton,Marcella, & Middleton, 1998; Liu, Walter, & Brooks, 1998). Learners whofollow online instruction are expected to engage in a self-paced learningstrategy (Fischer & Scharff, 1998); and in addition, they are expected toengage in a variety of online communications such as asynchronous (orsynchronous) interaction with other learners and instructor, virtual fieldtrips, email, and voice communication through internet audio streaming (Wang,Hinn, & Arvan, 2001; Kumari, 2001; Carr-Chellman and Duchastel, 2000;Fischer & Scharff, 1998). The nature of collaborative learning within theonline course design normally encourages group participation to generate onlinegroup project or ideas to solve an issue.

In contrast, face-to-faceinstruction requires academic staff to give lectures and have students attendface-to-face tutorial. This face-to-face instruction requires lecturers toengage in interpersonal contact, social contact, and non-verbal communicationwith students. It is possible that direct contact with the instructor intraditional face-to-face settings may still contain some stigma that preventsstudents from communicating freely with their instructor.

An interview with a few studentsat University Malaysia Sarawak (UNIMAS) showed that they were dissatisfied withsome online courses offered that were merely lectures notes placed online. Thisindicates that more effort is required in designing a learning environmentwhich should represent an integral part of the development of a quality onlinecourse content. The next section will discuss three online cases at UNIMAS.

Cullip (personal communication,1999) used Miscrosoft Front Page 2000 to design an online discussion forum toreplace normal tutorial session in learning English grammar. He attributed thelow participation rate of on-campus students to the fact that they were toobusy in completing assignments for other courses. As Cullip did not evaluatehow much students learned from the online discussion, he did not know whetherthe online tutorial was as good as or could supersede face-to-face tutorial.

Another lecturer, Joblie(personal communication, 1999) used ‘QuickPlace’ to encourage on-campusstudents in participating online discussion and doing online group project inwhich a lecturer (i.e., Joblie herself) acted as a facilitator. In contrast toCullip, Joblie reported satisfactory performance of students in the groupproject (in which students were required to write an essay) that representedpart of course assessment. According to Joblie, it was possible that all 24students who actively participated in the online discussion were satisfied withher method of monitoring. During the two weeks in which the online discussionwas conducted, she replied students’ queries within two days (i.e., technicalmatter, course content and so on). Apart from the performance outcome, Jobliedid not conduct evaluation in relation to learners’ perceptions of the ease ofusing the ‘QuickPlace’. In either the case of Cullip or Joblie, the onlinediscussion forum was limited to learner-learner interaction (discussion amongonline participants), and learner-instructor interaction (lecturer monitorsdiscussion) (Moore, 1989; Moore & Kearsley, 1996).

With the assistance of a technicalstaff from the Centre for Applied Learning and Multimedia at UNIMAS, NoorShah(2001) used ‘QuickPlace’ to create a website for teacher trainees who underwentteaching practice at various schools located in different geographical areas ofEast Malaysia. Email communication was available for the trainees as well asthe supervising lecturers from UNIMAS. In the website, the Notice Board servedas an avenue to make various announcements. The inclusion of school maps gavedirection (for the supervising lecturers from the UNIMAS) to reach a school;and a class time-table provided information concerning when a teacher traineewould be in a classroom. Further, online discussion forum in which teachertrainees could share and contribute ideas to issues related to teaching (i.e.,how to prepare a lesson plan, how to manage discipline in class and so on)proved to be a meaningful learning experience for them. However, the authornoted that trainees who were separated in a wider geographical regions tendedto exchange more messages online compared to those trainees who were placed incloser geographical areas. Also, some trainees in remote schools did not haveInternet access.

This research attempts toaddress the above issues related to designing and evaluating online course. Aresearch team was involved in designing and evaluating an online instructionfor the topic of ‘How to write research proposals and reports’. This projectrepresents a pilot study which would offer suggestions for lecturers, policymakers at UNIMAS who intend to offer online undergraduate courses.

Computermediated collaborative learning

Phillip, Fairholme andLuca (1998) used an email listserver to successfully encourage students inparticipating in a group project. Marks were allocated to appropriatelistserver use, and this served to encourage inter-team and inter-studentcommunication. Each team followed the progress of other teams through theweekly progress report. In contrast, Pearson (1999) reported the use of anelectronic network by trainee teachers did not facilitate active discussionsamong participants. Fear of public comments and criticism might deter them fromexpressing educational concepts and issues over the network.

The use of asynchronous learningnetworks (ALN) for undergraduate on-campus students who took a summer coursewhile they were off campus provided some insights into the learning outcomesand students’ satisfaction of the ALN (Wang, Hinn, & Arvan, 2001). Thesuccess of online learning relies on learners’ motivation, self-discipline, andskill in managing the time. The factors that contribute towards learningoutcomes include a reliable internet connection, the availability of technicalsupport and the course design.

The technology of collaborativelearning enables cross-cultural interaction in which participants fromdifferent countries collaborate to produce a joint project. For example,students from the City University of Hong Kong (part-time accountancy students)and the Eindhoven Unviersity of Technology in the Netherlands (businessengineering students) participated in a joint project related to InformationTechnology (Vogel, Genuchten, Lou, Van Eekhout, Verveen, & Adams). Bothsynchronous and asynchronous interactions were made available with the aid ofe-mail, videoconferencing and GroupSystem. Although eight out of 10 teams (72students) succeeded in producing a report related to software engineering, theparticipants had to overcome technical problem, differences in knowledgebackground, and cultural adaptation.

A ComparisonDesign

In developing an interactivecomputer programming learning environment, Catenazzi and Sommaruga (1999)provided students with lectures notes and exercises so that they could edit,compile, run programs, and evaluate their learning performance. In the finalexamination, the test group performed slightly better than other students whodid not participate in the program. In the study, the authors argued that themeasurement of the effectiveness of the computer learning environment requireda comparison group which did not participant in it (presumably meaning thosestudents who attended traditional face-to-face mode of instruction)

Closely related to this is thedevelopment of a distance learning computer course (e.g., Programing &Software Engineering, Database Design and Analysis) at the University New SouthWales, Australia (Lambert, Shepherd, Ngu, Ho, Whale & Geissinger, 1996).Results from this research indicate that the distance learning group performedas good as the control group who attended face-to-face lectures and tutorial.According to the authors, the inclusion of a comparison group is to assesswhether a distance computer learning course targeting those full-time workersenrolled as part-time students were able to achieve performance as good as orbetter than the face-to-face group.

The study reported by Schutte(1966) inidcated that the virtual class mode of delivery was superior to thetraditional classroom in learning social statistics. Students were motivated toengage in the new technology of virtual class as they spent more time incompleting their class work in the virtual environment than the traditionalgroup. In contrast, Smith and Taylor (1995) reported no significant differencein the mastery of a physics course that was presented to students either in webversion or lecture version.

Collins (2000) compared web,correspondence and lecture versions of a second-year Biology course over fourdifferent semesters. Analysis on the course evaluation revealed that studentsin the web version were satisfied with the web approach though they did notscore better than students in the correspondence version nor the students inthe lecture version in the final test. Because the author has the intention toreplace the correspondence version with the web version eventually, he includeda correspondence version as a comparison to assess whether there was anysignificant difference between the two groups in their final test.

There is, however, opposing viewconcerning a comparative study involving a computer media and a traditionalmode of learning. Lockee, Moore and Burton (2001) argued against the use ofmedia comparison studies. They identified a range of variables (such as learnercharacteristics, media attributes, instructional strategy choices, andpsychological theories) in the comparison studies which made the comparativedesign inadequate to justify the learning effects of the two instructions. Thatis, it is difficult to establish the cause and effect of the comparative designbecause it is almost impossible to match the variables for the participants ina comparison study.

In sum, research evidences pointto two different views in evaluating computer mediated learning environment. Asnoted above, several researchers adopted an evaluation strategy emphasizinglearners’ satisfaction and learning outcomes of the course design. Otherresearchers compared the effectiveness of online course and a traditionalface-to-face course with the intention of replacing online course with traditionalcourse should the former supersedes the latter. This investigation focused onfull-time UNIMAS undergraduate students’ responses towards online instruction.Similar to researchers who adopted a comparative design, the researcher in thisstudy included a face-to-face instruction as a control to test whether onlineinstruction would be superior to face-to-face instruction in terms of learningoutcomes. Further, data related to students’ views about the online instructionwould be collected to examine whether they were satisfied with the onlineinstruction.

Method

Materials

The topic investigated was ‘Howto write research proposals and reports’, and this was part of a ‘ResearchMethod’ course taught to undergraduate students enrolled in the Human ResourceDevelopment program. The learning environment provided topics notes, multiplechoice exercises, and structured group activities (see Appendix 1). Each topichad its learning aims and activities. This latter included multiple choiceexercises and/or online discussion (case studies). There were Links provided torelevant websites for additional support materials. Materials for evaluatingthe online instruction included a test comprising 7 multiple choice questionsthat resembled the practice multiple choice questions; and a case study(similar to online group discussion questions) in which students were requiredto answer 8 short questions. In addition, there was a questionnaire with 12questions to elicit students’ perceptions of online class; a diary sheet totrack the amount of time invested in completing the online class; and theusability evaluation form requiring students to rate the ease of use of thesystem as well as the usefulness of the topic notes. Apart from the onlinegroup project, the diary sheet, questionnaire, usability evaluation and a testwere in paper-based. Lastly, the computer would generate feedback thatincluded: (1) number of messages posted, (2) number of contributors andresponses generated. This provided a means to evaluate whether studentsparticipated in the online discussion.

Figure 1:  Interaction learning model

 

Interactionlearning model

The design emphasizedstudent-centered, and activity-based learning environment (Carr-Chellman &Duchastel, 2000). As shown in Figure 1, the design of an online learningenvironment emphasized learner-content interaction (Moore, 1989), learner-selfinteraction (Soo & Bonk, 1998), learner-learner interaction (Benson &Rye, 1996; McGill, Volet, & Hobbs, 1997), and instructor-learner interaction(Braggett, Retallick, Tuovinen, & Wallace, 1995; Brown, 1996; Stephenson,1997-98).

When students studied the topicnotes, they would be engaged in learner-content interaction. Since the topicnotes were presented in text only, this represented one-way learner-contentinteraction (Tuovinen, 1999). That is, learners could not manipulate (such asedit) the presentation mode of the content as they seek to understand thecontent. A multiple choice question was provided after each sub topic. Thisrequired students to tick the correct statements, and computer providedimmediate feedback (see Appendix 1). This frequent intervention of multiplechoice exercises (learner-self interaction) aimed to help students inreflecting (try to understand or make sense) the topic content (Soo & Bonk,1998). Take note that in the learner-self interaction, learners were requiredto decide the correct statement(s) for the multiple choice questions; whereasin learner-content interaction, learners were not required to use their decisionto judge the accuracy of the topic materials. Rather, all the topic materialswere presumed to be correct. However, it is possible that learner-selfinteraction overlaps with the learner-content interaction to a certain degree.In both cases, the learners were required to reflect and comprehend thematerials presented to them.

The online case studies(learner-learner interaction, and instructor-learner interaction) and groupproject (learner-learner interaction) represented structured group activities.In either the learner-learner interaction or the instructor-learnerinteraction, the interaction system would be two-way and asynchronous. Thisasynchronous communication did not depend on the participants being presenttogether at a specific time to exchange messages. In other words, it allowedparticipants time to reflect upon responses to questions posed, and this mayhelp them to generate fruitful discussion (Berge, 1994).

For the case studies,students were presented with discussion questions (see Figure 3). This providedopportunity for them (learner-learner interaction) to debate, ask, andcontribute to a discussion question. A lecturer would act as a facilitator(instructor-learner interaction) whose task was to monitor students’participation in the discussion, and to steer the course of discussion in caseswhere students were side-tracked. Note that this latter instructor-learnerinteraction was for online case studies only. The lecturer did not participatein the online group project discussion (see Figure 4). It is anticipated thatonline group discussion would facilitate a constructivism learning approach(Fischer & Scharff, 1998; Papert, 1980; Resnick, 1996). Its emphasis was oncollaborative learning in which all students were required to generate ideas,to construct and build knowledge as they exchanged information with eithertheir peers or the facilitator. In other words, this would assist students tointegrate multiple perspectives through reflection of diverse views on asubject matter.

Traditional Classroom Interaction 

Students were providedwith a handout of lecture notes (included multiple exercises) that matched withthe online content. Using overhead transparency, the lecturer presented tostudents the main points of each subtopic and its multiple choice exercise.Students were free to ask questions concerning the lectures; and to indicatewhich statement(s) in the multiple exercises were correct. Presumably studentswould attempt to understand the content via the interaction with the lecturer orthe notes or the multiple choice exercises. Interaction process was presumed tobe quite substantial due to the small number of students (i.e., 19). For thetutorial materials that were also the same as online group discussionquestions, students were divided into different groups to discuss varioustutorial questions. This group discussion represented interaction amongstudents, and the results of the discussion would be shared among the groups.

Designing the LearningEnvironment

The research team comprised stafffrom the Faculty of Cognitive Sciences and Human Development and Centre forApplied Learning and Multimedia (CALM) at UNIMAS. One academic staff (theresearcher) who had never taught the subject before wrote the content; and twoothers who had three years experience in teaching the subject reviewed thecontent. The content writer worked closely with a technical staff from the CALMto create an online learning environment.

Lotus QuickPlaceSoftware

The Lotus ‘QuickPlace’ softwarewas used. This was developed initially to assist business people to work ongroup project in a web-based working environment. To customize the ‘QuickPlace’software for this online instruction, the component of multiple choiceexercises was written with htmllanguage and incorporated in the ‘QuickPlace’. As can be seen in Figure 2, amenu on the left-hand side contained a few folders. The Bulletin Board denoteda folder that contained announcements. The Course Guide folder providedinformation such as introduction to the topic, lecturers’ contact, assessmentrequirements, and a study schedule. The Study Help folder provided study tipson the online learning strategy. The Project Room was for students to discussthe online group project; while the Discussion Room catered for online casestudies. Each online case study had its separate folder(s) depending on thenumber of discussion questions. The Links enabled students to get supportmaterials from useful websites. The security and customize folders can only beaccessed by the lecturer to make modification to the curse content.

Figure 2: Screen shot of themain page

 

Procedure

Participants were second yearfull-time undergraduate students at the UNIMAS who enrolled for the HumanResource Development program. They were randomly assigned to two groups: 18 inthe online group and 19 in the face-to-face group. As this research project wasconducted during students’ actual course time-table for that particular topic(i.e., 24/02/00 to 02/03/00), all students were matched on the same amount oftime to complete the topic.

Online group.  Students inthe online group were assembled in a computer laboratory at UNIMAS. Theresearcher briefed the students concerning the aim, and procedure of theexperiment. Each student was then assigned a password to access the coursecontent in a computer. Students were told to browse through the Bulletin Board,Course Guide, and Study Help folders first before they studied the topic notes.The programmer who customized the ‘QuickPlace’ was available to assiststudents (during the first day ) in case any computer problem arose. Then,students followed the online course in a self-paced manner, and they wererequired to record all their activities in a diary (see Appendix 2). In thestudy schedule (Course Guide folder), students were informed of a group projectassignment (how to write a research proposal) which needed to be submittedonline at the end of the topic (i.e, 02/03/00). A team of 6 students workedtogether to write a research proposal that should include the differentcomponents of a research proposal. The requirement of the research proposal was4-5 pages in length, font point 12, and double spaced.

At the end of the topic,students were given 20 minutes to complete a test comprising multiple choicequestions, and a case study; and 15 minutes to fill a questionnaire andusability evaluation form. The allocation of marks for the test and groupproject aimed at encouraging active participation of the students in the onlinelearning.

Face-to-face group.Students attended a two-hour lecture and one hourtutorial given by a lecturer (the content writer). The lecture notes werematched with the online content. Lecture notes (included multiple exercises)were presented to students using overhead transparency. The tutorial materialswere the same as those given in the online case studies. Students were requiredto sit a test, and complete a group project similar to those students whofollowed the online instruction.

Results andDiscussion

Objective Evaluation

 Data analyses were based on the 37 students who tookpart in the study. Students were required to answer 7 multiple choice questions,and 8 short questions for a case study. Onemark each was awarded for one correct answer to a multiple choice question and ashort question.

Table 1 presentsthe means and standard deviations of correct answers. A t-test indicates thatthe online group outperformed the face-to-face group on a case study, t (35) =3.10, p = .004; but not on the multiple choice questions, t (35) = 1.31, p = .2.Concerning the group project, one mark was allocated for each correct componentof the research proposal. The score for the three online groups were 88%, 66%,and 68% respectively. As for the face-to-face group, the respective score were74%, 76%, and 68%. Thus, the two groups did not show differential performance inthe group project.

Table 1  Meansand (standard deviations) of correct answers

 

            Online instruction

                n = 18

Face-to-face instruction

        n = 19

 

                M (SD)

                    M (SD)

Proportioncorrect answers

Multiple choice

Case study

4.78 (1.44)

4.11 (1.54)*

4.21 (1.18)

2.68 (1.25)

Note:*indicatet-test significant at .05 level

Figure 4:Online Group Project 

SubjectiveEvaluations

Table 2 shows theusability evaluation which revealed the perceived usefulness of the content, andthe perceived ease of use of the system. In general, students were moresatisfied with the course content rather than the tools used. Students used aLikert scale (Excellent (5) to Poor (1) to rate the content items. They ratedfavorably on the content items. Less than 10% of students rated the Topic Notesand Quality of exercises as poor. A similar rating pattern emerged for theonline case studies. In contrast, most students rated the Navigation usingbottom and links as Not useful (see also Table 3, comments on question 8). Butthey rated the Online study help and Online discussion page as quite useful.

Comments from thequestionnaire (see Table 3) indicate that students were frustrated by thefrequent server failure (see questions 2, 4, 7, 11). Also, comments on question8 indicates that some students found the design buttoms were not user friendly.For instance, when a student responded to a message, a separate screen wouldappear in which the original message was no longer in view. This represents asplit-attention effect (Tarmizi & Sweller, 1988; Sweller, Chandler, Tierney,& Cooper, 1990) in that students had to switch backward and forward to keeptrack with the original message while answering the message.

The diary sheets (see Appendix2) were not analyzed. A number of students filled the diary sheet incorrectly.They indicated the time for doing certain activities when the computer serverswere down in the campus. Note that most students lived in university hostels,and they used computers in the campus. Other students filled the type ofactivities but forget to include the time that they took to complete thoseactivities.

Table 2: Usability evaluation

 

Content Items

Percentage Response on Rating

Excellent

 

Poor

5

4

3

2

1

Topic Notes

0

56

44

0

0

Quality of exercise

0

39

44

11

6

 

Online Case Studies

 

1. Conceptual framework

6

44

44

6

0

2. Design and research method

6

44

39

6

6

3. Results

0

39

61

0

6

4. Discussion

6

39

39

17

6

 

Tools

 

Very useful

 

Not useful

5

4

3

2

1

Navigation using bottom

6

1

44

22

11

Links

6

6

28

56

6

Online study help

0

33

44

17

6

Online discussion page

6

33

44

11

6

 

Table 3:  Questionnaire to elicit students’ perceptions of onlineclass

Questions

%

Yes

%

No

%

No response

Comments’ made

1. Did you find the exercises prepared you the online discussion and the group project?

66

22

11

Exercise helps understanding

2. Is there sufficient learning activities in helping you to learn?

55

38

5

Not enough time to do those activities; server down; too much activities

3. Do you think six students in a group project is a good choice?

83

16

0

We can compare ideas; prefer 4 instead of 6

4. Did you discuss the online discussion with your friends ( and group project with your group members) first before you post the message?

61

33

5

No time; sometime; discuss because of server down; discussion enables me to get feedback from friends;

5. Did you read all the message posted?

77

22

0

I copy the message and read in hostel; can information from the message

6. Did putting the learning materials online motivate the discussion between you and your friends?

55

44

0

Their comments and contribution lead to more effective learning; it is not the learning materials, it’s the technology and facilities that did not motivate us; it is time consuming; add knowledge when we exchange information

7. Did you download and print out the learning materials?

72

27

0

To understand better;download some materials; makes my life easier because of server down,;easier for me to refer;

8. Did you find the organization of the different parts of the system (topic notes, exercises, online discussion pages) clear?

72

27

0

I have to turn page to page; design button is not user friendly; ; but the online discussion needs to be improved

9. Would you recommend this online materials to a friend?

44

44

11

It is ok if the computer and technology is working; unless UNIMAS has a good server supplier; take a long time; online materials are not user friendly;

10. For you personally, what was the best thing about this online study?

 

 

 

Online discussion ; need not attend lecture; can discuss with others; do not need to write in paper; the notes and exercises; this new method of study attract students’ interest in study; access anytime if the server is ok

 

11. For you personally, what was the worst thing about online study?

 

 

 

Power failure, bad server; server down, failure system

12. Any other comments?

 

 

 

Put a user-friendly interface, make the server better; online study is good, but the server problem make us very stressed; need sufficient time; the online study is too near to semester examination, feel very stressed

Computer generatedfeedback

Table 4 presents conferencingexchanges from 24/02/00 to 02/03/00. With respect to the online case studies,apart from the Conceptual Framework, the other three (Design and ResearchMethod, Results, and Discussion) had slightly less than 50% participants. Mostmessages posted were original, and the facilitator contributed almost all thereplies. On examining the questionnaire (see Table 3, question 2), 77% of thestudents read all message posted. Table 3 also indicates that power and serverfailure (Table 3, questions 2, 7, 9, and 11) caused the greatest dissatisfactionamong the students. Indirectly, this suggests that students did not engage muchin learner-learner interaction due to the difficulty in accessing computers.Nonetheless, students probably benefited from instructor-learner interaction inthat the replies provided by thelecturers helped them to compare and correct their answers. As shown in Figure3, students (indicated by their student no. 2651 and 2436) discussed ISM surveyform, and the lecturer (indicated by bhngu) supplied further informationconcerning the ISM survey form. Regarding the group project, students wereactive in posting and replying messages (see Figure 4). If computer access didnot represent a problem (as mentioned above), this active learner-learnerinteraction among students might result in a greater number of total messagesposted, and a better performance on the group project.

Table 4:  Conferencing exchanges on online case studies and groupproject from 24/02/00 to 02/03/00)

 

 

Conference

Online Case Studies

Group Project

Conceptual framework

 

4 Questions

Design and Research Method

4 Questions

Results

 

 

2 Questions

Discussion

 

 

2 Questions

A

B

C

No. of message posted

 

83

39

23

19

17

18

4

Online

Contributors

 

62

31

17

14

6

6

3

% of online participants

 

86

43

47

39

100

100

50

Lecturer contribution

 

21

7

6

5

0

0

0

Original messages

 

60

27

16

14

2

2

1

Replies

23

13

7

5

15

16

3

Figure 3:  Online Discussion 

 

Conclusion

 Test results indicate thatonline group performed slightly better than face-to-face group on case studiesonly. It seems that the online group was not disadvantaged despite frequentinterruptions of power and server failures in the campus. The positive onlinelearning effect may owe to its different types of interactions (learner-content,learner-self, learner-learner and instructor-learner). This various types oflearning interaction may foster a more self-oriented and group-oriented learningexperience than the face-to-face instruction (Maher, 1998). As a consequence,this self-oriented and group-oriented learning experience promotes learning(especially the online case studies) more than face-to-face instruction.

In contrast, face-to-face classdid not provide opportunity for every student to interact with the lecturer whenthe lecture was conducted. Due to nature of traditional lecture, only somestudents were given the opportunity to respond to the multiple choice exercisesincorporated in the lecture notes. Whereas all students in online group wereencouraged to participate the online discussion topics, students in face-to-facegroup worked in separate groups with each group discussed one tutorial topiconly.

However, before UNIMAS embarkson offering online courses, more research needs to be done to ensure the qualityof the online course. First, a better user friendly interface, in particular,the eliminating of split attention effect, would enhance the overall design.Second, the use of students’ names (more personalized) instead of studentnumbers may lead to more active group discussion. Third, if students were givena training session about online discussion forum at the beginning of thesemester, this may better prepare them for a serious study later.

Also, a pretest and posttestdesign would better measure the gain in transfer performance. In addition,online diary sheet rather than paper-based record sheet would facilitate theinstructor to monitor students’ progress more systematically. In this design,there was only one lecturer who attended to students’ online discussion, morestaff would be required to cater for a larger group of students.

More importantly, technicalproblems such as server failures and the power failures suggest an institutionalimplication for online course servers. There is little point in trying toprovide online courses relying on faulty delivery servers that are critical tothe process. Thus, a backup server is essential to ensure the smooth delivery ofthe online materials.

Acknowledgement

The author thanks Elaine GuatLien Khoo (glkhoo@fcs.unimas.my) andRoger Harris for helpful comments on earlier version of this paper.

References

         Benson, R., & Rye, O. (1996). Visual reports by video: An evaluation. DistanceEducation, 17(1), 117-131

         Berge, Z. L. (1994). Electronic Discussion Groups. Communication Education,43(2). 102-111.

         Braggett, E., Retallick, J., Tuovinen, J. E., & Wallace, A. (1995). DistanceEducation Project NATCAP. Report on the establishment of Telematics deliverysystems in one priority cluster area in NSW. 1993-94. Wagga Wagga: CharlesSturt University.

         Brown, K. M. (1996). The role of internal and external factors in thediscontinuation of off-campus students. Distance Education, 17(1), 44-71.

         Carr-Chellman, A., & Duchastel, P. (2000). The ideal online course. BritishJournal of Educational Technology, 31(3), 229-241.

         Carr-Chellman, A., & Duchastel, P. (2000). The ideal online course. BritishJournal of Educational Technology, 31 (3), 229-241.

         Catenazzi, N., & Sommaruga, L. (1999). The evaluation of the Hyper Apuntesinteractive learning environment. Computers & Education, 32, 35-49

         Collin, M. (2000). Comparing web, correspondence and lecture versions of asecond-year non-major Biology course. British Journal of EducationalTechnology, 31 (1), 21-27.

         Cullip. P., F. (1999). Learning pedagogical grammar. Centre for languagestudies, University Malaysia Sarawak.

         Fischer, G. and Scharff, E. (1998). Learning technologies in support ofself-directed learning. Journal of Interactive Media Education, 98 (4).Retrieved February 10, 2000 from World Wide Wed: wwww-jime.open.ac.uk/98/4.

         Joblie, F. (1999). Computer assisted language learning. Centre forlanguage studies. University Malaysia Sarawak.

         Kumari, D. S. (2001). Connecting graduate students to virtual guests throughasynchronous discussions –analysis of an experience. Journal ofAsynchronous Learning Network, 5(2).

         Lambert, T., Shepherd, J., Ngu, A., Ho, P., Whale, G., & Geissinger, H.(1996). Bridging the Gap: Computer Science meets Distance Education at UNSW.School of Computer Science and Engineering, University of New South Wales.

         Liu, D., Walter, L., & Brooks, D. (1998). Delivering a chemistry course overthe Internet. Journal of Chemical Education, 75 (1), 123-125.

         Lockee, B. B., Moore, D. M., and Burton, J. K. (2001). Old concerns with newdistance education research. Educause Quarterly 24, 2, 60-62.

         Maher, E. (1998). Does the Delivery media impact student learning? A case study.Open Learning, November, 27-32

         McGill, T. J., Volet, S. E., & Hobbs, V. J. (1997). Studying computerprogramming externally: Who succeeds? Distance Education, 18(2), 236256

         Moore, M. G. (1989). Editorial: three types of interaction. The AmericanJournal of Distance Education, 3(2), 1-6.

         Moore, M. G., & Kearsley, G. (1996). Distance education. A system view.Belmont: Wadsworth.

         Newton, R., Marcella, R., & Middleton, I. (1998). NetLearning: creation ofan online directory of Internet learning resources. British Journal ofEducational Technology, 29 (2), 173-176.

         NoorShah, M. S. (2001). Overcoming communication issues (Learning from theexperience of PKPG Teaching Practice website in UNIMAS). The Internet andHigher Education, Volume 4, Issues 3-4.

         Papert, S. (1980). Mindstorms: Children, Computers and Powerful Ideas.New York: Basic Books.

         Pearson, J. (1999). Electronic networking in initial teacher education: is avirtual faculty of education possible? Computers & Education 3,221-238.

         Philips, R., Fairholme, E., & Luca, J. (1998). Using email to improve theexperience of students doing a group-based project unit. Retrieved February, 21,2000 from World Wide Wed: http://cowan.edu.au/eddev/98case/phillips.html.

         Resnick, M. (1996). Distributed Constructivism. Proc. InternationalConference of the Learning Sciences, Chicago, IL.

         Schutte, J. G. Virtual Teaching in Higher Education: The new intellectualsuperhighway or just another traffic jam? Retrieved February, 2001 fromhttp://www.csyb.edu/sociology/virexp.htm

         Smith, R. C., Taylor, E. F. (1995). Teaching physics online. American Journalof Physics 63(2), 1090-1095.

         Soo, S. –K., & Bonk, C. J. (1998). Interaction: What does it mean inonline distance education? Paper presented at the EDDMEDIA & ED-TELECOM98, Freiburg, Germany.

         Stephenson, S. D. (1997-98). Distance Mentoring. Journal of EducationalTechnology System, 26(2), 181-186.

         Sweller, J., & Chandler, P., Tierney, P., & Cooper, M. (1990). Cognitiveload and selective attention as factors in the structuring of technicalmaterials. Journal of Experimental Psychology: General, 119, 176-192.

         Tarmizi, R. A., & Sweller, J. (1988). Guidance during mathematical problemsolving. Journal of Educational Psychology, 80, 424-436

         Tuovinen, J. E. (1999). Research framework and implications for onlinemultimedia education practice based on cognition research. Paper presentedat the Communications and networking in education: Learning in a networkedsociety, Aulanko, Hameenlinna, Finland.

         Vogel, D., Genuchten, M., Lou, D., Van Eekhout, M., Verveen, S., Adams, T.(2001). Exporatory research on the role of national and professional cultures ina distributed learning project, IEEE Transactions on ProfessionalCommunication, June, forthcoming.

         Wang, X. C., Hinn, D. M., & Arvan, L. (2001). Stretching the boundaries:Using ALN to reach on-campus students during an off-campus summer session. Journalof Asynchronous Learning Network, 5(1)


Appendix 1

 Howto write research proposals

 After studying the purposeof writing a research, you should know:

           Why a research proposal is written and to whom the research proposal istargeted.

LearningActivities

Multiple choice exercises

Online discussion (allparticipate)

Topic Note 1.Purpose of a research proposal

           A research proposal is a plan to do research investigation. The firstchallenge in writing a research proposal is to propose a study that willcontribute to theory and research in a particular field. The second challenge isto demonstrate the feasibility of the research investigation. This depends onjudgments about the sufficiency of available resources (time, money); and accessto population of interest.

Research proposals are writtenfor various purposes. Research proposals are written for theses. This enablessupervisors of the theses to offer suggestion for improving the study.Conducting research needs money. Thus, research proposals are written to obtainresearch grants from government agencies, universities research funds, andprivate agencies. Occasionally, a research proposal describing a project isrequired to obtain permission to collect data.

 Exercise

Please tick thecorrect statements

 Thegeneral purpose of any proposal is:

·       To persuade a committee of scholars to fund a research project

·       To implement a program that you would like to launch

·       To convince your supervisor that the research project is feasiblein terms of resources (time, sample and etc.)

·       The research project can provide a new insight to a particularfield of study 


Appendix2:  Diary Entry

 Name of the student

How much time do you spend in:

Day 1

Day 2

Day 3

Day 4

Day 5

Day 6

Day 7

24/02

25/02

26/02

28/02

29/02

01/03

02/03

Reading topic notes and trying exercises from notes

 

 

 

 

 

 

 

     Taking part in online discussion (e.g., posting message, reading message, and contributing messages)

 

 

 

 

 

 

 

 

Taking part in group project (e.g., posting message, reading message, and contributing messages)

 

 

 

 

 

 

 

 

Searching the web for the writing research proposals materials

 

 

 

 

 

 

 

Downloading relevant reading material

 

 

 

 

 

 

 

 

Looking at the computer screen

 

 

 

 

 

 

 

 

Any other activities (please specify)

1.

2.

 

 

 

 

 

 

 

Author Note

Correspondenceconcerning this paper should be sent to Dr. Bing H. NGU, Faculty of CognitiveSciences and Human Development, Universiti Malaysia Sarawak, Malaysia.Electronic mail may be sent via Internet to bhngu@fcs.unimas.my


IJET Homepage | Article Submissions | Editors | Issues

Copyright © 2002. All rights reserved.
Last Updated on 20 June 2002. Archived 5 May 2007.