IJET Logo

International
Journal of
Educational
Technology

home

Issues

submit        articles

Editors

Articles

Feature Resources

Disseminating Learning Technologies Across the Faculty

Cheryl Bullock, University of Illinois at Urbana-Champaign
Steve Schomberg, University of Illinois at Urbana-Champaign

Abstract

This purpose of this article is to share information about an Inter-institutional Faculty Summer Institute on Learning Technologies held at the University of Illinois at Urbana-Champaign (UIUC) for three years running. The authors, the principal evaluator and an Associate Chancellor from the UIUC, present data from a three-year collaboratively based evaluation study conducted for the administration of the Institute.

Important findings include the impact the Institute had on attendees' understanding of the role, use, benefits for both students and instructors, and availability of learning technologies in higher education.

Introduction

Most of us would probably agree that Socrates was concerned with actively involving his students in their learning. For example, he used innovations available to him to engage his students in their learning. Case in point, he would situate his students in groups, then use sand and a stick to more actively present lessons (Plato’s Republic, 1973). There would probably also be little argument that Thomas Edison was a forward thinker. He too understood that innovations have a place in the delivery of educational materials. Take his invention of the phonograph as an example. He visioned a time when teachers would record lessons and students would listen to them in their own homes and at a time most convenient. The point is that good teachers and other forward thinkers have thought about the most instructionally beneficial way to use innovations throughout the history of higher education.

Today is no different in that respect; good teachers still want and need to know how best to use the common innovations of the day. But today is different in that current teachers face exponentially more sophisticated and diverse innovations such as learning technologies. While Socrates didn’t need much training to present his lessons with a stick, complicated innovations such as interactive web-conferencing and video streaming present a clear need for the diffusion of information to those using them in the higher education setting. Simply stated, today's teachers need training to use, let alone use well, today's technological innovations.

The Need for Faculty Training

DeVry, et al (1997) surveyed faculty from universities across the United States and determined that they felt lacking in training in order to integrate learning technologies optimally into their classrooms. His study found that most were only comfortable using technology for basic purposes such as word processing and electronic communication. Past these routine uses, they began to feel incompetent and unprepared for advanced uses such as shared conferencing and web-based presentations. In short, they needed training to understand how, when, and what types of technology would most benefit their instructional situations. Studies on asynchronous learning like that undertaken for the Illinois’ Sloan Center for Asynchronous Learning Environments (SCALE) substantiate that faculty who want to adopt learning technologies in their classroom are often hindered by a lack of training. Further, when information about technology is diffused to these faculty it has to include the associated pedagogy (Bullock, Ory, 1998). Faculty need and want training, and it must be training which incorporates both actual technologies and the pedagogical issues that best facilitate successful implementation.

Considering this, the Illinois Board of Higher Education and the Illinois Community College Board has made an investment to prepares its faculty to understand more the role, availability, and pedagogical implications of technology in higher education classrooms. The statement below was issued in a joint report on the state of higher education technology use in Illinois:

"Illinois and its educational institutions must make a significant investment to prepare its faculty and to take advantage of the technologies that are under development today and will be available in the next two to five years."

The Faculty Summer Institute (FSI)

As part of their efforts to address this need for information dissemination, Presidents and Chancellors of various Illinois public universities and the Illinois Board of Higher Education funded an Inter-Institutional Faculty Summer Institute on Learning Technologies. Held at the University of Illinois at Urbana-Champaign (UIUC) during the month of May in 1997, 1998, and 1999, the Institute included faculty and instructional support from all of the four-year public universities in Illinois. The Institute served two separate functions. First, faculty were introduced to the types and purposes of various learning technologies. Second, they were given information about pedagogical issues associated with what is known about best practice in higher education. This was all done in an intensive learning atmosphere meant to encourage collaboration among participants. Participants stayed on-site, collaborated and learned closely together in teams during hands-on afternoon sessions. The Institute was designed to not only train faculty and instructional support, but also to foster situated learning (Bandura, 1977) and help continue the learning momentum outside the classroom (Dewey, 1902). Students collaborated and had meaningful dialogue both inside and outside the classroom, thus maximizing the potential for both learning and establishing productive relationships with peers (Basi, 1998; Forman & Cazden, 1986).

Combining a use of intensive situated learning within the context of teams, the FSI strove to ultimately be a change agent in the classroom teaching of attendees. With these goals in mind, the structure of the FSI was four intensive learning days with each afternoon spent with the same working teams. The context and content of the Institute was designed to provide overviews, but primarily be hands-on in nature.

Complete program descriptions for all three years of the FSI can be found at http://www.cet.uiuc.edu/fsi/about.html.

Purpose of the Article

This purpose of this article is to share information about the Institute and the impact that it had on those faculty and support staff trained. The data on which the article is built comes from an extensive three-year evaluation study conducted in a collaborative manner between the authors. The authors present findings regarding the impact of the Institute on attendees' understanding of the role, benefits, and availability of learning technologies for use in higher education, as well as the subsequent changes they have made in their teaching.

Methodology

Consistent with best practices in evaluation studies, a mixed-method approach to data collection was used (Worthen, Sanders, & Fitzpatrick, 1997). Surveys were administered to collect both comparative and descriptive data. Group and individual interviews were used to solicit qualitative reactions and responses. These methods are discussed in detail in the following sections.

Surveys

In order to gather a rich array of data and to gage impact, multiple surveys were used during this evaluation. During year one an exiting survey was given to respondents. The 1997 exiting survey was administered and completed during the last afternoon of the Institute to 97 of the 106 attendees. Both a pre-and post-survey was administered to participants during the second and third year (1998 and 1999) of the Institute to determine the impact on their learning as well as their satisfaction with the experience. Questions on perceived impact on learning and teaching were included in both pre-and post-surveys. The post-surveys also asked respondents to rate key Institute components and to provide suggestions for future Institutes. During year two (1998) the pre-survey was administered during the first afternoon of the Institute and had a response rate of 91%. Year three, the pre-survey was administered as part of the pre-registration information collected from attendees and yielded a 100% response rate. Ninety-eight attendees, representing a return rate of 83%, completed the post-survey during 1998 and 75 (71%) completed the post-survey in 1999. Table 1 below shows the breakdown of both pre-and post-surveys responses for all three years of the data collection efforts.

Table 1. Listing of All Surveys Given

Survey Type 1997 Post 1998 Pre 1998 Post 1999 Pre 1999 Post
Number Responding 97 107 98 106 75
Percentage 92% 91% 83% 100% 71%

Group Interviews

Group interviews were also conducted during all three years of the evaluation. During 1997, eight group interviews were held as part of the data collection efforts. These group interviews were conducted during the last afternoon work teams with participants from eight of the eleven working teams. Eight were randomly selected from the eleven work teams available because of time constraints. Each interview lasted approximately 20 minutes and was conducted by the evaluator with the aid of a notetaker. Work team facilitators left the room during each of the group interview sessions and interviewees were asked to identify the most useful parts of the Institute and to make suggestions for improvements. On average there were 10 participants in each group.

The number of group interviews was reduced to four and extended in length during the second year of the Institute evaluation. This was done to allow participants more time thereby increasing their ability to provide responses of depth. Held during the last afternoon work team, as with the previous year, each group interview contained participants from four of the nine working teams. Four were selected to try and get as representative responses as possible considering, again, the time constraints. During year two, the evaluator lead the group interviews and a graduate student served as notetaker. Year three, only one group interview was conducted, it was 90-minute long and held on the last evening of the Institute. Each interviewee was chosen by their home institution group as a representative and solicited their groups’ opinions prior to the actual group interview. Questions again focused on usefulness of components, but included additional questions about the Institute's facilitation of collaboration and sustained collegiality.

Follow-Up Interviews

Individual follow-up interviews were conducted with a sampling of seventeen first-year participants. This was done to validate the perceptions of on-site interviewees regarding the impact of the Institute. Interview questions focused on how attendance at the Institute had changed the way that they taught and their understanding of the role, availability, and benefits of learning technologies.

Two primary considerations were made when choosing attendees for the individual follow-up interviews: (1) to include interviewees from all twelve of the participating institutions and (2) that differing curricula be represented. However, as the sample was pulled it became necessary to substitute other names for those who were not available for interviews. In the end, seventeen attendees from a diverse group of institutions and curricula were able to participate in this phase of the evaluation.

Follow-up interviews were conducted via telephone by the evaluator. Each interview lasted approximately one hour, was audiotaped, and tapes were transcribed for data analysis purposes.

Who were the attendees?

Although the term faculty is used consistently throughout this paper, it is only for the sake of simplicity. The overwhelming majority of participants both years were faculty, however, some instructional support personnel did attend each of the three years.

Attendees came from all of the four-year institutions of higher education in the State of Illinois. They represented a myriad of disciplines and were each given $1,000 to use as they deemed appropriate. Some reported uses included professional development, software, and equipment.

Year One Work Teams. Year one participants were pre-selected by FSI staff into eleven working teams. These groups (based solely on the discipline area of each respondent) were: Biology, Calculus, Composition, Economics, Education, Engineering, Geography, Instructional Support, Physics, Social Sciences, and Spanish. The purpose of each team was to learn and share discipline specific applications of various learning technologies.

Year Two Work Teams. One finding from the first year evaluation was that some participants thought that work teams should be based on types of learning technology demonstrated during the 1998 FSI. However, a significant number of first year participants also reported that organizing the work teams based on disciplines had been very effective and useful. In order to try and meet the social learning needs of both camps, some work teams were developed based on discipline while others were developed based on type of technologies taught. Participants were thus allowed to self-select during 1998 into teams they felt would most foster their learning. Groups then become: Education, Business, Science, Composition, On-campus applications, Off-campus applications, and Collaborative Learning. The purpose of the discipline based teams remained the same as with year one while the two application groups focused on the range of technologies available for either off-campus or on-campus. The multi-disciplinary group had a more specific focus of using computer conferencing to encourage student collaboration.

Year Three Work Teams. During year three the work teams continued with the spirit of blending discipline and technology interest. They were: Applications I, Applications II, Art and Design, Business, Education (K-12), Instructional Support, Statistics for the Social Sciences, and Writing Across the Curriculum. This blend, which allowed participants to chose either a discipline specific or technology related group, was reported to work well by the participants. Descriptions of the content of these groups can be found at http://www.cet.uiuc.edu/fsi/default.htm.

What types of technologies were they using before they came to the Institute?

The types of technologies used by participants prior to attendance at the FSI were quite similar from year-to-year. Most were using e-mail and listservs, Web pages, and proprietary software packages. Many were also requiring that their students conduct research using the World Wide Web.

Those respondents who reported using e-mail or listservs did so to share routine communications with students as their primary classroom learning technology. Some did report, however, that they asked students to substantively react to assignments. In other words, they encouraged their students to use listservs to hold subject-matter discussions outside of the classroom. The most frequently reported use of Web sites was as a passive information center for the storing of class materials such as their course syllabus.

The most commonly used proprietary package for instructional purposes was Microsoft’s PowerPoint, which was used to present in-class lectures.

What were the impacts of the Institute on attendees?

Results from the evaluation show that there were three main impacts on attendees. First, they overwhelming believed that their week at the Institute would change the way they used technology in their classrooms. Year one follow-up interviews substantiated that attendees did, in fact, make predicted changes in their use of technology in the classroom. Second, they reported a broadened understanding of the types and purposes of technology in higher education. They walked away from the Institute better understanding the role of technology in higher education and its benefits for both instructors and students. Third, attendees had a newfound sense of collegiality among other faculty who used technology. This collegiality carried the promise of future collaborative efforts for many of the participants.

They left the Institute believing they would change the way they taught

The participants from all three years believed that they would change the way they used technology in their classroom because of their attendance at the Institute. Table 2 below shows that roughly 95% of the respondents, each year, believed that attendance at this week long institute had fundamentally impacted the way they would use technology in their classrooms.

Table 2. How likely is it that your experiences at the Summer Institute will change the way you teach?

Response Categories 1997 Post-Survey 1998 Post-Survey 1999 Post-Survey
Very likely 42% 43% 48%
Likely 53% 52% 46%
Unlikely 5% 4% 4%
Very unlikely 0% 1% 2%

Respondents were also asked each year to report how they anticipated making these changes. Most planned on implementing specific technologies highlighted during the Institute. These included computer conferencing packages such as FirstClass Conferencing, and on-line homework and quiz facilitation packages.

While this isn’t particularly surprising, it does put significant responsibility in the hands of those that organize institutes of this nature. Most likely the technologies that are showcased are the ones that attendees will implement when they return home.

There were long-term impacts and actual implementations. Year one follow-up interviews validated the on-site participants' perceptions of gained knowledge and skills, and the consequential impact on their teaching. While primarily used to validate on-site results, their findings are worth mentioning here.

The most common long-term benefit noted by the interviewees was learned concepts and, in some cases, actual strategies for improving the way that they were already using learning technologies in their classrooms or the classrooms that they supported. They talked about how the Institute made them better at what they were already doing. When probed, most did report implementing some completely new type or use of technology as a direct results of attendance at the FSI. These ranged from specific Java Applets to general overhauls of their existing course Web pages. The following quotes illustrate:

"I started right on a project when I got back which would allow my students to take quizzes and do homework on-line."

"I work with a faculty member who is using the Web to teach a poetry class. After some discussions, based on things I had learned at the Institute, he is using the Web to intertwine images of the poetry with comments from students and his lecture comments. He's really using the Web to create relationships between the authors, the poets at work themselves."

Another recurring theme was that interviewees had a continued increased understanding of the potentials of using technology in the classroom. Eight months later, they were still synthesizing the information and peer stories of technology use. This is consistent with current theory on the use of information. Often information transfer begins as conceptual, where no clear direction or understanding of use is apparent to the individual. In time, however, the concepts solidify in the thinking and actual implementation or use occurs (Weiss, 1991).

They better understood the place and potential of technology in higher education

Survey results from all three years of the evaluation show that respondents left the Institute better prepared to use learning technologies. They reported an increased understanding of the role of technology and the specific ways it could be implemented in the higher education classroom. Additionally they reported an increased understanding of the benefits to both instructors and to students. Finally, they reported a broader and clearer understanding of the types of technologies that were available for their instructional use. These results are represented in figure 1.

Increases in understanding
Figure 1. 1998 Mean score ratings for increases in survey respondents' understanding.


Figure 2 shows the pre-and post-survey mean scores for years two and figure 3 shows means scores for year three of the Institute. Pre-survey and post-survey means were found significantly different at the P < .05 level for each of the five areas for both years.

Image6.gif (4279 bytes)
Figure 2. Pre-and post-survey mean score ratings for second year of institution.


Image7.gif (4242 bytes)
Figure 3. Pre-and post-survey mean score ratings for third year of institution.

Group interview results during all three years substantiated that the format of the FSI had been conducive to gaining a broader understanding of the various types of learning technologies available. Participants reported that the FSI was effective because the general morning sessions were followed by afternoon sessions that allowed attendees to gain specifically transferable knowledge and skills. They said that the morning sessions gave them "the big picture" and the afternoon work teams allowed them to operationalize that which they had learned.

Another clear message given in group interviews was that project-based learning was more effective for the actual transfer of skill sections of the Institute. As one participants said:

"I need to understand in a very concrete way how this applies to my classroom, maybe faculty should be told to bring a work in progress (such as a Web page) with them to the Institute. Then they could fine tune the conferencing features, or add some bells and whistles."

They gained a sense of collegiality and a set stage for future collaborations

As well as rating the content of the Institute high, 27% of the survey respondents indicated that the collegial interaction and working with teams was a very useful part of attending the Institute. They wrote comments about collegiality and future collaborations such as:

"Having opportunities to talk with colleagues across the state and in other disciplines about learning objectives was particularly useful;"

"I had many opportunities to interact with faculty from other institutions; there was also the ability for faculty from same school opportunity to get together, meet, build relationships." and "Peer interactions were encouraged. I now have a better idea of what is happening in public universities around the state."

This was substantiated during year two with 34% of the post-survey respondents and year three with 41% specifically mentioning that the interaction afforded by the Institute with others, the networking, was a very useful part of attending the Institute.

Group interview participants also reported that they felt a sense of collaboration had been actively fostered by the format of the Institute. They talked of collaborating with both intra- and inter-campus faculty members. Many of them mentioned meeting other faculty, and in some cases instructional support personnel, from their own institutions as well as from others across the state. Several commented that meeting these individuals in the context of an off-campus, formalized Institute encouraged them to seek out intra-institution collaboration that they probably would not have otherwise. Year one group interviewees reported that a specific formalized meeting with their own campus group would have been useful. In response, the FSI held a specific Wednesday evening session during the second and third Institute where participants all from the same campus discussed what they had learned, and reported it to the general assembly on Thursday. Group interview participants from year two and three specifically mentioned this Wednesday evening session as being especially useful for goal setting and establishing future intra-institutional working relationships.

Lessons Learned

Have general morning sessions followed by afternoon work teams

Survey, group and individual interview responses consistently indicate that the format of the Institute was effective and should be continued. The morning session with more general information followed by hands-on work teams was well received and useful for the participants.

Along these lines, when asked for suggestions on both surveys and in group interviews, a consistent theme was that the Institute could be even more hands-on in nature. Several respondents noted that the optimal situation would be where participants brought a project with them, such as a Web shell for a particular course, to be completed during the Institute.

Foster both inter- and intra-institutional collaboration

Participants from all three years of the group interviews reported that meeting people from other universities and faculty and support staff from their own campuses was a highly useful aspect of the Institute. They reported "sharing business cards" and having informal discussions both during lunches and in the evening. This discussion often involved faculty who sought out other faculty from similar disciplines to discuss pedagogical issues of implementation. They additionally reported that the work team environment encouraged this type of inter-institutional collaboration.

Year two and three respondents reported that the Wednesday session specifically designed to bring intra-institutional attendees together was helpful for goal setting and establishing a basis for future collaborations.

Hold institutes away from most attendees' work environment

Participants indicated that they would not have gained as much if the training had occurred at their own institutions. They would have felt pressure to spend part of the day in their office, rather than give full concentration to learning and collaborating. Additionally, they reported being rejuvenated by the sense that other faculty from around the state were involved in the same efforts. They also spoke of being validated in the use of technology through attending the Institute. Their administration had chosen them to attend and supported their absence for this purpose. This brought a sense of security and renewed drive to their efforts to implement technology in their particular classrooms.

Focus on the pedagogy of implementation as well as the technology

The pedagogical emphasis of the Institute grew each year as a result of the evaluation findings (see http://www.cet.uiuc.edu/fsi/about.html).

For example, year one criticism from participants was that there needed to be more focus on the broader issue of pedagogy. They said things such as:

"As much as it is a learning technology conference, it is a little off balance. We dwelled on technology a great deal but did not spend as much time on the application of technology in the learning process."

The steering committee subsequently increased year two and three emphasis on pedagogy. They replaced year one sessions such as "What is a National Learning Infrastructure?" with more pedagogically focused year two and three sessions such as "Collaborative Education: Realizing the Potential of Distributed Teaching and Learning" and "A Universe of Voices: Technology, Teaching and Faculty Roles in the Digital Age." In turn, survey respondents from both 1998 and 1999 rated the topics that dealt specifically with pedagogy as the most useful in the Institute.

Group interview participants from all three years also talked about the importance of understanding the pedagogical theories surrounding successful use of learning technologies in higher education classrooms. They mentioned issues such as the impact of learning technologies on student learning and motivation, and using technology to promote active learning. These faculty need to place learning technologies within the context of good teaching; they want to know more about effective strategies for implementing so that there is a total instructional benefit.

Summary

Faculty throughout higher education are choosing to implement learning technologies into their particular classrooms. They’re using listservs, Web pages, on-line homework and proprietary software such as PowerPoint to help deliver instruction. But these faculty are often hindered in their use by a lack of training in both the actual technologies as well as the associated pedagogy most appropriate for implementation. Institutes such as that housed at the University of Illinois at Urbana-Champaign during the summers of 1997, 1998, and 1999 are helping to meet the needs of these faculty.

The results of the three-year evaluation conducted on the Faculty Summer Institute at UIUC suggest that Institutes of this nature can serve as a change agent for the way participants use technology in their classrooms. Participants gain a broadened understanding of the actual technologies available and the benefits they have for both students and instructors.

Finally, these participants reported an espris de corps in that the FSI was a place to network with one another, gain new ideas, and be exposed to some of the issues raised when integrating technology into the higher education classroom. They learned and spoke with one another in a pleasant atmosphere that was satisfying for both the physical and intellectual needs of the participants (Bullock and Basi, 1999).

References

Bandura, Albert (1977). Social Learning Theory. Canada: Prentice-Hall.

Bullock, C. and Ory, J., Evaluating a campus wide effort to implement learning technologies. American Journal of Evaluation, in review.

Bullock, C. and Basi, M., Evaluation report for the Faculty Summer Institute. University of Illinois, Champaign: IL.

DeVry, J. and Hyde, P., Supporting faculty exploration of teaching and technology. Cause-Effect, 20, 3.

Dewey, John (1902). The Child and the Curriculum. Chicago: University of Chicago Press.

Forman, E.A., & Cazden, C.B. (1986). Exploring Vygotskian perspectives in education: The cognitive value of peer interaction. In J.V. Wertsch (Ed.), Culture, communication, and cognition: Vygotskian perspectives. New York: Cambridge University Press.

Grube, G.M.A., translator (1974). Plato’s Republic. Indianapolis, Indiana: Hackett Publishing Company.

Weiss, C. (1991) Reflections on 19th-Century Experience with Knowledge Diffusion: The Sixth Annual Howard Davis Memorial Lecture, April 11, 1991. Knowledge: Creation, Diffusion, Utilization. (13)1, p5-16, Sep 1991.

Worthen, B.R., & Sanders, J.R. (1997). Educational evaluation: Alternative approaches and practical guidelines. White Plains, NY: Longman Publishing.


IJET Homepage | Article Submissions | Editors | Issues

Copyright © 1999. All rights reserved.
Last Updated on 1 July 2000. Archived 5 May 2007.
For additional information, contact IJET@lists.ed.uiuc.edu