[ ASCILITE ]
[ 2004 Proceedings Contents ] |
Concerns about the quality of student learning in first year Physics courses at UNSW, particularly laboratory activities, led to a decision to introduce some new ideas about learning activities. A comprehensive evaluation of student learning in these courses was initiated to clarify the student perceptions of first year courses. An online diary, using surveys and a discussion in WebCT, was used to gather regular feedback from students. A number of student volunteers responded to the surveys and answered a range of open ended questions at regular intervals over three semesters. Open ended text responses were analysed using NVIVO software to identify themes and quantify the responses to each theme. This led to the introduction of open ended group learning tasks. The paper details the process and outlines enhancements to the courses and their effectiveness.
Learning in science courses can be seen as an active process of inquiry in which theoretical understanding is used to form ideas about the nature of the topic under investigation. Some approaches to science learning and teaching seek to apply the stimulus of inquiry and practical investigation to make science courses more engaging, and to lead to improved learning outcomes. Redish (1999) reports very significant improvements in physics learning by replacing lectures with the guided discovery workshop approach. The workshop physics approach combines brief lectures, small group experimentation and class discussion, using open ended problems as well as those that have specific numerical solutions (Redish, 2002). These approaches emphasise group work as an integral part of the learning process with the aim of encouraging the development of communication and teamwork skills while leading to a deeper understanding of and greater engagement with the science. A problem solving model for Physics courses developed by Heller and Heller (1999) emphasises cooperative group work in solving context rich problems. The guided discovery approach described by Redish (2002) and the Technology Enabled Active Learning (TEAL) approach developed at MIT (Belcher, 2001) all emphasise small group learning activities. The TEAL approach uses technology to support small group experimentation, and to communicate results to the whole class.
These approaches illustrate some of the problems that have emerged in science courses at other institutions, and approaches that have been taken to enhance course outcomes. The following sections outline the UNSW context and the reasons for commencing a project to improve Physics laboratory courses with an online diary and surveys as evaluative tools
At the beginning of 2002, the laboratory program consisted of a set of experiments, generally of low inquiry level, in which an experimental aim and method were given, and often the (expected) result was known in advance (for example the measurement of a constant, or the total force on a body in equilibrium). The students were given explicit instructions on how to perform the experiments, which took place weekly in a two hour laboratory session, during which students recorded their results by filling in blanks in their laboratory manual. Informal student feedback showed that students considered the labs easy marks, and generally neither challenging nor engaging. This impression was supported by discussions with the demonstrators.
All courses have standard multiple choice student satisfaction surveys administered at the end, and generally the first year Physics courses rated on a par with other science subjects. However these sorts of simple student feedback surveys often fail to give any insight into students' learning experiences. Hence it was decided that an in-depth study of the student learning experience in first year physics should be carried out. The diary project was developed to meet this need.
A detailed analysis of text to find emergent concerns in the first semester of 2002 laboratory experiments informed experiment design for the first semester of 2003. Further analysis of the combined semester 1 and semester 2 feedback has built up a picture of student learning patterns in 1st year Physics as a whole. This is providing more extensive and rigorous data - by identifying which learning issues students mention most frequently. These can be built into future routine evaluations.
The students occasionally used the WebCT discussion board to suggest questions which they wanted to answer, and these were also added to the survey, as were suggestions made in the focus group meetings held two or three times per semester.
Approximately 15 students were selected each semester from a list of volunteers. A small payment was offered for participation to compensate for the time required to complete the diary. The selection process aimed to achieve:
It is always a challenge to find a representative group of students, and it is likely that those selected were more opinionated or more in need of financial assistance than the average student. But we believe that overall the students were neither more nor less able than average, since the range of learning styles and preferred learning activities described by the students indicate a varied sample. Discussions during focus group meetings indicated a range of backgrounds, from high achievement in secondary school physics programs, to no prior experience of physics at high school, and a range of approaches to their physics study from very shallow, fact and formula memorising to much deeper learning approaches.
Second, the survey tool could be used to gather lengthy text responses to open questions, such as "what did you learn in lab this week?" and "What helped you to learn?" in electronic form so that text analysis could be carried out without the need for data entry. This is a major advantage over having students complete a "pen and paper" journal, which subsequently needs to be carefully transcribed by a researcher who must grapple with difficult to read handwriting.
A third advantage is that responses could be reviewed quickly each week so that the next week's questions could be modified and added to based on arising issues. In semester 1, 2003, students were asked if they could suggest any additional questions to add to the online diary, and typically one or two questions were added each week either as one off questions, for example those relating to mid-semester exams, or regular questions which related to ongoing course activities. This would not be possible with a paper based journal, without regularly asking the students to submit their journals and them hand them back again later, which would be inconvenient both for the students and the researchers.
A disadvantage is that students are limited to completing their diary entries only when they have internet access, however this did not seem to be an issue for the students who contributed to this study. Internet access is widely available on campus, and most of the students involved in the project had internet access at home. The flexibility of the online diary was particularly valuable to researchers, as it enabled the student responses to be viewed at leisure, without having to collect and return paper based diaries.
These responses were analysed using the NVIVO text analysis software (Walsh, 2003). The text was coded manually on learning topics and learning issues - initially basing the codes on the students' own words and grouping these into themes as patterns began to emerge. The software allows exploration of the frequency of like comments - for example, the balance of positive and negative comments about the role of tutors, or the identification of the most common learning issue.
The researcher doing the coding analysis was from a central support unit and, although familiar with Physics education in general, had no knowledge of the design and operation of the experiments, nor what the students were supposed to learn from them. So the judgements made were on the basis of the student responses only. Although the analysis process may have missed a few comments, the pattern of comments is likely to be reliable, especially for those issues where there were no specific question prompts.
The diary project ran for three consecutive academic sessions and, with each iteration, the questions became more focused on the concerns already identified, while still offering scope for open comment. One advantage of using the NVIVO software was that, increasingly, the question structure could be used to code these responses automatically.
With a large and continuing survey, using the NVIVO software to analyse online diaries would reduce manual analysis work. However, ongoing qualitative analysis in this depth may have limited returns in relation to the time and costs.
In this context the main advantage of using the software was in providing systematic and quantified evidence in a discipline where practitioners are unfamiliar with qualitative methods - without restricting the initial research to multiple choice questions. Future multiple choice questionnaires can now be focused on issues that students themselves have raised in this context, rather than on the initial assumptions of teachers.
Table 1 represents a classification of comments from the diary into categories, with a tabulation of the numbers of positive and negative comments. The diary evaluation revealed that students had a generally negative view of laboratory courses. Table 1 shows significant numbers of negative comments on equipment, level of interest generated, instructions, level of challenge, relevance to the course, the time spent on the tasks and the amount of hands on activity. Students had predominantly positive comments about the support from the tutors and form working with other students. This clarified some of the staff perceptions of lack of student engagement with first year laboratory courses.
Issue | Total no of mentions | Positive | Negative |
equipment and facilities | 87 | 18 (appreciated using) | 63 (faulty, not enough, hard to use) |
tutors/demonstrators | 70 | 45 (helpful) | 24 (unhelpful) |
engagement | 53 | 17 (fun, interesting) | 34 (boring) |
instructions | 51 | 22 (helpful, easy to follow) | 26 (confusing, hard to follow) |
preparation | 46 | 23 (helped in lab) | 19 (unhelpful, not done, too hard) |
challenge | 25 | 2 (challenging) | 21 (easy) |
peer support | 25 | 17 (other students helped) | 7 (worked alone or others unhelpful) |
relevance to course | 22 | 11 (linked to lectures, etc.) | 10 (unrelated or badly timed) |
time taken | 17 | 1 (about right) | 15 (too short or too long) |
hands on | 10 | 4 (learnt from or enjoyed doing) | 6 (didn't do enough) |
While the laboratory activities were seen as a standard and safe way to run laboratory courses, they were not really meeting the students' educational needs. The staff involved in teaching the laboratory courses decided to try an open ended project as a learning task. The students were asked to select an area of interest and to design an experiment, to carry it out and to write a report and to make a presentation to the class on their findings. This was done as a group activity, to encourage the students to develop communication, teamwork and negotiation skills as well as learning scientific research and reasoning capabilities.
To establish the level of challenge and engagement for the students, and the challenges for staff in introducing, facilitating, and managing the process, projects were introduced to one course on a trial basis in Semester 2, following the initial diary evaluation. Students did a range of practicals in the first half of the semester, and spent the second half on their group projects. The online diary was used again in second semester. This involved a different cross section of students, some of whom were in the class that had the projects introduced.
Although the questions asked about what helped and hindered learning in these various course components, the students frequently mentioned, unprompted, other aspects of their study. These are listed in Table 2.
issue | total no. of comments | positive comments | negative comments |
tutors and demonstrators | 179 | 106 | 73 |
integration of components | 101 | 34 | 67 |
prelab work | 101 | 58 | 43 |
textbook | 99 | 83 | 16 |
lecturer | 77 | 43 | 34 |
lecture notes | 68 | 49 | 18 |
lab manual | 65 | 34 | 31 |
problem practice | 58 | 45 | 13 |
web | 51 | 25 | 26 |
Over both semesters, there were a number of comments, prompted and unprompted, on the main programmed learning activities (lectures, tutorials, projects, lab experiments). These are listed in Table 3, with some typical examples.
Table 3 provides an interesting illustration of the effectiveness of the introduction of the projects in semester 2. While the comments on the 'normal' laboratory courses and tasks were split evenly between positive and negative, comments on the projects were predominantly positive. The online diaries revealed a major change in attitude among the participants in the diary project.
course component | total no. of comments | positive comments [typical example] | negative comments [typical example] |
lectures | 125 | 35 [Richard is very good at explaining tricky concepts, he gives real life examples ... lots of demos (always good).] | 81 [I found the lectures extremely boring. They were difficult to understand and made learning physics difficult.] |
tutorials | 82 | 36 [... tutorial problems, I knew which questions I could do or not, so in tutorial I could ask the tutor about those questions.] | 49 [There needs to be way more teaching, because at the moment it is just some dude writing solutions on the board.] |
projects | 39 | 35 [I really liked the lab projects because it gave me the opportunity to pursue a topic of physics that I had particular interest in. Working in a group also helped in enhancing my organisational, group work and communication skills.] | 1 |
lab experiments | 26 | 13 [...felt good to know that I understood what was being taught by being able to apply it to the real world.] | 13 [I was doing experiments I had no idea about many weeks before we discussed them in lectures.] |
The generally negative response to tutorials is also being addressed. However, this is a more complex problem to tackle, and the causes for the negative responses come from a variety of problems including poor preparation by both the students and the tutors, and a lack of agreed learning objectives, and hence teaching strategy, amongst the lecturers and the tutors.
The immediate outcomes from the online diary feedback from 2002 were:
The diary project was valuable as a one off exercise in evaluating student experience of the existing courses and for investigating the efficacy of newly introduced activities, such as the open ended laboratory projects, which were designed to be more engaging to students. The resources and support required to run an online diary project and carry out the thematic analysis of text responses are perhaps not justifiable as an ongoing exercise for routine evaluation. However, at a time when the curriculum is being reviewed it provided useful information for redesigning some aspects of the course, and identified issues for ongoing evaluation.
Biggs, J. (1999). What the student does. Higher Education Research & Development, 18(1), 57-75.
Hanson, D. & Wolfskill, T. (2000). Process workshops - a new model for instruction. Journal of Chemical Education, 77(1), 120-130.
Heller, P. & Heller, K. (1999). Cooperative Group Problem Solving in Physics. Minnesota: University of Minnesota.
McInnis, C., James, R. & Hartley, R. (2000). Trends in the First Year Experience In Australian Universities. Canberra: Department of Education, Training and Youth Affairs.
Redish, E. F. (1999). Millikan Award Lecture (1998): Building a Science of Teaching Physics. American Journal of Physics, 67, 562-573. http://www.physics.umd.edu/perg/papers/redish/mlknpre.pdf [verified 21 Oct 2004].
Redish, E. F. (2002). Teaching Physics With the Physics Suite. Online Book. Department of Physics, University of Maryland. [Feb 2003, verified 21 Oct 2004] http://www2.physics.umd.edu/~redish/Book/
Walsh, M. (2003). Teaching qualitative analysis using QSR NVivo. The Qualitative Report, 8(2), 251-256. http://www.nova.edu/ssss/QR/QR258-252/walsh.pdf
Zuber-Skerritt, O. Y. T. (1992). Action Research in higher education: Examples and reflections. London: Kogan Page.
Please cite as: McAlpine, I., Wilson, K., Russell, C. & Cunningham, M. (2004). An online diary as a research and evaluation tool for first year physics. In R. Atkinson, C. McBeath, D. Jonas-Dwyer & R. Phillips (Eds), Beyond the comfort zone: Proceedings of the 21st ASCILITE Conference (pp. 616-622). Perth, 5-8 December. http://www.ascilite.org.au/conferences/perth04/procs/mcalpine.html |
© 2004 Iain McAlpine, Kate Wilson, Carol Russell and Maria Cunningham
The authors assign to ASCILITE and educational non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The authors also grant a non-exclusive licence to ASCILITE to publish this document on the ASCILITE web site (including any mirror or archival sites that may be developed) and in printed form within the ASCILITE 2004 Conference Proceedings. Any other usage is prohibited without the express permission of the authors.