Patterns of use of a CAL assessment program used by overseas qualified pharmacists preparing for a registration exam

Arthur Pappas
Victorian College of Pharmacy
Branko Cesnik
Centre of Medical Informatics, Monash University
Julia Hoffman
Australian Pharmacy Examining Council Inc.
EdTech98 logo

Two computer based educational modules were developed using the authoring package ToolBook Multimedia CBT Edition (Asymetrix Corporation), operating on a Windows® platform, and distributed on disk. They included tuition and assessment in the areas of pharmaceutical calculations (M1) and biopharmaceutics (M2) and were aimed at assisting overseas qualified pharmacists preparing for an Australian registration exam. Presented are the results of evaluations for the assessment programs' level of achievement of set educational objectives, frequency and patterns of use. Respondents (n=32) rated 'high-very high' the extent to which the following objectives were met: encouragement of testing of tutorial knowledge (M1: 90.3%; M2: 90.0%); provision of practice using the examination question format (M1: 90.0%; M2: 96.5%); encouragement of review of tutorials based on test performance (M1: 83.9%; M2: 86.7%). The analysis of data disks (n=58) on assessment program use indicated: total tests: (M1: 1,713; M2: 1,152); questions attempted: (M1: 20,900; M2: 15,921); timer option used: this was selected in 36.9% of total tests; period of testing with timer (118.4h) and without timer (598.1h) (maximum allowable question time recorded). Overall, the assessment programs were found very valuable components of each module. They were perceived to have met their set objectives and were significantly used.

Introduction

The use of computer assisted learning (CAL) as a mode of education in pharmacy has become increasingly popular overseas during the last decade.[1,2] This trend has also included its implementation as a means of continuing education.[3,4] Here in Australia there is also an emerging interest in exploring its value at both undergraduate and graduate levels.[5-7] Studies have shown that it is of comparable effectiveness to traditional modes of teaching such as lectures.[8,9] Some of the major benefits of this mode include the encouragement of self-directed and self-paced learning. Even though some of the drawbacks include substantial development costs and the necessity for appropriate computer facilities, it has been shown that, in the long term, CAL can be a cost-effective alternative to traditional teaching methods.[10]

The software described in this article was developed as a distance education course to assist overseas qualified pharmacists residing within Australia to prepare for an examination which forms part of the registration requirements.

In Australia, the recognition of qualifications of foreign graduates is overseen by the National Office of Overseas Skills Recognition (NOOSR). Ensuring that pharmacists seeking registration meet the relevant standards is a challenging task for Australian professional bodies. The Australian Pharmacy Examining Council Incorporated (APEC) is the equivalent of the British Adjudicating Committee[11] and the US Foreign Pharmacy Graduate Examination Commission[12] and is delegated to develop acceptable examination procedures. These countries have essentially similar protocols which consist of preliminary knowledge examinations followed by practical experience, prior to pre-registration competency examination(s). In Australia, candidates must first pass an Occupational English Test (OET) before being able to sit for the Stage I Examination, which consists of two MCQ examinations (Paper 1: Subjects: Pharmaceutical Chemistry, Pharmacology, Physiology; Paper 2: Subjects: Pharmaceutics, Therapeutics). Candidates passing Stage I are eligible to commence up to 12 months supervised practical training, followed by the Stage II Examination. This consists of examinations set by the registering authority of each State which essentially address the attainment of a range of Australian adopted competencies[13]. The problems (language, cultural and professional) experienced by candidates in Australia are similar to those overseas[14]. Many candidates are from non-English speaking countries; the period since registration varies greatly and so does the content of the pharmacy course originally undertaken. European and especially Middle Eastern nations have substantially different courses and the practice of pharmacy is also markedly different to that of Australia. APEC assists candidates by the provision of an Information Handbook[15] setting out the requirements, lists of study references, examination requirements and provides liaison with an academic advisor. Since there is no formal curriculum, APEC and NOOSR have been pro-active in exploring 'bridging courses' for tuition and assessment in areas of need, to make candidates aware of the expected standards of knowledge and practice.

Objectives

The aims of this study were to: (a) highlight potential subjects for the provision of bridging courses (b) develop computer-based modules that included tuition and assessment reflective of the standard expected of the Stage I APEC Examination (c) determine the extent to which set educational objectives were achieved (assessment component only described in this paper) and (d) analyse the frequency and patterns of use of the assessment mode.

Methods

A preliminary survey (available from the authors) of past and present APEC candidates highlighted several subject areas suitable for course development. The two chosen for this study were pharmaceutical calculations and biopharmaceutics.

Content development

The Tutorial Program was intended to be a concise summary of the most important points and contained sufficient background tuition to competently perform a particular type of calculation or understand a concept. Further information would be found in a list references. It was recommended that candidates proceed through the tutorial topics sequentially and complete a topic before proceeding to the Test Menu for assessment on that topic. On completion of a tutorial, the option of undertaking any number of tests was available.

The Assessment Program in each module contained multiple choice questions (MCQs) related to the content of the specific tutorial. There was only one correct answer from the five answer options presented. There was also the facility to receive questions randomly from all test topics. The aim of the Assessment Program in each module was to provide the candidate with the opportunity to test their knowledge using a question format that is currently implemented for the APEC Stage I Exam. The number of questions within each topic varied, but were in the order of about twenty.

Module 1: Pharmaceutical Calculations (M1)

The Tutorial Program contained tuition on 14 topics (Table 1). Each topic commenced with a brief introduction and a set of objectives. The specific aims for M1 tutorials were to provide: broad, but adequate, tuition in performing the common pharmaceutical calculations; worked examples for understanding the process of performing calculations; 'Quiz Questions' (with hidden pop-up answer) to test the candidates own ability. The worked examples intended to take the candidate step by step through a particular calculation. The aim was to solve a problem using the best method given the available information.

Table 1

Topics in Module 1
  1. Units and Conversions
  2. Percentages and Conversions
  3. Density and Specific Gravity
  4. Manipulating Pharmaceutical Calculations
  5. Weighing and Measuring
  6. Dilution of Liquid Formulations
  7. Dilution of Solid/Semi Solid Formulations
  1. Body Cavity Delivery Systems
  2. Millimoles, Milliequivalents and Milliosmoles
  3. Isosomotic and Isotonic Solutions
  4. Buffer Solutions
  5. Drug Stability
  6. Molecular Manipulations
  7. Posology

Topics in Module 2
  1. Rate-limiting steps in drug absorption
  2. Official disintegration and dissolution tests
  1. The passage of drugs across biological membranes
  2. Gastro-intestinal absorption of drugs

The Assessment Program contained at least 15 questions for each topic (total of 271 MCQs).

Module 2: Biopharmaceutics (M2)

The Tutorial Program comprised of tuition on 4 topics (Table 1). The specific aims for M2 tutorials were to provide: a thorough coverage of an area of biopharmaceutics without assuming significant prior knowledge. The 'Quiz Questions' were intended to give the candidate experience in solving a problem after having observed the working out of a similar problem. Candidates were encouraged to solve the problem before revealing the worked answer. On selection of an answer option, feedback was immediately provided.

The Assessment Program contained least 30 questions for each topic title (total of 126 MCQs).

Program design and evaluation

Two computer based modules were written using the authoring package ToolBook Multimedia CBT Edition® (Asymetrix Corporation), operating on a Windows® platform (Figs. 1 and 2). On entering each module a main menu page enabled selection of either the Tutorial or Assessment Program (Fig. 3).

Figure 1
Figure 1: Module 1

Figure 2
Figure 2: Module 2

System requirements: These were: Microsoft Windows® 3.1 or higher; a Windows®-compatible computer with a 20MHz 80386 SX processor or higher; a Windows®-compatible mouse; a 1.44MB (3.5 inch) disk drive; a hard disk drive with approximately 12MB of free disk space; at least 4MB of random-access memory (RAM); 8MB or more was recommended; VGA, superVGA or other Windows®-compatible monitor, running at 800x600 resolution in small font mode was recommended.

Tutorial Program: This consists of the tutorial menu, listing each tutorial topic. Selection of a particular topic transfers to the Topic Index page (Figs. 4 and 5). This page contains a list of hypertext sub-topics within a particular tutorial.

Figure 3
Figure 3: Main Menu Page

Figure 4
Figure 4: Tutorial Menu Page

Tool Bar Buttons: The function of each Tool Bar Button (Tutorial Program) located at the bottom of each page is described in the blue message bar (on-line help) whenever the mouse pointer is placed over that particular button. Buttons have a: navigational or directional role (eg: go to the next page, the topic index, the test menu); specific function (eg: display a calculator, print a page); pop-up function (eg: view explanation or additional material) (Figs. 6-8).

Figure 5
Figure 5: Topic Index Page

Figure 6
Figure 6: Worked Example

Sections visited: As a means of keeping a record of sections visited, the colour of the titles in the Topic Index is converted from blue to black. On leaving the module, the option exists for the settings to be saved or cancelled.

Figure 7
Figure 7: Quiz Question

Figure 8
Figure 8: Options at end of topic

Hypertext: This refers to text that when clicked transfers the user to relevant information. The presence of hypertext is identified by the conversion of the mouse pointer into a hand symbol whenever the pointer rests over that text. For example, in the Tutorial Program, text written in red indicated the presence of hypertext which may function as a pop-up window of information (eg: answer after a quiz question) or as a jump which transfers to another location within the program. Similarly the list of topics in the Tutorial Menu and Test Menu are hypertext.

Assessment Program: Fifteen questions are randomly selected each time a test is undertaken. Candidates are encouraged to perform several tests after completing a particular topic. After reviewing all tutorials and completing tests on each respective topic, it was recommended for candidates to perform composite tests by clicking the 'questions from all topics' option.

Starting a test: A test can be commenced by entering the Test Menu either from the Main Menu or directly from any page in a tutorial by pressing the Test Menu button (Fig. 9). Candidates were advised that test scores did not form any part of the Stage I Exam result and that there were no penalties for failing tests or for doing a large number of tests.

Figure 9
Figure 9: Test Menu Page

Figure 10
Figure 10: Timer Option

Test format: A test has the following features: up to 15 randomly selected questions per test; ability to stop a test at any time; one attempt allowed for answer selection; the option to time answering of questions (V1.1 70 secs; V1.2 120 secs); no deductions for incorrect answers; provision of an explanation (working out - Module 1 only)

Timing a test question: The option for timing a test question is provided at the start of each new test (Fig. 10). It was recommended that candidates initially proceeded through a test without the timer to get a feel for the requirements of the questions. In later stages, when candidates were fine tuning their examination technique, it was thought beneficial to note their efficiency in solving a problem. If a question was not answered within the time allowed, the answer was automatically indicated on the screen and a zero mark given for that question.

Test question page: A typical test question page contains the question and five answer options. The answer is selected by placing the mouse pointer over the desired option and clicking once. A 'hand pointer' or 'cross' indicates whether the selection was correct or incorrect, respectively. Candidates have the option of viewing the written explanation, before proceeding to the next question. The information presented at the top of each test question page is: the topic name, time remaining to answer the question (if the timer was selected); the number of the question in the test (maximum of 15 questions); the percentage of correct responses so far (Figs. 11 and 12).

Figure 11
Figure 11: Question Page

Figure 12
Figure 12: Full working out

Tool Bar Buttons: The function of each Tool Bar Button (Assessment Program) located at the bottom of each page is described in the blue message bar (on-line help) whenever the mouse pointer is placed over that particular button. Buttons have a: navigational or directional role (eg: go to the next question, stop a test); specific function (eg: display a calculator); pop-up function (eg: view explanation or additional material).

Assessment Summary Page: On prematurely stopping a test or after completion of the fifteen questions, a percentage mark is provided (Fig. 13). At this point the candidate has several options: print a copy of the results, view a history previous performances (date, timer, duration, module number, topic name, total questions, percentage mark) (Fig. 14), more revision of the current test topic, go to the Test Menu to select another test topic or exit to the last tutorial page.

Figure 13
Figure 13: Assessment Summary Page

Figure 14
Figure 14: Past Results Page

Program installation: The package consists of 4 installation disks and one Results Disk. The installation procedure follows a Windows® method of: File, Run, Setup with progress indicated on screen.

Results Disk: The Assessment Program has the facility for recording the details of all tests undertaken onto floppy disk. The information was gathered for course development purposes. The program cannot be operated without inserting the Results Disk into the hard drive. Candidates returned the disk as proof of module completion. When the disk was received, a password was issued for continued use of the program without the need for a disk. The Results Disk stores the following information for each test: candidate name (optional), test date, timer on/off (if timer was off, the maximum permitted time was recorded for each questions), duration of test, module number, topic number and name, number of questions, question number (eg: 2.15 was the fifteenth question for a test on topic 2), whether each question was correct, incorrect or exit before responded) and final percentage score.

Candidate's Guide: This was a 14 page booklet outlining system requirements, installation procedure, description of the Tutorial and Test Programs, explanation of navigation buttons and general features using screen images.

Program Release and Update: Version 1.1 was released for the September 1996 (19 respondents) and Version 1.2 for the March 1997 (13 respondents) Examination. Version 1.1 was corrected for typographical errors and the following upgrades were made. Firstly, the Results disk contained an additional data base for recording the time spent within each tutorial. Secondly, there was a Past Results Page that recorded details of a candidates previous test performances on their hard drive. eg: date, timer (on/off), duration of test (minutes), module number, topic name, total number of questions and percentage score.

Module evaluation: Thirty-two candidates who elected to use the package also completed an eight page mail questionnaire (available from the authors). Areas for evaluation were: installation procedure, self-rating of computer skills, usefulness of Candidates Guide; rating of achievement of set objectives; comparison with other study methods; usefulness as a study resource and preparation for the Stage I Examination; evaluation of various program features: navigation, presentation and a range of features within the Assessment Program. Candidate feedback to most questions was via selection from a five point Likert scale.[16]

Results

Results - Evaluation survey

Sample characteristics: The thirty-two candidates in this sample were registered in the following periods: 1966-76 (16.2%), 1977-86 (41.9%), 1987-96 (41.9%).

Site of program use: home (23); computer centre (1); library (1) and 'other' (7). Program installation: of 21 candidates who self-installed the program, 18 found the procedure 'easy-very easy'.

Self-rating of computer skills: Most (20) personally rated their computer skills as moderate and eight rated them as 'high-very high'. Only four candidates reported 'technical' problems and these related to improper installation and 'mouse' clicking. The Candidate's Guide provided was found to be 'useful-very useful' by 24 candidates.

Achievement of objectives: A 'high-very high' rating was given by candidates for the extent to which the following objectives were achieved: encouragement of testing of tutorial knowledge (M1: 90.3%; M2: 90.0%); provision of practice using the examination question format (M1: 90.0%; M2: 96.5%); encouragement of review of tutorials based on test performance (M1: 83.9%; M2: 86.7%).

Compared to using other study methods: The modules overall were rated 'better-much better' in the following ways: enthusiasm to read particular topic (M1: 93.8%, M2: 93.8%); understanding of the material (M1: 93.7%, M2: 93.6%); testing one's knowledge (M1: 96.9%, M2: 100.0%) and as an overall study source (M1: 96.9%, M2: 93.7%). All respondents rated the module 'useful-very useful' in improving overall subject knowledge.

As a preparation for the APEC exam: This modules overall were rated 'useful-very useful' as a preparation for the exam (M1: 90.4%, M2: 93.5%). More than 70% of respondents using either module found the level of language expression 'appropriate'. The modules as a whole were viewed as: a 'complementary' study source (M1: 66.7%, M2: 73.3%); 'sole' means of study (M1: 33.3%, M2: 26.7%).

Common program features: In terms of navigation, the tutorial and assessment programs were rated 'easy-very easy' by 93.8% and 93.5% of respondents, respectively. The presentation quality (including layout, design, graphics) of the tutorial and assessment programs were rated 'high-very high' by 90.6% and 93.5% of respondents, respectively.

Assessment Program features: The following features of the assessment program were rated 'useful-very useful': option to time a test (80.0%); continuous scoring (93.5%); stop test at any time (93.5%); display a calculator (45.1%); view the working of a question [M1 only] (96.8%); reasonable number of questions for each topic (100%); random selection of questions for each topic (100%) and for all topics (96.8%); a large data bank of questions (100%); a score and comment at the end of each test (90.0%); view details of previous test performances (90.0%) [Version 1.2].

Further module development: Almost all respondents (93.8%) stated that they would like further modules to be developed and popular suggested topics were: pharmaceutical/medicinal chemistry, pharmacology, drug interactions and therapeutics.

Results - Assessment program usage statistics

The data collected from the Results Disks from 58 candidates were analysed. Only data on Assessment Program usage was gathered and not on Tutorial Program usage. The reason for this was that candidates were provided with a hard copy of the tutorials as well as the software.

Tests
A total of 2,865 tests of various lengths were undertaken (M1: 1,713; M2: 1,152). There were periods of high frequency use of both Test Programs in the weeks leading up to month of each exam date (ie: March and September) (Figs. 15 and 16).

Questions
A total of 36, 821 questions were attempted (M1: 20,900; M2: 15,921). Candidates' overall performance in the questions was as follows: M1: (79.7% correct, 17.6% incorrect, 2.7% exited before answer selection); M2: (86.1% correct, 13.2% incorrect, 0.7% exited before answer selection).

Timer option
The timer was used in 36.9% of total tests (M1: 24.3%; M2: 55.7%).

Period of testing
The period of testing with timer 'on' was (118.4h) and with timer 'off' (maximum allowable question time recorded) (598.2h). Therefore the total testing period for both modules was 716.6 hours.

Average time attempting each question
With the timer 'on' the average period taken to answer a question was: (M1: 53.2s, M2: 19.1s). This was to be expected since solutions to calculations take longer to work out than answering biopharmaceutics questions that tested a candidate's interpretation of facts. When the timer was 'off', the average period for answering a question was similar for both modules (M1: 92.1s, M2: 96.4s).

Figure 15
Figure 15: Number of tests undertaken versus date (M1)
(arrow indicates exam date)

Figure 16
Figure 16: Number of tests undertaken versus date (M2)
(arrow indicates exam date)

Figure 17
Figure 17: Performance on individual questions

Analysis of question performance
The overall performance of candidates for each question, within each topic, was analysed. The following was graphed (Fig. 17): the percentage of times the question was answered correctly, incorrectly or exited (without making a selection). This provided useful information in terms of how each question was handled by candidates. Any question whose overall performance was poor, was checked for ambiguities and errors.

Discussion

This was the first Australian development of a computer package aimed at overseas qualified pharmacists to prepare them for their registration exam. Even though this sample was heterogeneous in terms of year of registration, place of training and previous computer experience, the feedback on the educational value of the package was quite positive.

Since candidates resided in all corners of the country, it was important that this distance course was easy to self-install and operate. We succeeded in that regard since most respondents felt comfortable doing this. Educationally, there were high ratings (>80% 'high-very high') for the accomplishment of the set objectives: testing of tutorial content, practice using exam format and tutorial review based on test performance. When the modules were compared to other study method, they consistently rated 'better-much better' in terms of enthusiasm to read a particular topic, understanding the material and testing one's knowledge. It was encouraging to know that more than 90% of respondents found each module 'useful-very useful' as a preparation for the exam. The package was seen by about two thirds of respondents as a complementary source of study for the Stage I Exam. This is understandable since the package only covered about a tenth of the subjects covered in the exam.

The program design was deliberately kept simple and all features were described using on-line help. The navigational and presentational qualities of the modules were given high score and the list of program features were also consistently rated highly except for the calculator option. We believe that candidates found it easier to use their own calculator rather than the 'pop-up' one because it obstructed their view of the question.

Conclusions

The assessment programs were seen a valuable and acceptable way of self-testing knowledge and facilitating review in the chosen subject areas. The data collected indicated a high level of usage by candidates. All program features were positively viewed, except for the calculator display option. Finally, the positive response to these modules was summarised by over 93% requesting the development of further programs.

References

  1. Mottram, DR, Armstrong, DJ. Computer-assisted learning at Liverpool school of pharmacy. Pharm J, 1994; 253: 103.

  2. Verheul, J. Computer assisted learning (CAL) in pharmacy education and training. Pharm J, 1992; 249: 467-9.

  3. Daly, JE, Nicholls, DK, Brain, KR, Grassby, PF, Temple, DJ. Pharm J, 1990; 245 (suppl.): E1-3.

  4. Dhalla M. Approaching computer assisted learning from different angles. Pharm J, 1992; 248: 684-6.

  5. Gee P, Peterson G. Computer assisted learning. Does it have a place in continuing education? Aust Pharm, 1995; 14 (10): 614-18.

  6. Gee P, Peterson G. Pharmtutor, a computer-assisted learning package for community pharmacists. Aust Pharm 1994; 13: 123-8.

  7. Pappas, A, Papalazarou, P, Marty, S, Cesnik, B, Roller, L, Reed, BL et al. Computers for practice: How acceptable are they as a reference and CPE resource ? Aust Pharm, 1996; 15 (7): 403-6, 431.

  8. Kulik, CC, Kulik, JA, Shwalb, BJ. The effectiveness of computer-based adult education: a meta analysis. J Educ Comp Res, 1986; 2: 235-52.

  9. Clem, JR, Murry, DJ, Perry, PJ, Alexander, B, Holman, TL. Performance in a clinical pharmacy clerkship: computer aided instruction versus traditional lectures. Am J Pharm Educ, 1992; 56: 259-63.

  10. Gee, MA, Oszko, MA, White, SJ, Scott, BE. Evaluation of computer-assisted instruction versus traditional new-employee training. Am J Hosp Pharm, 1988; 45: 2107-12.

  11. Registration of overseas pharmacists: 25 years of the Adjudicating Committee. Pharm J, 1988; 241:50-1.

  12. Abood, RR. Aliens and foreign graduates in pharmacy. US Pharm, 1984; 9: 13-55.

  13. Competency Standards for Entry Level Pharmacists in Australia, 1994 PSA: Canberra, ACT, Australia.

  14. Brod,y R. Plight of foreign RPhs who want to practice here. Am Drug, 1981; 183: 17-22.

  15. Pharmacy Candidates' Information Handbook, 1997: NOOSR, DEETYA, Canberra, ACT, Australia.

  16. Likert R. A technique for the measurement of attitudes. New York: Columbia University Press; 1932.
The authors acknowledge the financial support of NOOSR through the DEETYA, Canberra, ACT, Australia.

Authors: Arthur Pappas, Lecturer in Pharmacy Practice (address correspondence)
Victorian College of Pharmacy
Monash University
381 Royal Parade
Parkville Vic 3052 Australia
Email: arthur.pappas@vcp.monash.edu.au

Branko Cesnik, Director, Centre of Medical Informatics
Peninsula Campus, Monash University
McMahons Road, Frankston Vic 3199 Australia
Email: branko.cesnik@med.monash.edu.au

Julia Hoffman, Registrar, Australian Pharmacy Examining Council Inc.
Level 6, 10 Mort Street, Canberra City, ACT 2601 Australia
Ph: (06) 240 7614

Please cite as: Pappas, A., Cesnik, B. and Hoffman, J. (1998). Patterns of use of a CAL assessment program used by overseas qualified pharmacists preparing for a registration exam. In C. McBeath and R. Atkinson (Eds), Planning for Progress, Partnership and Profit. Proceedings EdTech'98. Perth: Australian Society for Educational Technology. http://www.aset.org.au/confs/edtech98/pubs/articles/pappas.html


[ Proceedings Contents ] [ EdTech'98 Main ]
© 1998 The author and ASET.
This URL: http://www.aset.org.au/confs/edtech98/pubs/articles/pappas.html
Created 20 Mar 1998. Last revision: 21 Apr 2003. Editor: Roger Atkinson
Previous URL 20 Mar 1998 to 30 Sep 2002: http://cleo.murdoch.edu.au/gen/aset/confs/edtech98/pubs/articles/p/pappas.html