Patterns of use of a CAL assessment program used by overseas qualified pharmacists preparing for a registration examArthur PappasVictorian College of Pharmacy Branko Cesnik Centre of Medical Informatics, Monash University Julia Hoffman Australian Pharmacy Examining Council Inc. |
Two computer based educational modules were developed using the authoring package ToolBook Multimedia CBT Edition (Asymetrix Corporation), operating on a Windows® platform, and distributed on disk. They included tuition and assessment in the areas of pharmaceutical calculations (M1) and biopharmaceutics (M2) and were aimed at assisting overseas qualified pharmacists preparing for an Australian registration exam. Presented are the results of evaluations for the assessment programs' level of achievement of set educational objectives, frequency and patterns of use. Respondents (n=32) rated 'high-very high' the extent to which the following objectives were met: encouragement of testing of tutorial knowledge (M1: 90.3%; M2: 90.0%); provision of practice using the examination question format (M1: 90.0%; M2: 96.5%); encouragement of review of tutorials based on test performance (M1: 83.9%; M2: 86.7%). The analysis of data disks (n=58) on assessment program use indicated: total tests: (M1: 1,713; M2: 1,152); questions attempted: (M1: 20,900; M2: 15,921); timer option used: this was selected in 36.9% of total tests; period of testing with timer (118.4h) and without timer (598.1h) (maximum allowable question time recorded). Overall, the assessment programs were found very valuable components of each module. They were perceived to have met their set objectives and were significantly used.
The software described in this article was developed as a distance education course to assist overseas qualified pharmacists residing within Australia to prepare for an examination which forms part of the registration requirements.
In Australia, the recognition of qualifications of foreign graduates is overseen by the National Office of Overseas Skills Recognition (NOOSR). Ensuring that pharmacists seeking registration meet the relevant standards is a challenging task for Australian professional bodies. The Australian Pharmacy Examining Council Incorporated (APEC) is the equivalent of the British Adjudicating Committee[11] and the US Foreign Pharmacy Graduate Examination Commission[12] and is delegated to develop acceptable examination procedures. These countries have essentially similar protocols which consist of preliminary knowledge examinations followed by practical experience, prior to pre-registration competency examination(s). In Australia, candidates must first pass an Occupational English Test (OET) before being able to sit for the Stage I Examination, which consists of two MCQ examinations (Paper 1: Subjects: Pharmaceutical Chemistry, Pharmacology, Physiology; Paper 2: Subjects: Pharmaceutics, Therapeutics). Candidates passing Stage I are eligible to commence up to 12 months supervised practical training, followed by the Stage II Examination. This consists of examinations set by the registering authority of each State which essentially address the attainment of a range of Australian adopted competencies[13]. The problems (language, cultural and professional) experienced by candidates in Australia are similar to those overseas[14]. Many candidates are from non-English speaking countries; the period since registration varies greatly and so does the content of the pharmacy course originally undertaken. European and especially Middle Eastern nations have substantially different courses and the practice of pharmacy is also markedly different to that of Australia. APEC assists candidates by the provision of an Information Handbook[15] setting out the requirements, lists of study references, examination requirements and provides liaison with an academic advisor. Since there is no formal curriculum, APEC and NOOSR have been pro-active in exploring 'bridging courses' for tuition and assessment in areas of need, to make candidates aware of the expected standards of knowledge and practice.
The Assessment Program in each module contained multiple choice questions (MCQs) related to the content of the specific tutorial. There was only one correct answer from the five answer options presented. There was also the facility to receive questions randomly from all test topics. The aim of the Assessment Program in each module was to provide the candidate with the opportunity to test their knowledge using a question format that is currently implemented for the APEC Stage I Exam. The number of questions within each topic varied, but were in the order of about twenty.
Topics in Module 1 | |
|
|
Topics in Module 2 | |
|
|
The Assessment Program contained at least 15 questions for each topic (total of 271 MCQs).
The Assessment Program contained least 30 questions for each topic title (total of 126 MCQs).
Figure 2: Module 2
System requirements: These were: Microsoft Windows® 3.1 or higher; a Windows®-compatible computer with a 20MHz 80386 SX processor or higher; a Windows®-compatible mouse; a 1.44MB (3.5 inch) disk drive; a hard disk drive with approximately 12MB of free disk space; at least 4MB of random-access memory (RAM); 8MB or more was recommended; VGA, superVGA or other Windows®-compatible monitor, running at 800x600 resolution in small font mode was recommended.
Tutorial Program: This consists of the tutorial menu, listing each tutorial topic. Selection of a particular topic transfers to the Topic Index page (Figs. 4 and 5). This page contains a list of hypertext sub-topics within a particular tutorial.
Figure 4: Tutorial Menu Page
Tool Bar Buttons: The function of each Tool Bar Button (Tutorial Program) located at the bottom of each page is described in the blue message bar (on-line help) whenever the mouse pointer is placed over that particular button. Buttons have a: navigational or directional role (eg: go to the next page, the topic index, the test menu); specific function (eg: display a calculator, print a page); pop-up function (eg: view explanation or additional material) (Figs. 6-8).
Figure 6: Worked Example
Sections visited: As a means of keeping a record of sections visited, the colour of the titles in the Topic Index is converted from blue to black. On leaving the module, the option exists for the settings to be saved or cancelled.
Figure 8: Options at end of topic
Hypertext: This refers to text that when clicked transfers the user to relevant information. The presence of hypertext is identified by the conversion of the mouse pointer into a hand symbol whenever the pointer rests over that text. For example, in the Tutorial Program, text written in red indicated the presence of hypertext which may function as a pop-up window of information (eg: answer after a quiz question) or as a jump which transfers to another location within the program. Similarly the list of topics in the Tutorial Menu and Test Menu are hypertext.
Assessment Program: Fifteen questions are randomly selected each time a test is undertaken. Candidates are encouraged to perform several tests after completing a particular topic. After reviewing all tutorials and completing tests on each respective topic, it was recommended for candidates to perform composite tests by clicking the 'questions from all topics' option.
Starting a test: A test can be commenced by entering the Test Menu either from the Main Menu or directly from any page in a tutorial by pressing the Test Menu button (Fig. 9). Candidates were advised that test scores did not form any part of the Stage I Exam result and that there were no penalties for failing tests or for doing a large number of tests.
Figure 10: Timer Option
Test format: A test has the following features: up to 15 randomly selected questions per test; ability to stop a test at any time; one attempt allowed for answer selection; the option to time answering of questions (V1.1 70 secs; V1.2 120 secs); no deductions for incorrect answers; provision of an explanation (working out - Module 1 only)
Timing a test question: The option for timing a test question is provided at the start of each new test (Fig. 10). It was recommended that candidates initially proceeded through a test without the timer to get a feel for the requirements of the questions. In later stages, when candidates were fine tuning their examination technique, it was thought beneficial to note their efficiency in solving a problem. If a question was not answered within the time allowed, the answer was automatically indicated on the screen and a zero mark given for that question.
Test question page: A typical test question page contains the question and five answer options. The answer is selected by placing the mouse pointer over the desired option and clicking once. A 'hand pointer' or 'cross' indicates whether the selection was correct or incorrect, respectively. Candidates have the option of viewing the written explanation, before proceeding to the next question. The information presented at the top of each test question page is: the topic name, time remaining to answer the question (if the timer was selected); the number of the question in the test (maximum of 15 questions); the percentage of correct responses so far (Figs. 11 and 12).
Figure 12: Full working out
Tool Bar Buttons: The function of each Tool Bar Button (Assessment Program) located at the bottom of each page is described in the blue message bar (on-line help) whenever the mouse pointer is placed over that particular button. Buttons have a: navigational or directional role (eg: go to the next question, stop a test); specific function (eg: display a calculator); pop-up function (eg: view explanation or additional material).
Assessment Summary Page: On prematurely stopping a test or after completion of the fifteen questions, a percentage mark is provided (Fig. 13). At this point the candidate has several options: print a copy of the results, view a history previous performances (date, timer, duration, module number, topic name, total questions, percentage mark) (Fig. 14), more revision of the current test topic, go to the Test Menu to select another test topic or exit to the last tutorial page.
Figure 14: Past Results Page
Program installation: The package consists of 4 installation disks and one Results Disk. The installation procedure follows a Windows® method of: File, Run, Setup with progress indicated on screen.
Results Disk: The Assessment Program has the facility for recording the details of all tests undertaken onto floppy disk. The information was gathered for course development purposes. The program cannot be operated without inserting the Results Disk into the hard drive. Candidates returned the disk as proof of module completion. When the disk was received, a password was issued for continued use of the program without the need for a disk. The Results Disk stores the following information for each test: candidate name (optional), test date, timer on/off (if timer was off, the maximum permitted time was recorded for each questions), duration of test, module number, topic number and name, number of questions, question number (eg: 2.15 was the fifteenth question for a test on topic 2), whether each question was correct, incorrect or exit before responded) and final percentage score.
Candidate's Guide: This was a 14 page booklet outlining system requirements, installation procedure, description of the Tutorial and Test Programs, explanation of navigation buttons and general features using screen images.
Program Release and Update: Version 1.1 was released for the September 1996 (19 respondents) and Version 1.2 for the March 1997 (13 respondents) Examination. Version 1.1 was corrected for typographical errors and the following upgrades were made. Firstly, the Results disk contained an additional data base for recording the time spent within each tutorial. Secondly, there was a Past Results Page that recorded details of a candidates previous test performances on their hard drive. eg: date, timer (on/off), duration of test (minutes), module number, topic name, total number of questions and percentage score.
Module evaluation: Thirty-two candidates who elected to use the package also completed an eight page mail questionnaire (available from the authors). Areas for evaluation were: installation procedure, self-rating of computer skills, usefulness of Candidates Guide; rating of achievement of set objectives; comparison with other study methods; usefulness as a study resource and preparation for the Stage I Examination; evaluation of various program features: navigation, presentation and a range of features within the Assessment Program. Candidate feedback to most questions was via selection from a five point Likert scale.[16]
Site of program use: home (23); computer centre (1); library (1) and 'other' (7). Program installation: of 21 candidates who self-installed the program, 18 found the procedure 'easy-very easy'.
Self-rating of computer skills: Most (20) personally rated their computer skills as moderate and eight rated them as 'high-very high'. Only four candidates reported 'technical' problems and these related to improper installation and 'mouse' clicking. The Candidate's Guide provided was found to be 'useful-very useful' by 24 candidates.
Achievement of objectives: A 'high-very high' rating was given by candidates for the extent to which the following objectives were achieved: encouragement of testing of tutorial knowledge (M1: 90.3%; M2: 90.0%); provision of practice using the examination question format (M1: 90.0%; M2: 96.5%); encouragement of review of tutorials based on test performance (M1: 83.9%; M2: 86.7%).
Compared to using other study methods: The modules overall were rated 'better-much better' in the following ways: enthusiasm to read particular topic (M1: 93.8%, M2: 93.8%); understanding of the material (M1: 93.7%, M2: 93.6%); testing one's knowledge (M1: 96.9%, M2: 100.0%) and as an overall study source (M1: 96.9%, M2: 93.7%). All respondents rated the module 'useful-very useful' in improving overall subject knowledge.
As a preparation for the APEC exam: This modules overall were rated 'useful-very useful' as a preparation for the exam (M1: 90.4%, M2: 93.5%). More than 70% of respondents using either module found the level of language expression 'appropriate'. The modules as a whole were viewed as: a 'complementary' study source (M1: 66.7%, M2: 73.3%); 'sole' means of study (M1: 33.3%, M2: 26.7%).
Common program features: In terms of navigation, the tutorial and assessment programs were rated 'easy-very easy' by 93.8% and 93.5% of respondents, respectively. The presentation quality (including layout, design, graphics) of the tutorial and assessment programs were rated 'high-very high' by 90.6% and 93.5% of respondents, respectively.
Assessment Program features: The following features of the assessment program were rated 'useful-very useful': option to time a test (80.0%); continuous scoring (93.5%); stop test at any time (93.5%); display a calculator (45.1%); view the working of a question [M1 only] (96.8%); reasonable number of questions for each topic (100%); random selection of questions for each topic (100%) and for all topics (96.8%); a large data bank of questions (100%); a score and comment at the end of each test (90.0%); view details of previous test performances (90.0%) [Version 1.2].
Further module development: Almost all respondents (93.8%) stated that they would like further modules to be developed and popular suggested topics were: pharmaceutical/medicinal chemistry, pharmacology, drug interactions and therapeutics.
Tests
A total of 2,865 tests of various lengths were undertaken (M1: 1,713; M2: 1,152). There were periods of high frequency use of both Test Programs in the weeks leading up to month of each exam date (ie: March and September) (Figs. 15 and 16).
Questions
A total of 36, 821 questions were attempted (M1: 20,900; M2: 15,921).
Candidates' overall performance in the questions was as follows: M1: (79.7% correct, 17.6% incorrect, 2.7% exited before answer selection); M2: (86.1% correct, 13.2% incorrect, 0.7% exited before answer selection).
Timer option
The timer was used in 36.9% of total tests (M1: 24.3%; M2: 55.7%).
Period of testing
The period of testing with timer 'on' was (118.4h) and with timer 'off' (maximum allowable question time recorded) (598.2h). Therefore the total testing period for both modules was 716.6 hours.
Average time attempting each question
With the timer 'on' the average period taken to answer a question was: (M1: 53.2s, M2: 19.1s). This was to be expected since solutions to calculations take longer to work out than answering biopharmaceutics questions that tested a candidate's interpretation of facts. When the timer was 'off', the average period for answering a question was similar for both modules (M1: 92.1s, M2: 96.4s).
Figure 16: Number of tests undertaken versus date (M2)
(arrow indicates exam date)
Figure 17: Performance on individual questions
Analysis of question performance
The overall performance of candidates for each question, within each topic, was analysed. The following was graphed (Fig. 17): the percentage of times the question was answered correctly, incorrectly or exited (without making a selection).
This provided useful information in terms of how each question was handled by candidates. Any question whose overall performance was poor, was checked for ambiguities and errors.
Since candidates resided in all corners of the country, it was important that this distance course was easy to self-install and operate. We succeeded in that regard since most respondents felt comfortable doing this. Educationally, there were high ratings (>80% 'high-very high') for the accomplishment of the set objectives: testing of tutorial content, practice using exam format and tutorial review based on test performance. When the modules were compared to other study method, they consistently rated 'better-much better' in terms of enthusiasm to read a particular topic, understanding the material and testing one's knowledge. It was encouraging to know that more than 90% of respondents found each module 'useful-very useful' as a preparation for the exam. The package was seen by about two thirds of respondents as a complementary source of study for the Stage I Exam. This is understandable since the package only covered about a tenth of the subjects covered in the exam.
The program design was deliberately kept simple and all features were described using on-line help. The navigational and presentational qualities of the modules were given high score and the list of program features were also consistently rated highly except for the calculator option. We believe that candidates found it easier to use their own calculator rather than the 'pop-up' one because it obstructed their view of the question.
The authors acknowledge the financial support of NOOSR through the DEETYA, Canberra, ACT, Australia.
Authors: Arthur Pappas, Lecturer in Pharmacy Practice (address correspondence) Victorian College of Pharmacy Monash University 381 Royal Parade Parkville Vic 3052 Australia Email: arthur.pappas@vcp.monash.edu.au
Branko Cesnik, Director, Centre of Medical Informatics
Julia Hoffman, Registrar, Australian Pharmacy Examining Council Inc. Please cite as: Pappas, A., Cesnik, B. and Hoffman, J. (1998). Patterns of use of a CAL assessment program used by overseas qualified pharmacists preparing for a registration exam. In C. McBeath and R. Atkinson (Eds), Planning for Progress, Partnership and Profit. Proceedings EdTech'98. Perth: Australian Society for Educational Technology. http://www.aset.org.au/confs/edtech98/pubs/articles/pappas.html |