Evaluation of One-Day Multiple-Choice Question Workshop for Anesthesiology Faculty Members

AUTHORS

Parissa Sezari ORCID # 1 , Ardeshir Tajbakhsh # 1 , Nilofar Massoudi 1 , Ali Arhami Dolatabadi ORCID 2 , Soodeh Tabashi 1 , Shahram Sayyadi ORCID 1 , Maryam Vosoughian ORCID 1 , Ali Dabbagh ORCID 3 , *

1 Anesthesiology Department, School of Medicine, Shahid Beheshti University of Medical Sciences, Tehran, Iran

2 EMS Department, School of Medicine, Shahid Beheshti University of Medical Sciences, Tehran, Iran

3 Anesthesiology Research Center, Shahid Beheshti University of Medical Sciences, Tehran, Iran

# These authors are contributed equally as the first author.

How to Cite: Sezari P, Tajbakhsh A, Massoudi N, Arhami Dolatabadi A, Tabashi S, et al. Evaluation of One-Day Multiple-Choice Question Workshop for Anesthesiology Faculty Members, Anesth Pain Med. 2020 ; 10(6):e111607. doi: 10.5812/aapm.111607.

ARTICLE INFORMATION

Anesthesiology and Pain Medicine: 10 (6); e111607
Published Online: December 13, 2020
Article Type: Brief Report
Received: November 25, 2020
Accepted: December 1, 2020
Crossmark
Crossmark
CHECKING
READ FULL TEXT

Abstract

Background: Multiple-choice questions (MCQs) are used commonly to evaluate medical health students. Most novice educators tend to create poor quality, flawed, and low-cognitive-level questions. Therefore, there is a need for educating the assessors to maximize the quality of MCQs and evaluations.

Objectives: The current study aimed to evaluate the effect of a one-day MCQ workshop on anesthesiology faculty members.

Methods: Faculty members were invited to participate in a four-hour, one-day MCQ workshop. At the beginning of the workshop, the participants were questioned about their knowledge about MCQ quality indexes and also were asked about MCQ general principles (pre-test). Participants were again asked about the questions which they had in the pre-test as their post-test and were questioned about their expectations and the influence of this workshop.

Results: The participants declared that their expectations were fulfilled (9.4 ± 0.6 out of 10), and the course was applicable (9.7 ± 0.7 out of 10). Before the workshop, only 12.5% of the participants know MCQ indicators. This rate increased to 41% after the workshop (P < 0.05). Also, they were questioned about Millman’s checklist for the MCQ examination. Participants’ correct answers were increased from 2.75 to 3.05 out of four (P < 0.05).

Conclusions: Although previous participation in MCQ training courses did not demonstrate an increase in knowledge and attitude, it could be theorized that short-term repetition would yield better results.

1. Background

Assessment of trainees is one of the main steps in the medical education and training process, both for undergraduate and postgraduate trainees (1-4). Clinical faculty members are the cornerstone of trainee assessment, and multiple-choice questions (MCQ’s) are one of the most common methods of assessment; however, MCQ’s are not always designed in a valid and reliable method (5-9).

Multiple-choice questions (MCQs) are used commonly to evaluate medical health students. MCQs can assess recall, comprehension, and application of science (10-12). These types of testing are chosen because you can test a large number of learners at the same time, ability to cover a wide area of subjects, and also have better accuracy and consistency than subjective forms (10, 11, 13).

Other evaluation methods, such as OSCEs, patient management problems (PMPs), roleplaying, etc., are proposed as an alternative or complementary form for evaluation (14, 15). On the other hand, instructors do not follow the general principle regarding the increasing quality of MCQs (13, 16). It has been shown that training workshops for one full week could increase the quality of questions designed by instructors (17). Most novice educators tend to create poor quality, flawed, and low-cognitive-level questions. Therefore, there is a need for educating the instructors to maximize the quality of MCQs and evaluations.

2. Objectives

This study was designed to assess the role of “Faculty MCQ Education Workshop on knowledge of the faculty who are involved in the Anesthesiology Departmental Evaluation Committee: ADEC”, Department of Anesthesiology and Critical Care (DACC), Shahid Beheshti University of Medical Sciences (SBMU), Tehran, Iran. So, the primary goal of the current study is to evaluate the effect of a one-day MCQ workshop on anesthesiology faculty members. The participants’ reactions to the workshop, knowledge, and their capability for evaluating questions were assessed. We hypothesized that a four-hour workshop would improve the attitude of instructors toward creating high-quality questions.

3. Methods

The study was approved by the Research Ethics Committee, Vice-Chancellor of Research Affairs of Shahid Beheshti University of Medical Sciences, Tehran, Iran (Ethics code: IR.SBMU.RETECH.REC.1399.780). In our educational department of anesthesiology, we are conducting monthly exams containing 120 MCQ questions. The questions are created by more than 30 faculty members. In an interventional medical education study, all members of ADEC, Anesthesiology Department, School of Medicine, Shahid Beheshti University of Medical Sciences, Tehran, Iran who had designed these exams were invited for a one-day workshop.

A single lecturer guided the one-day workshop. Among the 64 members of the ADEC, 17 subjects participated in the workshop. Two questionnaires were handed out among the participant; both before the workshop and just after termination of the discussions. This questionnaire was designed to assess the knowledge of the participants regarding MCQ tests.

The first questionnaire was about their knowledge about MCQ quality indexes and also about MCQ general principles (Pre-test). In this questionnaire, participants were asked to name three indicators of grading MCQs, and also four true/false questions were asked about their knowledge of common flaws. Each indicator and each question were given one point if answered correctly.

To achieve the workshop objectives, the instructor, who was also a faculty member of the Emergency Department of our university and had experience in this field, different teaching methods were applied. The first session consists of a short, interactive lecture, following that examples of common flaws were discussed. After the break, participants were asked to correct the flaws to improve the quality of the questions. At last, indicators of grading MCQs such as difficulty index (P), discriminating index (DI), non-functional distractors (NFDs), item writing flaws (IWF), and Bloom’s taxonomy level were described and discussed.

The participants were again asked about the questions which they had in the pre-test as their post-test and were questioned about their expectations and the influence of this workshop.

4. Results

Of the total sixty-four faculty members, 17 subjects attended the MCQ workshop. Only 6 of them did not take the MCQ course before this workshop. Members with more than 20-year history of membership have participated in MCQ training courses at least once. Members with less than 5-year history of membership did not take any course regarding designing MCQ questions (Table 1). The others have passed training courses once or twice.

Table 1. Characteristics of the Participants
VariablesValues
Gender
Male10
Female7
Educational rank
Full professor1
Associated professor10
Assistant professor6
Years of experience in teaching
> 152
5 - 1011
< 54
The experienced MCQ training course
05
17
> 15

The participants were asked about their motivation for passing this workshop, their responses are listed in Table 2. Also, at the end of the workshop, the participants declared that their expectations were fulfilled (9.4 ± 0.6 out of 10), and the course was practical (9.7 ± 0.7 out of 10).

Table 2. Participants Expectations From the MCQ Workshopa
Expectations From WorkshopValues
How to design an MCQ?5 (29.4)
What are the standards for the MCQ examination?3 (17.6)
What are the principles of designing MCQ?2 (11.7)
What does taxonomy mean?1 (5.8)

aValues are expressed as No. (%).

The participants were asked to write indicators for question assessment before and after the workshop. Before the workshop, only 12.5% of the participants knew these indicators. This number increased to 41% after the workshop (P < 0.05). Moreover, they were questioned about Millman’s criteria for the MCQ examination. Participants’ correct answers were increased from 2.75 to 3.05 out of four (P < 0.05) (Table 3).

Table 3. Pre-Test and Pot-Test Results
Pre-TestPost-TestSignificance
Knowledge about indicators, %12.541.1P < 0.05
Millman’s checklist2.75 of 43.05 of4P < 0.05

A linear regression analysis was done to check whether a history of attending MCQ training workshops or years of experience correlates with improvement in pre-test and post-test answers. The analysis revealed an R square of 30%, and significant F was reported more than 5%.

5. Discussion

In this study, the impact of a short workshop for MCQs were assessed by pre-test and post-test examination. The participants’ expectations were fulfilled in concordance with previous studies regarding training programs and faculty development workshops. Although this satisfaction is not enough to change the behavior of examinees, this will be fundamental (13, 18, 19).

The participants’ knowledge and skills were also assessed and showed improvements in short-term evaluation. This would be promising to regard short training for MCQ as effective. Although previous participation in MCQ training courses did not demonstrate an increase in knowledge and attitude, it could be theorized that short-term repetition would yield better results (20). In other words, our results suggest that the one-day approach workshop seems effective considering the time-limits of faculty members, a finding in concordance with similar experiences (21, 22).

The assessment of trainees is considered the main step in their thriving process, which could be both summative and/or formative; this process yielded improved results in previous experiences (4, 23, 24). Meanwhile, improving the faculty members in their “assessment and feedback” capacities has been considered an essential task and an important method in improving the educational quality of the residency program (25, 26). If we are going to decrease the distance between the desired curriculum and the experienced curriculum, we have to improve the faculty members through methods that are both feasible and effective (27).

5.1. Conclusions

The one-day short workshop for MCQs improves the faculty members’ capacities while it is feasible for the faculty. Although previous participation in MCQ training courses did not demonstrate an increase in knowledge and attitude, it could be theorized that short-term repetition would yield better results.

Acknowledgements

Footnotes

References

  • 1.

    Yang SC, Tsou MY, Chen ET, Chan KH, Chang KY. Statistical item analysis of the examination in anesthesiology for medical students using the Rasch model. J Chin Med Assoc. 2011;74(3):125-9. doi: 10.1016/j.jcma.2011.01.027. [PubMed: 21421207].

  • 2.

    Tetzlaff JE. Assessment of competence in anesthesiology. Curr Opin Anaesthesiol. 2009;22(6):809-13. doi: 10.1097/ACO.0b013e3283326958. [PubMed: 19773650].

  • 3.

    Newble DI, Jaeger K. The effect of assessments and examinations on the learning of medical students. Med Educ. 1983;17(3):165-71. doi: 10.1111/j.1365-2923.1983.tb00657.x. [PubMed: 6865814].

  • 4.

    Dabbagh A, Massoudi N, Vosoghian M, Mottaghi K, Mirkheshti A, Tajbakhsh A, et al. Improving the Training Process of Anesthesiology Residents Through the Mentorship-Based Approach. Anesth Pain Med. 2019;9(1). e88657. doi: 10.5812/aapm.88657. [PubMed: 30881915]. [PubMed Central: PMC6412912].

  • 5.

    Capan Melser M, Steiner-Hofbauer V, Lilaj B, Agis H, Knaus A, Holzinger A. Knowledge, application and how about competence? Qualitative assessment of multiple-choice questions for dental students. Med Educ Online. 2020;25(1):1714199. doi: 10.1080/10872981.2020.1714199. [PubMed: 31931687]. [PubMed Central: PMC7006711].

  • 6.

    McCoubrie P. Improving the fairness of multiple-choice questions: a literature review. Med Teach. 2004;26(8):709-12. doi: 10.1080/01421590400013495. [PubMed: 15763874].

  • 7.

    Lee AJ, Goodman SR, Banks SE, Lin M, Landau R. Development of a Multiple-Choice Test for Novice Anesthesia Residents to Evaluate Knowledge Related to Management of General Anesthesia for Urgent Cesarean Delivery. J Educ Perioper Med. 2018;20(2). E621. [PubMed: 30057932]. [PubMed Central: PMC6055537].

  • 8.

    Terry R, Hing W, Orr R, Milne N. Do coursework summative assessments predict clinical performance? A systematic review. BMC Med Educ. 2017;17(1):40. doi: 10.1186/s12909-017-0878-3. [PubMed: 28209159]. [PubMed Central: PMC5314623].

  • 9.

    Hift RJ. Should essays and other "open-ended"-type questions retain a place in written summative assessment in clinical medicine? BMC Med Educ. 2014;14:249. doi: 10.1186/s12909-014-0249-2. [PubMed: 25431359]. [PubMed Central: PMC4275935].

  • 10.

    Dellinges MA, Curtis DA. Will a Short Training Session Improve Multiple-Choice Item-Writing Quality by Dental School Faculty? A Pilot Study. J Dent Educ. 2017;81(8):948-55. doi: 10.21815/JDE.017.047. [PubMed: 28765439].

  • 11.

    Puthiaparampil T, Rahman MM. Very short answer questions: a viable alternative to multiple choice questions. BMC Med Educ. 2020;20(1):141. doi: 10.1186/s12909-020-02057-w. [PubMed: 32375739]. [PubMed Central: PMC7203787].

  • 12.

    Dabbagh A, Elyassi H, Sabouri AS, Vahidshahi K, Ziaee SAM, Anesthesiology D. The Role of Integrative Educational Intervention Package (Monthly ITE, Mentoring, Mocked OSCE) in Improving Successfulness for Anesthesiology Residents in the National Board Exam. Anesth Pain Med. 2020;10(2). e98566. doi: 10.5812/aapm.98566. [PubMed: 32547933]. [PubMed Central: PMC7260396].

  • 13.

    AlFaris E, Naeem N, Irfan F, Qureshi R, Saad H, Al Sadhan R, et al. A One-Day Dental Faculty Workshop in Writing Multiple-Choice Questions: An Impact Evaluation. J Dent Educ. 2015;79(11):1305-13. [PubMed: 26522635].

  • 14.

    Dabbagh A, Abtahi D, Aghamohammadi H, Ahmadizadeh SN, Ardehali SH. Relationship Between “Simulated Patient Scenarios and Role-Playing” Method and OSCE Performance in Senior Anesthesiology Residents: A Correlation Assessment Study. Anesth Pain Med. 2020;10(5). doi: 10.5812/aapm.106640.

  • 15.

    Heinke W, Rotzoll D, Hempel G, Zupanic M, Stumpp P, Kaisers UX, et al. Students benefit from developing their own emergency medicine OSCE stations: a comparative study using the matched-pair method. BMC Med Educ. 2013;13:138. doi: 10.1186/1472-6920-13-138. [PubMed: 24098996]. [PubMed Central: PMC3852440].

  • 16.

    Rajaei S, Dabbagh A. Interdisciplinary approach and anesthesiology: is there any role? J Cell Mol Anesth. 2016;1(3):129-33.

  • 17.

    Naeem N, van der Vleuten C, Alfaris EA. Faculty development on item writing substantially improves item quality. Adv Health Sci Educ Theory Pract. 2012;17(3):369-76. doi: 10.1007/s10459-011-9315-2. [PubMed: 21837548].

  • 18.

    Sezari P, Dabbagh A. Personalized medicine: the paradigm shift in medicine mandating lifelong learning. J Cell Mol Anesth. 2019;4(2):31-2.

  • 19.

    Fadaizadeh L, Vosoughian M, Shajareh E, Dabbagh A, Heydari G. Is tele-education a proper substitute for regular method to train anesthesiology residents? J Cell Mol Anesth. 2019;4(1):15-9.

  • 20.

    Azimaraghi O, Movafegh A. Cellular and Molecular Medicine; the Lost World in Postgraduate Medical Education. J Cell Mol Anesth. 2019;4(1):1-2.

  • 21.

    Khan AM, Gupta P, Singh N, Dhaliwal U, Singh S. Evaluation of a faculty development workshop aimed at development and implementation of a competency-based curriculum for medical undergraduates. J Family Med Prim Care. 2020;9(5):2226-31. doi: 10.4103/jfmpc.jfmpc_17_20. [PubMed: 32754478]. [PubMed Central: PMC7380743].

  • 22.

    Allen D, Abourbih J, Maar M, Boesch L, Goertzen J, Cervin C. Does a one-day workshop improve clinical faculty's comfort and behaviour in practising and teaching evidence-based medicine? A Canadian mixed methods study. BMJ Open. 2017;7(7). e015174. doi: 10.1136/bmjopen-2016-015174. [PubMed: 28710209]. [PubMed Central: PMC5734352].

  • 23.

    Dabbagh A. Is the future of perioperative medicine created by" Cellular and Molecular Medicine"? J Cell Mol Anesth. 2017;2(1):1-2.

  • 24.

    Dabbagh A, Sezari P, Tabashi S, Tajbakhsh A, Massoudi N, Vosoghian M, et al. Attitudes of Anesthesiology Residents Toward a Small Group Blended Learning Class. Anesth Pain Med. 2020;10(3). e103148. doi: 10.5812/aapm.103148. [PubMed: 32944563]. [PubMed Central: PMC7472787].

  • 25.

    Rosenblatt MA, Schartel SA. Evaluation, feedback, and remediation in anesthesiology residency training: a survey of 124 United States programs. J Clin Anesth. 1999;11(6):519-27. doi: 10.1016/s0952-8180(99)00084-7. [PubMed: 10526833].

  • 26.

    Riveros-Perez E, Arthur ME, Jain A, Kumar V, Rocuts A. Multifaceted remediation program: experience of a residency program to rescue residents who failed the American Board of Anesthesiology basic examination. Adv Med Educ Pract. 2018;9:865-71. doi: 10.2147/AMEP.S180627. [PubMed: 30538598]. [PubMed Central: PMC6263215].

  • 27.

    Bould MD, Naik VN, Hamstra SJ. Review article: new directions in medical education related to anesthesiology and perioperative medicine. Can J Anaesth. 2012;59(2):136-50. doi: 10.1007/s12630-011-9633-0. [PubMed: 22161241].

  • Copyright © 2020, Author(s). This is an open-access article distributed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/) which permits copy and redistribute the material just in noncommercial usages, provided the original work is properly cited.
    COMMENTS

    LEAVE A COMMENT HERE: