Knowledge of dental faculty in gulf cooperation council states of multiple-choice questions’ item writing flaws

Multiple-Choice Questions provide an objective cost/time effective assessment. Deviation from appropriate question writing structural guidelines will most probably result in commonly ignored multiple-choice questions writing flaws, influencing the ability of the assessment to measure students’ cogni...

Full description

Bibliographic Details
Main Authors: Mawlood Kowash, Hazza Alhobeira, Iyad Hussein, Manal Al Halabi, Saif Khan
Format: Article
Language:English
Published: Taylor & Francis Group 2020-01-01
Series:Medical Education Online
Subjects:
Online Access:http://dx.doi.org/10.1080/10872981.2020.1812224
Description
Summary:Multiple-Choice Questions provide an objective cost/time effective assessment. Deviation from appropriate question writing structural guidelines will most probably result in commonly ignored multiple-choice questions writing flaws, influencing the ability of the assessment to measure students’ cognitive levels thereby seriously affecting students’ academic performance outcome measures. To gauge the knowledge of multiple-choice question items writing flaws in dental faculty working at colleges in Gulf Cooperation Council (GCC) countries. A cross-sectional short online Survey MonkeyTM multiple-choice questions-based questionnaire was disseminated to dental faculty working in GCC countries during the academic year 2018/2019. The questionnaire included five test incorrect (flawed) multiple-choice questions and one correct control question. The participants were asked to identify flawed multiple-choice question items from the known 14 items writing flaws. Out of a total of 460 faculty, 216 respondents completed the questionnaires, 132 (61.1%) were from Saudi Arabia, while numbers of participants from United Arab Emirates, Kuwait and Oman were 59 (27.3), 14 (6.5%) and 11 (5.1%) respectively. Majority of participants were male (n = 141, 65.9%) compared to 73 females (34.1%). Eighty percent of the participants possessed more than five years of teaching experience. Assistant professors constituted the majority (43.3%) of the academic positions participating in this study. The overall fail rate ranged from 76.3% to 98.1% and almost 2/3rds of the participants were unable to identify one or more of the flawed item(s). No significant association was observed between the demographics (age, region, academic position and specialty) and knowledge except that of participant’s gender (p < 0.009). GCC dental faculty demonstrated below average knowledge of multiple-choice question items writing flaws. Training and workshops are needed to ensure substantial exposure to proper multiple-choice question items construction standards.
ISSN:1087-2981