Descriptive Analysis of the Psychometric Properties of Extended Matching Questions Conducted Among Anaesthesia Residents

Document Type : Original Article


Department of Anaesthesiology, Mahatma Gandhi Medical College and Research Institute, Pondichery, India


Background: Clinical reasoning is one of the core features of clinical competency. Training and assessing clinical reasoning is vital in post-graduate training. Extended matching questions(EMQs) are effective in assessing problem-solving and clinical reasoning abilities, but not commonly used in Postgraduate training. Covid-19 pandemic, which prevented both patient encounter and regular academic activities, warranted the introduction of innovative Teaching-Learning methods to sustain clinical reasoning skills. Hence, we aimed to introduce EMQs in formative assessment among anaesthesia residents and analyzed its psychometric properties. 
Methods: The study was conducted at the Department of Anaesthesiology, MGMCRI, Pondicherry. Four modules of EMQs as part of a formative assessment was conducted among residents (n=20). A total of 40 clinical vignettes and 60 options were administered. Post-validation of the EMQs was done by item analysis. Test reliability was estimated by the Kuder-Richardson 20 formula. Difficulty index(DIF-I), discrimination index(DIS-I), and distractor functionality were analysed.
Results: The KR-20 reliability coefficient was 0.72. The mean DIF-I was 0.43±0.17, where 72.5%(29) were in the acceptable range, 20%(8) difficult, and 7.5%(3) easy. The mean DIS-I was 0.28±0.24, where 40%(16) had acceptable, 27.5%(11) excellent, and 20% had poor discrimination. Ninety percent of distractors were functional. The DIS-I exhibited a positive correlation with DIF-I (r= 0.2155, P=0.0185).
Conclusions: The results of the present study indicates that EMQs have acceptable test reliability. The majority of the items (80%) followed the principles of MCQs. We conclude that EMQs can be effectively used as part of the postgraduate assessment to test higher-order knowledge and clinical competency.


  1. Thampy H, Willert E, Ramani S. Assessing Clinical Reasoning: Targeting the Higher Levels of the Pyramid. J Gen Intern Med. 34(8):1631–6.
  2. Sood R, Singh T. Assessment in medical education: Evolving perspectives and contemporary trends. Natl Med J India 2012; 25:357–64.
  3. Liu N, Carless D. Peer feedback: the learning element of peer assessment. Teaching in Higher Education. 2006; 11(3).279-90.
  4. Epstein R. Assessment is medical education. NEJM. 2007; 356:387-96.
  5. Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine 1990, 65(9):563-7.
  6. Tangianu F, Mazzone A, Berti F, Pinna G, Bortolotti I, Colombo F, et al. Are multiple-choice questions a good tool for the assessment of clinical competence in Internal Medicine? Italian Journal of Medicine 2018; 12:88-96.
  7. Aisling SB. The new Extended Matching Question (EMQ) paper of the MFSRH Examination. Fam Plann Reprod Health Care. 2010; 36: 171-73.
  8. Fenderson BA, Damjanov I, Robeson MR, Veloski JJ, Rubin E. The virtues of extended matching and uncued tests as alternatives to multiple choice questions. Hum Pathol. 1997; 28(5):526–32.
  9. Gronlund NE, Linn RL. Measurement and evaluation in teaching. 6th ed. New York: Macmillan publishing Co; 1990.
  10. Gajjar S, Sharma R, Kumar P, Rana M. Item and Test Analysis to Identify Quality Multiple Choice Questions (MCQs) from an Assessment of Medical Students of Ahmedabad, Gujarat. Indian J Community Med. 2014;39(1):17-20.
  11. Eijsvogels T, Van Den Brand T, Hopman M. Multiple choice questions are superior to extended matching questions to identify medicine and biomedical sciences students who perform poorly. Perspect Med Educ. 2013; 2(5–6):252–63.
  12. E J Wood. What are Extended Matching Sets Questions?. Bioscience Education. 2003; 1:1, 1-8
  13. Kreiter CD, Ferguson K, Gruppen LD. Evaluating the usefulness of computerized adaptive testing for medical in-course assessment. Acad Med. 1999; 74:1125-8.
  14. Vuma S, Sa B.A descriptive analysis of extended matching questions among third year medical students. Int J Res Med Sci 2017; 5:1913-20.
  15. Karelia BN, Pillai A, Vegada BN. The levels of difficulty and discrimination indices and relationship between them in four response type multiple choice questions of pharmacology summative tests of year II M.B.B.S students. Int EJ Sci Med Educ 2013; 7:41–6.
  16. Backhoff E, Larrazolo N, Rosas M. The level of difficulty and discrimination power of the basic knowledge and skills examination (EXHCOBA). Rev Electro´n Investig Educ 2000; 2(1).
  17. Kheyami D, Jaradat A, Al-Shibani T, Ali FA. Item Analysis of Multiple Choice Questions at the Department of Paediatrics, Arabian Gulf University, Manama, Bahrain. Sultan Qaboos Univ Med J. 2018; 18: 68-74.
  18. Ramzan M, Imran SS, Bibi S, Khan KW, Maqsood I. Item Analysis of Multiple-Choice Questions at the Department of Community Medicine, Wah Medical College, Pakistan. Life and Science. 2020; 1(2): 60-63.
  19. Mitra NK, Nagaraja HS, Ponnudurai G, Judson JP. The levels of difficulty and discrimination indices in type a multiple choice questions of pre-clinical semester 1, multidisciplinary summative tests. IeJSME. 2009; 3: 2-7.
  20. Namdeo SK, Sahoo S. Item analysis of multiple choice questions from an assessment of medical students in Bhubaneswar, India. Int J Res Med Sci 2016;4: 1716-19.
  21. Swanson DB, Holtzman KZ, Allbee K. Measurement Characteristics of Content-Parallel Single-Best-Answer and Extended-Matching Questions in Relation to Number and Source of Options. Acad Med. 2008; 83(10):S21–S24.