A Comparative Study on Evaluating of Knowledge and clinical Practices of Midwifery Students in the Courses of Physiopathology, Infectious and Internal Diseases and Surgery based on Objective Structured Clinical Examination and Traditional Assessment Methods

Document Type : Original Article

Authors

1 Department of Midwifery, Karaj Branch, Islamic Azad University, Karaj, Iran

2 Department of Nursing, Albourz University of Medical Sciences, Karaj, Iran

3 Rheumatic Diseases Research Center, Department of Rheumatology, Mashhad University of Medical Sciences, Mashhad, Iran

4 Medical Education Research Center, Department of Statistics and Epidemiology, Tabriz University of Medical Sciences, Tabriz, Iran

5 Research Management, Mashhad University of Medical Sciences, Mashhad, Iran

Abstract

Background: Objective structured clinical examination (OSCE) is one of the best methods of evaluating the achievement of domains such as educational objectives and cognitive, emotional, and psycho-motor aspects of medical students. According to available evidences, no study has been conducted  on the evaluating of OSCE in midwifery students. The main purpose of this study was the evaluation of knowledge and clinical practices of midwifery students in the courses of physiopathology, infectious, and internal diseases and surgery based on OSCE method, compared with traditional and with the final clinical traineeship score.
Methods: This study was performed on 39 midwifery students at the end of the third year of their education in Islamic Azad University of Karaj. The OSCE was performed in 10 different stations based on five practical purposes. A week later, traditional evaluation was performed using multiple choice questions and essays. Then final clinical examination was applied in next semester, when they passed their traineeship course. The scores in three evaluation methods were compared and correlation between the scores were reported.
Results: The mean (Standard deviation) score of OSCE, traditional, and clinical exams were 85.29 (8.90), 62.76 (10.80) and 81.50 (3.22) respectively. Based on the results of Pearson correlation test, there was no significant difference between the three evaluation methods mutually. There was a poor correlation between scores obtained from traditional and clinical (r=0.30, P Value=0.068), traditional and OSCE (r=0.19, P Value=0.123) and clinical and OSCE (r=0.06, P Value=0.728) evaluations. Mixed mode analysis showed a significant difference in the average scores when three evaluation methods of OSCE, traditional and clinical were compared (P<0.001, F (2, 83) =91.8).
Conclusions: This study showed that OSCE is a good way to evaluate midwifery students' clinical skills.

Keywords


INTRODUCTION

One of the goals of education is that, learners develop under the necessary trainings and can afford the professional duties which community has entrusted to them. Because medical students often deal with   with patients and clients directly, it is essential to experience their knowledge at the patient’s bedside (1). Therefore, assessment of clinical skills has always been one of the challenges of the medical authorities. Since traditional tests, had focused more on students' knowledge, based on their memory, the evaluation of skills such as problem solving, critical thinking, and communicating skills with the patient or client were neglected (2). It was a real concern for those involved in education that the integration of theory and clinical courses together should happen in the same position, therefore, medical universities have been always thinking of new and efficient methods of assessment (3, 4).

Objective structured clinical examination (OSCE) was designed and implemented for the first time by Professor Harden et al. as an instrument for teaching and evaluating medical practice in 1970s. Up to that time the professors of medical faculties used the traditional methods and according to Professor Harden, having the chance to intervene, were considered the most common shortcoming of traditional tests (5). In this method, the participants will be asked on multiple stations within a specified time. Each station has aimed to investigate a particular aspect of his/her previous knowledge. A unique feature of the OSCE method is using simulated patient, and in some cases the real patient that can provide a better accessibility to the test objectives (6). In this method, knowledge and performance can be measured simultaneously, and broad range of skills that have never been tested in the traditional clinical examination will be assessed (7). Another advantage of this method over the traditional method is that the test conditions are fair according to the same disease, stations and finally every station evaluators (8). OSCE not only helps students recognize their weaknesses but also enables teachers to analyze them. In a study of nursery students in the United States, they believed that OSCE was good and should be replaced with other methods of clinical evaluation (9). OSCE has many benefits during and after the study, so that according to the study of Austin et al. (2006), students had expressed that OSCE method provided them with a foundation for better understanding of their roles and psychological needs of the patients after 3 years of their graduation (10). OSCE can be combined with other assessments to enhance reliability and the combination of clinical skill training with training on real patients will improve OSCE performance (11, 12).

In UK   clinical education is emphasized in nursing, midwifery, and other health groups, since most of the problems rise  in these groups are because of their clinical training. According to Watson (2002) the use of different methods to measure different aspects of knowledge and performance of students in these courses is necessary (13). Although it seems that OSCE assessment is free from most faults but its use in training programs has not spread. The clinical evaluation has remained as one of the unresolved problems in Iran; and the most important reason which can affect the learning process is  lack of standardization in the clinical setting, goals, and scheduled which need  specificity, attainability, measurability,  appropriateness, and time management.A lot of  researches about OSCE have been done inside and outside of our country. But according to available evidences, no study has been conducted  on the evaluation of OSCE in midwifery students. Thus the need for research in this area led to the present study.

The aim of this study was to evaluate the knowledge and clinical practices of midwifery students in the courses of physiopathology, infectious and internal diseases and surgery, based on OSCE method compared with traditional assessment methods and with the final clinical traineeship scores in Azad University, Karaj Branch in 2011-2012.

METHODS

This descriptive–analytic study was performed on 39 midwifery students at the end of the third year of their education in Islamic Azad University, Karaj Branch–Iranin 2011-2012. For determining the sample size, the primary information including means and standard deviations of variables were obtained according to other studies (14). Considering the 95% confidence, 80 % test power, and using the formula n= [Z (1-α/2) + Z (1-β)] 2 × {σ12 + σ22} / (μ1- μ2) 2, in the highest value of the sample for the above-mentioned data, it is obtained at least 32 students. In this study including the loss of the samples, the total volume is considered 39 students with convenience sampling method. The inclusion criteria were students in the third year of midwifery that passed the courses of one and two in physiopathology, infectious and internal diseases and surgery and are taking the course 3 of mentioned above. All of the students are needed to  pass all prerequisites for these courses to be able to enter the study too. Students who did not want to participate in the study and did not agree to sign the inform consent were excluded from the study. Eligible midwifery students list was prepared after necessary provisions for consent of faculty authorities, ethical approval from the Ethics Committee and preparation of practice rooms. OSCE standard test was established within three months of working with experts in Internal Medicine, Gynecology and Midwifery from Mashhad, Albourz and Tehran Universities of Medical Sciences and Islamic Azad University, Karaj Branch. After extracting five practical goals and agreement on them, a number of faculty members who predominated on Curriculums were announced as the coordinators of the committee. Practical goals agreed for OSCE assessment were: the ability of history taking (stations 1 and 2), physical examination skills (stations 3 and 4), the ability to diagnose disease (stations 5, 6 and 9), interpretation of lab data (stations 7 and 8) and communication skills (station 10). These objectives were designed in the form of 10 stations. Simulated patients were used at stations 1, 2, 9 and moulage was used at stations 3 and 4 and slides containing lab data were used at stations 5, 6, 7 and 8. Working groups were determined during a meeting after determining the structure and content of the station and the duty of editing scenario was entrusted to a group of experts. Over the next few weeks, guidelines were given to standardized simulated patients and scenarios, check lists and eventually all the questions raised were reviewed by all the designers of the stations. Four standardized patients were selected among the students of School of Nursing and Midwifery, Islamic Azad University, Karaj Branch and were trained for playing  their roles. Stations’ equipments, students’ guidelines and the number of stations were determined. The Guids shared a number of stations and each student's starting point was checked. At each station, one expert evaluated student performance independently on the basis of predefined check lists.  5 minutes and 30 seconds was considered for each station to go through and start the next one . Thus the time for completing the examination was approximately 60 minutes for each of the students. Also there were arrangements for those students who did not take the exam to be quarantined in a room and have not the opportunity to meet  other groups. At each station, students were evaluated by assessors. After the exam, the evaluated performance record of students (practical scores at each station and field, the report of student's strengths and weaknesses and the average of practical score at total stations) were provided to students. In this study, data collection in OSCE group was done through the observation of skills and on the basis of pre-prepared questionnaires and check lists. It should be noted that questionnaires and check lists' justifiability were  evaluated through content validity and their reliability was evaluated through the test-retest with the correlation coefficient greater than 0.7. Scoring was studied in each area and total stations were 0 to 100 to make comparing feasible statistically. 

After a week of OSCE assessment, the evaluation of students was held in the traditional way. Traditional evaluation method included  short-answer essay and multiple choice questions. It was taken one hour and the minimum score of 10 out of 20 was considered acceptable. These questions were designed two months before the exam and the educational goals were set by the faculty group. Clinical evaluation of students was conducted in the next semester at the end of their traineeship course on the basis of clinical evaluation forms consistent with previous evaluation. It should be noted that traditional method and traineeship scores were normalized as 0 to 100 to compare with OSCE method scores statistically.

SPSS17 software was used for data analysis. Data was  reported for quantitative and qualitative variables, respectively, with a mean (standard deviation (SD)) or frequency (percent). Data normality was analyzed through K-S test. Mixed model analysis and Sidak post hoc test was used in the same set of people according to the above measures to compare the scores in three evaluation methods of OSCE, traditional and clinical. Pearson correlation test was used to examine the correlation between the scores. P

RESULTS

The participation of all participants in this study was 100% in all three evaluation methods. In this study, the mean (SD) age of participants was 22.62 (1.90), their total average was 15.33 (1.14) and the average of prerequisite courses was 76.07 (5.77). The mean (SD) of OSCE total score (total 10 stations) was 85.29 (8.90). Descriptive indicators of the overall score and individual domains classified in the courses of physiopathology, infectious and internal diseases and surgery with OSCE method are  shown in Table 1. The mean (SD) score of this course was 62.76 (10.08) in the traditional evaluation method. The mean (SD) score of the traineeship courses in physiopathology, infectious and internal diseases and surgery was 81.05 (3.22), when common clinical evaluation method was used. Mixed mode analysis showed a significant difference in the average scores of these courses when the three evaluation methods of OSCE, traditional and clinical were compared (P<0.001, F (2, 83) =91.8). Results of paired comparisons of scores in the courses of physiopathology, infectious and internal diseases and surgery through three evaluation methods of OSCE, traditional and clinical are summarized in Table 2. Also based on the results of Sidac post hoc test there was a significant difference between the scores gained when each of the scopes compared mutually (P

Based on the results of Pearson correlation test, there was no significant difference between the three evaluation methods mutually. According to the correlation values ​​(less than 0.3 in all cases), there was a poor correlation between scores obtained from traditional and clinical, traditional and OSCE and clinical and OSCE evaluations (Table 3). According to the Pearson correlation test results, there was a significant relationship between traditional score and age, average score and prerequisite courses (P

Table 1: Descriptive indicators of the overall and individual scores of scopes in the courses of physiopathology, infectious and internal diseases and surgery based on OSCE method

OSCE's Scope of Objectives

N

Minimum

Maximum

Mean (SD)

 

Scope 1

(History taking)

39

41.67

91.67

68.80 (13.10)

 

Scope 2

(Physical Examination)

39

58.82

94.12

80.09 (8.60)

Scope 3

(Diagnosing of disease)

39

0.00

70.83

35.26 (18.73)

Scope 4

(Interpretation of lab data)

39

0.00

84.21

49.39 (21.45)

Scope 5

(Communication skills)

39

0.00

90.00

69.23 (20.82)

Total stations

39

40.43

74.47

85.29 (8.90)

 

Table 2: Results of paired comparisons of scores in the courses of physiopathology, infectious and internal diseases and surgery based on three evaluation methods of OSCE, traditional and clinical

Test name (I)

Test name (J)

Differenceof means (I-J)

Standard error

Degrees of freedom

P Value

OSCE

traditional

4.464

1.661

74

0.026

OSCE

Common clinical

-22.779

1.799

112

0.001

Traditional

Common clinical

4.464

1.661

74

0.026

 

Table 3: The results of the relationship between the scores of the three evaluation methods mutually (n=39); Pearson correlations (P Values) are reported. 

 

Traditional evaluation

clinical evaluation

Clinical evaluation

0.2947  (P Value=0.068)

 

OSCE

0.1848 (P Value=0.123)

0.0576 (P Value=0.728)

 

Table 4: The results of the relationship between demographic and academic variables with scores from three valuation methods (n=39); Pearson correlations (P Values) are reported.

Evaluation methods/variables

age

average score

prerequisite courses

OSCE

-0.232 (P Value=0.156)

0.191 (P Value=0.245)

0.128 (P Value=0.513)

Traditional

0.37  (P Value=0.021)

0.50  (P Value=0.001)

0.516  (P Value<0.001)

Common clinical

0.19 (P Value=0.380)

0.14 (P Value=0.525)

0.245 (P Value=0.119)

DISCUSSION

The results of this study showed that the highest score in the courses 3 of physiopathology, infectious and internal diseases and surgery was gained by OSCE method and the lowest score was gained by traditional method. The mean score for traineeship courses was in the second place using clinical evaluation method. The results of mixed model analysis to compare the mean scores of this course in three evaluation methods of OSCE, traditional and clinical showed that there has been a significant difference between the scores of the three evaluation methods.

In the study by Dokoohaki et al. entitled "Evaluation of the knowledge and practice of the third year student nurses about drugs by OSCE method” the mean practical score based on OSCE method was higher than the theoretical score (62.80+7.71 and 49.02+9.24 respectively) (15).

In a study by Mozaffari et al. on senior nursing students in cardiac intensive care unit, using OSCE test  it was shown that the highest scores in the areas studied were related to history and the skills of history taking and physical examination; and the lowest scores were related to the arrhythmia identifying and controlling skills which were somewhat similar to our results (16).

It is believed that the educators should be more careful about students’ perfect knowledge which can be achieved by using more clinical nursing practice environments (17). The results of a study in Golestan University Of medical sciences showed that generally 56.2% of the students assessed themselves as very good in communication skills (18). Although students assessthemselves good in communication skills, nursing educators should gradually increase their knowledge about active learning methods and use more of these methods to increase the quantity and quality of student learning (19).

Numerous studies indicate that OSCE method can better reveal clinical skills of doctors and paramedics (20-23). In the study by Saboori and colleagues which was conducted on 87 specialized dental students in the faculty of dentistry, Shahid Beheshti University of Medical Sciences, generally the students’ attitude toward OSCE method was positive and considered it as an effective way to promote their practical knowledge (24). The results of another study by Imani (2005) and colleagues on the fifth year medical students in pediatric department in Zahedan and also the study of Awaisu (2010) on pharmacy students in Malaysia was similar to the study of Saboori and colleagues (25, 26). But the result of study by Faryabi (2010) which has examined the attitudes of dental students at  Kerman University of Medical Sciences was not compatible. In this study, the majority of students preferred using the written method and only 34.8% of students considered OSCE as a beneficial method (27). Bradley et al. (1999) conducted a study on 195 second-year medical students and measured their clinical and communication skills through the 30 designed OSCE stations. They expressed that OSCE was a powerful tool in learning and critical analysis was an important approach that could provide students with the clinical skills (28). Carraccio et al. (2002) reviewed the literature relevant to the OSCE on pediatric and mentioned that if the OSCE designed good, acceptable reliability and validity can be achieved.

They concluded that the combination of three OSCE, standardized board examinations, and direct observation at bedside would give a gold standard for assessing the physicians’ abilities (29). In the study by Mozaffari et al. on two nursing groups which had passed theoretical aspects of nursing care course, while only one of them had passed the traineeship course, it was shown that the two groups were almost identical in the clinical skills and there was no significant difference (except the one of 8 skills) between the two groups when OSCE assessment was used (16). In the present study the lowest score was gained by traditional evaluation method, but the highest average score was gained by OSCE method and then with significant difference in students' traineeship scores in the next semester. According to Mozaffari’s study, passing the traineeship course at the hospital could not enhance their competence and make them  learn more than theory training. In the present study, the students were evaluated at the end of a semester with OSCE and traditional methods and then in the following semester (12 weeks later) by clinical evaluation method based on their traineeship course. Perhaps one of the reasons for the differences between the present study and Mozaffari et al. is in running the research and that we had followed the students for the next semester. It may be considered that OSCE method as a reliable and valid test for assessing clinical skills and receiving its immediate feedback prior to pass the traineeship course and clinical setting can be one of the reasons for improving the students' scores in the next semester (30). In the present study records of student performance (theory score, practical scores within each station and field with the report of students' strengths and weaknesses and practical mean of all stations) was given to them at the end of evaluation by OSCE and traditional methods and after summing up the scores. Nevertheless, one must consider that factors such as the lack of available practical training areas (facilities and patients), the interest of most teaching staffs in theoretical aspects, the expansion of graduation and employment which is based on theoretical knowledge test can affect  learning and feeling of learning skills in the traineeship. The results of several studies emphasize on the inadequate skills of students and graduates of medical groups (31, 32). As was stated, there was a poor correlation between the results of OSCE method between the scores of theory and traineeship in the present study. We emphasize that there is a gap between theoretical and practical courses which always has been a challenge for education authorities. Considering the fact that OSCE method evaluates all three areas of knowledge, skill and attitude, finding a criterion to cover directly all three areas, particularly the attitude area seems  impossible.

Overall, this study provides evidence that there is a significant difference between the scores of the three OSCE, traditional and common clinical evaluation methods. Since OSCE is a good way to evaluate students' clinical skills and is emphasized in many articles and researches in recent decades, it is recommended that additional studies should be done in this area due to the conflicting results reported in different studies.

ACKNOWLEDGMENT:

We would like to thank all dear colleagues and participants of  this study. Also we would like to appreciate the Vice Chancellor for Education and Research of Albourz University of Medical Sciences which provided funding for this research.

  1. Assadullahi P, Afshari P. A comparison of effectiveness of theoretical and clinical learning in increasing the student knowledge and performance.IJME. 2002; 2: 15. [Persian]
  2. Selim AA, Ramadan FH, El-Gueneidy MM, Gaafer MM. Using Objective Structured Clinical Examination (OSCE) in undergraduate psychiatric nursing education: Is it reliable and valid? Nurse Educ Today. 2012; 32: 283–288.
  3. Brosnan M, Evans W, Brosnan E, Brown G. Implementing objective structured clinical skills evaluation (OSCE) in nurse registration programmes in a centre in Ireland: A utilisation focused evaluation. Nurse Educ Today. 2006; 26: 115–122.
  4. Adamo G. Simulated and standardized patients in OSCEs: achievements and challenges1992-2003. Med Teach. 2003; 25(3): 262-70.
  5. Rushforth HE. Objective structured clinical examination (OSCE): Review of literature and implications for nursing education. Nurse Educ Today. 2007; 27: 481–490.
  6. Abrishamkar S, Sabouri M, Shayan Sh, Eshraghi N, Maleki L.Analyzing and comparing the results of Objective Structured Clinical Examination (OSCE), in-group evaluation and final improvement examination of neurosurgical assistants of Isfahan University of Medical Sciences in 2009-2010. IJME. 2011; 10(5): 634-642.
  7. Griesser MJ, Beran MC, FlaniganDC, Quackenbush M, Van Hoff C, Bishop JY. Implementation of an Objective Structured Clinical Exam (OSCE) into Orthopedic Surgery Residency Training. Journal of Surgical Education. 2012; 69: 180-189.
  8. Wass V, Roberts C, Hoogenboom R, Jones R, Vleuten CVD. Effect of ethnicity on performance in a final objective structured clinical examination: qualitative and quantitative study. BMJ. 2003; 326:800-803

     9. Alinir G. Nursing students and lecturers perspectives of objective structured clinical examination incorporating simulation. Nurse Educ Today. 2003; 23:419-26.

     10.  Austin Z, Gregory P, Tabak D. Simulated patients vs. standardized patients in objective Structuredclinical examinations. Am J Pharm Educ. 2006; 70:119.

  1. Newble D. Techniques for measuring clinical competence: objective structured clinical examiation. Med Educ. 2004; 38:199-203.
  2. Jünger J, Schäfer S, Roth C, Schellberg D, Friedman Ben-David M, Nikendei C. Effects of basic clinical skills training on objective structured clinical examination performance. Med Educ. 2005; 39:1015-20.
  3. Watson R, Stimpson A, Topping A, Porock D. Clinical Competence Assessment in Nursing, A Systemic Review of the Litrature. J Adv Nurs. 2002; 39:421-31.
  4. Rau Th, Fegert J, liebhardt H. How high are the personnel costs for OSCE? A financial report on management aspects. GMS Z Med Ausbild. 2011; 28: Doc13.
  5. Dokoohaki R, Sharafi N, Rahnema sh, AzarHooshang P, Jahanbin I. Evaluation of the knowledge and practice of the third year student nurse about drugs by OSCI method. IJN. 2008; 21: 101-109. [Persian]
  6. Mozafari M. Practice of senior nursing student in the cardiac intensive care with using OSCI exam. Journal of Ilam University of medical science 2003; 12: 45-52. [Persian]
  7. Haji Hosseini F, Sharifnia SH, Nazari R, Rezaee R, Saatsaz S, Oajian P. Nursing Students' Evaluation of their Qualifications in Clinical Skills before Starting Training in Field. FMEJ. 2011; 1(1):16-20.
  8. Shahini N, Sanagoo A, Mahasti Jouybari L. Communication Skills and Professionalism: The Self-Assessment of Golestan University of Medical Sciences' Students. FMEJ. 2012; 2 (3):3-6.
  9. Sadeghnejad M, Habibi R, Habibi Gh. Nursing Students and Teachers' Knowledge, Application, Interest and Views towards the Use of New Active Teaching Methods in Nursing Education. FMEJ. 2011; 1(1):26-32.
  10. Shaneyfelt T, Baum KD, Bell D, Feldstein D, Houston TK, Kaatz S, et al. Instruments for evaluating education in evidence-based practice: A systematic review. JAMA. 2006; 296 (9): 1116-27.
  11. Tudiver F, Rose D, Banks B, Pfortmiller D. Reliability and validity testing of an evidence-based medicine OSCE station. Fam Med. 2009; 41 (2): 89-91.
  12. Frohna JG, Gruppen LD, Fliegel JE, Mangrulkar RS. Development of an evaluation of medical student competence in evidence-based medicine using a computer-based OSCE station. Teach Learn Med. 2006; 18 (3): 267-72.
  13. Fernandes PT, Noronha AL, Sander JW, Bell GS, Li LM. Training the trainers and disseminating information: A strategy to educate health professionals on epilepsy. Arch Neuropsychid. 2007; 65: 14-22.
  14. Saboury A, Vahid Dastjerdi E, Mahdian M, Kharazifard M. Dental residents' perceptions of Objective Structured Clinical Examination (OSCE) as a clinical evaluation method.J Dent Sch. 2010, 28: 88-94.[Persian]
  15. Imani M, Hosseini Tabatabaei MT: Is OSCE successful in pediatrics? J Med Educ. 2005; 6:153-158.
  16. Awaisu A, Abd Rahman NS, Nik Mohamed MH, Bux Rahman Bux SH, Mohamed Nazar NI: Malaysian pharmacy students' assessment of an objective structured clinical examination (OSCE). Am J Pharm Educ. 2010; 74:34.
  17. Faryabi J, Farzad M, Sinaei N: Dental students' perspective on Objective Structured Clinical Examination in Kerman University of medical sciences. Journal of Medical Education Development Center. 2010; 6:34-39.
  18. Bradley P, Humphris G. Assesing the ability of medical studens to apply evidence in practice. Medical education. 1999; 33:815-817.
  19. Carraccio C, Englander R. The objective structured clinical examination. Arch Pediatr Adolesc Med. 2002; 154: 736-741.
  20.  Moattari M, Abdollah-Zargar Sh, Mousavinasab M, Zare N, Beygi-Marvdast P. Reliability and validity of OSCE in evaluating clinical skills of nursing students. Pejouhesh. 2007; 31:55-59. [Persian]
  21. Chamanzari H. Optimizing of intensive care in preventing complications of mechanical ventilation in patients with brain trauma in ICU. Journal of Gonabad University of Medical Science. 1996; 3: 13-24. [Persian]
  22.  Adib Hajbagheri M, Afazel MR, Mousavi Gh, Noorizad S. Evaluation of knowledge and skills of medical personnels of Kashan hospitals regarding cardiopulmonary resuscitation. KAUMS Journal (FEYZ). 2001; 5 (3): 96-103. [Persian]