Medical Students' Perception of OSCE At The Department of Internal Medicine, College of Medicine, King Khalid University


Omer A Elfaki
Suliman Al-Humayed


What is a good assessment and written performance assessment


Student Assessment/Student Engagement


King Khalid University




·  OSCE was well accepted by the students.



·       Some of the students were concerned with station    time, exam anxiety and the long waiting time in the exam venue.



 ·   OSCE was first introduced as part of assessment of 4th year medical students in internal medicine in  the year 2009.

 ·   The first and second OSCE were considered trial ones and were composed of 5 stations only.

 ·   In 2011, a 12 station OSCE was conducted.


·  The aim of this study was to explore the students acceptance of OSCE as a method of assessment of clinical competence in internal medicine.




Objective Structured Clinical Examination (OSCE) in internal medicine was first introduced as part of assessment of 4th year medical students in the year 2009. The first and second OSCE were considered trial ones and were composed of 5 stations only. The present one is a 12 station OSCE. The aim of this study was to explore the student acceptance of OSCE as a method of assessment of clinical competence in internal medicine.


Summary of Work





·  A self-administered questionnaire was  completed by 4th year medical students immediately after the OSCE.



·  The main outcome measures were students' perception of examination attributes including the quality of instructions and organization, and the quality of performance.

Take-home Messages

·  OSCE was well accepted by the students.

·   The department is encouraged to replace the long and short cases for final students by OSCE.




· The authors wish to thank all member staff of the department of internal medicine for all their efforts in preparing and executing the OSCE.

 We also express our gratitude to the participating students and patients.

Summary of Results


 ·   The students perceived  OSCE in internal medicine as fair( 5 3%) , comprehensive(56%) and with clear instructions (54%).  


·   However, some students felt that it was not less stressful than other methods of assessment(30%).

·  Although most students were satisfied with the stations time(57% ) , a few expressed concerns about inadequacy of time in some stations(13%) .  

·  Some students expressed concerns about being kept for long time in the exam venue.




1. Harden RM: How to assess clinical competence – an overview. Med Teach 1979, 1:289-296.

2. Fowell SL, Bligh JG: Recent developments in assessing medical students. Postgrad Med J 1998, 74:18-24.

3. Harden RM: What is an OSCE? Med Teach 1988, 10:19-22.

4. Harden RM, Stevenson M, Downie WW, Wilson GM: Assessment of clinical competence using objective structured examination. Br Med J 1975, 1:447-451.

5. Carraccio C, Englander R: The objective structured clinical examination, a step in the direction of competency-based evaluation. Arch Pediatr Adolesc Med 2000, 154:736-741.

6. Harden RM, Caincross RG: The assessment of practical skills: the Objective Structured Practical Examination (OSPE). Stud High Educ 1980, 5:187-196.

7. StataCorp: Stata Statistical Software: Release 7.0 College Station, TX:

StataCorp LP; 2001.

8. Newble DI: Eight years experience with a structured clinical examination. Med Educ 1988, 22:200-204.

9. Duerson MC, Romrell LJ, Stevens CB: Impacting faculty teaching and student performance: nine years' experience with the objective structured clinical examination. Teach Learn Med 2000, 12:176-182.

10. Kowlowitz V, Hoole AJ, Sloane PD: Implementation of the Objective Structured Clinical Examination in a traditional medical school. Acad Med 1991, 66:345-347.

11. Woodburn J, Sutcliffe N: The reliability, validity and evaluation of the objective structured clinical examination in podiatry. Assessment Evaluation Higher Educ 1996, 21:131-147.

12. Allen R, Heard J, Savidge M, Bittengle J, Cantrell M, Huffmaster T: Surveying students' attitudes during the OSCE. Adv Health Sci Educ 1998, 3:197-206

13. Pierre, R., A. Wierenga, M. Barton, J.M. Branday and C. Chris tie, 2004. Student evaluation of an OSCE in pediatric at the Univers ity of the Wes t Indies , Jamaica. BMC Medical Education, 4(22):1-7.



 Stu dent feedback confirmed their acceptance of OSCE. This is encouraging to the department to consider implementing OSCE for graduating students. This is also timely since the university and the College are engaged in curriculum evaluation and reform.





  A ppropriate assessment of the clinical skills and competence of medical students is an essential and  integral component of the  medical curriculum.   Several methods of assessment of performance are used in medical education (1,2). The Objective Structured Clinical Examination (OSCE) is an approach to student assessment in which aspects of clinical competence are evaluated in a comprehensive, consistent and structured manner with close attention to the objectivity of the process (3). OSCE was introduced by Harden in 1975(4). Since its inception, OSCE has been increasingly used to provide formative and summative assessment in various medical disciplines worldwide (5). In addition to assessing the competence and performance of the examinee, OSCE has many advantages over traditional methods of evaluation such as conventional bedside long and short case examinations. The advantages of the examination are greatly apparent when one reviews the wide spectrum of clinical tasks that can be incorporated into the OSCE. Such tasks include clinical data interpretation, reviews of radiographs, use of models, and examination of simulated or real patients. The breadth of data that can be included in this type of examination is limited only by the imagination of the examiners. As an evaluation tool, OSCE  eliminates subjectivity, reduces variations in marking standards from examiner to examiner and can accurately reflect the real-life tasks of the doctor (6).

Evaluation of OSCE experience by students and faculty helps to enhance its acceptance as a relatively new assessment tool and refine some of the deficiencies observed in the preparation and conduct of the process. Many studies have been conducted on feedback from students on OSCE(7-10).

The college of medicine KKU, was established in 1980. The department of Internal Medicine has been there for the same duration of time as one of the major clinical departments rendering training for undergraduate medical students. However, OSCE as a testing format was introduced in the department in the year 2009. Only five stations were used to partially cover history taking, physical examination and data interpretation as part of assessment of 4th year medical students. This OSCE, was the first proper one to be implemented by the department. It was composed of a circuit of 12 stations in which various tasks were asked including examination of organ systems such as the respiratory, cardiovascular, gastrointestinal and history taking skills. Laboratory data, X-rays, pictures and videos were also posted at some of the stations to assess the analytical capacity of students. The time allotted for each station was 5 minutes. A standardized criterion-based scoring format was used for marking at each station. This descriptive cross-sectional study was conducted on medical students with the  objective of evaluating students’ perception about the fairness, objectivity, comprehensiveness and overall organization and administration of OSCE in the department of internal medicine.


Summary of Work


The survey was conducted in June 2011 on  medicine 1  batch  of medical students  in the fourth year of their study in the college. A 18-item self-administered structured questionnaire was employed to gather relevant data regarding perception of students about the quality of OSCE, its fairness and its organization.  A  5-point Likert scale, with responses ranging from “strongly agree” to “strongly disagree” was used.  Students were asked follow-up questions related to positive and negative aspects of the OSCE and suggestions for improvement. Basic descriptive statistical analysis of the Likert items was conducted by calculating frequencies and regrouping the responses was made into similar categories. The survey was anonymous and inclusion into it was entirely voluntary.




Take-home Messages
Summary of Results

Send ePoster Link