ePoster
Abstract Title | Discrepancy between faculty and simulated patient scores for communication and interpersonal skills

Authors

  1. Masami Tagawa
  2. Kazunori Ganjitsuda
  3. Kenichi Ikeda

Theme

Simulation and Simulated Patients

Category

Simulated Patients

INSTITUTION

Center for Innovation in Medical and Dental Education, Kagoshima University

Conclusion

In our study, using the instruments of CI, which have equivalent items but are expressed as observable behaviors for faculty raters and SPs, paired scores of the same examinees showed a discrepancy at item levels.  Those scores were rarely identical or showed positive correlations, especially for the items of listening and understanding the patients, expressing empathy, and global rating of an examinee as a medical graduate.

 

Background

For the assessment of communication and interpersonal skills (CI) of medical students and residents, scores of checklists rated by physicians and SPs have been discussed regarding whether they reflect competency and patient s’ perspectives (ref. 1-8).  In our previous study, total scores for CI by faculty raters (physicians) and SPs were weakly correlated, but the different perspectives of both were not clearly identified.


Summary of Work

Fifteen SP cases in OSCE encountered by sixth-year medical students at Kagoshima University in 2010 and 2011 were analyzed.  Faculty raters scored the CI as well as history taking (content), physical examination, informing of diagnosis and plan for SPs in the examination room.  After 20 minutes, SPs scored CI in the SP rating room.

 

 

Five items of listening (L), five of explaining and decision-making (ED), and 4 of attitude and the whole process (AW) were scored by a faculty rater and an SP for each examinee using equivalent instruments (see Details).  Scores were analyzed using the Pearson correlation coefficient and paired t-test.

 

Take-home Messages

Faculty and SP raters have different perspectives on CI, and do not complement each other.

 

Acknowledgement

The authors thank the sixth-year medical students, faculty members, Kagoshima SP group members, and staff who participated in the OSCE and this research.

 

Summary of Results

1. Items showed a positive correlation between faculty and SP scores by the Pearson correlation coefficient (p<0.05) were 34 among 14 items of 15 cases (16 %).

 

 

2. We classified items into psychometric criteria (Figure 2), and each item’s psychometric property of each case is shown in Figure 3.

 

Significant score differences were detected for items of listening behaviors and understanding the patients (Items 1, 2, and 5), expressing empathy (Item 11), and examinee’s attitude such as enthusiasm and confidence for the medical students expecting graduation (Item 12).

 

References
  1. Stiles WB.  Evaluating medical interview process components. Null correlations with outcomes may be misleading.  Med Care. 1989;27:212-20.
  2. Martin JA, Reznick RK, Rothman A, Tamblyn RM, Regehr G.  Who should rate candidates ina an objective structured clinical examination?  Acad Med 1996;71:170-5.
  3. Zoppi K, Epstein RM.  Is communication a skill? Communication behaviors and being in relation.  Fam Med 2002;34:319–24.
  4. Thistlethwaite J.  Simulated patient versus clinician marking of doctors’ performance: which is more accurate?  Med Educ  2004;38:456.
  5. Egener B, Cle-Kelly K.  Satisfying the patient, but failing the test.  Acad Med 2004;79:508-10.
  6. Mazor KM, Ockene JK, Jane Rogers H, Carlin KM, Quirk ME.  The relationship between checklist scores on communication OSCE and analogue patients’ perceptions of communication.  Adv Health Sci Educ 2005;10:37-51.
  7. Makoul G, Krupat E, Chang C-H.  Measureing patient views of physician communication skills: Development and testing ot the communication assessment tool.  Patient Educ Couns  2007;67:333-42.
  8. Salmon P, Young B.  Creativity in clinical communication: from communication skills to skilled communication.  Med Educ 2011;45:217-26.
  9. Boon H, Stewart M.  Patient-physician communication assessment instruments: 1986 to 1996 in review.  Patinet Educ Couns 1998;35:161-176.
  10. Duffy FD, Gordon GH, Whelan G, Cole-Kelly K, Frankel R.  Assessing vome@tence in communication and interpersonal skills: The Kalamazoo II Report.  Acad Med 2004;79:495-507.
  11. Schirmer JM, Mauksch L, Land F, Marvel K, Zoppi K, Epstein RM, Brock D, Pryzbyiski M.  Assessing communication competence: a review of current tools. Fam Med 2005;37:184-92.
  12. Rider EA, Keefer CH.  Communication skills competencies: definitions and a teaching toolbox.  Med Educ  2006;40:624-9.
  13. Iramaneerat C, Myford CM, Yudkowsky R, Lowestein T.  Evaluation the effectiveness of rating instruments for a communication skills assessment of medical residents.  Adv Health Sci Educ  2009;14:575-94.
  14. Huntley CD, Salmon P, Fisher PL, Fletcher I, Young B.  LUCAS: a theoretically informed instrument to assess clinical communication in objective structured clinical examinations.  Med Educ 2012;46:267-76.

Conclusion
Background
Summary of Work

Items for scoring CI were created based on a communication process required in this OSCE as well as the Common Achievement Test organization instrument and previous publications (ref. 9-14).   They were written as examinee’s observable behaviors for faculty raters.  For SPs, items were expressed as patient s’ recognition, understanding, and feeling (Table 1).  

 

 

Scores were rated using a 3-point likert scale, such as completely agree (satisfactorily done, competent)(1 point), agree but inadequate (done but incomplete /unsatisfactory) (0.5 points), and disagree (not done or very bad) (0 points).

 

Take-home Messages
Acknowledgement
Summary of Results
References
Send ePoster Link