The measured effect of delay in completing operative performance ratings on clarity and detail of ratings assigned

Reed G. Williams, Xiaodong Chen, Hilary Sanfey, Stephen J. Markwell, John D. Mellinger, Gary L. Dunnington

Research output: Contribution to journalArticle

31 Scopus citations

Abstract

Purpose Operative performance ratings (OPRs) need adequate clarity and detail to support self-directed learning and valid progress decisions. This study was designed to determine (1) the elapsed time between observing operative performances and completing performance ratings under field conditions and (2) the effect of increased elapsed time on rating clarity and detail.

Methods Overall, 895 OPRs by 19 faculty members for 37 general surgery residents were the focus of this study. The elapsed time between observing the performance and completing the evaluation was recorded. No-delay comparison data included 45 additional ratings of 8 performances collected under controlled conditions immediately following the performance by 17 surgeons whose sole responsibility was to observe and rate the performances. Item-to-item OPR variation and the presence and nature of comments were indicators of evaluation clarity, detail, and quality.

Results Elapsed time between observing and evaluating performances under field conditions were as follows: 1 day or less, 116 performances (13%); 2 to 3 days, 178 performances (20%); 4 to 14 days, 377 performances (42%); and more than 14 days, 224 performances (25%). Overall, 87% of performances rated more than 14 days after observation had no item-to-item ratings variation compared with 62% rated with a delay of 4 to 14 days, 41% rated with a delay of 2 to 3 days, 42% rated within 1 day, and 2% rated immediately. In addition, 70% of ratings completed more than 14 days after observation had no written comments, compared with 49% for those completed with a delay of 4 to 14 days, 45% for those completed in 2 to 3 days, and 46% for those completed within 1 day. Moreover, 47% of comments submitted after more than 14 days were exclusively global comments (less instructionally useful) compared with 7% for those completed with a delay of 4 to 14 days and 5% for those completed in 1 to 3 days.

Conclusions The elapsed time between observation and rating of operative performances should be recorded. Immediate ratings should be encouraged. Ratings completed more than 3 days after observation should be discouraged and discounted, as they lack clarity and detail about the performance.

Original languageEnglish (US)
Pages (from-to)e132-e138
JournalJournal of Surgical Education
Volume71
Issue number6
DOIs
StatePublished - Nov 1 2014

Keywords

  • general surgery
  • operative performance evaluation
  • resident training
  • surgical education

ASJC Scopus subject areas

  • Surgery
  • Education
  • Medicine(all)

Fingerprint Dive into the research topics of 'The measured effect of delay in completing operative performance ratings on clarity and detail of ratings assigned'. Together they form a unique fingerprint.

  • Cite this