Interrater reliability among primary care sports medicine fellowship application reviewers

Document Type

Conference Proceeding

Publication Date

3-2020

Publication Title

Clin J Sport Med

Abstract

Purpose: This study seeks to determine whether reviewers of fellowship applications agree upon differing elements in the evaluation process. To our knowledge, no published data currently exists for interrater reliability in assessment of Primary Care Sports Medicine fellowship applications. Methods: All fellowship candidate applications from a single cycle were reviewed by 4 Primary Care Sports Medicine faculty at a large, urban hospital accepting 3 Primary Care Sports Medicine fellows. Each reviewer scored all applications independently using a scoring manual developed by the group. No formal training had been completed by the raters. Scoring was completed for 14 unique domains. Results: During a single application cycle (1 year), 53 unique applications underwent review. Reviewers achieve excellent interrater reliability when scoring USMLE step 1, 2, and 3 scores including weighted scores; event and team coverage; medical school performance during the first 2 years; letters of recommendation; and sports medicine rotation. Raters reach fair/good agreement in evaluating leadership experience; research experience; event and team coverage weighted scores; medical school clerkship performance; Medical Student Performance Evaluation; and sports medicine rotation weighted scores. Evaluators have poor agreement on scoring special skills; personal statement; and gestalt. Conclusions: Excellent and fair interrater reliability has been achieved for most of the application elements. Weighted scores have not proven consistently reliable among different reviewers. Future research could investigate why certain application domains yield less than excellent interrater reliability and if greater interrater reliability could be achieved with training raters on use of an application scoring guide. Significance: Fellowship directors would benefit from efficient and reliable methods to review applications. Developing application review tools that have high interrater reliability allows multiple stakeholders to participate in the application review process.

Volume

30

Issue

2

First Page

162

This document is currently not available here.

Share

COinS