Seven Q-Tracks Monitors of Laboratory Quality Drive General Performance Improvement: Experience from the College of American Pathologists Q-Tracks Program 1999-2011
Recommended Citation
Meier FA, Souers RJ, Howanitz PJ, Tworek JA, Perrotta PL, Nakhleh RE, Karcher DS, Bashleben C, Darcy TP, Schifman RB, and Jones BA. Seven q-tracks monitors of laboratory quality drive general performance improvement: experience from the college of american pathologists q-tracks program 1999-2011. Arch Pathol Lab Med 2015; 139(6):762-775.
Document Type
Article
Publication Date
6-1-2015
Publication Title
Archives of pathology & laboratory medicine
Abstract
CONTEXT: Many production systems employ standardized statistical monitors that measure defect rates and cycle times, as indices of performance quality. Clinical laboratory testing, a system that produces test results, is amenable to such monitoring.
OBJECTIVE: To demonstrate patterns in clinical laboratory testing defect rates and cycle time using 7 College of American Pathologists Q-Tracks program monitors.
DESIGN: Subscribers measured monthly rates of outpatient order-entry errors, identification band defects, and specimen rejections; median troponin order-to-report cycle times and rates of STAT test receipt-to-report turnaround time outliers; and critical values reporting event defects, and corrected reports. From these submissions Q-Tracks program staff produced quarterly and annual reports. These charted each subscriber's performance relative to other participating laboratories and aggregate and subgroup performance over time, dividing participants into best and median performers and performers with the most room to improve. Each monitor's patterns of change present percentile distributions of subscribers' performance in relation to monitoring durations and numbers of participating subscribers. Changes over time in defect frequencies and the cycle duration quantify effects on performance of monitor participation.
RESULTS: All monitors showed significant decreases in defect rates as the 7 monitors ran variously for 6, 6, 7, 11, 12, 13, and 13 years. The most striking decreases occurred among performers who initially had the most room to improve and among subscribers who participated the longest. All 7 monitors registered significant improvement. Participation effects improved between 0.85% and 5.1% per quarter of participation.
CONCLUSIONS: Using statistical quality measures, collecting data monthly, and receiving reports quarterly and yearly, subscribers to a comparative monitoring program documented significant decreases in defect rates and shortening of a cycle time for 6 to 13 years in all 7 ongoing clinical laboratory quality monitors.
Medical Subject Headings
Clinical Laboratory Techniques; Humans; Laboratory Proficiency Testing; Pathology, Clinical; Quality Assurance, Health Care; Reproducibility of Results; Societies, Medical; United States
PubMed ID
26030245
Volume
139
Issue
6
First Page
762
Last Page
775