Intraobserver and interobserver agreement of the interpretation of pediatric chest radiographs

Jeremiah Johnson, Jeffrey Kline

Research output: Contribution to journalArticle

41 Citations (Scopus)

Abstract

The objective of this study is to quantify the magnitude of intraobserver and interobserver agreement among physicians for the interpretation of pneumonia on pediatric chest radiographs. Chest radiographs that produced discordant interpretations between the emergency physician and the radiologist's final interpretation were identified for patients aged 1-4 years. From 24 radiographs, eight were randomly selected as study radiographs, and 16 were diversion films. Study participants included two pediatric radiologists, two senior emergency medicine physicians, and two junior fellowship-trained pediatric emergency medicine physicians. Each test included 12 radiographs: the eight study radiographs and four randomly interspersed diversion radiographs, and each radiograph was paired with a written clinical vignette. Testing was repeated on four occasions, separated by ≥2 weeks. The dependent variable was the interpretation of presence or absence of pneumonia; primary analysis done with Cohen's kappa (95% confidence intervals). Intraobserver agreement was good for pediatric radiologists (kappa=0.87; 95% CI 0.60-0.99) for both but was lower for senior emergency physicians (mean kappa= 0.68; 95% CI 0.40-0.95) and junior pediatric emergency physicians (mean kappa=0.62; 95% CI 0.35-0.98). Interobserver agreement was fair to moderate overall; between pediatric radiologists, kappa=0.51 (0.39-0.64); between senior emergency physicians, kappa=0.55 (0.41-69), and between junior pediatric emergency medicine physicians, kappa=0.37 (0.25-0.51). Practicing emergency clinicians demonstrate considerable intraobserver and interobserver variability in the interpretation of pneumonia on pediatric chest radiographs.

Original languageEnglish (US)
Pages (from-to)285-290
Number of pages6
JournalEmergency Radiology
Volume17
Issue number4
DOIs
StatePublished - Jul 2010
Externally publishedYes

Fingerprint

Thorax
Pediatrics
Physicians
Emergencies
Pneumonia
Hospital Medical Staffs
Observer Variation
Emergency Medicine
Confidence Intervals
Radiologists
Pediatric Emergency Medicine

Keywords

  • Interobserver agreement
  • Intraobserver agreement
  • Medical malpractice
  • Pediatric
  • Pediatric chest radiograph
  • Pneumonia

ASJC Scopus subject areas

  • Radiology Nuclear Medicine and imaging
  • Emergency Medicine
  • Medicine(all)

Cite this

Intraobserver and interobserver agreement of the interpretation of pediatric chest radiographs. / Johnson, Jeremiah; Kline, Jeffrey.

In: Emergency Radiology, Vol. 17, No. 4, 07.2010, p. 285-290.

Research output: Contribution to journalArticle

@article{cf94406f15c84a03b9b4f8fad783b835,
title = "Intraobserver and interobserver agreement of the interpretation of pediatric chest radiographs",
abstract = "The objective of this study is to quantify the magnitude of intraobserver and interobserver agreement among physicians for the interpretation of pneumonia on pediatric chest radiographs. Chest radiographs that produced discordant interpretations between the emergency physician and the radiologist's final interpretation were identified for patients aged 1-4 years. From 24 radiographs, eight were randomly selected as study radiographs, and 16 were diversion films. Study participants included two pediatric radiologists, two senior emergency medicine physicians, and two junior fellowship-trained pediatric emergency medicine physicians. Each test included 12 radiographs: the eight study radiographs and four randomly interspersed diversion radiographs, and each radiograph was paired with a written clinical vignette. Testing was repeated on four occasions, separated by ≥2 weeks. The dependent variable was the interpretation of presence or absence of pneumonia; primary analysis done with Cohen's kappa (95{\%} confidence intervals). Intraobserver agreement was good for pediatric radiologists (kappa=0.87; 95{\%} CI 0.60-0.99) for both but was lower for senior emergency physicians (mean kappa= 0.68; 95{\%} CI 0.40-0.95) and junior pediatric emergency physicians (mean kappa=0.62; 95{\%} CI 0.35-0.98). Interobserver agreement was fair to moderate overall; between pediatric radiologists, kappa=0.51 (0.39-0.64); between senior emergency physicians, kappa=0.55 (0.41-69), and between junior pediatric emergency medicine physicians, kappa=0.37 (0.25-0.51). Practicing emergency clinicians demonstrate considerable intraobserver and interobserver variability in the interpretation of pneumonia on pediatric chest radiographs.",
keywords = "Interobserver agreement, Intraobserver agreement, Medical malpractice, Pediatric, Pediatric chest radiograph, Pneumonia",
author = "Jeremiah Johnson and Jeffrey Kline",
year = "2010",
month = "7",
doi = "10.1007/s10140-009-0854-2",
language = "English (US)",
volume = "17",
pages = "285--290",
journal = "Emergency Radiology",
issn = "1070-3004",
publisher = "Springer New York",
number = "4",

}

TY - JOUR

T1 - Intraobserver and interobserver agreement of the interpretation of pediatric chest radiographs

AU - Johnson, Jeremiah

AU - Kline, Jeffrey

PY - 2010/7

Y1 - 2010/7

N2 - The objective of this study is to quantify the magnitude of intraobserver and interobserver agreement among physicians for the interpretation of pneumonia on pediatric chest radiographs. Chest radiographs that produced discordant interpretations between the emergency physician and the radiologist's final interpretation were identified for patients aged 1-4 years. From 24 radiographs, eight were randomly selected as study radiographs, and 16 were diversion films. Study participants included two pediatric radiologists, two senior emergency medicine physicians, and two junior fellowship-trained pediatric emergency medicine physicians. Each test included 12 radiographs: the eight study radiographs and four randomly interspersed diversion radiographs, and each radiograph was paired with a written clinical vignette. Testing was repeated on four occasions, separated by ≥2 weeks. The dependent variable was the interpretation of presence or absence of pneumonia; primary analysis done with Cohen's kappa (95% confidence intervals). Intraobserver agreement was good for pediatric radiologists (kappa=0.87; 95% CI 0.60-0.99) for both but was lower for senior emergency physicians (mean kappa= 0.68; 95% CI 0.40-0.95) and junior pediatric emergency physicians (mean kappa=0.62; 95% CI 0.35-0.98). Interobserver agreement was fair to moderate overall; between pediatric radiologists, kappa=0.51 (0.39-0.64); between senior emergency physicians, kappa=0.55 (0.41-69), and between junior pediatric emergency medicine physicians, kappa=0.37 (0.25-0.51). Practicing emergency clinicians demonstrate considerable intraobserver and interobserver variability in the interpretation of pneumonia on pediatric chest radiographs.

AB - The objective of this study is to quantify the magnitude of intraobserver and interobserver agreement among physicians for the interpretation of pneumonia on pediatric chest radiographs. Chest radiographs that produced discordant interpretations between the emergency physician and the radiologist's final interpretation were identified for patients aged 1-4 years. From 24 radiographs, eight were randomly selected as study radiographs, and 16 were diversion films. Study participants included two pediatric radiologists, two senior emergency medicine physicians, and two junior fellowship-trained pediatric emergency medicine physicians. Each test included 12 radiographs: the eight study radiographs and four randomly interspersed diversion radiographs, and each radiograph was paired with a written clinical vignette. Testing was repeated on four occasions, separated by ≥2 weeks. The dependent variable was the interpretation of presence or absence of pneumonia; primary analysis done with Cohen's kappa (95% confidence intervals). Intraobserver agreement was good for pediatric radiologists (kappa=0.87; 95% CI 0.60-0.99) for both but was lower for senior emergency physicians (mean kappa= 0.68; 95% CI 0.40-0.95) and junior pediatric emergency physicians (mean kappa=0.62; 95% CI 0.35-0.98). Interobserver agreement was fair to moderate overall; between pediatric radiologists, kappa=0.51 (0.39-0.64); between senior emergency physicians, kappa=0.55 (0.41-69), and between junior pediatric emergency medicine physicians, kappa=0.37 (0.25-0.51). Practicing emergency clinicians demonstrate considerable intraobserver and interobserver variability in the interpretation of pneumonia on pediatric chest radiographs.

KW - Interobserver agreement

KW - Intraobserver agreement

KW - Medical malpractice

KW - Pediatric

KW - Pediatric chest radiograph

KW - Pneumonia

UR - http://www.scopus.com/inward/record.url?scp=77955716197&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=77955716197&partnerID=8YFLogxK

U2 - 10.1007/s10140-009-0854-2

DO - 10.1007/s10140-009-0854-2

M3 - Article

VL - 17

SP - 285

EP - 290

JO - Emergency Radiology

JF - Emergency Radiology

SN - 1070-3004

IS - 4

ER -