Interobserver consistency of digital rectal examination in clinical staging of localized prostatic carcinoma

Javier C. Angulo, James E. Montie, Timothy Bukowsky, Amit Chakrabarty, David Grignon, Wael Sakr, Falah H. Shamsa, J. Edson Pontes

Research output: Contribution to journalArticle

6 Citations (Scopus)

Abstract

A prospective study was undertaken to determine the reproducibility of clinical staging based on digital rectal examination (DRE) in prostate carcinoma. We evaluated 48 consecutive patients diagnosed with localized prostatic cancer. Four urologists performed DRE and sorted the patients according to the 1992 American Joint Committee on Cancer Classification for prostate cancer. Both the percentage observed total agreement among each couple of two different observers and the interobserver variability (Kappa index) were analyzed. The percentage observed total agreement among observers in distinguishing five clinical subcategories (T1c, T2a, T2b, T2c, and T3a) ranged between 38-60% (mean 49%) and the Kappa index showed interobserver agreement was poor (overall Kappa = 0.3 1). All four examiners agreed in assigning the same subcategory in only 21 % of cases, and 90% of them were T I. If only categories are distinguished (T I, T2, or T3), the percentage observed total agreement rises to 60-71% (mean 66%) and the interexaminer agreement improves to good (overall Kappa = 0.4 1). Accurate pathologic staging was obtained in every patient and the percentage observed agreement between every examiner and the pathologist was calculated, excluding cases interpreted as T I c. Regarding subcategories, clinicopathologic agreement ranges between 17-46%. If only categories T2 and9T3 are distinguished, agreement rises to 57-69%. In summary, the ability to reproduce clinical staging based on DRE among multiple examiners is disappointingly low and understandably correlates poorly with pathologic stage.

Original languageEnglish (US)
Pages (from-to)199-205
Number of pages7
JournalUrologic Oncology: Seminars and Original Investigations
Volume1
Issue number5
DOIs
StatePublished - 1995
Externally publishedYes

Fingerprint

Digital Rectal Examination
Prostatic Neoplasms
Carcinoma
Observer Variation
Prostate
Prospective Studies

Keywords

  • Digital rectal examination
  • interindividual consistency
  • prostate cancer
  • staging

ASJC Scopus subject areas

  • Oncology
  • Urology

Cite this

Interobserver consistency of digital rectal examination in clinical staging of localized prostatic carcinoma. / Angulo, Javier C.; Montie, James E.; Bukowsky, Timothy; Chakrabarty, Amit; Grignon, David; Sakr, Wael; Shamsa, Falah H.; Edson Pontes, J.

In: Urologic Oncology: Seminars and Original Investigations, Vol. 1, No. 5, 1995, p. 199-205.

Research output: Contribution to journalArticle

Angulo, Javier C. ; Montie, James E. ; Bukowsky, Timothy ; Chakrabarty, Amit ; Grignon, David ; Sakr, Wael ; Shamsa, Falah H. ; Edson Pontes, J. / Interobserver consistency of digital rectal examination in clinical staging of localized prostatic carcinoma. In: Urologic Oncology: Seminars and Original Investigations. 1995 ; Vol. 1, No. 5. pp. 199-205.
@article{e185d1847ecb44ffa13f3a7934a6b98e,
title = "Interobserver consistency of digital rectal examination in clinical staging of localized prostatic carcinoma",
abstract = "A prospective study was undertaken to determine the reproducibility of clinical staging based on digital rectal examination (DRE) in prostate carcinoma. We evaluated 48 consecutive patients diagnosed with localized prostatic cancer. Four urologists performed DRE and sorted the patients according to the 1992 American Joint Committee on Cancer Classification for prostate cancer. Both the percentage observed total agreement among each couple of two different observers and the interobserver variability (Kappa index) were analyzed. The percentage observed total agreement among observers in distinguishing five clinical subcategories (T1c, T2a, T2b, T2c, and T3a) ranged between 38-60{\%} (mean 49{\%}) and the Kappa index showed interobserver agreement was poor (overall Kappa = 0.3 1). All four examiners agreed in assigning the same subcategory in only 21 {\%} of cases, and 90{\%} of them were T I. If only categories are distinguished (T I, T2, or T3), the percentage observed total agreement rises to 60-71{\%} (mean 66{\%}) and the interexaminer agreement improves to good (overall Kappa = 0.4 1). Accurate pathologic staging was obtained in every patient and the percentage observed agreement between every examiner and the pathologist was calculated, excluding cases interpreted as T I c. Regarding subcategories, clinicopathologic agreement ranges between 17-46{\%}. If only categories T2 and9T3 are distinguished, agreement rises to 57-69{\%}. In summary, the ability to reproduce clinical staging based on DRE among multiple examiners is disappointingly low and understandably correlates poorly with pathologic stage.",
keywords = "Digital rectal examination, interindividual consistency, prostate cancer, staging",
author = "Angulo, {Javier C.} and Montie, {James E.} and Timothy Bukowsky and Amit Chakrabarty and David Grignon and Wael Sakr and Shamsa, {Falah H.} and {Edson Pontes}, J.",
year = "1995",
doi = "10.1016/1078-1439(95)00066-6",
language = "English (US)",
volume = "1",
pages = "199--205",
journal = "Urologic Oncology",
issn = "1078-1439",
publisher = "Elsevier Inc.",
number = "5",

}

TY - JOUR

T1 - Interobserver consistency of digital rectal examination in clinical staging of localized prostatic carcinoma

AU - Angulo, Javier C.

AU - Montie, James E.

AU - Bukowsky, Timothy

AU - Chakrabarty, Amit

AU - Grignon, David

AU - Sakr, Wael

AU - Shamsa, Falah H.

AU - Edson Pontes, J.

PY - 1995

Y1 - 1995

N2 - A prospective study was undertaken to determine the reproducibility of clinical staging based on digital rectal examination (DRE) in prostate carcinoma. We evaluated 48 consecutive patients diagnosed with localized prostatic cancer. Four urologists performed DRE and sorted the patients according to the 1992 American Joint Committee on Cancer Classification for prostate cancer. Both the percentage observed total agreement among each couple of two different observers and the interobserver variability (Kappa index) were analyzed. The percentage observed total agreement among observers in distinguishing five clinical subcategories (T1c, T2a, T2b, T2c, and T3a) ranged between 38-60% (mean 49%) and the Kappa index showed interobserver agreement was poor (overall Kappa = 0.3 1). All four examiners agreed in assigning the same subcategory in only 21 % of cases, and 90% of them were T I. If only categories are distinguished (T I, T2, or T3), the percentage observed total agreement rises to 60-71% (mean 66%) and the interexaminer agreement improves to good (overall Kappa = 0.4 1). Accurate pathologic staging was obtained in every patient and the percentage observed agreement between every examiner and the pathologist was calculated, excluding cases interpreted as T I c. Regarding subcategories, clinicopathologic agreement ranges between 17-46%. If only categories T2 and9T3 are distinguished, agreement rises to 57-69%. In summary, the ability to reproduce clinical staging based on DRE among multiple examiners is disappointingly low and understandably correlates poorly with pathologic stage.

AB - A prospective study was undertaken to determine the reproducibility of clinical staging based on digital rectal examination (DRE) in prostate carcinoma. We evaluated 48 consecutive patients diagnosed with localized prostatic cancer. Four urologists performed DRE and sorted the patients according to the 1992 American Joint Committee on Cancer Classification for prostate cancer. Both the percentage observed total agreement among each couple of two different observers and the interobserver variability (Kappa index) were analyzed. The percentage observed total agreement among observers in distinguishing five clinical subcategories (T1c, T2a, T2b, T2c, and T3a) ranged between 38-60% (mean 49%) and the Kappa index showed interobserver agreement was poor (overall Kappa = 0.3 1). All four examiners agreed in assigning the same subcategory in only 21 % of cases, and 90% of them were T I. If only categories are distinguished (T I, T2, or T3), the percentage observed total agreement rises to 60-71% (mean 66%) and the interexaminer agreement improves to good (overall Kappa = 0.4 1). Accurate pathologic staging was obtained in every patient and the percentage observed agreement between every examiner and the pathologist was calculated, excluding cases interpreted as T I c. Regarding subcategories, clinicopathologic agreement ranges between 17-46%. If only categories T2 and9T3 are distinguished, agreement rises to 57-69%. In summary, the ability to reproduce clinical staging based on DRE among multiple examiners is disappointingly low and understandably correlates poorly with pathologic stage.

KW - Digital rectal examination

KW - interindividual consistency

KW - prostate cancer

KW - staging

UR - http://www.scopus.com/inward/record.url?scp=0343758241&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0343758241&partnerID=8YFLogxK

U2 - 10.1016/1078-1439(95)00066-6

DO - 10.1016/1078-1439(95)00066-6

M3 - Article

AN - SCOPUS:0343758241

VL - 1

SP - 199

EP - 205

JO - Urologic Oncology

JF - Urologic Oncology

SN - 1078-1439

IS - 5

ER -