Assessing rater agreement using marginal association models

Susan Perkins, Mark P. Becker

Research output: Contribution to journalArticle

18 Citations (Scopus)

Abstract

New models that are useful in the assessment of rater agreement, particularly when the rating scale is ordered or partially ordered, are presented. The models are parameterized to address two important aspects of rater agreement: (i) agreement in terms of the overall frequency in which raters assign categories; and (ii) the extent to which raters agree on the category assigned to individual subjects or items. We present methodology for the simultaneous modelling of univariate marginal responses and bivariate marginal associations in the K-way contingency table representing the joint distribution of K rater responses. The univariate marginal responses provide information for evaluating agreement in terms of the overall frequency of responses, and the bivariate marginal associations provide information on category-wise agreement among pairs of raters. In addition, estimated scores within a generalized log non-linear model for bivariate associations facilitate the assessment of category distinguishability.

Original languageEnglish
Pages (from-to)1743-1760
Number of pages18
JournalStatistics in Medicine
Volume21
Issue number12
DOIs
StatePublished - Jun 30 2002

Fingerprint

Association Model
Marginal Model
Nonlinear Dynamics
Linear Models
Univariate
Contingency Table
Joint Distribution
Nonlinear Model
Assign
Methodology
Modeling
Model

Keywords

  • Category distinguishability
  • Log-linear model
  • Marginal association model
  • Ordinal data
  • Rater agreement

ASJC Scopus subject areas

  • Epidemiology

Cite this

Assessing rater agreement using marginal association models. / Perkins, Susan; Becker, Mark P.

In: Statistics in Medicine, Vol. 21, No. 12, 30.06.2002, p. 1743-1760.

Research output: Contribution to journalArticle

Perkins, Susan ; Becker, Mark P. / Assessing rater agreement using marginal association models. In: Statistics in Medicine. 2002 ; Vol. 21, No. 12. pp. 1743-1760.
@article{fcb0f118aac34fe89e9b51f7fc53bf53,
title = "Assessing rater agreement using marginal association models",
abstract = "New models that are useful in the assessment of rater agreement, particularly when the rating scale is ordered or partially ordered, are presented. The models are parameterized to address two important aspects of rater agreement: (i) agreement in terms of the overall frequency in which raters assign categories; and (ii) the extent to which raters agree on the category assigned to individual subjects or items. We present methodology for the simultaneous modelling of univariate marginal responses and bivariate marginal associations in the K-way contingency table representing the joint distribution of K rater responses. The univariate marginal responses provide information for evaluating agreement in terms of the overall frequency of responses, and the bivariate marginal associations provide information on category-wise agreement among pairs of raters. In addition, estimated scores within a generalized log non-linear model for bivariate associations facilitate the assessment of category distinguishability.",
keywords = "Category distinguishability, Log-linear model, Marginal association model, Ordinal data, Rater agreement",
author = "Susan Perkins and Becker, {Mark P.}",
year = "2002",
month = "6",
day = "30",
doi = "10.1002/sim.1146",
language = "English",
volume = "21",
pages = "1743--1760",
journal = "Statistics in Medicine",
issn = "0277-6715",
publisher = "John Wiley and Sons Ltd",
number = "12",

}

TY - JOUR

T1 - Assessing rater agreement using marginal association models

AU - Perkins, Susan

AU - Becker, Mark P.

PY - 2002/6/30

Y1 - 2002/6/30

N2 - New models that are useful in the assessment of rater agreement, particularly when the rating scale is ordered or partially ordered, are presented. The models are parameterized to address two important aspects of rater agreement: (i) agreement in terms of the overall frequency in which raters assign categories; and (ii) the extent to which raters agree on the category assigned to individual subjects or items. We present methodology for the simultaneous modelling of univariate marginal responses and bivariate marginal associations in the K-way contingency table representing the joint distribution of K rater responses. The univariate marginal responses provide information for evaluating agreement in terms of the overall frequency of responses, and the bivariate marginal associations provide information on category-wise agreement among pairs of raters. In addition, estimated scores within a generalized log non-linear model for bivariate associations facilitate the assessment of category distinguishability.

AB - New models that are useful in the assessment of rater agreement, particularly when the rating scale is ordered or partially ordered, are presented. The models are parameterized to address two important aspects of rater agreement: (i) agreement in terms of the overall frequency in which raters assign categories; and (ii) the extent to which raters agree on the category assigned to individual subjects or items. We present methodology for the simultaneous modelling of univariate marginal responses and bivariate marginal associations in the K-way contingency table representing the joint distribution of K rater responses. The univariate marginal responses provide information for evaluating agreement in terms of the overall frequency of responses, and the bivariate marginal associations provide information on category-wise agreement among pairs of raters. In addition, estimated scores within a generalized log non-linear model for bivariate associations facilitate the assessment of category distinguishability.

KW - Category distinguishability

KW - Log-linear model

KW - Marginal association model

KW - Ordinal data

KW - Rater agreement

UR - http://www.scopus.com/inward/record.url?scp=0037199002&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0037199002&partnerID=8YFLogxK

U2 - 10.1002/sim.1146

DO - 10.1002/sim.1146

M3 - Article

VL - 21

SP - 1743

EP - 1760

JO - Statistics in Medicine

JF - Statistics in Medicine

SN - 0277-6715

IS - 12

ER -