期刊论文详细信息
Frontiers in Psychology
An Overview of Interrater Agreement on Likert Scales for Researchers and Practitioners
Thomas A. O'Neill1 
关键词: interrater agreement;    rwg;    multilevel methods;    data aggregation;    within-group agreement;    reliability;   
DOI  :  10.3389/fpsyg.2017.00777
学科分类:心理学(综合)
来源: Frontiers
PDF
【 摘 要 】

Applications of interrater agreement (IRA) statistics for Likert scales are plentiful in research and practice. IRA may be implicated in job analysis, performance appraisal, panel interviews, and any other approach to gathering systematic observations. Any rating system involving subject-matter experts can also benefit from IRA as a measure of consensus. Further, IRA is fundamental to aggregation in multilevel research, which is becoming increasingly common in order to address nesting. Although, several technical descriptions of a few specific IRA statistics exist, this paper aims to provide a tractable orientation to common IRA indices to support application. The introductory overview is written with the intent of facilitating contrasts among IRA statistics by critically reviewing equations, interpretations, strengths, and weaknesses. Statistics considered include rwg, rwg*, r′wg, rwg(p), average deviation (AD), awg, standard deviation (Swg), and the coefficient of variation (CVwg). Equations support quick calculation and contrasting of different agreement indices. The article also includes a “quick reference” table and three figures in order to help readers identify how IRA statistics differ and how interpretations of IRA will depend strongly on the statistic employed. A brief consideration of recommended practices involving statistical and practical cutoff standards is presented, and conclusions are offered in light of the current literature.

【 授权许可】

CC BY   

【 预 览 】
附件列表
Files Size Format View
RO201901220890228ZK.pdf 1139KB PDF download
  文献评价指标  
  下载次数:2次 浏览次数:5次