期刊论文详细信息
BMC Infectious Diseases
Inter-rater reliability of the infectious disease modeling reproducibility checklist (IDMRC) as applied to COVID-19 computational modeling research
Research
Harry Hochheiser1  Bruce Childers2  Meghan Matlack3  Alice E Arcury-Quandt4  Marquis S Hawkins4  Kharlya Carpio4  Darya Pokutnaya4  Willem G Van Panhuis5 
[1] Department of Biomedical Informatics, Intelligent Systems Program, and Clinical and Translational Science Institute, University of Pittsburgh, Pittsburgh, PA, United States of America;Department of Computer Science, University of Pittsburgh, Pittsburgh, PA, United States of America;Department of Environmental and Occupational Health, University of Pittsburgh, Pittsburgh, PA, USA;Department of Epidemiology, University of Pittsburgh, Pittsburgh, PA, United States of America;Office of Data Science and Emerging Technologies, National Institute of Allergy and Infectious Diseases, Rockville, MD, United States of America;
关键词: Reproducibility;    Infectious Disease;    Epidemiology;    Modeling;    COVID-19;    Coronavirus Disease 2019;   
DOI  :  10.1186/s12879-023-08729-4
 received in 2023-03-22, accepted in 2023-10-19,  发布年份 2023
来源: Springer
PDF
【 摘 要 】

BackgroundInfectious disease computational modeling studies have been widely published during the coronavirus disease 2019 (COVID-19) pandemic, yet they have limited reproducibility. Developed through an iterative testing process with multiple reviewers, the Infectious Disease Modeling Reproducibility Checklist (IDMRC) enumerates the minimal elements necessary to support reproducible infectious disease computational modeling publications. The primary objective of this study was to assess the reliability of the IDMRC and to identify which reproducibility elements were unreported in a sample of COVID-19 computational modeling publications.MethodsFour reviewers used the IDMRC to assess 46 preprint and peer reviewed COVID-19 modeling studies published between March 13th, 2020, and July 30th, 2020. The inter-rater reliability was evaluated by mean percent agreement and Fleiss’ kappa coefficients (κ). Papers were ranked based on the average number of reported reproducibility elements, and average proportion of papers that reported each checklist item were tabulated.ResultsQuestions related to the computational environment (mean κ = 0.90, range = 0.90–0.90), analytical software (mean κ = 0.74, range = 0.68–0.82), model description (mean κ = 0.71, range = 0.58–0.84), model implementation (mean κ = 0.68, range = 0.39–0.86), and experimental protocol (mean κ = 0.63, range = 0.58–0.69) had moderate or greater (κ > 0.41) inter-rater reliability. Questions related to data had the lowest values (mean κ = 0.37, range = 0.23–0.59). Reviewers ranked similar papers in the upper and lower quartiles based on the proportion of reproducibility elements each paper reported. While over 70% of the publications provided data used in their models, less than 30% provided the model implementation. Conclusions: The IDMRC is the first comprehensive, quality-assessed tool for guiding researchers in reporting reproducible infectious disease computational modeling studies. The inter-rater reliability assessment found that most scores were characterized by moderate or greater agreement. These results suggest that the IDMRC might be used to provide reliable assessments of the potential for reproducibility of published infectious disease modeling publications. Results of this evaluation identified opportunities for improvement to the model implementation and data questions that can further improve the reliability of the checklist.

【 授权许可】

CC BY   
© The Author(s) 2023

【 预 览 】
附件列表
Files Size Format View
RO202311105659994ZK.pdf 1689KB PDF download
12936_2016_1411_Article_IEq110.gif 1KB Image download
1372KB Image download
Fig. 1 220KB Image download
Fig. 3 126KB Image download
【 图 表 】

Fig. 3

Fig. 1

12936_2016_1411_Article_IEq110.gif

【 参考文献 】
  • [1]
  • [2]
  • [3]
  • [4]
  • [5]
  • [6]
  • [7]
  • [8]
  • [9]
  • [10]
  • [11]
  • [12]
  • [13]
  • [14]
  • [15]
  • [16]
  • [17]
  • [18]
  • [19]
  • [20]
  • [21]
  • [22]
  文献评价指标  
  下载次数:5次 浏览次数:3次