期刊论文详细信息
BMC Medical Research Methodology
Crowdsourcing citation-screening in a mixed-studies systematic review: a feasibility study
Anna H. Noel-Storr1  Guillaume Lamé2  Gordon Dooley3  Patrick Redmond4  Sarah Kelly5  Jenni Burt5  Elisa Liberati5  Andy Paterson5  Lucy Miller6 
[1] Cochrane Dementia and Cognitive Improvement Group, Radcliffe Department of Medicine, University of Oxford, OX3 9DU, Oxford, UK;Laboratoire Genie Industriel, CentraleSupélec, Université Paris-Saclay, 91190, Gif-sur-Yvette, France;The Healthcare Improvement Studies Institute (THIS Institute), University of Cambridge, Cambridge Biomedical Campus, Cambridge, UK;Metaxis Ltd, Elmbank Offices, Main Road Curbridge, Witney, OX29 7NT, Oxfordshire, UK;NIHR ACL in General Practice, School of Population Health & Environmental Sciences, Kings College London, London, UK;The Healthcare Improvement Studies Institute (THIS Institute), University of Cambridge, Cambridge Biomedical Campus, Cambridge, UK;The Healthcare Improvement Studies Institute (THIS Institute), University of Cambridge, Cambridge Biomedical Campus, Cambridge, UK;University Division of Anaesthesia at Addenbrooke’s, University of Cambridge, Cambridge, UK;
关键词: Crowdsourcing;    Systematic review;    Evidence synthesis;    Citizen science;    Information retrieval;    Citations;   
DOI  :  10.1186/s12874-021-01271-4
来源: Springer
PDF
【 摘 要 】

BackgroundCrowdsourcing engages the help of large numbers of people in tasks, activities or projects, usually via the internet. One application of crowdsourcing is the screening of citations for inclusion in a systematic review. There is evidence that a ‘Crowd’ of non-specialists can reliably identify quantitative studies, such as randomized controlled trials, through the assessment of study titles and abstracts. In this feasibility study, we investigated crowd performance of an online, topic-based citation-screening task, assessing titles and abstracts for inclusion in a single mixed-studies systematic review.MethodsThis study was embedded within a mixed studies systematic review of maternity care, exploring the effects of training healthcare professionals in intrapartum cardiotocography. Citation-screening was undertaken via Cochrane Crowd, an online citizen science platform enabling volunteers to contribute to a range of tasks identifying evidence in health and healthcare. Contributors were recruited from users registered with Cochrane Crowd. Following completion of task-specific online training, the crowd and the review team independently screened 9546 titles and abstracts. The screening task was subsequently repeated with a new crowd following minor changes to the crowd agreement algorithm based on findings from the first screening task. We assessed the crowd decisions against the review team categorizations (the ‘gold standard’), measuring sensitivity, specificity, time and task engagement.ResultsSeventy-eight crowd contributors completed the first screening task. Sensitivity (the crowd’s ability to correctly identify studies included within the review) was 84% (N = 42/50), and specificity (the crowd’s ability to correctly identify excluded studies) was 99% (N = 9373/9493). Task completion was 33 h for the crowd and 410 h for the review team; mean time to classify each record was 6.06 s for each crowd participant and 3.96 s for review team members. Replicating this task with 85 new contributors and an altered agreement algorithm found 94% sensitivity (N = 48/50) and 98% specificity (N = 9348/9493). Contributors reported positive experiences of the task.ConclusionIt might be feasible to recruit and train a crowd to accurately perform topic-based citation-screening for mixed studies systematic reviews, though resource expended on the necessary customised training required should be factored in. In the face of long review production times, crowd screening may enable a more time-efficient conduct of reviews, with minimal reduction of citation-screening accuracy, but further research is needed.

【 授权许可】

CC BY   

【 预 览 】
附件列表
Files Size Format View
RO202107031652909ZK.pdf 754KB PDF download
  文献评价指标  
  下载次数:10次 浏览次数:1次