期刊论文详细信息
BMC Medical Imaging
A workstation-integrated peer review quality assurance program: pilot study
Research Article
Margaret M O’Keeffe1  Kerry Siminoski1  Todd M Davis2 
[1] Department of Radiology and Diagnostic Imaging, University of Alberta, and Medical Imaging Consultants, 11010-101 Street, T5H 4B9, Edmonton, AB, Canada;Intelerad, Montreal, QC, Canada;295 Midpark Way SE, Suite 380, T2X 2A8, Calgary, AB, Canada;
关键词: Diagnostic errors;    Diagnostic imaging;    Peer review;    Practice performance evaluation;    Quality assurance;    RADPEER;   
DOI  :  10.1186/1471-2342-13-19
 received in 2012-08-20, accepted in 2013-06-26,  发布年份 2013
来源: Springer
PDF
【 摘 要 】

BackgroundThe surrogate indicator of radiological excellence that has become accepted is consistency of assessments between radiologists, and the technique that has become the standard for evaluating concordance is peer review. This study describes the results of a workstation-integrated peer review program in a busy outpatient radiology practice.MethodsWorkstation-based peer review was performed using the software program Intelerad Peer Review. Cases for review were randomly chosen from those being actively reported. If an appropriate prior study was available, and if the reviewing radiologist and the original interpreting radiologist had not exceeded review targets, the case was scored using the modified RADPEER system.ResultsThere were 2,241 cases randomly assigned for peer review. Of selected cases, 1,705 (76%) were interpreted. Reviewing radiologists agreed with prior reports in 99.1% of assessments. Positive feedback (score 0) was given in three cases (0.2%) and concordance (scores of 0 to 2) was assigned in 99.4%, similar to reported rates of 97.0% to 99.8%. Clinically significant discrepancies (scores of 3 or 4) were identified in 10 cases (0.6%). Eighty-eight percent of reviewed radiologists found the reviews worthwhile, 79% found scores appropriate, and 65% felt feedback was appropriate. Two-thirds of radiologists found case rounds discussing significant discrepancies to be valuable.ConclusionsThe workstation-based computerized peer review process used in this pilot project was seamlessly incorporated into the normal workday and met most criteria for an ideal peer review system. Clinically significant discrepancies were identified in 0.6% of cases, similar to published outcomes using the RADPEER system. Reviewed radiologists felt the process was worthwhile.

【 授权许可】

Unknown   
© O’Keeffe et al.; licensee BioMed Central Ltd. 2013. This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

【 预 览 】
附件列表
Files Size Format View
RO202311091108606ZK.pdf 338KB PDF download
【 参考文献 】
  • [1]
  • [2]
  • [3]
  • [4]
  • [5]
  • [6]
  • [7]
  • [8]
  • [9]
  • [10]
  • [11]
  • [12]
  • [13]
  • [14]
  • [15]
  • [16]
  • [17]
  • [18]
  • [19]
  • [20]
  • [21]
  • [22]
  • [23]
  • [24]
  • [25]
  • [26]
  • [27]
  • [28]
  • [29]
  • [30]
  • [31]
  文献评价指标  
  下载次数:2次 浏览次数:3次