期刊论文详细信息
Implementation Science
Outcomes for implementation science: an enhanced systematic review of instruments using evidence-based rating criteria
Ruben G. Martinez6  Mimi Kim3  Cameo Stanick1  Bryan J. Weiner2  Sarah Fischer5  Cara C. Lewis4 
[1] Department of Psychology, University of Montana, 32 Campus Dr., Skaggs Bldg. 202, Missoula 59812, MT, USA;1102-C McGavran-Greenberg Hall, University of North Carolina at Chapel Hill, 135 Dauer Drive, Chapel Hill 27599-7411, NC, USA;Center for Biobehavioral Health Disparities Research, Duke University, Durham 27708-0420, NC, USA;Department of Psychiatry and Behavioral Sciences, University of Washington, School of Medicine, Harborview Medical Center, 325 9th Ave, Seattle 98104, WA, USA;Department of Psychological and Brain Sciences, Indiana University, 1101 E. 10th St., Bloomington 47405, IN, USA;Department of Psychology, Virginia Commonwealth University, 806 West Franklin St., Richmond 23220, VA, USA
关键词: Psychometrics;    Evidence-based assessment;    Instruments;    Dissemination;    Implementation;    Systematic review;   
Others  :  1230926
DOI  :  10.1186/s13012-015-0342-x
 received in 2015-04-11, accepted in 2015-10-19,  发布年份 2015
【 摘 要 】

Background

High-quality measurement is critical to advancing knowledge in any field. New fields, such as implementation science, are often beset with measurement gaps and poor quality instruments, a weakness that can be more easily addressed in light of systematic review findings. Although several reviews of quantitative instruments used in implementation science have been published, no studies have focused on instruments that measure implementation outcomes. Proctor and colleagues established a core set of implementation outcomes including: acceptability, adoption, appropriateness, cost, feasibility, fidelity, penetration, sustainability (Adm Policy Ment Health Ment Health Serv Res 36:24–34, 2009). The Society for Implementation Research Collaboration (SIRC) Instrument Review Project employed an enhanced systematic review methodology (Implement Sci 2: 2015) to identify quantitative instruments of implementation outcomes relevant to mental or behavioral health settings.

Methods

Full details of the enhanced systematic review methodology are available (Implement Sci 2: 2015). To increase the feasibility of the review, and consistent with the scope of SIRC, only instruments that were applicable to mental or behavioral health were included. The review, synthesis, and evaluation included the following: (1) a search protocol for the literature review of constructs; (2) the literature review of instruments using Web of Science and PsycINFO; and (3) data extraction and instrument quality ratings to inform knowledge synthesis. Our evidence-based assessment rating criteria quantified fundamental psychometric properties as well as a crude measure of usability. Two independent raters applied the evidence-based assessment rating criteria to each instrument to generate a quality profile.

Results

We identified 104 instruments across eight constructs, with nearly half (n = 50) assessing acceptability and 19 identified for adoption, with all other implementation outcomes revealing fewer than 10 instruments. Only one instrument demonstrated at least minimal evidence for psychometric strength on all six of the evidence-based assessment criteria. The majority of instruments had no information regarding responsiveness or predictive validity.

Conclusions

Implementation outcomes instrumentation is underdeveloped with respect to both the sheer number of available instruments and the psychometric quality of existing instruments. Until psychometric strength is established, the field will struggle to identify which implementation strategies work best, for which organizations, and under what conditions.

【 授权许可】

   
2015 Lewis et al.

附件列表
Files Size Format View
Fig. 12. 21KB Image download
Fig. 11. 49KB Image download
Fig. 10. 42KB Image download
Fig. 9. 41KB Image download
Fig. 8. 43KB Image download
Fig. 7. 41KB Image download
Fig. 6. 43KB Image download
Fig. 5. 40KB Image download
Fig. 4. 42KB Image download
Fig. 3. 47KB Image download
Figure 3. 30KB Image download
Fig. 1. 49KB Image download
Fig. 12. 21KB Image download
Fig. 11. 49KB Image download
Fig. 10. 42KB Image download
Fig. 9. 41KB Image download
Fig. 8. 43KB Image download
Fig. 7. 41KB Image download
Fig. 6. 43KB Image download
Fig. 5. 40KB Image download
Fig. 4. 42KB Image download
Fig. 3. 47KB Image download
Fig. 2. 49KB Image download
Fig. 1. 49KB Image download
【 图 表 】

Fig. 1.

Fig. 2.

Fig. 3.

Fig. 4.

Fig. 5.

Fig. 6.

Fig. 7.

Fig. 8.

Fig. 9.

Fig. 10.

Fig. 11.

Fig. 12.

Fig. 1.

Figure 3.

Fig. 3.

Fig. 4.

Fig. 5.

Fig. 6.

Fig. 7.

Fig. 8.

Fig. 9.

Fig. 10.

Fig. 11.

Fig. 12.

【 参考文献 】
  • [1]Martinez RG, Lewis CC, Weiner BJ. Instrumentation issues in implementation science. Implement Sci. 2014; 9:118. BioMed Central Full Text
  • [2]Weiner B, Amick H, Lee S-Y. Conceptualization and measurement of organizational readiness for change: a review of the literature in health services research and other fields. Med Care Res Rev. 2008
  • [3]Chaudoir SR, Dugan AG, Barr CH. Measuring factors affecting implementation of health innovations: a systematic review of structural, organizational, provider, patient, and innovation level measures. Implement Sci. 2013; 8:22. BioMed Central Full Text
  • [4]Chor KHB, Wisdom JP, Olin S-CS, Hoagwood KE, Horwitz SM. Measures for Predictors of Innovation Adoption. Adm Policy Ment Health Ment Health Serv Res. 2015;42:545-73.
  • [5]Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Adm Policy Ment Health Ment Health Serv Res. 2009; 36:24-34.
  • [6]Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health Ment Health Serv Res. 2011; 38:65-76.
  • [7]Lewis CC, Stanick CF, Martinez RG, Weiner BJ, Kim M, Barwick M, Comtois KA. The society for implementation research collaboration instrument review project: a methodology to promote rigorous evaluation. Implement Sci. 2015; 10:2. BioMed Central Full Text
  • [8]Grimshaw J, Eccles M, Thomas R, MacLennan G, Ramsay C, Fraser C, Vale L. Toward evidence-based quality improvement. J Gen Intern Med. 2006; 21:S14-S20.
  • [9]Hunsley J, Mash EJ. Evidence-based assessment. Annu Rev Clin Psychol. 2007; 3:29-51.
  • [10]Terwee CB, Bot SD, de Boer MR, van der Windt DA, Knol DL, Dekker J, Bouter LM, de Vet HC. Quality criteria were proposed for measurement properties of health status questionnaires. J Clin Epidemiol. 2007; 60:34-42.
  • [11]Muse K, McManus F. A systematic review of methods for assessing competence in cognitive–behavioural therapy. Clin Psychol Rev. 2013; 33:484-499.
  • [12]Terwee CB, Mokkink LB, Knol DL, Ostelo RW, Bouter LM, de Vet HC. Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist. Qual Life Res. 2012; 21:651-657.
  • [13]Goodman RM, McLeroy KR, Steckler AB, Hoyle RH. Development of level of institutionalization scales for health promotion programs. Health Educ Behav. 1993; 20:161-178.
  • [14]Yetter G. Assessing the acceptability of problem-solving procedures by school teams: preliminary development of the pre-referral intervention team inventory. J Educ Psychol Consult. 2010; 20:139-168.
  • [15]Addis ME, Krasnow AD. A national survey of practicing psychologists’ attitudes toward psychotherapy treatment manuals. J Consult Clin Psychol. 2000; 68:331.
  • [16]Moore GC, Benbasat I. Development of an instrument to measure the perceptions of adopting an information technology innovation. Inf Syst Res. 1991; 2:192-222.
  • [17]Champion VL, Leach A. Variables related to research utilization in nursing: an empirical investigation. J Adv Nurs. 1989; 14:705-710.
  • [18]Whittingham K, Sofronoff K, Sheffield JK. Stepping Stones Triple P: a pilot study to evaluate acceptability of the program by parents of a child diagnosed with an Autism Spectrum Disorder. Res Dev Disabil. 2006; 27:364-380.
  • [19]French MT, Bradley CJ, Calingaert B, Dennis ML, Karuntzos GT. Cost analysis of training and employment services in methadone treatment. Evol Program Plan. 1994; 17:107-120.
  • [20]Kashner TM, Rush AJ, Altshuler KZ. Measuring costs of guideline-driven mental health care: the Texas Medication Algorithm Project. J Ment Health Policy Econ. 1999; 2:111-121.
  • [21]Trent LR. Development of a Measure of Disseminability (MOD). University of Mississippi; 2010.
  • [22]McIntosh K, MacKay LD, Hume AE, Doolittle J, Vincent CG, Horner RH, Ervin RA. Development and initial validation of a measure to assess factors related to sustainability of school-wide positive behavior support. J Posit Behav Interv. 2011;13(4):208-18.
  • [23]Glasgow RE, Riley WT. Pragmatic measures: What they are and why we need them. Am J Prev Med. 2013; 45:237-243.
  • [24]Jensen-Doss A, Hawley KM. Understanding barriers to evidence-based assessment: clinician attitudes toward standardized assessment tools. J Clin Child Adolesc Psychol. 2010; 39:885-896.
  • [25]Rabin BA, Purcell P, Naveed S, Moser RP, Henton MD, Proctor EK, Brownson RC, Glasgow RE. Advancing the application, quality and harmonization of implementation science measures. Implement Sci. 2012; 7:119. BioMed Central Full Text
  • [26]Haynes SN, Mumma GH, Pinson C. Idiographic assessment: Conceptual and psychometric foundations of individualized behavioral assessment. Clin Psychol Rev. 2009; 29:179-191.
  • [27]Wisdom JP, Chor KHB, Hoagwood KE, Horwitz SM. Innovation adoption: a review of theories and constructs. Adm Policy Ment Health Ment Health Serv Res. 2014;41:480-502.
  文献评价指标  
  下载次数:227次 浏览次数:25次