期刊论文详细信息
BMC Health Services Research
Ethnographic process evaluation in primary care: explaining the complexity of implementation
Christine Nelson1  MaryBeth Mercer4  Victoria Jaworski3  Carmit K McMullen2  James V Davis2  Rachel Gold1  Arwen E Bunce2 
[1] OCHIN, Inc., 1881 SW Naito Parkway, Portland 97201, OR, USA;Kaiser Permanente Center for Health Research, 3800 N. Interstate Ave., Portland 97227, OR, USA;Multnomah County Health Department, 426 SW Stark St., Portland 97204, OR, USA;Virginia Garcia Memorial Health Center, Cornelius 97113, OR, USA
关键词: Primary care;    Clinical informatics;    Mixed methods;    Qualitative methods;    Ethnography;   
Others  :  1090184
DOI  :  10.1186/s12913-014-0607-0
 received in 2014-05-28, accepted in 2014-11-17,  发布年份 2014
PDF
【 摘 要 】

Background

The recent growth of implementation research in care delivery systems has led to a renewed interest in methodological approaches that deliver not only intervention outcome data but also deep understanding of the complex dynamics underlying the implementation process. We suggest that an ethnographic approach to process evaluation, when informed by and integrated with quantitative data, can provide this nuanced insight into intervention outcomes. The specific methods used in such ethnographic process evaluations are rarely presented in detail; our objective is to stimulate a conversation around the successes and challenges of specific data collection methods in health care settings. We use the example of a translational clinical trial among 11 community clinics in Portland, OR that are implementing an evidence-based, health-information technology (HIT)-based intervention focused on patients with diabetes.

Discussion

Our ethnographic process evaluation employed weekly diaries by clinic-based study employees, observation, informal and formal interviews, document review, surveys, and group discussions to identify barriers and facilitators to implementation success, provide insight into the quantitative study outcomes, and uncover lessons potentially transferable to other implementation projects. These methods captured the depth and breadth of factors contributing to intervention uptake, while minimizing disruption to clinic work and supporting mid-stream shifts in implementation strategies. A major challenge is the amount of dedicated researcher time required.

Summary

The deep understanding of the ‘how’ and ‘why’ behind intervention outcomes that can be gained through an ethnographic approach improves the credibility and transferability of study findings. We encourage others to share their own experiences with ethnography in implementation evaluation and health services research, and to consider adapting the methods and tools described here for their own research.

【 授权许可】

   
2014 Bunce et al.; licensee BioMed Central Ltd.

【 预 览 】
附件列表
Files Size Format View
20150128154743786.pdf 257KB PDF download
【 参考文献 】
  • [1]Miller WL, Crabtree BF, Harrison MI, Fennell ML: Integrating mixed methods in health services and delivery system research. Health Serv Res 2013, 48:2125-2133.
  • [2]Creswell JW: Controversies in Mixed Methods Research. In The SAGE Handbook of Qualitative Research. 4th edition. Edited by Denzin NK, Lincoln YS. SAGE Publications, Inc, Thousand Oaks, CA; 2011:269-283.
  • [3]Teddlie C, Thashakkori A: Mixed Methods Research: Contemporary Issues in an Emerging Field. In The SAGE Handbook of Qualitative Research. 4th edition. Edited by Denzin NK, Lincoln YS. SAGE Publications, Inc, Thousand Oaks, CA; 2014:285-299.
  • [4]Special edition: integrating mixed methods in health services and delivery system research Health Serv Res 2013, 48:2125.
  • [5]Wisdom JP, Cavaleri MA, Onwuegbuzie AJ, Green CA: Methodological reporting in qualitative, quantitative, and mixed methods health services research articles. Health Serv Res 2012, 47:721-745.
  • [6]Scammon DL, Tomoaia-Cotisel A, Day RL, Day J, Kim J, Waitzman NJ, Farrell TW, Magill MK: Connecting the dots and merging meaning: using mixed methods to study primary care delivery transformation. Health Serv Res 2013, 48:2181-2207.
  • [7]Powell BJ, Proctor EK, Glisson CA, Kohl PL, Raghavan R, Brownson RC, Stoner BP, Carpenter CR, Palinkas LA: A mixed methods multiple case study of implementation as usual in children’s social service organizations: study protocol. Implement Sci 2013, 8:92. BioMed Central Full Text
  • [8]Aarons GA, Fettes DL, Sommerfeld DH, Palinkas LA: Mixed methods for implementation research: application to evidence-based practice implementation and staff turnover in community-based organizations providing child welfare services. Child Maltreat 2012, 17:67-79.
  • [9]Fetters MD, Curry LA, Creswell JW: Achieving integration in mixed methods designs: principles and practices. Health Serv Res 2013, 48:2134-2156.
  • [10]Dixon-Woods M, Bosk C: Learning through observation: the role of ethnography in improving critical care. Curr Opin Crit Care 2010, 16:639-642.
  • [11]Smith-Morris C, Lopez G, Ottomanelli L, Goetz L, Dixon-Lawson K: Ethnography, fidelity, and the evidence that anthropology adds: supplementing the fidelity process in a clinical trial of supported employment. Med Anthropol Q 2014, 28:141-161.
  • [12]Greenhalgh T, Swinglehurst D: Studying technology use as social practice: the untapped potential of ethnography. BMC Med 2011, 9:45. BioMed Central Full Text
  • [13]Myers MD: Investigating information systems with ethnographic research. Commun Assoc Inf Syst 1999, 2:2-19.
  • [14]Dudl RJ, Wang MC, Wong M, Bellows J: Preventing myocardial infarction and stroke with a simplified bundle of cardioprotective medications. Am J Manag Care 2009, 15:e88-e94.
  • [15]Bellg AJ, Borrelli B, Resnick B, Hecht J, Minicucci DS, Ory M, Ogedegbe G, Orwig D, Ernst D, Czajkowski S: Enhancing treatment fidelity in health behavior change studies: best practices and recommendations from the NIH behavior change consortium. Health Psychol 2004, 23:443-451.
  • [16]Saunders RP, Evans AE, Kenison K, Workman L, Dowda M, Chu YH: Conceptualizing, implementing, and monitoring a structural health promotion intervention in an organizational setting. Health Promot Pract 2013, 14:343-353.
  • [17]Wilson DK, Griffin S, Saunders RP, Kitzman-Ulrich H, Meyers DC, Mansard L: Using process evaluation for program improvement in dose, fidelity and reach: the ACT trial experience. Int J Behav Nutr Phys Act 2009, 6:79. BioMed Central Full Text
  • [18]Greenhalgh T, Russell J: Why do evaluations of eHealth programs fail? An alternative set of guiding principles. PLoS Med 2010, 7:e1000360.
  • [19]Hasson H: Systematic evaluation of implementation fidelity of complex interventions in health and social care. Implement Sci 2010, 5:67. BioMed Central Full Text
  • [20]Patton MQ: The view from evaluation. NAPA Bulletin 2005, 24:31-40.
  • [21]LeCompte MD, Schensul JJ: Designing and Conducting Ethnographic Research. AltaMira Press, Lanham, MD; 2010.
  • [22]Cefkin M: The Limits to Speed in Ethnography. In Advancing Ethnography in Corporate Environments: Challenges and Emerging Opportunities. Edited by Jordan B. Left Coast Press, Walnut Creek, CA; 2013:108.
  • [23]Stange KC, Glasgow RE: Considering and Reporting Important Contextual Factors in Research on the Patient-Centered Medical Home. Agency for Healthcare Research and Quality, Rockville, MD; 2013.
  • [24]Tomoaia-Cotisel A, Scammon DL, Waitzman NJ, Cronholm PF, Halladay JR, Driscoll DL, Solberg LI, Hsu C, Tai-Seale M, Hiratsuka V, Shih SC, Fetters MD, Wise CG, Alexander JA, Hauser D, McMullen CK, Scholle SH, Tirodkar MA, Schmidt L, Donahue KE, Parchman ML, Stange KC: Context matters: the experience of 14 research teams in systematically reporting contextual factors important for practice change. Ann Fam Med 2013, 11(Suppl 1):S115-S123.
  • [25]Prasad P: Systems of Meaning: Ethnography as a Methodology for the Study of Information Technologies. In Information Systems and Qualitative Research. Edited by Lee A, Liebenau J, DeGross J. Springer US, New York, NY; 1997:101-118.
  • [26]Eccles MP, Foy R, Sales A, Wensing M, Mittman B: Implementation Science six years on–our evolving scope and common reasons for rejection without review. Implement Sci 2012, 7:71. BioMed Central Full Text
  • [27]Greenhalgh T, Russell J, Ashcroft RE, Parsons W: Why national eHealth programs need dead philosophers: Wittgensteinian reflections on policymakers’ reluctance to learn from history. Milbank Q 2011, 89:533-563.
  • [28]Dourish P: Process Descriptions as Organisational Accounting Devices: The Dual use of Workflow Technologies. In Proceedings of the 2001 International ACM SIGGROUP Conference on Supporting Group Work (GROUP’01). Edited by Ellis C, Zigurs I. Association of Computing Machinery, New York, NY; 2001.
  • [29]Sittig DF, Singh H: A new sociotechnical model for studying health information technology in complex adaptive healthcare systems. Qual Saf Health Care 2010, 19(Suppl 3):i68-i74.
  • [30]Goodson L, Vassar M: An overview of ethnography in healthcare and medical education research. J Educ Eval Health Prof 2011, 8:4.
  • [31]Morse JM: What Is Qualitative Health Research? In The SAGE Handbook of Qualitative Research. 4th edition. Edited by Denzin NK, Lincoln YS. Sage Publications, Los Angeles; 2011:401-414.
  • [32][https://www.youtube.com/watch?v=D9Ihs241zeg] webcite Adichie CN: The danger of a single story. TED Talk. 2009. . 12-6-2013.
  • [33]Nastasi BK, Berg MJ: Chapter 1, Using Ethnography to Strengthen and Evaluate Intervention Programs. In Using Ethnographic Data: Interventions, Public Programming, and Public Policy. Edited by Schensul JJ, LeCompte MD, Hess A Jr, Nastasi BK, Berg MJ, Williamson L, Brecher J, Glasser R. Altamira Press, Walnut Creek, CA; 1999:1-49.
  • [34]Lincoln YS, Guba EG: Naturalistic Inquiry. SAGE Publications, Newbury Park, CA; 1985.
  • [35]Borkan J: Immersion/Crystallization. In Doing Qualitative Research, Second Edition edition. Edited by Crabtree BF, Miller WL. Sage Publications, Thousand Oaks, CA; 2014:179-194.
  • [36]Bradley EH, Curry LA, Devers KJ: Qualitative data analysis for health services research: developing taxonomy, themes, and theory. Health Serv Res 2007, 42:1758-1772.
  • [37]Krueger RA: Analyzing and Reporting Focus Group Results. Focus Group Kit 6. Sage Publications, Thousand Oaks, CA; 1998.
  • [38]LeCompte MD, Schensul JJ: Analyzing & Interpreting Ethnographic Data. Ethnographer’s Toolkit 5. AltaMira Press, Walnut Creek, CA; 1999.
  • [39]Miller WL, Crabtree BF: The Dance of Interpretation. In Doing Qualitative Research. 2nd edition. Edited by Crabtree BF, Miller WL. Sage Publications, Thousand Oaks, CA; 1999:127-143.
  • [40]Guest G, Bunce A, Johnson L: How many interviews Are enough? an experiment with data saturation and variability. Field Methods 2006, 18:59-82.
  • [41]Cohen DJ, Leviton LC, Isaacson N, Tallia AF, Crabtree BF: Online diaries for qualitative evaluation: gaining real-time insights. Am J Eval 2006, 27:163-184.
  • [42]Davis K: A method of studying communication patterns in organizations. Pers Psychol 1953, 6:301-312.
  • [43]Hargie O, Tourish D: Handbook of Communication Audits for Organisations. Routledge, Oxford, UK; 2000.
  • [44][ http:/ / www.cms.gov/ Regulations-and-Guidance/ Legislation/ EHRIncentivePrograms/ Meaningful_Use.html] webcite Centers for Medicare & Medicaid Services: Meaningful Use. Centers for Medicare & Medicaid Services; 2013. 12-20-2013.
  • [45][http:/ / healthaffairs.org/ blog/ 2014/ 07/ 21/ the-alternative-payment-methodology -in-oregon-community-health-centers -empowering-new-ways-of-providing-c are/ ] webcite HealthAffairs Blog: The Alternative Payment Methodology In Oregon Community Health Centers: Empowering New Ways Of Providing Care [].
  • [46]Morgan DL, Krueger RA: The Focus Group Kit, vols 1–6. SAGE Publications, Thousand Oaks, CA; 1997.
  • [47]Beebe J: Rapid Assessment Process: An Introduction. Altamira Press, Walnut Creek, CA; 2001.
  • [48]McMullen CK, Ash JS, Sittig DF, Bunce A, Guappone K, Dykstra R, Carpenter J, Richardson J, Wright A: Rapid assessment of clinical information systems in the healthcare setting: an efficient method for time-pressed evaluation. Methods Inf Med 2011, 50:299-307.
  • [49]Spradley JP: Participant Observation. Holt, Rinehart and Winston; 1980.
  • [50]Butler MO: Translating evaluation anthropology. NAPA Bulletin 2005, 24:17-30.
  • [51]Spector PE: Social Desirability Bias. In The SAGE Encyclopedia of Social Science Research Methods. Edited by Lewis-Beck MS, Bryman A, Futing T. SAGE Publication, Thousand Oaks, CA; 2004.
  文献评价指标  
  下载次数:7次 浏览次数:12次