期刊论文详细信息
BMC Medical Education
Programmatic assessment of competency-based workplace learning: when theory meets practice
Debbie ADC Jaarsma1  Cees PM van der Vleuten3  Peter van Beukelen6  Jan CM Haarhuis6  Harold Brommer2  Lars FH Theyse5  Nancy J Rietbroek2  Robert P Favier5  Pim W Teunissen4  Harold GJ Bok6 
[1]Evidence-Based Education, Academic Medical Centre, University of Amsterdam, Amsterdam, The Netherlands
[2]Department of Equine Sciences, Faculty of Veterinary Medicine, Utrecht University, Utrecht, the Netherlands
[3]Department of Educational Development and Research, Faculty of Health, Medicine, and Life Sciences, Maastricht University, Maastricht, The Netherlands
[4]Department of Obstetrics and Gynaecology, VU University Medical Centre, Amsterdam, The Netherlands
[5]Department of Clinical Sciences of Companion Animals, Faculty of Veterinary Medicine, Utrecht University, Utrecht, The Netherlands
[6]Quality Improvement in Veterinary Education, Faculty of Veterinary Medicine, Utrecht University, Utrecht, The Netherlands
关键词: Personal development;    Mentoring;    (Peer) Feedback;    Undergraduate (veterinary) medical education;    Workplace learning;    Programmatic assessment;   
Others  :  1138757
DOI  :  10.1186/1472-6920-13-123
 received in 2013-05-23, accepted in 2013-09-06,  发布年份 2013
PDF
【 摘 要 】

Background

In competency-based medical education emphasis has shifted towards outcomes, capabilities, and learner-centeredness. Together with a focus on sustained evidence of professional competence this calls for new methods of teaching and assessment. Recently, medical educators advocated the use of a holistic, programmatic approach towards assessment. Besides maximum facilitation of learning it should improve the validity and reliability of measurements and documentation of competence development. We explored how, in a competency-based curriculum, current theories on programmatic assessment interacted with educational practice.

Methods

In a development study including evaluation, we investigated the implementation of a theory-based programme of assessment. Between April 2011 and May 2012 quantitative evaluation data were collected and used to guide group interviews that explored the experiences of students and clinical supervisors with the assessment programme. We coded the transcripts and emerging topics were organised into a list of lessons learned.

Results

The programme mainly focuses on the integration of learning and assessment by motivating and supporting students to seek and accumulate feedback. The assessment instruments were aligned to cover predefined competencies to enable aggregation of information in a structured and meaningful way. Assessments that were designed as formative learning experiences were increasingly perceived as summative by students. Peer feedback was experienced as a valuable method for formative feedback. Social interaction and external guidance seemed to be of crucial importance to scaffold self-directed learning. Aggregating data from individual assessments into a holistic portfolio judgement required expertise and extensive training and supervision of judges.

Conclusions

A programme of assessment with low-stakes assessments providing simultaneously formative feedback and input for summative decisions proved not easy to implement. Careful preparation and guidance of the implementation process was crucial. Assessment for learning requires meaningful feedback with each assessment. Special attention should be paid to the quality of feedback at individual assessment moments. Comprehensive attention for faculty development and training for students is essential for the successful implementation of an assessment programme.

【 授权许可】

   
2013 Bok et al.; licensee BioMed Central Ltd.

【 预 览 】
附件列表
Files Size Format View
20150320092205750.pdf 570KB PDF download
Figure 2. 156KB Image download
Figure 1. 52KB Image download
【 图 表 】

Figure 1.

Figure 2.

【 参考文献 】
  • [1]Pritchard WR: Future directions for veterinary medicine. Durham, NC: Duke University; 1988. [Report of the Pew National Veterinary Education Program]
  • [2]Carraccio C, Wolfsthal SD, Englander R, Ferentz K, Martin C: Shifting paradigms: from Flexner to competencies. Acad Med 2002, 77:361-367.
  • [3]Frank JR, Snell LS, Ten Cate O, Holmboe ES, Carraccio C, Swing SR, Harris P, Glasgow NJ, Campbell C, Dath D, Harden RM, Iobst W, Long DM, Mungroo R, Richardson DL, Sherbino J, Silver I, Taber S, Talbot M, Harris KA: Competency-based medical education: theory to practice. Med Teach 2010, 32:638-645.
  • [4]Van der Vleuten CPM: The assessment of professional competence: developments, research and practical implications. Adv Health Sci Educ 1996, 1:41-67.
  • [5]Van der Vleuten CPM, Schuwirth LWT: Assessing professional competence: from methods to programmes. Med Educ 2005, 39:309-317.
  • [6]Ringsted C, Ostergaard D, Scherpbier AJJA: Embracing the new paradigm of assessment in residency training: an assessment programme for first-year residency training in anaesthesiology. Med Teach 2003, 25:54-62.
  • [7]Ringsted C, Henriksen AH, Skaarup AM, Van der Vleuten CPM: Educational impact of in-training assessment (ITA) in postgraduate medical education: a qualitative study of an ITA programme in actual practice. Med Educ 2004, 38:767-777.
  • [8]Ringsted C, Skaarup AM, Henriksen AH, Davis D: Person-task-context: a model for designing curriculum and in-training assessment in postgraduate education. Med Teach 2006, 28:70-76.
  • [9]Dannefer EF, Henson LC: The portfolio approach to competency-based assessment at the Cleveland clinic Lerner college of medicine. Acad Med 2007, 82:493-502.
  • [10]Prescott LE, Norcini JJ, McKinlay P, Rennie JS: Facing the challenges of competency based assessment of postgraduate dental training: longitudinal evaluation of performance (LEP). Med Educ 2002, 36:92-97.
  • [11]Dijkstra J, Van der Vleuten CPM, Schuwirth LWT: A new framework for designing programmes of assessment. Adv Health Sci Educ 2009, 15:379-393.
  • [12]Schuwirth LWT, Van der Vleuten CPM: Programmatic assessment: from assessment of learning to assessment for learning. Med Teach 2011, 33:478-485.
  • [13]Van der Vleuten CPM, Schuwirth LWT, Driessen EW, Dijkstra J, Tigelaar D, Baartman LKJ, Van Tartwijk J: A model for programmatic assessment fit for purpose. Med Teach 2012, 34:205-214.
  • [14]Collins A, Joseph D, Bielaczyc K: Design research: theoretical and methodological issues. J learn sci 2004, 13:15-42.
  • [15]Bok HGJ, Jaarsma DADC, Teunissen PW, Van der Vleuten CPM, Van Beukelen P: Development and validation of a competency framework for veterinarians. JVME 2011, 38:262-269.
  • [16]Pope C, Van Royen P, Baker R: Qualitative methods in research on healthcare quality. Qual Saf Health Care 2002, 11:148-152.
  • [17]Norcini JJ, Burch V: Workplace-based assessment as an educational tool: AMEE Guide No. 31. Med Teach 2007, 29:855-871.
  • [18]McGaghie WC, Butter J, Kaye M: Observational assessment. In Assessment in health professions education. Edited by Downing SM, Yudkowsky R. New York: Routledge; 2009:185-216.
  • [19]Schuwirth LWT, Van der Vleuten CPM, Donkers HHLM: A closer look at cueing effects in multiple choice questions. Med Educ 1996, 30:44-49.
  • [20]Koretz D: Large scale portfolio assessments in the US: evidence pertaining to the quality of measurement. Ass Educ 1998, 5:309-334.
  • [21]Schuwirth LWT, Southgate L, Page GG, Paget NS, Lescop JMJ, Lew SR, Wade WB, Barón-Maldonado M: When enough is enough: a conceptual basis for fair and defensible practice performance assessment. Med Educ 2002, 36:925-930.
  • [22]Driessen E, Van der Vleuten CPM, Schuwirth LTW, Van Tartwijk J, Vermunt J: The use of qualitative research criteria for portfolio assessment as an alternative to reliability evaluation: a case study. Med Educ 2005, 39:214-220.
  • [23]Wilson M, Sloane K: From principles to practice: an embedded assessment system. App Meas Educ 2000, 13:181-208.
  • [24]Prideaux D: Curriculum development in medical education: from acronyms to dynamism. Teach Teach Educ 2007, 23:294-302.
  • [25]Martinez ME, Lipson JI: Assessment for learning. Educ Lead 1989, 46:73-75.
  • [26]Kogan JR, Shea JA: Implementing feedback cards in core clerkships. Med Educ 2008, 75:1071-1079.
  • [27]Kogan JR, Holmboe ES, Hauer KE: Tools for direct observation and assessment of clinical skills of medical trainees. J Am Med Ass 2009, 302:1316-1326.
  • [28]Pelgrim EAM, Kramer AWM, Mokkink HGA, Van der Vleuten CPM: The process of feedback in workplace based assessment: organisation, delivery, continuity. Med Educ 2012, 46:604-612.
  • [29]Driessen E, Van Tartwijk J, Van der Vleuten CPM, Wass V: Portfolios in medical education: why do they meet with mixed success? A systematic review. Med Educ 2007, 41:1224-1233.
  • [30]Van Merriënboer JJG, Sluijsmans DMA: Toward a synthesis of cognitive load theory, four-component instructional design, and self-directed learning. Educ Psychol Rev 2009, 21:55-66.
  • [31]Ross MT, Cameron HS: Peer assisted learning: a planning and implementation framework: AMEE Guide no. 30. Med Teach 2007, 29:527-545.
  • [32]Ten Cate O, Durning S: Peer teaching in medical education: twelve reasons to move from theory to practice. Med Teach 2007, 29:591-599.
  • [33]Epstein RM, Hundert EM: Defining and assessing professional competence. JAMA 2002, 287:226-235.
  • [34]Bok HGJ, Teunissen PW, Spruijt A, Fokkema JPI, Van Beukelen P, Jaarsma DADC, Van der Vleuten CPM: Clarifying students’ feedback-seeking behaviour in clinical clerkships. Med Educ 2013, 47:282-291.
  文献评价指标  
  下载次数:11次 浏览次数:33次