BMC Medical Education | |
Design and usability testing of an in-house developed performance feedback tool for medical students | |
Monika Domanska1  Martin Dittmar1  Harm Peters2  Ylva Holzhausen2  Hannah Tame3  Jan-Vincent Wyszynski3  Yadira Roa Romero3  Mohammed Alhassan-Altoaama3  Mandy Petzold3  | |
[1] Business Division IT, Charité – Universitätsmedizin, Berlin, 10117, Berlin, Germany;Dieter Scheffner Center for Medical Education and Educational Research, Dean’s Office of Study Affairs, Charité – Universitätsmedizin, Berlin, 10117, Berlin, Germany;Quality Assurance in Education, Charité – Universitätsmedizin, Berlin, 10117, Berlin, Germany; | |
关键词: Assessment feedback; Usability testing; Formative and summative assessment; Learning analytics; EPAs; | |
DOI : 10.1186/s12909-021-02788-4 | |
来源: Springer | |
【 摘 要 】
BackgroundFeedback is essential in a self-regulated learning environment such as medical education. When feedback channels are widely spread, the need arises for a system of integrating this information in a single platform. This article reports on the design and initial testing of a feedback tool for medical students at Charité-Universitätsmedizin, Berlin, a large teaching hospital. Following a needs analysis, we designed and programmed a feedback tool in a user-centered approach. The resulting interface was evaluated prior to release with usability testing and again post release using quantitative/qualitative questionnaires.ResultsThe tool we created is a browser application for use on desktop or mobile devices. Students log in to see a dashboard of “cards” featuring summaries of assessment results, a portal for the documentation of acquired practical skills, and an overview of their progress along their course. Users see their cohort’s average for each format. Learning analytics rank students’ strengths by subject. The interface is characterized by colourful and simple graphics. In its initial form, the tool has been rated positively overall by students. During testing, the high task completion rate (78%) and low overall number of non-critical errors indicated good usability, while the quantitative data (system usability scoring) also indicates high ease of use. The source code for the tool is open-source and can be adapted by other medical faculties.ConclusionsThe results suggest that the implemented tool LevelUp is well-accepted by students. It therefore holds promise for improved, digitalized integrated feedback about students’ learning progress. Our aim is that LevelUp will help medical students to keep track of their study progress and reflect on their skills. Further development will integrate users’ recommendations for additional features as well as optimizing data flow.
【 授权许可】
CC BY
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
RO202107224066830ZK.pdf | 1075KB | download |