ABSTRACTIntroduction and Significance:Usability studies are useful for getting feedback on web-based software applications. This work explores one usability study approach that involves unmoderated remote usability assessment by expert reviewers. The main goal of this work is to help identify usability issues with a new web-based software application, YourGiftGives(YGG), designed to notify biospecimen sample donors of outcomes of studies they participate in or follow. The objectives of this research are:Objective 1: To explore remote unmoderated usability testing by expert reviewers.Objective 2: To apply a remote user testing methodology involving expert reviewers to assess a new web-application targeted to biospecimen sample donors.Objective 3: To identify usability faults and to propose solutions.Methods:A literature review was conducted to understand the research done in the field of usability. The focus was on different types of usability studies, especially remote usability testing, and the use of expert reviewers in usability labs for gathering feedback. The use of various platforms for conducting unmoderated usability testing was also researched. As a result of this research, the remote testing platform UserTesting.com was chosen. Expert usability reviewers were also recruited from the available panel of global participants on UserTesting.com. These expert reviewers completed the usability tasks and provided detailed feedback. Following the completion of each task, expert reviewer participants were asked to answer questions about how easy it was to find information (intuitive design), howeasy it was to keep track of where they were in the application (memorability), and their ability to predict what section of the web application had the information they were looking for (ease of learning). Success or failure to complete a task (effectiveness) and the time it took to complete a task (efficiency) were also captured. After the completion of the usability session, expert reviewer participants were asked to complete the 10-item System Usability Scale (SUS) survey of overall usability of the YGG software. A score of 68 is an average usability score. Summary statistics from task-based analyses and SUS survey analyses were then compiled and reported. After analyzing the quantitative data, qualitative data was reviewed informally to identify potential usability issues, and possible solutions to those issues to help improve the user experience of YGG were proposed.Results:From June 1, 2018 to June 25, 2018, data was collected from 12 expert reviewers using UserTesting.com, 10 of which were usable for data analysis as the other two participants did not complete the test. Participants could successfully complete most of the tasks except tasks 01, 11, and 14 where at least one user was not able to successfully complete the task.When assessing memorability of tasks, with the exception of four tasks, users strongly agreed it was easy to remember the location of the task and to find the information they needed to complete the tasks. For each of the four tasks (04, 13, 17, and 18) one user agreed but did not strongly agree. When asked to rate the ease of learning a task, nine (90%) of the users strongly agreed. One user did not strongly agree for both tasks 03 and 13. It was observed that the average time on task was longer for tasks 01, 08, and 06 when compared to the other tasks. Per the results, the SUS survey results indicated that YGG had an average score of85.8, indicating an above average usability score. Upon informal review of the qualitative data, I identified the following areas that have the potential to improve the overall usability of YGG: (1) Improving home page navigation so that users can find important information more efficiently and effectively, (2) using accurate text in the password section so that users know what they are going to get, (3) dividing categories clearly in the notification preferences section so that users are able to more easily get updates on the studies they are interested in, and (4) keeping the educational material labels consistent so users can find necessary material easily.Conclusion/Discussion:Overall, I found that UserTesting.com was an efficient way to recruit expert reviewers to provide usability feedback on the YGG software. In addition, the methodology used for unmoderated remote usability testing was effective to collect usable data for 10 out of 12 recruited expert reviewers. Lastly, task-based analyses highlighted tasks that may be more challenging to the end users. The YGG web portal meets the objective for which it was designed: Users were able to understand and perform the tasks. In this study, we also identified areas where the YGG design could be improved.
【 预 览 】
附件列表
Files
Size
Format
View
UNMODERATED REMOTE USABILITY TESTING BY EXPERT REVIEWERS: AN ASSESSMENT OF A WEB-APPLICATION FOR SAMPLE DONORS