During Spring Quarter 2015, Professor Alfred Kobsa’s undergraduate Informatics 132 project course evaluated Canvas from the perspective of both instructors and students. The student researchers conducted a heuristic assessment of Canvas as well as a series of user interviews and usability tests on both Canvas and proposed user interface updates.

Below are excerpts from the four reports provided to the Office of Information Technology after their comprehensive research projects.

Group 1A: Instructors

Group 1A was tasked with assessing Canvas for instructors with a focus on the following features: Announcements, Discussions, People, Files, Outcomes, Modules, Collaborations, and Chat.

Below are the excerpted conclusions from the full report provided by Group 1A to OIT:

Implementation and Adoption:

Adopting a new complex Learning Management System such as Canvas poses a challenge of a high learning curve. The adoption and penetration of the system into a fast pace learning environment such as UCI will take time and effort from both parties. The instructor have to be willing to try a new system and be open-minded about features they have never used before, while the stakeholders and developers must consider the difficulty of implementing such a complex system and in turn design features that will aid instructors and students in the full adoption of the system. As users familiarize themselves with the system and how it is designed, they are more likely to complete tasks more quickly. Our usability tests strongly supported the existence of a learning curve as the mean time to complete each task was quicker for the experienced user in every single task that we gave them.

Recurring issues

Throughout the cognitive walkthrough, user interviews, usability testing and analysis, we found ourselves dealing with some recurring issues. These issues were found to affect different users in different sub-groups. We made sure to highlight and pay detailed attention to the areas of these problem areas. We compiled a list of these recurring problem areas and found them to include:

  • Inconsistency in wording for buttons
  • Inconsistency in location of buttons
  • Inconsistency in affirmation of successful completion, i.e. when the user submits an outcome, “Outcome successfully uploaded” flashes on screen
  • Lack of instructions in some much needed areas
  • Lack of leads
  • More hover instructions are needed in order to help the user when clicking buttons

Conclusion

Through our various exercises, such as the usability testing, cognitive walkthrough, interviews and analysis, we were able to discern and compile data to aid in materializing an extensive, efficient and highly effective design solution. We were able to gather troves of data and valuable data; both quantitative and qualitative data. The qualitative data gathered was used to form general areas of focus and aspects that needed further detailed testing. We gathered most of our quantitative data through our usability testing. We made sure follow the steps emphasized in our textbook when conducting the various tests. We then went on to analyze the data thoroughly to identify pain points.

Our usability tests yielded findings that indicated users were dissatisfied in all around similar areas, such as inconsistency in wording and lack of instructions. Our recommended solutions address these pain points in an easy to fix manner. The Canvas Learning Management System is quite user intuitive to begin with but needed some tweaks. Many of our users were also expressive of being satisfied with the EEE system which leads us to believe that most of the users would like to see more add-ons and enhanced functionality instead of a complete overhaul. We also would like to add that although our users expressed dissatisfaction, this may be due to the fact that they are comfortable with EEE and do not want to have to learn how to use a new Learning Management System.

Group 1B: Instructors

Group 1B was tasked with assessing Canvas for instructors with a focus on the following features: Assignments, Grades, Pages, Syllabus, Quizzes, Conferences, Attendance, and Settings.

Below are the excerpted conclusions from the full report provided by Group 1B to OIT:

Summary of Major Problems

One major problem that happens all over the system was that some buttons lack consistency. For example, the “add new” button for Assignment, Pages and Quiz function contains a “+” mark followed by the function named (ex for Assignment, the button is “+Assignment”). However, for the conference function, the “add new” button does not have a “+” mark which causes users to get confused.

Another major that is pretty common all over the system is that it lacks some important introduction and instruction. Sometimes, when a user encounters a new functionality, he or she will probably needs some introduction about what this function works for and how it works. He or she may also needs instruction about how to use this functionality correctly and effectively. however, this system lacks some important guideline for new users. New users may get confused about what is the meaning of “Page” function because users have already get familiar with “Syllabus” whose function is overlapped with the “Page” function. As a result, if system could introduce the new users to these functions’ goals and features and ways to use, new users could gain a better experience on using those new functions. For example, the “Attendance” section has a function called “badge” which allows users to add badges to their students. The badges could have different name such as “good”, “noisy” and “participant”. Users could also create some new badges by their own. This is a very good function, but because there is no introduction nor instruction for this function, many users did not even know its existence.

Conclusion

For this project, our group tested the functionalities: setting, syllabus, assignment, quiz, conference, grades, and attendance. The methods we used included: cognitive walkthrough, interviews and usability testing. Through these methods, we gathered the positive and negative thoughts for the features. We learned that methods were very important for our testing and being well organized. The usability testing gave us an insight of the different problems in Canvas. Through these problems, we made suggestions and improvements that could be implemented to make it better for the professors or users. Our suggestions for improvement included naming the settings differently because there are two settings. Another one of our suggestions is having consistency throughout the different functions of canvas. Instant feedback would be very useful for the users to know if they made errors or if they have saved the document. Our last suggestion includes how to make the weigh grades more easier to use. Canvas included impressive functions that the users really enjoyed. In conclusion, Canvas has implemented various useful tools that the professors found impressive and willing to use in the future.

Group 2A: Students

Group 2A was tasked with assessing Canvas for students with a focus on the following features: Announcements, Discussions, People, Files, Outcomes, Modules, Collaborations, and Chat.

Below are the excerpted conclusions from the full report provided by Group 2A to OIT:

As a group we have worked together to find the problems that lies in the Canvas Course Management that has great potential to be used by the students at UCI. By understanding the system ourselves initially, through heuristic evaluations and cognitive walkthroughs, we were able to then proceed to the next step of our tasks, which was by conducting interviews and think­alouds with the participants. Although the turnout of the people of volunteered were low from our expectations, we were still able to gain very insightful and satisfying data. After the collection of our data, we then created the mock ups based on the problems that were described and used it in our next round of interviews.

As we come to a close with this project, as a team, we also had our final recommendations when it comes to Canvas. As a group, we propose a full switch from MyEEE to Canvas based on the data that we have gathered from ourselves and from the participants. Our recommendation is based off of our findings from the participants who mentioned that Canvas is very convenient to have, especially when it comes to taking quizzes and working in a group because it has exquisite features such as: the calendar feature, the chat feature, and the Google­Drive like interface feature. We also would like to recommend to have a “help” documentation on the side as a reference to students who are still not used to the system. These are our final thoughts as a group from working with Canvas this quarter. We hope that our findings from the participants were helpful towards building a great course management system for the future students to use.

Group 2B: Students

Group 2B was tasked with assessing Canvas for instructors with a focus on the following features: Assignments, Grades, Pages, Syllabus, Quizzes, Conferences, Attendance, and Settings.

Below are the excerpted conclusions from the full report provided by Group 2B to OIT:

In order to research the functionality of Canvas, we conducted a set of cognitive walkthroughs in order to gain some first impressions about the website. Then, we developed a user test, which contained instructions for the students to follow. We recorded the user tests that the students took, and we worked on refining the instructions and the methods of the actual test. We then tested students who had never used Canvas before, and we were able to gain more insights about areas of the Canvas website that could benefit from improvement. Although the general response to the Canvas website was overwhelmingly positive, we did find some areas where small improvements could enhance the user’s experience with Canvas, and once we had completed our rounds of user tests with students who were new to the website (but had experience using EEE), we reworked some of the pages on Canvas and made mockups to highlight the areas we found could use improvement after analyzing the user tests we conducted and discussing Canvas with the testers.

[…]

Although our user tests indicate a positive response overall from the users, we have outlined a few suggestions to improve the website that are related to more the aesthetics rather than functionality. These suggestions are to help improve the user experience and are not given because the current Canvas learning management system is difficult to navigate. However, we would like to strongly recommend these outlined improvements for we believe that they would help improve the usability of the website considerably. After analyzing our user test results, we have constructed eight recommendations: changing the name of the Pages tab, sorting the assignments in the Grades page by class, making the Attendance page more easily accessible, adding a check­in function to the Attendance page, altering the layout for changing passwords, changing the layout of preferences in the Notifications page, changing the structure of the Home page, and having more consistency throughout the pages. The visual changes are highlighted in the “Redesign” section above. However, this section will go more in depth about our reasoning behind these changes.

All of our recommendations are not provided because the Canvas learning management system lack in functionality. However, we believe that implementing these changes will improve the user experience and contribute to the effectiveness of the website and the ease of the users to navigate through it. Through the user testing and interviews, we have learned that there are features that can be improved upon even though the functions are usable and not necessarily complicated or impossible to find.