final report, team instructure - indiana...
TRANSCRIPT
1
Final Report
Participants 5 Instructors
6 Students
Testing Date October - November 2012
Methodology 1-on-1 Usability Test Sessions
Preparation 2 Task Sheets
2 Test Plans
Location Instructor: Instructor’s Office
Student: USER Lab, Walker Plaza, Student’s home
David Brainer-Banker | Christina Dunbar | Steve Goforth | Preethi Srinivas
Fall 2012
Mike Wilson
I543 Usability and Evaluative Methods
2
TABLE OF CONTENTS
EXECUTIVE SUMMARY ................................................................................................................................... 3
MAJOR FINDINGS ............................................................................................................................................. 4
USER DEMOGRAPHIC & BACKGROUND INFORMATION ........................................................................ 6
DESCRIPTION OF SEVERITY RATINGS ....................................................................................................... 8
INSTRUCTOR ROLE ......................................................................................................................................... 9
QUANTITATIVE RESULTS: INSTRUCTORS ................................................................................................................... 9
QUALITATIVE FINDINGS: INSTRUCTORS ................................................................................................................... 10
STUDENT ROLE ............................................................................................................................................... 21
QUANTITATIVE RESULTS: STUDENTS ....................................................................................................................... 21
QUALITATIVE FINDINGS: STUDENTS ........................................................................................................................ 25
COMPARISON OF ISSUES FOUND BETWEEN EXPERT AND USER EVALUATION OF CANVAS ..... 34
EXPERT REVIEW ...................................................................................................................................................... 34
USER TESTING ......................................................................................................................................................... 34
OVERALL RECOMMENDATIONS FOR IMPROVEMENT ........................................................................ 36
APPENDIX ......................................................................................................................................................... 37
APPENDIX A ............................................................................................................................................................. 37
A.1 Methodology ................................................................................................................................................. 37 A.1.1. Test Design ............................................................................................................................................................ 37
A.2 Data Collection & Testing environment ....................................................................................................... 38 A.2.1 Instructors ............................................................................................................................................................... 38 A.2.2 Students .................................................................................................................................................................. 38
APPENDIX B ............................................................................................................................................................. 39
B.1 Informed consent (Student) .......................................................................................................................... 39
B.2 Informed Consent (Instructor) ...................................................................................................................... 41
B.3 Instructor Test Plan ...................................................................................................................................... 42
B.4 Student Test Plan .......................................................................................................................................... 50
B.5 System Usability Scale .................................................................................................................................. 58
APPENDIX C ............................................................................................................................................................. 59
C.1 Instructor quotes .......................................................................................................................................... 59
C.2 Student quotes .............................................................................................................................................. 61
3
Executive Summary
Canvas, a learning management system developed by Instructure, differentiates itself from its
competitors by integrating popular third-party web services, such as social media, and by
providing an alternative grading experience which attempts to expedite the grading process.
An initial expert review was conducted by a team of graduate students from Indiana University
School of Informatics and Computing on the Indianapolis campus during October-November of
2012. A usability test plan was designed based on the issues identified in the expert review. Two
groups of users, instructors and students, completed two different sets of tasks derived from the
core features of Canvas: Gradebook, Speedgrader, Calendar, Assignments, Quizzes, Inbox
(Conversations), Announcements, Web Services (i.e. Google Docs and Facebook). Users also
completed a pre-test and post-test, which collected demographic data and assessed the overall
usability of Canvas (SUS score).
Key findings from the user testing showed that students found the Web Services (Google Docs
and Facebook) features to be cumbersome and counterintuitive. However, students still rated
Canvas with a passing SUS score of 71.5. Instructors on the other hand rated Canvas with a
failing score of 43.5 finding the Mute Assignment and Inbox labels to be nonsensical and
difficult to find. Despite the high scores given to Canvas by students, they often struggled to
complete tasks and were challenged by various aspects of the system. The variation between
instructors and students may be due in part to each respective groups willingness to critique.
4
Major Findings
The user testing sessions from instructors and students revealed the following major usability
findings for Canvas.
Poor Location/Lack of Context
● Instructors could not find the Mute Assignment feature. All instructors failed to complete
the task for finding the Mute Assignment option. They expected to find the feature in Edit
Assignment Details.
● Students struggled to find the location of the option to link their Facebook account to
Canvas, expected it to be grouped with other notification-related options.
Unclear/Confusing Labels
● The redundant Assignments and Grades labels on the global navigation and within each
course navigation confused instructors. They found it very distracting.
● Instructors were unsure of the meaning and functionality for the “Save Settings” and
“Re-Publish Quiz” buttons in Quizzes.
System Feedback
● After inputting a grade in the SpeedGrader instructors did not receive any feedback. This
caused them to lack confidence in Canvas.
● A confirmation pop-up during the Google Doc assignment submission process confused
students and caused them to second-guess the action that they were taking.
Standard Conventions
● Neither students nor instructors were familiar with the “Inbox” label as a reference for
incoming and outgoing messages. They also were unfamiliar with emails for multiple
courses being managed in one mailbox and did not expect a messaging feature for a
course to reside outside of the course navigation.
5
● Instructors disliked Announcements having an option for commenting. They were only
familiar with Announcements being a one-way communication from the instructor to the
students.
● The lack of affordance (i.e. no checkbox, radio buttons, etc.) in the Google Documents
list made it difficult for students to recognize that they needed to click on a document in
order to associate it with an assignment they were submitting.
6
User Demographic & Background
Information
A. Instructors
The following table provides a summary of the demographic profiles of the instructors.
Instructors Demographic Background Information
Age Average = 39.2
Standard Deviation = 4.32
Gender Male = 2 Female = 3
Academic Title Associate Professor = 1 (20%) Assistant Professor = 1 (20%)
Career Services Specialist = 1 (20%)
Instructional Technology Specialist = 1 (20%)
Lecturer = 1 (20%)
Prior LMS’ Used OnCourse = 5 (100%) Adobe Connect = 1 (20%) Epsilen = 1 (20%) BlackBoard = 0 (0%)
Number of Years Using a LMS 1 Year or Less = 0 (0%) 2 Years = 0 (0%) 3 Years = 0 (0%) 4 Years = 2 (40%) 5+ Years = 3 (60%)
Edit Assignments and/or Quizzes are Being Sent To
Students? Frequently = 0 (0%) Sometimes = 0 (0%) Rarely = 5 (100%) Never = 0 (0%)
User LMS to Send Communications to Students? Frequently = 5 (100%) Sometimes = 0 (0%) Rarely = 0 (0%) Never = 0 (0%) No, I don't use a LMS = 0 (0%)
7
It should be noted from the table that all instructors were familiar with using a learning
management system, specifically Oncourse. Also, all 5 instructors had over 4 years of experience
using an LMS and frequently used it to communicate with students.
B. Students
The following table provides a summary of the demographic profiles of the students . It should
be noted that data collected from one (1) out of the 6 students was discarded due to incorrect
administration of a task. Data for the remaining 5 students is shown below.
Students Demographic and Background Information
Age Average = 24.4 Standard Deviation = 3.56
Gender Male = 3 Female = 2
Academic Title Graduate Student = 5 (100%)
Major Health Informatics = 2 Bioinformatics = 1 Human Computer Interaction = 2
Prior LMS’ Used OnCourse = 5 (100%) Adobe Connect = 1 (20%) BlackBoard = 1 (20%) OurCMS = 1 (20%)
Number of Years Using a LMS 1 Year or Less = 1 (20%) 2 Years = 3 (60%) 3+ Years = 1 (20%)
Do you see yourself subscribing to grades and
assignment information through Facebook? Yes = 0 (0%) No = 5 (100%)
Preference to combine academic and personal
calendar? Yes = 4 (80%) No = 1 (20%)
Preference to use email account as against a built-in
messaging feature in LMS? Prefer using email = 1 (20%) Prefer using built-in messaging tool = 4 (80%) Not sure = 0 (0%)
8
Description of Severity Ratings
Three levels of severity represent the priority for issues found in Canvas: High, Medium, and
Low. The severity levels have been modified since the expert review based on validation from
user testing.
HIGH (Critical) - These issues are high priority. High severity issues are the most critical and
should be given the most attention. These issues will impede the user’s ability to complete a task.
MEDIUM (Cautionary)- These issues are medium priority. Medium severity issues are related
to tasks that were found to be cumbersome and/or not intuitive. These issues have moderate
impact and will cause the user to struggle when completing a task.
LOW (Minimal) - These issues are low priority. Low severity issues are related to user
inconveniences and/or require UI improvements. These issues have minimal impact on user’s
ability to complete a task.
9
Instructor Role
Below is an overview of the quantitative results for the tasks that were completed by instructors
during user testing.
Quantitative Results: Instructors
Instructor Tasks
The following is a list of the tasks that were given to instructors during the user testing sessions.
A full description of all the testing materials can be seen in the script (see Appendix).
Task 1: Finding an assignment
Task 2: Turning off Mute Assignment
Task 3: Grading an assignment (SpeedGrader/Gradebook)
Task 4: Editing the details for a quiz
Task 5: Sending a message in the Inbox (Conversations)
Task 6: Sending an Announcement
Task Completion
Out of the 6 tasks given in the user testing for instructors only one task, Task 4, was completed
successfully by all instructors. Task 2, muting an assignment that has been turned in, was the
most challenging task as all instructors failed to complete it. Task 3, using SpeedGrader to grade
an assignment, and task 5, responding to a message in Conversations (Inbox), required the most
assistance for completion. The chart below provides more insight into the task completion by
participant.
10
Figure 1. Task completion by instructors
Qualitative Findings: Instructors
Below is an overview of the key findings for instructors from user testing including the finding,
severity rating, optional screenshot, and a recommendation.
1. Mute Assignment- Unable to find Mute Assignment
Severity: HIGH
Five (5) of five (5) instructors were unable to find and complete the task for the Mute
Assignment feature. All five (5) instructors went to Edit Assignment Details expecting to find
the feature. Users noted that they would not have expected to find the features in Gradebook or
Speed Grader.
11
Figure 2. Muting an assignment
User Quotes:
○ “The majority of control options are under the edit assignments, so it [mute assignment]
seems odd that it would not be there.”
○ “I don’t know definition of what they mean by locked. I do think some faculty need a
student view option. I feel like I am being stomped here I am a fairly intuitive person. I
expected the mute assignment type of function would be in Edit Assignments not in the
Gradebook area.”
Recommendation: Keep the Mute Assignment options available in Gradebook and Speed
Grader but add an additional Mute Assignment option to the Assignment Details area.
2. Mute Assignment- Confusion over “Mute Assignment’s” meaning
Severity: HIGH
All five (5) instructors did not comprehend the Mute Assignment label. Some instructors
attempted to find a definition in the help menu while others simply overlooked the label unsure
of what it meant. Users were only familiar with the term “mute” referring to audio functionality.
User Quotes:
12
○ “Mute to me means turn off the sound.”
○ “To me it would be more blind than mute, but to make that so it is not showing
up. I don't like that terminology very well. It is ambiguous to me. I would have
probably gone to assignment details since this is a detail of the assignment.
Maybe show/hide.”
Recommendation: Change the nomenclature to “Hold Grades” (Mute Assignment) / “Publish
Grades” (Unmute Assignment)
3. Inbox - Users unable to find messaging (Inbox)
Severity: HIGH
All five (5) instructors were unable to find the (Inbox) without assistance. They were unfamiliar
with the “Inbox” label as a reference for incoming and outgoing messages. They were also
unfamiliar with emails for multiple courses being managed in one mailbox and did not expect a
messaging feature for a course to reside outside of the course navigation.
Figure 3. Difficulty in finding Inbox
13
User Quotes:
○ “When I see Inbox, it’s like my personal Inbox, not a course inbox.I hated that
[the process].”
○ “Expected to find it on the side as "Messages".”
○ “I expected to see something labeled messages or emails. I thought Inbox was
messages that are coming to me. With email “inbox” is only incoming messages.
Not all messages...this is similar in Gmail and Outlook. Inbox is not intuitive.”
Recommendation: Change the nomenclature from saying “Inbox” to “Messaging” and locate it
to the global navigation towards the top of the UI.
4. Quizzes - Ambiguous nomenclature found in quiz options
Severity: HIGH
Five (5) out of 5 instructors were confused by the “Save Settings” and “Re-Publish Quiz”
buttons in Quizzes. They were unsure what the difference in functionality was between the two
buttons. The instructors opted to Save Settings in case the Re-Publish Quiz button did not save.
Some instructors chose the “Re-Publish Quiz” button only because the button was a blue color
catching their eye.
14
Figure 4. Ambiguous nomenclature followed in Gradebook
User Quotes:
○ “Not sure if I have to re-publish the quiz. When you save settings does it reflect
automatically or do you have to republish to reflect the change? Does editing
unpublish it?”
○ “I don’t understand the difference between Save Settings and Re-Publish Quiz.
This is very distracting having [Assignments] in two places is weird to me.”
○ “I did not complete the task, yet I thought I completed the task. That is the worst
thing a learning management system can do. There is nothing worse.”
Recommendation: Replace “Save Settings” and “Re-Publish Quiz” with a single action button
(e.g. Save and Publish) since Save by itself is conventionally only used for written information
or complicated processes. In this case, the Quiz Options are very limited, thus making a Save
Settings irrelevant.
15
5. Speed Grader - Lack of confirmation in regards to grade submission via Speed Grader
Severity: HIGH
All 5 instructors were unsure whether the grade entered into SpeedGrader had been saved. They
expected the system to give them some feedback that their grade was successfully entered. When
no feedback was given they questioned whether it was actually saved. This lack of feedback was
contrary to the accompanying comment feature in Speed Grader which gave feedback after
inputting a comment.
Figure 5. Lack of confirmation on submitting a grade
User Quotes:
○ “I’m not sure if the grade has been submitted. Having assignments on left and on
the right was confusing.”
○ “What to do next was not intuitive after inputting the grade.”
16
Recommendation: Add a submit button in respects to the assignment grade submission.
6. Speed Grader - Navigating to next student in Speed Grader
Severity: MEDIUM
Four (4) out of five (5) instructors initially struggled to find the next student’s assignment in
SpeedGrader. Instructors expected SpeedGrader to automatically advance to the next student
submission. Due to this fact, many instructors thought that the “SpeedGrader” was a regular
gradebook and found the “Gradebook” to actually expedite the grading process.
Figure 6. Navigating to next student in SpeedGrader
User Quotes:
○ “I expected the speedgrader to automatically go to next assignment once the an
assignment was graded. I would like labels on the arrows that said next and
previous submission.”
Recommendation: Add additional context that those arrows are in reference to switching
between student assignments.
17
7. Navigation - Global Navigation (top of the screen) Vs. Course Navigation (left of the
screen)
Severity: LOW
Three out of 5 users reported being distracted and confused by the “Assignments” and “Grades”
labels on both the global navigation and the course navigation. Users were unsure of which
navigation to choose.
Figure 7. Differentiating between global and local navigation
User Quotes:
"This one is going to take me to all of the assignments, but this one is sorting under to
grade, but calling it just assignments and giving me a drop down menu with only one
drop down choice is very confusing to me."
Recommendation: Add additional context to the global category on the top bar (Ex. Ungraded
Assignments).
18
8. Gradebook Vs. Speed Grader
Severity: LOW
All 5 instructors were able to quickly located the SpeedGrader, but had difficulty finding the
Gradebook. Once the Gradebook was found they all felt that it was quicker to use than the
SpeedGrader.
User Quotes:
“I don’t feel as though Speed Grader is speedier. Gradebook allows me to see all of the
students at one time. I still have to advance to the next student.”
Recommendation: Add a “Submit Grade” button, then advance to the next participant after
clicking it.
9. Speed Grader - Incorrect number of students to be graded in Speed Grader
Severity: LOW
One (1) instructor noticed that the number of “Graded” assignments in SpeedGrader did not
correspond with the number of assignments that had actually been submitted. “Graded” logically
should only include assignments that have been submitted and not correspond to the number of
students in the course.
19
Figure 8. Incorrect number of submissions to be graded
User Quotes:
○ “Only two of the students have submitted the assignment, yet Speed Grader says
that all four students have submitted. That is very confusing.”
Recommendation: Though it could be a glitch, only display the number of students who have
submitted an assignment.
10. Inbox - Inconsistent nomenclature (Inbox vs. Conversations)
Severity: LOW
Three out of 5 instructors thought that Conversations referred to the messages within the Inbox.
They felt that “Messages” was a more appropriate label instead of “Inbox” and “Inbox” a more
appropriate label in place of “Conversations”.
Figure 9. Inconsistent nomenclature
User Quotes:
○ “The page says “Conversations”, but I clicked on Inbox. I don’t remember seeing
a Conversations option on the left-hand part of the screen (course navigation).
Recommendation: Use consistent wording to refer to this section, which could mean changing
Inbox to Conversations in the top bar or renaming both to Messaging.
20
11. Announcements - Ability for students to reply to announcements
Severity: LOW
Five (5) out of 5 instructors commented that they disliked the ability for commenting on
Announcements. They were only familiar with using Announcements for one-way
communication from instructor to students.
Figure 10. Ability for users to reply to annoucements
User Quotes:
○ “This is more of a discussion thread. I don't know if I would use this.”
○ “Users must post before seeing replies? I don't get that. How do you get replies
unless you're not posting?”
○ “Announcements with dialogue is too messy.”
○ “I expect it [announcements] to be one-way. Not two-way. The only option is to save so I
am assuming it is going to save and post it? User must post before seeing replies. Does
not know what that means.”
21
Recommendation: Follow standard conventions for announcements by instructors and do not
allow replies. Replied could alternately remain as an option but should be disabled by default.
Student Role
Below is an overview of the quantitative results for the tasks that were completed by students
during user testing.
Quantitative Results: Students
Student Tasks
The following is a list of the tasks that were given to students during the user testing sessions. A
full description of all the testing materials can be seen in the script (see Appendix).
Task 1: Finding a list of all assignments due this week
Task 2: Submitting an assignment with Google Docs
Task 3: Checking a grade on a past assignment
Task 4: Sending a note to your instructor
Task 5: Connecting to Facebook
Time on Task
Figure below shows a comparison of student and expert time on task. The average time on task
and average number of clicks to complete the task by Team Instructure was considered as expert
time on task and number of clicks respectively.
22
Figure 11. Comparison of student and expert time on task. Error bars in this and following charts indicate +/-
1 SE from corresponding mean values.
Levels of Success
All the students completed all the tasks. Hence, there were two possible levels of success,
● complete success
● partial success
Figure 3 shows students success rates for individual tasks and percentage of students divided
across levels of success. Although students were able to turn in an assignment using Google
Docs, they took most time to complete the task and had the highest ratio of partial success.
23
Figure 12. Success rate per task.
Lostness
Average lostness was calculated for each task, as seen in Figure 4. Tasks involving students
finding the assignments coming up for the week and connecting to Facebook using Instructure
have average lostness values slightly over .5. However, these values do not have to be considered
as representative of users being lost in navigation. This is evident from the student comments
discussed below. Tasks requiring authorization of Google Docs account and finding grade for an
assignment along with student feedback clearly indicate lostness in navigation.
24
Figure 13. Lostness per task.
Figure 14. Comparison of student and expert clicks required for task completion (used to calculate lostness).
25
Qualitative Findings: Students
Below is an overview of the key findings for students from user testing including the finding,
severity rating, optional screenshot, and a recommendation.
1. Confirmation pop-up causes users to second-guess their action.
Severity: Medium
4 out of 5 students were misled by the pop-up message generated while authorizing Google Docs
account. 2 out of these 4 students stopped to think while the remaining 2 proceeded to click on
the wrong button before returning to try the same steps. This caused the average number of
clicks required to complete the task be close to 3 times that performed by an expert.
Figure 15. Pop-up message while authorizing Google Docs account.
User quote:
“ I was slightly confused with the pop-up dialog while authorizing Google Docs
account. The process of re-directing me back to Canvas after clicking on "Grant
access" was also confusing and unexpected.”
26
Recommendation: This dialog should not be displayed to users who are attempting to connect
to Google docs. At present there is no data which would be lost on the Google docs tab anyway,
but even if there were the system should instead temporarily cache the data and restore when the
user returns from Google because the connection step is a mandatory part of the process.
2. Lack of affordances makes it difficult for users to know that they must click on a
particular document from Google Docs before submitting.
Severity: Medium
All 5 students were unable to identify the last step of the task, i.e. selecting the document before
submitting the assignment. A possible reason mentioned by the students was the preconceived
notion “I thought the document was already attached to the submission!” While in reality there is
a higher probability for a user to have more than one document in his/her Google Docs account,
this is still considered an issue given all the 5 students were unable to identify this step.
Figure 16. Selecting document before submitting the assignment
User quote:
“By looking at the test document, I was under the assumption that the file is
actually attached to my submission. May be some pop-up message indicating
27
which file I have to select in order to attach might be helpful.”
Recommendation: We recommend following standard conventions, where the user sees a
selectable list of items with a checkbox or radio button next to each item on the list, or the list of
items is presented within chrome that indicates that it is a selectable list.
3. Tabbed file upload interface obscures Google Docs integration.
Severity: Medium
4 out of 5 students were confused by the File upload tab and proceeded to upload a file from
local drive as opposed to using Google Docs feature. The first tab titled “File Upload” is visible
by default for every user turning in an assignment online. Users saw the prominent Choose File
button (paired with an equally prominent “Submit Assignment” button) and proceeded to click it
rather than exploring the Google Doc tab.
Figure 17. Tabbed layout with multiple options to upload a file.
User quote:
“Submission process is confusing... Once you know what you have to do, its easy.”
Recommendation: We recommend moving the submit button and any comment fields out of the
“attachment” interface, clarifying the purpose of that interface as solely for selecting the type of
28
document to include. In addition, descriptive help text could be added before the interface to
further define its purpose and the tabs and text within could be improved. For example, the tab
“File Upload” could be renamed “Upload from my computer”. Similarly, “Text Entry,” “Website
URL,” and “Google Doc” could undergo the same type of improvement. A custom file upload
control could also be implemented using a less demanding and more explanatory tone than
“Choose File.”
4. Users were misled by the the “notifications” link on left panel of Settings page or the
Facebook link placed at footer of every page on Canvas.
Severity: Medium
2 out of 5 students were misled by the “Notifications” link on the left panel within Settings page.
Further, one user ignored the top right part of the screen and clicked on the Facebook link placed
at the footer of page.This led the user to the Facebook page of Canvas. Such misled actions
caused average number of clicks and time required to complete task to approximately thrice
compared to expert performance.
Figure 18. Links on the left panel within Settings page.
29
All 5 students, however had no reported issues with “Facebook” being listed under “Web
Services” and were able to find the button related to Facebook after scrolling down within the
main Settings page.
User quote:
“Finding the path leading to the page where FB can be authorized was difficult for me. I
saw the word Facebook at bottom of page and clicked on it instead.”
Recommendation: Users might find the Notifications page as lacking clarity in organization due
to the absence of means to set up notifications to Facebook from Canvas. We recommend
moving social networking options to the dedicated notifications page for consistent grouping. A
possible section advertising availability of options to connect to social media may be presented
in the main Settings page.
5. Users had issues with finding the right path to Inbox. Usage of the term “Inbox” was
recognized by users as a place where messages arrive rather than a place where messages
can be composed
Severity: Medium
1 out of 5 students did not identify “Inbox” as a messaging tool. This student found a
workaround path through “Users” link from left panel for composing a message to the teacher. 2
out of 5 students failed to notice “Inbox” while they proceeded to click on “Settings” link placed
right next to it. In addition to increasing the average number of clicks by twice when compared
to expert, this also increased the average time on task.
Figure 19. Difficulty in finding Inbox
30
User quote:
“If I hadn't explored the "Inbox" link, I wouldn't have been able to complete this
task. When I see the word Inbox, I would think of it as a place where there are
incoming messages, not a place where I can compose messages”
Recommendation: There is no apparent difference between the labels Inbox, Conversations, and
Discussions. We recommend the system following consistency and standards and providing
concise description for each feature. Relabeling “Inbox” as “Messages” would go a long way
toward making this easier for users to find.
6. Coming Up list may be overlooked or perceived as incomplete.
Severity: Low
The Coming up list is a dynamic list that is populated on the right panel of the home page for a
student Canvas user. This is a record of all assignments, quizzes, or discussions that are due to be
completed for the coming up week for all the courses the student has registered.
For the task, it was assumed that the student is registered for only 1 course. Hence, the student
was expected to find all the assignments that are due for the week. While completing the task, 2
out of 5 students clicked on “View Calendar” link next to “Coming up” list and failed to notice
the list.
Figure 20. Coming up list and view calendar features.
User quote:
“I didn’t even notice the “coming up” list on the right panel. I expected to see all the
assignments listed under the Assignments menu item. If there are more assignments, then
31
I would probably view the calendar since it is not possible to list all the assignments in
the right panel.”
Recommendation: The information seen on “To Do” and “Coming Up” lists are often
redundant. Given multiple courses and multiple deliverables within each course, the list can be
long and hard for user to read. We recommend having either one of the two lists and provide
clear visual cues for standard calendar viewing options.
7. Icon to find the teacher while composing the message was not found to be intuitive by
users.
Severity: Low
4 out of 5 students typed in first few letters of teacher name in the “To” textbox while composing
the message. All the 5 students failed to notice the icon to the right of “To” textbox that may be
used to find the person of interest.
Figure 21. “To” textbox for composing message.
User quote:
“I did not notice the icon at the right end of the box. Why would I when all I
have to do is just type in the first few letters of the receiver’s name?”
Recommendation: We recommend following standard conventions in allowing the user to look
at the address book or provide an affordance for the user to know that the button is clickable.
8. Usage of trigger words like “grades” in announcements caused users to click on the
announcement instead of clicking on grades link
Severity: Low
32
2 out of 5 students saw the trigger word “grade” in the announcement under “Recent Activity” of
home page instead of clicking on “Grades” link. This sidetracked them to the announcement that
had no further information related to grade.
Figure 22. Word “grade” in the announcement acts as trigger word
User quote:
“I didn’t even look at the menu to find grades. I saw from the announcement
that grades have been posted”
Recommendation: Provide a link from within the announcement which leads back to the
gradebook.
9. Summing up of the grades for all the assignments and discussions to get one total value
titled “Assignment” and “Total” was difficult to comprehend and understand for users
Severity: Low
A total grade listed along the row titled “Assignments” was calculated as the percentage value of
total score for all assignments and discussions. This confused users since the score for
discussions were also summed with those of assignments. Further, the total row had a value
similar to the assignments row. 2 out of 5 students had difficulties in reading and calling out the
information listed in the grades page. From the figure below, the total score is above 100%. This
value is calculated as the percentage of sum of scores from Assignment 1 and Test Discussion 1.
33
This is essentially a bug since the total score is calculated as 18 “out of” 10.
Figure 23. Grades page.
User quote:
“I am not sure about the naming conventions and how assignments and
discussions may be represented in the gradebook as different groups and still
be combined into a total value under Assignments. This is highly confusing! ”
Recommendation: We recommend aggregating assignments or discussions or quizzes
separately instead of grouping them all into one row titled Assignments. This will help
differentiate the redundant information seen under the row titled Total. It may also be helpful to
rethink how scores are handled when no “out of” has been entered so as to prevent the 180%
issue.
34
Comparison of issues found between expert
and user evaluation of Canvas
Expert Review User Testing
Hidden hover states in Gradebook cells
impede workflow
Counterintuitive icons and features in the
Gradebook
Inconsistent and confusing label (i.e. Inbox
or Conversations or Messages)
Inbox label was confusing to users
Email is not intuitive and does not follow
standard conventions
Course emails and personal emails mixed
together was confusing
No obvious social media/web services until
inside Settings
Difficulty locating social media icons
Allow for standard calendar viewing options Mute assignment label and location was not
intuitive
Provide visual cues for collapsed discussions Redundant labels in global and course nav
caused confusion
Use consistent language in Discussions for
replies
Ambiguous terms in Settings for features
(i.e. Locked, Save Settings, Re-Publish)
Place related information in relevant
proximity within Settings
Lack of system feedback when submitting
grades
Provide accurate labels for social media and
web services in Settings
Lack of affordance made attaching files
challenging
Allow for downloading standard file formats
for Google Docs
Additional Notes
Expert evaluation failed to identify the pop-up dialog box on authorizing Google Docs account
as a possible issue. However, all the users had second opinions on responding to this pop-up
dialog box.
35
Although users were less disturbed by the organization of Facebook button under the section
“Web Services” than was predicted by the expert review and completely ignored the right panel
of settings page that listed ways to add contact methods, they did initially expect to find
Facebook notification options under the notifications page rather than on the main settings page.
However, the visual presence of the Facebook icon was more than enough for the users to
recognize and identify the means to connect Canvas to Facebook.
36
Overall recommendations for Improvement
Poor Location/Lack of Context
● Improve the placement of features and options within the interface to better reflect a
user’s mental model and expectations.
Unclear/Confusing Labels
● Rewrite labelling to use the trigger words that users will expect (avoid clever inventions),
and provide additional descriptive text as required to clarify purpose and intent.
System Feedback
● Provide clear feedback when actions which alter data are being performed, but avoid
extraneous alerts in other cases.
Standard Conventions
● Rather than reinventing the wheel and expecting users to learn a new set of conventions
for Canvas, make use of standard user interface conventions including terminology and
iconography when users are performing functions that they are familiar with in other
contexts (for example, sending and receiving messages).
37
Appendix
Appendix A
A.1 Methodology
A.1.1. Test Design
Due to the complex nature of testing two user groups for both qualitative and quantitative data
our team developed and deployed different strategies of data collection based on the user type.
As the instructors were proactive in providing ample feedback while completing tasks, we
conducted the sessions in a participatory manner with more focus on the qualitative feedback.
For the students a quantitative approach was mostly used, as the students were not as candid in
providing feedback that accurately reflected their ability to complete the tasks.
User testing sessions for instructors were conducted by two members of Team Instructure (Steve
Goforth & Christina Dunbar). One team member facilitated while the other collected observation
notes. Each session had a duration of 1 hour and was recorded using Silverback on a MacBook
Pro laptop. Each instructor received a pre-test at the beginning of the session and a post-test
(System Usability Scale) at the end.
The user testing sessions for students were conducted by two members of Team Instructure
(Preethi Srinivas & David). One team member facilitated while the other viewed the recorded
videos. Both the team members collected observation notes. Each session had duration of
approximately 20-30 minutes and was recorded using Camtasia on a Windows 7 laptop. Each
student received an informed consent statement to sign, a pre-test questionnaire at the beginning
of the session, 2 verbal questions to answer after each task, and a post-test questionnaire (that
included System Usability Scale) at the end of the session.
38
A.2 Data Collection & Testing environment
A.2.1 Instructors
The testing environment for in-person sessions
URL Tested: http://iu.instructure.com
Computer Platform: Mac OS X - 10.6.8 (Snow Leopard)
Web Browser: Google Chrome
Screen Resolution: 1440 x 900
Recording Software: Silverback
Information Recorded: Participant’s Screen and Face
Location: Each Instructor’s (Participant) Personal Office
Connection Speed: IUPUI Broadband Wifi (Speed Unknown)
A.2.2 Students
Data was collected in both controlled environment (USER Lab, Walker Plaza) and in user’s
natural working environment. The evaluator travelled to the participant’s location with a laptop,
which was loaded with recording and capturing tools. The participant was given a pre-test
questionnaire after the evaluator was setup and the participant had signed the informed consent
form. The pre-test questionnaire solicited demographic information along with LMS usage
information and preference questions more specific to the following tasks. A task script with
thanking the student for participation was read out after the pre-test questionnaire was filled out.
Following this, the student was given an opportunity to view the home page for a few minutes to
understand its visual components.
Once the participant was familiar with setup of the laptop, web browser and the website, the
evaluator began the test. A task sheet with one task was handed over to the participant to read out
aloud before beginning to complete the task. Participants were encouraged to think out aloud
while completing a task. On completion of the task, post-task questions were answered by the
participant. These were questions that involved participants to rate the ease of completing the
task on a scale from 1 (very easy) to 7 (very tough). Following this, a new task sheet with
information for next task to complete was handed over to the participant to read out aloud.
39
The last section of the study involved participants filling out the post-test questionnaire (that
included System Usability Scale) and de-briefing session. Participants were given an opportunity
to ask questions and provide more feedback in the de-briefing session. At the end of de-briefing
session, participants were thanked again and given contact information of the evaluator for
communication additional thoughts and concerns about the website or the study in general, if
need be.
The testing environment for in-person sessions
URL Tested: http://iu.instructure.com
Computer Platform: Windows 7
Web Browser: Google Chrome
Screen Resolution: 1280 x 720
Recording Software: Camtasia
Information Recorded: Participant’s Screen and Face
Location: USER Lab, Walker Plaza - 3 students Student’s home - 2 students
Connection Speed: USER Lab, Walker Plaza - IUPUI Broadband Wifi
(13.2 Mbps) Student’s home - 12 Mbps
Appendix B
B.1 Informed consent (Student)
I state that I am over 18 years of age and agree to participate in usability study focused on
evaluating Canvas Instructure. Participation includes answering survey questions and completing
tasks using a computer. The questionnaire will involve filling out information related to the
question asked. I understand that I will be observed while I am using Instructure and will be
interviewed. I understand that the purpose of this observation and interview is to gather data
about how people use Instructure. I also understand that I will be asked to provide demographic
information and that all the information collected for the study will be kept confidential and the
results will be anonymous. Any photograph or video artifacts will be used solely for the purpose
of completing the class assignments. I understand that I may withdraw from the testing at
anytime with no penalty.
40
Signature:
_____________________________________________________
Date:
____________
(Adapted from Interaction Design: Beyond Human--‐ Computer Interaction, 2nd Edition)
41
B.2 Informed Consent (Instructor)
CONSENT FORM
Indiana University, School of Informatics – IUPUI
I543: Usability & Evaluative Methods
Project title: Instructor Evaluation of Canvas by Instructure
DESCRIPTION: This is a study to evaluate a new learning management system that may take the place
of Oncourse. The session will be videotaped for data collection and used solely for academic purposes.
RISKS: There are no known risks associated with participating in the study.
CONFIDENTIALITY:Subjects of this study will participate anonymously. No personal information will
be distributed or shared with anyone outside this research study, unless required by law.
COSTS: There are no costs to you to participate in the study.
VOLUNTARY: Your participation is voluntary and may choose not to take part or may leave the study
at any time. Leaving the study will not result in any penalty, and your decision whether or not to
participate in this study will not affect your current or future relations with IUPUI.
CONTACTS: For questions about the study, contact the researchers: Steve Goforth
<[email protected]>, Christina Dunbar <[email protected]>.
SUBJECT'S CONSENT: Being part of this study and answering the is an acknowledgment that you
understand the nature of the study and have given your permission to participate.
Signature of the participant: _____________________ Date: ________
Thank you for participating in this study!
42
B.3 Instructor Test Plan
Canvas by Instructure (Instructor) Usability Test Plan
Testing Dates: October 29th-November 2nd
Equipment needed for Test:
Computer with Internet connection
Testing Location: In-person and Remote usability testing
Application to be Tested:
Site for Inspection: canvas.instructure.com
Test Participants:
5 college instructors.
Participants will be pre-selected according to following criteria:
○ Currently teaching a college course
○ Has experience using a learning management system such as OnCourse or Blackboard
Testing Method:
Summative testing method will be used in this study. During 1:1 sessions, the facilitator will:
○ Open the Canvas by Instructure home page and set the context with the user that it is the first time
entering Canvas
○ Give the participant a few minutes to explore Canvas
○ Begin the first task scenario and observe where the user begins (observe the route they use to
begin the task)
Welcome. Please call 1 (877) ***-****, and enter Conference Code: 317 ***-**** to join the
WebEx session.
43
Introduction
“Hello my name is ____. Let me start by saying thank you for agreeing to participate in this assessment
of the a new learning management system, Canvas. My goal is not to judge you on the decisions you will
be making, but to see how usable this product is. As a participant, I will ask you to complete a series of
tasks in regards to the user interface of Canvas. If at any time you are struggling or have any questions
during the test, let me know and I will help as I can. Do you have any questions?”
“Before we start, I’d like you to complete a brief survey. This information will not be shared. This is just
for our internal analysis.”
**Present Pre-Session Survey**
“Excellent! The following digital prototype is still in the design phase. Are you ready to begin?”
**Jump into task one**
Task #1: Finding student work that has been turned
Let’s say it is the middle of the semester. You have assigned your students to write a paper on bass
fishing. Your student Anthony has asked you to grade his paper early as he will be out of the country the
following day. Please find Anthony’s bass fishing paper and assign it a grade.
44
Steps:
1 Login to Canvas
2 Click on Grades
3 Click under the assignment.
4 Type a number that would represent the student’s grade.
Post-Task Questions
1.) How was that process for you?
2.) Did you find any part of that task cumbersome?
Task #2: Muting an assignment that is turned in
One of your students tells you that one of the assignments (Assignment 8) that is six weeks out has
suddenly become available. Please find the assignment and turn student visibility off.
45
Steps:
1 Click on Christina Dunbar's Practice Course
2 Hover over Assignment 8
3 Click the drop down
4 Click “Mute Assignment”
Post-Task Questions
46
1.) How was that process for you?
2.) Did you find any part of that task cumbersome?
Task #3: Using SpeedGrader to grade an assignment
Your whole class has finished the assignment. You realize that it would take too much time to do each
student’s assignment one-by-one. Find a faster alternative grading method in Canvas and use it to grade
a student’s work.
47
Steps:
1 Click on Christina Dunbar's Practice Course
2 Hover over Assignment 2
3 Click “SpeedGrader” on the right side
4 Assign Grade
Post-Task Questions
1.) How was that process for you?
2.) Did you find any part of that task cumbersome?
Task #4: Revising a quiz
It is Sunday and you just realized that you forgot to add a time limit of 30 minutes to this week’s quiz
(Quiz #5). Please find quiz #5 and edit the time limit to equal 30 minutes.
48
Steps:
1 Hover over Courses & Groups from main navigation
2 Select “Christina Dunbar’s Practice Course” from the dropdown menu
3 Click “Quizzes” on the left side
4 Click on Quiz 1 (hyperlink)
5 Click on “Edit This Quiz” on right side
6 On the right side edit the Time Limit for the quiz
7 Re-Publish Quiz
Post-Task Questions
1.) How was that process for you?
2.) Did you find any part of that task cumbersome?
Task #5: Responding to a message in Conversations
(Inbox)
A student from your course has started an argument with another student on the course messaging page.
Please find the argument and reply to their conversation.
49
Steps:
1 Click on “Inbox” in the top right hand corner of the page.
2 Select an existing conversation.
3 Type a message in the Message field off to the right
4 Click Send
Post-Task Questions
1.) How was that process for you?
2.) Did you find any part of that task cumbersome?
Task #6: Sending an Announcement
Your students have an exam next week, but they still need to complete their discussion responses for
homework. Please send out a reminder to the class and let them know about their discussion responses.
50
Steps:
1 Hover over Courses & Groups from main navigation
2 Select “Christina Dunbar’s Practice Course” from the dropdown menu
3 Click “Announcements” on the left side
4 Click on Make an Announcement button
5 Type in a title
6 Type message in text editor
Post-Task Questions
1.) How was that process for you?
2.) Did you find any part of that task cumbersome?
Post-Test Questions
1.) From task 1 (Reviewing student work) through task 6 (sending out an announcement),
how was that overall process for you?
2.) Were there any areas of the Canvas that you would change? If so, where?
3.) If you had the choice to use your current learning management system or Canvas, which
one would you pick and why.
4.) *Take them to:* http://hcibib.org/perlman/question.cgi
5.) Do you have any other questions or suggestions?
B.4 Student Test Plan
Canvas by Instructure (Student) Usability Test Plan
Testing Dates: October 29th-November 2nd
Equipment needed for Test:
Computer with Internet connection
Testing Location: In-person and Remote usability testing
Application to be Tested:
Site for Inspection: iu.instructure.com
Test Participants: 5 university students.
Participants will be pre-selected according to following criteria:
51
○ Currently enrolled in a university course
○ Has experience using a learning management system such as OnCourse or Blackboard
Testing Method:
Summative testing method will be used in this study. During 1:1 sessions, the facilitator will:
○ Open the Canvas by Instructure home page and set the context with the user that it is the first time
entering Canvas
○ Give the participant a moment to take in the home screen (but not to click around or start performing
any work)
Begin the first task scenario and observe where the user begins (observe the route they use to begin the
task)
Introduction:
“Hello my name is ____. Let me start by saying thank you for agreeing to participate in this assessment
of the a new learning management system, Canvas. My goal is not to judge you on the decisions you will
be making, but to see how usable this product is. As a participant, I will ask you to complete a series of
tasks in regards to the user interface of Canvas. If at any time you are struggling or have any questions
during the test, let me know and I will help as I can. We will capture video of the screen as you work and
audio of your survey results and comments. These recording will be used only for the purpose of
evaluating the application. Do you have any questions?”
“Before we start, I’d like you to complete a brief survey. This information will not be shared. This is just
for our internal analysis.”
**Present Pre-Session Survey**
“Excellent! The following digital prototype is still in the design phase. Are you ready to begin?”
**Jump into task one**
Task #1: Finding a list of all assignments due this week
You have a busy week and want to make sure you don’t let anything slip through the cracks. You want to
know what you have due this week so that you can plan ahead. Find a full list of all of the assignments
that you have coming up for the week.
52
Steps:
1 Locate and identify the list of assignments due on the right panel listed under To Do and Coming Up
headers
Post-Task Questions
1.) How was that process for you?
2.) Did you find any part of that task cumbersome?
Task #2: Submitting an assignment with Google Docs
You have an assignment due today which you haven’t started on yet. Locate the assignment which is due
today (there will be only one) and submit it using Google Docs. You may log into Google Docs with the
following credentials: (name) / (pass). A sample document is provided in that account to use when
completing these steps.
53
Steps:
1 Locate the assignment due listed on right panel of homepage under To Do header
2 Click on the link stating the assignment name
3 Click on Submit Assignment button on right panel
4 Select Google Doc tab
5 Select the required Google document by clicking on it
6 Click on Submit Assignment button
Alternative Steps:
1 Hover over Assignments menu item
2 Click on the assignment under the heading “To Turn in”
3 Remaining steps are similar to previous route
Post-Task Questions
1.) How was that process for you?
2.) Did you find any part of that task cumbersome?
54
Task #3: Checking a grade on a past assignment
You submitted assignment 1 in your class week and your professor just informed you that a grade is
already available. Locate and check your grade for that assignment.
Steps:
1 Go to the required course page
2 Click on Grades menu item on left panel
3 Locate and identify the grade for the assignment of interest
Post-Task Questions
1.) How was that process for you?
2.) Did you find any part of that task cumbersome?
Task #4: Sending a note to your instructor
55
You are going to be out of town on (date) and will miss class. You want to communicate this to your
professor. Use the messaging tools built into Canvas to contact your professor and let them know you will
be gone.
Steps:
1 Click on Inbox link at top right panel
2 Click on the icon at the right end of “To” textbox in New Message panel
3 Click on course of interest from the resulting list
4 Click on “Teacher” list item
5 Click on the required teacher name from the resulting list
6 Click inside the Message textbox and type in the message
7 Click on send button
Post-Task Questions
1.) How was that process for you?
56
2.) Did you find any part of that task cumbersome?
Task #5: Connecting to Facebook
You’ve been told by a friend that you can receive your grades, updates to assignments and quizzes, their
corresponding due dates as notifications to your Facebook account. You want to give this a try to see if it
is useful. Register your Facebook account with Canvas to achieve this.
Steps:
1 Click on Settings link on top right page
2 Click on Facebook button listed under “Other Services”
3 Click on Register Your Facebook Account
4 Login to Facebook account
57
Post-Task Questions
1.) How was that process for you?
2.) Did you find any part of that task cumbersome?
Post-Test Questions
1.) From task 1 through task 6, how was that overall process for you?
2.) Were there any areas of the Canvas that you would change? If so, where
3.) If you had the choice to use your current learning management system or Canvas, which one would
you pick and why.
4.) *Take them to:* http://hcibib.org/perlman/question.cgi (SUS)
5.) Do you have any other questions or suggestions?
58
B.5 System Usability Scale
59
Appendix C C.1 Instructor quotes
Participant Quotes by Task
The quotes below provide candid feedback from instructors on the user experience for Canvas.
Task 1 -
“It is confusing having the double labels for Assignments. I do not like how far comment is from
the grading area. You could almost overlook it easily.”
Task 2 -
"See I've gone to bifocals in the last year or two. But still when there is a lot of data and I
am looking for something and can't really find it, I get really annoyed."
“Mute to me means turn off the sound.”
“The majority of control options are under the edit assignments, so it [mute assignment] seems
odd that it would not be there.”
“I don’t know definition of what they mean by locked. I do think some faculty need a student
view option. I feel like I am being stomped here I am a fairly intuitive person. I expected the mute
assignment” type of function would be in Edit Assignments not in the Gradebook area.”
"But if it is a class where I am basing our progress on the order I'm trying to do things, I
don't want them to see what I have planned since surprise is a good portion of that class.
There is no way for me to do it other than manually entering it each week. I hated it.
Very difficult to find. Not intuitive at all."
Task 3 -
“ I am confused by the dashed line versus the other icon. You need the Gradebook and the
Speedgrader process to allow both methods for faculty to grade.”
"When I see a button, I am preconditioned to click on it. I want a submit button cause I
don't know how to end this." (voice recognition)
Task 4 -
“It was straightforward and easy.”
“Not sure if I have to re-publish the quiz. When you save settings does it reflect automatically or
do you have to republish to reflect the change? Does editing unpublish it?”
“I don’t understand the difference between Save Settings and Re-Publish Quiz. This is very
distracting having [Assignments] in two places is weird to me.”
"Save Settings = Not finished, not ready for the students to see it... and Re-Publish did
not even process in my head."
"I would replace these buttons with Save Draft and Publish Quiz. Save draft reminds me
that it is a draft and it is not finished."
60
"I did not complete the task, yet I thought I completed the task. That is the worst thing a
learning management system can do. There is nothing worse."
Task 5 -
"I was looking for something like a chat room, so I went to Discussions."
"So this is the inbox for just the course? Right? Something does seem kind of strange."
“When I see Inbox, it’s like my personal Inbox, not a course inbox.I hated that [the process].”
Task 6 -
"Users must post before seeing replies? I don't get that. How do you get replies unless
you're not posting?" (in regards to options below Announcement).
"This is more of a discussion thread. I don't know if I would use this."
“Announcements with dialogue is too messy.”
“I expect it [announcements] to be one-way. Not two-way. The only option is to save so I am
assuming it is going to save and post it? User must post before seeing replies. Does not know
what that means.”
“I was expecting to save or post. Save does not necessarily mean Post. “It makes me feel
like I am missing something.”
"Announcements makes no sense to me in how I am understanding it at this second with
replies."
"At that point, it is a discussion. When I put an announcement out, there is no room for
discussion."
Overall Feedback
“I’d use OnCourse. There may be some degree of familiarity.”
“I think OnCourse should be replaced, but I don’t feel that this LMS makes sense.”
“Obviously I’ve spent years learning and using OnCourse... I don’t like some of it’s features, but
I know where they are. A lot more things were not intuitive to me than were. Would not have
thought Inbox would be for all courses, it is not what I am used to.”
“I think this is just a softer course management system, fuzzier, softer and more user-friendly. I
don’t see I gain a whole lot of advantages in using this compared to OnCourse. I think the biggest
difference is the grading tool.”
61
C.2 Student quotes
Following are some of the user quotes from student tasks and post-test questionnaire.
Coming up list
“I like the coming up list feature since with OnCourse, I will have to check the syllabus to verify
the assignments that are coming up. Canvas lists them on the homepage, which reduces my
effort. The assignments feature on OnCourse only lists the assignment that is due, but Canvas
seems to be listing the assignments that are due for the entire week.”
“I had difficulties navigating forward and backward from calendar to home page where the
assignments coming up for the week are listed.”
Submitting an assignment using Google Docs feature
“I didn’t have difficulties with this task. The only issue I had was with respect to selecting the
test document before clicking on submit button. I kept missing this small step while turning in
the file.”
Locating and identifying grade for an assignment
“OnCourse does not give overall grade, but Canvas seems to be listing it, which I think is very
good.”
Composing a message
“Although I went into inbox, I was unable to find the place where I can ‘compose’ a message. In
OnCoruse, it is difficult for me to find the instructor from a list of users. Further, I am required to
ctrl+click for selecting multiple users. I think the feature that automatically populates the name
of the instructor as I start to type his/her name is interesting.”
“It will be better if they have a link named Message Tool or something similar on the left panel”
Connecting Canvas to Facebook
“I would add an item at top right corner named FaceBook since I feel many people use
Facebook. So there should be an easier way to setup the notification process.”
62
“ If I were a first-time user, finding the place to setup my Facebook account might have been
difficult. I was able to do this task since I noticed it while doing the previous task.”
“It is interesting there was no pop-up message that I am leaving Canvas while logging into my
FaceBook account.”
“Once I figured I should look into Settings, I was able to see Facebook as a option to link with
Canvas. I am a frequent user of Facebook and really like this idea since I can see my grades or
assignments due without logging into a LMS account.”
Quotes from post-test questionnaire
“I think a first-time user might spend more time exploring this interface.”
“I would prefer OnCourse since it is more evolved. You don't have to keep looking for things. I
do like the recent activity and connecting to Facebook concept. However, I feel the system needs
to be more simple and straightforward.”
“I am more familiar with OnCourse. May be if I am given more time, I would be able to clearly
say which one I would pick.”
“I feel things can get little tricky when there are multiple courses. I am not sure how things will
work then. I would love to see a simple and cleaner interface. Some text is sized differently and
it is difficult to read them from a distance. I have to get closer to the screen to read such text.”
“Canvas looks good. I can check assignments that are coming up, it gives me my overall grade, I
can get notifications on FB, and I can turn in a Google Doc instead of saving the file to a local
hard drive and then uploading it back while turning it in.”