Cane Success -
Usability Testing Report
A usability testing study meant to evaluate the effectiveness of a new resource-heavy success website for students.
PROJECT SCOPE
-
Team: Joaddan Reme, Arielle Asare, Tirsa Fernandez, Neon Cao, Justin Jacobson
-
My Role: UX Researcher, Usability Testing Coordinator
-
Timeline: 2 weeks
-
Toolkit: Figma, Google Suite, Microsoft Teams, Zoom, Excel
-
Methods: Generative Research, Usability Testing, Wireframing + Prototyping

Executive Summary
The purpose of this usability testing study was to evaluate the effectiveness, clarity, and navigability of the University of Miami Cane Success Website. Our goal was to evaluate how well the site enables students to find resources, connect with support staff, and solve academic and professional concerns independently.
Methodology
Participants: 5 current UM students.
Format: Moderated 1:1 usability sessions.
Protocol: 5 scenario-based tasks using the think-aloud method.
Post-task & post-session surveys included SEQ and SUS questionnaires.
Data Analysis: Completion rates, time-on-task, satisfaction scores, and qualitative feedback were aggregated and analyzed.
Key Findings
• 19 usability issues identified across the tested flows
• Average SUS Score: 46/100
• Task Success Rate: 48% overall
• Average Time on Task: 1 minute 43 seconds
• Common Issues:
-
Confusing layout on resource-heavy pages.
-
Inconsistent labeling and unclear pathways to support on key pages.
-
Lower functional usability perception.
This Report Includes:
-
Prioritized usability issues by severity (based on the Albert & Tullis scale.)
-
Supporting metrics for each task.
-
Actionable evidence-based recommendations to improve usability, clarity, and student engagement.
Objectives
As we conducted our testing sessions, we aimed to mainly uncover pain points, but also to understands student’s current mental models of a student success website to better inform our recommendations. In conjunction with the website’s goals, we outline the following key goals and objectives to guide our study...
Website Goals
-
Improve students’ information literacy.
-
Facilitate independent problem-solving skills for students.
-
Emphasize the teams’ support roles for students and parents.
Our Goals
-
Uncover core usability challenges that hinder website performance.
-
Evaluate task-flows and user behaviors to highlight patterns that affect overall satisfaction.
-
Provide key, actionable user-centered recommendations.
Methodology
Procedure
To conduct the usability test, our team developed five tasks intended to simulate common actions that continuing students might perform on the university website. Participants were recruited through convenience sampling, with inclusion criteria specifying that individuals must be continuing students at the University of Miami.
-
A total of five participants were recruited for the study. Sessions were conducted in person, recorded via Zoom, and session notes were automatically transcribed for later analysis.
-
At the task level, data collected included task completion, time spent on each task (measured in seconds), and a single-item measure assessing perceived ease of task completion.
-
At the website level, data collection included the System Usability Scale (SUS) to assess overall usability, as well as a brief semi-structured interview to gather participants' subjective feedback on the website’s strengths and areas for improvement.
Usability Testing Setup
-
Each participant gave informed consent to participate in the study, and for de-identified data to be collected, including: a recording of the session, any verbal comments they made and answers to survey questions.
-
Each participant was then presented with realistic scenarios and instructed to use the website to complete the corresponding tasks. Participants were encouraged to think out loud as they worked through the tasks.
-
Raw data taken from the sessions was then cleaned, aggregated and analyzed to develop insights for this report.
Participant Outreach and Screening
Five current University of Miami students were selected as participants. Our ideal candidates were undergraduate students encountering the new advising system and website for the first time in their academic journey at UM. Aside from basic technology experience, English proficiency, and student status, demographic factors such as age, gender, ethnicity, and device usage will be collected, but can vary.
Our ideal candidate would fit the following user profile:
-
Age: 18-30
-
Educational Level: Undergraduate Continuing Students
-
Spoken English Proficiency: Basic
-
Technology Experience: Basic
Participant Demographics
The following data was collected from out demographic questionnaire used before the usability testing portion.
Tasks
Task 1: Find your academic support person.
-
Using the Cane Success website, determine the title of the support person who is responsible for answering a question about whether or not a certain class will count for needed credit.
Task 2: Find the Student Journey Map.
-
You want to get an idea of who the Cane Success Advisors are and what they do to help students along their educational journey . Find the student journey map.
Task 3: Schedule a virtual appointment.
-
You want to speak with someone to verify progress toward graduation. Find where to schedule a virtual appointment with the appropriate advisor.
Task 4: Request academic accommodation.
-
Find where you would get to the Office of Disability Services (ODS) website.
Task 5: Leave, Withdrawal, and Readmission procedures.
-
Using the website, find out information about what the process is for taking a leave of absence.
Equipment
-
Laptops/Electronic Device: To navigate the website and take notes.
-
Zoom: To record participants’ screens and take transcripts.
-
Digital Stopwatch: To record time-on-task for each participant.
-
Google Drive: Serve as our file and workbook repository.
-
Excel: Used to collect and organize data.
-
Moderator Packet: Used by the moderator to oversee the testing session.
-
Observer Packet: Used to collect the same information as the moderator, excluding the scripts.
-
Participant Packet: Used by the participant to complete necessary tasks and questionnaires.
Task Metrics
The following task metrics were collected and analyzed in this report:
-
Number of participants who were able to complete each task - Effectiveness
-
Used to understand how successfully participants completed tasks based on a predetermined optimal task-flow, informing overall website effectiveness.
-
-
Time needed to complete each task - Efficiency
-
Collected once the participant would verbally start and end a task, informing the website’s efficiency in navigation and layout.
-
-
SEQ Scores
-
Used to measure a user’s perception of a task’s difficulty immediately after each task. Based on a 7-point Likert Scale, with 1 = Very difficult and 7 = Very easy.
-
-
SUS Scores
-
Used to measure a user’s perception of the overall website’s usability once the usability session is complete. Based on a 5-point scale, 10-question questionnaire with alternating positive and negative statements
-
-
Participant comments written on questionnaires and/or spoken.
-
Specifically noting usability problems.
-
-
To further supplement our qualitative insights and feedback, we conducted the following Post-Study Interview to collect data on user’s experience with Cane Success at the end of the usability testing session:
-
Overall on a scale of 1 to 10, how would you describe your experience using the Cane Success website (1 being the worst, 10 being the best)?
-
Were there any parts of the website that you think could be improved?
-
Was there anything you found helpful or well-designed?
-
Is there anything you expected to find but didn’t?
-
Do you have any suggestions for how the website could be improved for students like yourself?
-
-
In addition to the previous metrics, we defined the severity of each identified usability issue using the Albert and Tullis severity scale (Tullis & Albert, 2008). Severity ratings were determined using the 2 dimensions:
-
Impact on user experience: How significantly the issue interfered with a user's ability to complete a task.
-
Frequency of occurrence: How often the issue was observed.
-
Results
Overall, the Cane Success website has critical usability problems based on task success, overall perception, and time on task. The website excelled in following key usability heuristics such as Aesthetic and Minimalist Design, but there are opportunities for improvement in navigation and functionality.
Task Completion Rates
Key Takeaways:
-
Tasks based on connecting with the correct support person had low success rate.
-
Tasks based on finding information on the site had higher success rate.
-
Finding the Student Journey Map was the task with the lowest success rate.

Time on Task
Key Takeaways:
-
Tasks based on connecting with a support person seemed to take longer.
-
Tasks based on finding resources were completed (on average) in less time.
-
Finding the Student Journey Map took (on average) the longest.
Note: Maximum Time Allotted Per Task: 130s (3 Minutes)

Satisfaction Scores
Key Takeaways:
-
Overall, participants subjectively rated most tasks as being relatively easy.
-
Finding the Student Journey map perceived as being most difficult.
-
Some discrepancy with Task 1 being rated the easiest but having a low success rate.
-
This may indicate that users thought they were completing the task but were mistaken on the title of the support person.
-
NOTE:
Single Ease of Use Question (SEQ):
Overall, how difficult or easy did you find this task?
1 = Very difficult and 7 = Very easy


Keepers
Despite our findings and identified usability problems, the Cane Success website did well when it came to visual design, student graphics, and primary navigation options. More specifically, the following elements stood out during testing:
-
Visual Design: The site aligns with UM aesthetic and brand identity while maintaining a professional look. This makes the interface visually familiar to the students.
-
Student Journey Map: Users appreciated Student Journey Map as a tool to understand support roles and how they fit in student’s personal academic journey.
-
Student Categories: The navigation bar has labeled categories for different student types (e.g., Incoming Students, Transfer Students, and Continuing Students), reinforcing the site’s goal of personalizing the student experience. This structure aligns well with user expectations.
-
Make Appointment Button: Participants appreciated the Make Appointment Button as a primary call to action, stating that connecting with an advisor is the most likely reason to use the website.
-
Search Bar: Users naturally turned to the search bar first as a starting point for navigation. Its familiarity mirrors other web usability norms, making it a key feature users can rely on.
Usability Problems Identified
We found a total of 19 problems (N=19) throughout our usability testing sessions, all of which are outlined in our full usability problem matrix. This matrix is meant to visualize the most critical issues based on severity and impact.
A brief overview of the top 5 problems are outlined here, highlighting the following overarching problems throughout the website:
-
All users commented on the poor visual hierarchy of the entirety of the website, suggesting a need for restructuring content blocks
-
Most critical issues are tied to navigation and labeling.
-
Set the stage for further recommendations.

Voices
“I think maybe the website struggles with a little bit of like organization. So, even though there is a search bar. Some of the information was a little hard for me to find, because you would type in something into the search bar, and there would be a suggested on results. What you're looking for doesn't exactly pop up there, so it can be overwhelming sometimes.” -P2
“It's not really about what the portal tells you to do — it should focus more on helping you reach your actual goal. Like, if I need to talk to someone, just show me a few advisor names or how to find them. Don't just list deadlines — we don’t care about that. What we really need is something like a calendar. We're not here to dig through a bunch of text trying to find policies or deadlines.” -P5
"I feel like if we already have this bunch of buttons here, it would be better if we could just expand on them and include as much information as possible right on the homepage. Maybe organize it more like a chapter system or sections—it would make things a lot easier to find." -P4
Redesign
To aid the official Cane Success team in their improvements, we reworked the sitemap and redesigned key critical pages based on our test findings.
Recommendations
To make Cane Success more user-friendly, we recommend the following practical improvement plan:
-
Reorganize the site’s information architecture
-
Use methods such as open card sorting to better align with how students naturally think and search for support.
-
Uncover how users group and label content to inform clearer navigation structures and better page organization.
-
-
Enhance navigation consistency
-
Ensure menus and links are predictable and easy to follow across the site.
-
-
Simplify the homepage
-
Focus on task-oriented calls-to-action that guide students quickly to what they need.
-
-
Refine search functionality
-
Make it easier for students to find specific resources or information.
-
-
Use more intuitive labels for advisor roles
-
Help students easily understand who they should reach out to for support.
-
Appendices
For a closer look at our team files, click the links below:




