UT PROVOST

This picture compares the functionalities provided by various portals across UT and the other two universities and how these functionalities fair against each other in terms of a hierarchical representation of the tools within each portal, and the number of clicks needed to perform the desired task. The tools in question for the study are the ones provided by UT Direct.

GOAL

Usability assessment of the process by which students of University of Texas register for classes. UT 
Provost (client) is interested specifically in which systems students use, how well those systems integrate, and how to make improvements to the process.

CLIENT KICK-OFF

The UT Provost team addressed the class in a one-hour meeting where de-briefing of the problem statement and the entire process was provided. A list of relevant documents was provided in addition to all the data collected from previous studies for reference. Links to access all the registration tools in sandbox environment were also provided.

REGISTRATION PROCESS 

Registration is the process by which students secure the classes they need each semester to complete their degrees. Selecting the wrong classes or not being able to enroll in the classes a student has chosen means the student takes on more loans and enters the workforce later, adding to their lifetime student debt. As a result, registering for classes is a very high-stress activity for many students.

  • Class selection

    • How do I know which classes are available?

    • Which classes do I want to take?

    • How do I know which I need to take?

  • Schedule design

    • How will these classes fit around my work schedule?

    • Can I juggle all the classes I need given time constraints and locations?

  • Can I take a class?

    • Am I barred from registration, for example for non-payment of a bill?

    • Am I allowed to take the class? Do I have the necessary prerequisites or do they only allow a certain type of student, like students with a certain major?

  • Registration time

    • When can I register?

    • Can I register fast enough to get the class I need before other people?

  • Registering

    • How do I register?

    • If I don’t get the class I need, can I get on the waitlist?

TOOLS / PORTALS

Following tools used for the purpose of registration were chosen by the UT Provost Office for the usability study

  1. Course Schedule

  2. Registration Information Sheet

  3. Register Courses 

  4. Waitlist

  5. Degree Audit

  6. UY Planner

  7. CV & Syllabi

TYPE

Semester long group project for the Usability class. Done in a group of 5.

TIMELINE
Gantt Chart.png
AFFINITY DIAGRAM
Screen Shot 2019-10-07 at 1.28.55 PM.png
ADVISOR INTERVIEW

An "Advisor" is a person who works closely with students throughout their degree program at the University of Texas (and other universities) to help them guide and advise on their short-term and long-term goals/plans. Each student is assigned an advisor on admission, based off of their department/program. 

For the sake of giving the class a taste of "how to conduct interviews", the professor Mr. Eric Nordquist arranged for an Advisor (who shall remain anonymous throughout the scope of this project), from one of the departments at University of Texas, for interview.

The interview was conducted in-class and was a voluntary activity. It lasted an hour where Eric served as the moderator and each student was a note-taker. The session was not audio/video recorded.

Click here to access the Moderator Test Script and Interview Notes.

HEURISTIC EVALUATION

Each of the tools was evaluated against Jacob Nielson's 10 principles (Nielsen Norman Group, 2019) as heuristics to run a usability inspection. The 10 heuristics are:

  • Visibility of System Status (VSS)

  • Match Between System and the Real World (MSR)

  • User Control and Freedom (UCF)

  • Consistency and Standards (CAS)

  • Error Prevention (EPR)

  • Recognition Rather than Recall (RRR)

  • Flexibility and Efficiency of Use (FEU)

  • Aesthetic and Minimalist Design (AMD)

  • Help Users Recognize, Diagnose, and Recover from Errors (RDR)

  • Help and Documentation (HAD)

Each tool is inspected through and through against each of these heuristics and a detailed report is generated. Each problem encountered is given a severity which helps prioritize the order of addressing these problems. The severity scale used is as given below:

  1. Very Low

  2. Low to Fair

  3. Average

  4. Severe

  5. Catastrophic

 

 

 

 

 

 

 

Additionally, each problem found was coded based on the tool and the heuristic type. Screenshots of the tool interface were attached in the report with the problem areas highlighted and the code provided. Also, a table mentioning the number of problems per heuristic fro each tool was generated.   ​

Click here to access the Heuristic Evaluation Report.

Screen Shot 2020-03-11 at 4.40.39 PM.png
Screen Shot 2020-03-11 at 5.22.04 PM.png
COMPETITIVE ANALYSIS

UT Provost chose 7 tools for us to run the usability study. The reasons provided behind the decision were: the popularity of the tool among students and teachers and the importance of usage and information presented on the tool. The advisor interview and a few responses that we received through the survey, helped us identify other portals provided by UT Provost and some third party applications that are widely popular amongst the students at the University. Further, we chose to look at the registration portals provided by other universities, namely, Texas A&M and the University of Michigan. The choice of universities was based solely upon the ease of access to their portals given the contact with students at these Universities. The findings from the analysis are summarised in the images below.

Screen Shot 2020-03-11 at 5.33.47 PM.png
Screen Shot 2020-03-11 at 5.33.25 PM.png

The image compares the registration portals offered by UT Provost to those by Texas A&M and the University of Michigan. The ease of cruising through the portal, and hopping from one functional aspect of the portal to another is color-coded as 'green', 'yellow' and 'red'.

SURVEY

Each group in the class brainstormed to come up with a list of questions for the survey and posted their list on a shared Google document. The Professor then compiled a comprehensive survey that was sent out in various university social media groups and was also forwarded by the students. The response received was slightly underwhelming and mostly comprised of graduate students, which was a bummer as the UT Provost wanted the focus to be on undergrad students. 

Nonetheless, since all but 1 tool are available for use to both graduated and under-grad students, the Provost Office gave us the green flag to build up and present on the collected survey data. The following pictures show some analysis of the data collected from the survey.

Click here to access the Survey Report.

Screen Shot 2020-03-11 at 10.19.59 PM.pn
Screen Shot 2020-03-11 at 10.20.31 PM.pn
Screen Shot 2020-03-11 at 10.21.46 PM.pn
Screen Shot 2020-03-11 at 10.22.08 PM.pn
INTERVIEW
Screen Shot 2020-03-12 at 11.22.12 AM.pn

Each interview session lasted somewhere between 45 minutes to an hour.  The interview session was broken into parts:

  • Briefing: the goal of the interview was explained to the participant and the participant was made duly aware of their rights during and after the session.

  • Consent: the participants signed a consent form allowing us to audio and video record the session.  The recording was for us to make sure our notes were accurate and we didn't miss any important information. The recordings were destroyed after all the information was coded.

  • Ice breaker: the participant was general background questions to ease him/her in the process and gather some relevant background information.

  • User Journey: this part of the interview was a free-flowing conversation where the participant helped us understand their journey throughout the registration process. The process was broken down into three phases: Prior to Registration, During Registration, and Post Registration. A white-board and marker were used to map out the user journey, and the goal was to get a more concrete and practical understanding of the planning, strategies, processes, tools used etc. by the participants to register for courses.

Screen Shot 2020-03-12 at 12.11.39 PM.pn
  • Usability Testing: this is where the participants were involved hands-on in the registration experience. The participant was given a scenario, a laptop, scratch paper, and a pen, and was asked to perform the task discussed in the scenario. A sandbox environment for all the tools was set up in advance for the participants to dummy register for courses. Some participants preferred demonstrating the process using their own student accounts, which was just as good. 
    Minimal help was given to the participants during the task, and they were asked to "Think Out Loud" while performing tasks. Each task was evaluated as a "success"  or "failure" with any provided help/hints or bottleneck situations recorded.

  • Between-Task Survey: each task was followed by a small survey designed to assess the participant's satisfaction with the tool.

Screen Shot 2020-03-12 at 12.28.05 PM.pn
  • System Usability Scale (SUS): participants were asked to rate the entire registration system/process on various features on a scale of 1 (Strongly Disagree) to 5 (Strongly Agree). This helped us gain a quick, and more quantified understanding of how the participant felt about the system as a whole.

Screen Shot 2020-03-12 at 12.33.59 PM.pn

Click here to access the complete Interview Packet.

CODING THE INTERVIEWS
Screen Shot 2020-03-12 at 12.54.51 PM.pn

All data collected from the interviews (including the recordings) was coded for analysis in Google Sheets. The identity of the participants was not linked to the data and each participant was given a code/id. Any kind of relevant behavioral data in terms of body language, gestures etc was also captured and used for analysis.

Click here to access the Codebook.

ANALYSIS AND PRESENTATION

The data collected from the survey and the interviews, and the previous data provided by the UT Provost office were all combined to generate our final analysis. The team met several times during the course of 2 weeks to go through each interview (each team member selected a participant whose interview they analyzed, and whoever finished first took on the remaining interview sessions). The survey analysis was done as a group and in the final week, we combined our observations and brainstormed ways on how to present our analysis. The following images try to show how students feel throughout their journey of registration.

Click here to access the Presentation.

Screen Shot 2020-03-12 at 1.03.43 PM.png
Screen Shot 2020-03-12 at 1.03.58 PM.png