My Role & Responsibilities

UX Researcher

  • Conduct heuristic evaluation and compile the findings into an actionable report.

  • Design and distribute survey and analyze the data collected along with other teams.

  • Moderate generative interviews and usability testing.

  • Create a codebook for the qualitative data collected.

  • Compile all the findings across all methods to create an actionable report and share it with the stakeholders.

  • Present the findings to the stakeholders in a simple and easily deducible format.

Project

  • 12 weeks

  • Group Project

  • 5 Team Members

  • Qualitative UX Research

  • Usability Testing

  • Constraints: Time, Money & Research Participants

The institution has the nation's seventh-largest single-campus enrollment, with over 50,000 undergraduate and graduate students and over 24,000 faculty and staff.

Office of the Executive Vice President and Provost at the University of Texas at Austin works closely with the campus leadership and stakeholders, to support a broad portfolio of units that deliver a world-class educational experience and produce high-impact research and scholarship. The Provost office is organized in 7 portfolios and in this project we focus on the Enroll Management & Student Success division which is responsible for admissions, financial aid, enrollment analytics, student success initiatives, and more. Most student-facing portals and websites used for the aforementioned processes have not been updated or upgraded from a UX perspective, and there was little to no relevant qualitative research that could help inform the product redesign. Before the redesign of the suite of products could begin, the UT Provost office reached out to Dr. Eric Nordquist to help conduct some user-centered research that could be used in the (outsourced) design and development process to follow.

Target Users

The university is one of the most diverse universities in the United States hosting thousands of students from across the world each year. The figures below represent the diversity of UT Austin students, and each student uses the suite of registration portals several times through the course of their degree program.

No. of Students

52,000

No. of countries

123

No. of Schools

18

No. of Degree Programs

300

 

Research Statement & Goals

We wanted to better understand how users interact with the different portals/tools provided by the UT Provost office for the purposes of registration and degree progress planning and how does the afforded experience influence their decision-making in order to unearth gaps and existing problems associated with the portals and the experience and identify and prioritize portals based on the improvements needed.

 

From the client kickoff meeting and a discussion session between the team and the Professor, we narrowed down our research goals to the following:

  1. Discover the user's current workflow and decision-making using the tools for registration and degree planning and how they feel about the entire experience.

  2. Learn about user’s pain points and frustrations, and the barriers to using the tools used, and how the users would improve it.

  3. Find out the problems users face that interfere with their graduation plans and how the tools can be used to address them.

  4. Uncover the other tools users' are using to achieve their goals around the registration process and their experience with those tools. Understand what these tools are doing better and how they can be used as an inspiration for the design of university-provided tools.

  5. Evaluate how users are currently interacting with the current tools and using them for academic success.

 

The client provided us some data collected from previous studies/research for reference (which can not be disclosed here) and sandbox environment accounts for all the tools to conduct usability testing.

Timeline

Gantt Chart.png

Research Methodology

Limitation of resources - time, money (no money), manpower, and unavailability of users and their time for participation in the study affected the how the Research Methodology was shaped among other reasons.

research-methodology-no-bg.png
 

Analysis & Hypotheses

To make some sense out of the data collected by the client in the past pertaining to student enrollment, graduation track, and completion time. I am not at liberty to disclose any part of the shared information here, we used the raw data and the analysis report to brainstorm a path for UX research moving forward. We, as a team and with the Professor, bounced off some ideas and used some sticky notes and an Easel pad to create an Affinity diagram which we then digitized in Mural.

Challenges: 

  • We wanted to go the regression analysis route to explore the dependency of successful graduation in a period of 2 years for Masters and 4 years for undergraduate students on a number of different variables. 

  • We were not allowed to conduct any sort of analysis or manipulation with the provided data so we decided to make use of the quantitative data and the generated analysis going the affinity diagram way.

Screen Shot 2019-10-07 at 1.28.55 PM.png

Analysis of the data provided combined with the client requirements presentation (client kick-off) helped us frame some hypotheses that we planned to test as a part of our ensuing research process. We took these hypotheses and met with the client and stakeholders to go over them and narrow down the ones that we absolutely wanted to focus on for the course of this research:

  1. The process of registration is a highly time-sensitive process and the tools provided are not competent enough to help users complete the desired tasks in a time-efficient fashion.

  2. The suite of registration portal rank low on the user satisfaction scale.

  3. Users view the process of registration as a stressful and difficult task.

  4. Disruptions or failure in registering for classes have thrown students off their graduation timeline.

Heuristic Evaluation

For a more holistic analysis and evaluation of the suite of tools users interact with during each registration process, we conducted a Heuristic Evaluation. To keep things simple and wholesome, we relied on the trusted Nielsen Norman 10 heuristic principles for evaluating the portals. Three team members (including myself) conducted the Heuristic Evaluation on the product suite and compiled an actionable evaluation report ensuring that the review was thorough.

Screen Shot 2021-04-01 at 9.48.36 AM.png
Screen Shot 2021-04-01 at 9.51.20 AM.png
Screen Shot 2021-04-22 at 4.32.36 PM.png

Staying true to the client’s requirements of prioritizing the tools for improvement, we prioritized the problems found on a 5-point severity scale (1: Very low severity; 5: Catastrophic).

Link to the Heuristic Evaluation Report

Competitive Analysis

The other 2 students (not involved in Heuristic Evaluation) conducted a Competitive Analysis to identify scopes for improvement and take inspiration from registration tools offered by other Universities and the third-party tools that UT students use (as identified in the previously conducted research) over some of the UT offered tools.

Challenges:

  • Accessing tools offered by other universities is not simple.

  • Most friends were apprehensive of providing their credentials. So we went the contextual inquiry way and observed users using these tools, took notes, and asked questions in the end.

Link to the Competitive Analysis report

Screen Shot 2020-03-11 at 5.33.25 PM.png
Screen Shot 2020-03-11 at 5.33.47 PM.png

Survey

The expert review and comparative analysis put the width and depth of the suite of products into perspective. With limited time and resources on our hands, we decided to design and launch a survey that would help bring some quantitative data that could be used to test some of the hypotheses around the project.  The survey focused on more user experience-related questions than satisfaction rating questions as compared to the last survey conducted by the UT Provost office. The survey also helped us cover more ground in terms of user diversity and use it to our advantage while making decisions later in the process. We distributed the survey on the University’s social media platforms and asked Professors to share it with students in their classes.

Link to the Survey report

Screen Shot 2020-03-11 at 10.21.46 PM.pn
 

76.9% of Junior & Senior students enrolled at the University of Texas at Austin agreed that not getting into a class they needed could affect their graduation plans and timeline.  

Recruitment

We recruited a mix of graduate and undergraduate students across genders, nationalities, ethnicity, first language, and study program. A screener was designed in Qualtrics and distributed across the University’s social media platforms, and shared with friends, colleagues, and professors for recruitment. Each team member chose to serve as a moderator or notetaker based on their skills and availability for the session. The scheduling was managed through Google Sheets. Because of the limited resources (no money) at our disposal, we decided to incentivize the students who came in for the interview by treating them to a spread of snacks, baked goods, and coffee.

 

We also interviewed 2 student advisors from two of the biggest departments in the University. Even though the client specifically asked us to approach the projects from the students’ (primary user) point of view, we did interview the Advisors because these are the primary contact or first-responders for the students whenever there is an issue, conflict, or confusion during any part of the registration process. The interview helped us in identifying some themes in the process and the more common problems associated with each.

Screen Shot 2021-04-01 at 2.40.03 PM.png
Registration theme.png
 

One-on-One User Sessions

Each user session lasted for about 1 hour - 1.5 hours and was broken down into three sections or parts.

GEnerative Interview

The first part of the session focussed on general questions - the educational and demographic background of the participants, their familiarity and experience with the process of registration and the tools/portals they use to accomplish the tasks. This part was designed to help users ease into the process and give them a background on what are our goals for the project and the one-on-one user sessions.

According to the survey, 75% of the freshmen and sophomore students disagreed with the fact that the registration process was easy.

USer Journey

This section was designed after digesting the information gathered in the Advisor interview. We divided the user journey into 3 broad phases, based on our findings from the advisor interview -

  1. Pre Registration (Planning)

  2. During Registration (Process) 

  3. Post Registration (After-math)

and asked them to walk us through their journey and process during these phases.

          As the participant described their journey, we prompted them with a question that helped us understand things like - how do they know when the registration begins, how do they prepare for the process, what are the things they do and go through during the process, what are the tools they use to accomplish tasks in each phase, who are the different people involved and what are common hurdles and pain points and frustrations related to the different phases, tools and people involved.

USABILITY TESTING

The last part of the session was unmoderated usability testing. Each task was followed by an in-between task survey and the session concluded with a SUS survey. The tasks were designed to replicate the 6 major registration tasks that users accomplish during each registration process (pre, during, post). Because registration is a time-sensitive and confidential process and is pre-scheduled for certain weeks of the year, we provided the students with sandbox environments on each tool. A contextual inquiry would have been the best course of action in this case and would have yielded better UX insights, but since that was not feasible we tried to recreate the environment in which the users interact with these tools as much as possible by:

  1. Ensuring that the user performs the usability tasks on the same/similar device as he/she usually would in real-time.

  2. Conducting the usability tests across the university in libraries, lounges, cafeterias, and school premises depending upon the user’s preferences and general working conditions.

  3. Provided the user with most of the tools that they mentioned using during the user journey process - notebook, pen, paper, stickies, etc.

Link to the User Session script (Interview + User Journey + Usability Testing)

Screen Shot 2021-04-01 at 2.18.50 PM.png

In-between task survey asks participants to provide a satisfaction rating for various aspects related to the tool they used to accomplish the task

Screen Shot 2021-04-01 at 2.22.34 PM.png

SUS Survey that the participants filled out at the end of the session

We created a codebook which we used to log in the notes taken after each session.

 

Analysis and Synthesis

We started with feeding the collected textual information during the interview to a simple Natural Language Processing model (coded by myself) and using tokenization and entity recognition to give us a headstart on identifying different themes in the collected data. We used the result to help create an affinity diagram and used it as a baseline to design our narrative for the presentation.

Frame 2 (1).png

The word frequency and the findings from affinity diagram were mostly in coherence and we could successfully identify common words, problems and pain-points of the users through the registration cycle and what are their feeling as they go throughthe process year after year, semester after semester.

A couple of words that we identified as the most frequent in the user interviews and belonging to a context that had a negative sentiment are:

Tabs (Windows)

Early morning

Stressful

Wait-list

Confused (confusion)

 

All the moderators went back to watching the video recordings of the session and re-verified and updated the data in the codebook. The notetakers did the same by referring to the notes they took during each session.

le of words that we identified as the most frequent in the user interviews and belonging to a context that had a negative sentiment are:

A snapshot of the raw data in the codebook

Combining the data collected through the survey with the user journey exercise during the one-on-one user sessions and the usability testing results, we designed user journey maps to put forth our findings in a simple and effective way for the client and stakeholders to understand.

Screen Shot 2021-04-01 at 2.59.50 PM.png

Revisiting the hypotheses that we started with, we could at this stage present some user-based evidence in favour or against the hypotheses.

  • The suite of registration portal rank low on the user satisfaction scale.

  • Users view the process of registration as a stressful and difficult task.

Data (qualitative and quantitative) supports the hypothesis that the process of registration is time-critical and is highly stressful. The satisfaction or dissatisfaction with the tools is correlated to how far along a student is in their degree program. Students, like all users, get used to and learn to make do with the tools offered to succeed in the tasks and because they interact with the tools in short time-frames scattered over the course of a semester the satisfaction rating of the tools seemed to be higher among seniors as compared to freshmen.

36.jpeg
40.jpeg
  • The process of registration is a highly time-sensitive process and the tools provided are not competent enough to help users complete the desired tasks in a time-efficient fashion.

  • Disruptions or failure in registering for classes have thrown students off their graduation timeline.

Time-sensitivity of the process was clear when all of the students being interviewed mentioned they got up "early in the morning" on the day registration started and were often "waitlisted" for popular classes. The waitlisting or not getting into a class mattered more to the students who were due to graduate soon (in a year or two) and the same was found as a part of the survey data. 76.9% of the junior and students agreed that not getting registered for a class in a semester would prevent them from graduating on time. Graduation students during the interviews mentioned that they only had a couple of required courses and fewer graduate students enrolled in the program allowed them to easily register for required classes and the Professors and administration would ensure that their graduation was not delayed just because they could not get into a required class which was full. But students across all years and programs (about 83% of undergrad students and about 69% of graduate students ) agreed to some extent that the registration process was stressful.

39.jpeg
37.jpeg

Another interesting and unexpected fact that came out was the ability of finding information for course registration using the tools varied by student seniority

38.jpeg

Now because the client had particularly and time and again asked for ranking tools for improvement as a result of the UX Research process, we used the data from the usability tests (in conjunction with the heuristic evaluation findings), to generate some sort of prioritization for improvement. The tools were evaluated based on user experience, interaction and satisfaction in terms of efficiency of using the tool independently for successfully completing the tasks (timely) as a part of the Usability Testing. The scores and the findings from the usability tests and heuristic evaluation for improvement prioritization could be summarized as:

  1. Users were able to complete all the usability tasks using the UT Provost registration tools successfully without the "need" of a third-party tool or software.

  2. Interactive Degree Audit used for registration and degree planning and Course Schedule used for finding course information for registration ranked low overall in terms of design, functionality, independent use and time efficiency.

  3. While the Course Registration tool ranked high in terms of time efficiency and functionality, it ranked the lowest in terms of overall user experience of registering for a course owing to inconsistent and missing feedback, and the tool not being integrated with Course Schedule, among other reasons.  

  4. RIS used for checking registration eligibility and Past CV & Syllabi used for getting information on the past classes and instructor teaching the course were rated average across all attributes.

  5. Waitlist Information (a part of the Course Registration) tool ranked the highest across all attributes

As a part of the one-on-one sessions, users were also asked to rate their overall experience with the suite of registration tools as a whole based on several factor and this is how the tools faired.

Output and Deliverables

We compiled an actionable and thorough research report of all methods and findings addressing any irregularities or outliers for the clients and stakeholders to refer to later. To help assist the design process to follow, we also tried to weigh in on prioritizing the tools for re-design based on our research in the submitted report. We also created a presentation and as a team, advocating on behalf of the users presented our findings to a group of people from the UT Provost office.

Link to the Stakeholder Presentation

I really liked how Team 4 (our team) presented the user journeys using emojis through the experience and the use of colors to depict the number of clicks in the portal (referring to Competitive Analysis) .... The use of emojis to show emotions as students go through registration is very interesting.

(Stakeholder Remarks)

Impact

Our recommendation for integrating different tools into a one-stop portal was picked up by the team and implemented the following semester. The re-designed portal is now called Texas One-stop where students can find all the information regarding financial aid, registration, degree planning, bill payments etc. It is the Registrar and My UT tool integrated and offered through a single portal.

Screen Shot 2021-04-01 at 3.12.16 PM.png

Next Steps

Based on our research we recommend making some design changes and then conducting a thorough university-wide Usability Testing or A/B testing to identify the impact of changes and further scope of improvement.

30.jpg
31.jpg
29.jpg
32.jpg

Reflections

Working on the project I acquired important UX Research skills by working on some of them like moderating user interviews, conducting usability tests, and creating user stories. Being a part of a team where each member had a different background and skill-set and yet everyone was just a student, there were often disagreements and more often than not, one of us felt being let down. But the experience brought a certain level of maturity in our demeanor where each one of us put the success of the project first. I, personally, learned to be more assertive and yet be mindful of what others had to say and work collectively to impartially weigh out the pros and cons of any major decisions we made.

Some of the things I learned as a UX Researcher are:

  • Take what your clients and stakeholders ask you to do with a grain of salt. They are not UX Researchers. Understand where they are coming from and design and direct your study in a way that helps them as well as your users succeed. Understand why they are asking you to do what they are asking you to do and then with the help of some data and research show them that you understand their concerns and what would be a better approach to handle the problem.

  • It is nice and beneficial to work closely with the stakeholders and clients be it while designing the research study or while analyzing the data.

  • Instead of resolving conflicts during the process by voting within the team, it is more beneficial to ask your client and stakeholders and if possible users to weigh in.

  • Budgeting time and money for the research process is more important than I thought.

  • Planning ahead for no-shows, unsuccessful sessions, technical glitches is a necessity.

  • A lot is said and hidden in user body language so it's always wise to be fully and actively invested in the interview and have a notetaker help you out.