Design Process
min read
April 10, 2024

Validating design solutions with usability testing for Stanford GSB’s course research & registration platform

Validating design solutions with usability testing for Stanford GSB’s course research & registration platform
Table of contents

The Stanford Graduate School of Business, known as GSB, is a top business school worldwide. It has diverse students and faculty from all over the world. The school offers different business degrees and many subjects to choose from.

Before students enroll, they spend weeks researching courses to learn about the subjects, instructors, ratings & reviews from previous students, past curriculum, credits, timings, and much more.

Big decisions like college, courses, or career changes need planning, and informed students make smoother transitions. To help navigate these choices, online resources should be clear, and informative that guide students towards making well-informed life decisions.

About the course research & registration platform

Students faced significant challenges navigating the outdated Course Research & Registration platform. Here's what frustrated them:

  1. Inconsistent navigation:The platform lacked a clear and consistent layout, making it difficult for students to find the starting point for tasks like finding the right course. They often got lost and frustrated trying to navigate between different sections too.
  2. Unintuitive registration steps:The process of researching and registering for courses wasn't straightforward. Students were forced to jump back and forth between different sections to complete tasks, leading to confusion and wasted time.
  3. Ineffective search tools:Finding specific courses was a struggle because the search function lacked effective filtering options. Students couldn't easily narrow down results based on department, course type, or other relevant criteria.
  4. Hidden past information:Information from previous years, which could be crucial for course selection (e.g., professor reviews, and course descriptions), was buried deep within the platform.
  5. Data overload:The data on the platform was not well organized, leading to a cluttered interface with numerous tabs and information overload. Finding specific details about a particular course or program proved time-consuming and frustrating.
  6. Schedule blindspot:There was no way for students to see their selected courses at a glance. They had to manually add and remove courses from different sections, making it difficult to visualize their overall schedule and identify potential conflicts.
  7. Outdated interface and functionality:The platform's outdated design felt clunky and difficult to use. Annoying pop-ups and unresponsive preview windows further hindered user experience and made it a chore to complete tasks.

These challenges made the course research and registration process unnecessarily complex and time-consuming. To address these shortcomings and improve the user experience, we adopted a design approach focused on clarity, efficiency, and user needs.

A look at our design approach

Understanding student needs and the GSB platform

Our journey to improve the user experience began with focusing on student needs. We collaborated closely with the GSB team, gleaning valuable insights into how students typically research courses.

Their research, combined with our exploration of the platform, provided a comprehensive understanding. This ensured we weren't just working with theoretical data, but with the actual functionalities students encounter when researching and registering for courses.

Simplifying and enhancing the platform

Next, we focused on making improvements guided by user needs. Studying the user journeys and personas — detailed profiles of typical users — provided a roadmap to areas that needed enhancement.

We streamlined the platform by removing unnecessary features and making existing ones more intuitive. This reduced clutter and made the platform easier to navigate and use for students.

Based on our findings, we introduced new functionalities to better address student needs and simplify tasks such as course search, registration, and schedule building.

Iterative design and overcoming challenges

Throughout this process, agility was key. We employed a rapid design approach, creating low-fidelity prototypes that could be quickly tested with real users. This iterative cycle allowed us to incorporate their feedback and refine the design efficiently.

User feedback proved invaluable in identifying areas for improvement and ensuring the final design met student needs. While challenges inevitably arose, such as technical limitations or conflicting priorities, they pushed us to think creatively and find innovative solutions.

This resulted in a more user-friendly platform that streamlined the course research and registration process for students.

user-friendly platform that streamlined the course research and registration process for students.
Design approach:user-friendly platform that streamlined the course research and registration process for students.

User testing for optimal results

Since students were the key users of the GSB Course Research and Registration platform, their feedback was essential. We designed tests to see how easy it was for students to use the platform and make sure they liked it.

Through collaboration with the GSB team, we successfully recruited 11 students to participate in our testing sessions.  These sessions were recorded, allowing us to capture detailed user interactions and feedback.  The students' candid conversations provided invaluable insights, revealing perspectives we hadn't previously considered.

Testing Real-Life Tasks

The platform had many different features and ways to get things done. We wanted to make sure each part worked well. So, instead of giving students specific instructions, we gave them tasks that mirrored what they might actually do on the platform.

For example, instead of saying "Find FINANCE 123," we might say "Find a specific course and see all the details about it." This way, students had to explore the platform on their own and use different features. We created several tasks like this to test all the different screens we had designed.

Some testing scenarios

  1. Planning Your Schedule: Here, we imagined a student meeting with their advisor to plan their winter and spring classes. We asked them to use the platform to build their schedule for those quarters.
  2. Comparing Courses: This scenario involved a friend recommending three courses. Students had to find these courses on the platform, compare them to courses they already saved, and decide which ones to register for.
  3. Registration Day: The final scenario focused on actually registering for classes. Students used the platform to register for the courses they chose in the previous step.
Some testing scenarios

Observing User Interactions

To get the most accurate feedback, we had clear rules for how we interacted with students during testing. We didn't tell them what to do or how to use the platform. Instead, we watched them and asked questions to understand their thought process.

For example, we might ask: "Why did you click there?" or "What were you expecting to happen?" or "Do you find this section easy to use?" The questions we planned to ask and these rules helped us run successful user testing sessions.

Identifying Patterns

The testing sessions provided a wealth of information to guide our revisions.  While we couldn't address every suggestion, we focused on identifying common themes that emerged from multiple users. This helped us prioritize the most impactful changes to the platform.

Insights & key findings from usability testing

Revising the interface design

The user testing sessions were an eye-opener. We saw how students interacted with the platform and discovered areas that needed improvement.  In addition, the GSB admin team provided valuable suggestions for functionalities that would enhance their ability to manage the platform efficiently.

We took all this feedback and moved on to phase 2 of the project. This phase focused on incorporating user insights and admin suggestions to refine the platform's design.

Improvements:Revising the interface design
Improvised interface design
user-friendly platform with Improvised interface design

Conclusion

Usability testing proved to be a powerful tool. By observing how students used the platform, we were able to identify and address their pain points. This data-driven approach led to improvements in information discoverability and overall user experience.

The collaboration with the GSB team was instrumental in achieving this success. Their input on platform management functionalities ensured that the revised platform would be not only user-friendly for students but also efficient for administrators.

Through this collaborative effort, we transformed the platform into a smooth and effective tool, enabling MBA students to navigate the course selection process with greater ease.

Download our discovery playbook with actionable templates to guide you through conducting effective user testing sessions

Dive deeper into our successful collaboration with the Stanford Graduate School of Business! Read the complete case study — Migration & Design System Implementation for Stanford Graduate School of Business

Written by
Editor
Priyanka Jeph
Content Design Lead