Cappex is a web platform that has helped 10 million students navigate through college admissions. For a long time, Cappex has been serving prospective students with the information they need regarding colleges.
However, some of the features are outdated to meet the students' needs. Let's why we started to redesign every feature we have to make them better. Our goal is simple: college admissions are complicated, we want to guide students through the journey and help explore the colleges to find the true fits.
We scheduled interviews and conducted surveys to drive the problem-solving process. Here are some key insights that shaped the final solutions.
From our user research, we discovered two types of search needs: exact search and exploratory search. When students do not have sufficient domain knowledge to conduct the exact search, the exploratory search is highly preferred. They want to quickly discover the colleges that match their preferences.
However, our search experience did not provide the best experience when students were looking to explore colleges.
Once a user passes the college search, they will land on college profiles to learn more about each college. From previous user interviews, a major pain point of consuming the college information is locating the correct content and understand the meaning of the information.
We conducted various types of research and concluded that students couldn't always find the info from the pages they expected.
As an information hub, we can provide students every piece of information about each college they are interested in. But the most efficient way is to explain how the information matters to each individual student.
Instead of overloading information, we could personalize the experience so that every student can consume the information easily and learn quickly about colleges. We mapped out the customer journey and focused more on improving where the business participates at the moment.
At the beginning of the project, we felt we haven't identified the problems and potential opportunities well enough. Therefore, we decided to do exploratory user research first. Knowing we had only 6 months to deliver the project, we planned the research and design process as the following. The details of the research process are presented in the last section of this case study.
The word "fit" came up a lot during our conversations with students and parents, but their definition of "fit" are all different. Most of our customers could not communicate what colleges they are looking for exactly. They viewed the search experience is often a way to educate themselves.
Then we reviewed our current search interface and we identified a few issues.
Most of the users come to our site to research colleges. Once they pass the college search, they will land on college profiles that provide information on each college.
We have more than 300 data points on each college, and from the research, locating data points is one of the major pain points for users, along with several other issues:
Since the platform was first developed back in 2008, the framework it used was not mobile-friendly. However, even with the unfriendly mobile websites, over 60% of the traffic came from mobile devices.
To simplify the college research process, the new college search experience emphasized information re-design and personalization. Students can browse through college information that matters to them and quickly learn more about the colleges they are interested in.
Also, the new 2.0 design focused on mobile-first experience so that students can access and learn about colleges no matter where they are.
College search should be simple and fun. Ideally, students should feel excited and motivated when browsing and learning about colleges. From interviewing them, we realized that "the college experience" means a lot to students. Therefore, when we design the college search, we tried to bring a taste of the campus and college life to the students so that they don't get bored when doing the research.
It shouldn't be hard to find any college information students want. We analyzed the information architecture by conducting card-sorting and tree-testing to make sure we put every piece of information in the right place. We also used different charts and diagrams to help students understand the information more quickly.
Looking back, Cappex had not been keeping up with the user needs very well. Also, as the product team here, we felt we haven't developed the empathy for users and understand their journey well enough.
To understand user pain points better, we decided to do exploratory user research first at large scale. We focused on qualitative insights first to understand the general journey steps and pain points, and then focused on quantifying pain points with the job-to-be-done framework.
We started by interviewing our customers, students and parents, to understand the general steps they took on the admissions journey and the major pain points they faced. We conducted 3 rounds of interviews and interviewed over 60 students and parents in total.
Our interview process was iterative. For every round, we refined our script to include better questions.
As we gather more information about the steps students take, we are putting post-it notes on the wall to form a user journey map.
After the 3rd round of interview, we gathered enough information to conclude the user journey map. The college admissions journey includes 8 big steps, each 8 step may also contain several small steps. Although the journey is not in a strictly linear order, we could present it in a relatively linear more conveniently as the following:
From the interviews, we also collected nearly a hundred potential problems to solve. However, we understood that we are a small company with unlimited resources and that solving many problems at the same time would not be realistic. Therefore, we decided to prioritize the problems.
Firstly, we wrote down the potential problems in the format of outcome statements students want to achieve. For example, one outcome statements could be "minimize the time it takes to search for admissions requirements". Then we worked with Qualtrics to send the outcome statements to over one thousand students and parents via a survey to evaluate 2 metrics for each statement:
Importance: how important to achieve the outcome
Satisfaction: how satisfied with the market solutions to achieve the outcome
Before redesigning the search interface, we did a survey to understand what filters students wish to have and planned to order accordingly. We asked "what questions you want to get answered?" and "what filters do you wish to have" on the survey. We sent out the survey to over 300 students and received the results.
In order to make sure that students can easily find the information they need, so we decided to take a step back and do an information architecture study.
350 Cappex users participated in a hybrid card-sorting task. We gave each participant 30 cards and each card contains one statement about a college. Participants were asked to sort each card into 5 pre-defined groups. If a participant finds that it is hard to group some cards into the groups, the participant can create a new group and name the new group.
The card-sorting was done online via a survey tool as in the following image.
After analyzing the research results using percentage rate and distance metrics, we created an information tree that all college data should live within. We were confident that this new architecture can help students find the college information they need more easily.
College search should be simple. Ideally, users should be able to click a few times and narrow the results down to match their preferences. After having a better understanding of our users, we started to create the design from wireframe to detailed screen mockups.
As we were redesigning Cappex, we wanted to avoid any issues that stop users from completing their tasks. Before launching new features, we conducted a series of usability testing sessions to validate design assumptions and identify usability problems.
We track and prioritize the issues identified through usability testing, and set a 2-month feedback cycle. For every 2 months, we do another round of usability testing to identify new issues and validate whether the design change solves the issues found previously.
This is a long-term iterative project. In the long run, we will continue to make effort on simplifying user journey, crafting more user-friendly features, and making the product engaging and fun through gamification. Stay tuned for updates!