What we learned by going ‘shopping’ for prospective college student feedback


What we learned by going ‘shopping’ for prospective college student feedback

Many famous actors and models have a “big break” story—getting spotted by the right person in a bar, on the street or at that quintessential teenage watering hole, the shopping mall.

There’s a reason talent scouts frequent small-town malls looking for their next clients: Small-town malls are ideal places to encounter young people in their natural environment. So, when our web design team needed to evaluate potential updates to our partner schools’ online applications, it seemed appropriate that we, too, head to the mall to seek real-time reactions from our ultimate audience, high school students.

At EAB Enrollment Services (formerly Royall & Company), we are no strangers to in-person testing. Every year, we do interviews, exercises and tests with individual students to complement the quantitative data we gather by measuring responses to campaigns. But the shopping mall environment was new for us, and it proved to be a particularly rich source of input, transforming our experience into something more like “guerrilla user testing.”

We nabbed a spot at the entrance to the food court (prime real estate indeed), and our web designers set to work scouting students to serve as test subjects. Just a few hours later, we had walked dozens of high school and college students through our application prototypes and learned several valuable lessons along the way—about how to get useful feedback from students and also about the design of our online applications.

Testing the waters

For this particular testing session, we were focused on two enhancements to our online applications.

First, we were exploring a more intuitive approach to our validation experience on mobile devices. College applications are inherently large and complex, including for the sign-in and password-setting process, which poses particular challenges for mobile device users. Our goal in this test was to create for mobile-device users a new validation experience that could be scaled and adapted to meet each school’s unique needs while still providing a great, seamless user experience.

Second, we were testing a new approach to how students entered an application and returned to complete it. This test was important to us because new security technology has become available; our goal was to ensure the security upgrades were introduced in a way that added security without disrupting the user experience or marketing performance.

Keeping tests short, sweet, and uninterrupted

We had only about five minutes of each student’s time for our tests. These students came to the mall to shop and socialize, not to participate in usability testing, so making the tests as brief as possible was at a premium.

However, we also wanted to keep ourselves from providing students with too much guidance throughout the testing experience. After all, we wanted to find the true “pain points” in the system—the ones that were affecting students’ ability to use it—not the potential problems we’d speculated about prior to testing.

So, to speed up the process, we dropped students into our application prototypes at specific testing steps rather than having them navigate through the entire application from start to finish. We also encouraged students to talk aloud to the test facilitator as they went through each step, voicing their likes, dislikes and any difficulties with the experience.

Meeting students where they are—digitally as well as literally

While we had already trekked to the mall, we knew we also had to get into students’ digital environment. As such, the prototypes were tested on a variety of devices, including iPhones, iPads and laptops.

Mobile experience is a particular area of focus for the Enrollment Services team. We see upwards of 20% of our member applications being submitted entirely on mobile devices, with even greater numbers using a mobile device at some other points in the process. Live user testing is a perfect opportunity to see firsthand the unique challenges presented by these mobile interactions.

Students (like all consumers) don’t like to read

Students (and users in general) don’t always reliably read copy, so they can easily miss important instructions or misinterpret messaging. For example, words like expired elicited a negative and unexpected reaction from some users, specifically those who didn’t read the copy as thoroughly.

To remedy this, careful word choice, smart headlines and scannability are very important considerations.

What’s obvious to you is not always obvious to the user

We see this repeatedly. A student misuses a tool that we think has been explained very clearly. Important content that we’ve been careful to highlight visually is overlooked, and so on. There are a lot of factors that contribute to how a user perceives and interacts with an interface, and those factors can be counterintuitive to a designer in some cases. (Not sure what we mean? Google “banner blindness” to see one of the most famous examples.)

Be aware of expectations

Users have well-defined expectations about the way websites and applications work, based on conventions that they’re used to seeing. It’s completely possible (and necessary) to break these conventions at times. However, it’s important to be aware that you’re doing so and to take care to give a little extra guidance, through copy or visual cues, to help the user learn and adapt to this new experience.

I am pleased to report that our findings from this testing translated into a significant upgrade to the mobile validation experience, and we are analyzing results to assess whether these changes improved submission behavior for mobile users.

And now that we realize how productive a trip to the mall can be, you can be sure that we’ll go back again soon when we have a new batch of innovations to test.

Want more about what soon-to-be graduates are looking for?

For a more complete picture of the communication preferences of today’s high school students, read our entire white paper detailing the results of our student survey. Read the full white paper.

EAB asks you to accept cookies for authorization purposes, as well as to track usage data and for marketing purposes. To get more information about these cookies and the processing of your personal information, please see our Privacy Policy. Do you accept these cookies and the processing of your personal information involved?