What measuring the student experience means to Institutional Research—and why it’s so hard

Blogs

What measuring the student experience means to Institutional Research—and why it’s so hard

This spring, we conducted a survey to learn what issues most concern Institutional Research (IR) leaders, and one thing was loud and clear: IR is laser-focused on gathering and analyzing data to help their institutions track and understand the student experience. In our survey, 65% of respondents rated “using data to understand and improve the student experience” as a top priority. In an increasingly competitive enrollment landscape, now exacerbated by the current COVID-19 crisis, creating an impactful and equitable student experience has become more critical than ever.

0%

of respondents rated “using data to understand and improve the student experience” as a top priority

To further understand IR’s work with student experience data, we conducted a series of interviews across the spring and summer about how IR leaders are tracking the student experience and what’s holding them back. Anecdotally, we know that things like extracurricular involvement lead to greater engagement on campus; when I reflect on my time in college, the connections I made participating in a Women in Economics group and playing intramural volleyball stand out in my memory as much as any of my classes. But how can you measure the impact of those experiences to help create a meaningful student experience for all your students? Here’s what we learned about how IR is grappling with those questions.

What does it mean to measure the student experience?

Measuring the student experience means identifying pivotal experiences on campus—like faculty-student mentorship, student employment, or residential communities—and quantifying how they impact learning outcomes, retention, engagement, and post-graduation outcomes. IR and other campus data teams can then use that data to inform a range of initiatives from student success to recruitment to alumni fundraising.

Here are some examples of student experience data initiatives from IR leaders we interviewed:

Large public master’s university: IR struggled with stakeholders across campus performing interventions to improve student outcomes without any way to track their results. Now, stakeholders are expected to track student interventions centrally in their CRM so that IR can measure longitudinal data. They are also considering implementing a templatized system to streamline the tracking process. This ensures that IR can measure the efficacy of varying interventions to judge what is most effective for students.

Midsize public master’s university: Advancement staff wanted to provide differentiated outreach for graduates to help them recall their experience at the institution. Staff began going through old yearbooks and Excel files to determine club associations for graduates and logged this information in their CRM. This enabled them to target subpopulations for outreach and invoke the nostalgia of their experiences when asking them to support the institution financially.

Collecting this kind of data has long been a challenge. For example, IR offices often struggle to capture reliable alumni outcomes data through traditional methods like surveys and call campaigns. However, institutions are increasingly asked to track this data, especially equity-related metrics like debt to graduation and earning potential. Some institutions are even turning to third-party providers to sync their historical data on students and prospects with more granular alumni data, providing a fuller ‘life-cycle’ of student data.

What does it mean to measure the student experience during the COVID-19 pandemic?

Measuring the student experience means identifying pivotal experiences on campus—like faculty-student mentorship, student employment, or residential communities—and quantifying how they impact learning outcomes, retention, engagement, and post-graduation outcomes. IR and other campus data teams can then use that data to inform a range of initiatives from student success to recruitment to alumni fundraising.

Here are some examples of student experience data initiatives from IR leaders we interviewed:

Large public master’s university: Working in tandem with Financial Aid, IR helped whittle down the pool of applicants for a COVID-19 need-based aid fund by verifying information on students’ grant applications, submitted via a survey, with their SIS information. This ensured staff could validate the self-reported student data and micro-target students at the greatest risk for dropping out or with the greatest financial need when allocating funding.

Large public doctoral university: IR staff leveraged LMS usage data to identify faculty who had yet to provide their course resources online, prompting them to update or fix the materials in the LMS. By nudging their online faculty, IR staff can ensure students have the course materials needed to succeed in their remote classes, ensuring a more seamless transition to online for their students.

Large public master’s university: Faced with difficult decisions to make about staffing in the fall, IR leaders are using faculty workload data, student satisfaction surveys, and LMS metrics (e.g., published resources, student usage of the course page) to assess the efficacy of adjunct professors teaching entirely online.

What makes it hard to gather and analyze student experience data?

The recent need for better student experience data illustrates a larger phenomenon of institutions increasingly relying on data to inform decisions at all levels. In this environment, IR leaders must simultaneously expand data access by creating self-service options for administrators and staff while also ensuring the data driving decisions is of high quality and protected from manipulation or misuse. University leadership needs accurate data more quickly than ever, yet often fails to recognize the difficulties IR offices and other campus data teams face when supporting data-driven decision-making.

In discussing the increasing need for data and the new ways to use existing campus data, my conversations with IR leaders often come back to the same underlying data management issues:

  • Siloed data systems create barriers to access: All of campus relies on IR’s ability to access complex source systems to get the data they need. In addition to limiting access, data siloes also eat up IR’s time. For example, regarding the new reliance on LMS data as a result of COVID-19, one IR leader commented “My office has become responsible for teaching ‘Canvas 101.’”
  • A lack of formal data governance creates widespread data quality concerns: Institutions need to ensure the integrity of the data being entered into source systems across campus. This relies on the creation of a culture of continuous improvement, as well as codified data governance protocols.

“Without a centralized data management framework, institutions cannot maximize the value of all the data they collect every day.”

Director of IR

Small Private Doctoral/Professional University

Improve the student experience with a sustainable digital strategy

Explore the mandate for student-centric innovation and access tools to build an adaptable and agile digital strategy.

EAB asks you to accept cookies for authorization purposes, as well as to track usage data and for marketing purposes. To get more information about these cookies and the processing of your personal information, please see our Privacy Policy. Do you accept these cookies and the processing of your personal information involved?