Improve your program review process with these 4 data tips

Blogs

Improve your program review process with these 4 data tips

A Q&A with Dr. Louis Slimak, Associate Provost at West Virginia University

Illustration-HESF-Blog-Illustration-3-1000x700

In today’s fast-changing environment, program review is an increasingly critical area of focus and the first step in delivering a portfolio that meets the institution’s goals and students’ needs. Yet many institutions lack a standardized, data-informed process for assessing program performance, leading to inconsistency and disagreement.

Dr. Louis Slimak, Associate Provost for Curriculum and Assessment at West Virginia University (WVU), has done extensive work to overhaul the WVU’s program review process—and its culture around data more broadly—since his arrival in 2016. We profiled Dr. Slimak’s work in the first session of our ongoing Edify Data Strategy Series.

Here are some highlights from my conversation with Dr. Slimak, including his tips for other leaders embarking on program review and other large-scale data projects.

Watch the webinar

View the full webinar with Dr. Slimak and other panelists discussing using data for program review. 

Q: Overhauling an entire data culture seems like an impossibly large task. Why did you decide to start that process with program review?

When I arrived, assessment of learning was part of a bigger systemic problem. Program review was the vehicle I chose to address a wide range of those issues, including assessment of learning, program viability, and connecting those two areas to ongoing program improvement.

Program review at that point was poorly defined. We had a form, but no standard data. The questions were ambiguous. People spent more time tracking things down than answering questions. And we weren’t making decisions that led to meaningful change, not by the programs nor by the institution. We’d get 200-page, massive, IRS audit-like responses that didn’t make sense and were a waste of time, and then nothing happened. The biggest problem was that we didn’t have a standard data set—the files were always on someone’s computer hiding in a corner, which is obviously difficult to compare or make actionable.

Q: That’s a huge hurdle to jump. What did the first stages of the project look like?

It was a two-year process overhaul. I spent the first two years of my time at WVU working with the University Assessment Council, an advisory board I formed to help with this work. The group included representatives from many parts of campus, including Associate Deans, the Provost’s Office, the Teaching and Learning Commons, Faculty Senate Committee Chairs, the Registrar’s Office, faculty, and several others, which gave us a wide range of perspectives.

Our main area of focus—beyond improving the form and questions—was to agree on a standardized data set using Edify’s Academic Performance Solutions metrics. We said, “This is what everyone’s looking at, right? This is how we’re going to define these metrics, and this is why we care about those things.”

Q: It’s a massive challenge getting everyone up to speed on such a large change to a core process. How did you train and educate your staff?

Once we developed a form that was easy to use, crafted clear questions, and agreed on metrics, I realized almost immediately that I needed to increase the amount of support and professional development available at the institution. We provided annual workshops on the process, increased data literacy, explained how decisions were made (and by whom), and tracked how decisions got communicated and followed up on.

We also built a website to make resources available to folks who wanted to see what data elements we included. This site included information on trainings with EAB, individual department-level consultations, guidance on self-studies, and the results of each year’s reviews. We had to build capacity and transparency while improving the process and data.

Q: What are some of the biggest wins you’ve seen since improving the program review process?

The Edify custom dashboard is revolutionary. I used to spend at least 60 hours each summer pulling data and compiling it in Excel spreadsheets to push out to people. Now, we just link to the dashboard and tell people how to set their filters.

dashboard 1

Note: Screenshot is from a demonstration site, not West Virginia University’s site.

The Edify dashboard has broadened access to the dean’s offices, chairs, and—most excitingly from my perspective—to any interested faculty member. Want to know how your program is trending in enrollment, continuance, or completions? Go look. Want to see how you stack up against other programs in your college or at the institution? Go look. Want to connect your assessment of learning findings to other student performance data? It’s all there for you.

dashboard 2

Note: Screenshot is from a demonstration site, not West Virginia University’s site.

Q: Thinking personally: What has been your biggest takeaway from this project?

There are no less-important pieces. Every component is crucial for building institutional capacity in executing a meaningful process. The process has significant impact on the academic portfolio, where we allocate resources, who engages in decisions, how we communicate results, and how the entire campus community understands the evidence that contributes to the decisions.

Now more than ever, institutions feel pressure to execute these kinds of processes. They know that every single component matters. Data definitions and quality, data access, meaningful, questions, a sustainable process that leads to results, the organizational structures and relationships that carry those results out and give them life within the institution—all of it is essential to success.

4 tips from Dr. Slimak

#1: Be inclusive in defining and choosing metrics.

We brought in a variety of stakeholders and perspectives for these conversations. When I do a refresh of this process this coming year, I will have the same group of stakeholders at the table. It’s amazing to see how different perspectives understand the same metric or process—it helps us understand what we need to explain and contextualize. Also, choose meaningful metrics; don’t be blind to institutional mission, scope, and purpose.

#2: Carefully consider your data visualizations

Remember that you’re dealing with a campus community that has a wide range of data literacy. Some have little quantitative expertise, while others come from STEM backgrounds who you need to orient towards the standards of action research. The data and analyses are not going to be perfect. You want to spend more time using the data than explaining it. I want people to look at data and think, “What should I be doing about this? What do I need to change?” not “What does this mean? Is this bad or good?”

#3: Be creative and think holistically

Think big-picture. What can a process do for the institution at a higher level? Start from the bottom up, at the metric level, then gather the metrics into a bundle. Go from that bundle and how it supports and relates to a process like program review, then from that single process to how it informs program creation (and vice versa). If I don’t have the same metrics across program creation and program review, then I’m standing up programs that won’t be measured the same way several years later. Those things connect and then suddenly I realize that my faculty hiring process is an investment in programs that I’m going to stand up and review. All three of those processes need to connect, share metrics and perspectives, and have a team of decision-makers across those areas.

#4: Invest the necessary time and resources into the work, no matter the project

Program review happened to be the vehicle to address the responsibilities I needed to address. But this advice applies to any project. Looking at the demographic cliff, there are going to be two types of projects: projects driven by exigency and immediate need, and those you want to sustain for strategic decisions and institutional or program improvement. All of these processes are built on relationships that help you make decisions about where to allocate resources. Invest in people and take time to build these processes so they sustain after the immediate work is completed. Support the people doing this work and institutionally communicate its importance regularly.

In conclusion

Q: You have obviously achieved a lot so far—I’m curious, what’s next?

This coming academic year we’re going to fold in two new metrics, one on instructional cost per program and one on program revenue generation, both aimed at better understanding program viability. We’re going to expand how we evaluate the external market for a program and the occupational and industrial need and trends. We’re going to reframe the mission section to allow programs and colleges to articulate the strategic need (or lack thereof) for a particular program area. Historically, that’s been a hard decision because it’s not directly apparent in the metrics.

We’re also standing up a dedicated Edify dashboard for our faculty hiring process. We will pull that dashboard and its new metrics into the program creation and program review processes. I’m excited for those processes to inform one another and drive more strategic decisions.

Want to learn more?

To see how Edify can help your institution with program review, enrollment analytics, data warehousing, and more, request a demo using the form below.

EAB asks you to accept cookies for authorization purposes, as well as to track usage data and for marketing purposes. To get more information about these cookies and the processing of your personal information, please see our Privacy Policy. Do you accept these cookies and the processing of your personal information involved?