- Transparent
- Broadly scoped
- Intentional
- Responsive
1. Transparent
Perhaps more than any other characteristic, transparency is critical to an equitable model. Transparency applies to your institution’s modeling practices and the models themselves.
Institutions should develop and adhere to thoughtful policies surrounding data quality, governance, and privacy. Ideally, anyone impacted by the model should be made aware of the model’s use and goals. Key constituencies include the administrators who use the model’s outputs to strategize, the faculty and staff who put the model’s outputs into practice, and the students impacted by the ways the other stakeholders use those outputs. At the least, students and other affected stakeholders should be able to inquire and learn more about the model. For example, EAB provides users of Navigate’s Student Support Predictive Model a customized report detailing the model’s accuracy and listing the variables the model used. Each time the model is refreshed, the institution receives a new report, granting consistent insight into how the model is trained.
Additionally, your institution should have a firm grasp on which variables your predictive model includes, which requires transparency of the model itself. Predictive models can use any combination of variables and techniques to arrive at predictions, putting more weight or less significance on specific variables. While it’s not mandatory to understand the intricacies of the math involved in the model, you should have a functional understanding of the process.
2. Broadly scoped
Models make predictions based on past data, so without evaluation of what goes into the model, the model might assume that past outcomes are objectively desirable benchmarks to be replicated in the future. Including the same set of variables every time you run a model offers less opportunity for a model to perform one of its most desirable tasks: revealing unexpected connections between variables and outcomes.
Begin with a wide swath of variables and pare down the list to measure significance and outcomes. Including new variables and testing their significance might uncover new connections between seemingly disparate data.
To keep variable lists fresh, solicit ideas from colleagues inside and outside the department applying the model’s results. Staff inside the relevant department can offer insight drawn from direct experience, while staff outside the department can offer different perspectives and concerns that you may have overlooked. For example, financial aid counselors may have valuable insight into a model you build for the Vice Chancellor of Enrollment Management.
REIMAGINE THE STUDENT EXPERIENCE
3. Intentional
Institutions should have a clear goal in mind before building and deploying models. That goal should reflect principles of equity and fairness. Many institutions use models to achieve increased enrollment or better retention, but equity is an equally important goal. For example, if a model flags students with financial needs as less likely to succeed in STEM majors, your institution can choose how to proceed: do you deprioritize those students for recruitment, or do you develop better support programs for students facing financial hardships? A study from New America suggests that the former path contributes to inequity while the latter works toward equity.
Additionally, be sure to establish and follow a review process before deploying your model’s results. Depending on what you intend to do with the model, it can be helpful to incorporate reviewers from within the department where you intend to use the model and other stakeholders with perspectives that can inform analysis.
4. Responsive
The COVID-19 pandemic reaffirmed that any data-related initiative must be ready to respond to rapid change, and that global events do not proportionally impact every group. Throughout the pandemic, the world learned more by the day about the COVID-19’s health impacts, social disruptions, and demographic disparities, and some models operating on pre-2020 data struggled to keep pace with the rapid shifts in every area of life.
Still, institutions with access to flexible, responsive modeling tools found new applications that helped them continue to make better decisions. Rather than operating from now-dated information, institutions that took a responsive approach to modeling explored new data sources and sought outcomes that would help them deal with immediate crises and shifting circumstances. Live data from Learning Management Systems and virtual data from online interactions stood in for traditional data points in student success models and enrollment forecasts, allowing institutions to apply their data creatively toward solving problems that seemed insurmountable.
EXPLORE THE RIPPLE EFFECTS OF THE PANDEMIC ON RETENTION AND SUCCESS
Beyond the context of COVID-19, predictive modeling is an iterative process. Track how your model performs across different demographic subsets of your population and measure how changes to the model improve its results for those subsets. You may not always identify the cause for a performance disparity. Still, it is vital to be aware of these differences and minimize gaps with each new iteration.
Be prepared to iterate as you go
As higher education learns more about serving its students equitably, it’s crucial for that greater understanding to embed itself into predictive models. Predictive modeling should always be an open conversation rather than an immutable initiative. View results with a critical eye and consider the potential downstream impacts of the actions you take based on a model’s results. Invite input and discussion, and always be open to changing procedures based on feedback. This collaborative and iterative approach will help you build and use equitable predictive models that inform operational decisions across campus and ultimately improve outcomes for your students.