What do “Freakonomics” and academic decision support have in common?


What do “Freakonomics” and academic decision support have in common?

Freakonomics, by Steven Levitt and Stephen Dubner, has been on my reading list for years. While in the library the other day, I finally picked it up. Barely a few minutes after cracking open the spine, I was shocked at how much the book and EAB’s recent research on higher education decision support had in common.

In the introduction, Levitt and Dubner state that several ideas are fundamental to the book:

All of these statements reminded me of stories I had heard from my research contacts:

Incentives are the cornerstone of modern life

When we spoke to one data steward, he said he knew the work was important, but there were no positive individual incentives for being a data steward. He was assessed against how well he did his traditional job, and working on data governance took time away from making progress against his goals. Also, he told us that he’s never gotten any positive recognition for his help with data governance—he’s only been blamed when things go wrong.

In our research, we only found one university—George Washington University—that’s incorporating data governance responsibilities into job descriptions and thus taking a first step toward rewarding data stewards for their contributions to institutional data governance. There are many incentives for not helping data governance (especially convenience)… how are you incenting contributions to data governance?

The conventional wisdom is often wrong

Conventional wisdom is usually conventional because relying on hunches and untested logic is expedient, not because it’s right. We spoke with one CIO who told us about her regional public university, which is located in a fairly poor area of the country. Her institution’s leadership team assumed that the majority of the student population would mirror the socioeconomics of the university’s location.

This assumption guided major resource allocation decisions, including large investments made in support services. When someone actually pulled the data though, everyone was shocked to learn that only about 40 percent of students came from this demographic—not the over 60 percent everyone assumed. Leadership quickly realized that they were spending an inordinate amount on students not ready for college (which wasn’t getting them great results because the resources offered were not aligned to the students’ needs), and they should shift their focus instead to the students from what our colleagues with the Student Success Collaborative call the “murky middle.”

Dramatic effects often have distant, even subtle, causes

At one institution, business intelligence staff were coming up with odd results when analyzing section enrollment against section capacity. For some of the sections, the fill rate was impossible to calculate. After taking a deeper look into the data, analysts discovered that minor actions by faculty members had caused such an analysis impossible. The analysts found that faculty members had requested the ability to individually approve students for enrollment in some courses.

Rather than build a new functionality to flag faculty members when new students enrolled, the registrar’s office saved time by simply changing the Maximum Capacity of those courses to zero. The registrar’s office was automatically notified whenever a student registered, and they could connect the student and professor directly. Faculty members’ simple request ultimately impaired section fill rate analyses. Unfortunately, data quality is often no one’s job. Ensure that you have a system in place to enable campus members to report data quality problems, and provide staff with time to address egregious problems.

“Experts”—from criminologists to real-estate agents—use their informational advantage to serve their own agenda

In today’s competitive environment, data must be used as a strategic asset in higher education decision making. However, we have seen data provide a competitive edge within an institution as well, with deans hiring their own directors of analytics—outside of central business intelligence teams – to better understand their students.

One engineering dean with whom we spoke mentioned that, in a year when overall funding for academic units decreased, his college was able to get more money from central funds. This was because they used data to predict what their enrollment would look like over the next few years. Further, he could predict how growth within the engineering college would affect other departments (e.g., physics) as he knew what courses across campus his students typically take. Institutions with mature BI organizations often have leaders who encourage (or even require) data to inform major decisions.

Knowing what to measure and how to measure it makes a complicated world much less so

The transition from intuition-based decision making to data-informed decision making isn’t easy. Expectations suddenly change, but skillsets require time to catch up. Often, BI teams rejoice at opening data to campus, but the fact is, campus members often have no clue how to approach the data or determine what data is useful to them. We found a few smart ways BI teams were helping direct people to the information relevant to them.

The University of Washington, for example, includes a section within their BI portal that recommends reports to individual users based on their typical report usage and the activities of other campus members in similar roles. This, along with crowd-sourced feedback on how the reports are used, helps guide campus members to the right data for them. Are your campus members fending for themselves, or are you helping them make the transition to data-driven?

The rest of the book was full of insights pulled from data to encourage more skeptical thinking of everything from conventional wisdom to expert claims. You may want to start your next cross-campus meeting about BI with a little light reading—even just the introduction to this book—and see what kind of conversation arises.

EAB asks you to accept cookies for authorization purposes, as well as to track usage data and for marketing purposes. To get more information about these cookies and the processing of your personal information, please see our Privacy Policy. Do you accept these cookies and the processing of your personal information involved?