Skip navigation
Tool

How one university created a departmental dashboard with actionable metrics for student success

In 2009, UW Eau Claire engaged faculty in a campus-wide gap analysis exercise to identify reasons for delayed student completion. During the process, over 1,200 transcripts and 919 degree audits of students who had 90 credits or more but had not applied for graduation were analyzed by 60 participating faculty members. Faculty received an average financial stipend of $1,000 in return for their participation.

Student transcripts and degree audits did not hide student names, which allowed faculty to identify familiar students and provide context to their analysis. Faculty were asked to fill out the degree audits for their students and identify key gaps in student credit hour completion and progression. The transcripts did not identify demographic factors of students including first-generation, low-income, or racial status.

More on this topic

This resource is part of the Design Effective Academic Department Reviews Roadmap. Access the Roadmap for stepwise guidance with additional tools and research.

120

Students dropped unnecessary courses during their 4th year
Students dropped unnecessary courses during their 4th year

Common reasons for untimely completion

Faculty discovered that untimely degree completion primarily stemmed from three root causes: major changes, overcredentialing, and underenrollment. Non-academic reasons for not enrolling in certain courses included student employment, lack of awareness that a full course load entails enrollment in 15 rather than 12 credit hours, and poor planning around course availability.

400+

Students missed timely graduation by one course
Students missed timely graduation by one course

UW Eau Claire also found that 25% of students changed their major at least once and 1/3 of students were over-credentialing by enrolling in unnecessary courses. The vast majority of students under-enrolled and failed to complete 30 credits during their first year and 60 credits by the end of their second year.

Resulting policy and curricular changes created one-time fixes for untimely completion

The degree audit and transcript analysis revealed a number of opportunities to address timely degree completion through quick policy and curricular changes. By allowing faculty to directly interact with the data and view the names of their students, contacts report that faculty were much more invested in the change management and reforms that resulted from the exercise.

Public Accountability Matrix (PAM) reports data on departmental performance

The state of Wisconsin mandates that all institutions publicize institutional performance on state-dictated metrics as part of a statewide initiative to improve graduation rates. Therefore, UW Eau Claire developed the Public Accountability Matrix (PAM), a spreadsheet that benchmarks performance at the department level.

Each department’s performance on student engagement, student success, faculty productivity, and economic development is publicly available for viewing. However, since PAM is reported to the public, and often by third parties, UW Eau Claire has very little control over how metrics like retention and completion are measured.

Public Accountability Matrix metrics

  • Level of Academic Challenge
  • Active and Collaborative Learning
  • Student-Faculty Interaction
  • Supportive Campus Environment
  • Diverse Perspectives
  • Diverse Interactions
  • Understanding Diverse Backgrounds
  • 2nd Year Retention: First-time, full-time freshman retained starting the subsequent fall.
  • 3rd Year Retention: First-time, full-time freshman retained starting the second subsequent fall.
  • 4-Year Graduation: First-time, full-time freshman that graduate in four years.
  • 6-Year Graduation: First-time, full-time freshman that graduate in six years.
  • Time to Degree: Average number of semesters taken to attain degree for a given graduating class.
  • Placement: Percentage of graduates employed or continuing formal education for a given graduating class.
  • Undergraduate Student Credit Hours: Number of undergraduate student credit hours supplied by department.
  • Undergraduate Degrees: Number of baccalaureate degrees produced by department

PAM raises faculty concerns regarding metric measurement and student connection

While PAM is effective in reporting department-level metrics, the metrics themselves do not directly speak to the student experience and do not provide an outlet for faculty and staff to improve those metrics. When PAM was first implemented, faculty voiced these concerns, claiming that PAM is not personalized and lacks a direct connection to the student experience.

In addition, faculty reacted negatively to the fact that PAM metrics are typically calculated by the state of Wisconsin and the Department of Education, leaving UW Eau Claire very little say in determining how metrics like retention and completion are measured. For example, the above metrics for retention and completion apply only to first-time, full-time freshman, which leaves transfers and other non-traditional students out of the equation.

Finally, the metrics in PAM are not directly actionable: while most faculty buy into the notion that the institution must improve first-year retention, it is very difficult for faculty to understand how they can help move the dial on a day-to-day basis.

SAM developed to create personalized and actionable metrics for individual faculty

While the degree audit and transcript analysis successfully instituted curricular and policy reforms to facilitate timely degree completion, the exercise did not solve the need for more actionable metrics for deans, department chairs, individual faculty, and advisors to act upon daily. The implementation of PAM paved the way for assessing more granular data on student success at the department-level, but the metrics were too broad to generate real behavioral changes.

To address faculty concerns and incent departments to directly participate in student success initiatives, the university implemented a more targeted version of PAM called the Strategic Accountability Matrix or SAM. The university began to roll out SAM in 2009 and fully implemented the matrix in 2011. SAM benchmarks departments on specific metrics that can be inflected by individuals within the department.

Instead of telling a department chair to improve 1st year retention, SAM measures departments on metrics including participation in mid-term grade reporting and student involvement in high-impact practices. The goal was to create metrics that are student-centric, shorter-term, and grassroots-level, while still focusing on initiatives that amalgamate to improve retention, progression, and completion. However, general cost, tuition, retention, and completion metrics were removed from SAM because the data is not actionable for individual faculty members.

Weighting of metrics signals departmental priorities and allows greater flexibility

While all departments are benchmarked on the same 18 metrics, each department weights the metrics differently depending on their strategic goals and student population. Department chairs, deans, and the provost negotiate to allocate a weighting of 0, 1, or 2 to each metric. These negotiations aim to ensure that each department allocates the weight that naturally fits the demographics, priorities, and specific characteristics of the program.

The weightings of metrics signal to faculty members which metrics are most important to the department during any given academic year. For example, lower enrollment majors like Philosophy and Religious Studies can weight the metric for “New Freshman Majors” at 0 while Nursing may weight the same metric at 2. The weightings display the priority level of each metric and identify areas where faculty can have a particularly high impact. Contacts affirm that allowing departments to weight each metric helped relieve faculty skepticism related to the impact that faculty and departments may have on any given metric.

Strategic Accountability Matrix metrics

  • Collaborative Research or Creative Activity: Percentage of graduating majors that participated in at least one faculty/student collaborative research or creative exercise.
  • Internships: Percentage of graduating majors that participated in at least one “internship” experience including traditional internships, student teaching, and clinical experiences.
  • Intercultural Immersion: Percentage of graduating majors that participated in at least one intercultural immersion experience.
  • New Freshmen Degree Plans: Percentage of freshmen majors with four-year degree plans within the degree planning system.
  • Advisee Satisfaction: Percentage of majors that rated the overall quality of their academic advising as “good” or “excellent.”
  • Student of Color Majors: Percentage of current majors that are Native American, African American, Hispanic, Asian, Alaska Native, Native Hawaiian, Pacific Islander, or two or more races.
  • Transfer Student Majors: Number of new transfer students transferring into a major offered by the department.
  • High School Student Interest: Share of graduating high school students who indicated an interest in a major offered by a specific department on submitted ACT scores.
  • New Freshmen Majors: Share of first-time, full-time new freshmen who are majoring in the department.
  • Total Majors: Share of all UW-Eau Claire students majoring in the department.

Liberal Education Student Credit Hours: Total number of General Education student credits hours offered by the department.

  • Winterim Undergraduate Student Credit Hours: Total number of undergraduate Winter Session student credit hours offered by the department.
  • Summer Undergraduate Student Credit Hours: Total number of undergraduate Summer Session student credit hours offered by the department.
  • Student Credit Hours Lost due to Withdraw/Repeat/Fail: Share of all student credit hours lost because students withdrew, failed, or repeated courses offered by the department.
  • Freshman Mid-Term Grade Reports: Percentage of faculty teaching freshmen courses who submitted mid-term grade reports.
  • 30 Credits First Year: Percentage of majors earning 30 credits in their first academic year.
  • 60 Credits First Two Years: Percentage of majors earning 60 credits in their first two academic years.

Tuition from Mini-Session: Tuition generated by Winter Session and Summer Session enrollment in courses offered by the department.

Actual to expected performance ratio calculated to determine funding allocation

To calculate overall performance, UW Eau Claire created a ratio of actual to expected performance for each individual metric. The ratio is multiplied by the specific weight that the department allocated for the metric. The weighted ratios are then added to create a weighted sum or “score” for overall performance on all 18 metrics. Departments receive a portion of the $400,000 central fund, which is calculated by dividing the department’s score by the total score for all departments. Initially, the funding was reallocated from other areas within the division of Academic Affairs and has since become a permanent bonus pool.

$400K

Allocated annually to departments based on their performance on departmental dashboards
Allocated annually to departments based on their performance on departmental dashboards

The funding is allocated 50 percent based on the department’s FTE and 50 percent based on the department’s performance on SAM, to ensure that the funding does not disproportionately benefit smaller departments. On average, departments receive about $10,000 each year. The funding has no strings attached; department chairs and deans convene to determine the best way to invest the additional funding in order to continue to improve on the departmental metrics. In the wake of state budget cuts and very little flexible spending at the department level, department chairs welcome the additional funding to finance smaller projects and necessities.

Departments quick to react to visible performance gaps

SAM is updated annually and sent to department and dean’s officers as a PDF. In addition, faculty and staff can request to view their department’s performance at any point. The increased visibility that SAM provides to departments incents chairs to make the necessary short-term steps and investments to improve student academic and engagement outcomes.

Contacts report that SAM has increased collaboration between departments and heightened faculty awareness of student success outcomes. The Institutional Research department is currently developing an interactive dashboard version of SAM that can be updated each semester and is easily accessible online.

Local Curricular Reforms

Aligning pre-requisites with local CCs

Biology department adjusted introductory curriculum to better suit transfer students

Revitalizing first-year instruction

Low-enrollment science programs shifted from “weeding freshmen out” to more engaged pedagogy

Investments in Student Support

Increasing instructional support for at-risk groups

Psychology department added supplemental instruction to address noticeable achievement gap

Requiring four-year degree plans

Share of all first-year students with complete degree plans grew 45 percent in the first two years of assessment

Iterative development of SAM encourages reforms based on institutional priorities

Each year, deans, department chairs, and the provost review SAM to ensure that the metrics are still actionable for faculty and pertinent to institutional goals. The metrics have changed as flaws in the model have been discovered or as performance goals for the institution are prioritized.

For example, when the metric for compliance with midterm grade reporting was added to SAM, faculty participation rose from 20 percent to 80 percent across the institution. The addition of this metric signaled the importance of mid-term grade reporting as a retention strategy, thus encouraging faculty participation.

Departmental dashboards foster lasting cultural change at the university

In addition to the local curricular reforms and investments in student support, the implementation of SAM has successfully created an accountability mechanism for departments to invest in student outcomes. The departmental dashboards clarify each unit’s role in contributing to institutional performance goals, which creates awareness of how all of the individual actions within each department add up to ultimate improvement or decline in student success.

Deans and department chairs have begun to include goals for these metrics in their annual budgeting, planning, and reporting processes to add an additional level of accountability and ensure that the metrics stay top-of-mind for faculty.

Results: Departmental dashboards prepare institutions for like move to performance-based funding

As many states have recently moved to performance-based funding models, institutions have scrambled to acclimate to the new budget reality. SAM has allowed UW Eau Claire to adequately prepare for a likely move to PBF at the state level. The departmental dashboards force faculty, staff, and unit leaders to adjust to a culture of evaluation and continuous improvement, preempting a potential top-down system dictate.