Was the impact of remote instruction as bad as we feared? Here’s what the data shows

Blogs

Was the impact of remote instruction as bad as we feared? Here’s what the data shows

It’s fair to say many colleges and universities struggled with the rapid transition to remote instruction in spring 2020. The semester was hard on students as well as faculty. At the time, we wondered: How will the upheaval impact student progress? And as the pandemic surges across the country, presenting the possibility for more upheaval in Spring 2021, institutions are scrambling to understand the answer to that question.

Benchmark data is an invaluable source of insight during periods of rapid change, letting you compare your institution’s performance to that of your peers. To investigate the impact of remote instruction on student success, our Academic Performance Solutions team analyzed course completion data across a cohort of 49 partner institutions. In this post, I’ll outline key findings from that analysis and discuss what they mean for Spring 2021.

N=56 standardized departments. All analyses exclude individual instruction courses. Both upper and lower division courses included unless otherwise noted.

64%

of academic departments saw a decrease in course completion from Spring 2019 to Spring 2020
of academic departments saw a decrease in course completion from Spring 2019 to Spring 2020

Course completion fell in more than half of departments

Data confirms the mountain of anecdotal evidence that students and faculty struggled with the mid-semester transition to remote instruction. From Spring 2019 to Spring 2020, 64% of academic departments saw a decrease in course completion.

But there is a flip side to this story: amidst unprecedented turmoil, more than a third of departments increased their course completion rates. And the biggest single drop was 6.15 percentage points, not what you’d hope for in a normal year but clearly better than some of the doomsday scenarios we imagined.

Takeaway: That more than a third of departments increased course completion and that the largest drop for any department was not catastrophic are attributable to the herculean efforts of faculty, students, and administrators to make it work.

College-level analysis won’t tell the full story

One point of emphasis when considering overall departmental performance is that the way we’ve organized colleges or schools doesn’t necessarily correlate with where students or faculty struggled. We need to look towards department- or course division-level data to fully understand these changes in course completion. For example, from Spring 2019 to Spring 2020, the largest increase in lower-division course completion and the second-largest decrease both occurred in engineering departments:

+3.19

percentage-point increase in course completion in mechanical and aerospace engineering from Spring 2019 to Spring 2020

-6.15

percentage-point decrease in course completion in chemical and biological engineering from Spring 2019 to Spring 2020

This illustrates why it’s critical to provide relevant data to deans and department chairs so that the appropriate follow-up can occur.

Takeaway: Was a big increase rooted in new-found leniency or forgiveness? To answer that, we need to watch those same rates closely for 200- and 300-level courses to see if students aren’t succeeding in their next-level of courses. Does a big decrease point towards a limit to online learning? If so, consider that when deciding what classes to prioritize for face-to-face learning.

Benchmark data is just the start of the conversation

Beyond department- and course-level analysis, we also analyzed course completion by instructor type to see how faculty fared in adapting to changed circumstances. What we found was that tenured and tenure-track faculty saw the largest decreases in course completion from Spring 2019 to Spring 2020 (-2.1 and -2.7 percentage points, respectively) relative to adjuncts, graduate assistants, or full-time non-tenurable faculty. This will mean something different depending on the institution and even the department. These questions may help you reflect on data about course completion by instructor type for Spring 2020:

  • Did you deploy tenurable instructors against your most challenging lower-division courses, where you might expect to see students struggle more because of the changed modality? Might that account for a decrease in course completion among these instructors?
  • Did your tenurable instructors struggle to adjust to the new modality more than other instructors, and should therefore receive more focused training?

Surprisingly, the decrease in course completion among tenured and tenure-track faculty isn’t driven by general education courses. In fact, when you filter for courses in common general education departments:

  • Tenured faculty course completion rates rose (+0.8 percentage points compared to -2.1 in all courses)
  • Tenure-track faculty course completion rates decreased less (i.e., -1.8 percentage points compared to -2.7 in all courses)

This suggests general education instructors were better able to adapt to the switch to remote instruction.

Data on general education in this context refers to lower-division courses in the following six departments: Biology, Chemistry, English and English Literature, Mathematics and Statistics, Psychology, Sociology. Individual instruction courses were excluded from analysis.

Takeaway: Did your institution invest extra time and resources in training and supporting general education instructors? Continued support for general education instructors and students will be crucial as institutions focus on engaging and retaining freshmen who are beginning their higher ed experiences remotely.

Context for comparison is a crucial to making data-informed decisions

COVID-19 has exacerbated a tension that was growing on college and university campuses even before the pandemic hit: amid declining enrollments, institutions need to contain costs while advancing student success. In this environment, data is critical to ensure efficient resource allocation. Benchmark data is a valuable tool to understand your institution’s performance and spur discussions across campus about what changes need to be made and how best to move forward.

You may also like

EAB asks you to accept cookies for authorization purposes, as well as to track usage data and for marketing purposes. To get more information about these cookies and the processing of your personal information, please see our Privacy Policy. Do you accept these cookies and the processing of your personal information involved?