
ID Case File #2 - The Leaky Pipeline
July 28, 2025
The Dilemma
An email forward appears in your inbox:
--------- Forwarded message ---------
From: Dr. Evelyn Reed <ereed@northwood.edu>
Date: Mon, Jul 28, 2025 at 10:44 AM
Subject: Urgent Consultation Request
To: Skye Calloway <skye@id.inc>
Dear Skye
For the last four years, our introductory chemistry course, CHEM 101, has become a significant roadblock for our students. It's a required gateway course for nearly all our STEM majors, but we're losing almost half of the students who take it; our DFW rate is at an unacceptable 40%.
The prevailing sentiment among our chemistry faculty is that the problem is simply one of student preparedness. Their consistent recommendation has been to add more tutoring and supplemental instruction. We've invested heavily in these resources, but the needle hasn't moved.
I know the timing is not ideal. It's finals week, which means direct access to students for interviews is impossible, and the faculty are swamped. However, we can provide full access to all of our historical course data, past student evaluations, as well as the course itself.
The faculty will have dedicated time over the upcoming summer break to work with your team to make any necessary changes to the course. To make the most of their time, we need your team to find the root cause now so we can hit the ground running and have the course updated for the fall.
Dr. Evelyn Reed
Dean, College of Sciences
Northwood University
As you can see, it's a classic 'leaky pipeline' problem, but the real challenge is that it's the last week of the semester. We can’t interview students or faculty and, even more importantly, the students who have already failed or dropped the course (the people we really need to talk to) are no longer enrolled and effectively unreachable.
The Dean has given us full access to their systems, but we need to find the root cause without talking to anyone directly.
I’ve scheduled a follow up meeting next week to review our initial findings, so you’ll need to be strategic about where to focus your efforts.
The Decision
As I see it, you have two primary paths you can take for this initial analysis:
Course Design & Analytics:
Dedicate your week to a deep, forensic analysis of the existing course materials and historical student performance data. Dig into their LMS and review everything (syllabi, modules, assignments, and exams) to find patterns in the course design that might be causing students to fail.
Select an option above or scroll down to view the debrief.
The Debrief
Both analytical paths led to a positive reaction from the Dean - there is no 'wrong' answer here. The path you chose didn't determine if you found a problem; it determined what kind of problem you found.
Focusing on the course alignment uncovered a clear, data-backed instructional problem: an assessment misalignment. This is a tangible, solvable issue that the faculty can address. It's a very successful and valuable finding.
Analyzing the context and environment of the course uncovered a powerful, human-centered experiential problem: a confusing and unsupportive learning environment. This is a more systemic issue that speaks to the students' lived reality.
The real skill isn't just finding a problem. It's about knowing how to prioritize your analysis to find the root cause. To understand that, we need to look at the full framework we use for any comprehensive Needs Assessment.
Our design process is always grounded in a comprehensive Needs Assessment, which is the systematic process of identifying the gap between the current state and the desired state. In a project with no constraints, we would analyze all four layers. But with such a short turnaround time for our analysis, we have to prioritize. To understand that choice, we first need to look at the four layers of analysis we use.
Task Needs Assessment
A Task Needs Assessment focuses on understanding the specific tasks and skills required to perform a job or, in this case, succeed in a course. We deconstruct the work to find out what knowledge, skills, attitudes, and behaviors (KSAB) are required for effective performance.
This could involve:
Analyzing job descriptions and competency frameworks.
Breaking down complex tasks into smaller, more manageable steps.
Observing experts to deconstruct their intuitive skills.
Reviewing the course design and alignment is a classic Task Analysis. You would be reviewing the syllabus, assignments, and exams to map out every task a student must perform to pass. A thorough analysis here could reveal that an exam, for example, is testing a skill that was never actually taught, creating a clear instructional gap.
Organizational Needs Assessment
An Organizational Needs Assessment aims to align any potential solution with the broader business objectives and strategic goals of the client. It seeks to answer the question: How can our work support the organization's success?
This might involve analyzing:
Strategic goals and initiatives, like new product launches or market expansions.
Performance gaps, like low productivity or high safety incidents.
External factors, like changes in industry regulations or new market competition.
In this case, the Dean has given us a very clear top-level strategic goal: improve student progression and retention by reducing the 40% DFW rate in CHEM 101. However, a full organizational analysis also involves investigating how the current solution aligns with that goal. A key part of our analysis would be to determine if the course's stated objectives and curriculum are truly designed to support student success or if they are misaligned, perhaps focusing on "weeding out" students rather than building them up.
Learner Needs Assessment
A Learner Needs Assessment is all about understanding the learners themselves: their demographics, backgrounds, motivations, challenges, and learning preferences. Without this layer, we risk creating a solution that is technically correct but completely disconnected from the people who need to take it.
This assessment would analyze:
Demographics and cultural backgrounds.
Prior knowledge and existing skill levels.
Intrinsic and extrinsic motivations for learning.
Since we can't interview students directly, we would analyze the data they've left behind, like past course evaluations, to build a picture of their experience. We'd look for recurring themes in their feedback to uncover their specific pain points.
Environmental Needs Assessment
An Environmental Needs Assessment evaluates the technological, logistical, and cultural factors that can support or hinder learning.
This might involve:
Analyzing the available technological infrastructure, like the LMS or internet connectivity.
Assessing the physical learning environment for on-site training.
Considering cultural and logistical factors, like organizational culture or time constraints.
For a hybrid course like CHEM 101, an environmental audit might reveal that the LMS is difficult to navigate or that critical resources are buried. These environmental barriers can cause students to fail, regardless of how well-prepared they are.
Deconstructing the Approaches
Now, let's look at the two approaches through that four-layer lens. Both are valid strategies a designer might take, and both have significant pros and cons in this specific situation.
Looking Inside-Out
Analyzing the course and historical data is an 'inside-out' approach. It starts from the perspective of the institution. A core part of this approach is conducting a Task Needs Assessment to ensure alignment. You would analyze if the final exams are truly aligned with the course's learning objectives, and if the instructional materials are aligned with what's being tested. A thorough analysis here could reveal a critical flaw—for example, that the exams cover content that was never actually taught in the online lectures. This path is excellent for finding these kinds of objective instructional gaps.
So, why isn't this the clear first choice? Because of the context the Dean gave us. The fact that the university has already invested heavily in tutoring and supplemental instruction, and it hasn't worked, is a massive clue. It suggests that the issue might not be a simple instructional gap that more 'help' can fix. While this path could uncover the problem, you risk spending your entire week analyzing the curriculum only to confirm what the failed tutoring already implies: that the problem lies elsewhere.
Looking Outside-In
On the other hand, analyzing student feedback and the user experience is an 'outside-in' approach, rooted in our Human-Centered Design philosophy. It starts from the perspective of the learner. By reviewing past course evaluations, you are conducting a Learner Needs Assessment. By auditing the online learning platform, you are conducting an Environmental Needs Assessment.
However, let's be realistic, this approach has its own serious flaws. We can't let our belief in empathy blind us to the data's limitations. Student evaluations are not a perfect source of truth. They are often skewed toward the extremes, the students who loved the course or hated it, and they completely miss the voices of the students who withdrew before the end of the semester. So, we know going in that this data is incomplete.
Making the Best Choice
So, why prioritize this approach? Because in a situation with limited time and a 'black box' problem, our goal isn't to find the definitive answer in one week. Our goal is to form the strongest possible hypothesis. The open-ended comments in course evaluations are a goldmine of qualitative data. They can provide clues about hidden frustrations, like a confusing LMS or a lack of instructor presence. Systemic issues like poor usability or inaccessible materials can create significant barriers. If students struggle to navigate the online environment, they may fail regardless of the content quality, making the environment itself a potential root cause worth investigating.
The Bottom Line
This "outside-in" approach, while imperfect, is a strategic bet that the student's lived experience will give us the clues we need to conduct a much more efficient and targeted Task Analysis later.
Ultimately, both paths require you to analyze data, but the real job of an instructional designer isn't just to analyze data; it's to find the story hidden within it. That story is what allows you to move beyond the surface-level symptoms and solve the right problem.
Community Insights
This section summarizes real-world feedback from instructional design practitioners polled on LinkedIn, Reddit, and other professional forums such as ONILP and Useful Stuff. We've highlighted their poll results, insightful comments, and alternative strategies to showcase diverse approaches to the dilemma.
Summary of Results
While the poll was closely split, a slight majority (58%) favored starting with Student Feedback & UX.
However, the comments reveal a stronger trend: practitioners don't see these as separate choices, advocating instead for an integrated approach where one informs the other, such as using a UX walkthrough to form a hypothesis that is then tested against student feedback data.
Featured Comments
It is difficult to choose just one option because since I only have a week to review materials and information, I’d want to do more than just review student evaluations (although they provide good information, I think there are other valuable things to review).
In my experience, if a course isn’t working well, it is usually due to some or a mix of these factors: instructor communication and/or grading, writing mechanics, instructions, the course is not aligning (or not aligning well) to the course objectives, the materials/resources/assessments provided to students are too advanced or don’t address the course objectives, and so on. I’d want to review/update the course map, do a time on task analysis, and review the learning materials and resources. So there are quite a few things I could and would want to do before the meeting.
It can be challenging to ‘rein in’ faculty as they sometimes want to impart all their knowledge, but to support students, only materials and info that relates to the course objectives should be included. A course map is such a valuable tool to do this type of analysis. I use them almost everyday in my work.
- Willow Aureala
LinkedIn Profile
Can I look at course design and past student feedback? I would actually likely mix and match the options you’ve given here. Let me explain how I would approach it:
I would put myself in the shoes of the learner in lieu of being able to talk to them, and I would take the course for a test drive. I would imagine that I’m back in my freshman year of chem and start going through the course. I would look for things like: logical flow and organization, clear explanations and definitions, clear language around assignments, number of assignments per week, that the assignments and assessments cover what was covered in the lessons, walls of text, lengthy video lectures, trick questions, and more.
I would create a hypothesis. I’m doing my own learner research, so to increase my odds of correctly identifying the problem, I’m going to try to anticipate what the feedback in the evaluations will be based on my assessment of the course design. If I saw bad flow or information overload, I would look for students complaining the course is “confusing.” If there were too many assignments, I might look for students to say there was “too much work.” Basically, I would try to think of how people without a learning background would describe these things so I can use the evaluations to test my hypothesis. I would also see if there are any direct metrics I can connect, for example, if there are lots of lengthy video lectures and there’s a question about the lectures, I would expect that to trend low.
I would use the student evaluations to test my hypothesis. If I’m correct and see a lot of student complaints and data alignment in the areas I predicted, I would focus on those aspects of the course design that led me to that hypothesis. If there was something in the student data that I did NOT anticipate (maybe a problem faculty member or something about the textbook not matching the lessons, etc.), I would go back and seek that out in the course design to see if it could be fixed. It’s important to remember that this could occur whether or not I am right about my hypothesis, so we have to remember to look at ALL the data to see what it’s telling us.
I would make updates based on my mini research project. If there were any specific words or phrases I found often that supported my hypothesis, I would ask to see course data after the changes to see if those words appear less frequently. This is a great way to show measurable change. The same with the numbers: remember that lecture example? If the long lecture videos got spliced into short daily ones, I would look for an increase in the rating on the lectures. Now, I can PROVE that the changes made an impact with data I was already using anyway.
- Dr. Heidi Kirby
Founder of Useful Stuff
Students are very forthcoming about their course experiences. They talk to each other. They talk to students who have already taken the course. They go to online reviews and engage with discussions there. They share their struggles and their opinions. And, curiously, they have this idea that authority figures, be they parents, or teachers, or employers, don't do any of these things. So they write truths that they might not be sharing IRL with people who aren't their peers.
Any person who takes an online course, or even leverages course technology, is a user *first*. Their world is digital. It's where they connect with their friends. It's how they access their news. It's how they pay for their purchases. And it's how they study. So when they approach an online course, they do so first as "something something online" and only then as "course." And they expect that something online to behave as anything else in their digital lives. With well designed (too well, in some cases) UX. If the UX is bad, they disengage.
So if there is an issue with students struggling in an online course, why would you start anywhere but with the students struggling in the online course?!?!?
- Kristin Neumayer
Learning Experience Designer
LinkedIn Profile





