You know that feeling.
You've just pushed out a compliance module. Tight deadline, 40 slides, content handed to you in a Word doc the week before launch. You did what you could. Cleaned up the language, added a few interactions, made it look decent.
But you already know what's going to happen.
Learners will click through it. Some will leave it running in another tab. A few will email their manager to say it was "fine."
And you'll move on to the next one. Because there are six more in the queue.
This is the reality for most learning designers. When the request backlog never stops, "good enough to ship" becomes the standard. And engagement becomes something you'll optimise for next time.
Except next time never really comes.
So, when the conversation turns to "how do we make our eLearning more engaging," the usual suspects show up. More interactions. Better visuals. Maybe some gamification.
But here's the thing: none of those address the real problem.
Because engagement isn't a design style. It's a design outcome.
Learners don't disengage because your course isn't fun enough.
They disengage because it doesn't work with how their brain actually operates.
One of the largest studies in online learning, analysing 6.9 million video sessions across edX, found that most learners stop paying attention after about six minutes (Guo, Kim & Rubin, 2014).
And it doesn't matter how polished or well-produced the content is. Attention drops off anyway.
At the same time, research into multimedia learning consistently shows that when content is broken into smaller, learner-paced segments, people don't just stay longer. They actually learn more. A meta-analysis of the segmenting effect found improvements in both retention and transfer, alongside reduced cognitive load (Rey et al., 2019).
So, there's a tension at the heart of every course you build:
Learners need less content at once, but more cognitive effort while they're there.
That's where most courses fall apart because the structure worked against the learner from the start.
Think about a typical course. A long explanation up front. Maybe a video. A few click-to-reveal interactions. A quiz at the end.
On the surface, it feels complete.
But from a learning perspective, it's doing two things wrong at the same time: overloading attention and under-activating thinking.
Learners sit through large chunks of information without being asked to do anything meaningful with it. And when thinking isn't required, engagement evaporates. Fast.
Here's what's interesting: the fix isn't complex interactivity. Research shows that simply embedding questions throughout a learning experience, even basic ones, significantly improves both engagement and retention.
Not extensive branching scenarios. Not custom-built simulations.
Just moments where learners have to stop and think.
Which leads to a much more useful definition of engagement:
Engagement isn't about holding attention. It's about creating effort that feels worth it.
Once you see engagement this way, the whole approach shifts.
Instead of asking "how do we make this more interactive?" you start asking a better question:
Where does the learner actually have to think?
That one question is where most of the gains come from. Because learning doesn't happen when people are exposed to information. It happens when they retrieve it, apply it, and make decisions with it. And crucially, when they get feedback on those decisions.
Here's a practical example. Say you're building a module on your company's returns policy. The instinct is to start with an explanation: here's the policy, here are the rules, here are the exceptions.
But what if you opened with this instead:
A customer is standing in front of you with a broken product and no receipt. They're frustrated. What do you do?
Now the learner is thinking before they've been told anything. They're drawing on what they already know, making a judgment call. And when you show them the outcome, they have a reason to care about the policy detail that follows.
Same content. Completely different cognitive experience.
The pattern is simple:
Each cycle should be short, roughly aligned with that six-minute attention window, but complete enough to feel meaningful. Over time, those moments compound into real understanding.
Even if you nail the design, there's still a gap.
You can run a post-course survey. You can collect anecdotal feedback. Learners might tell you they "liked it" or found it "useful."
But you can't see where they actually lost interest. You can't see where they dropped off, which questions they struggled with, or whether the module you spent the most time on is the one they skipped entirely.
Without that visibility, you're designing in the dark. You might be getting better, but you have no way to know for sure.
Engagement isn't just something you design for. It's something you need to measure. Because the gap between what learners say about a course and what they do inside it is often the most important insight you'll get.
If you're trying to make your courses more engaging, you don't need more sophisticated, complicated tools. You need better constraints.
Shorter attention windows. Clearer structure. More thinking. Less noise.
And critically, a way to see whether it's actually working.
When you get that right, engagement isn't something you bolt on at the end. It's something that emerges naturally from the design.
Want to close the gap between what you design and what your learners actually experience? Our Practical Guide to Learning Analytics shows you how to measure what matters, from where learners engage to where they drop off. Download the free eBook →
Guo, P. J., Kim, J., & Rubin, R. (2014). How video production affects student engagement: An empirical study of MOOC videos. Proceedings of the First ACM Conference on Learning @ Scale (L@S '14), 41–50.
Rey, G. D., Beege, M., Nebel, S., Wirzberger, M., Schmitt, T. H., & Schneider, S. (2019). A meta-analysis of the segmenting effect. Educational Psychology Review, 31, 389–419.