Measuring the impact of learning 

8 min read
Jun 26, 2023 10:16:19 AM

 

How do we prove that learning is effective? And how do we measure results? It’s a recurring question, and a challenge for L&D teams whenever you’re asked to build a business case or demonstrate return on investment. 

But the question is bigger than simply how we measure learning. Why do we want to measure the impact of learning? Should we always measure learning? And how do we quantify those intangible benefits of learning, things like job satisfaction, wellbeing, and personal fulfilment?  

Dan Tohill is CEO of learning consultancy Inspire Group. They specialise in learning design, leadership development, and helping organisations source top L&D talent. The business was founded in 2001, and their client list is a who’s who of blue chip brands in New Zealand, Australia, and Asia. That’s a long winded way of saying Dan has had one or two conversations about measuring learning over the last twenty years and has sage advice to share.  

 

Get clear on why you’re measuring learning 

Dan advises getting crystal clear on why you want to measure learning. “Everyone thinks measuring learning is a good idea. But get clear on why you want to assess impact. If a learning initiative’s really small, It’s probably not worth taking time to evaluate it. Measurement needs to be commensurate with the value of a learning programme because you could spend as much time and money quantifying the impact of learning, as you do creating it.” 

Dan outlines the four key reasons organisations measure the impact of learning. 

  1. Industry awards. Judging panels want proof. Award panels aren’t easily impressed. They say, yeah, that's cool but did it make a difference? 
  2. Critical learning initiatives. Say, for example, people working in a high risk organisation experience injuries or death. That’s an intolerable situation, so the organisation needs to prove their health and safety learning solution works.  
  3. Learning designed to deliver financial impact. The business case for this learning is often informed by hard business metrics. An example is when an organisation decides to revamp their sales model to meet business growth goals. Results are measured because organisations want to know their return on investment  
  4. Improving learning. L&D teams measure engagement with learning, through qualitative and quantitative measures and learning analytics. They want to know if learning is well received, and how to improve learning to make it more engaging and effective.  

Once you’re clear why you’re measuring results get clear on who you’re measuring learning for.  

Dan says, “L&D folks will ask different questions to senior leaders. L&D teams will be interested in a formative assessment that evaluates the efficacy of learning as they build and roll it out because that data will help them shape learning to be more effective and plan for the future. While senior stakeholders will be more interested in summative evaluation that summarises learners’ achievements and helps demonstrate ROI. What new skills and knowledge do people have? Did the learning have the desired results?”  

 

Learning doesn’t exist in a vacuum  

Dan explains at the start of any learning project it’s important to ask what you’re trying to achieve? Can you measure that outcome? Are you measuring that outcome now?  

A pragmatic lens helps when it comes to measuring impact. Dan says, “Organisations aren't going to introduce a new raft of measurements just to track learning. And they shouldn't either because collecting data is time consuming. But if you think you've got a compliance issue because you've got certain data, or you've identified a sales opportunity from data, go back to those measures after learning has been rolled out and see what influence your learning had.”  

However, attributing positive improvements to learning initiatives isn’t clear cut, because the closer you get to hard success metrics; the more other factors get in the way. 

Dan says, “Sales are often a good example. You create new sales training and you're looking for 20% increase in sales. If you roll that programme out in a period where nothing changes, and you get an increase in sales, you might pat yourself on the back. But the reality is the world isn’t static, and while you’re rolling out your programme, salespeople will move around, new product lines arrive, and marketing will run a campaign. So, learning never exists in a vacuum, and it’s hard to draw straight lines between learning and positive outcomes.” 

“But most of the time, when stakeholders who control budgets spend large on an initiative learning might only represent a quarter of that spend. And they’re interested in the big picture and overall results, rather than the efficacy of the various measures they took.” 

 

Don’t confuse assessment with evaluation  

Dan explains people confuse assessment with evaluation. “Assessments measure whether or not someone has learned something. They don’t measure how effective learning outcomes are. You can have poorly designed learning that people love because it’s fun and they do well in the assessment, but the learning doesn't work because it's focusing on things that don't matter.” 

“We had a client years ago who said they were going to measure the effectiveness of our learning solution based on the attainment of learners. I said, ‘Okay, how's that gonna work?’ They said, ‘If learners get over 85% in the assessment, we’ll rate your learning as effective.’ I said, ‘Well, that's cool. We’ll just make the assessment really easy.’”  

“Evaluating efficacy is different to an assessment. You evaluate efficacy to understand if learners have taken what they've learned and are applying it in the real world. You observe behaviours back on the job to see if people are doing things differently. Maybe managers report changes in behaviour. Maybe results change, sales or customer satisfaction ratings go up, or health and safety incidents go down. Efficacy is real world stuff.” 

However, your work environment influences whether new behaviours are practiced on the job. Dan says, “Take health and safety training. New recruits go through learning, come back, and start behaving in new ways. But the wider team laugh at them and say, ‘Don’t bother with that mate, we don’t do things like that here.’ When learners are trying to demonstrate new behaviours in the work environment, that environment needs to be conducive to change.”  

“As a result, leadership support for learning is critical, celebrating change when they see it and calling out old behaviours when they creep back in. Part of learning design is considering and communicating what’s expected of leaders. An example is encouraging leaders to check in with learners at their next team meeting and saying, ‘Okay, team, you did your health and safety training last week, what are you doing differently now?’ It could be that simple.” 

Dan explains that measuring learning starts at the learning design stage. “Design measures into your learning to influence their efficacy and make efficacy easier to evaluate. But don’t pack too much in because there's a balance to hit. People are maxing out their cognitive load, so the last thing you want is for your carefully crafted learning to be dismissed as just some more shit from HR. Get to the point. Make your learning and your communications about that learning immediately meaningful and relevant to get engagement.”  

 

Seek clarity on the change you want to see 

Dan notes the common factor in learning projects that measure impact effectively and get good results is organisations have clarity on the change they want to see. “Successful learning initiatives are very clear about the measurable, demonstrable change they want to see as a result. But often learning project sponsors aren't clear on this point.”  

“Good learning designers will always ask, what are you trying to achieve here? How do you know what you’re doing now isn’t working? What things are you measuring currently? Without answers to those questions, we don’t know what levers to pull.” 

“Often we’re told, ‘We need X learning program.’ But when we dig deeper we find people already have those skills, they’re simply not using them because they lack motivation. Often organisational challenges aren’t learning issues but leadership issues.”   

“We were approached by a business who wanted e-learning to show their people how to use a CRM. I asked, ‘Is the CRM new?’ ‘Oh no, we've had it for years, but people don’t put in the right data. They did before, but now they can't be bothered.’ In that situation, learning isn’t going to help. It’s a motivation issue. People don't believe entering the data is important.”  

“L&D teams and consultants are guilty of taking orders when we need to be brave enough to challenge the request or walk away if the learning solution a business thinks it needs isn’t going to shift the dial and deliver the results they want. Equally we need to have the confidence to guide senior leaders to clarity on the measurable change they want to see.” 

 

How to present measurements in a way that counts  

Dan shares tips for measuring learning and presenting those measurements in a powerful way. “Make the evaluation as small and focused as possible because if it's overblown and unwieldy measuring will be too hard and no one will do it. Don't try to build new measurements and put extra load on your organisation. Use measurements already in place because if you think you've got a problem, you're probably already capturing data that shows that.”  

“Measure behaviour by interviewing stakeholders. But keep that manageable too. Focus on a couple of key questions. How has behaviour changed and is that new behaviour leading to better outcomes for the organisation?” 

“When you start looking at human behaviour, you’ll find stories. Those stories are data, but they’re also what people respond to most. Statistics that prove positive change are important, but people remember good news stories better. If you want to gain support for a learning initiative gather stories of the problems people are experiencing. And if you want to demonstrate the success of a learning initiative, gather stories of positive change.”  

 

Steal Inspire Group’s learning evaluation framework  

Inspire Group’s approach to evaluation design is grounded in the Kirkpatrick model, with a healthy dose of pragmatism, because Inspire Group know from 20 years’ experience that despite the best intentions, most learning initiatives aren’t evaluated past measuring engagement and performing learning assessments. 

Principles that guide Inspire Groups’ approach include: 

  • Evaluation design starts when you design your learning solution.  

 

  • The influence of L&D wanes as other factors affect the learning solution’s effectiveness. Some (but not all of these factors) can be mitigated using more holistic design thinking.  

 

  • Clarify your stakeholders for evaluation, and tailor the evaluation to their needs, e.g., avoid overblown evaluations when the sponsor needs something simple. 

 

  • Use existing metrics to measure success as they’re likely to have been the catalyst for the learning initiative.  

 

To determine what levels of evaluation are required, Inspire Group answer these six questions.  

  1. Why are you evaluating in the first place? What will you do with the results?  
  2. Who are the stakeholders for each level? What are their evaluation information / data needs? This avoids over complicated evaluation.  
  3. What metrics will be used to evaluate? Use existing measurements, as evaluation often fails when the effort required to collect data outweighs the value of the insights gained. 
  4. What evaluation measures are summative vs formative?  
    • Summative assessments gather data to show if a programme has worked  
    • Formative assessments gather data to help you evaluate your learning solution and modify as you go to provide maximum impact.  
  5. When will measurements be collected? 
  6. How will you collect and collate data? 

Amplify the impact of learning with Chameleon  

Chameleon is a fast, easy, beautiful content authoring tool that anyone can use, with a hosting platform, where you can host, share, and report on your Chameleon learning content. Find out more about our hosting here or view a demo.  

Hosting_Animated

 

Get Email Notifications