Traditions of measuring impact
Measuring learning has been a hot topic for decades, and the field has already established traditional models for it. These are often connected to the well-known Kirkpatrick model of learning evaluation.
Kirkpatrick’s four levels of learning evaluation
Although it is old, the Kirkpatrick model is not redundant. It does work as a basic framework and is also commonly used in companies for evaluating learning effectiveness. The Kirkpatrick model includes four levels:
Reaction: Learners' immediate reactions to the training.
Learning: The extent to which learners gain knowledge and skills.
Behavior: The degree to which learners apply what they learned.
Results: The overall impact on organizational goals.
As the measuring progresses from the first level to the fourth, it becomes more complex and provides more in-depth information.
Broadening the scope of learning measurement
In the webinar, we discussed how many organizations struggle to measure levels 3 and 4 due to constraints like time, staffing, and lacking the appropriate metrics and tools. On one hand, measuring the third and fourth levels is tricky, but it’s hard to get a clear and comprehesive picture if you only focus on levels 1 and 2. On the other hand, sometimes the first two levels are enough and the third and fourth are not needed or practical. One of the key messages of the webinar was that it’s necessary to broaden this approach to include evaluating, among others, business impact. In fact, business goals should always point the direction of any training.

Practical challenges and solutions
Although measuring impact is, at least in theory, an old and established practice, many companies admit that they are still taking their first steps on this path. While there are challenges with effective measuring, participants discussed and shared practical examples they have used to cross these hurdles. One organization noted that one of their biggest problems is the quality of the data they have. They have now started by improving data quality and measuring the coverage of learning solutions before focusing on impact measurement. Another organization has experimented with smaller-scale initiatives to test different methods of impact measurement. It may be easier to follow up with a smaller group to see the long-term effects of learning initiatives. As this participant pointed out, it often takes time for new practices to take hold in the learners’ everyday lives. This method also creates an opportunity to build up processes for impact measurement. Participants also discussed the follow-up on trainings, including providing further training and coaching by supervisors to reinforce learning.
Making it stick
One of the key points of the webinar was that measurement should always start from, and be aligned with, strategic goals. The goals of any training should contribute towards achieving these strategic goals, whether in small or large ways. Furthermore, though it may not be explicitly outlined in strategic goals, as one participant pointed out, continuous learning is part of a broader cultural change. Continuous skill updates are essential in today’s fast-paced environment, and measuring learning effectiveness should reflect this ongoing need for development. Employees must internalize this attitude in order for learning to really stick.
Data and measuring
High-quality data is crucial for effective measurement. Currently this data is either hard to reach or not even collected in the first place. Organizations need robust systems for collecting and analyzing data to draw meaningful correlations between learning initiatives and business outcomes. In addition to a lack of data, it can be hard to point out a clear cause and effect between a training and its impact on achieving strategic goals. Some results can be easier to identify—for example, the number of accidents decreasing after a safety training—but often the connection is not that clear. This is partly due to the fact that learning initiatives are always part of a larger strategy. A DEI training is only one part of your company’s endeavors towards a diverse and inclusive workplace, but it is an important part.
Conclusion
While there is room for improvement, as always, progress has been made. Professionals are passionate yet practical about measuring impact. We at Bitville are confident that solutions for challenges can be found. AI especially holds great potential, helping to process large volumes of data and offering valuable insight. The webinar concluded with a call to action for organizations to adopt a more comprehensive and strategic approach to measuring learning effectiveness. By aligning learning initiatives with business goals, improving data quality, and leveraging measurement models, organizations can better understand and enhance the impact of their learning solutions.