4 reasons to evaluate your online training

Many online trainings and e-learnings are prepared and developed with much care. It’s important to maintain this focus after the training is launched. When you don’t do this, this can have quite a negative impact on your image and the effectiveness of your training. We’ll give you the 4 most important reasons to evaluate your online training in this blog, so you can keep a finger on the pulse. 

1. Accurate and current

Online trainings can’t be viewed as something isolated. They’re always influenced by developments in the work field and the zeitgeist. A current example is of course the corona crisis. E-learnings that were supposed to launch in March 2020 are probably rewritten. Other events, such as legal amendments, reorganisation or a new policy plan, are also of great influence on the accuracy and timeliness of your training. It’s also relevant on a smaller scale. Links that don’t work, an outdated calendar, videos that have been taken offline and grammatical errors can lead to big frustrations. The training comes across as unprofessional and unconvincing and that’s the last thing you want to achieve.

2. Motivation and growth

You want the users of a training to look back at the training and have a positive feeling about the content and the form of it, besides that you want them to learn something of course. When you experience joy during a training, you’ll remember the content better and you’ll be able to use it in practice quicker. It’s thus important for these reasons that you keep an eye on your e-learning. When it decreases in relevance or when it’s not completely up-to-date anymore, the joy of the students will decrease as well. And that while an e-learning can motivate people to develop themselves further. An opportunity you don’t want to miss. 

3. Result and strategy

You didn’t create the online training for nothing. Of course you expect a result from it. Regardless if it’s about productivity, corporate culture or sales, you’re probably extremely curious if your training combined with the commitment of your personnel results in achieving your goals, or if you need to apply a different tactic. When you don’t evaluate your learning content, this will be difficult to retrieve, and you won’t be able to optimise your e-learning.

4. Incomplete evaluations and wrong conclusions

It’s great when your training scores an 8.8 concerning satisfaction, right? This is more nuanced, because you shouldn’t conclude too fast. A satisfied user doesn’t immediately mean learning efficiency or change of behaviour. Which conclusions can you draw from this result then? To understand this better, we’ll take a look at the evaluation model of Donald Kirkpatrick (1924-2014). This teacher made a difference between four levels of the evaluation of trainings. 

Evaluatiemodel Kirkpatrick

  • Level 1. The reaction level: how did the users experience the training?

  • Level 2. The learning efficiency: what do they remember? Measure this by using a knowledge or skills test or use an application assignment. 


    Level 3. Behavioural change: this is mostly indicated in the practice. 


    Level 4. Effects of the behavioural change: which results come from the changed behaviour?

An essential fact about this model is that every layer leans on the former level. It can be that someone is quite satisfied with the training, but hasn’t learned much. When someone isn’t satisfied, it’s almost certain they haven’t learned anything. When you only evaluate a training on the fourth level and the outcome is negative, you won’t be able to indicate what the problem is because you haven’t evaluated the underlying levels. So, this doesn’t help you. That’s why it’s important to use all 4 levels in your evaluation, so you can fix the possible unsatisfaction purposefully. 

Let's get started

The evaluation module of Hubper collects feedback on for example your e-learning or online trainings from your users. You can design this evaluation yourself and include all 4 levels of the evaluation module. Moreover, you can add logical decisions. Does someone give a lower rating on a certain aspect, you can continue to ask a question about optional points of improvement. When someone indicates that they want to learn about a certain subject, you can redirect this person to an extra path in which you ask more about it. This way you exploit your evaluation moment optimally. 



Now you know why evaluating your training is so important, but where do you start? Which points are good to evaluate? We like to help you with the list below, so you can get started. They are mainly pointed at level 1 and 2 of the evaluation model. The upper levels are different for every organisation. Every point we make down below can be worked out further:

  • Structure – Was the length and the structure of the training logical? 
  • Navigation and features learning platform – Were the buttons on logical places, and was the design attractive? 
  • Content – How would you rate the level, accuracy and extent of meaning of the content? 
  • Content forms and learning styles – How would you rate the use of media, variation between assignments and possibilities of testing? 
  • Interactivity and feedback – What did you think of the possibility of interaction with other participants and the trainer(s)? 
  • Suggestions – What did you think of the tips for deepening and inspiration?
  • Notifications – How do you rate the reminders and motivational messages of your training?
  • Support – How were you helped when you got stuck?
  • Valuation – How would you rate the training as a whole? 

Or get more inspiration on measuring the effectiveness  your online training in this blog.