Evaluating Learning Activities

evaluating Learning

Despite the misinterpretations of the 70:20:10 learning model, quality training and coaching are central to fulfilling the potential of any business. Training serves as the foundation/ platform from which employees can grow and flourish but it’s important for it to be supported by coaching which enables us to critique the 70% learning ‘on the job’, find out what’s working, put the training into context and discuss more useful ways of operating. After all,

Practice doesn’t make perfect, it makes permanent…

Nevertheless, the key word used above with regards to any learning intervention was ‘quality’. Any learning activity should serve a purpose and be informed by a learning needs analysis. The activity should have learning objectives and clearly defined outcomes so that we know what we’re working towards and trying to achieve. Even more importantly, this enables us to evaluate the intervention afterwards to assess impact and value to the business.

Evaluating learning activities is not always straightforward as it can be harder to prove that improvements in key areas were the result of interventions such as leadership development. What a lot of businesses ultimately want to see is that a learning intervention will boost their bottom line, never mind pay for itself. We’ve discussed the financial impact of interventions such as leadership development programmes at length in previous blogs, but to keep it short – there is a direct correlation in relationships between managers and staff in a business and employee engagement levels. Employee engagement levels will impact on:

 

– Productivity & Quality

– Staff Satisfaction

– Attendance & Retention

– Hiring & Training Expenditure

– Customer Satisfaction & Retention

– Reputation & New Business

 

Ultimately this is all going to impact on profits, so in short it’s well worth investing in quality learning & development programmes. But how do we assess whether our interventions are working? Here are a few tips:

 

  • Be specific at the outset

Take the time to assess where the business or proposed delegates are currently and where you would like them to be; what is it you’re actually trying to achieve or improve? Because that’s what you should be tracking. Identify clear objectives and learning outcomes and ensure the intervention is designed for and capable of meeting them.

  • Make sure it’s in sync

Review competency frameworks to match any development plan against the actual requirements for the role and needs of the business. If you don’t have a competency framework, the first step should be a formal job analysis process. The information gleaned from the job analysis serves as the blueprint for everything else you’ll do.

  • Remember that L&D (Learning & Development) is not just training

Training alone can only achieve so much (while not taking it as exact, there are many anomalies, do remember the Ebbinghaus Forgetting Curve). It’s not until delegates leave the classroom and attempt to put the learning into practice that they will encounter obstacles and think of questions. Supporting training with coaching, on the other hand, allows you to identify clear business and/ or personal improvement areas and set business improvement projects in line with company strategy that can be worked on throughout the programme.

Ebbinghaus

  • Notice what you’re getting before you start

Too often interventions are deployed and then we look afterwards at what the impact has been. If we haven’t assessed and kept record of how things were at the outset what do we have to measure it against?

Relevant data can be recorded at the outset, tracked throughout the duration of the programme and measured again at the end so that you have a clear picture of its impact from which to ascertain its success. 360° feedback is one way of doing this to assess impact/ change in delegate behaviour(s).

 

  • Evaluate at multiple levels

As all the established and recommended evaluation models suggest, L&D should be evaluated at multiple stages rather than just with on the day ‘happy sheets’.

Kirkpatrick Evaluation

 

Jack Phillips evaluation model which builds on Kirkpatrick’s well known work (above) adds a fifth level which is to some the Holy Grail:

Level 5: Return on Investment

Did the impact positively pay off in monetary value?

 

  • Give it time

Last but not least – don’t expect things to be totally transformed in a day. People have been trained, yes, but it takes time for them to absorb the learnings, find opportunities to make improvements, overcome obstacles and make those changes. This is far more likely to happen with the ongoing support and conversations taking place when the learning is supported by coaching too.

The attitude of those around the newly trained, brimming with enthusiasm and ideas will also need time to catch up. People tend to be resistant to change, on a mission to find the ulterior motive even when something great is introduced. So there may be resistance and sometimes we have to go through a bit of hardship to make things better than they’ve ever been.

In that respect, it’s another factor to remember when measuring the impact of your learning and change programmes – the benefits may not be obvious immediately; but they may be more sustainable in the long term.

Take all the training

Q – What are your top tips for evaluating L&D activities?

Vibrant Talent Development work with organisations to help create great places to work. We do that through tailored, engaging, outcomes driven learning interventions that make a difference.

Check out our website www.vibranttalent.eu for more information or drop us an e-mail info@vibranttalent.eu

You can also follow us on LinkedIn, Twitter or Facebook for our latest posts and updates.

0 499
Vibrant

Leave a Reply