Measuring the success of your LMS

Written in partnership with Boost

Boost logo

Imagine the day when you get to turn to your boss, eyes open wide, and say with a smile: “It works! Success!”

But when we say success, what do we mean and are we sometimes being premature? Yes the alpha launch of the learning management system (LMS) functions, the single sign on works, and the reports show lots of data, but is this truly a ‘success’?

You and others involved in your LMS launch definitely get to call it a ‘successful test’, but hold off on the champagne for now. Maybe a Prosecco might be better suited and let us tell you why...

I am sure that as we explore the concept of measuring success using “levels”, it’s going to sound a little familiar. And this is because we are inspired by Kirkpatrick and his four levels. Not forgetting Jack Philips’ fifth level. For those of you not familiar with these levels they are (in essence) reaction, learning, application, impact on business/society and ROI, all of which we will talk about now and a similar, but sufficiently reworked, framework for measuring the entire success of your LMS.

Level 1 – functional success

Let’s look to your LMS implementation as inspiration for our first level. You have just gone live and as you look back over your implementation, are you able to say yes to the following questions?

  1. Was it on time?
  2. Was it on budget?
  3. Was it on quality (functionally)?

If you are able to answer yes to all of the above, then you have measured Level 1 and that equals a successful launch. However, do not feel disheartened if you find yourself saying ‘No’ to questions one and two; it is a place we all strive for in a project and only the chosen few succeed.

Level 2 – user acceptance

It’s all very well having an LMS, but is anyone logging on, and are they actually enjoying the experience? Level 2 is all about the users, their interactions with the new LMS, their reaction to it, and whether eventually it becomes part of their working life and ongoing development.

The Google HEART framework provides some good suggestions to help quantify the user experience over time. You can read more here. Within the Google HEART framework, there are suggested metrics relating to visits, satisfaction levels, repeat visits and more. It’s true that these measures have been developed for websites and e-commerce, but they work equally well when applied to an LMS evaluation.

Below, we have given you a sample of just some of the types of data that you may wish to measure and cross-compare with each other or even with historical data from an old LMS:

  • The number of learners that visited in the first week and again in the next four weeks. If you have moved from one LMS to another, and you feel different measures were taken to communicate the introduction of the new LMS, use this as an opportunity to compare the usage. You may want to consider the measurement every quarter for the first year to prove the value
  • Consider looking at the number of return visits vs. those that only visited once
  • Look at the length of visitors’ engagements – there is no value in a lot of visits if the interactions are only 30 seconds long

There are many other data variables you may want to explore, and some of these variables will also be dictated by the role of the person asking for the data. For more information on making the best of your Totara data, take a look at our post here. 

In addition to the data analysis, you may also want to invite learners to complete a survey focusing on their LMS experience. You need to be clear that this is not a conventional learner satisfaction survey, as they focus on the learning content. You may look at scored questions such as:

  • How easy did they find the system to use?
  • How useful were the various functions in the system?
  • What sentences best describe how they saw the role of the LMS?
  • How much more or less learning have they undertaken since adopting the system (see Level 3 below)?

With all of these questions, you could ask about the old system (or lack of one) and the new system. We recommend you get such a survey out within a couple of weeks of going live, while people’s perspective is fresh. Four months is enough time to establish the change in mood of bedding in, but not so long that they have forgotten how they felt at the beginning of this journey.

Level 3 – learning return

OK, on to the trickier stuff now. The previous two levels were very much about the ‘system’. This level needs to be about the learning.

Here, our burden of proof is that the process of learning has improved. Note that the key word here is ‘process’. The measures here need to relate to an increase in knowledge/skills acquired per employee for the same investment in time or money, or over the same period. A bit like an ROI for learning.

The LMS can churn out a lot of data to help with the maths here, but if you do it correctly you need to factor in all learning within the business and show an overall increase in learning per unit of investment.

Let’s not get bogged down in the equations and maths stuff. Just take it from us, you can probably come up with a pretty good estimate here showing a steady increase in learning as adoption increases.

Learning returns can be broadened out to include the following:

1) Think about the decisions you’ve made thanks to the better management intelligence e.g. scrapping some programmes because of poor satisfaction, moving to a better supplier, replacing unused content with fresh content. etc.

2) If the LMS enables mobile learning, consider the value of enabling learning during otherwise less productive times outside the office. It is the value of the useful time you are creating and you are estimating the value.

3) If your LMS enables collaborative learning then you can consider the value of this peer-to-peer learning by quantifying time spent learning and giving that a value equivalent to the cost of coaching.

4) If your LMS increases synchronous learning, then monitor the amount this increases and the consequential savings on travel to meetings. 

Level 4 – impact on competence

Clearly this is going to be a good one.

Impact on (drum roll) competence!

A really well-designed LMS will link to a competency framework, and as people learn more, and apply it in the workplace, the number of competencies your employees have will increase.

Whether competence is measured directly by your LMS or indirectly via another system, or very indirectly in terms of course attendance, you should hopefully see competence levels improve as LMS usage increases.

Why is competence higher level than ‘learning’? Well it all relates to the fact that the more competencies your employees have, the more value they have as ‘your greatest assets’. Assets cost money. Here we are getting tantalisingly close to an ROI!

Level 5 – business performance improvement

OK, this is the biggie, this is the level which justifies the spend.

Your first job is to identify which big picture metrics you believe you are impacting upon.

But how can more learning and more competence deliver tangible business performance improvement at this level? Well, in many cases you are looking at general indicators across the business and asking people to make a little leap of faith. 

People don’t like making leaps of faith, though, challenging you with comments like “How on earth can you possibly take credit for [for example] improving employee engagement?”

You want to prove your LMS has improved employee engagement, or career progress, or compliance, or productivity, or the value of the business to a potential buyer.

The sciences of correlation analysis and even econometric modelling exist to prove cause and effect. Yes, you could invest in expertise here, but the good thing about the model we are outlining here is that is takes a big leap of faith and breaks it down into manageable steps. So the leap is no longer a leap, and more of a manageable stride.

If you would like to remove any trace of a leap of faith and create that irrefutable graph which shows a linear relationship between LMS adoption and any of the factors mentioned above – we would like support this decision wholeheartedly. This blog post is too short to explain all the different ways of doing this, but please get in touch and let us help you make this happen because the good news is…

…it is always possible.

So, to sum up, please do use this framework to help you and your stakeholders appreciate the impact you have made using the levels listed in this document. Together, they provide manageable steps from low level to high level impacts, without any leaps of faith.

What now?

So now that we understand the different types of measurement, this raises the question: “When is the best time to measure?” The answer is always “now”. Why? Because there is never a “too soon” when it comes to evaluation.

So, whether you are thinking about a new LMS or reflecting on the success of an established LMS, now is the time to decide what to measure, how, and when, as well as measureable goals in how you hope these will improve.

The sooner we start measuring, the sooner we can prove beyond a shadow of doubt that your LMS launch was a success, not just for those who made it work without crashing, but a success for the business, one which makes it better than it was before. 

Now, put away that cheap bottle of sparkling wine. Bring out the magnum of Moet. Let’s work together to prove success at a strategic level.



Leave us your comments...