Skip to main content

Can you really measure the impact of your learning?

Podcasts and Audio | 02.12.2021

This month, the Kineo team delve deep into the idea of what is truly required to measure your learning successfully.

Transcript

Andy Costello
​​​​​​​Hello, everyone, welcome to Kineo's Stream of Thought. Today we are asking the question; can you measure learning? My name is Andy Costello, head of Customer Solutions, and I am joined by...

Jez Anderson
Jez Anderson, Head of Consulting. 

Jacob Funnell
Jacob Funnell, Digital Marketing Consultant. 

James Cory-Wright
James Cory-Wright, Head of Learning Design. 

Andy Costello
Thank you. Welcome, everybody. So we're looking at the question, can you measure learning? How it's been evaluated in the past, what measures are in place now, what the future might look like and we'll be drilling down into the world's most valuable resource, no longer oil according to The Economist in 2017, but data. I had to get data in there as early as I could. Albert Einstein said "not everything that counts can be counted and not everything that can be counted counts". So, gentlemen, I'm going to start with you, James. Can we measure learning? 

James Cory-Wright
I don't really know, to be honest, whether you can measure learning or not. I think you probably can't. But what you can do or, certainly in the context of the workplace, is measure what learning has led to or what it leads to. So in that sense, you can observe behavior, you can measure performance using all sorts of metrics, including appraisals and things like that, but can you measure it? I mean, it depends. In other words, what you want to measure it for. It's probably the for that's more important than the measure. I think we'll probably come to this later. I'm not actually sure that measurement is necessarily the thing we should be talking about in relationship to the data, but we can come to that later. That was a good rhyme wasn't it. 

Andy Costello
I was going to say yeah, when you say we shouldn't be talking about, I was worried about, you know, saying that so early on in this podcast, because that's kind of what we're talking about. But OK, so measurable impact then, can you measure the impact that learning has? Is that a more helpful phrase? 

James Cory-Wright
Well, you can. I suspect you can measure the impact that learning has had but whether actually that's what happens in practice is another question. It's extremely, you know, to measure impact, the impact of learning, you have to do a lot of work upfront before you kind of set out to deliver that learning. It has to be comparative, doesn't it? No, you can't measure anything, really, unless you have a sort of comparisons. So you have to start off by agreeing what are the things you're going to measure. 

Andy Costello
What's the start point and what's the end point.

James Cory-Wright
Otherwise, you're not going to know whether you've achieved any impact. And impact is really  primarily going to be about business metrics. And so, again, it's not really about the learning, it's more about the business. And there is the challenge, I think, that no one has really grasped in the 30 odd years that I've been doing it, because in a way, L&D and the business tend to work rather as two separate entities. 

Andy Costello
Yeah, sure. I mean, we obviously work in learning provision, we're learning providers, most of our clients are corporate clients, large organizations that care about the impact of business. Yet we're often told that we are being commissioned for a piece of work to improve learning, improve performance. Jez - gonna bring you in here. Do you think that's fair? Do you think we should be focussing more on the measurement of the learner rather than anything broader in the first instance? 

Jez Anderson
I agree with James, in reality, and I think that unusual as that is that I agree with James, but I do agree with James on this point, because I think that in some ways it's a little bit of a red herring to think about learning. I think we have to look at what is the impact that learning is having on people's behaviors, on what they say and what they do in the workplace. That's really what organizations are interested in. So clearly what we need to try and understand is what is it that the learning has done to influence and change that behavior that people exhibit and therefore the results that that change has led to in terms of business performance or whatever other elements of performance we are looking at. And I think, you know, we've been kicking around as an L&D community, the whole evaluation thing for donkey's years. I mean, I've been in it 25 years and no one's really got it yet and we still talk about Kirkpatrick ae still sort level one evaluation. 

Andy Costello
That's 50 years old, at least isn't it?

Jez Anderson
1954 I think it was, something like that. I mean, you know, so Kirkpatrick's a donkey's old framework and there's lots of other things there. But ultimately, it's an interesting frame and it gives us something to talk about. But the reality of it is what we're probably good at doing and what we're able to do is understand the immediacy and the immediate response a learner has had to a piece of learning, a piece of learning intervention, if you like, whatever that is, if it's a outdoor management course up in the Lake District or it's a piece of e-learning doing health and safety. There's ways that we can start to understand has that been useful for that individual at that time? What impact that's had on their performance? You can't measure that immediately after a training program. You've got to look at how do you test that? How would you put that into context of their real lives? How then do you understand what the impact of that has had on them as an individual? Has it had any impact or are there other things at play which will possibly impact? So it's a really complex thing. And I often talk about it as being a field of rabbit holes. And when you start talking about evaluation, it is just a massive field of rabbit holes that you can just fall down and there's just so many of them that it's not really helpful. The key for me and I'll shut up in a minute. The key for me is data. And I think it's the shift and change in the availability of data and how we're using data, which potentially opens a new, different window on our approach to how we look at impact. I don't think it's something necessarily that L&D can do in isolation. It has to be working much more closely with the business to understand the impact that was being sought after and therefore how can what their interventions and what they produce and deliver or support, what will impact that will have on performance? 

James Cory-Wright
I mean, I think that measurement's been bandied around, but really to make something sort of sound more important or significant than it possibly is. I mean, the bottom line is that and I don't be embarrassed about this, is that there's a recognition in the workplace that people need to know stuff. They have to know certain things in order to be able to do certain things. End of. You know, that doesn't need measurement, that's doesn't need science, that's just common sense. And really, that is the reason why training and learning is provided. Do you need to measure that? I would argue no. 

Andy Costello
Okay, but you need to measure surely that the investment you've made in that learning has worth and value and which should be repeated or not. That what you've set out to achieve has demonstrably improved some level of performance. 

James Cory-Wright
Well, I mean, you say you need to measure the investment in that. Arguably, no. But what you do need to know or will be reassured about, is that people are doing their jobs properly and to the right level. Which takes me back to the original point really. That's the only thing you need to know. Do you need to measure that? No. I don't know whether that's the right word. 

Jez Anderson
I think I'll just dive in. I think that, again, you know, it's getting to a point where we may well not. And if you put a purist hat on it, you can argue that, no, we don't need to measure. The reality of it is if you're an L&D manager and you've got a budget, you want to prove that your budget is actually delivered something. And that's really what we're talking about, is how do we support the L&D community to ensure that what they're doing is adding value to the bottom line of their business. And this for me, again, this is where data really starts to come in, because it's actually now that we can start to look at different sources of data, as long as we're asking the right questions of it early enough in the process. because if we don't know what it is, if you think about it in terms of a set of levers, if we don't know what lever it is that we're trying to pull in the first place, we don't know what forces we need to put on that lever to pull it. So if we haven't done the needs analysis effectively, we haven't looked therefore, what we're trying to achieve as an objective and a goal for the training and then we haven't aligned that next to some sort of performance business measure, which you can quantify and create some metric around, it comes very difficult. Yeah. So, you know, this is where for me, this is where data starts to come into its own is that we've now got, we have learner data, so we have evaluation, impact data, if you like or validation data I prefer to call it. But it's say, well, that's great and that's good and that validates the program and it says it's a well-designed program and the questions worked and all that sort of stuff. Brilliant. But it's then what do people do with it? How do those behaviors change? And we only know that when we can start to look at those metrics and say, well, those metrics have changed as a result of doing X, the metrics that we're looking to measure Y have changed demonstrably or not as a result of doing that. And that's where we start to think pulling in different data sets. So it requires for me, it starts to require an understanding of how to analyze and use data well beyond what L&D traditionally has even thought about as being data. 

Andy Costello
And yet we're hearing all about data all the time. There are more platforms that specialize in capturing data. There are more people using the word data to sell these new platforms. And yet, is there a case, therefore, that, you know, while we have more accessibility to data, we don't really know what to do with it? 

Jez Anderson
Exactly. And I was talking to one of my clients yesterday, and we're having this conversation about digital transformation. We've been working with them around what they're trying to achieve in terms of that digital transformation over the last few months. And one of the key things that's coming out is not the technology stack. And although that's where we started, what is coming out strong and clear is actually it's the capability set of the learning and development community and the way that that capability is structured. So it's not only the skills and knowledge of the people doing the jobs, but it's actually are they the right jobs, you know. So we hear now about the shifting nature of the jobs today which don't exist tomorrow. Well, you know, maybe we're at a point where we start to think about, well, to be a curation manager as opposed to being a development manager, we'll be data informed because it will be understanding what people are accessing. It'll be looking for patterns, it'll be looking to understand what people like, what they don't like, what's having an impact, and then choosing that to support, you know, the curriculum of learning or the curriculum of content, which is probably better than learning, that they put together. 

Jacob Funnell
So I think there's a parallel in the field of marketing in that we get very locked in to the tools that we have available to us and what the platforms that we're using are telling us. But the actual skills to properly interpret this data is similarly lacking. And I think that's often because data analysis is something which is deceptively hard. There are so many things which can go wrong when you're looking at data. There are so many phenomena within it, even really basic things like regression to the mean, which many people aren't familiar with and it's its own skill set. But one thing that can happen very profitably is if you have a background in an area and you understand exactly as you were saying, James, you're saying like, 'OK, we understand that certain people need these skills in order to do their jobs' and you have this kind of background information, then you can profitably talk to someone who is more of a data specialist and say, 'oh, how might we be able to measure this? So my model of the world, which I'm very confident in, is that, you know, improving soft skills is very important. Okay. If you're very confident in that, that's great. But there will still be ways that you can measure that. And someone who is a specialist in data analysis can often bring approaches and techniques which will be closed to you if all you're used to doing is just drawing off reports. You know, in my case, in marketing it would be something like analytics, in the L&D world it might be from your LMS. If you're dealing with someone who's really used to just pulling lots of spreadsheets and bringing lots of different data types together and coming up with really interesting analysis that 's where you can get really interesting stuff coming up.

Andy Costello
Sounds like we could learn a lot from marketing and the marketers use of data and collection of data. 

James Cory-Wright
I do have a question for Jacob, though, i'm going right back to that first question about measurement. Are we using the wrong word? In the sense that is it actually about data capture and then analyzing that data and then about improving the learning content on the basis of that analysis? 

Jacob Funnell
So I think that's a really good point, because I think we often think that analyzing data and having good metrics are the same thing. But you have to keep in mind that often with metrics, you produce them often for the function of showing, 'I'm doing my job', 'I was able to bring this number up'. But there's a really fascinating law , it's called Good Hearts Law, and I really recommend anyone is interested in this check out a podcast, Rationally Speaking with Julia Galef and I think it's Dr. Manheim and they discuss this law, which is basically as soon as a measure becomes a target, it ceases to be a good measure. And there's a reason because as soon as you set your measure up as as your target, then people start thinking that that measure is what you're aiming for. So a good example, which again comes from his podcast is, if you're trying to lose weight, you might think, okay, well, my measure is to just get the scale to read as a lower number to me. But then what people can do is to go, 'all right, before I weigh myself this morning, I'm not going to drink a glass of water because that would mean that the number on my scale will go up'. But that's totally not what you're trying to measure. You've mistaken the fact that you're trying to lose weight with you're trying to make that number change. 

James Cory-Wright
Also, to continue that analogy of the weight, the main thing really is about keeping the weight off. That's the biggest challenge of all. And I can sort of mangle that all the way back to learning again and sort of say that instead of content learning content being a commodity or thinking of it as a product, maybe it's more of a service in the sense that, you know, learning in the workplace is ongoing. And maybe the use of the data is to improve the learning experience in an ongoing way, in that sense it's a service. It's not a one off. It's not a product. It's not one trip to the scales to weigh yourself. 

Andy Costello
Which a measurement would be, and become the all important target. 

James Cory-Wright
Yes. And that's measurement. So maybe it's got nothing to do with measurement, maybe the how measurement thing is just a giant red herring. 

Jez Anderson
But I do think that, unfortunately, whatever we'd like to say, we operate in a world where, you know, numbers are how people interpret performance. And, you know, we can argue and to your point Jacob I totally agree and it soon becomes a number chase. But the reality of it is at some point you have to be able to communicate in the language of the people who are making the decisions and the people who are making the decisions invariably are the people who are using numbers to be able to say, 'I want to see X percent improvement performance of X', 'I want to see X percent accident, less accidents of Y', and it's numbers and it's metrics. So for L&D, it's about being able to quantify that in the behavioral way. So to say looking at that metric going, well, okay, so behaviorally, what underpins that? What will change that? What will give us a shift in that, which is a bit like, you know, the weight analogy, it is a bit like say, well, understanding that you're okay, having a glass of water or not having a glass of water will have an impact on what the scales say but in the longer term is like saying, well, okay, if I change my drinking habits, what are the drinking habits that I need to adopt to make that change? And it might be that drinking plenty of water is good, drinking beer isn't. And therefore, I need to drink less beer. And I'll measure that. I will measure how much less beer I drink and therefore see how much the scales drop. 

Jacob Funnell
I think one thing within this is so to take an example of like drinking less beer so you could very easily go, 'All right then. Well, I shall just start drinking cider instead'. But if you want to avoid a metric being gamed, then it's good to have several metrics of the same thing. So, for example, when I'm looking at the engagement that people have with content, then I can have a look at something like how many pages did they visit in a session? I can also have a look at how many repeat visitors did we have to the site? Are we building an audience? I can also have a look at how long are they spending on an article and I can have a look at what was their scroll depth. And now if I've got lots of these things together, then I've got my measures, but then I can also use my judgment with it. And I think that's the really, really critical point, that there is no way that you can have a metric which substitutes for that. It can inform your judgment. There is a very rare metric which you can completely use as a substitute and a good example of this, as you might think, well, okay, every single business just wants to make profit and a profit of a business goes up and that's fantastic. Right. And there's no arguing with that. Well, where the profit of the business is going up and they're doing that by doing deals with the mafia or something, then that's really terrible. So it really doesn't matter what metric you pick. There are some side constraints on it. There are some things where your expertise is relevant in assessing that. 

Andy Costello
So by relying too heavily on the data and on the measurements, we're losing the necessary human trait of actually doing the analysis and picking it apart. 

Jacob Funnell
Well, that's what we're incredibly good at. Machines are very bad at having an understanding of context. It's very, very difficult for a machine to understand sort of causal relationships in the world. If someone goes on this course, then we expect it to change their behavior in these sorts of ways. I mean, they're extremely bad at doing that. They're very good at spotting correlations. But that background knowledge of the world, it's just been a massive problem in A.I. for a long time and it's still a big problem at the moment. And so that's the sort of thing you can put into the system. And once you do that, then you kind of you get the best of both worlds. You get the kind of rigour of the fact you're measuring stuff that has nowhere to really hide. But you're also getting kind of the human element, too. And I think that if you go too far on either side, you run into real pitfalls. 

James Cory-Wright
So the bottom line is you can't measure learning, but it doesn't really matter because you can measure a lot of other stuff, essentially. And maybe the whole sort of agenda is just moved on from measurement, from learning. You know, all these things are clapped out phrases. It's all about the data, it's all the analysis... 

Andy Costello
It seems to me that data is providing is a lot more things to measure, a lot more granulated points in a process or in a person's performance or in a business. 

James Cory-Wright
But not learning. And that's the red herring bit. Because it doesn't matter whether you learn or you don't learn, what does matter is you behave and what you do, which we can track to a certain extent using the machines. 

Andy Costello
Yeah. And so Jacob, just to bring you back with that, the machines and i know this is a big, hot topic out there and lots of people are going on about A.I. and machine learning and this is the new way for learning and development. And I read it a nice little analogy the other day that A.I. can win the hardest game of chess in the world and beat you at it but it won't for a moment enjoy the fact and that to me says it all. But is it all that, quite frankly, A.I. at the moment?b Where has it got to get to and how can it help us measure or not or create more measures or create the correct measures? 

Jacob Funnell
I think that there are things which artificial intelligence remains  exceptionally good at. And so when it comes to things like analyzing correlations between things, it's extraordinarily good at that and finding trends and finding unexpected patterns in data and that sort of thing. But one of the things I find is that often it produces so much of this information and it obviously really depends on your use case and what you're doing with it. But sometimes it helps if you have a question that you want to ask first. And you have like some conceptual clarity on that first and then you kind of bring these tools to bear so you can  A.I. in extremely impressive ways. And there are like questions which would otherwise be very, very difficult to answer. For example, if you're looking at very complex, in marketing  you have things like really complex attribution problems where you go, 'hey, someone saw a Facebook ad, they came to us from Google search and then, you know, they maybe  clicked on an ad and how do I apportion credit to these these many different inputs then'? This is just an incredibly difficult problem. But A.I. can really help with that. It can really help do lots of very clever kind of statistical or mathematical things. And so that's the sort of thing where I think it's really, really good. But you still need that kind of underlying understanding of what that activity was. You know, what ads you are running and what campaigns you've been doing and how you feel that your audience reacts to your content, you can't strip all of that out. So, yeah. Do I think A.I. is all that? Well, I don't think, at present, I don't think that it can substitute for some of the things are very good at, context, understanding, particularly complex social relations. If you were to try and diagram out all of the things that happen when just people have a conversation, it's unbelievably complex. My beliefs about what you believe in, my beliefs about what you believe, about what I'm saying, my beliefs about what the listenership are doing and how they're switching off or whatever. You know, like that's really, really hard. And that's something we can't accurately model yet, to my understanding. 

Jez Anderson
I think for me, one of the key aspects of this conversation, and Jacob's hit on it in there, is that it's asking 'what question'? 'What question' are we trying to ask him prove and are we asking that question in the first place? So as L&D it goes back a little bit to James's point, can you measure learning or not? I think you can measure learning if you ask the right question. So what is it of the learning that you want to actually measure? And have you got an understanding of what that learning will look like, you know? So what is it? What is your starting point? What is your visioned end point? And what are the steps in the middle that's going to help you get to that? 

Andy Costello
Just before James comes back and says,' no you can't' -  I just wanted to call out the Towards Maturity Report 2018. You and I discused a while ago and they asked the question, 'what is L&D's relationship with data and what does L&D need to do to take advantage of the opportunities that data provides'? Presuming that we can and that we can therefore use data to measure. So the argument is there that we can. 

James Cory-Wright
The answer to that question is that you take more of a long term view. You know, I said let's stop looking at learning as a commodity, as a one off event. And picking up on what Jez said, you know, of course you can measure some learning. Y Yes, you can. I would agree. But the answer to that is you don't measure it at one point, it's whether it's stuck, whether it's stuck with you, whether in six months time to a year's time, are still sort of exhibiting all the benefits of that learning that you achieved. And that's a sort of very different kind of way of looking at things. You know, so in other words it's learning over time, it's about using data to improve things over time. It's all over time.  An ongoing thing. And that's a big shift from how we currently work. 

Andy Costello
So what you're making me think of there is that our job perhaps in L&D isn't to measure. That's not our responsibility. Our job is to have perhaps use the data to better improve the things we are doing to make that learning develop over time. 

James Cory-Wright
Absolutely. 

Andy Costello
So we're talking about things like nudge theory and spaced practice, spaced rehearsal. These are the activities and the interventions we can create if we just could better understand data to inform what a longer program of learning might be.

James Cory-Wright
Absolutely. It's a huge opportunity in that sense. 

Jez Anderson
And, to be honest, the project that we've been working on for the last six months or so, which is looking at how the source of data that we get from learners interaction with e-learning in particular, but also looking at other measures as well, so from our colleagues from the Oxford group. And it is very much like what are the tricks, if you like, that we can use to create different data streams and different data source sources which might exist, but we're just not able to capture them. So it is looking at learner behavior. So that's one aspect to it and then I think the other aspect to it is, as Jake said you've got this data, you've got this information and we might shove it through an nice little data engine or it'll produce a graph at the end of it. Well, that's marvellous and we've got some nice graphs and we've got a dashboard with numbers on it and pretty pictures. What does it mean? Again, it goes back to that question. It goes back to what question are you asking of that learning in the first place? And does the information that you're generating help you understand if you've done that or not? And if it does, then that's great, because then that helps you develop your practice and your input. If it doesn't, that's also great because it means you've done something wrong. 

James Cory-Wright
I think actually you've identified one of the one of the great challenges around the data. Data's a huge opportunity, but it's also a big challenge because the question is not whether you can capture it, not whether you can present it as a dashboard, but what are you going to do with it? 

Andy Costello
Precisely. 

James Cory-Wright
That's the challenge. What are L&D for example, or the business, going to do with all that data that we can now get hold of? 

Andy Costello
Well, we tried to engage very clever people like Jacob and data analysts that are clearly a skill set above and beyond. 

James Cory-Wright
All right. Well, I'll just get my coat! 

Andy Costello
So I'd like to bring Jacob back in at this point. Jacob, what can L&D do? What should they be doing with this data? We've got all this stuff. What do we do with it? 

Jacob Funnell
So one of the key things is that when you're being given data and when you're being presented a dashboard, ideally, you should have been involved in the creation of that dashboard in the beginning. So the questions of, you know, 'what is it that you want from this? What understanding do you want?' I'm literally doing that today for Kineo. I'm asking, 'what do you want to know about Kineo's content and how can I present you a dashboard with things which answer that question'? So, for example, one technique that you can do is instead of just showing some data, you can outline that graph with a question. So instead of it being, you know, if you were looking at engagement with your content, you had some measure of that, you could just instead of saying,'oh, this is how long on average someone was spending with this piece of e-learning', you would say, 'how engaged are people with our digital learning'? And that would be like your question mark. Well, the thing that you're trying to answer and you've explicitly framing it in that way rather than trying to just like dump some data at someone's feet and then making them ask that question in the first place, like, 'what do I do with this? What is this telling me'? And I also think that really, ideally, you need to have a section where you're telling someone, you know, ideally, if you're presenting information, you need to give it some context. Because again, otherwise, you're making this massive assumption that the data is self-explanatory and very often it just isn't. 

Andy Costello
So it's not just about, you know, what are we asking? It's also who we are asking for. I suppose as well, is it for for the learner, for us, for business, for, you know, to what are the aims of asking these questions? 

Jacob Funnell
I think often as well. When you've decided on a measurement, then you can just run sanity checks on it and you can go, 'okay, well look, we did this and we measured over time, these are facts'. And we can go back and say, 'well, do we think this measurement's any good'? Is it showing signs that we're trying to really game this measurement?' So a typical paradox that you might get in a sales team would be, 'hey, we want to make more calls' and then you measure how many calls people are making. They make twice as many calls and they're just really crappy calls. And so you go, all right. This isn't a great measure. We need to do some more things with that. And I think it's the same in any area. You just need to be interrogating the measurements. You can't just kind of take them at face value. You need to have some feedback loops where you are actively interrogating them and making sure that they're still meaningful to you. 

James Cory-Wright
Does that mean, then, that you can't create anything that's generic, that everything has to be custom? 

Jacob Funnell
I think there are often things which apply to a lot of use cases. So I feel like if you feel that the situations are sufficiently similar, then for sure, like you can have generic things. But I think it's important that you've at least asked those questions of what you're trying to answer. Do you think these measurements are good? Do you feel there's any risk of them being gamed? And just keep those in mind. 

Andy Costello
I suppose the pure granularity of the data you can get means that any correlation or pattern form is likely to be unique and therefore custom. And the custom thing you actually need is the person, is the human being to do the analysis. That's going to be different for every case. 

James Cory-Wright
Everything is pointing at this person - it could be an L&D person.

Andy Costello
Yes, it could be the new job and the new role. The marketing are already doing it. And this is the thing we we often hear and I know this is may sound dated and trite and apologies to those in marketing, but we constantly hear we can learn a lot in L&D through marketing, through, you know, great communication of messages, through campaigns. And maybe this role that marketing have now created, the data analyst, to huge success is something that we should absolutely and obviously have been thinking about, but perhaps more so rather than just the new technology to capture and beautifully visibly present data in a dashboard, we need more than that. 

James Cory-Wright
Yes, only this week I was talking to one of our designers who's quite interested in that. 

Andy Costello
A penchant for data. 

James Cory-Wright
A penchant for data, indeed. And in doing so, we were kind of coming to the point of view that now who do we need to the data out of the backend? Unfortunate phrase. But there we go. And on the one hand, you know, that's fine. But so they could just be a sort of a number cruncher in that sense. But then we were sort of saying they need that kind of learning design sensibility, in that sort of awareness of learner data if you like. So they sort of have to have the ability to get the data, still have a view about the data (just picking up on what Jacob's saying) and interrogate it but through the lens of whatever the business learning, L&D and then share that, with whoever needs it or act on it themselves. So I could see a role, for example, from the learning design perspective, very much of this curator/creator role where that sort of person, they're doing curation and they're creating as it's required. But one of the key skill sets is its ability to analyze the data in order to inform their decisions about what to curate. 

Andy Costello
And future learning programs. 

James Cory-Wright
So the 80 20 rule. So that's sort of how to improve your 80 percent which is curated and how to create from scratch really good targeted 20 percent. And you know, for me, that's the future of a future role. And in a company like Kineo I could see the learning designers are obvious candidates for moving into that curator/creator role. And then within L&D, you can see that they could have a counterpart, somebody in L&D who also has that kind of skill set. And then maybe the two work together to provide an ongoing curation creation service. I come back to this idea of service, so that they're always tweaking, always changing, always modifying, refining, improving, keeping up to date, validating and socializing content. Yeah, but it's not about measurement. I mean, I guess also that content would probably be informal content. They create a lot of it, create it or gather it or whatever.  And arguably that's not particularly measurable either, but they'll know whether it is any good or not because they'll be looking at the data. So they'll be borrowing the techniques from marketing to refine that data and to refine that content. 

Andy Costello
Thank you. Yes. Closing thoughts then on that, Jacob? 

Jacob Funnell
I think in some ways you've given marketing quite a lot of credit, but I feel that marketing has the same broad challenges as L&D and that you have organizations which are enormously sophisticated and they're often using data science techniques. And there are others who are like predominantly using very face value numbers in basic analytics programs. So I wouldn't want to present the world of marketing as being the sort of wonderful region where we've all got it sorted and many of the same problems still apply. I would say in terms of overall, the question that you're trying to answer when you're looking at something like learning or and I feel really somewhat parallel thing in marketing is when you're trying to influence people's behavior further up the buying journey, when you're trying to improve their awareness of things and you're doing very behavioral, psychological things. These are incredibly hard to measure things. They're very hard to solve. And although we've made a lot of progress on it, there's a reason why, you know, the science of psychology and sociology and everything have a lot of difficulty measuring these things, too, because there are so many confounding factors when you're dealing with people. We're very, very complicated. And so it's just generally a hard problem. So if it seems difficult, I think it's fair to say it's because it just is. And if you think about what L&D are trying to do, that they're dealing with the entirety of the company like a company wide thing, it's not just the problems of one department, it's the problems of every department and influencing them positively. I mean, that's just amazing. So I think the other thing I would say is that if you're thinking about these things, it really helps to get a variety of perspectives. Some of the most profitable discussions I've had are with data scientists or with psychologists and with people who have to think about measurement and understanding things with their own particular box of tricks. And that can just really, really help. It can help prevent you from feeling like you're having to solve these quite specialist problems all by yourself. And so I guess if there was one takeaway I would give, it would be to just broaden the diversity of perspectives that you're getting on these problems. 

Andy Costello
Thank you, Jacob. Jez, any closing thoughts?

Jez Anderson
There's no silver bullet. I think that's probably it. I think when we talk about measurement and we saw what impact there's no quick fix, there's no piece of technology or solution that's going to come out and revolutionize the world. I think the reality of it is, is that we do want to understand the impact that our support has on our communities that we work with. I think, you know, anyone who works in L&D would want to do that for whatever reason. There isn't necessarily a technological solution that will give you that answer in one hit. But I think what it is, it's about understanding that technology is a key part of our makeup now and on how we operate and how we work. And as a result, it creates a lot of data. And let's not get too hooked on the process of collating data and collecting it. Let's not get worried about that. Let's just get worried about some fundamental things again, which is 'what are we trying to prove'? 'What are we trying to say that by doing X, we get Y?' If we can do that, then we can start to use the data. But don't start off with the data start off with the question.

Andy Costello

Thank you for joining us today in which we learned that measurement is perhaps a red herring after all, but we may have created a brand new role for learning design and L&D who knows. Update those CVs folks. Thank you very much. If you would like to carry on the conversation, please do reach out to us. We are Kineo on Linked In and Twitter or via our website at www.kineo.com. All the best and speak to you again soon. .

Your speakers are


As Head of Customer Solutions, Andy leads the Kineo EMEA sales team and brings a 20-year industry track record of Learning Technology expertise. Andy is passionate about driving exceptional customer service and develops close partnerships with clients, ensuring they achieve success not only for standalone projects but long-term strategic goals. Andy also plays a key role in consulting on projects and account relationships across Kineo, is regularly featured on our podcasts, and is a sought after speaker at industry events.
Jacob works as a digital marketing consultant at Kineo. He has over 9 years' experience helping businesses in the learning and training industries understand their customers and bring in more leads. www.funnellmarketing.com
James has over 25 years' experience of instructional design and video scriptwriting. He previously headed up our team of learning designers and consultants, overseeing learning content design across all client projects.
Jes was Head of Consulting at Kineo until 2020.