I think we’re all agreed, microlearning has big business benefits. The ability to create bitesize multimedia content, with specific learning goals, tailored to your functional teams, customer segments or student groups. Brilliant. But it’s largely wasted effort unless you can measure audience interaction.
“If you can’t measure it, you can’t improve it” – Peter Drucker’s now immortal quote speaks to all areas of business, but it’s particularly valid here. Microlearning’s power relies not on the building and sharing of great content (though that’s certainly important) but the ability to improve, refine and tailor that content over time; making incremental edits to messaging, media and meaning.
“If you can’t measure it, you can’t improve it”
And so it follows, any microlearning platform worth its salt must include backend analytics to effectively measure and interpret key behavioural metrics. In addition to measuring attainment and completion stats, you need to understand which pieces of content users like, dislike and engage with. Only then can you begin to create content that truly resonates with your audience.
That’s not to say you shouldn’t strive for perfectly targeted content from the outset. You probably have a pretty good handle on your audience already, so go with your gut and build from there. And it’s important not to drown your audience in too much content. Remember that bitesize is best, especially to begin with. Once your audience is engaged they may be receptive to higher volumes or more in-depth subject matter. Just tread carefully.
Want more tips? Check out: 6 tips to build brilliant bitesize microlearning content.
So, once your content has been delivered, what metrics should you track? This really depends on the capabilities of your microlearning platform but, as a minimum, you need to know which pieces of content are performing well and which are not. This could be achieved by tracking ‘likes’ or bookmarks – aggregating these results will quickly highlight underperforming content. You could also look at time spent consuming content – but be careful drawing snap conclusions, as this could also indicate a lack of clarity.
“Microlearning’s power relies on the ability to improve content over time; making incremental edits to messaging, media and meaning.”
You should also track completion versus open rates. This will help you to identify any units that are being started but not finished, so you can quickly remove any roadblocks or objections. And if units form part of a bigger course, look at progression through that course, to identify which units are seeing strong engagement or, conversely, user drop-outs.
In addition to backend analytics, you should employ inbuilt measurement tools at the content level, like multiple-choice questions, feedback forms and free-form comments. It’s all about building a picture of your audience, to inform your business decisions and aid future content creation.
Meet Plasmo: built by Salpo
If you’re ready to implement microlearning, check out Plasmo – our new digital engagement platform. Plasmo enables businesses to build dynamic multimedia content, share it with staff and customers, then measure and analyse their engagement.
Become a Plasmo beta-tester!
We’re inviting a limited number of people to beta-test Plasmo and provide their honest feedback, helping us to shape the future of this platform. It’s totally free to participate, with zero commitment. Limited spots available and subject to an initial qualifying call.