Over the past few weeks, in a range of different situations, I have bumped into a few assumptions that I have had to challenge. Working as a consultant, I regard it as part of my job to challenge existing mindsets where necessary. I also have no qualms about doing so within my community of practice.
So let's take a look at some of the issues I've addressed... and how:
"We need to track learning"
First of all, you can't track learning. The only way you can tell whether somebody has taken something on board is to observe their behaviour in the workplace. If it changes to incorporate the new material/process/whatever... then they have learned something. The best thing you can do is track access to learning materials. This is no indication that learning has taken place. If person X simply clicks 'next' every few seconds and keeps going right to the end, your LMS is going to tell you that they have successfully completed the course.It is true that certain levels of tracking will allow you to check how long a user spent on each page, from which you can draw realistic conclusions about whether or not they actually read the material on each page, but whoa! Who is actually going to do this job? Whose time can you afford to allocate to this task when there is so much real work to be done? And once they have identified that person X failed to spend long enough on pages 12, 45 and 67, what then? Are you really going to go after them with a big stick and force them to go back and do those pages again?
"We must have an assessment"
Let's just make one thing totally clear: a series of multiple choice questions with options such that even the average Joe from off the street could select the correct answer, is not an assessment. It's an attendance register. Okay?If yours is a regulated industry and you are obliged to have some kind of butt-covering tick box, then fine. But let's not pretend to each other that it is anything other than that. If this is not the case, why exactly do you want an assessment? You could provide a few thought provoking scenarios. I'm all in favour of that, but do you really need to record some kind of test score? Would something along these lines not suffice?
Once again, the best way to assess whether people have learned anything is in the form of observable behaviour change on the job.
"People need to know this"
Really? Why? Because they need to observe it? Ah. So what you're actually after is not that they should know something, but that they should do something, right? Can we agree that knowing is not necessarily linked to doing? How many people know what the speed limit is in any given area? How many people observe it? Knowing isn't the goal.
Besides, let's face it, most 'policies' are pretty much common sense recorded in formal language with too many commas. In cases like this, I refer people to Cathy Moore's action mapping post. I've lost count of the number of people with whom I've shared that post!
"We need a half-hour elearning course on xyz"
Mostly when L&D people get this sort of request, they just nod and get on with it. I'd like to encourage them to push back. C'mon people: add a little value, already! Ask these questions:- Why?
- What is it for?
- What will people do differently afterwards?
- Which of the organisation's strategic goals are being addressed, here?
"How can we design this so that it fits with what we can do in Articulate/Packager/X-tool?"
I get really uncomfortable when people adopt this approach. When they have a hammer and try to figure out ways to turn everything into a nail. Does it have to be shiny? Sometimes the answer is absolutely yes, but not as often as we are led to believe. Sometimes all you need is a simple roadmap diagram, or a list of procedural steps with a list of links to user generated screen capture videos or testimonial video clips taken with web/flip cameras.
No comments:
Post a Comment