Wednesday, May 28, 2008

Assessment must assess!

I am busy with a project at the moment, on which a bit of a sticking point has developed around the assessments. Some of the assessment questions being forward are, well, inappropriate.

One could argue (and I often do) that the most valid form of assessment is the performance management process. One's team leader and colleagues will be best positioned to assess whether your learning has resulted in any changed behaviours, increased skill, etc.

But there are times when compliance requirements must override such intangible evidence, and quantifiable evidence must be recorded. This generates the need for assessment.

Matters are not helped by the fact that the word that has passed into common parlance (thanks to the terminology used in the various applications) is "Quiz". It's hard to place much credence on an assessment that is termed a quiz, but it is what it is, so we'll move past that point.

The expectations of SMEs and sponsoring clients (and now I'm not just referring to my current project) seem often to be for multiple choice questions that are lifted straight from the text of the learning resource.

If it is possible for a person with no prior knowledge of a concept to take the assessment and pass, then I'd question the validity of the assessment. If the question is followed by a reference to page on which the answer may be found, even more so! At the very least, the questions must call for inference. Primary school children do comprehension tests on extracts from literature, where the answers may be found verbatim in the text. Once they get to secondary school, children are expected to be able to draw conclusions and to conjecture based on the given text. Why, when we're dealing with work-based adults, do we feel the need to revert to the verbatim approach?

We should be able to say with confidence that passing an assessment is indicative of something. Based on the fact that Joe Bloggs has passed this assessment, I should be able to assume that he has a good working knowledge of whatever-it-is. If this is not the case, then surely the assessment is a waste of time and resources.

When there is a compliance requirement for an assessment, this usually means that there is a regulatory body in the background. Let's imagine that a representative of this body is visiting our site. Proudly we tell him/her that 97% of our staff have passed the assessment on, say, first aid in the workplace. The rep asks to see the assessment.

Q1. Do you know where the first aid kit is kept?
(a) Yes
(b) No
(Correct answer: yes)

Q2. The position in which we place a person is known as:
(a) The pike position
(b) The first aid position
(c) The missionary position
(d) The recovery position
(Correct answer: d)

I can't imagine that this will fill the rep with any sense of confidence that, were he/she to take a tumble during that visit, any number of employees would be able to provide appropriate first aid. In fact, perhaps it is worth mentioning that I have a certificate that says I am a qualified first aider, yet I faint at the sight of blood - surely the assessment should have taken this small impediment into consideration?

I would also argue that a straightforward multiple choice assessment has limited validity. There are all manner of urban legends that tell of small children/monkeys/random generators that pass such assessments. All these aside, there are limits as to the challenge that such questions can present. Although it is worth bearing in mind that, with the technology at our disposal, there is much that we can do with multiple choice - instead of four text options from which to select, the user might be faced with images, video/audio clips or cartoon strips.

The trouble with assessment questions that go beyond "select the correct answer" is that they are likely to require some human intervention. Even if the answers are a couple of lines of text, someone will have to "mark" them, since the automated options are a bit thin and unreliable on this front. And this is where the rope hits the rudder. Quite often, elearning resources are requisitioned to obviate the need for human involvement in learning and development. I have lost count of the number of times I have suggested a project-based learning-cum-assessment to be submitted to the line manager within a specified timeframe, only to be assured that (a) the learners wouldn't do it and (b) the line managers would never agree to it.

Sigh.

Let's look at it this way:

  • Do you really need a formal assessment? If not, skip it!
  • Why? Your justification of your requirement will shape the assessment.
  • How will it benefit the learner? And yes, this is the point... unless of course, this is purely a box ticking exercise.
  • What do you want it to prove? What conclusions should I be able to draw from each person's results on the assessment?
Once you have drawn up your assessment, you need to ask yourself whether it serves these purposes. If not, I suggest you go back to the drawing board.

And above all, it must be possible to fail. If you have created an assessment that anybody can pass then what you have is not an assessment at all. It is an attendance register.

4 comments:

N Winton said...

I love your final point about attendance registers… especially in light of the annual %age increase in A-Level results (see http://tinyurl.com/6btb6p for details).

At least in Scotland we only get a pass rate in the 70-80% range. Yes... we actually fail pupils!

It's a little depressing to realise that many of the VLEs that schools will be rolling out over the next few years will have little value for learning but will go a long way to increase the number of 'assessments'… er quizzes… that the poor pupils have to take. If we're not careful, we'll be so busy assessing thwm that we won't have time to teach them…

Karyn Romeis said...

@neil I see what you mean! I have to say that getting an A no longer carries the kudos it once did. In our year group, there were 5 A aggregates out of a total of 57 girls - few schools could boast a close to 1 in 11 rate. Even getting an A for a single subject was a significant achievement. I got 4 Cs, a D and an E and I was considered pretty darn smart. Today, those results would be very mediocre.

Mind you, that said, one of the A aggs went to a girl I never thought particularly smart.

Go figure.

Jago said...

First Aid courses and pointless questions - don't get me started! Our first-aid course too suffers from this particular problem. It's a book on batteries with search-and-recall questions basically.

Where are the cases and self-monitoring (Am I capable of doing this?)? There nowhere to be found, but of course no time and money. So there you go, end of story. It's the sad truth oftentimes.

Karyn Romeis said...

@jago "A book on batteries" - I like that! I often wonder why we don't just give them a book instead of the things we're called upon to create.