Archive for the ‘assessment’ category

3 hot resources for best practice multiple-choice quizzing

27 April 2011

In my previous post, 14 reasons why your multiple-choice quiz sucks, I listed typical clangers whose only purpose is to render your assessments ineffective.

Thumbs down

If they’re the bad and ugly aspects of MCQ design, what’s the good?

To answer that question I hit Google and a couple of academic databases, but mostly in vain.

It may be due to my poor researching skills, but I found very little empirical evidence of best practice multiple-choice quizzing. Plenty of unsubstantiated opinion (of course) but not much science.


You see, Google wasn’t much help because “best practice” is frequently confused with “common practice” – but it’s not the same thing.

The peer-reviewed literature wasn’t much better. Alarmingly, many of the studies were inconclusive, adopted a flawed experimental design, and/or didn’t compare the performances of the quiz-takers on the job under the different treatments – which is the whole point!

However, through a combination of persistence, serendipity and social networking, I finally uncovered 3 resources that I consider worth recommending: a journal article, a website and a blog.

1. A Review of Multiple-Choice Item-Writing Guidelines for Classroom Assessment – In this article, Thomas Haladyna, Steven Downing & Michael Rodriguez validate a taxonomy of 31 multiple-choice item-writing guidelines by reviewing 27 textbooks on educational testing and 27 research studies. If you want insight into the myriad of MCQ variables, here it is.

2. QuestionMark – David Glow and Jayme Frey independently pointed me to the wealth of resources on this website. QuestionMark is a business, granted, but they know what they’re talking about – a claim backed up by the fact they have a psychometrician on the payroll (cheers David) and I heard Eric Shepherd with my own ears at LearnX last year and was very impressed.

3. The eLearning Coach – Connie Malamed is a qualified and experienced e-learning designer whose blog provides advice to fellow practitioners. I value Connie’s expertise because it is practical and she has implemented it in the real world.

If you are aware of other good MCQ resources – preferably evidence based – please share them here…


14 reasons why your multiple-choice quiz sucks

19 April 2011

Unlike some of my peers, I’m not an opponent of the multiple-choice quiz. It’s a convenient and efficient means of assessing e-learners, if it’s done well.

The problem, unfortunately, is that it’s often done poorly. So poorly, in fact, it’s embarrassing.


At my workplace, I am regularly subjected to the multiple-choice quiz.

In fact, over the years in a range of jobs across several industries, not to mention my time in high school and university, I have been subjected to countless multiple-choice quizzes.

So I feel eminently qualified to tell you why yours sucks…

Thumbs down

  1. The questions don’t represent the learning objectives, so why did I waste my time doing the course?

  2. There aren’t many questions, so I should be able to bluff my way through them.

  3. The number of attempts is unlimited, so I’ll just keep guessing until I pass.

  4. The pass mark is low, but my customers aren’t so forgiving.

  5. The questions and answers aren’t randomised at each attempt, so I’ll jot down my responses to prepare for next time: 1-A, 2-D, 3-B…

  6. I’ll also ask my mate what he got for Questions 3, 6 & 8.

  7. The questions follow the order of the topics in the course, but I’m unlikely to encounter them in that order on the job.

  8. You use “All of the above” only when it’s correct.

  9. The ridiculous answer is obviously incorrect.

  10. The longest answer is probably correct, otherwise you wouldn’t have bothered writing it.

  11. More than one of the answers is arguably correct, and I’m shocked you didn’t know that.

  12. Your use of the double negative can only mean one thing: you can’t write no good.

  13. You try to trick me rather than confirm my understanding of the central concepts, so remind me: what’s the purpose of your role in this organisation?

  14. Your questions test my recall of obscure facts rather than my behaviour in authentic situations, so this exercise has no bearing on my performance.

Clearly I’m being deliberately pointed, but in all honesty, have you ever been guilty of one of those clangers?

I know I have.

But that’s no reason to vilify the multiple-choice quiz. When combined with other forms of assessment, it’s a valuable piece of the pedagogical puzzle.

Let’s just design it better.

Chicken takes a multiple-choice quiz

Online courses must die!

7 July 2010

A touch dramatic, isn’t it?

Now that I have your attention, please bear with me.

There’s method in my madness…

The myth of rapid authoring

The proliferation of so-called rapid authoring tools over the last few years has coincided with an explosion in the number of online courses developed in-house.

In the bad old days, technically challenged L&D professionals had to pay exorbitant fees to development houses to produce simple modules. These days, however, everyone seems to be creating their own online courses and distributing them via an LMS.

In tandem with this trend, though, has been the increasingly familiar cry of “It’s not interactive!”. Critics rail against boring page turners – and rightly so.

Bored at the computer

But you know what? Even when L&D professionals consciously integrate interactivity into their online courseware, I usually don’t think it’s all that engaging anyway. Increasing the number of clicks required to view the content does not make it more interactive. It just makes it annoying, especially for time-poor employees in the corporate sector.

Yes, I know you can embed real interactivity into courseware via games, branched simulations, virtual worlds etc, but hardly anyone does that. It requires time – which you don’t have because you’re too busy building the online course – or dollars – which defeats the purpose of developing it in-house!

So what’s the alternative?

Frankly, there’s nothing most online courses do that a PDF can’t. Think about it: PDFs display structured text and pretty pictures. Just like a typical online course, without the fancy software or specialist skills.

Businessman typing on keyboardAnyone (and I mean just about anyone) can create and update a PDF. Suddenly SMEs are back in the game…

Write up a Word doc and convert it? Easy.

Update the Word doc and re-convert it? Easy.

Now that’s what I call rapid.

The best of both worlds

If we dispense with online courses in favour of PDFs, how can we incorporate interactivity into the learning experience?

Enter the Informal Learning Environment (ILE).

Occupying a place on the continuum somewhere between a VLE and a PLE, an ILE is an informal learning environment that a facilitator manages on behalf of a group of learners.

Essentially, an ILE is a space (like a website or intranet site) that centralises relevant learning resources in a particular domain. The site may host some of those resources and point to others that exist elsewhere.

So your PDFs can go in there, but so too can your audio clips, videos, puzzles, games, quizzes and simulations. Don’t forget podcasts, RSS feeds, slideshows, infographics, animations, articles and real-life case studies. Not to mention blogs, wikis, discussion forums and social bookmarks.

Man working on computer

Unlike a VLE, an ILE is strictly informal. The learners can explore its resources at their own pace and at their own discretion. No forced navigation, no completion status. In this sense, the pedagogy is constructivist.

Unlike a PLE, an ILE is communal. It exists to support a community of practice, whose members can (or more accurately, should) incorporate it into their own respective PLEs. In this sense, the pedagogy is connectivist.

But that’s not to say that the pedagogy of an ILE can’t be instructivist either. The facilitator should provide a learning plan for novice learners which defines a sequence of study, identifying specific resources among the potentially overwhelming array of options.

The sky’s the limit

An ILE is a scalable and flexible learning environment. If we view each resource within that environment as a learning object, we can appreciate how easy it is to add new content, update old content, and remove obsolete content.


It’s incredibly inefficient to use up the precious time of an L&D professional to build, publish, test and upload an online course, only to edit, re-publish, re-test and re-upload it later, just because a few words need to be changed and a graph replaced. Instead, the SME can create and update the object via Word.

If you are keen on creating interactive tutorials, games or virtual worlds, now you can go for it! You have more time, and new tools are coming out that are making these kinds of thing easier to do. The finished product can be added to the ILE as another learning object. Again, if it needs to be updated later, there’s no need to edit, re-publish, re-test and re-upload a whole course – just that object.

If you commission an external developer to build a smokin’ hot immersive scenario, guess what: you add it to the ILE as another learning object. When it needs to be updated, you pay the developer to work on that object and that object only.

In this age of iPhones and Flip cameras, why not encourage your learners to generate their own content too? It’s another rich source of objects to add to the mix.

All these examples illustrate my central premise: when content is managed in the form of independent learning objects, it remains open and flexible, which means you can keep it current, relevant and organic.

Rockin’ role

Under the ILE model, the role of the L&D professional finally evolves.

Happy professionalThe SME is empowered to produce content, which frees you up to apply your own expertise: instructional design. This may involve a greater focus on engagement and interactivity.

The responsibility of learning is assigned back to the learners, which frees you up to guide, scaffold, encourage, discuss, prompt, probe, challenge and clarify. In other words, facilitate learning.

Your value in the organisation goes through the roof!

Take the ass out of assessment

I claimed earlier that there’s nothing most online courses do that a PDF can’t. I glaringly omitted assessment. Please note I left it out on purpose.

There are just some things that the company must know that you know. You get no argument from me on that.

However, how we assess that knowledge is bizarrely old fashioned.


While it’s convenient to wrap up some content and a quiz into a single package, I just don’t see the point from an instructional design perspective. Forcing someone to register into a course, just to pass a dinky quiz at the end, doesn’t make a lot of sense to me.

It is widely acknowledged that the vast majority of learning in the workplace is informal. From exploring an ILE to chatting around the water cooler, there is a myriad of ways that people learn stuff. Assessment should represent the sum of that learning.

This is where the LMS comes in. In my view it should manage assessment, not content. More specifically, it should deliver, track and record standalone tests that are linked to particular competencies.

When the LMS is used in this way, the L&D model aligns more closely with the learning process. The employees learn informally all over the place, using an ILE as their central support resource, then (if necessary) they record their competence. The focus of measurement shifts from activity to outcome.

This unorthodox approach makes many people nervous. Their primary concern is that someone can jump straight onto the test and pass it immediately, without ever “doing the course”. In response, I make these three points:

  1. You can jump straight to the assessment in most online courses anyway.

  2. If someone bluffs their way through the assessment and passes, clearly it wasn’t robust enough. That’s your fault.

  3. Conversely, if someone passes the assessment because they already have the knowledge, what’s the problem? You are recording competence, not making people’s lives difficult.

Of course, this kind of nervousness isn’t confined to the corporate sector nor to e-learning. For example, many universities have a minimum 80% attendance policy for face-to-face lectures. I don’t see the point of turning up just to fall asleep with my eyes open, but that’s another story!

The method in my madness

Online courses must die because they are unsustainable in the modern workplace. They aren’t rapid, flexible or scalable, and they usually don’t take full advantage of their medium anyway.

So unlock your content and manage it in the form of individual learning objects in an ILE.

Shift the bulk of the content to PDF. In the age of e-readers, no one will notice much difference.

By all means invest in authoring tools, but only in ones that will help you create interactive and engaging objects – easily.

Exploit Web 2.0.

Use standalone tests to record competence on your LMS. They cover all sources of knowledge.

Informalise learning. Formalise assessment.