Posted tagged ‘online assessment’

E-Learning Provocateur: Volume 2

17 September 2012

Following the modest success of my first book, I decided to fulfil the promise of its subtitle and publish E-Learning Provocateur: Volume 2.

The volume comprises a collation of my articles from this blog. As in the first volume, my intent is to provoke deeper thinking across a range of e-learning related themes in the workplace, including:

E-Learning Provocateur: Volume 2•   social business
•   informal learning
•   mobile learning
•   microblogging
•   data analysis
•   digital influence
•   customer service
•   augmented reality
•   the role of L&D
•   smartfailing
•   storytelling
•   critical theory
•   ecological psychology
•   online assessment
•   government 2.0
•   human nature

Order your copy now at Amazon.

3 hot resources for best practice multiple-choice quizzing

27 April 2011

In my previous post, 14 reasons why your multiple-choice quiz sucks, I listed typical clangers whose only purpose is to render your assessments ineffective.

Thumbs down

If they’re the bad and ugly aspects of MCQ design, what’s the good?

To answer that question I hit Google and a couple of academic databases, but mostly in vain.

It may be due to my poor researching skills, but I found very little empirical evidence of best practice multiple-choice quizzing. Plenty of unsubstantiated opinion (of course) but not much science.

Cartoon

You see, Google wasn’t much help because “best practice” is frequently confused with “common practice” – but it’s not the same thing.

The peer-reviewed literature wasn’t much better. Alarmingly, many of the studies were inconclusive, adopted a flawed experimental design, and/or didn’t compare the performances of the quiz-takers on the job under the different treatments – which is the whole point!

However, through a combination of persistence, serendipity and social networking, I finally uncovered 3 resources that I consider worth recommending: a journal article, a website and a blog.

1. A Review of Multiple-Choice Item-Writing Guidelines for Classroom Assessment – In this article, Thomas Haladyna, Steven Downing & Michael Rodriguez validate a taxonomy of 31 multiple-choice item-writing guidelines by reviewing 27 textbooks on educational testing and 27 research studies. If you want insight into the myriad of MCQ variables, here it is.

2. QuestionMark – David Glow and Jayme Frey independently pointed me to the wealth of resources on this website. QuestionMark is a business, granted, but they know what they’re talking about – a claim backed up by the fact they have a psychometrician on the payroll (cheers David) and I heard Eric Shepherd with my own ears at LearnX last year and was very impressed.

3. The eLearning Coach – Connie Malamed is a qualified and experienced e-learning designer whose blog provides advice to fellow practitioners. I value Connie’s expertise because it is practical and she has implemented it in the real world.

If you are aware of other good MCQ resources – preferably evidence based – please share them here…
 

14 reasons why your multiple-choice quiz sucks

19 April 2011

Unlike some of my peers, I’m not an opponent of the multiple-choice quiz. It’s a convenient and efficient means of assessing e-learners, if it’s done well.

The problem, unfortunately, is that it’s often done poorly. So poorly, in fact, it’s embarrassing.

oops!

At my workplace, I am regularly subjected to the multiple-choice quiz.

In fact, over the years in a range of jobs across several industries, not to mention my time in high school and university, I have been subjected to countless multiple-choice quizzes.

So I feel eminently qualified to tell you why yours sucks…

Thumbs down

  1. The questions don’t represent the learning objectives, so why did I waste my time doing the course?

  2. There aren’t many questions, so I should be able to bluff my way through them.

  3. The number of attempts is unlimited, so I’ll just keep guessing until I pass.

  4. The pass mark is low, but my customers aren’t so forgiving.

  5. The questions and answers aren’t randomised at each attempt, so I’ll jot down my responses to prepare for next time: 1-A, 2-D, 3-B…

  6. I’ll also ask my mate what he got for Questions 3, 6 & 8.

  7. The questions follow the order of the topics in the course, but I’m unlikely to encounter them in that order on the job.

  8. You use “All of the above” only when it’s correct.

  9. The ridiculous answer is obviously incorrect.

  10. The longest answer is probably correct, otherwise you wouldn’t have bothered writing it.

  11. More than one of the answers is arguably correct, and I’m shocked you didn’t know that.

  12. Your use of the double negative can only mean one thing: you can’t write no good.

  13. You try to trick me rather than confirm my understanding of the central concepts, so remind me: what’s the purpose of your role in this organisation?

  14. Your questions test my recall of obscure facts rather than my behaviour in authentic situations, so this exercise has no bearing on my performance.

Clearly I’m being deliberately pointed, but in all honesty, have you ever been guilty of one of those clangers?

I know I have.

But that’s no reason to vilify the multiple-choice quiz. When combined with other forms of assessment, it’s a valuable piece of the pedagogical puzzle.

Let’s just design it better.

Chicken takes a multiple-choice quiz


Follow

Get every new post delivered to your Inbox.

Join 457 other followers