Posted tagged ‘testing’

The joy of UX

8 September 2015

One of the funniest tweets I have ever seen was brought to my attention by Vala Afshar…

Seeing this little animation was one of those serendipitous moments, as I had that very day experienced something eerily similar.

I’ve written previously about how I’ve been toying around with the augmented reality app Aurasma. In A way with the fairies I described how I used this app to replicate Disney’s fairy trail in my local botanic garden.

Impressed with what the app can do, I turned my attention to using it in the workplace. I decided to start small by using it to promote a new online course that my team was launching. I took a screenshot of the main characters in the course’s scenario and provided 3-step instructions for the target audience outlining how to: (1) Install the app onto their mobile device; (2) Visit the relevant URL in their browser; and (3) Point their device at the picture. When they did so, the names of the characters would magically appear above their heads.

This wasn’t just a gimmick; it was a proof of concept. By starting small, I wanted to test it cheaply and fail quickly. And fail I did.

Deflated sports mascot

When I asked several of my tech-savvy colleagues to test it, every one of them reported back saying it didn’t work. Huh? It worked for me! So what could be the problem?

After much tinkering and re-testing in vain, I decided to ask a friend of mine to test it. Bang, it worked for her first go. As it turns out, my colleagues simply weren’t following the second instruction to go to the URL. In their excitement to scan the image, they did so immediately after installing the app – but of course without the link, the app had nothing to connect the image to my augmentation. So when I pointed out their skipping of Step 2 and they re-tried it, voila it worked.

Despite this rough start, another colleague of mine cottoned on to my trial and was keen to use the idea to jazz up a desk-drop he was creating. Upon scanning the trigger image, he wanted a video to play. Aurasma can indeed do this, but I was trepidatious because my experiment had failed with tech-savvy colleagues – let alone regular folks. But I decided to look on the bright side and consider this an opportunity to expand my sample size.

Learning from my mistakes, I re-worded the 3-step instructions to make them clearer, and this time I asked a colleague to test it in front of me. But again we ran into trouble. This fellow did follow Step 2, but when the URL opened the app, it immediately required him to scroll through a tutorial. Then it asked him to sign up. Argh… these steps were confusing… and I was oblivious to it because I had installed Aurasma ages ago and had long since done the tutorial and signed up.

But that wasn’t all. After I grandfathered my colleague to Step 3, he held out his smartphone and pointed it at the image like a lightsaber. WTF? He read the instruction to “point” his device literally.

Another lesson learned.

Facepalm

Steve Jobs famously obsessed over making his products insanely simple. Apple goodies don’t come with a user manual because they don’t need them.

My experience is certainly a testament to that philosophy.

Three steps were evidently too many for my target audience to handle. The first step appeared simple enough: millions of people go to the App Store or Google Play to install millions of apps. And indeed, no one in my test balked at that. (Although convincing IT to tolerate a 3rd-party app would have been my next challenge.)

Similarly, the third step was easy enough when re-worded to point your device’s camera at the image.

The second step was the logjam. Not only is it unintuitive to open your browser after you have just installed a new app, but dutifully following this instruction mires you into yet more complexity. Sure, there is an alternative: search for the specific channel within the Aurasma app and then follow it – but that too is problematic as the user has to click a tab to filter the channel-specific results, which is academic anyway if you don’t want the channel to be public.

I understand why Aurasma links images to augmentations via specific channels. Imagine how the public would augment certain corporate logos, for example; those corporations wouldn’t want anything derogatory propagated across the general Aurasmasphere. Yet they hold the rights over their IP, so I would’ve thought that cutting off Joe Public’s inappropriate augmentation would be a matter of sending a simple email request to the Aurasma folks. Not to mention it would be in the corporation’s best interest to augment its own logo.

Anyway, that’s all a bit over my head. All I know is that requiring the user to follow a particular channel complicates the UX.

So that has caused me to wind down my plans for augmented domination. I am still thinking of using Aurasma: we might use it in our corporate museum to bring our old photos and artefacts to life. But if we go down this road, I’ll recommend that we provide a loan device with everything already set-up on it and ready to go – like MONA does.

In the meantime, I’ll investigate other AR apps.

Advertisements

3 hot resources for best practice multiple-choice quizzing

27 April 2011

In my previous post, 14 reasons why your multiple-choice quiz sucks, I listed typical clangers whose only purpose is to render your assessments ineffective.

Thumbs down

If they’re the bad and ugly aspects of MCQ design, what’s the good?

To answer that question I hit Google and a couple of academic databases, but mostly in vain.

It may be due to my poor researching skills, but I found very little empirical evidence of best practice multiple-choice quizzing. Plenty of unsubstantiated opinion (of course) but not much science.

Cartoon

You see, Google wasn’t much help because “best practice” is frequently confused with “common practice” – but it’s not the same thing.

The peer-reviewed literature wasn’t much better. Alarmingly, many of the studies were inconclusive, adopted a flawed experimental design, and/or didn’t compare the performances of the quiz-takers on the job under the different treatments – which is the whole point!

However, through a combination of persistence, serendipity and social networking, I finally uncovered 3 resources that I consider worth recommending: a journal article, a website and a blog.

1. A Review of Multiple-Choice Item-Writing Guidelines for Classroom Assessment – In this article, Thomas Haladyna, Steven Downing & Michael Rodriguez validate a taxonomy of 31 multiple-choice item-writing guidelines by reviewing 27 textbooks on educational testing and 27 research studies. If you want insight into the myriad of MCQ variables, here it is.

2. QuestionMark – David Glow and Jayme Frey independently pointed me to the wealth of resources on this website. QuestionMark is a business, granted, but they know what they’re talking about – a claim backed up by the fact they have a psychometrician on the payroll (cheers David) and I heard Eric Shepherd with my own ears at LearnX last year and was very impressed.

3. The eLearning Coach – Connie Malamed is a qualified and experienced e-learning designer whose blog provides advice to fellow practitioners. I value Connie’s expertise because it is practical and she has implemented it in the real world.

If you are aware of other good MCQ resources – preferably evidence based – please share them here…
 

14 reasons why your multiple-choice quiz sucks

19 April 2011

Unlike some of my peers, I’m not an opponent of the multiple-choice quiz. It’s a convenient and efficient means of assessing e-learners, if it’s done well.

The problem, unfortunately, is that it’s often done poorly. So poorly, in fact, it’s embarrassing.

oops!

At my workplace, I am regularly subjected to the multiple-choice quiz.

In fact, over the years in a range of jobs across several industries, not to mention my time in high school and university, I have been subjected to countless multiple-choice quizzes.

So I feel eminently qualified to tell you why yours sucks…

Thumbs down

  1. The questions don’t represent the learning objectives, so why did I waste my time doing the course?

  2. There aren’t many questions, so I should be able to bluff my way through them.

  3. The number of attempts is unlimited, so I’ll just keep guessing until I pass.

  4. The pass mark is low, but my customers aren’t so forgiving.

  5. The questions and answers aren’t randomised at each attempt, so I’ll jot down my responses to prepare for next time: 1-A, 2-D, 3-B…

  6. I’ll also ask my mate what he got for Questions 3, 6 & 8.

  7. The questions follow the order of the topics in the course, but I’m unlikely to encounter them in that order on the job.

  8. You use “All of the above” only when it’s correct.

  9. The ridiculous answer is obviously incorrect.

  10. The longest answer is probably correct, otherwise you wouldn’t have bothered writing it.

  11. More than one of the answers is arguably correct, and I’m shocked you didn’t know that.

  12. Your use of the double negative can only mean one thing: you can’t write no good.

  13. You try to trick me rather than confirm my understanding of the central concepts, so remind me: what’s the purpose of your role in this organisation?

  14. Your questions test my recall of obscure facts rather than my behaviour in authentic situations, so this exercise has no bearing on my performance.

Clearly I’m being deliberately pointed, but in all honesty, have you ever been guilty of one of those clangers?

I know I have.

But that’s no reason to vilify the multiple-choice quiz. When combined with other forms of assessment, it’s a valuable piece of the pedagogical puzzle.

Let’s just design it better.

Chicken takes a multiple-choice quiz