Categories: instructional design, motivation
Tags: engagement, formal learning, informal learning, instructional design, learning, learning styles, motivation, preferences, teaching, training, workplace
The L&D community may be divided into two camps: (1) Those for whom the mere mention of learning styles makes their blood boil; and (2) Those who are inexplicably unaware of the hullabaloo and are thus oblivious to the aforementioned boiling of blood.
Credit: Based on original artwork by Allie Brosh in This is Why I’ll Never be an Adult, Hyperbole and a Half.
The antagonism stems from the popularity of learning styles in the educational discourse – not to mention vocational curricula – despite a lack of empirical evidence supporting their effectiveness when incorporated into instructional design. The argument is that in the absence of such evidence, don’t waste time and money trying to match your teaching style to everyone’s learning styles; instead, divert that energy towards other, evidence-based pedagogy.
This is sound advice.
Nonetheless, I urge my peers not to throw the baby out with the bath water. By this I mean regardless of the existence or impact of learning styles, a phenomenon that enjoys universal recognition is that of learner preferences. And I fear it may be an unintended casualty of the war on learning styles.
For example, a deduction from the literature might be that a teacher need not tailor his or her delivery to meet the needs of the audience. Since learning styles are bunk, I can do what I like because it won’t make a difference anyway. Such a view is conveniently teacher centric, and it flies in the face of the thought leadership on learner centeredness that we have advanced so far. Sure, the deduction may be unreasonable, but extremists rarely listen to reason.
However, a more insidious factor is the dominance of the literature on formal learning. Studies of the impact of learning styles are typically based on teaching in a classroom setting, often in the K12 sector. Furthermore, the statistics are based on scores achieved via formal assessment. Yet we know in the workplace the vast majority of learning is informal.
Let me illustrate my concern here with a personal example. When I need to find out how to perform a particular task in a particular software program, I strongly prefer text-based instructions over video. I’m annoyed by having to play a clip, wait for it to load, and then wait for the presenter to get to the bit that is relevant to me. Instead, I prefer to scan the step-by-step instructions at my own speed and get on with it.
Now, if only video was available and I weren’t such a diligent employee, I might postpone the task or forget about it all together. Yet if you were to put me in a classroom, force me to watch the video, then test my ability to perform the task – sure, I’ll ace it. But that’s not the point.
The point is that the learner’s preference hasn’t been taken into account in the instructional design, and that can affect his performance in the real world.
If you don’t agree with me, perhaps because you happen to like video, suppose a manual was the only form of instruction available. Would you read it? Perhaps you would because you are a diligent employee.
In case your blood is beginning to boil, let me emphasise: (1) Learning styles appear to have no significant effect on learning outcomes; and (2) The nature of the content probably dictates its most effective mode of delivery.
If we assume that learning styles are highly correlated with learner preferences – indeed, for some they are synonymous – then we might be tempted to conclude that learner preferences have no significant effect on learning outcomes. I consider this a false conclusion.
Indeed in a controlled environment, learner preferences don’t really matter. The participants are forced to do it whether they like it or not, or they somehow feel obliged to comply.
Outside of the controlled environment, however, learner preferences do matter. We sometimes see this in formal settings (which is why universities enforce a minimum percentage of lecture attendance), but it appears most starkly in informal settings where the learner is empowered to do it or not. If they don’t like doing it, odds are they won’t.
So we need to be mindful of the interaction between pedagogical effectiveness and learner preference. An experience that your learners love but is ineffective is ultimately worthless. But so too is an experience that is effective but your learners loathe.
As a profession we need to aim for experiences that are both effective and liked by our audience – or at the very least, don’t turn them away.
Categories: content curation, instructional design
Tags: #wolweek, content, curation, framework, informal learning, instructional design, let me, model, motivation, show me, tell me
In conversation at EduTECH earlier this month, Harold Jarche evoked George E. P. Box’s quote that “all models are wrong, but some are useful”.
Of course, the purpose of a model is to simplify a complex system so that something purposeful can be done within it. By definition, then, the model can only ever be an approximation of reality; by human error, furthermore, it won’t be as approximate as it could be.
Nevertheless, if we accept the inherent variability in (and fallibility of) the model, we can achieve a much better outcome by using it than by not.
It is with this in mind that I have started thinking about a model – or perhaps more accurately, a framework – for content curation.
I have grown weary of hotchpotch lists of resources that we L&D pro’s tend to cobble together. Sure, they may be thoughtfully filtered and informatively annotated, but a hotchpotch is a hotchpotch. I should know: I’ve used them as a student, I’ve seen my peers create them, and I’ve created them myself.
Surely we can put more design into our curation efforts so that the fruits of our labour are more efficient, meaningful, and effective…?
Consider the trusty instructional design heuristic of Tell Me, Show Me, Let Me, Test Me. As far as heuristics go, I’ve found this to be a good one. It reminds us that transmission is ineffective on its own; learners really need to see the concept in action and give it a go themselves. As the Chinese saying goes, “Tell me and I forget. Show me and I remember. Involve me and I understand.” *
* Truisms such as this one are typically met with suspicion from certain quarters of the L&D community, but in this case the research on the comparative efficacies of lectures, worked examples, PBL etc appears to add up.
As a framework for content curation, however, I feel the heuristic doesn’t go far enough. In an age in which learners in the workplace are expected to be more autodidactic than ever before, it needs refurbishment to remain relevant.
So I propose the following dimensions of a new-and-improved framework…
An important piece of content curated for the target audience is one that attracts them to the curation in the first place, and promotes word-of-mouth marketing among their colleagues.
While related to the subject matter, this content need not be “educational” in the traditional sense. Instead, its role is to be funny, fascinating or otherwise engaging enough to pull the learners in.
As learning in the workplace inevitably informalises, the motivation of employees to drive their own development becomes increasingly pivotal to their performance.
Old-school extrinsic motivators (such as attendance rosters and exams) don’t exist in this space, so the curator needs to convince the audience to proceed. Essentially this means putting the topic into context for them, clarifying how it relates to their role, and explaining why they should bother learning it.
This content is new knowledge. I recommend covering only one key concept (or a few at most) to reduce cognitive load. It’s worth remembering that education is not the provision of information; it is sense making.
It’s important for this content to actually teach something. I see far too much curation that waxes lyrical “about” the subject, yet offers nothing practical to be applied on the job. We’re beyond the sales pitch at this stage; give ’em something they can use.
This content demonstrates the “Tell me” content in action, so the employee can see what the right behaviour looks like, and through that make further sense of the concept.
Real-world scenarios are especially powerful.
By putting the content into practice, the learner puts his or her understanding to the test.
Interactive exercises and immersive simulations – with feedback – allow the learner to play, fail and succeed in a safe environment.
This content jumps the knowing-doing gap by helping the learner apply the concepts back on the job.
This is principally achieved via job aids, and perhaps a social forum to facilitate ad hoc Q&A.
This content assists the employee who is keen to learn more by raising their awareness of other learning opportunities. These might explore the concepts in more depth, or introduce other concepts more broadly.
All that extra curation that we would have been tempted to shove under “Tell me” can live here instead.
Everyone is an SME in something, so they have an opportunity to participate in the curation effort. Whether the content they use is self generated or found elsewhere, it is likely to be useful for their colleagues too.
Leverage this opportunity by providing a mechanism by which anyone can contribute better content.
As you have no doubt deduced by now, the overarching theme of my proposed framework is “less is more”. It values quality over quantity.
It may prove useful beyond curation too. For example, it may inform the sequence of an online course. (In such a circumstance, a “Test me” dimension might be inserted after “Let me” to add summative assessment to the formative.)
In any case, it is very much a work in progress. And given it is #wolweek, I ask you… What are your thoughts?
Categories: augmented reality
Tags: app, AR, augmented reality, Aurasma, Disney Fairies Trail, fauxmented reality, m-learning, mobile, mobile learning, transparency
One of the greatest hoaxes to be perpetrated last century was that of the Cottingley Fairies.
In 1917, 9-year-old Frances Griffiths and her 16-year-old cousin, Elsie Wright, borrowed Elsie’s father’s camera to take a photograph of the fairies they claimed lived down by the creek. Sure enough, when he developed the plate, Elsie’s father saw several fairies frolicking in front of Frances’s face.
A couple of months later the girls took another photograph, this time showing Elsie playing with a gnome. While her father immediately suspected a prank, her mother wasn’t so sure and she took the photos to a local spiritualist meet-up. From there the photos, and three others taken subsequently by the girls, eventually hit the press and became a worldwide sensation.
Although the photos were quite real, the fairies of course were fake. In 1983, the cousins (by now old ladies) finally confessed they were cardboard cut-outs.
Not everyone – it must be said – had been fooled. In fact, most probably weren’t. However one who took the bait hook, line and sinker was none other than Sir Arthur Conan Doyle, whose involvement helped catapult the photographs to public attention in the first place.
It must also be said that as a devout spiritualist, Sir Arthur wanted to believe.
I was reminded of the Cottingley affair when I stumbled upon the Disney Fairies Trail app.
Developed in association with the Botanic Gardens Trust, the app promised to bring to life Tinker Bell and her flighty friends using our beautiful public spaces as the back drop.
Despite not being a member of the app’s target audience, I am an augmented reality advocate, so I downloaded the app and installed it on my iPad.
Its instructions were simple:
- Choose a botanic garden.
- Start the trail.
- Use the map to help find all the fairy locations.
- Your device will vibrate when there is a fairy nearby.
- Use your device to find the fairy.
- Tap the fairy to reveal their [sic] secrets.
I wanted to believe, so I hotfooted my way to the Royal Botanic Garden in Sydney to give it a go. Unfortunately the experience was less than optimal.
From the get-go, the map was so high level it was effectively useless.
After wandering semi-randomly around the park, I stumbled upon a tiny sign with an arrow promising that fairies live over there. Yet after crisscrossing my way all over the vicinity, my device never vibrated.
So I moved on. After a while I stumbled upon another tiny sign promising that the fairy trail continued this way, but after a short distance the path split out into multiple alternatives, none of which were sign posted.
After some more rambling, I finally stumbled upon a nice big sign declaring that fairies live here. Alas, still no vibrating – but I had a thought… perhaps the app doesn’t like my iPad? So I whipped out my Android smartphone and downloaded the app to it. It wouldn’t be as fun on the smaller screen, but I was determined to see a friggin fairy.
But it didn’t work on my smartphone either.
I should have read the customer reviews on the App Store before going to so much trouble. I clearly wasn’t the only one who had trouble with the app. And this bewilders me.
Unfortunately it was no surprise in retrospect when the real-life aspect of the experience didn’t work. To put my opinion into context, the Botanic Gardens Trust is the organisation that allows hordes of boot campers to bully tourists and locals alike off the park’s footpaths; and whose own staff choose the peak CBD lunch hour to drive their trucks and trailers along said paths.
But Disney! How could Disney fail to bring Tinker Bell & Co to life? The makers of masterpieces such as Toy Story and The Lion King evidently couldn’t engineer a dinky little augmented reality app that would work on my device of choice – or even on my device of second choice.
It just goes to show, if you want something done right, you have to do it yourself. So I did.
I discovered the Aurasma app a while ago, and I’ve been toying around with it to get a sense of how it works.
The app allows you to upload an image or a video, which appears or plays when you scan a real-world trigger with your device’s camera, hence augmenting reality.
I haven’t yet encountered a burning need to use it for an educational purpose at my workplace, but when I had trouble with the Fairies Trail app, I decided to see if I could replicate the intended experience.
So I downloaded Green Fairy 3 by TexelGirl, uploaded her to Aurasma, and associated her with a rose in the garden. When I scanned my iPad’s camera over the rose… Voila! She appeared.
As you can see via my screenshot, Aurasma supports the binary transparency of PNG files; with the background of the fairy image invisible, the real background shines through. The app also supports the partial transparency of PNG files; if I were to make the fairy 50% transparent, the real background would be partially visible through her.
My fairy was a static image, so she wasn’t moving. While Aurasma doesn’t seem to support animated gifs, it does support video. However there appears to be a conspicuous problem. My understanding is that MP4 format does not support transparency, and while FLV does, it won’t run on the iPad. I tweeted the Aurasma folks asking them to clarify this, but they are yet to respond.
Nevertheless, I discovered a work-around which is to duplicate the real background in the background of your video clip. That way when the video launches, it appears that its background is the real background. (I suspect this is how Dewars brought Robert Burns to life.)
Of course, this means the experience is no longer augmented reality, but rather an illusion of it. Though I wouldn’t go so far as to call it a hoax. Fauxmented reality, perhaps?
Which won’t be a problem, if we want to believe.
Categories: game-based learning
Tags: cons, game-based learning, games, gamification, GBL, learning, Mission US, motivation, serious games, Sight, workplace
How well do you chop your cucumber?
It’s a ridiculous question, I know, but in the short film Sight the protagonist plays an augmented reality game that awards him points for the consistency in the thickness of his slices.
The scene irked me. The last thing I would want while preparing dinner is a computer judging me. Really, who cares how wide I cut the slices, and who judged that distance to be the perfect width anyway? It’s certainly not my idea of fun. And besides, it all tastes the same.
It’s a clear case of gamification gone too far – and of course that was the film’s message. The plot continues to delve into much darker uses of the technology, raising the spectre of what appears to be utopia on the surface hiding dystopia underneath.
In my previous post Game-based learning on a shoestring, I advocated the use of games to support learning in the workplace. I believe they have much to offer in terms of motivation, engagement and the development of capability.
However, I also recognise another side of games that can in fact impede learning. They may be downright inappropriate for several reasons…
1. Life is not a game.
Points, badges and leaderboards may be critical elements of game mechanics, but they have little bearing on real life. Firefighters don’t save people from burning buildings for 200 digital hats; soldiers can’t heal their shrapnel wounds with a beverage; and utility workers who die of asphyxiation in confined spaces don’t scrape into the Top 10.
So if you want your game to be authentic, dispense with the inauthentic.
2. Games can trivialise serious issues.
While serious games such as Darfur is Dying shine a light on worthy causes, sometimes even the best of intentions can backfire.
Take Mission US for instance. In one of the missions you play a slave girl in 19th Century Kentucky who tries to escape to the north. Prima facie it sounds like a way of encouraging young folk to appreciate the horrors of slavery. In practice, however, it’s gone over like a lead balloon.
3. Games may reinforce the wrong mindset.
The concerns that many people have over Grand Theft Auto are well documented.
What is less documented, however, is the undesirable influence that work-based games can have on your employees. Do you really want them to compete against one another?
4. Games can contaminate motivation.
Forcing those who don’t want to play a game is a sure-fire way to demotivate them. If you’re going to gamify my chopping of cucumbers, I’ll chop as few cucumbers as possible as infrequently as possible.
Even encouraging those who want to play the game might promote their extrinsic motivation over their intrinsic. This begs the question… How will they perform on the job without the prospect of external rewards?
5. Games will be gamed.
Regardless of the purpose of your game, or its sound pedagogical foundation, someone will always seek to game it. That means they’re focused on “winning” rather than on learning.
And what’s the point of that?
To conclude, I reiterate my belief that games have much to offer workplace L&D. But there’s a fine line between an engaging learning experience and an insidious waste of time. So before embarking on your gamely quest, take a moment to consider – and mitigate – the unintended consequences.
May the odds be ever in your favour.
Categories: game-based learning
Tags: blended learning, collaboration, Diner Dash, free, game, game-based learning, games, gamification, GBL, group development, learning, stages, team building, team dynamics, The Learning Assembly, Tuckman, Tuckman's model
Game-based learning doesn’t have to break the bank. That was the key point of my presentation at The Learning Assembly in Melbourne last week.
Sure, you can spend an obscene amount of money on gaming technology if you want to, but you don’t have to.
Take Diner Dash for instance. In this free online game, you play the role of a waitress in a busy restaurant. As the customers arrive you need to seat them, take their order, submit the order to the chef, serve their food, transact their payment, clean their table, and take the dirty dishes back to the kitchen.
Leave any of your customers unattended for too long and they’ll walk out in a huff, costing you a star. When you lose all your stars, your shift is over.
It’s all very straight-forward… until the customers start pouring in and you find yourself racing to do everything at the same time. Straight-forward rapidly becomes complex!
While Diner Dash is just a simple little game, it can afford an engaging learning experience.
For example, suppose you incorporate the game into a team-building workshop. You could split the participants into teams of 3 or 4 members, place each team in front of a computer with Diner Dash pre-loaded, and instruct them to score as many points as possible within a given time period.
Of course the game isn’t meant to be played in this way. Controlling the waitress by committee is awkward and inefficient. The participants will panic; they’ll snap at one another; someone will commandeer the mouse and go it alone; someone else will butt in; and they’ll all start to talk over the top of each other.
But that’s by design. Because when the game is over, you introduce Tuckman’s model of team development and suddenly the penny drops.
What Diner Dash has done is provide the participants with a recent experience of team building. Sure, the premise of the game was fictitious, but the dynamics among the players were real. So when it comes time to reflect upon the theoretical principles of the model, they don’t need to imagine some vague hypothetical scenario because they’ve personally experienced a highly charged scenario that very morning. It’s fresh in their minds.
Other themes that could emerge via a game like Diner Dash include time management, priority management, customer service, problem solving, decision making, strategic thinking, adaptability and learning agility.
Another is collaboration. If you were to put a leaderboard at the front of the room, I could almost guarantee that each team would default to competition mode and battle it out for supremacy. But wasn’t the objective of the activity to score as many points as possible? So why wouldn’t you collaborate with your colleagues around you to do that – especially those who had played the game before! This observation never fails to enlighten.
So, getting back to my original proposition: game-based learning doesn’t have to break the bank. With resources such as Diner Dash available for free, you can do it on a shoestring.