Tag: assessment

Indecent proposals

Another year has flown by, and once again I’m pleasantly surprised by the number of articles I managed to post in-between the trials and tribulations of life.

In December I like to review each one with a view to identifying a common theme. This time around, I’ve noticed that I – perhaps more directly than usual – presented my ideas in the form of proposals.

As to their decency, I’ll let you be the judge…

A woman with her hand to her mouth in a bashful manner.

US Liberty $1 coin

By the way, thank you everyone who reached out to me to express your appreciation of my annual compilation of L&D conferences in Australia. You’ve given me the reason I needed to continue doing it, so stay tuned for January.

In the meantime, I wish you and your family a merry Christmas and a bonza start to the new year!

Tree climbers

I respect Malcolm Gladwell as a thinker, but I’m disappointed by his Grand Unified Theory for fixing higher education that he espouses in an episode of the Revisionist History podcast titled The Tortoise and the Hare.

I won’t spoil the surprise for those who haven’t yet heard it, but suffice to say it’s born out of his experience in taking the hallowed Law School Admission Test (LSAT).

Malcolm argues the exam favours hares, not tortoises, even though tortoises make better lawyers. His animalian analogy reminds me of the cartoon Our Education System.

A teacher saying to a range of different animals: For a fair selection everybody has to take the same exam: please climb that tree.

While the cartoon makes a valid point about the diversity of intelligence, we must bear in mind that an assessment of your ability to climb a tree is a perfectly reasonable way to measure your mastery of tree climbing.

While Malcolm goes on to propose a solution to redress the bias he sees in the LSAT, he ends up abandoning it in favour of a catchphrase that treats the symptom rather than the cause.

I wish he remained focused on the cure: authentic assessment.

If we need tree climbers, let’s test their ability to climb trees.

Supercharge your digital training

We’ve all been there.

The organisation invests an obscene amount of money in a course library, and after much fanfare and an initial spike of excitement, activity steadily dwindles until the platform resembles a ghost town vacated by all but the most enthusiastic of fans.

Similar problems with learner engagement beset other forms of digital training too; whether it’s the famously low completions rates of MOOCs, or the constant chasing up of laggards who are yet to complete their compliance modules.

So when David Swaddle called out for tips to help fight what he described as “zombie digital learning”, I was all too willing to share the Top 3 pieces of advice that I’ve formulated on my quest to transform conventional digital training into blended learning experiences.

Here they are…

Rusty old car in front of a deserted shack.

1. Make time

Everyone’s too busy and they don’t have enough time to devote to their own learning and development. This has been the case ever since I started my career in this field and probably will remain so long after I retire.

So make time.

Add reminders into your participants’ calendars; schedule learning blocks; benchmark progress by declaring where they should be up by now; and host a complementary social networking group to keep the flame alive.

2. Provide context

Digital content can be generic by design, because it’s intended to scale up far and wide. However our audience may struggle to join the dots between what they see on screen and what they do on the job.

By supplementing the generic content with customised content, we can explain the implications of the former in the organisational context.

And by facilitating live interactive sessions that explore that context further, we reinforce it.

3. Assess application

Whether it’s a fair reputation or not, digital training is notorious for being a tick & flick exercise that fails to change behaviour in the real world.

So we need to ensure that the knowledge and skills that are developed via the learning experience are transferred by the employee to their day-to-day role.

By weaving an application activity into the instructional design – and by assessing the evidence of that application – we make it happen.

Electric sports car recharging

These are by no means the only ways to evolve your digital training.

However I hope that by implementing my three tips, you’ll supercharge it.

I don’t know

Despite its honesty, the humble phrase “I don’t know” is widely feared.

From the fake-it-til-you-make-it mindset of consultants to the face-saving responses of executives, we puny humans are psychologically conditioned to have all the answers – or at least be seen to.

Of course, demanding all the answers is the premise of summative assessment, especially when it’s in the form of the much maligned multiple-choice quiz. And our test takers respond in kind – whether it’s via “when in doubt, pick C” or by madly selecting the remaining options in a quasi zig-zag pattern as they run out of time.

But that’s precisely the kind of behaviour we don’t want to see on the job! Imagine your doctor wondering if a symptom pertains to the heart, kidney, liver or gall bladder, and feeling content to prescribe you medication for the third one. Or any random one in the 15th minute.

Of course my comparison is extreme for effect, and it may very well be inauthentic; after all, the learned doctor would almost certainly look it up. But I’d like to reiterate that in a typical organisational setting, having all the information we need at our fingertips is a myth.

Moreover, as Schema Theory maintains, an efficient and effective worker quickly retrieves the knowledge they need on a daily basis from the network they’ve embedded in their longterm memory. We can’t have our contact centre staff putting our customers on hold every 5 seconds while they ask their team leader yet another question, or our plumber shrugging his shoulders at every tap or toilet he claps his eyes on until he reads a manual. Of course, these recourses are totally acceptable… if they’re the exception rather than the rule.

And notwithstanding being a notch or two less serious than the life and death scenarios with which doctors deal, it wouldn’t be much fun if your loan or lavatory were the subject of a blind guess.

So yes, we humans can never know it all. And what we don’t know, we can find out. But the more we do know, the better we perform.

Two dice showing double sixes

Thus we don’t want our colleagues gaming their assessments. Randomly guessing a correct answer falsely indicates knowledge they don’t really have, and hence the gap won’t be remediated.

So I propose we normalise “I don’t know” as an answer option.

Particularly if a recursive feedback approach were to be adopted, a candid admission of ignorance motivated by a growth mindset would be much more meaningful than a lucky roll of the dice.

I don’t mean to underestimate the shift in culture that would be necessary to effect such a change, but I contend the benefits would be worth it – both to the organisation and to the individual.

In time, maybe identifying your own knowledge gaps with a view to continuously improving your performance will displace getting it right in the test and wrong on the job.

Approaching perfection

I’ve never understood the rationale of the 80% pass mark.

Which 20% of our work are we prepared to do wrongly?

It might explain the universally poor state of CX that companies are evidently willing to wear, but it’s arguably more serious when we consider the acronym-laden topics that are typically rolled out via e-learning, such as OHS and CTF. Which 20% of safety are we willing to risk? Which 20% of terrorism are we willing to fund?

There has to be a better way.

I’ve previously contended that an assessment first philosophy renders the concept of a pass mark obsolete, but went on to state that such a radical idea is a story for another day. Well my friends, that day has arrived.

An arrow pointing from Diagnose to Remediate then back to Diagnose.

Recursive feedback

Back in 2016, the University of Illinois’ excellent mooc e-Learning Ecologies: Innovative Approaches to Teaching and Learning for the Digital Age piqued my interest in the affordance of “recursive feedback” – defined by the instructor as rapid and repeatable cycles of feedback or formative assessment, designed to continually diagnose and remediate knowledge gaps.

I propose we adopt a similar approach in the corporate sector. Drop the arbitrary pass mark, while still recording the score and completion status in the LMS. But don’t stop there. Follow it up with cycles of targeted intervention to close the gaps, coupled with re-assessment to refresh the employee’s capability profile.

Depending on the domain, our people may never reach a score of 100%. Or if they do, they might not maintain it over time. After all, we’re human.

However the recursive approach isn’t about achieving perfection. It’s about continuous improvement approaching perfection.

One arrow with a single red dot; another arrow with a wavy green line.

Way of working

While the mooc instructor’s notion of recursive feedback aligns to formative assessment, my proposal aligns it to summative assessment. And that’s OK. His primary focus is on learning. Mine is on performance. We occupy two sides of the same coin.

To push the contrarianism even further, I’m also comfortable with the large-scale distribution of an e-learning module. However, where such an approach has notoriously been treated as a tick & flick, I consider it a phase in a longer term strategy.

Post-remediation efforts, I see no sense in retaking the e-learning module. Rather, a micro-assessment approach promotes operational efficiency – not to mention employee sanity – without sacrificing pedagogical effectiveness.

In this way, recursive feedback becomes a way of working.

And the L&D department’s “big bang” initiatives can be saved for the needs that demand them.