Tag: assessment

Higher Assessment

I find it strange when a blogger doesn’t approve my comment.

I consider comments the life blood of my own blog, and whether they be positive or negative, classy or rude, they all add to the diversity of the conversation. If your fragile ego can’t handle that, don’t blog.

I recently submitted a constructive comment to a particular blog post, twice, and it never eventuated. A later comment by someone else has.

Right, rather than waste my thought bubble, I’ve decided to reproduce the thrust of it here…

Looking up at Mannheim City Water Tower

The OP was about the future of Higher Education being modular and flexible, which I agreed with. However something that caught my eye was the author’s observation about the assessment of prior learning via an essay or exam defeating the point of documentary evidence of previous course content or work experience.

Yet I feel that assessment via an essay or exam or some other means is the point. We needn’t rely so much on the bureaucracy if we could simply demonstrate what we know – regardless of how we came to know it.

When accrediting prior learning, a university needn’t get bogged down with evaluating myriad external permutations that may be worthy of credit, because what matters is the outcome of those permutations.

Similarly from the student’s point of view, it wouldn’t matter if they’ve done a mooc but not paid for the certificate, or if they did a course many years ago and worked in the field thereafter. What matters is the knowledge they can demonstrate now.

As a bastion of education, the university is losing ground to external competitors. Yet it maintains a certain gravitas that I suggest can be channelled into more of an assessment-driven role for society, whereby it validates knowledge at a certain standard and awards its qualifications accordingly.

It’s role in teaching and learning is retained, of course, to fill in the gaps; powered by research to keep it at the forefront of the science.

Scaling up

In Roses are red, I proposed definitions for oft-used yet ambiguous terms such as “competency” and “capability”.

Not only did I suggest a competency be considered a task, but also that its measurement be binary: competent or not yet competent.

As a more general construct, a capability is not so readily measured in a binary fashion. For instance, the question is unlikely to be whether you can analyse data, but the degree to which you can do so. Hence capabilities are preferably measured via a proficiency scale.

Feet on scales

Of course numerous proficiency scales exist. For example:

No doubt each of these scales aligns to the purpose for which it was defined. So I wonder if a scale for the purpose of organisational development might align to the Kirkpatrick Model of Evaluation:

 Level  Label  Evidence 
0 Not Yet Assessed  None
1 Self Rater Self rated
2 Knower Passes an assessment
3 Doer Observed by others
4 Performer Meets relevant KPIs
5 Collaborator Teaches others

Table 1. Tracey Proficiency Scale (CC BY-NC-SA)

I contend that such a scale simplifies the measurement of proficiency for L&D professionals, and is presented in a language that is clear and self-evident for our target audience.

Hence it is ahem scalable across the organisation.

More than just a pretty face

I’ve blogged in favour of digital badges in the past, not because they’re colourful motivators – which arguably they are, at least for some people – but because they represent an achievement.

While the robustness of the criteria for earning a badge may be challenged, as may be the assessment of meeting said criteria, the concept holds true: a badge must be earned by demonstrating that you have done something.

What that something is is a variable to be defined. Some badges such as the ones that are popular among IT geeks are earned by completing a training program or by passing an exam. I call these “certification badges”.

However I maintain a stronger implementation of the idea emerges when we earn the badge by successfully executing a task (or a suite of tasks). I call these “practitioner badges”.

Assorted badges, including one stating Qualified Dog-Petter

For example, you might complete a 40-hour course and pass a massive multiple-choice quiz to earn an XYZ-issued “Project Management” badge. That’s quite an achievement.

But I’d be more impressed (and more confident as an employer) if you were to demonstrate how you’ve applied the XYZ-endorsed principles to a real project in the real world, thereby earning a “Project Manager” badge. To me, that’s a greater achievement because it shifts the focus of the exercise from the activity (learning) to its outcome (performance).

In an organisational context, I see opportunities to blend the tasks to enrich the experience. For example, one task may be to apply a principle to your current project, while the next task is to share your reflection of doing so on the enterprise social network; thereby facilitating not only metacognition and expert feedback, but also peer-to-peer knowledge sharing.

Celebrating the latest cohort of people who’ve earned badges in the same forum may also generate a bit of FOMO.

In any case, my point is a badge should be more than just a pretty face. I propose we distinguish between two types of badge – namely a certification badge and a practitioner badge – with the latter representing an achievement above and beyond the former.

Transformers

It seems like everyone’s spruiking the “new normal” of work.

The COVID-19 pandemic is keeping millions of previously office-bound employees at home, forcing L&D professionals to turn on a dime.

Under pressure to maintain business continuity, our profession has been widely congratulated for its herculean effort in adapting to change.

I’m not so generous.

Our typical response to the changing circumstances appears to have been to lift and shift our classroom sessions over to webinars.

In The next normal, which I published relatively early during lockdown, several of my peers and I recognised the knee-jerk nature of this response.

And that’s not really something that ought to be congratulated.

Who led the digital transformation of your company? The CEO (incorrect), The CTO (incorrect), COVID-19 (correct)

For starters, the virus exposed a shocking lack of risk management on our part. Digital technology is hardly novel, and our neglect in embracing it left us unprepared for when we suddenly needed it.

Look no further than the Higher Education sector for a prime example. They’re suffering a free-fall in income from international students, despite the consensus that people can access the Internet from other countries.

Beyond our misgivings with technology, moreover, the virus has also shone a light on our pedagogy. The broadcast approach that we deliver virtually today is largely a continuation of our practice pre-pandemic. It wasn’t quite right then, and it isn’t quite right now. In fact, isolation, digital distractions and Zoom fatigue probably make it worse.

I feel this is important to point out because the genie is out of the bottle. Employee surveys reveal that the majority of us either don’t want to return to the office, or we’ll want to split our working week at home. That means while in-person classes can resume, remote learning will remain the staple.

So now is our moment of opportunity. In the midst of the crisis, we have the moral authority to mature our service offering. To innovate our way out of the underwhelming “new normal” and usher in the modern “next normal”.

In some cases that will mean pivoting away from training in favour of more progressive methodologies. While I advocate these, I also maintain that direct instruction is warranted under some circumstances. So instead of joining the rallying cry against training per se, I propose transforming it so that it becomes more efficient, engaging and effective in our brave new world.

Transformer-style toy robot

Good things come in small packages

To begin, I suggest we go micro.

So-called “bite sized” pieces of content have the dual benefit of not only being easier to process from a cognitive load perspective, but also more responsive to the busy working week.

For example, if we were charged with upskilling our colleagues across the business in Design Thinking, we might kick off by sharing Chris Nodder’s 1.5-minute video clip in which he breaks the news that “you are not your users”.

This short but sweet piece of content piques the curiosity of the learner, while introducing the concept of Empathize in the d.school’s 5-stage model.

We’re all in this together

Next, I suggest we go social.

Posting the video clip to the enterprise social network seeds a discussion, by which anyone and everyone can share their experiences and insights, and thus learn from one another.

It’s important to note that facilitating the discussion demands a new skillset from the trainer, as they shift their role from “sage on the stage” to “guide on the side”.

It’s also important to note that the learning process shifts from synchronous to asynchronous – or perhaps more accurately, semi-synchronous – empowering the learner to consume the content at a time that is most convenient for them (rather than for the L&D department).

There is no try

Next, I suggest we go practical.

If the raison d’être of learning & development is to improve performance, then our newly acquired knowledge needs to be converted into action.

Follow-up posts on the social network shift from the “what” to the “how”, while a synchronous session in the virtual classroom enables the learner to practise the latter in a safe environment.

Returning to our Design Thinking example, we might post content such as sample questions to ask prospective users, active listening techniques, or an observation checklist. The point of the synchronous session then is to use these resources – to stumble and bumble, receive feedback, tweak and repeat; to push through the uncomfortable process we call “learning” towards mastery.

It’s important to recognise the class has been flipped. While time off the floor will indeed be required to attend it, it has become a shorter yet value-added activity focusing on the application of the knowledge rather than its transmission.

Again, it’s also important to note that facilitating the flipped class demands a new skillset from the trainer.

A journey of a thousand miles

Next, I suggest we go experiential.

Learning is redundant if it fails to transfer into the real world, so my suggestion is to set tasks or challenges for the learner to do back on the job.

Returning to our Design Thinking example, we might charge the learner with empathising with a certain number of end users in their current project, and report back their reflections via the social network.

In this way our return on investment begins immediately, prior to moving on to the next stage in the model.

Pics or it didn’t happen

Finally, I suggest we go evidential.

I have long argued in favour of informalising learning and formalising its assessment. Bums on seats misses the point of training which, let’s remind ourselves again, is to improve performance.

How you learned something is way less interesting to me than if you learned it – and the way to measure that is via assessment.

Returning to our Design Thinking example, we need a way to demonstrate the learner’s mastery of the methodology in a real-world context, and I maintain the past tense of open badges fits the bill.

In addition to the other benefits that badges offer corporates, the crux of the matter is that a badge must be earned.

Informalise learning. Formalise its assessment.

I am cognisant of the fact that my proposal may be considered heretical in certain quarters.

The consumption of content on the social network, for example, may be difficult to track and report. But my reply is “so what” – we don’t really need to record activity so why hide it behind the walls of an LMS?

If the openness of the training means that our colleagues outside of the cohort learn something too, great! Besides, they’ll have their own stories to tell and insights to share, thereby enriching the learning experience for everyone.

Instead it is the outcome we need to focus on, and that’s formalised by the assessment. Measure what matters, and record that in the LMS.

In other words, the disruptive force of the COVID-19 pandemic is an impetus for us to reflect on our habits. The way it has always been done is no substitute for the way it can be done better.

Our moment has arrived to transform our way out of mode lock.

Micro-learning’s unsung sibling

Micro-learning is so hot right now.

But I’m not going to deliberate over its definition. If you’re into that, check out Shannon Tipton’s Microlearning: The Misunderstood Buzzword and 7 Deadly Myths of Microlearning.

Nor am I going to try to convince you to jump on board, or abandon ship.

Instead, I’m going to consider firstly how micro-learning might be used in a corporate training context; and secondly, pivot towards something slightly different.

And if you were to find any value in these musings, I’d be delighted.

How micro-learning might be used

The nature of micro-learning lends itself to the campaign model.

Independent but related packets of content that are distributed over time can be woven into the working day of the target audience, and hence reduce time “off the floor”. In this context, the micro-learning is the training.

Similarly I see an opportunity for micro-learning to be deployed before the training. The content can prime the target audience for the experience to follow, perhaps in the form of a flipped class.

And of course I also see an opportunity for micro-learning to be deployed after the training: what one may call “reinforcement” to improve retention and increase the probability of knowledge transfer.

Sure, but does it work?

Well cognitive science suggests it does. I recommend reading up on the forgetting curve, subsumption theory, Piaget, cognitive load, the spacing effect and interleaving. It’s worth it.

A hand holding a pen pointing to a chart.

The pivot

While I’m obviously an advocate of micro-learning, a less buzzy but perhaps just-as-important variant is micro-assessment.

This is similar to micro-learning except the content is in question format – preferably scenario based and feedback rich.

In one sense, the two approaches may be conflated. Formative assessment is nothing new, and a few daily questions over a set timespan could constitute training, or prompt critical thinking pre-training, or promote application post-training.

If you want more bedtime reading, I suggest looking up the testing effect or its synonyms, retrieval practice and active recall.

However I feel the untapped potential of micro-assessment lay in its summative power. As the bank of results builds up over time, the data can be used to diagnose the population’s understanding of the subject matter. If the questions are aligned to competencies, the knowledge gaps can be identified and closed with further interventions.

Hence, micro-assessment can be leveraged to execute an assessment first strategy, thereby increasing the relevance of the L&D service offering to the business.

And if you want yet more bedtime reading, I suggest exploring metacognition and its effect on motivation.

On that note, good night!