Posted tagged ‘assessment’

Transformers

1 September 2020

It seems like everyone’s spruiking the “new normal” of work.

The COVID-19 pandemic is keeping millions of previously office-bound employees at home, forcing L&D professionals to turn on a dime.

Under pressure to maintain business continuity, our profession has been widely congratulated for its herculean effort in adapting to change.

I’m not so generous.

Our typical response to the changing circumstances appears to have been to lift and shift our classroom sessions over to webinars.

In The next normal, which I published relatively early during lockdown, several of my peers and I recognised the knee-jerk nature of this response.

And that’s not really something that ought to be congratulated.

Who led the digital transformation of your company? The CEO (incorrect), The CTO (incorrect), COVID-19 (correct)

For starters, the virus exposed a shocking lack of risk management on our part. Digital technology is hardly novel, and our neglect in embracing it left us unprepared for when we suddenly needed it.

Look no further than the Higher Education sector for a prime example. They’re suffering a free-fall in income from international students, despite the consensus that people can access the Internet from other countries.

Beyond our misgivings with technology, moreover, the virus has also shone a light on our pedagogy. The broadcast approach that we deliver virtually today is largely a continuation of our practice pre-pandemic. It wasn’t quite right then, and it isn’t quite right now. In fact, isolation, digital distractions and Zoom fatigue probably make it worse.

I feel this is important to point out because the genie is out of the bottle. Employee surveys reveal that the majority of us either don’t want to return to the office, or we’ll want to split our working week at home. That means while in-person classes can resume, remote learning will remain the staple.

So now is our moment of opportunity. In the midst of the crisis, we have the moral authority to mature our service offering. To innovate our way out of the underwhelming “new normal” and usher in the modern “next normal”.

In some cases that will mean pivoting away from training in favour of more progressive methodologies. While I advocate these, I also maintain that direct instruction is warranted under some circumstances. So instead of joining the rallying cry against training per se, I propose transforming it so that it becomes more efficient, engaging and effective in our brave new world.

Transformer-style toy robot

Good things come in small packages

To begin, I suggest we go micro.

So-called “bite sized” pieces of content have the dual benefit of not only being easier to process from a cognitive load perspective, but also more responsive to the busy working week.

For example, if we were charged with upskilling our colleagues across the business in Design Thinking, we might kick off by sharing Chris Nodder’s 1.5-minute video clip in which he breaks the news that “you are not your users”.

This short but sweet piece of content piques the curiosity of the learner, while introducing the concept of Empathize in the d.school’s 5-stage model.

We’re all in this together

Next, I suggest we go social.

Posting the video clip to the enterprise social network seeds a discussion, by which anyone and everyone can share their experiences and insights, and thus learn from one another.

It’s important to note that facilitating the discussion demands a new skillset from the trainer, as they shift their role from “sage on the stage” to “guide on the side”.

It’s also important to note that the learning process shifts from synchronous to asynchronous – or perhaps more accurately, semi-synchronous – empowering the learner to consume the content at a time that is most convenient for them (rather than for the L&D department).

There is no try

Next, I suggest we go practical.

If the raison d’être of learning & development is to improve performance, then our newly acquired knowledge needs to be converted into action.

Follow-up posts on the social network shift from the “what” to the “how”, while a synchronous session in the virtual classroom enables the learner to practise the latter in a safe environment.

Returning to our Design Thinking example, we might post content such as sample questions to ask prospective users, active listening techniques, or an observation checklist. The point of the synchronous session then is to use these resources – to stumble and bumble, receive feedback, tweak and repeat; to push through the uncomfortable process we call “learning” towards mastery.

It’s important to recognise the class has been flipped. While time off the floor will indeed be required to attend it, it has become a shorter yet value-added activity focusing on the application of the knowledge rather than its transmission.

Again, it’s also important to note that facilitating the flipped class demands a new skillset from the trainer.

A journey of a thousand miles

Next, I suggest we go experiential.

Learning is redundant if it fails to transfer into the real world, so my suggestion is to set tasks or challenges for the learner to do back on the job.

Returning to our Design Thinking example, we might charge the learner with empathising with a certain number of end users in their current project, and report back their reflections via the social network.

In this way our return on investment begins immediately, prior to moving on to the next stage in the model.

Pics or it didn’t happen

Finally, I suggest we go evidential.

I have long argued in favour of informalising learning and formalising its assessment. Bums on seats misses the point of training which, let’s remind ourselves again, is to improve performance.

How you learned something is way less interesting to me than if you learned it – and the way to measure that is via assessment.

Returning to our Design Thinking example, we need a way to demonstrate the learner’s mastery of the methodology in a real-world context, and I maintain the past tense of open badges fits the bill.

In addition to the other benefits that badges offer corporates, the crux of the matter is that a badge must be earned.

Informalise learning. Formalise its assessment.

I am cognisant of the fact that my proposal may be considered heretical in certain quarters.

The consumption of content on the social network, for example, may be difficult to track and report. But my reply is “so what” – we don’t really need to record activity so why hide it behind the walls of an LMS?

If the openness of the training means that our colleagues outside of the cohort learn something too, great! Besides, they’ll have their own stories to tell and insights to share, thereby enriching the learning experience for everyone.

Instead it is the outcome we need to focus on, and that’s formalised by the assessment. Measure what matters, and record that in the LMS.

In other words, the disruptive force of the COVID-19 pandemic is an impetus for us to reflect on our habits. The way it has always been done is no substitute for the way it can be done better.

Our moment has arrived to transform our way out of mode lock.

Micro-learning’s unsung sibling

9 April 2019

Micro-learning is so hot right now.

But I’m not going to deliberate over its definition. If you’re into that, check out Shannon Tipton’s Microlearning: The Misunderstood Buzzword and 7 Deadly Myths of Microlearning.

Nor am I going to try to convince you to jump on board, or abandon ship.

Instead, I’m going to consider firstly how micro-learning might be used in a corporate training context; and secondly, pivot towards something slightly different.

And if you were to find any value in these musings, I’d be delighted.

How micro-learning might be used

The nature of micro-learning lends itself to the campaign model.

Independent but related packets of content that are distributed over time can be woven into the working day of the target audience, and hence reduce time “off the floor”. In this context, the micro-learning is the training.

Similarly I see an opportunity for micro-learning to be deployed before the training. The content can prime the target audience for the experience to follow, perhaps in the form of a flipped class.

And of course I also see an opportunity for micro-learning to be deployed after the training: what one may call “reinforcement” to improve retention and increase the probability of knowledge transfer.

Sure, but does it work?

Well cognitive science suggests it does. I recommend reading up on the forgetting curve, subsumption theory, Piaget, cognitive load, the spacing effect and interleaving. It’s worth it.

A hand holding a pen pointing to a chart.

The pivot

While I’m obviously an advocate of micro-learning, a less buzzy but perhaps just-as-important variant is micro-assessment.

This is similar to micro-learning except the content is in question format – preferably scenario based and feedback rich.

In one sense, the two approaches may be conflated. Formative assessment is nothing new, and a few daily questions over a set timespan could constitute training, or prompt critical thinking pre-training, or promote application post-training.

If you want more bedtime reading, I suggest looking up the testing effect or its synonyms, retrieval practice and active recall.

However I feel the untapped potential of micro-assessment lay in its summative power. As the bank of results builds up over time, the data can be used to diagnose the population’s understanding of the subject matter. If the questions are aligned to competencies, the knowledge gaps can be identified and closed with further interventions.

Hence, micro-assessment can be leveraged to execute an assessment first strategy, thereby increasing the relevance of the L&D service offering to the business.

And if you want yet more bedtime reading, I suggest exploring metacognition and its effect on motivation.

On that note, good night!

The L&D maturity curve

4 March 2019

Over the course of my career, I’ve witnessed a slow but steady shift away from formal learning to informal learning.

Of course, remnants of the “formal first” philosophy still exist, whereby every conceivable problem is attempted to be fixed by a training solution, typically in the form of a course. Over time, the traditional classroom-based delivery of such courses has increasingly given way to online modules, but that’s merely a change in format – not strategy.

While courses certainly have their place in the L&D portfolio, the forgetting curve places a question mark over their longterm effectiveness on their own.

The informal first philosophy balances the pendulum by empowering the employee to self-direct their learning in accordance with their personal needs.

While in some cases informal learning obviates the need for training, in other cases it will complement it. For example, I see the informalisation of learning as an opportunity to deliver the content (for example, via a wiki) which can be consumed at the discretion of the employee. The focus of the course then pivots to the application of the content, which is the point of learning it in the first place. Similarly, the assessment evaluates the learning in the context of real-world scenarios, which is what the learner will encounter post-course.

And since the content remains accessible, it can be used for ongoing reference long after the course has been completed.

A hand holding a pen pointing to a chart.

While I consider the informal first philosophy a giant leap in L&D maturity, it essentially pertains to instructional design. For a more holistic view of L&D, I propose an “assessment first” philosophy by which the capability of the target audience is analysed prior to any design work being undertaken.

The rationale for this philosophy is best appreciated in the context of an existing employee base (rather than greenhorn new starters). Such a group comprises adults who have a wide range of knowledge, skills and experiences. Not to mention they’ve probably been doing the job for a number of years.

Sheep dipping everyone in this group with the same training doesn’t make much sense. For a minority it might be a worthwhile learning experience, but for the majority it is likely to be redundant. This renders the training an ineffective waste of time, and an unnecessary burden on the L&D team.

By firstly assessing the target audience’s proficiency in the competencies that matter, a knowledge gap analysis can identify those in which the population is weak, and targeted training can be delivered in response. Individuals who are “not yet competent” in particular areas can be assigned personalised interventions.

This approach avoids the solution first trap. By focusing the L&D team’s attention on the real needs of the business, not only does the volume of demand reduce, but the work becomes more relevant.

The assessment first philosophy may appear incongruent where new starters are concerned, who by definition are assumed to be weak in all competencies – after all, they’ve only just walked through the door! – but I counter that assumption on two fronts.

Firstly, not all new starters are doe-eyed college grads. Many have had previous jobs in the industry or in other industries, and so they arrive armed with transferable knowledge, skills and experiences.

And regardless, the informal first philosophy holds true. That is to say, the new starter can consume the content (or not) as they see fit, demonstrate their understanding in the scenario-oriented “course”, and formalise it via the assessment.

The results of the assessment dictate any further intervention that is necessary.

Of course, some topics such as the company’s own products or processes will necessitate significant front-end loading via content development and maybe even curricula, but these may be considered the exception rather than the rule. By looking through the lens of assessment first, the L&D team works backwards to focus that kind of energy on where it is warranted.

It is also worth noting the assessment first philosophy renders the traditional “pass mark” obsolete, but such a radical idea is a story for another day!

Laptop showing business metrics.

While the assessment first philosophy represents an exponential leap in the maturity of L&D, there is yet another leap to make: “performance first”.

The raison d’être of the L&D team is to improve performance, so it’s always been a mystery to me as to why our work is so often disconnected to the business results. I do appreciate the barriers that are in our way – such as the inexplicable difficulty of obtaining the stats – but still, we can and should be doing more.

Under the performance first paradigm, it is not knowledge gaps that are analysed, but rather performance gaps. A root cause analysis identifies whether the cause is a capability deficiency or not – in the case of the former, a capability analysis feeds into the assessment first approach; in the case of the latter, a solution other than training is pursued instead.

As with assessment first, performance first may appear incongruent where new starters are concerned. After all, their stats thus far are zero, and waiting to recognise poor performance may have unacceptable consequences.

So again we have another exception to the rule whereby some folks may be scaffolded through L&D intervention prior to their performance being analysed. However the point is, we needn’t force everyone down that road. It depends on the circumstances.

And again, by looking through the lens of performance first, the L&D team works backwards to focus its energy on where it is needed. But this time with results at the forefront of the team’s purpose, its relevance to the business goes through the roof.

The L&D Maturity Curve, featuring Formal First rising to Informal First rising to Assessment First rising to Performance First. The x-axis represents maturity of the L&D function and the y-axis represents its relevance to the business.

I realise my take on L&D maturity might freak some of my peers out. Concurrently, others will argue that we should leapfrog to performance first now and get on with it.

Personally I consider the maturity curve a journey. Yes, it is theoretically possible to skip stages, but I feel that would be a shock to the system. From a change management perspective, I believe an organisation at one stage of the curve would achieve more success by growing into the next stage of the curve, while ironing out the bugs and creating the new normal along the way.

Besides, it isn’t a race. Important journeys take time. What matters is the direction in which that journey is heading.

Louder than words

13 November 2017

My last couple of blog posts have argued in favour of extracting value out of organisational capabilities. Due to the nature of my role I have posited these arguments in terms of employee development.

However, I further advocate the use of organisational capabilities across all parts of the employee lifecycle.

Using the 4+4 Part Employee Lifecycle as my guide, I will share my thoughts on some of the ways in which your capability framework can add value to your organisation’s activities in terms of recruitment, onboarding, performance, and offboarding.

The 4+4 Part Employee Lifecycle: (1) Recruitment; (2) Onboarding; (3) Performance; and (4) Offboarding; plus (1) Performance Management; (2) Development; (3) Health & Wellbeing; and (4) Retention.

Recruitment

Everyone knows that change management is hard. Culture eats strategy for breakfast; an organisation’s culture doesn’t change over night; something about herding cats; the change curve; etc. etc.

We’ve heard it all before, and yes it’s probably true.

But there’s a big elephant in the room: the power of recruitment to accelerate cultural change. That is to say, bring in from the outside the people whose capabilities you desperately need on the inside.

Which begs the question… what capabilities? Well, organisations that focus like an eagle know precisely the capabilities to assess each candidate against, because they are the ones that align to their strategic imperatives.

If your organisation needs to become more collaborative, recruit collaborative people. If it needs to become more innovative, recruit innovative people. And if it needs to become more digitally literate, recruit digitally literate people.

This approach may seem too obvious to mention, yet I dare you to examine your organisation’s current recruitment practices.

Onboarding

Onboarding is one of those pies that everyone wants to stick their fingers into, but nobody wants to own. Yet it is crucial for setting up the new recruit for success.

From an organisational capability perspective, a gold-plated opportunity arises during this phase in the employee’s lifecycle to draw their attention to the capability framework and the riches therein. The new recruit is motivated, keen to prove themselves, and hungry to learn.

Highlight the resources that are available to them to develop their capabilities now. This is important because the first few weeks of their experience in the organisation colours their remaining tenure.

Ensure they start their journey the way you’d like them to continue it: productively.

Performance

Capability powers performance, so the capability framework is a tool you can use to improve all four subparts of Performance in the 4+4 Part Employee Lifecycle.

Performance Management

Effective performance management complements development planning to provide the employee with guidance on improving said performance.

When seen through the lens of the capability framework, an employee’s performance appraisal can identify meaningful development opportunities. Performance weak spots may be (at least partly) attributable to gaps in specific capabilities; while a strengths-based approach might also be adopted, whereby an already strong capability is enhanced to drive higher performance.

To inform these decisions with data, I’d be keen to correlate capability assessments against individual performances and observe the relationship between the variables over time.

Development

It’s all very well to have a poetic capability framework, but if learning opportunities aren’t mapped to it, then its value is inherently limited.

If the framework’s capabilities align to leadership stages, I suggest the following question be put to the user: Do you want to excel in your current role or prepare for your next role?

Not only does this question focus the user’s development goal, it also identifies the relevant leadership stage so the capabilities can be presented in the right context.

A follow-up question may then be posed: Would you like to browse all the capabilities – useful for those who want to explore, or already know which capability to develop – focus on our strategic imperatives – useful for those who are time poor – or assess your capabilities – useful for those who seek a personal diagnosis.

The answers to these questions lead to a selection of capabilities which, beyond the provision of clear descriptions, outline the opportunities for development.

Resist the urge to dump masses of resources into their respective buckets. Instead, curate them. I suggest the following approaches:

KASAB is an esoteric extension of the KSA heuristic in teaching circles, and I like it because it includes “B” for “Behaviour”.

For example, help your colleagues move beyond the consumption of teamwork videos, design thinking workshops, and moocs on digital business; by encouraging them to contribute to communities of practice, submit ideas to the enterprise idea management system, and participate in the company’s social media campaign.

Health & Wellbeing

I see organisational capabilities applying to health & wellbeing in two ways.

The first way concerns the impact of employee development on mental health. Given the satisfaction and pride of building mastery drives engagement, the capability framework presents opportunities to improve mental health across the enterprise.

The second way concerns the composition of the capability framework. Given a healthy employee is a productive employee, why isn’t Wellness itself an organisational capability?

Retention

I’ve seen with my own eyes the impact of employee development (or lack thereof) on retention.

Given the sense of support and growth that the investment in people’s learning brings, the capability framework presents opportunities to retain talent across the enterprise.

Offboarding

Capabilities that align to leadership stages are useful for succession planning. Not only do they identify the capabilities that someone needs to succeed in their current role, but also the capabilities they need to succeed in their next role. Assessment of the latter informs the readiness of the employee for promotion.

Conversely, when the employee leaves the team (or exits the organisation) the capability framework can be used to assess the skills gap that remains.

Girl with home-made wings

In 7 tips for custodians of capability frameworks I declared a capability framework that remains unused is merely a bunch of words. But it’s worse than that. It is unrealised value across the employee lifecycle.

So use your capability framework to improve the organisation’s recruitment, onboarding, performance, and offboarding.

Actions speak louder than words.

Top 5 benefits of open badges for corporates

17 July 2013

I’ve been blogging a lot about open badges lately. That really means I’ve been thinking a lot about open badges lately, as I use my blog as a sense-making platform.

Through my blogging, combined with the insightful discussions following both Badges of honour and The past tense of open badges, I have been able to consolidate my thoughts somewhat.

This consolidation I rehash share with you now in the form of my Top 5 benefits of open badges for corporates.

Carrot badge

1. Open badges can motivate employees to learn.

Badges are widely perceived as being childish, yet there is no denying that the game mechanics that underpin them can work. Some people are incredibly motivated by badges. Once they’ve earned one, they want to earn another.

You will note that I am using weasel words such as “can” and “some”. This is because badges don’t motivate everyone – just ask Foursquare! But my view is if they motivate a significant proportion of your target audience, then that makes them worthwhile.

I consider this an important point because as learning in the corporate sector becomes more informal, the employee’s motivation to drive their own development will become increasingly pivotal to their performance, and hence to the performance of the organisation as a whole.

Credential badge

2. Open badges can credential in-house training.

Yes, corporates can print off certificates of completion for employees who undertake their in-house training offerings, only for them to be pinned to a workstation or hidden in a drawer.

And yes, corporates typically track and record completion statuses in their LMS, but that lacks visibility for pretty much everyone but the employee him- or herself.

In contrast, open badges are the epitome of visibility. They’re shiny and colourful, the employee can collect them in their online backpack, and they can be shown off via a plugin on a website or blog – or intranet profile.

Badges therefore give corporates the opportunity to recognise the employees who have completed their in-house training, within an enterprise-wide framework.

Portable badge

3. Open badges are portable.

Currently, if you undertake training at one organisation and then leave to join another, you leave your completion records behind. However, if badges were earned through that training, their openness and centralisation in the cloud means that you can continue to “wear” them when you move to your next employer.

This portability of open badges would be enhanced if third parties were also able to endorse the training. So an APRA-endorsed badge earned at Bank A, for example, would be meaningful to my next employer, Bank B, because this bank is also regulated by APRA.

Still, the concept holds without third-party endorsement; that is to say, much of the training provided by Bank A would probably still be meaningful to Bank B – because Bank A and Bank B do very similar things.

Task-oriented badge

4. Open badges are task oriented.

Despite my talk of “training” thus far, open badges are in fact task oriented. That means they recognise the execution of specific actions, and hence the mastery of skills.

I love this aspect of open badges because it means they don’t promise that you can do a particular task, but rather demonstrate that you have already done it.

That gives employers confidence in your capability to perform on the job.

Assessment badge

5. Open badges can formally recognise informal learning.

I have argued previously that in the modern workplace, we should informalise learning and formalise assessment.

My rationale is that the vast majority of learning in the workplace is informal anyway. Employees learn in all kinds of ways – from reading a newsfeed or watching a video clip, to playing with new software or chatting with colleagues over lunch.

The question is how to manage all of that learning. The answer is you don’t.

If a particular competency is important to the business, you assess it. Assessment represents the sum of all the learning that the employee has undertaken in relation to that competency, regardless of where, when or how it was done.

I see open badges as micro-assessments of specific tasks. If you execute a task according to the pre-defined criteria (whatever that may be), then you earn its badge. In this way, the badge represents the sum of all the learning that you have undertaken to perform the task successfully, regardless of where, when or how that learning was done.

Opinion badge

This is my blog, so of course all of the above assertions are the product of my own opinion. Naturally, I believe it to be an opinion informed by experience.

Other people have different opinions – some concordant, some contrary, as the comments under Badges of honour and The past tense of open badges will attest.

So, I’m curious… what’s your opinion?