Posted tagged ‘L&D’

5 podcasts every e-learning professional should listen to

3 July 2019

…or should that be “to which every e-learning professional should listen”? Never mind, I can end a sentence with a preposition if I want to.

Arcane grammar jokes aside, I’m a late bloomer to podcasts. While everyone else was apparently obsessed with them, they never really appealed to me until I starting taking long trips on the bus. Now I’m hooked.

As many of my peers will attest, there’s no shortage of podcasts directed to the L&D practitioner. In fact, the sheer volume of options can be overwhelming.

If like me you’re just getting started with podcasts, or perhaps you’re looking for another one to add to your subscription, I hereby offer you 5 of my favourites.

A mobile phone with earphones

1. Learning Uncut

Produced by three of the best in the business – namely, Michelle Ockers, Karen Maloney and Amanda Ashby – Learning Uncut recently celebrated its first birthday.

Over the course of the past year, Michelle and Karen have interviewed an impressive cross-section of experts in my corner of the globe. The episode featuring Nic Barry is a standout.

2. The Learning & Development Podcast

A new comer to the podcasting scene, The Learning & Development Podcast is hosted by David James.

David’s view of our profession largely mirrors my own (hence he is a genius) and I consider his interview with Simon Gibson a must-hear.

3. Learning is the New Working

Given his experience as Microsoft’s Chief Learning Officer, Chris Pirie’s Learning is the New Working is well worth a listen.

Chris reaches out to people around the world whom I haven’t heard of before (to be perfectly honest) which is welcome because they diversify my feed.

4. The eLearning Coach Podcast

No self-respecting e-learning professional would fail to devour Connie Malamed’s The eLearning Coach blog, which she complements admirably with The eLearning Coach Podcast.

What I love about Connie’s expertise is her focus on practicality. Thought leadership is great and all, but how do we apply it to our work?

5. Hardcore History

While educational, Hardcore History isn’t about education. I include it in my list of faves however because it flies in the face of contemporary notions of instructional design.

Each episode spans several hours and frankly I could listen to Dan Carlin talk all day. Despite the hoopla over micro-learning (which, for the record, I advocate) clearly one size does not fit all.

My point is it’s healthy for we professionals to continually re-assess our own philosophies by appreciating contrarian approaches – especially those that are raging success stories!

Light bulb

If you’d like more ideas for what an e-learning professional should do, check out the following blog posts by yours truly:

And these by my friend Matt Guyan:

Advertisements

The L&D maturity curve

4 March 2019

Over the course of my career, I’ve witnessed a slow but steady shift away from formal learning to informal learning.

Of course, remnants of the “formal first” philosophy still exist, whereby every conceivable problem is attempted to be fixed by a training solution, typically in the form of a course. Over time, the traditional classroom-based delivery of such courses has increasingly given way to online modules, but that’s merely a change in format – not strategy.

While courses certainly have their place in the L&D portfolio, the forgetting curve places a question mark over their longterm effectiveness on their own.

The informal first philosophy balances the pendulum by empowering the employee to self-direct their learning in accordance with their personal needs.

While in some cases informal learning obviates the need for training, in other cases it will complement it. For example, I see the informalisation of learning as an opportunity to deliver the content (for example, via a wiki) which can be consumed at the discretion of the employee. The focus of the course then pivots to the application of the content, which is the point of learning it in the first place. Similarly, the assessment evaluates the learning in the context of real-world scenarios, which is what the learner will encounter post-course.

And since the content remains accessible, it can be used for ongoing reference long after the course has been completed.

A hand holding a pen pointing to a chart.

While I consider the informal first philosophy a giant leap in L&D maturity, it essentially pertains to instructional design. For a more holistic view of L&D, I propose an “assessment first” philosophy by which the capability of the target audience is analysed prior to any design work being undertaken.

The rationale for this philosophy is best appreciated in the context of an existing employee base (rather than greenhorn new starters). Such a group comprises adults who have a wide range of knowledge, skills and experiences. Not to mention they’ve probably been doing the job for a number of years.

Sheep dipping everyone in this group with the same training doesn’t make much sense. For a minority it might be a worthwhile learning experience, but for the majority it is likely to be redundant. This renders the training an ineffective waste of time, and an unnecessary burden on the L&D team.

By firstly assessing the target audience’s proficiency in the competencies that matter, a knowledge gap analysis can identify those in which the population is weak, and targeted training can be delivered in response. Individuals who are “not yet competent” in particular areas can be assigned personalised interventions.

This approach avoids the solution first trap. By focusing the L&D team’s attention on the real needs of the business, not only does the volume of demand reduce, but the work becomes more relevant.

The assessment first philosophy may appear incongruent where new starters are concerned, who by definition are assumed to be weak in all competencies – after all, they’ve only just walked through the door! – but I counter that assumption on two fronts.

Firstly, not all new starters are doe-eyed college grads. Many have had previous jobs in the industry or in other industries, and so they arrive armed with transferable knowledge, skills and experiences.

And regardless, the informal first philosophy holds true. That is to say, the new starter can consume the content (or not) as they see fit, demonstrate their understanding in the scenario-oriented “course”, and formalise it via the assessment.

The results of the assessment dictate any further intervention that is necessary.

Of course, some topics such as the company’s own products or processes will necessitate significant front-end loading via content development and maybe even curricula, but these may be considered the exception rather than the rule. By looking through the lens of assessment first, the L&D team works backwards to focus that kind of energy on where it is warranted.

It is also worth noting the assessment first philosophy renders the traditional “pass mark” obsolete, but such a radical idea is a story for another day!

Laptop showing business metrics.

While the assessment first philosophy represents an exponential leap in the maturity of L&D, there is yet another leap to make: “performance first”.

The raison d’être of the L&D team is to improve performance, so it’s always been a mystery to me as to why our work is so often disconnected to the business results. I do appreciate the barriers that are in our way – such as the inexplicable difficulty of obtaining the stats – but still, we can and should be doing more.

Under the performance first paradigm, it is not knowledge gaps that are analysed, but rather performance gaps. A root cause analysis identifies whether the cause is a capability deficiency or not – in the case of the former, a capability analysis feeds into the assessment first approach; in the case of the latter, a solution other than training is pursued instead.

As with assessment first, performance first may appear incongruent where new starters are concerned. After all, their stats thus far are zero, and waiting to recognise poor performance may have unacceptable consequences.

So again we have another exception to the rule whereby some folks may be scaffolded through L&D intervention prior to their performance being analysed. However the point is, we needn’t force everyone down that road. It depends on the circumstances.

And again, by looking through the lens of performance first, the L&D team works backwards to focus its energy on where it is needed. But this time with results at the forefront of the team’s purpose, its relevance to the business goes through the roof.

The L&D Maturity Curve, featuring Formal First rising to Informal First rising to Assessment First rising to Performance First. The x-axis represents maturity of the L&D function and the y-axis represents its relevance to the business.

I realise my take on L&D maturity might freak some of my peers out. Concurrently, others will argue that we should leapfrog to performance first now and get on with it.

Personally I consider the maturity curve a journey. Yes, it is theoretically possible to skip stages, but I feel that would be a shock to the system. From a change management perspective, I believe an organisation at one stage of the curve would achieve more success by growing into the next stage of the curve, while ironing out the bugs and creating the new normal along the way.

Besides, it isn’t a race. Important journeys take time. What matters is the direction in which that journey is heading.

Gift horses

16 July 2018

If I had asked people what they wanted, they would have said faster horses.

I’m fascinated by this quote that Henry Ford may or may not have uttered.

In The best of both worlds I promoted Design Thinking as a means of using customer insights to inform strategic decision making. However, as the above quote suggests, customers don’t know what they don’t know. Sometimes it takes an expert to show them.

In an era in which the very existence of the L&D department is attracting evermore scrutiny, the role of the “expert” in our context is becoming increasingly pertinent. I have long been of the opinion that L&D professionals should dispense with being the SME of what is being trained; and instead be the SME of how it’s being trained.

Under this paradigm, we are the experts in the science and practice of learning and development, and we consult the business accordingly.

This resonates with me because beyond the education and research I invest in myself, I’ve been around the block a few times. I have a strong idea of what will work, not only because I’ve read up on it and thought deeply about it, but also because I’ve seen it play out with my own eyes.

I also get paid to focus on my portfolio every day. I consider it not only my mandate, but an ethical obligation, to originate and innovate.

A horse in a pasture

So I’m more than comfortable with L&D professionals pushing the envelope on the basis of knowledge, curiosity, creativity and experience – so long as these activities are put through the Design Thinking cycle too.

By this I mean be confident that your idea is a sound one, but not so arrogant as to instil it with blind faith. Put your one-man (in my case) fruit of ideation to your customers to check it will work for them. While you’re at it, confirm the problem statement is indeed one that needs to be solved.

So much for Design Thinking being linear!

Then proceed with prototyping and testing, prior to launching an MVP, and iterating and evolving it.

In this way, the promise of expertise is tempered by an agile approach. It hedges the bet not only by building confidence pre-launch, but also by minimising potential losses post-launch.

Ford Mustang emblem depicting a galloping horse

If Mr Ford had resigned himself to breeding faster horses, he never would have launched the Model T.

In our admirable quest to utilise our customers as a source of innovation, let’s balance that approach by empowering the experts whom we have hired to practise their expertise.

Lest the L&D department be put out to pasture.

7 tips for custodians of capability frameworks

18 September 2017

Wow, my previous blog post elicited some rich comments from my peers in the L&D profession.

Reframing the capability framework was my first foray into publishing my thoughts on the subject, in which I argued in favour of using the oft-ignored resource as a tool to be proactive and add value to the business.

To everyone who contributed a comment, not only via my blog but also on Twitter and LinkedIn… thank you. Your insights have helped me shape my subsequent thoughts about capability frameworks and their implementation in an organisation.

I will now articulate these thoughts in the tried and tested form of a listicle.

Metallic blue building blocks, two golden.

If you are building, launching or managing your organisation’s capabilities, I invite you to consider my 7 tips for custodians of capability frameworks…

1. Leverage like a banker.

At the organisational level, the capabilities that drive success are strikingly similar across companies, sectors and industries. Unless you have incredibly unique needs, you probably don’t need to build a bespoke capability framework from the ground up.

Instead, consider buying a box set of capabilities from the experts in this sort of thing, or draw inspiration *ahem* from someone else who has shared theirs. (Hint: Search for a “leadership” capability framework.)

2. Refine like a sculptor.

No framework will perfectly model your organisation’s needs from the get-go.

Tweak the capabilities to better match the nature of the business, its values and its goals.

3. Release the dove.

I’ve witnessed a capability framework go through literally years of wordsmithing prior to launch, in spite of rapidly diminishing returns.

Lexiconic squabbles are a poor substitute for action. So be agile: Launch the not-yet-finished-but-still-quite-useful framework (MVP) now.

Then continuously improve it.

4. Evolve or die.

Consider your capability framework an organic document. It is never finished.

As the needs of the business change, so too must your people’s capabilities to remain relevant.

5. Sing from the same song sheet.

Apply the same capabilities to everyone across the organisation.

While technical capabilities will necessarily be different for the myriad job roles throughout your business, the organisational capabilities should be representative of the whole organisation’s commitment to performance.

For example, while Customer Focus is obviously relevant to the contact centre operator, is it any less so for the CEO? Conversely, while Innovation is obviously relevant to the CEO, is it any less so for the contact centre operator?

Having said that, the nature of a capability will necessarily be different across levels or leadership stages. For example, while the Customer Focus I and Innovation I capabilities that apply to the contact centre operator will be thematically similar to Customer Focus V and Innovation V that apply to the CEO, their pitches will differ in relation to their respective contexts.

6. Focus like an eagle.

Frameworks that comprise dozens of capabilities are unwieldy, overwhelming, and ultimately useless.

Not only do I suggest your framework comprise fewer rather than extra capabilities, but also that one or two are earmarked for special attention. These should align to the strategic imperatives of the business.

7. Use it or lose it.

A capability framework that remains unused is merely a bunch of words.

In my next blog post I will examine ways in which it can be used to add value at each stage of the employee lifecycle.

Reframing the capability framework

28 August 2017

There once was a time when I didn’t respect the capability framework. I saw it as yet another example of HR fluff.

You want me to be innovative? No kidding. And collaborative? What a great idea! And you want me to focus on our customers? Crikey, why didn’t I think of that?!

But that was then, and this is now.

Now I realise that I severely underestimated the level of support that my colleagues seek in relation to their learning and development. As a digitally savvy L&D professional, I’ve had the temperament to recognise the capabilities I need – nay, want – to develop, the knowledge of how and where to develop them, and crucially the motivation to go ahead and do it.

But our target audience is not like us. While we live and breathe learning, they don’t. Far too many imho wait to be trained, and our boring, time-guzzling and ultimately useless offerings haven’t helped change their minds.

Yet even those who are motivated to learn struggle to do so effectively.

A businessman thinking

Sure, we’ve read about those intrepid millennials who circumnavigate the languid L&D department to develop their own skills via YouTube, MOOCs, user forums, meet-ups and the like; but for every one wunderkind is several hundred others scratching their heads once a year while they ponder what to put in their Individual Development Plan, before finally settling on “presentation skills”.

This is unacceptable!

While it’s admirable for L&D to be responsive to the business’s relentless requests for training, it’s time for us to break out of the cycle of reactivity. I put it to you that a capability framework can help us do that. It’s a tool we can use to be proactive.

If we inform the organisation of the capabilities that will improve our performance, enable individuals to assess these capabilities to identify those that are most relevant for their own development, and map meaningful learning opportunities against each one, we add value to the business.

In an era in which the ROI of the L&D department is being put under ever-increasing scrutiny, I suggest a value-added approach is long overdue.

The 70:20:10 lens

9 February 2016

In 70:20:10 for trainers I advocated the use of the 70:20:10 model by L&D professionals as a lens through which to view their instructional design.

The excellent comments on my post, and insightful blog posts by others – notably Mark Britz, Clark Quinn and Arun Pradhan – have prompted me to think deeper about my premise.

I continue to reject the notion that 70:20:10 is a formula or a goal, because it is not a model of what “should be”. For example, we needn’t assign 70% of our time, effort and money on OTJ interventions, 20% on social learning, and 10% on formal training. Similarly, we shouldn’t mandate that our target audience aligns its learning activity according to these proportions. Both of these approaches miss the point.

The point is that 70:20:10 is a model of what “is”. Our target audience does undertake 70% of its learning on the job, 20% via interacting with others, and 10% off the job (or thereabouts). Mark Britz calls it a principle. It’s not right and it’s not wrong. It just is.

Our role then as L&D professionals is to support and facilitate this learning as best we can. One of the ways I propose we do this is by using 70:20:10 as a lens. By this I mean using it as a framework to structure our thinking and prompt us on what to consider. Less a recipe citing specific ingredients and amounts, more a shopping basket containing various ingredients that we can use in different combinations depending on the meal.

For this purpose I have created the following diagram. To avoid the formula trap, I decided against labelling each segment 70, 20 and 10, and instead chose their 3E equivalents of Experience, Exposure and Education. For the same reason, I sized each segment evenly rather than to scale.

The 3 E's: Education, Exposure, Experience

Using the framework at face value is straight-forward. Given a learning objective, we consider whether a course or a resource may be suitable; whether a social forum might be of use; if matching mentees with mentors would be worthwhile. Perhaps it would be helpful to develop some reference content, or provide a job aid. When looking through the lens, we see alternatives and complements beyond the usual event-based intervention.

Yet we can see more. Consider not only the elements in the framework, but also the interactions between them. For example, in our course we could assign an on-the-job task to the learners, and ask them to share their experiences with it on the ESN. In the language of the framework, we are connecting education to experience, which in turn we connect to exposure. Conversely we can ask workshop attendees to share their experiences in class (connecting experience to education) or encourage them to call out for project opportunities (connecting exposure to experience). The possibilities for integrating the elements are endless.

Those who see L&D as the arbiter of all learning in the workplace may find all this overwhelming. But I see L&D as a support function. To me, 70:20:10 is not about engineering the perfect solution. It’s about adding value to what already happens in our absence.

70:20:10 for trainers

12 January 2016

Learning & Development Professional has been running a poll on the following question:

Is the 70:20:10 model still relevant today?

And I’m shocked by the results. At the time of writing this blog, over half the respondents have chosen “No”. Assuming they are all L&D professionals, the extrapolation means most of us don’t think the 70:20:10 model is relevant to our work.

But what does this really mean?

In LDP’s article The 70:20:10 model – how fair dinkum is it in 2015? – by the way, “fair dinkum” is Australian slang for “real” or “genuine” – Emeritus Professor David Boud says he doesn’t think there is proper evidence available for the effectiveness of the model.

If this is a backlash against the numbers, I urge us all to let it go already. Others have explained umpteen times that 70:20:10 is not a formula. It just refers to the general observation that the majority of learning in the workplace is done on the job, a substantial chunk is done by interacting with others, while a much smaller proportion is done off the job (eg in a classroom).

Indeed this observation doesn’t boast a wealth of empirical evidence to support it, although there is some – see here, here and here.

Nonetheless, I wonder if the hoo-ha is really about the evidence. After all, plenty of research can be cited to support the efficacy of on-the-job learning, social learning and formal training. To quibble over their relative proportions seems a bit pointless.

Consequently, some point the finger at trainers. These people are relics of a bygone era, clinging to the old paradigm because “that’s how we’ve always done it”. And while this might sound a bit harsh, it may contain a seed of truth. Change is hard, and no one wants their livelihood threatened.

If you feel deep down that you are one of the folks who views 70:20:10 as an “us vs them” proposition, I have two important messages that I wish to convey to you…

1. Training will never die.

While I believe the overall amount of formal training in the workplace will continue to decrease, it will never disappear altogether – principally for the reasons I’ve outlined in Let’s get rid of the instructors!.

Ergo, trainers will remain necessary for the foreseeable future.

2. The 70:20:10 model will improve your effectiveness.

As the forgetting curve illustrates, no matter how brilliant your workshops are, they are likely to be ineffective on their own.

Ebbinghaus Forgetting Curve showing exponentially decreasing retention over time

To overcome this problem, I suggest using the 70:20:10 model as a lens through which you view your instructional design.

For example, suppose you are charged with training the sales team on a new product. As a trainer, you will smash the “10” with an informative and engaging workshop filled with handouts, scenarios, role plays, activities etc.

Then your trainees return to their desks, put the handouts in a drawer, and try to remember all the important information for as long as humanly possible.

To help your audience remember, why not provide them with reference content in a central location, such as on the corporate intranet or in a wiki. Then they can look it up just in time when they need it; for example, in the waiting room while visiting a client.

Job aids would also be useful, especially for skills-based information; for example, the sequence of key messages to convey in a client conversation.

To improve the effectiveness of your workshop even further, consider doing the following:

  • Engage each trainee’s manager to act as their coach or mentor. Not only does this extend the learning experience, but it also bakes in accountability for the learning.

  • Encourage the manager to engineer opportunities for the trainee to put their learning into practice. These can form part of the assessment.

  • Set up a community of practice forum in which the trainee can ask questions in the moment. This fosters collaboration among the team and reduces the burden on the L&D department to respond to each and every request.

  • Partner each trainee with a buddy to accompany them on their sales calls. The buddy can act as a role model and provide feedback to the trainee.

In my humble opinion, it is counter-productive to rail against 70:20:10.

As an L&D professional, it is in your interest to embrace it.