Posted tagged ‘L&D’

7 tips for custodians of capability frameworks

18 September 2017

Wow, my previous blog post elicited some rich comments from my peers in the L&D profession.

Reframing the capability framework was my first foray into publishing my thoughts on the subject, in which I argued in favour of using the oft-ignored resource as a tool to be proactive and add value to the business.

To everyone who contributed a comment, not only via my blog but also on Twitter and LinkedIn… thank you. Your insights have helped me shape my subsequent thoughts about capability frameworks and their implementation in an organisation.

I will now articulate these thoughts in the tried and tested form of a listicle.

Metallic blue building blocks, two golden.

If you are building, launching or managing your organisation’s capabilities, I invite you to consider my 7 tips for custodians of capability frameworks…

1. Leverage like a banker.

At the organisational level, the capabilities that drive success are strikingly similar across companies, sectors and industries. Unless you have incredibly unique needs, you probably don’t need to build a bespoke capability framework from the ground up.

Instead, consider buying a box set of capabilities from the experts in this sort of thing, or draw inspiration *ahem* from someone else who has shared theirs. (Hint: Search for a “leadership” capability framework.)

2. Refine like a sculptor.

No framework will perfectly model your organisation’s needs from the get-go.

Tweak the capabilities to better match the nature of the business, its values and its goals.

3. Release the dove.

I’ve witnessed a capability framework go through literally years of wordsmithing prior to launch, in spite of rapidly diminishing returns.

Lexiconic squabbles are a poor substitute for action. So be agile: Launch the not-yet-finished-but-still-quite-useful framework (MVP) now.

Then continuously improve it.

4. Evolve or die.

Consider your capability framework an organic document. It is never finished.

As the needs of the business change, so too must your people’s capabilities to remain relevant.

5. Sing from the same song sheet.

Apply the same capabilities to everyone across the organisation.

While technical capabilities will necessarily be different for the myriad job roles throughout your business, the organisational capabilities should be representative of the whole organisation’s commitment to performance.

For example, while Customer Focus is obviously relevant to the contact centre operator, is it any less so for the CEO? Conversely, while Innovation is obviously relevant to the CEO, is it any less so for the contact centre operator?

Having said that, the nature of a capability will necessarily be different across levels or leadership stages. For example, while the Customer Focus I and Innovation I capabilities that apply to the contact centre operator will be thematically similar to Customer Focus V and Innovation V that apply to the CEO, their pitches will differ in relation to their respective contexts.

6. Focus like an eagle.

Frameworks that comprise dozens of capabilities are unwieldy, overwhelming, and ultimately useless.

Not only do I suggest your framework comprise fewer rather than extra capabilities, but also that one or two are earmarked for special attention. These should align to the strategic imperatives of the business.

7. Use it or lose it.

A capability framework that remains unused is merely a bunch of words.

In my next blog post I will examine ways in which it can be used to add value at each stage of the employee lifecycle.

Advertisements

Reframing the capability framework

28 August 2017

There once was a time when I didn’t respect the capability framework. I saw it as yet another example of HR fluff.

You want me to be innovative? No kidding. And collaborative? What a great idea! And you want me to focus on our customers? Crikey, why didn’t I think of that?!

But that was then, and this is now.

Now I realise that I severely underestimated the level of support that my colleagues seek in relation to their learning and development. As a digitally savvy L&D professional, I’ve had the temperament to recognise the capabilities I need – nay, want – to develop, the knowledge of how and where to develop them, and crucially the motivation to go ahead and do it.

But our target audience is not like us. While we live and breathe learning, they don’t. Far too many imho wait to be trained, and our boring, time-guzzling and ultimately useless offerings haven’t helped change their minds.

Yet even those who are motivated to learn struggle to do so effectively.

A businessman thinking

Sure, we’ve read about those intrepid millennials who circumnavigate the languid L&D department to develop their own skills via YouTube, MOOCs, user forums, meet-ups and the like; but for every one wunderkind is several hundred others scratching their heads once a year while they ponder what to put in their Individual Development Plan, before finally settling on “presentation skills”.

This is unacceptable!

While it’s admirable for L&D to be responsive to the business’s relentless requests for training, it’s time for us to break out of the cycle of reactivity. I put it to you that a capability framework can help us do that. It’s a tool we can use to be proactive.

If we inform the organisation of the capabilities that will improve our performance, enable individuals to assess these capabilities to identify those that are most relevant for their own development, and map meaningful learning opportunities against each one, we add value to the business.

In an era in which the ROI of the L&D department is being put under ever-increasing scrutiny, I suggest a value-added approach is long overdue.

The 70:20:10 lens

9 February 2016

In 70:20:10 for trainers I advocated the use of the 70:20:10 model by L&D professionals as a lens through which to view their instructional design.

The excellent comments on my post, and insightful blog posts by others – notably Mark Britz, Clark Quinn and Arun Pradhan – have prompted me to think deeper about my premise.

I continue to reject the notion that 70:20:10 is a formula or a goal, because it is not a model of what “should be”. For example, we needn’t assign 70% of our time, effort and money on OTJ interventions, 20% on social learning, and 10% on formal training. Similarly, we shouldn’t mandate that our target audience aligns its learning activity according to these proportions. Both of these approaches miss the point.

The point is that 70:20:10 is a model of what “is”. Our target audience does undertake 70% of its learning on the job, 20% via interacting with others, and 10% off the job (or thereabouts). Mark Britz calls it a principle. It’s not right and it’s not wrong. It just is.

Our role then as L&D professionals is to support and facilitate this learning as best we can. One of the ways I propose we do this is by using 70:20:10 as a lens. By this I mean using it as a framework to structure our thinking and prompt us on what to consider. Less a recipe citing specific ingredients and amounts, more a shopping basket containing various ingredients that we can use in different combinations depending on the meal.

For this purpose I have created the following diagram. To avoid the formula trap, I decided against labelling each segment 70, 20 and 10, and instead chose their 3E equivalents of Experience, Exposure and Education. For the same reason, I sized each segment evenly rather than to scale.

The 3 E's: Education, Exposure, Experience

Using the framework at face value is straight-forward. Given a learning objective, we consider whether a course or a resource may be suitable; whether a social forum might be of use; if matching mentees with mentors would be worthwhile. Perhaps it would be helpful to develop some reference content, or provide a job aid. When looking through the lens, we see alternatives and complements beyond the usual event-based intervention.

Yet we can see more. Consider not only the elements in the framework, but also the interactions between them. For example, in our course we could assign an on-the-job task to the learners, and ask them to share their experiences with it on the ESN. In the language of the framework, we are connecting education to experience, which in turn we connect to exposure. Conversely we can ask workshop attendees to share their experiences in class (connecting experience to education) or encourage them to call out for project opportunities (connecting exposure to experience). The possibilities for integrating the elements are endless.

Those who see L&D as the arbiter of all learning in the workplace may find all this overwhelming. But I see L&D as a support function. To me, 70:20:10 is not about engineering the perfect solution. It’s about adding value to what already happens in our absence.

70:20:10 for trainers

12 January 2016

Learning & Development Professional has been running a poll on the following question:

Is the 70:20:10 model still relevant today?

And I’m shocked by the results. At the time of writing this blog, over half the respondents have chosen “No”. Assuming they are all L&D professionals, the extrapolation means most of us don’t think the 70:20:10 model is relevant to our work.

But what does this really mean?

In LDP’s article The 70:20:10 model – how fair dinkum is it in 2015? – by the way, “fair dinkum” is Australian slang for “real” or “genuine” – Emeritus Professor David Boud says he doesn’t think there is proper evidence available for the effectiveness of the model.

If this is a backlash against the numbers, I urge us all to let it go already. Others have explained umpteen times that 70:20:10 is not a formula. It just refers to the general observation that the majority of learning in the workplace is done on the job, a substantial chunk is done by interacting with others, while a much smaller proportion is done off the job (eg in a classroom).

Indeed this observation doesn’t boast a wealth of empirical evidence to support it, although there is some – see here, here and here.

Nonetheless, I wonder if the hoo-ha is really about the evidence. After all, plenty of research can be cited to support the efficacy of on-the-job learning, social learning and formal training. To quibble over their relative proportions seems a bit pointless.

Consequently, some point the finger at trainers. These people are relics of a bygone era, clinging to the old paradigm because “that’s how we’ve always done it”. And while this might sound a bit harsh, it may contain a seed of truth. Change is hard, and no one wants their livelihood threatened.

If you feel deep down that you are one of the folks who views 70:20:10 as an “us vs them” proposition, I have two important messages that I wish to convey to you…

1. Training will never die.

While I believe the overall amount of formal training in the workplace will continue to decrease, it will never disappear altogether – principally for the reasons I’ve outlined in Let’s get rid of the instructors!.

Ergo, trainers will remain necessary for the foreseeable future.

2. The 70:20:10 model will improve your effectiveness.

As the forgetting curve illustrates, no matter how brilliant your workshops are, they are likely to be ineffective on their own.

Ebbinghaus Forgetting Curve showing exponentially decreasing retention over time

To overcome this problem, I suggest using the 70:20:10 model as a lens through which you view your instructional design.

For example, suppose you are charged with training the sales team on a new product. As a trainer, you will smash the “10” with an informative and engaging workshop filled with handouts, scenarios, role plays, activities etc.

Then your trainees return to their desks, put the handouts in a drawer, and try to remember all the important information for as long as humanly possible.

To help your audience remember, why not provide them with reference content in a central location, such as on the corporate intranet or in a wiki. Then they can look it up just in time when they need it; for example, in the waiting room while visiting a client.

Job aids would also be useful, especially for skills-based information; for example, the sequence of key messages to convey in a client conversation.

To improve the effectiveness of your workshop even further, consider doing the following:

  • Engage each trainee’s manager to act as their coach or mentor. Not only does this extend the learning experience, but it also bakes in accountability for the learning.

  • Encourage the manager to engineer opportunities for the trainee to put their learning into practice. These can form part of the assessment.

  • Set up a community of practice forum in which the trainee can ask questions in the moment. This fosters collaboration among the team and reduces the burden on the L&D department to respond to each and every request.

  • Partner each trainee with a buddy to accompany them on their sales calls. The buddy can act as a role model and provide feedback to the trainee.

In my humble opinion, it is counter-productive to rail against 70:20:10.

As an L&D professional, it is in your interest to embrace it.

My blogging year in the rear-view mirror

8 December 2015

As the year draws to a close, I like to reflect on my blog posts.

I invite you to scan the list below and catch up on any that you may have missed. It’s never to late to comment!

Rear-view mirror

Thank you everyone for your ongoing support.

I wish you a merry Christmas and a happy new year!

Where is L&D heading?

6 October 2015

Last week I was invited by David Swaddle to be a panellist at the Sydney eLearning and Instructional Design meetup.

The topic of the evening was Where is L&D Heading? and some questions were posted through by the attendees ahead of time, while others emerged through the discourse.

Here is an overview of my answers, plus elaborations and suggestions for further reading, for each of the questions that was (and was not) asked. Feel free to add your own views via the comments…

Businessman holding a crystal ball

With Ernst & Young dropping their degree entry requirement, how do you see the future of universities? Is the race to the bottom on time and price for degrees affecting employers’ perceptions of universities? What respect do MOOC qualifications get?

I find EY’s move here interesting, but I don’t expect other companies to follow suit en mass – particularly enterprise-wide. Having said that, dropping the degree entry requirement could make sense for specific teams such as Innovation, who might be looking for someone with creative thinking skills rather than a Bachelor of Commerce degree.

I see the future of universities as service providers, plain and simple. Students are customers, and increasing competition, deregulation and even the emergence of MOOCs has shifted power into their hands. Yes, deregulation may prompt the $100,000 degree… but who will buy it?

If students are customers, by extension so are employers. I don’t think the time and price of a degree are such big issues for them; instead I think it’s the relevance of the degree. Whether or not we agree the role of the university is to prepare students for the workplace, I think it’s going that way due to market forces.

Regarding MOOC qualifications, I think many of us are still looking at them the wrong way. When we worry about the status of their credentials or lose sleep over their completion rates, we’re perpetuating an out-dated paradigm of education based on formal learning. I prefer to see MOOCs through the lens of informal learning which values the learning over its bureaucracy. If a job applicant lists some MOOCs on their CV, I think it demonstrates an aptitude to drive their own development.

Question mark

How do you see the impact and importance of big data, adaptive learning, mobile learning and micro-learning?

While mobile learning gets a lot of hype – rightly or wrongly – my target audience is office bound. Yes, I can push content to their devices (and there’s a solid argument for micro-learning in this instance) but the truth is no one will do their training on the bus. Outside of work hours, most people don’t want to do anything work related.

I see more scope in pull learning. For example, it’s important that your intranet is mobile optimised, so when someone is away from their desk, they can quickly look up the information they need and put it into action.

The real power of m-learning though is in creating an experience. By this I mean integrating the content with the environment in which the individual is situated, and I see a lot of potential in augmented reality and wearable technologies facilitating this.

And let’s not forget about blended learning. If we allow our attendees to bring their tablets into class, they can participate in online polling, consume content and play games together. While this isn’t actually mobile learning, it leverages the technology.

As for big data, there is clearly a lot of potential in using it to inform our practice – if we can access it. I also see a lot of potential for adaptive learning in personalising the learning experience – if we can work with the tools. My caveat for emerging technologies such as these is what I call the “Average Joe imperative” – if regular folks can’t do it, it won’t gain widespread adoption.

Question mark

What about online social education and Communities of Practice? What are the challenges in using them properly in companies, schools or universities? Where are the success stories?

Beyond the technology, the success of social learning is predicated on the culture of the organisation. If you’re people aren’t the type who care and share, then a platform isn’t going to be much help. Having said that, I believe the managers in the organisation have a critical role to play in leading by example.

My go-to success stories for social learning are Coca-Cola Amatil, who have cultivated active communities of practice across state-based factory floors; and Deloitte, who are the poster child for enterprise social networking.

Question mark

Will interactive videos replace e-learning modules?

I think lots of things will replace e-learning modules!

As we embrace informal learning, we will rely less on e-learning modules in favour of alternatives such as social forums, job aids, games, and indeed, interactive videos.

I see the LMS then being used more for the assessment of learning.

Question mark

What tips does the panel have for coping with reduced training budgets?

My big tip here is that you can do a lot for free or on-the-cheap.

For example, if you want to film a training scenario, you could pay a production house many thousands of dollars to produce a slick, Academy Award worthy video clip. Alternatively, you could use your iPhone.

Sure, the quality won’t be nearly as good… so long as it’s good enough. What really matters is the learning outcome.

Besides, I think in-house production adds authenticity to the scene.

Question mark

Does L&D belong in HR?

I interpret this question as really asking “Should L&D be centralised or distributed?”.

My short answer is both. A centralised Organisational Development function can focus on enterprise-wide capability needs, while L&D professionals embedded in the business can address local capability needs.

Question mark

How does the panel identify whether an L&D professional is good? Does Australia need improved quality benchmarking or qualifications for L&D professionals such as instructional designers?

I think the point of learning in the workplace is to improve performance, so my definition of a “good” L&D professional is one that improves the performance of his or her business.

There are certain attributes that I value in an L&D pro, including being proactive, consultative, creative, and willing to try new things.

If I were considering an applicant for an instructional design role, I’d ask them to demonstrate their track record, just as I’d ask a sales rep to do. A portfolio would be useful, as would be their approach to a hypothetical project.

Furthermore, I think you can tell a lot about someone’s expertise through simple conversation; if they don’t really know what they’re talking about, it will become painfully obvious.

As for benchmarking and formal qualifications for L&D pro’s, I think they can help but I wouldn’t put too much stock into them. As EY is seeing, acing the qual doesn’t necessarily translate into good practice.

Question mark

What advice would you give to somebody interested in getting involved in ID?

I think getting involved is the key phrase in this question.

Attend meetups and events, get active on social media, participate in #lrnchat, work out loud, scan the academic research, and read blogs – learn from those at the coal face.

The caveat of the performance centre

10 February 2014

One of the more exciting ideas to emerge from the corporate learning space, which I hasten to add is yet to be realised, is to transform the Learning & Development department into a performance centre.

Rather than charging L&D Consultants with marketing the team’s lovingly crafted interventions, or reacting to random solution-first requests from the business – We need a team building workshop! – the Performance Consultant analyses the real needs of the business and identifies the relevant solutions.

This is not a novel idea. For example, I am aware of an Australian bank that established a performance centre over a decade ago, while Helen Blunden recently shared the following via an OzLearn chat:

On the face of it, this makes sense to me. I subscribe to the notion that the point of learning in the workplace is to improve performance, and the raison d’être of the performance centre is to shift our focus to its namesake.

However, I do have a caveat: If the performance centre is populated with L&D types, then the solutions they devise are probably going to be L&D oriented.

This won’t appear to pose a problem unless you appreciate that not all performance ailments are due to an L&D deficiency. On the contrary, poor performance may be caused by myriad factors such as:

Nails• A flawed process
• Obsolete technology
• Inadequate resourcing
• Noise or other disturbances
• Office politics
• Interpersonal conflict

…or any number of human conditions:

• Stress
• Sickness
• Demotivation
• Exhaustion
• Laziness

…not to mention one of my favourites offered by Joyce Seitzinger in the aforementioned Ozlearn chat:

Of course! Recruiting the right person for the role in the first place!

My point is, while poor performance may well be due to a lack of capability, it might not be either. An effective Performance Consultant must determine the root causes of the problems – whatever they may be – and respond accordingly. Do former L&D Consultants have that skillset?

If all you have is a hammer, everything looks like a nail.