Posted tagged ‘learning’

The L&D maturity curve

4 March 2019

Over the course of my career, I’ve witnessed a slow but steady shift away from formal learning to informal learning.

Of course, remnants of the “formal first” philosophy still exist, whereby every conceivable problem is attempted to be fixed by a training solution, typically in the form of a course. Over time, the traditional classroom-based delivery of such courses has increasingly given way to online modules, but that’s merely a change in format – not strategy.

While courses certainly have their place in the L&D portfolio, the forgetting curve places a question mark over their longterm effectiveness on their own.

The informal first philosophy balances the pendulum by empowering the employee to self-direct their learning in accordance with their personal needs.

While in some cases informal learning obviates the need for training, in other cases it will complement it. For example, I see the informalisation of learning as an opportunity to deliver the content (for example, via a wiki) which can be consumed at the discretion of the employee. The focus of the course then pivots to the application of the content, which is the point of learning it in the first place. Similarly, the assessment evaluates the learning in the context of real-world scenarios, which is what the learner will encounter post-course.

And since the content remains accessible, it can be used for ongoing reference long after the course has been completed.

A hand holding a pen pointing to a chart.

While I consider the informal first philosophy a giant leap in L&D maturity, it essentially pertains to instructional design. For a more holistic view of L&D, I propose an “assessment first” philosophy by which the capability of the target audience is analysed prior to any design work being undertaken.

The rationale for this philosophy is best appreciated in the context of an existing employee base (rather than greenhorn new starters). Such a group comprises adults who have a wide range of knowledge, skills and experiences. Not to mention they’ve probably been doing the job for a number of years.

Sheep dipping everyone in this group with the same training doesn’t make much sense. For a minority it might be a worthwhile learning experience, but for the majority it is likely to be redundant. This renders the training an ineffective waste of time, and an unnecessary burden on the L&D team.

By firstly assessing the target audience’s proficiency in the competencies that matter, a knowledge gap analysis can identify those in which the population is weak, and targeted training can be delivered in response. Individuals who are “not yet competent” in particular areas can be assigned personalised interventions.

This approach avoids the solution first trap. By focusing the L&D team’s attention on the real needs of the business, not only does the volume of demand reduce, but the work becomes more relevant.

The assessment first philosophy may appear incongruent where new starters are concerned, who by definition are assumed to be weak in all competencies – after all, they’ve only just walked through the door! – but I counter that assumption on two fronts.

Firstly, not all new starters are doe-eyed college grads. Many have had previous jobs in the industry or in other industries, and so they arrive armed with transferable knowledge, skills and experiences.

And regardless, the informal first philosophy holds true. That is to say, the new starter can consume the content (or not) as they see fit, demonstrate their understanding in the scenario-oriented “course”, and formalise it via the assessment.

The results of the assessment dictate any further intervention that is necessary.

Of course, some topics such as the company’s own products or processes will necessitate significant front-end loading via content development and maybe even curricula, but these may be considered the exception rather than the rule. By looking through the lens of assessment first, the L&D team works backwards to focus that kind of energy on where it is warranted.

It is also worth noting the assessment first philosophy renders the traditional “pass mark” obsolete, but such a radical idea is a story for another day!

Laptop showing business metrics.

While the assessment first philosophy represents an exponential leap in the maturity of L&D, there is yet another leap to make: “performance first”.

The raison d’être of the L&D team is to improve performance, so it’s always been a mystery to me as to why our work is so often disconnected to the business results. I do appreciate the barriers that are in our way – such as the inexplicable difficulty of obtaining the stats – but still, we can and should be doing more.

Under the performance first paradigm, it is not knowledge gaps that are analysed, but rather performance gaps. A root cause analysis identifies whether the cause is a capability deficiency or not – in the case of the former, a capability analysis feeds into the assessment first approach; in the case of the latter, a solution other than training is pursued instead.

As with assessment first, performance first may appear incongruent where new starters are concerned. After all, their stats thus far are zero, and waiting to recognise poor performance may have unacceptable consequences.

So again we have another exception to the rule whereby some folks may be scaffolded through L&D intervention prior to their performance being analysed. However the point is, we needn’t force everyone down that road. It depends on the circumstances.

And again, by looking through the lens of performance first, the L&D team works backwards to focus its energy on where it is needed. But this time with results at the forefront of the team’s purpose, its relevance to the business goes through the roof.

The L&D Maturity Curve, featuring Formal First rising to Informal First rising to Assessment First rising to Performance First. The x-axis represents maturity of the L&D function and the y-axis represents its relevance to the business.

I realise my take on L&D maturity might freak some of my peers out. Concurrently, others will argue that we should leapfrog to performance first now and get on with it.

Personally I consider the maturity curve a journey. Yes, it is theoretically possible to skip stages, but I feel that would be a shock to the system. From a change management perspective, I believe an organisation at one stage of the curve would achieve more success by growing into the next stage of the curve, while ironing out the bugs and creating the new normal along the way.

Besides, it isn’t a race. Important journeys take time. What matters is the direction in which that journey is heading.

Advertisements

Figure it out

5 November 2018

I can honestly say I’ve never suffered from imposter syndrome.

I’ve always been the type of person who likes to work out how to do something hands-on, so I can talk about it with confidence.

I suppose I’ve been lucky in the sense that, throughout my career, I’ve been able to align my curiosity and sense of direction with the needs of the business.

Having said that, I also suppose I’ve created some of my own luck by keeping a few steps ahead of the business.

Einstein bobblehead

It is in this light that I read this fascinating article about consultants. Initially I considered it an alarming exposé into the fake-it-til-you-make-it culture of professional services.

Upon continued reading, however, I increasingly sympathised with their discomfort of not feeling on top of their game.

In the knowledge economy we can never know everything. For me, there is always another thing that I don’t just want to get my head around, but also deconstruct and reconstruct to understand deeply. When busy-ness gets in the way, the discomfort grows.

Over time I’ve learned to embrace the discomfort. It’s paradoxically liberating to recognise that I will always feel uncomfortable; that’s the nature of this kind of work.

It’s not like digging holes when at the end of your shift you can forget about it until your next shift. As a knowledge worker, you never clock off. Anywhere, anytime – or more accurately, everywhere all the time – you’re thinking about it. It consumes you to the point that it becomes way more than just a job; it’s a lifestyle.

So yes, I sympathise with the consultants in the article. They’re dealing with multiple clients while under pressure to deliver at speed. To this the client will say “We pay you because you’re the expert.” And to a certain extent I agree, but I also appreciate the expert must adapt his or her expertise to the context of the client’s environment. This takes time and cognitive effort, especially when you need to lay the foundations and start building up to the maturity level the client thinks they are already at!

Friends and peers have been urging me for years to do my own thing – to become a consultant – and while it’s still on my radar, thus far I’ve resisted. The benefits of a steady paycheck aside, I haven’t so much feared knowing everything as knowing enough.

Client needs are so diverse, it puts the fake-it-til-you-make-it construct into perspective. Perhaps it’s nigh on impossible for an external agent to do anything else?

Besides, I love driving the agenda from within – executing thought leadership, getting hands on, experimenting, starting small and scaling up – to effect positive change.

Yet as day-to-day business for regular employees like me gets ever more insane, must we eventually adopt a similar construct?

I sincerely hope not, but I must temper this view with the realisation that on occasion, I’ve had cause to question my own capability. More often than not, that’s been due to unreasonable expectations or poor job fit; nonetheless, I’ve been proud of my readiness to call out the shortcomings of my own skillset whenever the need has arisen.

This apparent courage, I think, is largely due to my confidence in my ability to learn what needs to be learned. And so this leads me to propose an alternative construct for knowledge workers: figure it out.

Instead of being the expert who knows the solution, be the one who solves the problem. This subtle but powerful shift transforms the objective from a noun to a verb. Solving involves thinking, researching, designing, deploying and evaluating.

When we do this to build upon what we already know, all impost is lost.

Gift horses

16 July 2018

If I had asked people what they wanted, they would have said faster horses.

I’m fascinated by this quote that Henry Ford may or may not have uttered.

In The best of both worlds I promoted Design Thinking as a means of using customer insights to inform strategic decision making. However, as the above quote suggests, customers don’t know what they don’t know. Sometimes it takes an expert to show them.

In an era in which the very existence of the L&D department is attracting evermore scrutiny, the role of the “expert” in our context is becoming increasingly pertinent. I have long been of the opinion that L&D professionals should dispense with being the SME of what is being trained; and instead be the SME of how it’s being trained.

Under this paradigm, we are the experts in the science and practice of learning and development, and we consult the business accordingly.

This resonates with me because beyond the education and research I invest in myself, I’ve been around the block a few times. I have a strong idea of what will work, not only because I’ve read up on it and thought deeply about it, but also because I’ve seen it play out with my own eyes.

I also get paid to focus on my portfolio every day. I consider it not only my mandate, but an ethical obligation, to originate and innovate.

A horse in a pasture

So I’m more than comfortable with L&D professionals pushing the envelope on the basis of knowledge, curiosity, creativity and experience – so long as these activities are put through the Design Thinking cycle too.

By this I mean be confident that your idea is a sound one, but not so arrogant as to instil it with blind faith. Put your one-man (in my case) fruit of ideation to your customers to check it will work for them. While you’re at it, confirm the problem statement is indeed one that needs to be solved.

So much for Design Thinking being linear!

Then proceed with prototyping and testing, prior to launching an MVP, and iterating and evolving it.

In this way, the promise of expertise is tempered by an agile approach. It hedges the bet not only by building confidence pre-launch, but also by minimising potential losses post-launch.

Ford Mustang emblem depicting a galloping horse

If Mr Ford had resigned himself to breeding faster horses, he never would have launched the Model T.

In our admirable quest to utilise our customers as a source of innovation, let’s balance that approach by empowering the experts whom we have hired to practise their expertise.

Lest the L&D department be put out to pasture.

The best of both worlds

11 June 2018

There’s no point landing the perfect plane at the wrong airport.

That’s an analogy someone shared with me several years ago to explain Design Thinking, and it has resonated with me ever since for two reasons. Firstly, it exposes the solution-first approach that pervades the corporate sector; and secondly, it challenges our obsession with perfection.

When I look across the business landscape, I’m continually surprised by the decisions that some companies make on behalf of their customers, without those decisions being informed by said customers. It’s more prevalent then you might think. We humans are beset by bias, prejudice, arrogance and self-importance. We make assumptions and just know what is best for others. So we launch blind. No wonder so many initiatives fail.

Likewise I am continually surprised by the great lengths to which some companies go to ensure their product is flawless. All that time spent prior to launch represents time out of the market. And all those eggs put into the one basket means if it fails, it fails hard.

Empathize, Define, Ideate, Prototype, Test

Design Thinking promises to overcome these problems by recasting the customer as the source of innovation rather than merely the recipient. Moreover, it’s agile – in the sense that it combines speed to market with continuous improvement.

Perhaps the most widely recognised variant of Design Thinking is the 5-stage framework espoused by Stanford University’s d.school. I won’t bother delving into its details when countless others have already done so. Suffice to say it involves empathising with your customers to find out what they really need; using those insights to define the problem you’ll solve for them; generating ideas for a potential solution; prototyping and testing (and modifying) the solution; prior to launching a minimum viable product (MVP).

Design Thinking is an iterative process, with an emphasis on cycles of learning: informing your decisions with intelligence; trying them out; failing fast; failing cheap; adapting; approaching ever closer to designing the right thing, and designing it right, to maximise its probability of success.

And it doesn’t end at launch. The MVP is a starting point, not an end point. In the heat of the market, the cycle of learning continues, and so the product evolves.

Design Thinking is at the intersection of evidence and delivery

Of course Design Thinking has no shortage of detractors. One commentator likens it to syphilis (!) while others are even more offensive, calling it linear.

Much of the disdain appears to stem from the evangelism practised by fanbois who worship the idol of Design Thinking, the healer of all ills (including, no doubt, syphilis).

I also find the language of the protagonists sometimes misleading; for example, IDEO – the proponent of Human Centered Design, Design Thinking’s alter ego – claims “you’ll know that your solution will be a success because you’ve kept the very people you’re looking to serve at the heart of the process”. I know what they’re getting at, and I agree with the sentiment, but anyone with a freshman’s appreciation of statistics understands you can’t possibly know an outcome based on a sample. The best you can do is infer; or in layman’s terms, increase your confidence.

Nonetheless, I’m prepared to see past the breathless zeal and call myself an advocate of Design Thinking. Why? Because I consider it the best of both worlds: it’s evidence based, and it delivers.

Do your homework to check you’ll add real value, but get on with it and start adding that value now.

Over time, the value will grow.

The foundations of innovation in L&D

14 May 2018

There are two sides of the innovation coin in corporate learning & development: technology and pedagogy.

The former is rather obvious and is often conflated with the term innovation. Futuristic hardware and magical software that educates everyone at the press of a button are tempting “solutions”. Some folks call this mindset Shiny New Toy Syndrome, and by golly, it’s a pandemic.

The latter is less obvious because it involves thinking, and I’m not being facetious when I say that thinking is hard. Traditional ways of learning in the workplace are, by definition, ingrained in the psyche of the vast majority of the workforce. Changing the concept of how we learn and redefining how we can help people do it better involve shifting the organisation’s culture, and that is a challenge greater than any IT implementation.

I see technology as an enabler of the pedagogical outcome, rather than it being the outcome per se. And just as we must learn to walk before we can run, so too must an organisation lay the foundations of innovation before it can reach for the stars. Though not as sexy as their more tweeted-about alternatives, these foundations are the building blocks of long-term efficiency, flexibility and creativity.

So what are the foundations of innovation in L&D?

I will hereby attempt to answer this question by looking through the lens of the 70:20:10 model. Whereas previously I have advocated this approach when designing a solution for a specific learning objective, this time I’m elevating the approach to the strategic level, with a view to designing a future-proofed solution for all the organisation’s learning objectives.

The Foundations of Innovation in L&D: content library, knowledge base, enterprise social network, and performance-oriented training

The 70

From the get-go, a false idol that must fall is the belief that the role of the L&D department is to create all the training to meet the organisation’s learning needs. These needs are so diverse within and across all the different job roles that the task is an almost comical impossibility.

Moreover, a large proportion of these needs is generic; despite what many organisations think, they’re not that special. Analytics is analytics. Decision making is decision making. Difficult conversations are difficult conversations. The nature of such content is universal.

So my first building block is a third-party content library. There are many players in this space, and sure it makes sense to pick one that matches your organisation’s profile, but their pedagogical purpose is the same: to provide your people with immediate access to an extensive suite of learning assets, covering a broad range of topics, on demand. Such a resource empowers self-directed learning which, in the language of 70:20:10, can be done on the job, just in time.

Another false idol to fall is the myth that all the information we need is at our fingertips. Clearly, not all our needs are generic. The organisation is special in the sense that has its own products, processes, systems, policies, etc, which a third party will never cover.

So my second building block is an in-house knowledge base. Whether the underlying technology is an intranet, CMS or wiki, again the pedagogical purpose is the same: to provide your people with on-demand access to bespoke content that improves performance.

The 20

Despite the best intentions of a content library and a knowledge base, they will never meet every conceivable learning need. An enterprise social network covers the “in-betweens”, principally by empowering everyone to ask their own questions to the crowd, and to keep abreast of emergent knowledge in the moment.

The 10

The building blocks in the 70 and the 20 spearhead an informal first approach to learning and development which lifts a mountain of weight off the shoulders of the L&D team. Freed from the burden of training everything, we can now focus our attention on what should be trained.

Furthermore, these building blocks enable change in the nature of the training. With the bulk of the content hosted elsewhere, it doesn’t need to be shovelled into the course. The class can be flipped, the narrative pared back to its key messages, and a scenario-based design adopted to train not the content, but its application.

In this way, the training becomes performance oriented.

A man working on a house frame

By no means do these building blocks exhaust the 70:20:10 model, nor do they represent the extent of innovation in L&D. Rather, they form the bedrock of further innovation.

For example:

  • User-generated content has a home, not only where it can be housed, but also where it can be governed.

  • Blended learning goes beyond pre-work online modules by integrating social activity and ongoing performance support.

  • Corporate MOOCs have a delivery vehicle.

  • Micro-learning and micro-assessments have a rich source of reference content to which remedial feedback can link.

  • If the content library, knowledge base and ESN are mobile accessible, they support mobile learning.

  • Any reduction in training volume creates more space to explore emerging technologies such as AI, VR and AR.

  • An orderly, structured L&D service offering provides the basis for a proper consideration of the value that a next-generation learning management system may add (or not).

So while I remain an advocate of ad hoc innovation, I see it as a necessity in the absence of a plan. My preference is a much more strategic approach, bedding down what matters most to meet the immediate needs of the business, prior to building additional innovative initiatives that stand firmly on that foundation.

In this way, not only do we innovate now, but we have a platform for innovating into the future.

E-Learning conferences in Australia in 2018

2 January 2018

Well 2018 is shaping up to be a strong year of professional development opportunities for e-learning professionals down under.

A good spread of conferences is already scheduled along the Eastern Seaboard, while more will inevitably pop up (perhaps further west?) as the year progresses.

The following list is organic, so keep an eye on it!

Baywalk Bollards representing lifesavers in Geelong, Australia.

International Conference on E-Learning and Distance Learning
Sydney, 29-30 January 2018

International Conference on Distance Education and Virtual Learning
Melbourne, 1-2 February 2018

The Art of Knowledge Management, Learning & Communication
Melbourne, 22 February 2018

International Conference on Virtual and Augmented Reality Simulations
Brisbane, 24-26 February 2018

Chief Digital Officer Summit
Sydney, 26 February – 3 March 2018

iDESIGNX
Sydney, 27 February 2018

Learning Analytics and Knowledge Conference
Sydney, 5-9 March 2018

National FutureSchools Expo and Conferences
Melbourne, 21-22 March 2018

Digital Disruption
Sydney, 26-27 March 2018

L&D Leadership Innovation Summit
Sydney, 26-29 March 2018

Strategy & Innovation World Forum
Sydney, 9-10 May 2018

Online & e-Learning Summit
Melbourne, 15-16 May 2018

CeBIT Australia
Sydney, 15-17 May 2018

Totara User Conference – Asia Pacific
Sydney, 5-6 June 2018

AITD National Conference
Sydney, 7-8 June 2018

The Future of Learning Conference
Brisbane, 21-22 June 2018

Learning While Working
Virtual, 17-18 July 2018

Leading a Digital School Conference
Sunshine Coast, 16-18 August 2018

Blackboard Teaching & Learning Conference ANZ
Brisbane, 28-31 August 2018

K-12 Digital Classroom Practice Conference
Melbourne, 31 August 2018

FlipCon Australia
Melbourne, 14-15 September 2018

MoodleMoot Australia
Brisbane, 24–26 September 2018

Enterprise Learning Confluence
Sydney, 26 September 2018

Eportfolio Forum
Brisbane, 3-4 October 2018

Australian Council for Computers in Education Conference
Sydney, 2-5 October 2018

L&D Innovation & Tech Fest
Sydney, 29-30 October 2018

LearnX
Melbourne, 30 October 2018

National Future Work Summit
Sydney, 30 October 2018

ASCILITE
Geelong, 25-28 November 2018

Lanyard

If you are aware of another e-learning related conference down under this year, please let me know and I’ll add it to the list.

7 tips for custodians of capability frameworks

18 September 2017

Wow, my previous blog post elicited some rich comments from my peers in the L&D profession.

Reframing the capability framework was my first foray into publishing my thoughts on the subject, in which I argued in favour of using the oft-ignored resource as a tool to be proactive and add value to the business.

To everyone who contributed a comment, not only via my blog but also on Twitter and LinkedIn… thank you. Your insights have helped me shape my subsequent thoughts about capability frameworks and their implementation in an organisation.

I will now articulate these thoughts in the tried and tested form of a listicle.

Metallic blue building blocks, two golden.

If you are building, launching or managing your organisation’s capabilities, I invite you to consider my 7 tips for custodians of capability frameworks…

1. Leverage like a banker.

At the organisational level, the capabilities that drive success are strikingly similar across companies, sectors and industries. Unless you have incredibly unique needs, you probably don’t need to build a bespoke capability framework from the ground up.

Instead, consider buying a box set of capabilities from the experts in this sort of thing, or draw inspiration *ahem* from someone else who has shared theirs. (Hint: Search for a “leadership” capability framework.)

2. Refine like a sculptor.

No framework will perfectly model your organisation’s needs from the get-go.

Tweak the capabilities to better match the nature of the business, its values and its goals.

3. Release the dove.

I’ve witnessed a capability framework go through literally years of wordsmithing prior to launch, in spite of rapidly diminishing returns.

Lexiconic squabbles are a poor substitute for action. So be agile: Launch the not-yet-finished-but-still-quite-useful framework (MVP) now.

Then continuously improve it.

4. Evolve or die.

Consider your capability framework an organic document. It is never finished.

As the needs of the business change, so too must your people’s capabilities to remain relevant.

5. Sing from the same song sheet.

Apply the same capabilities to everyone across the organisation.

While technical capabilities will necessarily be different for the myriad job roles throughout your business, the organisational capabilities should be representative of the whole organisation’s commitment to performance.

For example, while Customer Focus is obviously relevant to the contact centre operator, is it any less so for the CEO? Conversely, while Innovation is obviously relevant to the CEO, is it any less so for the contact centre operator?

Having said that, the nature of a capability will necessarily be different across levels or leadership stages. For example, while the Customer Focus I and Innovation I capabilities that apply to the contact centre operator will be thematically similar to Customer Focus V and Innovation V that apply to the CEO, their pitches will differ in relation to their respective contexts.

6. Focus like an eagle.

Frameworks that comprise dozens of capabilities are unwieldy, overwhelming, and ultimately useless.

Not only do I suggest your framework comprise fewer rather than extra capabilities, but also that one or two are earmarked for special attention. These should align to the strategic imperatives of the business.

7. Use it or lose it.

A capability framework that remains unused is merely a bunch of words.

In my next blog post I will examine ways in which it can be used to add value at each stage of the employee lifecycle.