In Roses are red, I proposed definitions for oft-used yet ambiguous terms such as “competency” and “capability”.
Not only did I suggest a competency be considered a task, but also that its measurement be binary: competent or not yet competent.
As a more general construct, a capability is not so readily measured in a binary fashion. For instance, the question is unlikely to be whether you can analyse data, but the degree to which you can do so. Hence capabilities are preferably measured via a proficiency scale.
Of course numerous proficiency scales exist. For example:
The NIH Proficiency Scale maintains Not Applicable, Fundamental Awareness (basic knowledge), Novice (limited experience), Intermediate (practical application), Advanced (applied theory) and Expert (recognized authority).
It seems like overnight the L&D profession has started to struggle with the definition of terms such as “capability”, “competency” and “skill”.
Some of our peers consider them synonyms – and hence interchangeable – but I do not.
Indeed I recognise subtle but powerful distinctions among them, so here’s my 2-cents’ worth to try to cut through the confusion.
From the get-go, the difference between the terms may be most clearly distinguished when we consider a competency a task. It is something that is performed.
Our friends in vocational education have already this figured out. For example, if we refer to the Tap furnaces unit of competency documented by the Australian Department of Education, Skills and Employment, we see elements such as Plan and prepare for furnace tapping and Tap molten metal from furnace.
Importantly, we also see performance criteria, evidence and assessment conditions. Meeting a competency therefore is binary: either you can perform the task successfully (you are “competent”) or you can not (in the positive parlance of educationalists, you are “not yet competent”).
Given a competency is a task, a capability is a personal attribute you draw upon to perform it.
An attribute may be knowledge (something you know, eg tax law), a skill (something you can do, eg speak Japanese), or a mindset (a state of being, eg agile).
I consider capability an umbrella term for all these attributes; they combine with one another to enable the behaviour that meets the competency.
According to the definitions I’ve outlined above, we frequently see in the workplace that “capability frameworks” are mislabelled “competency frameworks” and vice versa.
Terms such as Decision Making and Data Analysis are capabilities – not competencies – and moreover they are skills. Hence, not only would I prefer they be referred to as such, but also that they adopt an active voice (Make Decisions, Analyse Data).
I also suggest they be complemented by knowledge and mindsets, otherwise the collection isn’t so much a capability framework as a “skills framework”; which is fine, but self-limiting.
I have previously argued in favour of the L&D team deploying a capability framework as a strategic imperative, but now the question that begs to be asked is: should we deploy a capability framework or a competency framework?
My typical answer to a false dichotomy like this is both.
Since capabilities represent a higher level of abstraction, they are scalable across the whole organisation and are transferable from role to role and gig to gig. They also tend to be generic, which means they can be procured in bulk from a third party, and their low volatility makes them sustainable. The value they offer is a no-brainer.
In contrast, competencies are granular. They’re bespoke creations specific to particular roles, which makes them laborious to build and demanding to maintain. Having said that, their level of personalised value is sky high, so I advise they be deployed where they are warranted – targeting popular roles and pivotal roles, for example.
A rose by any other name would smell as sweet.
Yet a rose is not a violet.
In a similar manner I maintain that capabilities and competencies are, by definition, different.
In any case, if we neglect them, the next term we’ll struggle to define is “service offering”.
This consolidation I rehash share with you now in the form of my Top 5 benefits of open badges for corporates.
1. Open badges can motivate employees to learn.
Badges are widely perceived as being childish, yet there is no denying that the game mechanics that underpin them can work. Some people are incredibly motivated by badges. Once they’ve earned one, they want to earn another.
You will note that I am using weasel words such as “can” and “some”. This is because badges don’t motivate everyone – just ask Foursquare! But my view is if they motivate a significant proportion of your target audience, then that makes them worthwhile.
I consider this an important point because as learning in the corporate sector becomes more informal, the employee’s motivation to drive their own development will become increasingly pivotal to their performance, and hence to the performance of the organisation as a whole.
2. Open badges can credential in-house training.
Yes, corporates can print off certificates of completion for employees who undertake their in-house training offerings, only for them to be pinned to a workstation or hidden in a drawer.
And yes, corporates typically track and record completion statuses in their LMS, but that lacks visibility for pretty much everyone but the employee him- or herself.
In contrast, open badges are the epitome of visibility. They’re shiny and colourful, the employee can collect them in their online backpack, and they can be shown off via a plugin on a website or blog – or intranet profile.
Badges therefore give corporates the opportunity to recognise the employees who have completed their in-house training, within an enterprise-wide framework.
3. Open badges are portable.
Currently, if you undertake training at one organisation and then leave to join another, you leave your completion records behind. However, if badges were earned through that training, their openness and centralisation in the cloud means that you can continue to “wear” them when you move to your next employer.
This portability of open badges would be enhanced if third parties were also able to endorse the training. So an APRA-endorsed badge earned at Bank A, for example, would be meaningful to my next employer, Bank B, because this bank is also regulated by APRA.
Still, the concept holds without third-party endorsement; that is to say, much of the training provided by Bank A would probably still be meaningful to Bank B – because Bank A and Bank B do very similar things.
4. Open badges are task oriented.
Despite my talk of “training” thus far, open badges are in fact task oriented. That means they recognise the execution of specific actions, and hence the mastery of skills.
I love this aspect of open badges because it means they don’t promise that you can do a particular task, but rather demonstrate that you have already done it.
That gives employers confidence in your capability to perform on the job.
5. Open badges can formally recognise informal learning.
I have argued previously that in the modern workplace, we should informalise learning and formalise assessment.
My rationale is that the vast majority of learning in the workplace is informal anyway. Employees learn in all kinds of ways – from reading a newsfeed or watching a video clip, to playing with new software or chatting with colleagues over lunch.
The question is how to manage all of that learning. The answer is you don’t.
If a particular competency is important to the business, you assess it. Assessment represents the sum of all the learning that the employee has undertaken in relation to that competency, regardless of where, when or how it was done.
I see open badges as micro-assessments of specific tasks. If you execute a task according to the pre-defined criteria (whatever that may be), then you earn its badge. In this way, the badge represents the sum of all the learning that you have undertaken to perform the task successfully, regardless of where, when or how that learning was done.
This is my blog, so of course all of the above assertions are the product of my own opinion. Naturally, I believe it to be an opinion informed by experience.
People familiar with my blog will know that I’m not a member of the anti-LMS brigade.
On the contrary, I think a Learning Management System is a valuable piece of educational technology – particularly in large organisations. It is indispensible for managing registrations, deploying e-learning, marking grades, recording completion statuses, centralising performance agreements and documenting performance appraisals.
In other words – and the name gives it away – an LMS is useful for managing learning.
Yet while LMSs are widely used in the corporate sector, I suspect they are not being used to their full potential. You see, when most people think of an LMS, they think of formal learning. I don’t.
I think of informal learning. I think of the vast majority of knowledge that is acquired outside of the classroom. I think of the plethora of skills that are developed away from the cubicle. I think of reading a newspaper and chatting around the water cooler, and the myriad of other ways that people learn stuff. Relevant stuff. Stuff that actually makes a difference to their performance.
And I wonder how we can acknowledge all of that learning. We can hardly stick the newspaper or the water cooler into the LMS, although many will try in vain.
No – the way we can acknowledge informal learning is via assessment. Assessment represents the sum of learning in relation to a domain, regardless of where, when or how that learning was done.
The assessment need not be a multiple-choice quiz (although I am not necessarily against such a device), nor need it be online. The LMS only needs to manage it. And by that I mean record the learner’s score, assign a pass or fail status, and impart a competency at a particular proficiency.
In this way, the purpose of learning shifts from activity to outcome.
Having said that, the LMS suffers a big problem: portability.
I’m not referring to the content. We have SCORM to ensure our courses are compatible with different systems. Although, if you think migrating SCORM-compliant content from one LMS to another is problem free, I have an opera house to sell you. It has pointy white sails and a great view of the harbour.
No – I’m referring to the learner’s training records. That’s the whole point of the LMS, but they’re locked in there. Sure, if the organisation transfers from one LMS to another, it can migrate the data while spending a tonne of money and shedding blood, sweat and tears in the process.
But worse, if the learner leaves the organisation to join another, they also leave their training records behind. Haha… we don’t care if you complied with the same regulations at your last organisation. Or that you were working with the same types of products. Or that you were using the same computer system. We’re going to make you do your training all over again. Sucker.
It’s hardly learner-centered, and it sure as hell ain’t a smart way of doing business.
Enter Tin Can
According to my understanding, Tin Can is designed to overcome the problem of training record portability. I imagine everyone having a big tin can in the cloud, connected to the interwebs. When I complete a course at Organisation A, my record is recorded in my tin can. When I leave Organisation A for a better job at Organisation B, no worries because I’ve still got my tin can. It’s mine, sitting in the sky, keeping all my training records accessible.
This idea has taken the education world by storm, and some LMSs such as UpsideLMS have already integrated the API into their proprietary architecture.
Furthermore, I can update my tin can manually. For example, if I read a newspaper article or have an enlightening conversation with someone around the water cooler, I can log into my account and record it.
This sounds admirable prima facie, but for me it raises a couple of concerns. Firstly, the system is reliant on the learner’s honour – ! – but more concerningly, its philosophy reverts back to activity over outcome. Recording reams and reams of minor learning interactions all seems a bit pointless to me.
So where to from here?
Plurality is a brilliant short film watched by the participants in Week 2 of The University of Edinburgh’s E-learning and Digital Cultures course.
The film paints a dystopian vision of the future whereby everyone’s personal details are stored in an online grid, which is controlled of course by the government. When you swipe your finger over a scanner, the computer reads your DNA and identifies you. This is convenient for automatically deducting the cost of a sandwich from your bank account, or unlocking your car, but not so convenient when you are on the run from the cops and they can track you through everything you touch.
Despite the Big Brother message pushed by the film, it prompted me to recognise an emerging opportunity for Tin Can if it were to re-align its focus on assessment and exploit the Internet of Things.
Suppose for example you are sitting in a jumbo jet waiting to take off to London or New York. If the cockpit had a scanner that required the pilot to swipe his finger, the computer could check his tin can to confirm he has acquired the relevant competencies at the required proficiencies before activating the engine.
Or suppose you are meeting a financial advisor. With a portable scanner, you could check that she has been keeping up with the continuing education points required by the relevant accreditation agency.
Competencies and assessment tend to cop a beating in the academic sphere, but in the real world you want to be reasonably confident that your pilot can fly a plane and your financial advisor knows what she’s talking about.
If the film’s portrayal of DNA is too far-fetched, it need not be the mechanism. For example, the pilot could key in his personal credentials, or you could key in the financial advisor’s agency code.
But maybe it’s not so far-fetched after all. The Consortium for the Barcode of Life – based at the Smithsonian Institution’s National Museum of Natural History, no less – is currently researching DNA barcoding.
And still, maybe Plurality is looking at it the wrong way around. We can already store digital information in synthetic DNA. Perhaps in the not-too-distant future our training records will be coded into our natural DNA and injected back into our bodies. Then instead of the scanner referring to your tin can in the cloud, it mines your data right there in your genes.