Posted tagged ‘assessment’

Top 5 benefits of open badges for corporates

17 July 2013

I’ve been blogging a lot about open badges lately. That really means I’ve been thinking a lot about open badges lately, as I use my blog as a sense-making platform.

Through my blogging, combined with the insightful discussions following both Badges of honour and The past tense of open badges, I have been able to consolidate my thoughts somewhat.

This consolidation I rehash share with you now in the form of my Top 5 benefits of open badges for corporates.

Carrot badge

1. Open badges can motivate employees to learn.

Badges are widely perceived as being childish, yet there is no denying that the game mechanics that underpin them can work. Some people are incredibly motivated by badges. Once they’ve earned one, they want to earn another.

You will note that I am using weasel words such as “can” and “some”. This is because badges don’t motivate everyone – just ask Foursquare! But my view is if they motivate a significant proportion of your target audience, then that makes them worthwhile.

I consider this an important point because as learning in the corporate sector becomes more informal, the employee’s motivation to drive their own development will become increasingly pivotal to their performance, and hence to the performance of the organisation as a whole.

Credential badge

2. Open badges can credential in-house training.

Yes, corporates can print off certificates of completion for employees who undertake their in-house training offerings, only for them to be pinned to a workstation or hidden in a drawer.

And yes, corporates typically track and record completion statuses in their LMS, but that lacks visibility for pretty much everyone but the employee him- or herself.

In contrast, open badges are the epitome of visibility. They’re shiny and colourful, the employee can collect them in their online backpack, and they can be shown off via a plugin on a website or blog – or intranet profile.

Badges therefore give corporates the opportunity to recognise the employees who have completed their in-house training, within an enterprise-wide framework.

Portable badge

3. Open badges are portable.

Currently, if you undertake training at one organisation and then leave to join another, you leave your completion records behind. However, if badges were earned through that training, their openness and centralisation in the cloud means that you can continue to “wear” them when you move to your next employer.

This portability of open badges would be enhanced if third parties were also able to endorse the training. So an APRA-endorsed badge earned at Bank A, for example, would be meaningful to my next employer, Bank B, because this bank is also regulated by APRA.

Still, the concept holds without third-party endorsement; that is to say, much of the training provided by Bank A would probably still be meaningful to Bank B – because Bank A and Bank B do very similar things.

Task-oriented badge

4. Open badges are task oriented.

Despite my talk of “training” thus far, open badges are in fact task oriented. That means they recognise the execution of specific actions, and hence the mastery of skills.

I love this aspect of open badges because it means they don’t promise that you can do a particular task, but rather demonstrate that you have already done it.

That gives employers confidence in your capability to perform on the job.

Assessment badge

5. Open badges can formally recognise informal learning.

I have argued previously that in the modern workplace, we should informalise learning and formalise assessment.

My rationale is that the vast majority of learning in the workplace is informal anyway. Employees learn in all kinds of ways – from reading a newsfeed or watching a video clip, to playing with new software or chatting with colleagues over lunch.

The question is how to manage all of that learning. The answer is you don’t.

If a particular competency is important to the business, you assess it. Assessment represents the sum of all the learning that the employee has undertaken in relation to that competency, regardless of where, when or how it was done.

I see open badges as micro-assessments of specific tasks. If you execute a task according to the pre-defined criteria (whatever that may be), then you earn its badge. In this way, the badge represents the sum of all the learning that you have undertaken to perform the task successfully, regardless of where, when or how that learning was done.

Opinion badge

This is my blog, so of course all of the above assertions are the product of my own opinion. Naturally, I believe it to be an opinion informed by experience.

Other people have different opinions – some concordant, some contrary, as the comments under Badges of honour and The past tense of open badges will attest.

So, I’m curious… what’s your opinion?

The future of learning management

11 February 2013

People familiar with my blog will know that I’m not a member of the anti-LMS brigade.

On the contrary, I think a Learning Management System is a valuable piece of educational technology – particularly in large organisations. It is indispensible for managing registrations, deploying e-learning, marking grades, recording completion statuses, centralising performance agreements and documenting performance appraisals.

In other words – and the name gives it away – an LMS is useful for managing learning.

Yet while LMSs are widely used in the corporate sector, I suspect they are not being used to their full potential. You see, when most people think of an LMS, they think of formal learning. I don’t.

I think of informal learning. I think of the vast majority of knowledge that is acquired outside of the classroom. I think of the plethora of skills that are developed away from the cubicle. I think of reading a newspaper and chatting around the water cooler, and the myriad of other ways that people learn stuff. Relevant stuff. Stuff that actually makes a difference to their performance.

And I wonder how we can acknowledge all of that learning. We can hardly stick the newspaper or the water cooler into the LMS, although many will try in vain.

No – the way we can acknowledge informal learning is via assessment. Assessment represents the sum of learning in relation to a domain, regardless of where, when or how that learning was done.

The assessment need not be a multiple-choice quiz (although I am not necessarily against such a device), nor need it be online. The LMS only needs to manage it. And by that I mean record the learner’s score, assign a pass or fail status, and impart a competency at a particular proficiency.

In this way, the purpose of learning shifts from activity to outcome.

Wheelbarrow

Having said that, the LMS suffers a big problem: portability.

I’m not referring to the content. We have SCORM to ensure our courses are compatible with different systems. Although, if you think migrating SCORM-compliant content from one LMS to another is problem free, I have an opera house to sell you. It has pointy white sails and a great view of the harbour.

No – I’m referring to the learner’s training records. That’s the whole point of the LMS, but they’re locked in there. Sure, if the organisation transfers from one LMS to another, it can migrate the data while spending a tonne of money and shedding blood, sweat and tears in the process.

But worse, if the learner leaves the organisation to join another, they also leave their training records behind. Haha… we don’t care if you complied with the same regulations at your last organisation. Or that you were working with the same types of products. Or that you were using the same computer system. We’re going to make you do your training all over again. Sucker.

It’s hardly learner-centered, and it sure as hell ain’t a smart way of doing business.

Enter Tin Can.

Tin can in the cloud

According to my understanding, Tin Can is designed to overcome the problem of training record portability. I imagine everyone having a big tin can in the cloud, connected to the interwebs. When I complete a course at Organisation A, my record is recorded in my tin can. When I leave Organisation A for a better job at Organisation B, no worries because I’ve still got my tin can. It’s mine, sitting in the sky, keeping all my training records accessible.

This idea has taken the education world by storm, and some LMSs such as UpsideLMS have already integrated the API into their proprietary architecture.

Furthermore, I can update my tin can manually. For example, if I read a newspaper article or have an enlightening conversation with someone around the water cooler, I can log into my account and record it.

This sounds admirable prima facie, but for me it raises a couple of concerns. Firstly, the system is reliant on the learner’s honour – ! – but more concerningly, its philosophy reverts back to activity over outcome. Recording reams and reams of minor learning interactions all seems a bit pointless to me.

So where to from here?

Enter Plurality.

Plurality is a brilliant short film watched by the participants in Week 2 of The University of Edinburgh’s E-learning and Digital Cultures course.

The film paints a dystopian vision of the future whereby everyone’s personal details are stored in an online grid, which is controlled of course by the government. When you swipe your finger over a scanner, the computer reads your DNA and identifies you. This is convenient for automatically deducting the cost of a sandwich from your bank account, or unlocking your car, but not so convenient when you are on the run from the cops and they can track you through everything you touch.

Despite the Big Brother message pushed by the film, it prompted me to recognise an emerging opportunity for Tin Can if it were to re-align its focus on assessment and exploit the Internet of Things.

Suppose for example you are sitting in a jumbo jet waiting to take off to London or New York. If the cockpit had a scanner that required the pilot to swipe his finger, the computer could check his tin can to confirm he has acquired the relevant competencies at the required proficiencies before activating the engine.

Or suppose you are meeting a financial advisor. With a portable scanner, you could check that she has been keeping up with the continuing education points required by the relevant accreditation agency.

Competencies and assessment tend to cop a beating in the academic sphere, but in the real world you want to be reasonably confident that your pilot can fly a plane and your financial advisor knows what she’s talking about.

DNA strand

If the film’s portrayal of DNA is too far-fetched, it need not be the mechanism. For example, the pilot could key in his personal credentials, or you could key in the financial advisor’s agency code.

But maybe it’s not so far-fetched after all. The Consortium for the Barcode of Life – based at the Smithsonian Institution’s National Museum of Natural History, no less – is currently researching DNA barcoding.

And still, maybe Plurality is looking at it the wrong way around. We can already store digital information in synthetic DNA. Perhaps in the not-too-distant future our training records will be coded into our natural DNA and injected back into our bodies. Then instead of the scanner referring to your tin can in the cloud, it mines your data right there in your genes.

And you thought science fiction was scary!

The future of MOOCs

26 November 2012

MOOCs get a bad rap. Dismissed as prescriptive, or teacher-centric, or unsocial, or something else, it’s like a badge of honour to espouse why you dislike MOOCs.

Despite their pedagogical flaws, however, MOOCs provide unprecedented access to quality content for millions of learners.

It’s all very well for Apple-owning, organic-buying professionals to cast aspersions, but consider the girl in Pakistan who’s too scared to set foot in a classroom. Consider the teenager in central Australia whose school has only one teacher. Consider the young woman in Indonesia who can’t afford college. Consider the boy in San Francisco whose maths teacher simply doesn’t teach very well.

Don’t all these people deserve a better education? And isn’t content sourced from some of the world’s best providers a giant leap in that direction?

Sure, the pedagogy may not be perfect, but the alternative is much worse.

Child learning on a computer

MOOC proponent George Siemens distinguishes between two types of MOOC: the xMOOC and the cMOOC.

The former is the subject of such disdain. Involving little more than knowledge transmission and perhaps a quiz at the end, the xMOOC is widely seen as replicating old-fashioned lectures and exams.

In contrast, the latter leverages the connectedness of the participants. Seeded with content, the cMOOC empowers – read “expects” – the learner to discuss, debate, discover, share and co-create new knowledge with his or her fellow learners.

The cMOOC’s participant is active whereas the xMOOC’s participant is passive. As Siemens puts it, cMOOCs focus on knowledge creation and generation whereas xMOOCs focus on knowledge duplication.

Despite Siemens’ evangelism though, I don’t think the cMOOC is necessarily better than the xMOOC. (I’ll explain later.)

Ethernet cable

Love them or loathe them, xMOOC or cMOOC, the fact remains: MOOCs have arrived, and they are here to stay.

Moreover, I submit they are yet to wreak their full vengeance on the education industry. When I look into my crystal ball, I foresee that MOOCs will rock our world, and they will do so in 15 ways…

Fortune teller

1. Universities will finally accept they are service providers.

As the latest edition of Educause Review indicates, universities are fee-for-service businesses. That means they are subjected to market forces such as competition.

MOOCs beg the question: If I can study at Stanford University for free, why would I pay tens of thousands of dollars to study at your dinky university and subject myself to your arcane rules?

2. The vast majority of students will be overseas.

Countries that currently attract foreign students to their shores will need to brace for the impact on their local economies, as an ever-increasing proportion of students choose to gain an international education without leaving their home country.

3. The pecking order will be reshuffled.

While the world’s most prestigious institutions will enjoy a windfall of new students, those that rely more on age than ability will ultimately fail as the target audience realises how pedestrian they are.

Conversely, some of the smaller, younger institutions will emerge from the shadows as the world sees how good they really are.

4. Research will become a competitive advantage.

There’s nowhere to hide on the global stage, and cutting-edge expertise will be one of the few aspects that a university will have to distinguish itself from the others.

No more lazy professors, no more specious journal articles. Faculty who don’t generate a flow of new knowledge for their students will have their tenure terminated.

5. Universities will flip their classrooms.

Bricks’n’mortar establishments will become expensive relics unless their owners redeploy them. One way to do that is to leverage MOOCs for content delivery and provide value-added instruction (discussion, Q&A, worked examples, role plays etc) to local students – who of course will pay a premium for the privilege.

Studying on campus will become a status symbol.

6. The role of the teacher will evolve.

There’s no point rehashing the same lectures when the world’s best authorities have already recorded them and offered them to the world as OERs. It’s how the teacher uses that content to support learning that will make the difference.

7. The pedagogy of MOOCs will be enriched.

While MOOCs typically comprise video clips and perhaps a quiz, they will inevitably include more instructional devices to assist distance learning (and remain competitive).

Over time, content providers will supplement their core offerings with live webinars, interactive exercises, discussion forums, wikis, social networks etc. Some may even organise real-life meetups at selected sites around the world.

8. Content providers will charge for assessment.

A certificate of completion is good; an official grade is better.

Assessment is one of the ways universities will monetise their MOOCs, and edX is already going one step further by offering proctored exams.

9. Universities will offer credits for MOOCs.

Again, this is already being considered by the American Council on Education.

Of course, a certificate of completion won’t suffice. Ka ching!

10. Online cheating will mushroom.

An ever-present thorn in the side of online education, cheating will be almost impossible to prevent in the MOOC space. But surely we can do better than onsite exams?

11. Academic inflation will skyrocket.

Every man and his dog will have a ream of courses listed on his CV. Employers will consider certificates of completion meaningless, while maintaining a reserved suspicion over assessment scores.

Outcomes-based activities that demonstrate the applicant’s knowledge and skills will become a component of best-practice recruitment.

12. Offshoring will become the rule, not the exception.

Deloitte’s global CLO, Nick van Dam, told me that American firms are using MOOCs to upskill accountants based in India on US accounting practices.

Dental, anyone?

13. MOOCs will target the corporate sector.

Current MOOCs are heavily geared towards school and college audiences. Over time, an increasing number of narrow, specific topics that link to corporate competencies will emerge.

Content providers will wag the long tail.

14. The corporate sector will embrace xMOOCs.

Learners in the workplace are time poor. They don’t have the luxury to explore, discover, and “make sense of the chaos”. They need the knowledge now and they are happy for the expert to transmit it to them.

15. An xcMOOC hybrid will emerge as the third variant.

Sooner or later, the powers that be will remember that an instructivist approach suits novices, while an increasingly constructivist and connectivist approach suits learners as they develop their expertise.

Hence, the MOOC of the future may resemble an xMOOC in its early stages, and morph into a cMOOC in its later stages.

3 hot resources for best practice multiple-choice quizzing

27 April 2011

In my previous post, 14 reasons why your multiple-choice quiz sucks, I listed typical clangers whose only purpose is to render your assessments ineffective.

Thumbs down

If they’re the bad and ugly aspects of MCQ design, what’s the good?

To answer that question I hit Google and a couple of academic databases, but mostly in vain.

It may be due to my poor researching skills, but I found very little empirical evidence of best practice multiple-choice quizzing. Plenty of unsubstantiated opinion (of course) but not much science.

Cartoon

You see, Google wasn’t much help because “best practice” is frequently confused with “common practice” – but it’s not the same thing.

The peer-reviewed literature wasn’t much better. Alarmingly, many of the studies were inconclusive, adopted a flawed experimental design, and/or didn’t compare the performances of the quiz-takers on the job under the different treatments – which is the whole point!

However, through a combination of persistence, serendipity and social networking, I finally uncovered 3 resources that I consider worth recommending: a journal article, a website and a blog.

1. A Review of Multiple-Choice Item-Writing Guidelines for Classroom Assessment – In this article, Thomas Haladyna, Steven Downing & Michael Rodriguez validate a taxonomy of 31 multiple-choice item-writing guidelines by reviewing 27 textbooks on educational testing and 27 research studies. If you want insight into the myriad of MCQ variables, here it is.

2. QuestionMark – David Glow and Jayme Frey independently pointed me to the wealth of resources on this website. QuestionMark is a business, granted, but they know what they’re talking about – a claim backed up by the fact they have a psychometrician on the payroll (cheers David) and I heard Eric Shepherd with my own ears at LearnX last year and was very impressed.

3. The eLearning Coach – Connie Malamed is a qualified and experienced e-learning designer whose blog provides advice to fellow practitioners. I value Connie’s expertise because it is practical and she has implemented it in the real world.

If you are aware of other good MCQ resources – preferably evidence based – please share them here…
 

14 reasons why your multiple-choice quiz sucks

19 April 2011

Unlike some of my peers, I’m not an opponent of the multiple-choice quiz. It’s a convenient and efficient means of assessing e-learners, if it’s done well.

The problem, unfortunately, is that it’s often done poorly. So poorly, in fact, it’s embarrassing.

oops!

At my workplace, I am regularly subjected to the multiple-choice quiz.

In fact, over the years in a range of jobs across several industries, not to mention my time in high school and university, I have been subjected to countless multiple-choice quizzes.

So I feel eminently qualified to tell you why yours sucks…

Thumbs down

  1. The questions don’t represent the learning objectives, so why did I waste my time doing the course?

  2. There aren’t many questions, so I should be able to bluff my way through them.

  3. The number of attempts is unlimited, so I’ll just keep guessing until I pass.

  4. The pass mark is low, but my customers aren’t so forgiving.

  5. The questions and answers aren’t randomised at each attempt, so I’ll jot down my responses to prepare for next time: 1-A, 2-D, 3-B…

  6. I’ll also ask my mate what he got for Questions 3, 6 & 8.

  7. The questions follow the order of the topics in the course, but I’m unlikely to encounter them in that order on the job.

  8. You use “All of the above” only when it’s correct.

  9. The ridiculous answer is obviously incorrect.

  10. The longest answer is probably correct, otherwise you wouldn’t have bothered writing it.

  11. More than one of the answers is arguably correct, and I’m shocked you didn’t know that.

  12. Your use of the double negative can only mean one thing: you can’t write no good.

  13. You try to trick me rather than confirm my understanding of the central concepts, so remind me: what’s the purpose of your role in this organisation?

  14. Your questions test my recall of obscure facts rather than my behaviour in authentic situations, so this exercise has no bearing on my performance.

Clearly I’m being deliberately pointed, but in all honesty, have you ever been guilty of one of those clangers?

I know I have.

But that’s no reason to vilify the multiple-choice quiz. When combined with other forms of assessment, it’s a valuable piece of the pedagogical puzzle.

Let’s just design it better.

Chicken takes a multiple-choice quiz

Vive la evolution

15 February 2011

Last week, Laura Layton-James stumbled upon my post Online courses must die! and she left a wonderfully detailed comment.

I was so enamoured with what she wrote that I feel compelled to repeat it here…

An excellent post Ryan. When running courses on how to create engaging eLearning (concentrating on the self-study module) we concentrate on designing, as Cathy Moore puts it, experiences. To me it’s a total waste of time reposting the information that already exists somewhere on the intranet in a pdf. As you say, a waste of resources when L&D’s expertise as learning consultants (which is what we are) can be put to better use creating those skills based activities that will test application rather than regurgitate facts and figures.

Unfortunately, the blame (if we have to lay blame) lies with the increasing need to tick boxes. It seems that if we just point people in the right direction for the information we can’t be sure they’ve read it. So we think the solution is to type it all up again in shorter chunks and ask them a load of questions which only tests immediate recall.

Bored at the computer

If it’s eInformation / eReference that’s needed, that’s fine. We can make that more visually appealing and easier to read on screen. We can even make it more enjoyable to view in the form of videos or podcasts.

Where the real learning takes place is in the analysis of the material in relation to a specific work-based problem. A problem that the learner is likely to face in the workplace.

An example I use is the mandatory fire safety course. It tends to be boring when done in the classroom where facts upon facts about the fire triangle are poured into learners’ heads. If they’re listening carefully enough they might be able to answer some questions on the fire triangle and what constitutes fuel, heat and oxygen (I’m still not sure if I’ve got it right). Tell me, in a fire how many people will be standing there pondering on the fire triangle. Really what would do us more good is to either assess realistic risks, or evacuate safely in the event of a fire.

The fire drill

Encouraging more of a user-generated and peer-to-peer learning environment may not be to everyone’s taste so a VLE such as Moodle will give more control. But L&D’s real and untapped value will be in the nurturing of learners, working with SMEs to provide digestible chunks of information, designing bite-sized resources and providing study guides and recommended personal learning plans so learning becomes more individual and task based.

Definitely, why force individuals to go through the same mandatory content year after year when all they may need is a yearly, skills based assessment. If that assessment highlights skills gaps then a more flexible learning programme will make sure individuals learn only what they need not what they don’t.

It’s no longer about what we know but more about where to find the information and apply it to tasks.

I couldn’t agree more.

Thanks Laura!

How to revamp your learning model

7 September 2010

In my articles Online courses must die! and The ILE and the FLE in harmony, I advocate the development of a virtual Informal Learning Environment (ILE) to work in tandem with the Formal Learning Environment (FLE) to support both the learning process and its administration.

Heeding the advice of Bill Brandon, I will now flesh out that idea with an illustration of how it might be implemented in a real organisation.

Informal learning

I believe in the power of informal learning. In fact, I go so far as to say it should be the central philosophy of the organisation’s learning model.

In a practical sense, that means we need to provide our learners with tools and resources that they can use to drive their own development.

This is where the ILE fits in: It’s a space (like a website or intranet site) that centralises those tools and resources.

The ILE illustrated

There are a thousand and one possible combinations and permutations of an ILE.

However, if I were to consider (read “fantasise”) a greenfield opportunity (read “pipedream”), what would I design?

Essentially I would base my design on three core components, as illustrated in Figure 1.

Informal Learning Environment, consisting of a wiki, a discussion forum and personal profiles

Figure 1. Informal Learning Environment

Core component #1: Wiki

The primary component of my ILE is a comprehensive wiki.

In a big corporation like the one I work for, knowledge is distributed everywhere – on obscure intranet pages, in random folders, in people’s heads – which makes it really hard to find.

A wiki enables the organisation to centralise that collateral, whether directly (by inputting it) or indirectly (by linking to where it exists elsewhere), thereby functioning as the first port of call.

A wiki can contain – or point to – all manner of media, such as text, graphics, documents and multimedia. The learner can search and explore the content that’s relevant to them, just-in-time if need be.

The flexibility of a wiki also allows anyone to contribute content. This empowers the learner to share their knowledge with their colleagues, build on the knowledge that has already been contributed by others, and communally keep it up to date.

Core component #2: Discussion forum

The secondary component of my ILE is an open discussion forum. I say “secondary” because my rationale is that, if the learner can’t find the knowledge they need in the wiki, they can crowdsource it via the forum.

A discussion forum enables the learner to post a question to their peers, thereby leveraging the collective intelligence of the organisation. Of course the learner can also share their knowledge by answering someone else’s question, and they can learn incidentally by reading the questions and answers of others.

The questions posted to the forum may also serve to expose knowledge deficiencies in the organisation, which can be remedied by updating the wiki!

Core component #3: Personal profiles

The tertiary component of my ILE is a bank of personal profiles. I say “tertiary” because my rationale is that, if the learner can’t find the knowledge they need in the wiki nor via the discussion forum, they can target an SME directly.

For example, if the learner is struggling with a Java programming problem, they can look up a Java expert in the system and send them a direct message. The SME may be recognised as a “Java” SME because they have said so in their profile, or – if the technology is sophisticated enough – their contributions of Java-related content in the wiki and participation in Java-related conversations on the discussion forum flag them as such.

I’m in two minds as to whether a full-blown social network is useful for internal learning purposes. Apart from profiling, I’m not convinced that friending, status updating and other Facebook-like activities add much value – especially when a discussion forum that accommodates groups is already in place.

Formal learning

Self-directed, informal learning is great. However, there are some things your employer must know that you know.

The most obvious example is compliance, eg privacy, trade practices and OH&S. If you breach the regulations, the company will be in hot water, so they’re not just going to take your word for it.

There are plenty of other examples, such as a certain level of product knowledge, that may be critical to the role.

In a practical sense, this means we should map required competencies to each role and assess the employee’s proficiency against each one. That probably leads to a development plan, which in turn forms a subset of the performance agreement and is subject to regular appraisals.

Then there are formal training events like courses and workshops that are important and require documentation, and some people want their informal learning (eg reading a book) recorded too.

The FLE is a space (like a database or platform) in which all this administration is done.

The FLE illustrated

Again, there are a thousand and one possible combinations and permutations of an FLE.

However I base my design on two core components, as illustrated in Figure 2.

Formal Learning Environment, consisting of an LMS and reports

Figure 2. Formal Learning Environment

Core component #1: Learning Management System

The primary component of my FLE is a Learning Management System (LMS).

The LMS is an oft-derided yet invaluable educational technology. I suspect the typical organisation under appreciates it because it uses it illogically.

My advice is to use the LMS for what it’s designed for: managing learning. Competency maps, auto-marked assessments, registrations, completion statuses, grades, transcripts, performance agreements and performance appraisals are what the LMS does well. Some even extend into talent management and other HR domains.

Conversely, my advice is to avoid using the LMS for what it is not really designed for: managing content. Leave that to the ILE, which is a much more open and flexible environment, and is purpose built to support “learning”.

Core component #2: Reports

The complementary component of my FLE is the range of reports that can be generated from various systems to provide useful data. Such data may include productivity statistics, quality scores, complaint volumes, engagement indices… whatever can be analysed to identify training needs and/or evaluate learning outcomes.

At the end of the day, learning must support performance.

Putting it all together

My revamped learning model, then, comprises two discrete but related virtual environments:

1. An ILE, and
2. An FLE.

The former supports the process of learning; the latter supports its management.

A revamped learning model, consisting of an ILE and an FLE

Figure 3. A revamped learning model

Separating the two environments like this aids in segregating them in the human mind.

Why bother?

Because learning should be a joy.

By definition, an ILE should be unforced, unscored, unthreatening.
It should be a safe, open space where people are excited to go because they want to learn, without the burden of forced navigation and pass marks.

Simultaneously, an FLE should focus on what really matters. Too often when formal and informal learning are mixed, goals blur and we run the risk of formalising for formalising’s sake. We don’t need to monitor our colleagues like Big Brother; we just need to assess them when necessary.

How long is a piece of string?

Of course, many more components may be reasonably argued for inclusion in the learning model.

An onsite classroom, for example, is obviously a part of the formal learning environment. So too is a university campus on the other side of town.

In terms of informal learning, the water cooler, a cabinet of books – and even the pages in a book – may be considered components of the ILE.

How about a library of online courses? That might be considered a component of the ILE if the learner is free to explore it at their convenience, but it will suddenly revert to the FLE if the learner is instructed to complete a particular course.

Clearly then, the ILE and the FLE are elastic concepts, highly dependent on perspective and context. That’s why I have focused on the core components that I think can provide a universal framework for a revamped learning model.

The two virtual environments are constant; everything else around them is variable.
 


Follow

Get every new post delivered to your Inbox.

Join 499 other followers