Tag: philosophy

The L&D maturity curve

Over the course of my career, I’ve witnessed a slow but steady shift away from formal learning to informal learning.

Of course, remnants of the “formal first” philosophy still exist, whereby every conceivable problem is attempted to be fixed by a training solution, typically in the form of a course. Over time, the traditional classroom-based delivery of such courses has increasingly given way to online modules, but that’s merely a change in format – not strategy.

While courses certainly have their place in the L&D portfolio, the forgetting curve places a question mark over their longterm effectiveness on their own.

The informal first philosophy balances the pendulum by empowering the employee to self-direct their learning in accordance with their personal needs.

While in some cases informal learning obviates the need for training, in other cases it will complement it. For example, I see the informalisation of learning as an opportunity to deliver the content (for example, via a wiki) which can be consumed at the discretion of the employee. The focus of the course then pivots to the application of the content, which is the point of learning it in the first place. Similarly, the assessment evaluates the learning in the context of real-world scenarios, which is what the learner will encounter post-course.

And since the content remains accessible, it can be used for ongoing reference long after the course has been completed.

A hand holding a pen pointing to a chart.

While I consider the informal first philosophy a giant leap in L&D maturity, it essentially pertains to instructional design. For a more holistic view of L&D, I propose an “assessment first” philosophy by which the capability of the target audience is analysed prior to any design work being undertaken.

The rationale for this philosophy is best appreciated in the context of an existing employee base (rather than greenhorn new starters). Such a group comprises adults who have a wide range of knowledge, skills and experiences. Not to mention they’ve probably been doing the job for a number of years.

Sheep dipping everyone in this group with the same training doesn’t make much sense. For a minority it might be a worthwhile learning experience, but for the majority it is likely to be redundant. This renders the training an ineffective waste of time, and an unnecessary burden on the L&D team.

By firstly assessing the target audience’s proficiency in the competencies that matter, a knowledge gap analysis can identify those in which the population is weak, and targeted training can be delivered in response. Individuals who are “not yet competent” in particular areas can be assigned personalised interventions.

This approach avoids the solution first trap. By focusing the L&D team’s attention on the real needs of the business, not only does the volume of demand reduce, but the work becomes more relevant.

The assessment first philosophy may appear incongruent where new starters are concerned, who by definition are assumed to be weak in all competencies – after all, they’ve only just walked through the door! – but I counter that assumption on two fronts.

Firstly, not all new starters are doe-eyed college grads. Many have had previous jobs in the industry or in other industries, and so they arrive armed with transferable knowledge, skills and experiences.

And regardless, the informal first philosophy holds true. That is to say, the new starter can consume the content (or not) as they see fit, demonstrate their understanding in the scenario-oriented “course”, and formalise it via the assessment.

The results of the assessment dictate any further intervention that is necessary.

Of course, some topics such as the company’s own products or processes will necessitate significant front-end loading via content development and maybe even curricula, but these may be considered the exception rather than the rule. By looking through the lens of assessment first, the L&D team works backwards to focus that kind of energy on where it is warranted.

It is also worth noting the assessment first philosophy renders the traditional “pass mark” obsolete, but such a radical idea is a story for another day!

Laptop showing business metrics.

While the assessment first philosophy represents an exponential leap in the maturity of L&D, there is yet another leap to make: “performance first”.

The raison d’être of the L&D team is to improve performance, so it’s always been a mystery to me as to why our work is so often disconnected to the business results. I do appreciate the barriers that are in our way – such as the inexplicable difficulty of obtaining the stats – but still, we can and should be doing more.

Under the performance first paradigm, it is not knowledge gaps that are analysed, but rather performance gaps. A root cause analysis identifies whether the cause is a capability deficiency or not – in the case of the former, a capability analysis feeds into the assessment first approach; in the case of the latter, a solution other than training is pursued instead.

As with assessment first, performance first may appear incongruent where new starters are concerned. After all, their stats thus far are zero, and waiting to recognise poor performance may have unacceptable consequences.

So again we have another exception to the rule whereby some folks may be scaffolded through L&D intervention prior to their performance being analysed. However the point is, we needn’t force everyone down that road. It depends on the circumstances.

And again, by looking through the lens of performance first, the L&D team works backwards to focus its energy on where it is needed. But this time with results at the forefront of the team’s purpose, its relevance to the business goes through the roof.

The L&D Maturity Curve, featuring Formal First rising to Informal First rising to Assessment First rising to Performance First. The x-axis represents maturity of the L&D function and the y-axis represents its relevance to the business.

I realise my take on L&D maturity might freak some of my peers out. Concurrently, others will argue that we should leapfrog to performance first now and get on with it.

Personally I consider the maturity curve a journey. Yes, it is theoretically possible to skip stages, but I feel that would be a shock to the system. From a change management perspective, I believe an organisation at one stage of the curve would achieve more success by growing into the next stage of the curve, while ironing out the bugs and creating the new normal along the way.

Besides, it isn’t a race. Important journeys take time. What matters is the direction in which that journey is heading.

The grassroots of learning

Here’s a common scenario: I “quickly” look up something on Wikipedia, and hours later I have 47 tabs open as I delve into tangential aspects of the topic.

That’s the beauty of hypertext. A link takes you somewhere else, which contains other links that take you somewhere else yet again. The Internet is thus the perfect vehicle for explaining the concept of rhizomatic learning.

Rhizomatic learning is something that I have been superficially aware of for a while. I had read a few blog posts by Dave Cormier and I follow the intrepid Soozie Bea, but unfortunately I missed Dave’s #rhizo14 mooc earlier in the year.

Since I’ve been blogging about the semantics of education lately, I thought it high time to dig a little deeper.

Bamboo with rhizome.

It seems to me that rhizomatic learning is the pedagogical antithesis of direct instruction. Direct instruction has pre-defined learning outcomes with pre-defined content to match. The content is typically delivered in a highly structured format.

In contrast, rhizomatic learning has no pre-defined learning outcomes nor pre-defined content. The learner almost haphazardly follows his or her own line of inquiry from one aspect of the subject matter to the next, then the next, and so forth according to whatever piques his or her interest. Thus it can not be predicted ahead of time.

Given my scientific background, I was already familiar with the rhizome. So is everyone else, incidentally, perhaps without realising it. A rhizome is the creeping rootstalk of a plant that explores the soil around it, sending out new roots and shoots as it goes along. A common example is bamboo, whose rhizome enables it to spread like wildfire.

In A Thousand Plateaus: Capitalism and Schizophrenia, Gilles Deleuze and Félix Guattari adopt the rhizome as a metaphor for the spread of culture throughout society. That’s a massive over-simplification, of course, and quite possibly wrong. The Outsider represents the extent of my French philosophy bookshelf!

Anyway, the point I’m bumbling towards is that Dave Cormier has picked up this philosophical metaphor and applied it to the wonderful world of learning. He explains in Trying to write Rhizomatic Learning in 300 words:

“Rhizomatic Learning developed as an approach for me as a response to my experiences working with online communities. Along with some colleagues we started meeting regularly online for live interactive webcasts starting in 2005 at Edtechtalk. We learned by working together, sharing our experiences and understanding. The outcomes of those discussions were more about participating and belonging than about specific items of content – the content was already everywhere around us on the web. Our challenge was in learning how to choose, how to deal with the uncertainty of abundance and choice presented by the Internet. In translating this experience to the classroom, I try to see the open web and the connections we create between people and ideas as the curriculum for learning. In a sense, participating in the community is the curriculum.”

I note that this explanation from 2012 is somewhat different from his paper in 2008, which of course reflects the evolution of the idea. In Rhizomatic Education: Community as Curriculum, Dave similarly mentioned the abundance of content on the Internet, and also the shrinking half-life of knowledge. He contrasted the context of traditional education – in which experts are the custodians of a canon of accepted thought, which is presumed to remain relatively stable – with today – in which knowledge changes so quickly as to make the traditional notion of education flawed.

Dave posited educational technology is a prime example. Indeed when I studied this discipline at university, much of the learning theory (for instance) enjoyed a broad canon of knowledge to which students such as myself could refer. It was even documented in textbooks. Other aspects of the subject (for instance, the rapid advances in technology, and the pedagogical shifts towards social and informal learning) could not be compared against any such canon. The development of this knowledge was so rapid that we students relied as much on each other’s recent experiences and on sharing our personal learning journeys than we did on anything the professor could supply.

“In the rhizomatic model of learning, curriculum is not driven by predefined inputs from experts; it is constructed and negotiated in real time by the contributions of those engaged in the learning process. This community acts as the curriculum, spontaneously shaping, constructing, and reconstructing itself and the subject of its learning in the same way that the rhizome responds to changing environmental conditions.”

From 2008 to 2012, I see a shift in Dave’s language from Rhizomatic Education to Rhizomatic Learning. This I think is a better fit for the metaphor, as while it may be argued that the members of the community are “teaching” one another, the driving force behind the learning process is the active learner who uses the community as a resource and makes his or her own decisions along the way.

I also note the change from “the community is the curriculum” to “participating in the community is the curriculum”. Another semantic shift that I think is closer to the mark, but perhaps still not quite there. I suggest that the content created by the members of community is the curriculum. In other words, the curriculum is the output that emerges from participating in the community. So “participating in the community produces the curriculum”.

As a philosophy for learning, then, rhizomatic learning is not so different from constructivism, connectivism, and more broadly, andragogy. The distinguishing feature is the botanical imagery.

However this is where my understanding clouds over…

Is it the abundance of content “out there” that is rhizomatic?

Or is it the construction of new knowledge that is rhizomatic?

Or is it the learning journey that is undertaken by the individual learner?

Perhaps such pedantic questions are inconsequential, but the scientist in me demands clarification. So I propose the following:

  1. The knowledge that is constructed by the community is the rhizome.
  2. The process of constructing the knowledge by the members of the community is rhizomatic education.
  3. The process of exploring, discovering and consuming the knowledge by the individual learner is rhizomatic learning.

If we return to my Wikipedia scenario, we can use it as a microcosm of the World Wide Web and the universe more broadly:

  1. The ever-expanding Wikipedia is the rhizome.
  2. The Wikipedians are conducting rhizomatic education.
  3. I, the Average Joe who looks it up and loses myself in it for hours on end, is experiencing rhizomatic learning.

In the age of Web 2.0, Average Joe may also be a Wikipedian. Hence we can all be rhizomatic educators and rhizomatic learners.

Barnstar

I also detect a certain level of defensiveness from Dave in his early paper. He prefaces his work with a quote from Henrik Ibsen’s An enemy of the People which rejoices in the evolution of “truth” in the face of conventional resistance [my interpretation], while later on he addresses the responses of the “purveyors of traditional educational knowledge” – primarily in the realms of academic publishing and intellectual property.

I think Dave was right to be defensive. Despite the pervasive learnification of education that would theoretically promote rhizomatic learning as its poster boy, anything new that threatens the status quo is typically met with outrage from those who stand to lose out.

A case in point is moocs. Dave refers to Alec Couros’s graduate course in educational technology, which was a precursor to his enormously popular #ETMOOC. While a cMOOC such as this one may be the epitome of the rhizomatic philosophy, I contend that it also applies to the xMOOC.

You see, while the xMOOC is [partly] delivered instructivistly, those darn participants still learn rhizomatically! And so the traditionalists delight in the low completion rates of moocs, while the rest of us appreciate that learning (as opposed to education) simply doesn’t work that way – especially in the digital age.

Don’t get me wrong: I am no anti-educationalist. Regular readers of my blog will not find it surprising when I point out that sometimes the rhizomatic model is not appropriate. For example, when the learner is a novice in a particular field, they don’t know what they don’t know. As I was alluding to via my tweet to Urbie in lrnchat, sometimes there is a central and stable canon of knowledge and the appointed expert is best placed to teach it to you.

I also realise that while an abundance of knowledge is indeed freely available on the Internet, not all of it is. It may be hidden in walled gardens, or not on the web at all. Soozie makes the point that information sources go beyond what the web and other technologies can channel. “Information that is filtered, classified or cleansed, consolidated or verified may also come from formal, non-formal or informal connections including teachers, friends, relatives, professional colleagues and recognized experts in the field.” But I take her point that all this is enhanced by technology.

Finally, the prominence of rhizomatic learning will inevitably increase as knowledge continues to digitise and our lens on learning continues to informalise. In this context, I think the role of the instructor needs much more consideration. While Dave maintains that the role is to provide an introduction to an existing learning community in which the student may participate, there is obviously more that we L&D pro’s must do to fulfil our purpose into the future.

On that note I’ll rest my rhizomatic deliberation on rhizomatic learning. If you want to find out more about this philosophy, I suggest you look it up on Wikipedia.

Let’s get rid of the instructional designers!

That’s the view of some user-oriented design proponents.

It’s something I remembered while writing my last blog post about user-generated content. Whereas that post explored the role of the learner in the content development process, how about their role in the broader instructional design process?

I wrote a short (1000 word) assignment on the latter at uni several years ago – in the form of a review of a chapter written by Alison Carr-Chellman and Michael Savoy – and it’s a concept that has resonated with me ever since.

Here I shall share with you that review, unadulterated from its original form except for adding an image to represent the user empowerment continuum, replacing the phrase “preferred learning styles” with “learning preferences”, and hyperlinking the reference.

Whether or not the more “progressive” design philosophies resonate with you, at the very least I hope they provoke your thinking…

Colleagues collaborating around a table with sticky notes

Introduction

Carr-Chellman & Savoy (2004) provide a broad overview of user design. They define the term user design, compare it against other methodologies of user-oriented design, identify obstacles to its successful implementation, and finally make recommendations for the direction of further research.

Definition

According to Carr-Chellman & Savoy (2004), traditional instructional design methodologies disenfranchise the user from the design process. In a corporate organisation, for example, the leaders will typically initiate the instructional design project, an expert designer will then analyse the situation and create a design, and finally, the leaders will review the design and either approve it or reject it. The role of the user, then, is simply to use the system (or perhaps circumvent it).

In contrast to traditional instructional design methodologies, user design enables the users to participate in the design process. Instead of just using the system, they are involved in its design. Furthermore, their role is more than just providing input; they are active participants in the decision-making process.

Comparison against other methodologies

Carr-Chellman & Savoy (2004) carefully distinguish user design from other methodologies of user-oriented design, namely user-centered design and emancipatory design.

User empowerment continuum, featuring traditional instructional design at the lowest extremity, then user-centered design, then user design, then emancipatory design at the highest extremity.

User-centered design

According to Carr-Chellman & Savoy (2004), user-centered design methodologies consider the needs of the user during the design process. In educational situations, for example, the expert designer may analyse the target audience, identify their learning preferences, and perhaps run a pretest. In tool usage situations, he or she may distribute user surveys or conduct usability testing. The goal of these activities is to obtain extra information to assist the designer in creating a better system for the users.

The key difference between user-centered design and user design is the level of participation of the users in the design process. Under a user-centered design model, the designer considers the needs of the users, but ultimately makes the design decisions on their behalf.

Under a user design model, however, the needs of the users go beyond mere food for thought. The users are empowered to make their own design decisions and thereby assume an active role in the design process.

Emancipatory design

If traditional design occupies the lowest extremity of the user empowerment continuum, and user-centered design occupies a step up from that position, then emancipatory design occupies the opposite extremity.

Emancipatory design dispenses with the role of the expert designer and elevates the role of the users, so that in effect they are the designers. This methodology charges the users with full responsibility over all facets of the design process, from initiation, through analysis, design, review, to approval. Instead of having a system imposed on them, the users have truly designed it for themselves, according to their own, independent design decisions.

Emancipatory design is founded on issues of conflict and harmony in the disciplines of social economics and industrial relations. Carr-Chellman & Savoy (2004) recognise that the goal of emancipatory design is “more to create change and vest the users and frontline workers in organisational outcomes than it is actually to create a working instructional system”. Hence, emancipatory design may not be a universal instructional design methodology.

User design

User design fits between the extremes of the user empowerment continuum. Whereas traditional design and user-centered design remove the user from the active design process, and conversely, emancipatory design removes the expert designer from the process, user design merges the roles into the shared role of “co-designer”. It strikes a balance between the two perspectives by including contributions from both parties.

Arguably, user design is a universal instructional design methodology. Whereas traditional design and user-centered design devalue the role of the users in the active design process, emancipatory design devalues the role of the expert designer.

User design, however, values both roles. It recognises the necessity of the active involvement of users, because they are the experts in their domain and will be the ones operating the system. However, users can not be expected to understand the science of design. The active involvement of an expert designer is critical in guiding the design process and driving the work towards an efficient and effective outcome.

Obstacles

Carr-Chellman & Savoy (2004) identify numerous obstacles to the successful implementation of user design, including the reluctance of designers and leaders to share their decision-making powers with users, the inclusion of users too late in the design process, the tendency to categorise users into a homogenous group, and the lack of user motivation to participate in design activities.

Further Research

Carr-Chellman & Savoy (2004) claim that research specific to user design within instruction systems is scarce, and much of the research into other user-oriented design methodologies lacks scientific rigour. Therefore, they recommend the following actions for the research community:

  1. To create a standardised language to define user design and to distinguish it from other user-oriented design methodologies,
  2. To study the implementation of user design across different variables, such as user profile, subject area and mode of delivery, and
  3. To communicate the success of user design in terms of “traditional measures of effectiveness” for the purpose of influencing policymakers.

Furthermore, Carr-Chellman & Savoy (2004) recommend that researchers adopt the participatory action research (PAR) method of inquiry. They argue that PAR democratises the research process and, consequently, is ideologically aligned with the principles of user design.

It can be argued, therefore, that Carr-Chellman & Savoy (2004) promote both user design and user research. Their vision for users is not only to assume the role of “co-designer”, but also of “co-researcher”.

Reference

Carr-Chellman, A. & Savoy, M. (2004). User-design research, in Handbook of Research on Educational Communication and Technology, 2nd ed, D. H. Jonassen (Ed), pp. 701-716, New Jersey, USA: Lawrence Erlbaum.

Human enough

It is with glee that the proponents of e-learning trumpet the results of studies such as the US Department of Education’s Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies, which found that, on average, online instruction is as effective as classroom instruction.

And who can blame them? It is only natural for evangelists to seize upon evidence that furthers their cause.

But these results mystified me. If humans are gregarious beings and learning is social, how can face-to-face instruction possibly fail to out perform its online equivalent?

That was until I watched Professor Steve Fuller’s Humanity 2.0 TEDxWarwick talk in Week 3 of The University of Edinburgh’s E-learning and Digital Cultures course.

The professor explains with wonderful articulation how difficult it is to define a human.

Sure, biologists will define humanity in terms of DNA, yet they can’t even agree on whether the Neanderthals were a subspecies of Homo sapiens or a separate species all together.

If we remove our gaze from the electron microscope, we have our morphology. Perhaps a human is an organism that has five fingers on each hand? But does that mean someone who is born with four (or six) is not human?

Perhaps a human is an organism that uses tools? Well, vultures drop rocks onto eggs to break them open.

Perhaps then a human is an organism that uses language? Whales might have something to say about that.

It is an intriguing conundrum that has occupied our thoughts since anyone can remember.

Title page of the first edition of René Descartes' Discourse on Method.

In the 17th Century, René Descartes made an intellectual breakthrough. He contended that “reason…is the only thing that makes us men, and distinguishes us from the beasts”. In other words, we are the only creatures on God’s earth capable of rational thought. I think, therefore I am.

Descartes pushed his point by arguing that while a robot might one day be developed to speak words, “it is not conceivable that such a machine should…give an appropriately meaningful answer in its presence”. And despite astonishing advances in artificial intelligence, the philosophical Frenchman remains right. Even Watson, who triumphed at Jeopardy! and today mines big data to help humans make better decisions, can not reasonably be considered a human itself. It is simply a product of computer programming.

Speaking of machines, if a human were to progressively replace her body parts with robotics – hence becoming a cyborg – at what point does she cease to be a human? According to the humanist tradition of Descartes, the absolute difference between a human and a non-human is a property of the mind. So, arguably she will remain a “human” until her brain is replaced.

But that begs the question: if we flip the scenario around and place a person’s brain in a robot’s body, does that make it a human?

All this philosophy starts to do my head in after a while, and that’s before getting into Freud’s posthumanism.

Somehow I prefer Joseph Gliddon’s simpler definition of a human: something that drinks coffee.

Cup of coffee next to a laptop

It’s not as flippant as it sounds, for it is our artificial enhancements that paradoxically make us more human.

Riding a bicycle, for example, is a quintessentially human endeavour. No other creature does it. Yes, a monkey might do so in the circus, but the reason we find it funny (or at least unusual) is because it doesn’t normally do that. The poor thing is mimicking a human.

Similarly, digital technology is an extension of our notion of humanity. Humans are the only organisms that use computers, surf the Web, write text, film video, record audio, and engage with one another in online discussion forums.

So when we view online pedagogy through this lens, we recognise very little of it that is not human. Consequently the strong performance of online students becomes less mysterious. In fact, it becomes expected because, just as a bicycle enhances our capability for travel, digital technology enhances our capability for learning.

This expectation is supported by a further finding of the Department of Education’s research – namely, that “blends of online and face-to-face instruction, on average, had stronger learning outcomes than did face-to-face instruction alone”. In other words, students who had the technology via the blended design performed better than those who didn’t.

But it doesn’t work in reverse: “the majority of…studies that directly compared purely online and blended learning conditions found no significant differences in student learning”. In other words, those who had the face-to-face interaction via the blended design performed no better than those who didn’t. Apparently the online instruction was human enough.

OK, on that bombshell, I think I’ll ride my bike to the cafe and pick up a cup of joe…

The equation for change

Guns don’t kill people. People do.

It’s a well-worn saying that Americans in particular know only too well.

And of course it’s technically correct. I don’t fear a gun on the table, but I do fear someone might pick it up and pull the trigger. That’s why I don’t want a gun on the table.

It’s a subtle yet powerful distinction that occurred to me as I absorbed the core reading for Week 1 of The University of Edinburgh’s E-learning and Digital Cultures course; namely Daniel Chandler’s Technological or Media Determinism.

Stone relief of a group of conquistadors.

Technological determinism is a philosophy that has implications for e-learning professionals as we grapple with technologies such as smartphones, tablets, ebooks, gamification, QR codes, augmented reality, the cloud, telepresence, ADDIE, SAM, and of course, MOOCs.

Chandler explains that “hard” technological determinism holds technology as the driver of change in society. Certain consequences are seen as “inevitable” or at least “highly probable” when a technology is unleashed on the masses. It’s how a lot of people view Apple products for example, and it’s extremist.

Like most extremism, however, it’s an absurd construct. Any given technology – whether it be a tool, a gadget or a methodology – is merely a thing. It can not do anything until people use it. Otherwise it’s just a box of wires or a figment of someone’s imagination.

Taking this rationale a step further, people won’t use a particular technology unless a socio-historical force is driving their behaviour to do so. History is littered with inventions that failed to take off because no one had any need for them.

Consider the fall of Aztec empire in the 16th Century. Sailing ships, armour, cannons, swords, horse bridles etc didn’t cause the conquistadors to catastrophically impact an ancient society. In the socio-historical context of the times, their demand for gold and glory drove them to exploit the technologies that were available to them. In other words, technology enabled the outcome.

At the other end of the spectrum, technological denial is just as absurd. The view that technology does not drive social change is plainly wrong, as we can demonstrate by flipping the Aztec scenario: if sailing ships, armour etc were not available to the conquistadors, the outcome would have been very different. They wouldn’t have been able to get to the new world, let alone destroy it.

Of course, the truth lies somewhere in between. Technology is a driver of change in society, but not always, and never by itself. In other words, technology can change society when combined with social demand. It is only one component of the equation for change:

Technology + Demand = Change

In terms of e-learning, this “softer” view of technological determinism is a timely theoretical lens through which to see the MOOC phenomenon. Video, the Internet and Web 2.0 didn’t conspire to spellbind people into undertaking massive open online courses. In the socio-historical context of our time, the demand that providers have for altruism? corporate citizenship? branding? profit? (not yet) drives them to leverage these technologies in the form of MOOCs. Concurrently, a thirst for knowledge, the need for quality content, and the yearning for collaboration drives millions of students worldwide to sign up.

MOOCs won’t revolutionise education; after all, they are just strings of code sitting on a server somewhere. But millions of people using MOOCs to learn? That will shake the tree.

So the practical message I draw from the theory of technological determinism is that to change your society – be it a classroom, an organisation, or even a country – there’s no point implementing a technology just for the sake of it. You first need to know your audience and understand the demands they have that drive their behaviour. Only then will you know which technology to deploy, if any at all.

As far as gun control in the US is concerned, that’s a matter for the Americans. I only hope they learn from their ineffective war on drugs: enforcement is vital, but it’s only half the equation. The other half is demand.