Posted tagged ‘semantics’

The leader’s new clothes

15 July 2020

From $7 billion to nearly $14 billion.

That’s how much the spend on leadership training by American corporations grew over the preceding 15 years, according to Kaiser and Curphy in their 2013 paper Leadership development: The failure of an industry and the opportunity for consulting psychologists.

Over that same period we witnessed the bursting of the dot-com bubble, the implosion of Enron, and of course the Global Financial Crisis. While the causes of these unfortunate events are complicated, our leaders were evidently ill-equipped to prevent them.

Despite the billions of dollars’ worth of training invested in them.

Undressed mannequins in a shop window

For a long time I felt like the child who could see the emperor wasn’t wearing any clothes. Then Jeffrey Pfeffer visited Sydney.

Pfeffer is the Professor of Organizational Behavior at Stanford University’s Graduate School of Business. He was promoting a book he had published, Leadership BS: Fixing Workplaces and Careers One Truth at a Time, in which he states what I (and no doubt many others) had been thinking: leadership training is largely ineffective.

At a breakfast seminar I attended, the professor demonstrated how decades of development had no positive impact on metrics such as employee engagement, job satisfaction, leader tenure, or leader performance. He posited numerous reasons for this, all of them compelling.

Today I’d humbly like to add one more to the mix: I believe managers get “leadership” training when what they really need is “management” training.

They’re entreated to be best practice before they even know what to do. It’s the classic putting of the cart before the horse.

For example, the managers in an organisation might attend a workshop on providing effective feedback, leveraging myriad models and partaking in roleplays; when what they really need to know is they should be having an hour-long 1:1 conversation with each of their team members every fortnight.

Other examples include training in unconscious bias, emotional intelligence and strategic thinking; yet they don’t know how to hire new staff, process parental leave, or write a quarterly business plan. Worse still, many won’t realise they’re expected to do any of that until the horse has bolted.

I’m not suggesting leadership training is unimportant. On the contrary it’s critical. What I am saying is that it’s illogical to buy our managers diamond cufflinks when they don’t yet own a shirt.

At this juncture I think semantics are important. I propose the following:

  • Management training is what to do and how to do it.
  • Leadership training is how to do it better.

In other words, management training is the nuts & bolts. The foundation. It’s what our expectations are of you in this role, and how to execute those expectations – timelines, processes, systems, etc. It focuses on minimum performance to ensure it gets done.

In contrast, leadership training drives high performance. Now you’ve got the fundamentals under your belt, here’s how to broaden diversity when hiring new staff. Here’s how to motivate and engage your team. Here’s how to identify opportunities for innovation and growth.

$14 billion is a lot of money. Let’s invest it in a new wardrobe, starting with the underwear.

The sum of us

10 July 2017

What is the definition of the term “data scientist”…?

In my previous post, Painting by numbers, I offered a shorthand definition of data science based on what I could synthesise from the interwebs. Namely, it is the combination of statistics, computer programming, and domain expertise to generate insight. It follows, then, that the definition of data scientist is someone who has those skill sets.

Fat chance!

In this post I intended to articulate my observation that in the real world, incredibly few people could be considered masters of all three disciplines. I was then going to suggest that rather than seeking out these unicorns, employers should build data science teams comprising experts with complementary talents. I say “was” because I subsequently read this CIO article by Thor Olavsrud in which he quotes Bob Rogers saying, well… that.

Given Thor and Bob have stolen my thunder (18 months ago!) I think the only value I can add now is to draw a parallel with pop culture. So I will do so with the geeky HBO sitcom Silicon Valley.

The cast of Silicon Valley: Dinesh, Gilfoyle, Richard, Jared and Erlich.

If you aren’t familiar with this series, the plot revolves around the trials and tribulations of a start-up called Pied Piper. Richard is the awkward brainiac behind a revolutionary data compression algorithm, and he employs a sardonic network engineer, Gilfoyle, and another nerdy coder, Dinesh, to help bring it to market. The other team members are the ostentatious Erlich – in whose incubator (house) the group can work rent-free in exchange for a 10% stake – and Jared, a mild-mannered economics graduate who could have been plucked from the set of Leave It to Beaver.

The three code monkeys are gifted computer scientists, but they have zero business acumen. They are entirely dependent on Jared to write up their budgets and forecasts and all the other tickets required to play in the big end of town. Gilfoyle and Dinesh’s one attempt at a SWOT analysis is self-serving and, to be generous, NSFW.

Conversely, Jared would struggle to spell HTML.

Arguably the court jester, Erlich, is the smartest guy in the room. Despite his OTT bravado and general buffoonery, he proves his programming ability when he rolls up his sleeves and smashes out code to rescue the start-up from imploding, and he repeatedly uses his savvy to shepherd the fledgling business through the corporate jungle.

Despite the problems and challenges the start-up encounters throughout the series, it succeeds not because it is a team of unicorns, but because it comprises specialists and a generalist who work together as a team.

Unicorn silhouette

And so the art of Silicon Valley shows us how unlikely we would be in real-life to recruit an expert statistician / computer programmer / business strategist. Each is a career in its own right that demands years of education and practice to develop. A jack-of-all-trades will inevitably be a master of none.

That is not to say a statistician can’t code, or a programmer will be clueless about the business. My point is, a statistician will excel at statistics, a computer programmer will excel at coding, while a business strategist will excel at business strategy. And I’m not suggesting the jack-of-all-trades is useless; on the contrary, he or she will be the glue that holds the specialists together.

So that begs the question… which one is the data scientist?

Since each is using data to inform business decisions, I say they all are.

Painting by numbers

3 June 2017

A lifetime ago I graduated as an environmental biologist.

I was one of those kids who did well in school, but had no idea what his vocation was. As a pimply teenager with minimal life experience, how was I to know even half the jobs that existed?

After much dilly dallying, I eventually drew upon my nerdy interest in science and my idealistic zeal for conservation and applied for a BSc. And while I eventually left the science industry, I consider myself extremely fortunate to have studied the discipline because it has been the backbone of my career.

Science taught me to think about the world in a logical, systematic manner. It’s a way of thinking that is founded on statistics, and I maintain it should inform the activities we undertake in other sectors of society such as Learning & Development.

The lectures I attended and the exams I crammed for faded into a distant memory, until the emergence of learning analytics rekindled the fire.

Successive realisations have rapidly dawned on me that I love maths and stats, I’ve floated away from them over time, the world is finally waking up to the importance of scientific method, and it is high time I refocused my attention onto it.

So it is in this context that I have started to review the principles of statistics and its contemporary manifestation, analytics. My exploration has been accompanied by several niggling queries: what’s the difference between statistics and analytics? Is the latter just a fancy name for the former? If not, how not?

Overlaying the post-modern notion of data science, what are the differences among the three? Is a data scientist, as Sean Owen jokingly attests, a statistician who lives in San Francisco?

The DIKW Pyramid

My journey of re-discovery started with the DIKW Pyramid. This beguilingly simple triangle models successive orders of epistemology, which is quite a complex concept. Here’s my take on it…

The DIKW Pyramid, with Data at the base, Information a step higher, Knowledge another step higher, and Wisdom at the peak.

At the base of the pyramid, Data is a set of values of qualitative or quantitative variables. In other words, it is the collection of facts or numbers at your disposal that somehow represent your subject of study. For example, your data may be the weights of 10,000 people. While this data may be important, if you were to flick through the reams of numbers you wouldn’t glean much from them.

The next step up in the pyramid is Information. This refers to data that has been processed to make it intelligible. For example, if you were to calculate the average of those ten thousand weights, you’d have a comprehensible number that is inherently meaningful. Now you can do something useful with it.

The next step up in the pyramid is Knowledge. To avoid getting lost in a philosophical labyrinth, I’ll just say that knowledge represents understanding. For example, if you were to compare the average weight against a medical standard, you might determine these people are overweight.

The highest step in the pyramid is Wisdom. I’ll offer an example of wisdom later in my deliberation, but suffice it to say here that wisdom represents higher order thinking that synthesises various knowledge to generate insight. For example, the wise man or woman will not only know these people are overweight, but also recognise they are at risk of disease.

Some folks describe wisdom as future focused, and I like that because I see it being used to inform decisions.

Statistics

My shorthand definition of statistics is the analysis of numerical data.

In practice, this is done to describe a population or to compare populations – that is to say, infer significant differences between them.

For example, by calculating the average weight of 10,000 people in Town A, we describe the population of that town. And if we were to compare the weights of those 10,000 people with the weights of 10,000 people in Town B, we might infer the people in Town A weigh significantly more than the people in Town B do.

Similarly, if we were to compare the household incomes of the 10,000 people in Town A with the household incomes of the 10,000 people in Town B, we might infer the people in Town A earn significantly less than the people in Town B do.

Then if we were to correlate all the weights against their respective household incomes, we might demonstrate they are inversely proportional to one another.

The DIKW Pyramid, showing statistics converting data into information.

Thus, our statistical tests have used mathematics to convert our data into information. We have climbed a step up the DIKW Pyramid.

Analytics

My shorthand definition of analytics is the analysis of data to identify meaningful patterns.

So while analytics is often conflated with statistics, it is indeed a broader expression – not only in terms of the nature of the data that may be analysed, but also in terms of what is done with the results.

For example, if we were to analyse the results of our weight-related statistical tests, we might recognise an obesity problem in poor neighbourhoods.

The DIKW Pyramid, showing analytics converting data into knowledge.

Thus, our application of analytics has used statistics to convert our data into information, which we have then translated into knowledge. We have climbed another step higher in the DIKW Pyramid.

Data science

My shorthand definition of data science is the combination of statistics, computer programming, and domain expertise to generate insight. Or so I’m led to believe.

Given the powerful statistical software packages currently available, I don’t see why anyone would need to resort to hand coding in R or Python. At this early stage of my re-discovery, I can only assume the software isn’t sophisticated enough to compute the specific processes that people need.

Nonetheless, if we return to our obesity problem, we can combine our new-found knowledge with existing knowledge to inform strategic decisions. For example, given we know a healthy diet and regular exercise promote weight loss, we might seek to improve the health of our fellow citizens in poor neighbourhoods (and thereby lessen the burden on public healthcare) by building sports facilities there, or by subsidising salad lunches and fruit in school canteens.

The DIKW Pyramid, showing data science converting data into wisdom.

Thus, not only has our application of data science used statistics and analytics to convert data into information and then into knowledge, it has also converted that knowledge into actionable intelligence.

In other words, data science has converted our data into wisdom. We have reached the top of the DIKW Pyramid.

The relationship between learning and performance support

18 November 2014

This post is the third in a series in which I deliberate over the semantics of education.

I dedicate this one to Jane Hart whom I was delighted to meet in-person in Sydney last month. Jane is a renowned advocate of performance support in the workplace, and I wonder what she’ll make of my latest musing.

While much of Jane’s work exposes the difference between training and performance support – and implores us to do less of the former in favour of the latter – my post here does not. The difference between training and performance support proxies (at least IMHO) the difference between formal and informal learning, and I do not intend to rehash that which others such as Jane have already documented so well.

Instead, I intend to explore the relationship between learning and performance support, with the former considered in its informal context.

I hasten to add that while much of Jane’s treatment of informal learning is in terms of social media, for the purposes of my post I will remain within the scope of broadcast content that is published by or on behalf of SMEs for consumption by the masses. The platform I have in mind is the corporate intranet.

Business woman typing on computer

A healthy corporate intranet comprises thoughtfully structured information and resources to facilitate learning by the organisation’s employees. While this content is typically delivered in an instructivist manner by the SME, it is probably consumed in a constructivist manner by the end user.

Much of the content – if not most of it – is designed to be consumed before it needs to be applied on the job. Hence I refer to it as “pre-learning”. It is undertaken just in case it will be needed later on, and is thus vulnerable to becoming “scrap learning”.

But of course not all pre-learning is a waste of time; some of it will indeed be applied later on. However it may be quite a while before this happens, so it’s important that the learner can refer back to the content to refresh his or her memory of it as the need arises. This might be called “re-learning” and it’s done just in time.

To support the learner in applying their learning on the job, tools such as checklists and templates may be provided to them for their immediate use. These tools are called “job aids” and they’re used in the workflow.

However job aids aren’t the only form of performance support. Content in the ilk of pre-learning may be similarly looked up just in time, though it was never learned in the first place. These concepts may be so straight-forward that they need not be processed ahead of time.

Business meeting

To illustrate, consider the topic of difficult feedback.

James is a proactive manager who reads up about this topic on the corporate intranet, watches some scenarios, and perhaps even tries his hand at some simulations. But it’s not until an incident occurs a couple of months later that he needs to have that special conversation with a problematic team member. So he refers back to the intranet to brush up on the topic before going into the meeting armed with the knowledge and skills he needs for success.

Jennifer also explores this topic on the intranet while she’s in between projects. Some time later she finds that she too needs to have a conversation with one of her team members, but she feels she doesn’t need to re-learn anything. Instead, she’s comfortable to follow the step-by-step guide on her iPad during the meeting, which gives her sufficient scaffolding to ensure the conversation is effective.

George, on the other hand, has been so busy that he hasn’t gotten around to exploring this topic on the intranet. However he too finds that he must provide difficult feedback to one of his team members. So he quickly looks it up now, draws out the key points, and engages the conversation armed with that knowledge.

The point of these scenarios is not to say that someone was right and someone was wrong, but rather to highlight that everyone is subjected to different circumstances. Sure, one of the conversations will probably be more effective than the others, but the point is that each of the managers is able to perform the task better than they otherwise would have.

Venn diagram showing the intersection of learning and performance support at JIT

So when we return to the relationship between learning and performance support, we see a subtle but important difference.

Learning is about preparing for performance. This preparation may be done well ahead of time or just in time.

Performance support is about, umm… supporting performance. This support may be provided in the moment or – again – just in time.

Hence we see an intersection.

But the ultimate question is: so what? Well, I think an awareness of this relationship informs our approach as L&D professionals. And our approach depends on our driver.

If our driver is to improve capability, then we need to facilitate learning. If our driver is to improve execution, then we need to facilitate performance support.

Arguably these are two different ways of looking at the same thing, and as the intersection in the venn diagram shows, at least in that sense they are the same thing. So here we can kill two birds with one stone.

The grassroots of learning

22 October 2014

Here’s a common scenario: I “quickly” look up something on Wikipedia, and hours later I have 47 tabs open as I delve into tangential aspects of the topic.

That’s the beauty of hypertext. A link takes you somewhere else, which contains other links that take you somewhere else yet again. The Internet is thus the perfect vehicle for explaining the concept of rhizomatic learning.

Rhizomatic learning is something that I have been superficially aware of for a while. I had read a few blog posts by Dave Cormier (the godfather of the philosophy) and I follow the intrepid Soozie Bea (a card-carrying disciple), but unfortunately I missed Dave’s #rhizo14 mooc earlier in the year.

Since I’ve been blogging about the semantics of education lately, I thought it high time to dig a little deeper.

Bamboo with rhizome

It seems to me that rhizomatic learning is the pedagogical antithesis of direct instruction. Direct instruction has pre-defined learning outcomes with pre-defined content to match. The content is typically delivered in a highly structured format.

In contrast, rhizomatic learning has no pre-defined learning outcomes nor pre-defined content. The learner almost haphazardly follows his or her own line of inquiry from one aspect of the subject matter to the next, then the next, and so forth according to whatever piques his or her interest. Thus it can not be predicted ahead of time.

Given my scientific background, I was already familiar with the rhizome. So is everyone else, incidentally, perhaps without realising it. A rhizome is the creeping rootstalk of a plant that explores the soil around it, sending out new roots and shoots as it goes along. A common example is bamboo, whose rhizome enables it to spread like wildfire.

In A Thousand Plateaus: Capitalism and Schizophrenia, Gilles Deleuze and Félix Guattari adopt the rhizome as a metaphor for the spread of culture throughout society. That’s a massive over-simplification, of course, and quite possibly wrong. The Outsider represents the extent of my French philosophy bookshelf!

Anyway, the point I’m bumbling towards is that Dave Cormier has picked up this philosophical metaphor and applied it to the wonderful world of learning. He explains in Trying to write Rhizomatic Learning in 300 words:

“Rhizomatic Learning developed as an approach for me as a response to my experiences working with online communities. Along with some colleagues we started meeting regularly online for live interactive webcasts starting in 2005 at Edtechtalk. We learned by working together, sharing our experiences and understanding. The outcomes of those discussions were more about participating and belonging than about specific items of content – the content was already everywhere around us on the web. Our challenge was in learning how to choose, how to deal with the uncertainty of abundance and choice presented by the Internet. In translating this experience to the classroom, I try to see the open web and the connections we create between people and ideas as the curriculum for learning. In a sense, participating in the community is the curriculum.”

I note that this explanation from 2012 is somewhat different from his paper in 2008, which of course reflects the evolution of the idea. In Rhizomatic Education: Community as Curriculum, Dave similarly mentioned the abundance of content on the Internet, and also the shrinking half-life of knowledge. He contrasted the context of traditional education – in which experts are the custodians of a canon of accepted thought, which is presumed to remain relatively stable – with today – in which knowledge changes so quickly as to make the traditional notion of education flawed.

Dave posited educational technology is a prime example. Indeed when I studied this discipline at university, much of the learning theory (for instance) enjoyed a broad canon of knowledge to which students such as myself could refer. It was even documented in textbooks. Other aspects of the subject (for instance, the rapid advances in technology, and the pedagogical shifts towards social and informal learning) could not be compared against any such canon. The development of this knowledge was so rapid that we students relied as much on each other’s recent experiences and on sharing our personal learning journeys than we did on anything the professor could supply.

“In the rhizomatic model of learning, curriculum is not driven by predefined inputs from experts; it is constructed and negotiated in real time by the contributions of those engaged in the learning process. This community acts as the curriculum, spontaneously shaping, constructing, and reconstructing itself and the subject of its learning in the same way that the rhizome responds to changing environmental conditions.”

From 2008 to 2012, I see a shift in Dave’s language from Rhizomatic Education to Rhizomatic Learning. This I think is a better fit for the metaphor, as while it may be argued that the members of the community are “teaching” one another, the driving force behind the learning process is the active learner who uses the community as a resource and makes his or her own decisions along the way.

I also note the change from “the community is the curriculum” to “participating in the community is the curriculum”. Another semantic shift that I think is closer to the mark, but perhaps still not quite there. I suggest that the content created by the members of community is the curriculum. In other words, the curriculum is the output that emerges from participating in the community. So “participating in the community produces the curriculum”.

As a philosophy for learning, then, rhizomatic learning is not so different from constructivism, connectivism, and more broadly, andragogy. The distinguishing feature is the botanical imagery.

However this is where my understanding clouds over…

Is it the abundance of content “out there” that is rhizomatic?

Or is it the construction of new knowledge that is rhizomatic?

Or is it the learning journey that is undertaken by the individual learner?

Perhaps such pedantic questions are inconsequential, but the scientist in me demands clarification. So I propose the following:

 

The knowledge that is constructed by the community is the rhizome.

The process of constructing the knowledge
by the members of the community is rhizomatic education.

The process of exploring, discovering and consuming the knowledge
by the individual learner is rhizomatic learning.

 

If we return to my Wikipedia scenario, we can use it as a microcosm of the World Wide Web and the universe more broadly:

The ever-expanding Wikipedia is the rhizome.

The Wikipedians are conducting rhizomatic education.

I, the Average Joe who looks it up and loses myself in it for hours on end,
is experiencing rhizomatic learning.

In the age of Web 2.0, Average Joe may also be a Wikipedian. Hence we can all be rhizomatic educators and rhizomatic learners.

Barnstar

I also detect a certain level of defensiveness from Dave in his early paper. He prefaces his work with a quote from Henrik Ibsen’s An enemy of the People which rejoices in the evolution of “truth” in the face of conventional resistance [my interpretation], while later on he addresses the responses of the “purveyors of traditional educational knowledge” – primarily in the realms of academic publishing and intellectual property.

I think Dave was right to be defensive. Despite the pervasive learnification of education that would theoretically promote rhizomatic learning as its poster boy, anything new that threatens the status quo is typically met with outrage from those who stand to lose out.

A case in point is moocs. Dave refers to Alec Couros’s graduate course in educational technology, which was a precursor to his enormously popular #ETMOOC. While a cMOOC such as this one may be the epitome of the rhizomatic philosophy, I contend that it also applies to the xMOOC.

You see, while the xMOOC is [partly] delivered instructivistly, those darn participants still learn rhizomatically! And so the traditionalists delight in the low completion rates of moocs, while the rest of us appreciate that learning (as opposed to education) simply doesn’t work that way – especially in the digital age.

Don’t get me wrong: I am no anti-educationalist. Regular readers of my blog will not find it surprising when I point out that sometimes the rhizomatic model is not appropriate. For example, when the learner is a novice in a particular field, they don’t know what they don’t know. As I was alluding to via my tweet to Urbie in lrnchat, sometimes there is a central and stable canon of knowledge and the appointed expert is best placed to teach it to you.

I also realise that while an abundance of knowledge is indeed freely available on the Internet, not all of it is. It may be hidden in walled gardens, or not on the web at all. Soozie makes the point that information sources go beyond what the web and other technologies can channel. “Information that is filtered, classified or cleansed, consolidated or verified may also come from formal, non-formal or informal connections including teachers, friends, relatives, professional colleagues and recognized experts in the field.” But I take her point that all this is enhanced by technology.

Finally, the prominence of rhizomatic learning will inevitably increase as knowledge continues to digitise and our lens on learning continues to informalise. In this context, I think the role of the instructor needs much more consideration. While Dave maintains that the role is to provide an introduction to an existing learning community in which the student may participate, there is obviously more that we L&D pro’s must do to fulfil our purpose into the future.

On that note I’ll rest my rhizomatic deliberation on rhizomatic learning. If you want to find out more about this philosophy, I suggest you look it up on Wikipedia.