Tag: semantics

Great and small

English is a funny language.

Coloured by countless other languages over centuries of war, politics, colonialism, migration and globalisation, many words have been lost, appropriated or invented, while others have changed their meaning.

In Australian English for example, fair dinkum means “true” or “genuine”. Linguaphiles speculate the phrase originated in 19th Century Lincolnshire, where “dinkum” referred to a fair amount of work, probably in relation to a stint down the mines. Add a tautology and 10,000 miles, and you have yourself a new lingo.

Thousands of other English words have their origins in ancient Greek. One pertinent example for L&D practitioners is pedagogy (formerly paedagogie) which derives from the Hellenic words paidos for “child” and agogos for “leader”. This etymology underscores our use of the word when we mean the teaching of children.

And yet our language is nuanced. We may alternately use pedagogy to mean the general approach to teaching and learning. Not necessarily teaching, not necessarily children. In this broader sense it’s an umbrella term that may also cover andragogy – the teaching of adults – and heutagogy – self-determined learning.

For example, when Tim Fawns, the Deputy Programme Director of the MSc in Clinical Education at the University of Edinburgh, blogged his thoughts about pedagogy and technology from a postdigital perspective, he defined pedagogy in the university setting as “the thoughtful combination of methods, technologies, social and physical designs and on-the-fly interactions to produce learning environments, student experiences, activities, outcomes or whatever your preferred way is of thinking about what we do in education”.

When Trevor Norris and Tara Silver examined positive aging as consumer pedagogy, they were interested in how informal learning in a commercial space influences the mindset of its adult patrons.

And when I use the word pedagogy in my capacity as an L&D professional in the corporate sector, I’m referring to the full gamut of training, coaching, peer-to-peer knowledge sharing, on-the-job experiences and performance support for my colleagues across 70:20:10.

A standing businessman facilitating a training session with a group of colleagues seated in a semi circle.

So while I assume (rightly or wrongly) that the broader form of the term “pedagogy” is implicitly understood by my peers when it’s used in that context, I spot an opportunity for the narrower form to be clarified.

Evidently, modern usage of the word refers not only to the teaching of children but also to the teaching of adults. Whether they’re students, customers or colleagues, the attribute they have in common with kids is that they’re new to the subject matter. Hence I support the Oxford English Dictionary’s definition of pedagogy as the practice of teaching, regardless of the age of the target audience.

If pedagogy includes adults, then logic dictates we also review the exclusivity of the term andragogy. Sometimes children are experienced with the subject matter; in such cases, an andragogical approach that draws upon their existing knowledge, ideas and motivations would be applicable. Hence I dare to depart from the OED’s definition of andragogy as the practice of teaching adults, in favour of the facilitation of learning. Again, regardless of the age of the target audience.

With regard to heutagogy, I accept Hase & Kenyon’s coinage of the term as the study of self-determined learning; however in the context of our roles as practitioners, I suggest we think of it as the facilitation of self-determined learning. That makes heutagogy a subset of andragogy, but whereas the latter will have us lead the learners by pitching problems to them, hosting Socratic discussions with them and perhaps curating content for them, the former is more about providing them with the tools and capabilities that enable them to lead their own learning journeys.

A tree structure flowing from Pedagogy down to Pedagogy, Andragogy and Heutagogy; with Instructivism, Constructivism, Connectivism and Novices, Intermediates, Experts aligned respectively.

This reshaping of our pedagogical terminology complements another tri-categorisation of teaching and learning: instructivism, constructivism and connectivism.

As the most direct of the three, instructivism is arguably more appropriate for engaging novices. Thus it aligns to the teaching nature of pedagogy.

When the learner moves beyond noviceship, constructivism is arguably more appropriate for helping them “fill in the gaps” so to speak. Thus it aligns to the learning nature of andragogy.

And when the learner attains a certain level of expertise, a connectivist approach is arguably more appropriate for empowering them to source new knowledge for themselves. Thus it aligns to the self-determined nature of heutagogy.

Hence the principle remains the same: the approach to teaching and learning reflects prior knowledge. Just like instructivism, constructivism and connectivism – depending on the circumstances – pedagogy, andragogy and heutagogy apply to all learners, great and small.

Roses are red

It seems like overnight the L&D profession has started to struggle with the definition of terms such as “capability”, “competency” and “skill”.

Some of our peers consider them synonyms – and hence interchangeable – but I do not.

Indeed I recognise subtle but powerful distinctions among them, so here’s my 2-cents’ worth to try to cut through the confusion.

Old style botanical drawing of a rose and violets

Competency

From the get-go, the difference between the terms may be most clearly distinguished when we consider a competency a task. It is something that is performed.

Our friends in vocational education have already this figured out. For example, if we refer to the Tap furnaces unit of competency documented by the Australian Department of Education, Skills and Employment, we see elements such as Plan and prepare for furnace tapping and Tap molten metal from furnace.

Importantly, we also see performance criteria, evidence and assessment conditions. Meeting a competency therefore is binary: either you can perform the task successfully (you are “competent”) or you can not (in the positive parlance of educationalists, you are “not yet competent”).

Capability

Given a competency is a task, a capability is a personal attribute you draw upon to perform it.

An attribute may be knowledge (something you know, eg tax law), a skill (something you can do, eg speak Japanese), or a mindset (a state of being, eg agile).

I consider capability an umbrella term for all these attributes; they combine with one another to enable the behaviour that meets the competency.

Capability is an umbrella term for the attributes that combine with one another to enable the behaviour that meets a competency.

Frameworks

According to the definitions I’ve outlined above, we frequently see in the workplace that “capability frameworks” are mislabelled “competency frameworks” and vice versa.

Terms such as Decision Making and Data Analysis are capabilities – not competencies – and moreover they are skills. Hence, not only would I prefer they be referred to as such, but also that they adopt an active voice (Make Decisions, Analyse Data).

I also suggest they be complemented by knowledge and mindsets, otherwise the collection isn’t so much a capability framework as a “skills framework”; which is fine, but self-limiting.

Deployment

I have previously argued in favour of the L&D team deploying a capability framework as a strategic imperative, but now the question that begs to be asked is: should we deploy a capability framework or a competency framework?

My typical answer to a false dichotomy like this is both.

Since capabilities represent a higher level of abstraction, they are scalable across the whole organisation and are transferable from role to role and gig to gig. They also tend to be generic, which means they can be procured in bulk from a third party, and their low volatility makes them sustainable. The value they offer is a no-brainer.

In contrast, competencies are granular. They’re bespoke creations specific to particular roles, which makes them laborious to build and demanding to maintain. Having said that, their level of personalised value is sky high, so I advise they be deployed where they are warranted – targeting popular roles and pivotal roles, for example.

Semantics

A rose by any other name would smell as sweet.

Yet a rose is not a violet.

In a similar manner I maintain that capabilities and competencies are, by definition, different.

In any case, if we neglect them, the next term we’ll struggle to define is “service offering”.

The leader’s new clothes

From $7 billion to nearly $14 billion.

That’s how much the spend on leadership training by American corporations grew over the preceding 15 years, according to Kaiser and Curphy in their 2013 paper Leadership development: The failure of an industry and the opportunity for consulting psychologists.

Over that same period we witnessed the bursting of the dot-com bubble, the implosion of Enron, and of course the Global Financial Crisis. While the causes of these unfortunate events are complicated, our leaders were evidently ill-equipped to prevent them.

Despite the billions of dollars’ worth of training invested in them.

Undressed mannequins in a shop window

For a long time I felt like the child who could see the emperor wasn’t wearing any clothes. Then Jeffrey Pfeffer visited Sydney.

Pfeffer is the Professor of Organizational Behavior at Stanford University’s Graduate School of Business. He was promoting a book he had published, Leadership BS: Fixing Workplaces and Careers One Truth at a Time, in which he states what I (and no doubt many others) had been thinking: leadership training is largely ineffective.

At a breakfast seminar I attended, the professor demonstrated how decades of development had no positive impact on metrics such as employee engagement, job satisfaction, leader tenure, or leader performance. He posited numerous reasons for this, all of them compelling.

Today I’d humbly like to add one more to the mix: I believe managers get “leadership” training when what they really need is “management” training.

They’re entreated to be best practice before they even know what to do. It’s the classic putting of the cart before the horse.

For example, the managers in an organisation might attend a workshop on providing effective feedback, leveraging myriad models and partaking in roleplays; when what they really need to know is they should be having an hour-long 1:1 conversation with each of their team members every fortnight.

Other examples include training in unconscious bias, emotional intelligence and strategic thinking; yet they don’t know how to hire new staff, process parental leave, or write a quarterly business plan. Worse still, many won’t realise they’re expected to do any of that until the horse has bolted.

I’m not suggesting leadership training is unimportant. On the contrary it’s critical. What I am saying is that it’s illogical to buy our managers diamond cufflinks when they don’t yet own a shirt.

At this juncture I think semantics are important. I propose the following:

  • Management training is what to do and how to do it.
  • Leadership training is how to do it better.

In other words, management training is the nuts & bolts. The foundation. It’s what our expectations are of you in this role, and how to execute those expectations – timelines, processes, systems, etc. It focuses on minimum performance to ensure it gets done.

In contrast, leadership training drives high performance. Now you’ve got the fundamentals under your belt, here’s how to broaden diversity when hiring new staff. Here’s how to motivate and engage your team. Here’s how to identify opportunities for innovation and growth.

$14 billion is a lot of money. Let’s invest it in a new wardrobe, starting with the underwear.

The sum of us

What is the definition of the term “data scientist”…?

In my previous post, Painting by numbers, I offered a shorthand definition of data science based on what I could synthesise from the interwebs. Namely, it is the combination of statistics, computer programming, and domain expertise to generate insight. It follows, then, that the definition of data scientist is someone who has those skill sets.

Fat chance!

In this post I intended to articulate my observation that in the real world, incredibly few people could be considered masters of all three disciplines. I was then going to suggest that rather than seeking out these unicorns, employers should build data science teams comprising experts with complementary talents. I say “was” because I subsequently read this CIO article by Thor Olavsrud in which he quotes Bob Rogers saying, well… that.

Given Thor and Bob have stolen my thunder (18 months ago!) I think the only value I can add now is to draw a parallel with pop culture. So I will do so with the geeky HBO sitcom Silicon Valley.

The cast of Silicon Valley: Dinesh, Gilfoyle, Richard, Jared and Erlich.

If you aren’t familiar with this series, the plot revolves around the trials and tribulations of a start-up called Pied Piper. Richard is the awkward brainiac behind a revolutionary data compression algorithm, and he employs a sardonic network engineer, Gilfoyle, and another nerdy coder, Dinesh, to help bring it to market. The other team members are the ostentatious Erlich – in whose incubator (house) the group can work rent-free in exchange for a 10% stake – and Jared, a mild-mannered economics graduate who could have been plucked from the set of Leave It to Beaver.

The three code monkeys are gifted computer scientists, but they have zero business acumen. They are entirely dependent on Jared to write up their budgets and forecasts and all the other tickets required to play in the big end of town. Gilfoyle and Dinesh’s one attempt at a SWOT analysis is self-serving and, to be generous, NSFW.

Conversely, Jared would struggle to spell HTML.

Arguably the court jester, Erlich, is the smartest guy in the room. Despite his OTT bravado and general buffoonery, he proves his programming ability when he rolls up his sleeves and smashes out code to rescue the start-up from imploding, and he repeatedly uses his savvy to shepherd the fledgling business through the corporate jungle.

Despite the problems and challenges the start-up encounters throughout the series, it succeeds not because it is a team of unicorns, but because it comprises specialists and a generalist who work together as a team.

Purple Unicorn, courtesy of Wild0ne, Pixabay.

And so the art of Silicon Valley shows us how unlikely we would be in real-life to recruit an expert statistician / computer programmer / business strategist. Each is a career in its own right that demands years of education and practice to develop. A jack-of-all-trades will inevitably be a master of none.

That is not to say a statistician can’t code, or a programmer will be clueless about the business. My point is, a statistician will excel at statistics, a computer programmer will excel at coding, while a business strategist will excel at business strategy. And I’m not suggesting the jack-of-all-trades is useless; on the contrary, he or she will be the glue that holds the specialists together.

So that begs the question… which one is the data scientist?

Since each is using data to inform business decisions, I say they all are.

Painting by numbers

A lifetime ago I graduated as an environmental biologist.

I was one of those kids who did well in school, but had no idea what his vocation was. As a pimply teenager with minimal life experience, how was I to know even half the jobs that existed?

After much dilly dallying, I eventually drew upon my nerdy interest in science and my idealistic zeal for conservation and applied for a BSc. And while I eventually left the science industry, I consider myself extremely fortunate to have studied the discipline because it has been the backbone of my career.

Science taught me to think about the world in a logical, systematic manner. It’s a way of thinking that is founded on statistics, and I maintain it should inform the activities we undertake in other sectors of society such as Learning & Development.

The lectures I attended and the exams I crammed for faded into a distant memory, until the emergence of learning analytics rekindled the fire.

Successive realisations have rapidly dawned on me that I love maths and stats, I’ve floated away from them over time, the world is finally waking up to the importance of scientific method, and it is high time I refocused my attention onto it.

So it is in this context that I have started to review the principles of statistics and its contemporary manifestation, analytics. My exploration has been accompanied by several niggling queries: what’s the difference between statistics and analytics? Is the latter just a fancy name for the former? If not, how not?

Overlaying the post-modern notion of data science, what are the differences among the three? Is a data scientist, as Sean Owen jokingly attests, a statistician who lives in San Francisco?

The DIKW Pyramid

My journey of re-discovery started with the DIKW Pyramid. This beguilingly simple triangle models successive orders of epistemology, which is quite a complex concept. Here’s my take on it…

The DIKW Pyramid, with Data at the base, Information a step higher, Knowledge another step higher, and Wisdom at the peak.

At the base of the pyramid, Data is a set of values of qualitative or quantitative variables. In other words, it is the collection of facts or numbers at your disposal that somehow represent your subject of study. For example, your data may be the weights of 10,000 people. While this data may be important, if you were to flick through the reams of numbers you wouldn’t glean much from them.

The next step up in the pyramid is Information. This refers to data that has been processed to make it intelligible. For example, if you were to calculate the average of those ten thousand weights, you’d have a comprehensible number that is inherently meaningful. Now you can do something useful with it.

The next step up in the pyramid is Knowledge. To avoid getting lost in a philosophical labyrinth, I’ll just say that knowledge represents understanding. For example, if you were to compare the average weight against a medical standard, you might determine these people are overweight.

The highest step in the pyramid is Wisdom. I’ll offer an example of wisdom later in my deliberation, but suffice it to say here that wisdom represents higher order thinking that synthesises various knowledge to generate insight. For example, the wise man or woman will not only know these people are overweight, but also recognise they are at risk of disease.

Some folks describe wisdom as future focused, and I like that because I see it being used to inform decisions.

Statistics

My shorthand definition of statistics is the analysis of numerical data.

In practice, this is done to describe a population or to compare populations – that is to say, infer significant differences between them.

For example, by calculating the average weight of 10,000 people in Town A, we describe the population of that town. And if we were to compare the weights of those 10,000 people with the weights of 10,000 people in Town B, we might infer the people in Town A weigh significantly more than the people in Town B do.

Similarly, if we were to compare the household incomes of the 10,000 people in Town A with the household incomes of the 10,000 people in Town B, we might infer the people in Town A earn significantly less than the people in Town B do.

Then if we were to correlate all the weights against their respective household incomes, we might demonstrate they are inversely proportional to one another.

The DIKW Pyramid, showing statistics converting data into information.

Thus, our statistical tests have used mathematics to convert our data into information. We have climbed a step up the DIKW Pyramid.

Analytics

My shorthand definition of analytics is the analysis of data to identify meaningful patterns.

So while analytics is often conflated with statistics, it is indeed a broader expression – not only in terms of the nature of the data that may be analysed, but also in terms of what is done with the results.

For example, if we were to analyse the results of our weight-related statistical tests, we might recognise an obesity problem in poor neighbourhoods.

The DIKW Pyramid, showing analytics converting data into knowledge.

Thus, our application of analytics has used statistics to convert our data into information, which we have then translated into knowledge. We have climbed another step higher in the DIKW Pyramid.

Data science

My shorthand definition of data science is the combination of statistics, computer programming, and domain expertise to generate insight. Or so I’m led to believe.

Given the powerful statistical software packages currently available, I don’t see why anyone would need to resort to hand coding in R or Python. At this early stage of my re-discovery, I can only assume the software isn’t sophisticated enough to compute the specific processes that people need.

Nonetheless, if we return to our obesity problem, we can combine our new-found knowledge with existing knowledge to inform strategic decisions. For example, given we know a healthy diet and regular exercise promote weight loss, we might seek to improve the health of our fellow citizens in poor neighbourhoods (and thereby lessen the burden on public healthcare) by building sports facilities there, or by subsidising salad lunches and fruit in school canteens.

The DIKW Pyramid, showing data science converting data into wisdom.

Thus, not only has our application of data science used statistics and analytics to convert data into information and then into knowledge, it has also converted that knowledge into actionable intelligence.

In other words, data science has converted our data into wisdom. We have reached the top of the DIKW Pyramid.