Sunday, 3 January 2016

From Courses to Campaigns : using the 70:20:10 approach

imageOne of the major strategic objectives for many HR and L&D departments in 2016 and beyond will be to extend their focus and services beyond courses and out into the workplace.

There are many reasons why this objective makes good sense.

Firstly, we know that learning is a powerful and continuous process that occurs daily at work and throughout life. Courses may help with the basics, or to refresh our knowledge, but courses alone won’t deliver high performance. Other activities in the workplace – such as challenging experiences, opportunities to practice in ‘real’ situations, support, advice and guidance from colleagues, and reflection, are all more important than courses in helping do that. If we put all our effort and resource only into designing, developing and delivering courses we may be helping people to some extent, but we’re only supporting one aspect of organisational learning and performance improvement.

Secondly, we also know that context is vital for effective learning. Learning is more powerful and more likely to result in behaviour change when the learning context and the working context are identical. In other words, results are improved when ‘work is learning and learning is the work’ as Harold Jarche has pointed out many times. We almost invariable learn best by ‘doing’ in the context of our work. The next best option is where the learning context very closely represents the work environment where new capabilities are to be applied. That’s why there is such huge investment in immersive simulators for training by the military, the aviation industry, the nuclear industry, for space programmes, and an increasing number of other industries. It’s cheaper and more practical to learn how drive a tank, land an aircraft or space vehicle, or manage a nuclear power plant safely in a simulator than risk the cost and damage of making errors in the real thing.

Thirdly, learning is invariably more impactful when we solve real problems and find real solutions ourselves. Business education has understood this fact for years, but rather than designing ways that allow experienced business school professors to support and mentor managers to solve their own real problems in their own context, most use a proxy called the case method. The Harvard case-study method was designed to allow emerging leaders opportunities to develop through the analysis of real organisations’ real problems. A good idea, but not the students’ own organisations or their organisations’ own problems. The resulting point of failure with the case method is that it often leads to superficial analysis with little or no understanding of the deeper, personal context. Henry Mintzberg of McGill University, and a renowned academic and author on business and management, has been challenging the ‘proxy’ learning via the case method for many years:

“The most obvious example, I think, of where it goes wrong is in the case-study method: give me 20 pages and an evening to think about it and I'll give you the decision tomorrow morning. It trains people to provide the most superficial response to problems, over and over again getting the data in a nice, neat, packaged form and then making decisions on that basis. It encourages managers to be disconnected from the people they're managing”1.

Looking across the entire landscape of organisational learning and development, we see similar proxies to the case method being used. Virtually all of them are wrapped up in an ‘event’ concept – often called the course, workshop, programme (or program), module etc. They are constructs which are based on the concept that experts are best placed to tell people what they need to learn, how they need to learn it and when they need to learn it2

Jane Hart in her recent article 2016: Rethinking workplace learning points out that this approach is really ‘workplace training’, and that although it may help, it is only one (small) part of the larger process of workplace learning. Another term for ‘workplace training’ is adding learning to work. Adding learning to work is only one way learning and work can be integrated. Adding learning to work is still learning focused (which makes it an obvious first step for L&D professionals). Adding learning is certainly better that removing learning entirely from work, but it is only one step towards integrated learning and working.

Beyond the ‘adding’ step there are others; embedding learning in work (through approaches such as performance support, checklists, FAQs and many other methods); extracting learning from work (through reflection, learning logs, work narration, personal micro-blogging and many other methods); and sharing learning with work colleagues (through ‘working out loud’, ‘showing your work’ – see Jane Bozarth’s great book of the same name, storytelling, team reviews and many other methods).

[extending%2520learning%255B4%255D.png]

In her article Jane Hart also hits on one of the major change factors necessary to enable the objective of extending learning beyond the course and into daily workflow – the right mindset.

Beyond the Course Mindset

The ‘course mindset’ is a sometimes a difficult one to cast off. The default solution (a course or programme) to address human performance problems is deeply embedded in most HR and learning professionals’ psyche and also our own development experiences. We’ve all been through courses at school and college, on programmes at university and in our workplace. Why should there be better ways?

There often are better ways. But they require a different way of thinking in order to define the best solutions, and different approaches to implement them. This is why it is better to approach performance challenges with a campaign mindset than a course mindset.

  • In the course mindset, the output is seen as ‘learning’. In the campaign mindset, the output is improved performance – organisational performance, team performance, and individual performance.
  • In the course mindset, we start with an analysis of the training need. In the campaign mindset we start by understanding the business or organisational problem, the associated performance problems and the root causes of each.
  • In the course mindset we then undertake course design. In the campaign mindset we then analyse the problems, identify the desired changes and identify potential ‘70’, ‘20’ and ‘10’ solutions.
  • In the course mindset we develop our solution for individuals and, sometimes, for teams. In the campaign mindset we develop solutions with organisational performance in mind.
  • In the course mindset we focus on aligning learning with work. In the campaign mindset we work to embed learning in work, and enhance extracting and sharing learning from work as well.
  • In the course mindset, we’re principally input focused. In the campaign mindset, we’re absolutely output focused.

Finally, in the course mindset we tend to only produce ‘10’ solutions. These are structured learning solutions that sits within the ‘10’ part of the 70:20:10 model. In the campaign mindset, we produce ‘100’ solutions. These are solutions that draw on the ‘70’, ‘20’ and the ‘10’ aspects of 70:20:10.

My previous article ‘Start with the 70. Plan for the 100’ explains why the ‘70’ and ‘20’ aspects are likely to provide the greatest value. That’s where HR and L&D departments need to be focusing if they’re to extend their focus and services beyond courses and out into the workplace and therefore increase the impact of their work.

My friend Lars Hyland has also written about moving from courses to campaigns.  An article by Lars in 2009, titled ‘Get Real: Mission Critical E-Learning’, published in the UK Learning Technologies magazine, stressed the need for ‘joined-up’ working between the typically disconnected internal functions of Internal Communications, Training, and Performance Management. In that article Lars stressed  the following point: “Thinking end to end means adopting "campaign" rather than "course" led programmes designed to effect real changes in attitudes, behaviour and performance” as part of his AGILE approach. This is very much in line with the approach I am recommending here. 

Tools to Get There

The recent book by Arets, Jennings and Heijnan ‘70:20:10 towards 100% performance’ explains in detail how organisations can make this move from courses to campaigns by using the 70:20:10 approach, and architect effective solutions with the ‘100’ in mind.

In this book we’ve defined a new set of roles that need to be fulfilled and tasks that need to be completed to make the change. Each of the roles is focused on outputs – performance - and the tasks are, in many cases, very different to the tasks carried out in most L&D departments today. In fact, some of the roles and tasks are not specifically linked to L&D and may (or will) sit in other parts of the organisation.

We’ve also designed and are launching an Expert Programme to help organisations exploit the 70:20:10 approach more effectively. Details of the programme are here together with downloadable brochure with details and feedback from previous participants. The programme will be launched globally early in 2016.

image

Roles in the new world of 70:20:10

---------

1 The Economist: An interview with Henry Mintzberg http://www.economist.com/node/850703

2. Jane Hart 2016: Rethinking workplace learning http://www.c4lpt.co.uk/blog/2016/01/02/2016-rethinking-workplace-learning/

Sunday, 6 December 2015

Start with the 70. Plan for the 100.

702010-towards-100-percent-performanceThis article draws on ideas and supporting material from a new book published for the first time in English last week. 

702010 towards 100% performance
by Jos Arets, Charles Jennings & Vivian Heijnen

Copyright: Sutler Media
Language: English
Pages: 313
Size: 30.5cm x 23.5 cm (12 X 9.25 inches)

It provides the first comprehensive and practical guidance for supporting the 70:20:10 model.

BUY THE BOOK: www.702010institute.com

The book is divided into 100 numbered sections across 313 pages in ‘coffee table’ format. Just eight of these sections are devoted to the problems. The other 92 provide solutions.

    • Full explanations of how the 70:20:10 approach can be used to help overcome the ‘training bubble’

    • Descriptions of five new performance-focused roles to support the use of 70:2010

    • The detailed tasks that need to be executed in each of these roles. Task lists, models, guidelines.

    • Checklists to rate your own organisation’s ability to deliver the critical tasks supporting 70:20:10

    • Nine ‘cameos’ written by leading thinkers and practitioners including Dennis Mankin (Platinum Performance), Nigel Harrison (Performance Consulting), Clark Quinn (Quinnovation), Jane Hart (Centre for Learning & Performance Technologies), Bob Mosher (APPLY Synergies), Jack Tabak (Chief Learning Officer, Royal Dutch Shell), Jane Bozarth (US Government) and others.

    • 12 page bibliography with a wealth of references to supporting papers, books, articles, case studies and other material.

 

Start with the 70. Plan for the 100.

Extending Learning into the Workflow

Many Learning & Development leaders are using the 70:20:10 model to help them re-position their focus for building and supporting performance across their organisations. They are finding it helps them extend the focus on learning out into the workflow.

The 70, 20 and 10 categories refer to different ways people learn and acquire the habits of high performance. ‘70’ activities are centred on experiential learning and learning through support in the workplace; ‘20’ solutions are centred on social learning and learning through others; and ‘10’ solutions are centred on structured or formal learning.

  • 10 solutions include training and development courses and programmes, eLearning modules and reading.
  • 20 solutions include sharing and collaboration, co-operation, feedback, coaching and mentoring.
  • 70 solutions include near real-time support, information sources, challenges and situational learning.

Traditionally, L&D has been responsible for services in the ‘10’, and sometimes for more structured elements in the ‘20’ (such as coaching and mentoring programmes). 

The ‘10’ has primarily involved designing, developing and implementing structured training and development interventions. When done well, these ‘10’ interventions can successfully help to build performance. However, learning which occurs closer to the time and place where it is to be used has a greater chance of being turned into action and result in performance improvement.

The closer learning is to work, usually the better.

In other words, 10 solutions are likely to have less business impact and provide less value than the 70 and 20 solutions in the long run.

Increasing the Value of Learning

That’s an important point worth repeating. As learning is highly contextual, and improved performance is the critical desired outcome, the closer learning occurs to the point of use then the greater is it’s likely impact.

This point is illustrated in the diagram below. This is taken from the 702010 Towards 100 Percent Performance book. As you move from the 10 and closer to the workflow (where most of the 20 and 70 happen) the potential for impact and realised value increases.

Fig 2.3

This aligns with the model developed by IBM Consulting Services some years ago (see below) developed to explain the evolution of learning and increased value of on-demand services aligned with current and future business needs.

IBM Core Model

The IBM model suggests three phases – access, integration, and on-demand. As learning moves from being separate from work, through enabling work, to being embedded in work the realised value potential increases.

De Grip (2015)[1], along with a number of other academic researchers, have also observed that informal learning – mostly 20 and 70 activities - is much more important than formal training when it comes to developing people in organisations.

Start With the 70

As learning is likely to be most effective when it occurs nearest the time and place of use, then it is best to always start with the 70 when developing solutions to address performance problems.

This may seem counter-intuitive to many L&D professionals.

In the past we’ve usually started with the ‘10’. We identified a performance challenge (often presented as a ‘training problem’) and then decided whether the solution should be face-to-face or digital. In other words, do we develop class/workshop or eLearning.

This simple binary option approach will not deliver optimum value. Selection of the ‘channel’ is made only from ‘10’ options. ‘70’ and ‘20’ options tend to be ignored.

The 70:20:10 approach recommends that solution design should start with options that are most likely to produce fast and efficient results, and those that are most likely to realise the greatest value. These are the solutions that are integrated into the workflow – the 70 and 20 solutions.

This recommendation is supported by a number of findings including those recently reported in a paper titled ‘The Secret Learning Life of UK Managers’.

‘The Secret Learning Life of UK Managers’
GoodPractice and Comres
November 2015

This report found that, for managers at least, the two key factors that most influence how people in work choose to learn are:

[a] ease of access, and;
[b] speed of result.

The research for this report was based on 500 interviews with managers carried out by Comres[2], a specialist polling and data gathering/analysis organisation.

The principal finding of this study was:

“How effective a learning option is perceived to be is much less important than how accessible it is and how quickly it produces a result. This applies across all approaches, whether online or offline.”

Plan for the 100

The key for effective 70:20:10 design is to plan for the 100.

What this means is that any solution is likely to comprise a variety of parts; some 70, some 20, and some 10.

It is important to avoid solutioneering within the 10 at the outset. As such, it is important to design with both the result in mind and with the ‘100’ in mind. This immediately extends both thinking and practice beyond the 10.

In other words, it is critical to maintain a clear focus on the desired performance outputs and, at the same time, use the principle of designing a total solution – incorporating 70, 20, and 10 elements as needed (and in this order).

Starting with the 70 and designing for the 100 is a good mantra to adopt if we are looking to deliver effective learning solutions.

 

Visit the 70:20:10 Institute site at www.702010institute.com


[1] De Grip, A. (2015). The importance of informal learning at work. On: http://wol.iza.org/articles/importance-of-informal-learning-at-work-1.pdf.

[2] http://www.comres.co.uk/

Friday, 13 November 2015

JAY CROSS – Pushing the Envelope to the End

Jay_1

 

 

“It all boils down to learning, but not the sort of learning you experienced at school. No, this is learning as a life skill. You’re learning all the time, taking in new information and making sense of it. You learn from experience, from conversations with peers, and from the school of hard knocks. You’re in charge of it, not a teacher or institution.”

(extract from the first draft of Real Learning, the book Jay was working on when he died on Friday 6 November 2015)

 

 


 

 

Jay Cross driving a 1904 Pope Tribune,Beaulieu Motor Museum June 2010

Jay’s premature death last week at the age of 71 has brought forward an enormous number of tributes from people whose lives he touched in so many ways.

David Kelly, amongst others, has done a wonderful job in gathering many of the remembrances of Jay that have been written over the past week.

Our Internet Time Alliance colleagues Jane Hart, Harold Jarche and Clark Quinn have each written poignant tributes. Jane has also done a great job curating Twitter condolences. For each of us, as for many, Jay’s loss is a deep personal and professional one.  He brought us together in 2009. He thought there were synergies (there were) and that we’d all get on well (we did).

Jay’s contribution to the field of organisational learning was huge. He made us think hard about the edges of our profession. When many were fretting about perfecting the irrelevant with better classroom courses Jay was pulling us into the emerging world of eLearning. When most were still focused on integrating eLearning into courses and curricula Jay was shouting that the real power wasn’t in structured learning at all but in workplace and in informal and social learning approaches.

The analysis he carried out for his 2006 Informal Learning book says it all. He called the focus on formal learning ‘absurd’. He was like that, never backward in saying it exactly as he saw it.

clip_image002

Jay didn’t come to informal learning by happenchance. He had studied and admired Ivan Illich and Illich’s views on the straightjacket of schooling for years. He’d also absorbed the thinking and writing of many other important contributors to the field of learning (he was fond of quoting the work of Kurt Lewin and Bluma Zeigarnik amongst many others).

clip_image004

Jay aligned himself with Illich whenever the opportunity arose:

“Together we have come to realize that for most men the right to learn is curtailed by the obligation to attend school”

Jay was a renaissance man. Deeply knowledgeable and widely read on many fronts, he saw a better way for people to learn and achieve their potential. He wrote about the importance of happiness and helping people to use learning to lead fulfilled lives. He was passionate about making a difference.

A quick look at Jay’s contributions to the field over the years uncovers a deeply humanist view of the world.

“My calling is to make people happy. They deserve more fulfilling, satisfied lives” https://about.me/jaycross

“When I look out 10 years, I see businesses prospering by treating people like people. Trusting people changes EVERYTHING.http://www.scoop.it/u/jay-cross

“The Real Learning Project aims to help millions of people learn to learn, increase their intelligence, and realize their life goals.” http://www.internettime.com

Jay could be direct, challenging and didn’t take prisoners. He often rattled and got under people’s skins, but never just for the sake of it. His underlying desire was to make the world a better place and improve the way we go about helping people learn and, subsequently, achieve that aim.

Jay Cross was great fun to be around. Obsessive (I once introduced him to a song by the fine British musician Richard Thomson. He told me he played it on continual loop for 4 solid days), and restless, he was continually generating new ideas and throwing them out to whoever was in talking distance or on the end of an email. He could be enthralling, mischievous and frustrating at the same time. My wife described having Jay to stay as being “like having a clever, over-excited child in the house”. Ideas and action bounced off every wall. He was also never without a camera close at hand to act as his ‘external memory’. His Flickr stream contains thousands of photos. Jay was like that, he shared everything.

Jay - LMSJay and ClarkGuinness Brewery, Dublin July 2010ITA
Jay was fun to be around

Above all, Jay provided a beacon to light the way to new and better approaches for creating high performing organisations. It is terribly sad that he died at a time when only the beginnings of the transformation he championed are starting to appear. He also lived his credo that ‘conversation is the best learning technology ever invented’.

Jay’s premature death is a huge loss to his family and close friends, and also to the many people he and his ideas touched across the world. It has taken one of the true original thinkers from us at a time when we most need him.

I have no doubt that Jay would want others to continue to build on his ideas and work. He was like that, generous and sharing. It was an absolute privilege and pleasure to have been one among his many friends.

Jay_3

 

 

So, farewell, Jay Cross. You’ve left the world a much better place. Your ideas and work have helped push the boundaries. You showed the way and helped us ‘keep it on the road’.

 

 

11th November 2015

#itashare

Thursday, 27 August 2015

70:20:10 Primer

I have often been asked to explain the fundamentals of 70:20:10 as a strategic framework quickly and simply.

I wrote the one-page ‘primer’ below to serve that immediate purpose. Please feel free to use it for any non-commercial purposes. It is published here under the Creative Commons: Attribution – Non-Commercial – Share Alike Licence (CC BY-NC-SA 2.0 UK).

If you are looking for a more detailed resource, see below the Primer.

----------------------------

Primer
What is 70:20:10?

70:20:10 is a reference model or framework that helps organisations extend their focus on learning and development beyond the classroom and course-based eLearning to build more resilient workforces and create cultures of continuous learning.

70:20:10 isn’t a ‘rule’. The model simply describes learning as it naturally happens and then offers means to accelerate and support that learning:

  • as part of the daily workflow;
  • through working and sharing with colleagues and experts;
  • through structured development activities.

Why the Numbers?

Although the 70:20:10 model is primarily a change agent, the numbers serve as a useful reminder that most learning occurs in the workplace rather than in formal learning situations. It also stresses that learning is highly context dependent. However, don’t make the mistake of quoting the numbers as a mantra or as fixed percentages. Research over the past 40 years has shown that informal and workplace learning is increasingly pervasive and central to learning in organisations. Studies have produced varying figures of the amount learned in these ways[1]. Each organisational culture will display its own profile of workplace, social and structured development opportunities, and the distinctions between the ways of learning often blur.

It is important not to put the three elements in the 70:20:10 model into separate ‘boxes’ in practice. They are interdependent. For instance coaching, mentoring and courses work best when they support on-the-job development[2].

Re-thinking Learning

With its emphasis on learning through experience and with others, the 70:20:10 framework helps extend the understanding of what learning means within organisations. It also moves us from ‘know-what’ learning towards more effective ‘know-how’ learning.

In summary, 70:20:10 helps change mindsets and learning practices.


[1] 70% (Tough, 1971, 1979); 70% (Bruce, Aring, and Brand, 1998); 62% (Zemke, 1985 and Verespej, 1998); 70% (Vader, 1998); 85-90% (Raybold, 2000); 70% (Dobbs, 2000); 75% (Lloyd, 2000)

[2] McCall, 2010

----------------------------

For More Detail

For some time I have been working with two friends and co-authors on a practical book titled ‘702010 Towards 100% Performance’. This book delves deep into implementing 70:20:10 as a strategic L&D tool and defines new L&D roles and tasks to support the process.

The book is large and comprehensive (300+ pages in ‘coffee table’ format and weighing in at around 4lb/1.8kg!). It provides detailed guidance, lots of checklists, tools and process support resources as well as insightful contributions from leading experts.  It was published in Dutch in June 2015. The English version will be available in November 2015.

Book Cover1     IMG_4456

#itashare

2015 Top Tools for Learning

top100Jane Hart’s 9th Annual ‘Top Tools for Learning’ Survey closes on Friday 19th September 2015 and will be published the following Monday.

If you haven’t already voted, please take a visit here and do so if you’re reading this before the close date.  If it’s too late, make sure you’ve marked up the action to contribute to next year’s survey.

My list of top tools for learning in 2015:

  1. Google Search: ‘Professor Google’ is he is known in my house. Like the brains of many professors, the information you’re looking for is usually in there somewhere, but sometimes difficult to pin down. However, Google has provided a public searchable store of enormous magnitude – larger than anything seen before in humankind history – and is without doubt the most used learning tool by many, if not all, of us. We can barely imagine life without Google Search.
  2. Twitter: I have learned more in my professional life through Twitter in the past six-and-a-half years than in the previous 30 years. For me Twitter gives access to smart people who provide an enormous wealth of information and insight through their commentary and links to research, articles and other resources. Twitter is the first tool/resource I access every day – even before I turn to see the sports results!
  3. Evernote: Tools for managing the tidal wave of information we are all subjected to are an absolute necessity. If we’re not to drown under uncategorised information overload, and if we’re to reduce the time we spend trying to find the ‘right stuff’ at the right time we need tools like Evernote.  It’s certainly a tool that works for me. Evernote is the frontal lobe of my ‘external brain’ (where Google Search is probably the rest of my digital cerebral cortex). Evernote is on all my devices and I use it every time I want to save an interesting ‘snippet’ that I may want to access and use in future.
  4. Dropbox: This tool lowers my blood pressure better than tablets ever could. It has removed the fear of losing everything if a hard disk crashes or a backup fails. It also allows me to collaborate and share files and resources with others – whether they are colleagues and translators working on the English version of the new ‘70:20:10, Towards 100% Performance’ book we’re just finishing, or clients who want to co-ordinate materials for a masterclass. Dropbox has also removed the embarrassment of arriving somewhere far-flung from home to find the workshop materials or keynote slides are still sitting on the office computer or on a memory stick that’s fallen under the bed in my hotel room. As a learning tool, Dropbox is the school satchel – keeping everything organised, safe and dry.
  5. Skype: Skype offers the magic of essentially free multimedia global communication and learning. 25 years ago the Boston Consulting Group was predicting a future of virtually free telecommunications. That prediction came true much sooner than BCG thought it would. Of course Skype has glitches and issues, and its new owner Microsoft is trying to extract value from users, but with a decent broadband connection Skype offers the real-time and asynchronous communications we could have only dreamed about 25 years ago.  Together with my colleagues in the Internet Time Alliance, we use Skype as our principal digital ‘glue’ to share and learn in a continuous flow.
  6. YouTube: YouTube is the digital ‘master’ I turn to when I’m trying to do something new and need guidance. YouTube is my ‘20’ support to help me learn through the ‘70’. Whether it’s discovering how to use some function in Excel or how to fix a laser printer, YouTube invariably offers help and guidance. YouTube is also the wonderful conduit for sharing expertise in other ways, whether it’s through TED Talks or hundreds of other video resources.  It’s pretty helpful when I’m trying to improve my musical skills, too.
  7. Google Scholar: When I’m researching something, Google Scholar is the first port-of-call. A quick search for ‘workplace learning’ on Google Scholar returns almost one-and-a-half million results in less than a tenth of a second. Then that’s the day gone as I sift through relevant papers and find myself ordering books and diving off in all directions, reading and learning as I go. Google Scholar provides the library index and stack we could only dream about 40 years ago.
  8. LinkedIn: Gone are the days of trying to keep track of people in Microsoft Outlook or some other contacts list. LinkedIn does that job well. But LinkedIn is much more than just an online contacts list. It provides a stream of updates from people I know and groups I’ve joined. LinkedIn allows for deep discussions with other professionals – always a great learning opportunity. I’m a member of far too many LinkedIn groups. The Harvard Business Review group discussions alone could probably occupy all of my time.
  9. PowerPoint: Although PowerPoint’s linear nature is often restricting, I’ve never found Prezi or other tools as robust and flexible. PowerPoint is the useful ‘Swiss army knife’ in the toolbox for assembling presentations and creating simple flow diagrams and graphics. It handles the integration of video and other multimedia reasonably well and is ubiquitous. I’m sure there are better tools than PowerPoint, but it does the job for me.
  10. Flipboard: A tool I couldn’t do without on my iPad. I use Flipboard as my blog and news aggregator. Whether it’s my colleague Jane Hart’s C4LPT site, Harold Jarche’s Adapting to Perpetual Beta site, Jay Cross’s Internet Time blog, Clark Quinn’s Learnlets, Nick Shackleton-Jones’ fascinating Aconventional site, or Tom Stafford’s unbelievably extensive MindHacks neuroscience and psychology resource (I remember Tom when he was just a schoolboy – we can certainly learn a lot from the next generation). Flipboard brings them all together under a simple interface.

Of course there are other tools I use for learning – Wikipedia (naturally), Blogger (that’s where this blog sits), Audacity for audio work, Google Hangouts, Microsoft Live Writer and others. But the 10 above are my principal daily workhorses.

#itashare

Tuesday, 21 July 2015

The #Blimage Challenge

For a bit of fun this afternoon my colleague Jane Hart set a few of us a #Blimage challenge.

I hadn’t come across this particular game before but having subjected myself to an iced water dunking along with millions of others last year I was reasonably pleased to see that this one only requires a stream of consciousness and a blog post rather than a stream of cold water.

Steve Wheeler (@timbuckteeth) explained the #Blimage Challenge this way:

“You send an image or photograph to a colleague with the challenge that they have to write a learning related blog post based on it. Just make sure the images aren’t too rude. The permutations are blimmin’ endless.”

This is the image Jane sent.

tree-710660_1280

My first thoughts were ‘am I looking at a tunnel or the sun?  Is the tree heading down a rabbit-hole with its branches reaching into the long bright tunnel (or even along a yellow brick road) or are its branches holding the sun in its yellow sky?

So what sense and inspiration could I possibly draw from this psychedelic image and make meaningful on a learning blog post?

The three key messages that this little reflective exercise produced for me were:

  1. Context is king in life. When we look at a 2D rendering of our 3D world we often need additional information to make sense of it. Even when we see a ‘real’ 3D rendering we can be tricked. The street artist Slinkachu demonstrates the essential part context plays perfectly.
  2. Context is also king for learning. We learn best within the context where we are going to use that learning. Workplace learning is generally more effective than simulations which, in turn, are generally more effective than being provided with information using a traditional ‘knowledge transfer’ learning approach.
  3. If we don’t have all the information we can easily draw false conclusions. The great Peter Drucker once said ‘the most important thing in communication is hearing what isn’t said’. If we don’t understand the unsaid, then we’re operating in a half-known world. If we are not inquisitive and explore the (half) learning from our classrooms or workshops in the 3D world of our workplace(s) we are likely to be only half-equipped to use that learning.

So if we don’t have a 3D view of the world (the full context) – and, correspondingly, if we try to learn without knowing the full context where our learning is to be used – then the learning will be providing the equivalent of a 2D picture for us.

Having the full, unambiguous, context-critical picture is essential for effective the learning which will lead to high performance.  That’s what learning is all about. It’s only really useful when it can be put into action in the 3D world. Passing a written exam doesn’t necessarily mean that learning has taken place. It’s only when the learning can be applied do we know if that has happened. Roger Schank explains that brilliantly in a recent blog post titled ‘reading is no way to learn’.

Not having all the information can, and will, lead to critical mistakes. A 2D rendering, or learning without context, can easily lead down the rabbit hole. Watch out. Many people have fallen into that trap.

101413_0   flat,550x550,075,f

Wednesday, 6 May 2015

70:20:10 - Beyond the Blend

blue-143734_1280

The term ‘blended learning’ first appeared in the late-1990s when web-based learning solutions started to become more widely used and were integrated on one way or another with face-to-face methods.

Of course the ‘blending’ concept has been around for much longer than the past few years. Apprenticeship training has ‘blended’ for centuries and the correspondence schools in Europe in the 1840s used blending. There are many other examples of ‘blending’ learning stretching back into the past, too.

However, the incorporation of technology into learning or training delivery has given blended learning a boost.

Speed reading machines in the late 1950s and 1960s (I remember my own speed reading courses – sitting in front of a large scrolling text machine in the early-1960s), interactive video (where some of the best eLearning programmes were developed in the 1980s), CD-based support and, of course, the Web have all contributed to our relative comfort in accepting blended learning as the norm. Each of these, though, were used to design and deliver structured and directed learning based on some form of instructional design and, often, as part of a curriculum.

In terms of new delivery approaches, blending offers up new horizons. However, in terms of breaking the traditional ‘push’ learning model it offers up little.

Blended is invariably ‘Push’ Learning

There are many definitions of blended learning. In 2003 the UK Department of Education and Training defined it as “learning which combines online and face-to-face approaches”. Most people would recognise that definition in what we see as blended programmes today - the use of two or more channels to make learning more easily or widely available.

The diagram below also represents a common view of blended learning. It focuses on the ‘delivery channel’ - integrating technology with traditional face-to-face approaches and stretching the time available to spend learning.

Blended Learning Heinze & Proctor
Diagram from Heinze & Proctor ‘Reflections on the Use of Blended Learning’(2004).
University of Salford, UK.

The current Wikipedia definition of blended learning reflects its structured nature:

Blended learning is a formal education program in which a student learns at least in part through delivery of content and instruction via digital and online media with some element of student control over time, place, path, or pace.

The key point about blended learning as is generally understood is that it remains firmly based on a push model. The learning experience is designed by others and usually packaged into a coherent event or set of events by instructional experts and ‘delivered’.

Of course within the ‘push capsule’ of blended learning there may be increased flexibility for individual learning preferences and increased flexibility of access. Participants are not constrained in the same way that they might be if they need to show up to a class at a set time and location to complete their learning process.

Traditional ‘Blending’ is based on Dependent Learning Models 

More recently blended learning solutions have been expanded to include combining simulations with structured curses, using instructional technology to link courses with on-the-job tasks, and integrating workplace coaching with formal programmes amongst other approaches.

In other words, ‘blending’ is starting to mean more than simply mixing delivery channels.

It is still, however, focused on learning outcomes (rather than performance outcomes) and is still firmly based around the concepts of structured learning processes to achieve its objectives. This is what my colleague Jane Hart calls dependent learning (see diagram below).

Blending is about increasing the efficiency and effectiveness of dependent learning.

Although it may sometimes focus on extending learning into the ‘informal’ part of this diagram (and thus making it somewhat formal) the fact that the ‘blend’ is part of an overall designed programme, course or initiative makes ‘blended’ primarily fall into the formal/dependent category.

learning categories

As such, blended learning still essentially sits in the push paradigm. It consists of learning content (mainly) and possibly some learning experiences that are designed by L&D professionals for the use of others.

Adding Learning to Work by Blending

This expansion of blended learning into the workplace can be termed ‘adding learning to work’. We intentionally add learning-focused activities into the workflow.

A number of researchers and practitioners have categorised the process of extending learning into the workflow as ‘adding and embedding’ or ‘embedding and extracting’. The categorisation below brings together some of the ways in which this happens.

Adding learning to work occurs where intentional learning-focused actions are taken to extend formal away-from-work courses and programmes back into the workplace. Most leadership and management programmes use this with work-based assignments, linked action learning, and other techniques.

The key point is that ‘adding learning to work’ is achieved through intentionally designed activities that are linked with a formal learning intervention.

Blended learning almost always falls into the ‘adding’ category. It is learning-focused and based around dependent learning models.

extending learning

70:20:10 - Beyond the Blend

By contrast, the 70:20:10 model is based on the concept of utilising both push and pull learning to achieve greater impact, shorter time to performance, sustainability, increased innovation and cost constraint.

A 70:20:10 approach spans all four of the categories above – adding, embedding, extracting and sharing.

A 70:20:10 approach also encompasses Jane Hart’s interdependent and independent categories (above).

It is important to realise the 70:20:10 strategic model emerged from a view of modern adult learning that is wider than ‘blending’. 70:20:10 draws on the fundamental changes that have occurred, and are continuing to occur, in the workplace. Work is becoming more complex. We work more in teams and rely on others to get our work done more than ever before. Experiential and social learning are becoming more critical day-by-day as agents of development.

In response to this wider view of adult workplace learning, and to these changes, learning and work must, by necessity, merge.

Changing work

This evolving view of modern workplace learning includes:

  • A re-focusing away from Taylorist views of management as a scientific discipline and the need to standardise for efficiency towards approaches to support the need for agility, innovation and speed.
  • An acceptance that ‘best practice’ (i.e. one single best way to achieve optimum outcomes) is increasingly irrelevant in our complicated and complex working world. The focus is moving towards ‘good practice’ (i.e. practices that work well for our context but may not be appropriate in other contexts) and ‘emerging practices’ (i.e. practices that we develop retrospectively as we seek to improve).
  • An understanding that a ‘curriculum’ mindset – where plans for standardised learning pathways are defined for standardised job roles and standardised career progression – is increasingly irrelevant in a world where a culture of continuous and flexible development is required to keep ahead.
  • The knowledge that competencies (i.e. ‘satisfactory’) is what people should enter our organisations with, but that capabilities (i.e. ‘potential’) are what we need to help develop.
  • The realisation that with the intangible value of organisations out-stripping the tangible value, people (the largest intangible element) need to be seen and treated as co-creators of value. What is in the heads of workers has never before been more important for organisations to survive and thrive.

We still have a long way to go to break the learning=schooling mindset, to increase the impact and efficiency of learning, and to build cultures of continuous development embedded in work. But we’ve made some good starting steps.

The 70:20:10 reference model can certainly help us expand our concepts and practices to support a better workforce development approach when it is used wisely and as an agent of change and not followed slavishly as some ‘rule’.

Blended Learning is Only the Beginning of the Story

Blended learning has been an important first step in this process as it has helped break the shackles of time and location imposed by the dominant face-to-face dependent learning approaches that have been in use for centuries. Technology has enabled that. The ‘richness-reach trade-off’ described by Evans and Wurster in 1999 has truly been broken.

But ‘blending’ is just a baby step.

Blended learning is still on the wrong side of the chasm between learning and the learning/work continuum – and it needs to jump. A lot more work is required beyond ‘blending’ to truly embed learning into work.

It is important to remember that blended learning is a sub-set of 70:20:10, and one way to support a 70:20:10 approach, but it is not a replacement for it. If you’ve implemented blending, you’re on the road but not at the end of the journey yet.

#itashare

Friday, 23 January 2015

Autonomy and Value in Social and Workplace Learning

My colleague Jane Hart recently shared the diagram below on her blog.

It shows the relationship between relative value and relative autonomy as they relate to different approaches for learning in the modern workplace.

Jane's Model

‘Learning in the Modern Workplace’ Model

Jane’s diagram shows the increasing value that can be released through exploiting learning opportunities beyond ‘the course’ and the curriculum. Initially expanding from courses to resources and then further out to the exploitation of social collaboration and personal learning (and personal knowledge mastery).

It struck me that Jane’s model closely aligned with others I’ve used to help explain the increase in realised value brought about by the use of experiential, social and workplace learning.


IBM Core Model

IBM Core Model

This model, produced by IBM Consulting services in 2005, separates learning solutions into three phases:

  1. Access Phase: where learning is separate from work
  2. Integration Phase: where learning is ‘enabling’ work
  3. On Demand Phase: where learning is ‘embedded’ in work/tasks

This model shows the maximum potential value that can be realised increases as learning becomes closer to, and more integrated with, work.

I have mapped the elements of the 70:20:10 model at the bottom to show the link with the next model.


70:20:10 Model

jennings 702010

The 70:20:10 model is a strategy and set of practices to extend learning into the workflow. The principle is that in the new working environment learning is the work. Harold Jarche has written extensively about the merging of work and learning.

I see the 70:20:10 model as reflecting, to some extent, IBM’s model. Exploiting and extending learning opportunities from point solutions (learning events) to continuous development (learning as a process and part of the daily workflow) to increase value.

Organisations that are able to move in this direction, and have the HR and L&D teams to facilitate and support the move, will extract far greater value from workforce development than those that can’t.


The Autonomy-Strategic Alignment C-Curve’

C-curve- original

This model, the ‘Jennings & Reid-Dodick C-Curve’, was developed in the early stages of an L&D transformation for a Global FTSE100 company more than a decade ago.

It links to Jane’s diagram at the top of this post and maps autonomy against strategic alignment.

this model was developed to define the journey for the L&D transformation – firstly centralising standards and processes, and building a consistent performance consulting approach, then strengthening governance, and finally ‘federalising’ to provide the autonomy needed for agility, responsiveness and sustainability.

The C-Curve is based on the principle that the end-point for an effective L&D department is where the various units (which may be regional or functional) are tightly strategically organisationally aligned, but also have the level of autonomy that encourages them to be agile and pro-active.

Many organisations flip-flop between centralised L&D and distributed L&D. The cycle tends to have a frequency of about 5-8 years. Every 5-8 years an HRD or CEO decides to centralise, or to push L&D back into ‘the business’ – depending on the current operating model.  Then, 5-8 years after that change, L&D is de-centralised/centralised once again.

The C-Curve model addresses this ‘flipping’ problem.

The fundamental issue isn’t where the various L&D resources are sitting, but how they are aligned strategically and how responsive they are able to be. Simply flipping the organisational structure and reporting lines will do nothing to address the fundamental issue.

 


The ‘C’ Curve applied to Workplace Learning

Some years ago Harold Jarche and I talked about the ‘C’ Curve model. Harold then aligned it with a framework he had developed for supporting effective social learning (in the context of several models – including Snowden’s Cynefin and Ronfeldt’s TIMN).

Harold mapped the autonomy/strategic alignment axes of the C-Curve against knowledge acquisition models.

As John Reid-Dodick and I concluded back in 2004, Harold came to the conclusion that a jump straight from Stage 1 to Stage 4 is unlikely to succeed and that it requires a journey through at least some other stages to reach the end-point.

Harold reported:

“I’ve combined the C-Curve [X=Autonomy, Y= Strategic Alignment] with the knowledge acquisition models from these three organizational types (simple, Complicated, Complex). The question that I ask here is whether it is necessary to follow the curve or if one can leap from Stage 1 to 4.  If not, that means that organizations need to understand and implement something like a human performance technology model for L&D before they can move on to social learning. Perhaps this is why social learning is being resisted or put into a formal training box in many organizations. They have not made the move to Stage 3 (Performance Support) yet. It’s too much of a leap for organizations in Stage 2. On the other hand, social learning is only a short leap for more tribal start-ups that have not developed any structure at all for L&D as they are quite comfortable with autonomy and messy networks. Stage 2 seems like the worst place to be.”

  1. L&D Autonomous = taking action as a Tribe of its own
  2. L&D Aligned with organization = coordinated with the Institution
  3. L&D with governance & guidelines = able to work in a collaborative Market
  4. L&D strategically aligned = a co-operative member of (a) Network(s)

C-curve-LD

Harold’s full article is well worth studying.

#itashare

Tuesday, 20 January 2015

70:20:10 – Above All Else It’s a Change Agent

“Progress is impossible without change; and those who cannot change their minds cannot change anything.”
George Bernard Shaw

imageTom Spiglanin is a senior engineering specialist at the Aerospace Corporation in California and is a leader in the organisation’s technical training department. The people he works with carry out research for the US space programmes – both for the US Government and for civil agencies like NASA and NOAA. In other words, they’re rocket scientists. Tom is a rocket scientist and helps other helps rocket scientists learn their stuff.

Recently Tom wrote a series of blogs titled ‘Ten Things I Believe About Workplace Learning’. His list included important issues and current areas of focus such as the new and emerging roles for L&D professionals; the value of sharing as a skill for learning and development; the importance of personal learning networks and personal knowledge mastery; and the inverse relationship between experience and the value of formal learning.

The first post on Tom’s list was I Believe in the 70:20:10 framework.

The messages he conveyed in this short post struck me as having been missed by lots of people when they talk about the 70:20:10 model as a framework for learning and development.

Tom wrote:

“The reason this framework works is that it more or less reflects what’s actually true for employees in the typical workplace. Formal education has its place in preparing people for the workplace. Once those people become employees, they have a job to get done. People aren’t hired to learn, they’re hired to increase productivity or capability. There are productivity expectations and organizational needs to be met.”

It’s Not the Numbers

imageIt “more or less reflects what’s actually true for employees in a typical workplace”. That’s the key to the 70:20:10 model, and there’s an increasing body of data in support of this.

We all know instinctively that we learn most of what we need through observing, mimicking, discussing, trying things out, making mistakes, and trying again until we are adept. That’s the nature of human learning. We are learning animals, born to learn.

We learn through watching others who ‘know how to do’ (who of us hasn’t stood and looked over someone’s shoulder recently to see how they were operating a ticket machine or some other piece of technology?) and through conversations. We learn through navigating tough situations, and through practice. And we learn through taking time to reflect on challenges and how we might have handled them differently so we can do better next time (again, who of us hasn’t spent time recently mulling over a difficult work problem whilst lying in bed, showering, or out walking the dog, and then planned ways to address it ?)

Double-Edged Sword

To make it easier to explain the skew favouring informal and workplace learning over formal (in terms of contribution to performance) we put a number on each of the three broad categories in the 70:20:10 model – [the ‘70’] learning through by experience and practice; [the ‘20’] learning through, and with, others; and [the ‘10’] learning through courses, programmes and content structured by others.

However, the way these three broad categories are described in the model can lead to a focus on the ratios rather than the underlying principles and categorisation. As such ‘the numbers’ can serve as a double-edged sword.

It is important to understand that these numbers are simply markers and shouldn’t be taken literally. This is a reference model, not a recipe. Sometimes this presents a challenge for people who want or need clear and simple explanations. Unfortunately, life’s not often clear and simple!

I have written previously about some misguided researchers (possibly out on work experience) declaring that “50:26:24 is the average learning mix in most companies right now” (with the implication that it wasn’t “70:20:10”). The idea that companies could neatly slice the learning patterns of their people into three carefully-defined and carefully analysed buckets like this belies belief. This is where a focus on the numbers masks the general underlying principles of the framework.

The evidence, however, does point to the fact that most learning is experiential and social, and most of that being carried out in a self-directed way. In other words, ‘informally’. It also points to some broad – rather than specific - ratios.

Research over the past 40 years has shown that informal and workplace learning is increasingly pervasive and central to learning in organisations. Of course studies have produced varying figures of the amount learned in these ways[1] (as one would expect). Each organisational culture will display its own profile of workplace, social and structured development opportunities, and each will vary dependent on a number of factors.


[1] 70% (Tough, 1971, 1979); 70% (Bruce, Aring, and Brand, 1998); 62% (Zemke, 1985 and Verespej, 1998); 70% (Vader, 1998); 85-90% (Raybold, 2000); 70% (Dobbs, 2000); 75% (Lloyd, 2000)


Despite all the points made above about avoiding focus on the numbers, there is a general pattern here. As Jay Cross pointed out back in 2003:

“At work we learn more in the break room than in the classroom. We discover how to do our jobs through informal learning -- observing others, asking the person in the next cubicle, calling the help desk, trial-and-error, and simply working with people in the know. Formal learning - classes and workshops and online events - is the source of only 10% to 20% of what we learn at work.”

Jay’s last comment here should be a guide to our thinking – “Formal learning .. is the source of only 10% to 20% of what we learn at work”. That’s a large variance, not an exact number, but it does suggest that we need to look beyond formal learning if we’re to help create a step-change in performance.

We can expect to see more research output and new individual ratios in the next few years. The fact that different studies reveal different numbers doesn’t make them invalid. Every study is contextual. However, the aggregated results and trends do build the evidence behind the principles of workplace and social learning, and behind the 70:20:10 model.

Why Use Numbers, Then?

Although the 70:20:10 model is primarily a change agent, the numbers do serve as a useful reminder that most learning occurs in the workplace rather than in formal learning situations. They also help stress that learning is highly context dependent.

imageSome mistakenly think 70:20:10 is some kind of golden ratio or edict that can be applied as a simple formula no matter what the context or situation.

The idea that we should be trying to align our learning and development efforts with some fixed ratio is mistaking the 70:20:10 model for something that it is not. What we should be doing is putting our effort into supporting and refining learning where it’s already happening, and this is predominantly as part of the daily workflow.

70:20:10 is not the L&D equivalent of the Ten Commandments or the Quran. The model is better likened to the guidance and advice a parent might provide to a child to help them make the most of their life “work hard to get better at everything you do, put most of your effort into being kind to others, learn your lessons, and you’ll go far”.

L&D professionals need to have tattooed onto their brains that “70:20:10 is a reference model and not a 'rule'”.

A Change Agent

image70:20:10 is primarily an agent of change for extending our thinking about learning beyond the classroom and other structured, event-based development activities.

Good use of 70:20:10 results in increased focus on supporting effective learning and development within the daily workflow, naturally and at the speed of business – or preferably faster than the speed of business.

That’s where the model can have its greatest impact.

Along with providing a strategy for supporting effective and efficient learning and releasing high performance, 70:20:10 thinking also helps to change and develop mindsets (and change practices). Of course formal away-from-work learning is still necessary to build capability efficiently and effectively in certain situations – especially when people are new to an role or organisation. However we need to think and act more widely than simply changing the delivery channel.

That’s where a 70:20:10 strategy can help.

Although many L&D departments are reaching out to new media and new approaches to support daily development activities – with incorporating social learning into courses, launching MOOCs, adding gamification, using mobile and other communication and delivery channels in the vanguard - many of these are still being implemented within the traditional L&D structured learning framework. That framework and mindset is essentially about command and control - 'we design and deliver the packages, the 'learners' learn, we metricise and report'.

This traditional approach lacks flexibility and is based on assumptions that may have been valid in 18th century Prussia when the concept of a curriculum arose, but is not fit-for-purpose in our fast-evolving 21st century world. 70:20:10 thinking and action helps overcome this ‘course and curriculum mindset’. A 70:20:10 L&D strategy is a good starting point for this change process.

Outcomes

I view 70:20:10 as an opportunity to re-establish the working relationship of L&D departments with their colleagues and stakeholders and to move from 'control' mindsets to supporting, facilitating, and enabling mindsets and practices with razor-like focus on organisational and stakeholder needs and priorities.

I’ve seen quite a few smart HR and L&D departments move rapidly along this road.

It's up to the wider L&D professional body as to whether it takes that opportunity or not.

At the core of 70:20:10 thinking is the fact that most of the learning that occurs in the workplace simply can't be 'managed' by anyone other than the person who is learning (and, sometimes, by their supervisor) so L&D professionals need to re-think their role if they're to help extend and improve the learning that's already happening outside their world. 70:20:10 helps them do just that.

---------------------------------------

Apart from Tom Spiglanin’s post, this article arose from various conversations and articles over the past months.  The 70:20:10 model is more a light pointing the way than a rulebook.

#itashare

Monday, 24 November 2014

The Only Person Who Behaves Sensibly Is My Tailor

 

“The only person who behaves sensibly is my tailor. He takes new measurements every time he sees me. All the rest go on with their old measurements.”
—George Bernard Shaw

I’ve always enjoyed George Bernard Shaw’s writing. He was a man who made a great deal of sense to me. I started reading his books in my early teenage years and many of the ideas in them have stuck.

Shaw was a true Renaissance man - an Irish playwright and author, a Nobel Prize and Academy Award winner (how many can claim that double?) and a co-founder of the London School of Economics.

Shaw had a particular interest in education; from the way the state educates its children, where he argued that the education of the child must not be in “the child prisons which we call schools, and which William Morris called boy farms”; to the way in which education could move from teachers “preventing pupils from thinking otherwise than as the Government dictates” to a world where teachers should “induce them to think a little for themselves”.

Shaw was also a lifelong learner. Despite, or possibly because of, his own irregular early education he focused on learning as an important activity in life. He developed his thinking and ability through a discipline of reading and reflecting, through debating and exchanging ideas with others, and through lecturing. Apart from leaving a wonderful legacy of plays, political and social treatises, and other commentaries, Shaw also won the 1925 Nobel Prize for literature for “his work which is marked by both idealism and humanity, its stimulating satire often being infused with a singular poetic beauty". And, in 1938, the Academy Award for his screenplay for Pygmalion (later to be turned into the musical and film My Fair Lady after Shaw’s death. He hated musicals – some would say sensibly - and forbade any of his plays becoming musicals in his lifetime)

At 91 Shaw joined the British Interplanetary Society whose chairman at the time was Arthur C Clark (some interesting conversations there, I’m sure).

Shaw summed up his views on lifelong learning thus:

"What we call education and culture is for the most part nothing but the substitution of reading for experience, of literature for life, of the obsolete fictitious for the contemporary real."

Shaw’s Tailor

In the statement about his tailor Shaw was simply making the point that change is a continuous process and part of life, and that we constantly need to recalibrate if we’re to gain an understanding of what’s really happening. If we do this we are more likely to have a better grasp of things and make the adjustments and appropriate responses needed. It’s the sensible approach.

Shaw and Work-Based Learning

I recently came across Shaw’s quote about sensibility and his tailor again in Joseph Raelin’s book ‘Work-Based Learning: Bridging Knowledge and Action in the Workplace’. Raelin’s work is something every L&D professional should read.

The quote started me thinking about the ways we measure learning and development in our organisations.

Effective Metrics for Learning and Development

I wonder what Shaw would think if he saw the way learning and development is predominantly measured in organisations today.

The most widely used measures for ‘learning’ are based on activity, not on outcomes. We measure how many people have attended a class or completed an eLearning module, or read a document or engaged in a job swap or in a coaching relationship.

Sometimes we measure achievement rates in completing a test or certification examination and call these ‘learning measures’.

The activity measures determine input, not output. The ‘learning’ measures usually determine short-term memory retention, not learning.

I am sure that Shaw would have determined we need to do better.

Outcomes not Activity

Even with today’s interest in the xAPI/TinCan protocol the predominant focus is still on measuring activity. It may be helpful to know that (noun, verb, object) ‘Charles did this’ as xAPI specifies. However extrapolating the context and outcomes to make any sense of this type of data requires a series of further steps that are orders of magnitude along the path to providing meaningful insight.

In many cases the activity measures simply serve to muddy the water rather than to reveal insights.

Attending a course or completing an eLearning module tells us little apart from the fact that some activity occurred. The same applies to taking part in a difficult workplace task or participating in a team activity.

Activity measurement does have some limited use. For instance when a regulatory body has defined an activity as a legal or mandatory necessity and requires organisations to report on those activities. these reports may help to keep a CEO out of the courts or jail. But this type of measurement is starting from the ‘wrong end’. A ‘learning activity is not necessarily an indicator of learning’ tag should be attached to every piece of this data.

There’s plenty of evidence beyond the anecdotal to support the fact that formal learning activity is not a good indicator of behaviour change (‘real learning’). For example a  study of 829 companies over 31 years showed diversity training had "no positive effects in the average workplace." The study reported that mandatory training sometimes has a positive effect, but overall has a negative effect.

“There are two caveats about training. First, it does show small positive effects in the largest of workplaces, although diversity councils, diversity managers, and mentoring programs are significantly more effective. Second, optional (not mandatory) training programs and those that focus on cultural awareness (not the threat of the law) can have positive effects. In firms where training is mandatory or emphasizes the threat of lawsuits, training actually has negative effects on management diversity”

Dobbin, Kalev, and Kelly
Diversity Management in Corporate America
2007, Vol. 6, Number 4
American Sociological Association.

For further evidence as to the fact that training activity does not necessarily lead to learning (changed behaviour) we need look no further than the financial services industry. Did global financial services companies carry out regulatory and compliance training prior to 2008?  Of course they did – bucketsful of it. Did this training activity lead to compliant behaviour. Apparently not. It could be argued that without the training things could have been worse. However, there’s no easy way to know that. The results of banking behaviour and lack of compliance were bad enough to suggest the training had little impact. I suppose we could analyse, for example, the amount of time and budget spent per employee on regulatory and compliance training by individual global banks and assess this against the fines levied against them.  I doubt that there would be an inverse correlation.

(What is our response to the global financial crisis and the apparent failure of regulatory and compliance training? More regulatory and compliance training, of course!)

The Activity Measurement ‘Industry’

The ATD’s ‘State of the Industry’ report, which is published around this time of the year on an annual basis, is a case-in-point of the industry that has grown up around measuring ‘learning’ activity.

ATD has been producing this annual report for years (originally as the ASTD). The data presented in the ATD annual ‘State of the Industry’ report is essentially based around activity and input measurement – the annual spend on employee development, learning hours used per employee, expenditure on training as a percentage of payroll or profit or revenue, number of employees per L&D staff member and so on.

Some of these data points may be useful to help improve the efficient running of L&D departments and therefore of value to HR and L&D leaders, but many of the metrics and data are simply ‘noise’. They certainly should not be presented to senior executives as evidence of effectiveness of the L&D function.

To take an example from the ATD data, the annual report itemises ‘hours per year on ‘learning’ (which means ‘hours per year on training). The implicit assumption is that the more that are hours provided, the better and more focused the organisation is on developing its workforce.

But is it better for employees in an organisation be spending 49 hours per year on ‘learning’ than, say, 30 hours per year? These are figures from the 2014 ATD report.

Even if one puts aside the fact that as a species we are learning much of the time as part of our work and not just when we engage in organisationally designed activities that have a specific ‘learning’ tag, this is an important point worth considering.

It could be argued that organisations with the higher figure – 49 hours per year – are more focused on developing their people.  It could equally be argued that these organisations are less efficient at developing their people and simply take longer to achieve the same results. It could be further argued that the organisations spending more time training their people in trackable ‘learning’ events are simply worse at recruitment, hiring people who need more training than the ‘smart’ organisations that hire people with the skills and capabilities needed who don’t need much further training. We could dig further and ask whether spending 49 hours rather than 30 hours is indicative of poor selection of training ‘channel’ – that organisations with the higher number are simply using less efficient channels (classroom, workshop etc.) than others who may have integrated training activities more closely with the workflow (eLearning, ‘brown bag lunches’, on-the-job coaching etc.). Even further, is the organisation with the 49 hours per year simply stuck in the industrial age and using formal training as the only approach to attack the issue of building high performance – when it could (and should) be using an entire kitbag of informal, social, workplace and other approaches as well?

One could go on applying equally valid hypotheses to this data.The point is that activity data provides few if any insights into the effectiveness of learning and provides only limited insight into the efficiency of learning activities.

So why is there an obsession to gather this data?

Maybe we gather it because it is relatively easy to do so.

Maybe we gather it because the ‘traditional’ measurement models – based on time-and-motion efficiency measures – are deeply embedded. These time-honoured metrics developed for an industrial age are not the answer.  We need to use new approaches based on outcomes, not inputs.

Learning is a ‘Messy’ Process

imageThe real challenge for measuring learning and development is that performance improvement often comes about in ‘messy’ ways.

Sometimes we attend a structured course and learn something new and then apply it our jobs. At other times we attend a structured course and meet another attendee who we then add to our LinkedIn connections. At some later point we contact this LinkedIn connection to help solve a problem – because we remember they told an interesting story  about overcoming a similar situation in their organisation or part of our organisation.

This second case falls into the ‘messy’ basket. It is almost impossible to track and ‘formalise’ this type of learning through data models such as xAPI – unless we’re living with the unrealistic expectation that people will document everything they do at every moment in time or that we track every interaction and are able to draw meaningful inferences.  Even national security agencies struggle doing that.

More frequently than learning in structured events we learn through facing challenges as part of our daily workflow, solving the problems in some way, and storing the knowledge about the successful solution for future use.  We also increasingly learn and improve through our interaction with others – our peers, our team, our wider networks or people we may not even know.

So how do we effectively measure this learning and development? Is it even worthwhile measuring?

I believe the answer to the second question is ‘Yes, when we can gain actionable insight’. It is worthwhile measuring individual, team and organisational learning and development to understand how we are adapting to change, innovating, improving our customer service, reducing our errors and so on.

This type of measurement needs to be part of designed performance improvement initiatives.

Furthermore, measuring learning frequently via performance improvement is better than measuring it infrequently.

One of the challenges the annual performance review process has come under recently is that the insights (and data) collected as part of the process is too infrequent. Companies like Adobe have already abolished annual performance reviews and replaced them with regular manager-report informal check-ins to review performance progress and any corrections needed. Fishbowl, a Utah-based technology company, has gone a step further and not only abolished annual performance reviews but also abolished its managers. Companies such as W.L.Gore have been treading this path for some time. It is clear that the annual performance review, a metrics approach based on ling (long) cycle times and relatively stability, will give way new, more nuanced approaches. A parallel path to learning metrics.

Outcome Measurement

One of the challenges for L&D is that the useful outcome metrics are not ‘owned’ by them. These are stakeholder metrics not ‘learning metrics’.

If we want to determine the effectiveness of a leadership development programme the metrics we should be using will be linked to leadership performance – customer satisfaction, employee engagement levels, organisational profitability for instance.

If we want to measure the impact and effectiveness of a functional training course the metrics we should be using are whether productivity increases, first-time error rate decreases, customer satisfaction rises, quality improves and so on.

If we want to measure the benefits from establishing a community for a specific function or around a specific topic the metrics we should be using will be linked to similar outputs – productivity increases, increase in customer satisfaction etc. Also we should be measuring improvements in collegiate problem-solving, cross-department collaboration and co-operation and similar outputs in the ‘working smarter together’ dimension.

These metrics need to be agreed between the key stakeholders and the L&D leaders before any structured learning and development activities are started. Without knowing and aligning with stakeholder expectations any structured development is just a ‘'shot in the dark’.

L&D also needs to consult with its stakeholder on how to obtain these metrics.

Some data may be readily available. Customer-facing departments, for example, will regularly collect CSAT (Customer Satisfaction) data. There are a number of standard methodologies to do this. Sales teams will inevitably have various measures in place to collect and analyse sales data. Technical and Finance teams will have a wealth of performance data they use. Other data will be available from HR processes – annual performance reviews, 360 feedback surveys etc.

These are the metrics that will provide useful insights into the effectiveness and impact of development activities managed by the L&D department.

Obviously these data are more nuanced than the number of people who have completed an eLearning course or have attended a classroom training course, but they are more useful. Sometimes the causal links between the learning intervention and the change in output are not clearly identifiable. This is where careful scientific data analysis together with the level of trust relationship between L&D and stakeholder are important. The 10-year old study by the (then) ASTD and IBM ‘The Strategic Value of Learning’  found that:

“When looking at measuring learning's value contribution to the organization, both groups (C-level and CLOs) placed greater emphasis on perceptions as measures”

One C-Suite interviewee in this study said “We measure (the effectiveness of learning) based on the success of the business projects. Not qualitative metrics, but the perceptions of the business people that the learning function worked with.

New Measurements Every Time

Returning to George Bernard Shaw, one of the challenges of effective measurement is the need to review the metrics needed for each specific instance. No two situations are identical, so no two approaches to measuring impact are likely to be identical. Or, at least, we need to check whether our metrics are appropriate for each measurement we undertake.

As Robert Brinkerhoff says, “There is no uniform set of metrics suitable for everyone”.

Brinkerhoff’s Success Case Method addresses systems impact rather than trying to isolate the impact of learning individually as the more simplistic Kirkpatrick approach attempts. Brinkerhoff’s approach moves us from input metrics to stakeholder metrics – certainly on the right road.

What is also required in defining and agreeing metrics that will be useful for each and every project is a process of engagement with stakeholders and performance consulting by learning professionals.

These approaches require a new way of thinking about measurement  and new skill for many L&D professionals but, like Shaw’s tailor, we need to ‘behave sensibly’ and stop wasting our time on trying to ‘tweak’ the old methods of measurement.

Learning, and measurement, are both becoming indistinguishable from working.

-------------

Photographs:
Shaw: Nobel Foundation 1925. Public Domain
Tape Measure:
Creative Commons Attribution-Share Alike 3.0

#itashare