What to Do About MicroLearning?

Ah, we finally have a new buzzword. I got the standard email yesterday: “We’ve got to do something about <insert buzzword here>” The buzzword of the day is “MicroLearning.” We’ve been talking about “chunking” content for years without getting much traction but dressed up in a new, more grown-up word, it gets taken more seriously. That’s cool. It’s still a good concept. People don’t have time for epic courses. By breaking down content into smaller “micro” parts, they are easier to consume in a hurry and they can be targeted to the right people, the right task and the right delivery channel.

There’s a problem though. It was always lurking behind the chunking conversation. Our current process for delivering learning content: The LMS via SCORM is too heavy handed for the scale we will be working in. Imagine that launching a course takes longer than actually doing the course. Imagine that loading many SCORM based microlearnings into an LMS being more cumbersome than it is worth. How do we track these things in a reasonable manner?

Here are some options:

Self-Completions

In the LMS you can load the url for the content and let the user click complete when they are done. This is the simplest idea and I always defer to the simplest but it may not meet your stakeholder’s standards for data integrity.

xAPI

The Experience API (a.k.a. Tin Can, xAPI) has the advantage of sending data to a database when the learner takes an action rather than forcing the learner to launch the content from the LMS like SCORM does. This would simplify the process but you would have to build a process to insert xAPI calls into your content and figure out how to get the data back into the LMS.

Track the Assessment

Load only the final assessment for a group of microlearnings into the LMS. In this way you are only tracking the successful completion of the quiz as evidence of the learning achieved through the microlearnings. The microlearnings themselves then become supporting material that the learner can launch at will. This is probably the ideal solution but I do have one more trick up my sleeve.

Gamification

I bet you didn’t see that one coming. Think about a video game with rooms and levels. If you run though the rooms as fast as you can, you won’t beat the level. You need to take something from each room, a key of sorts into the last room to win. How can we apply this to microlearning? Imagine that at the end of each microlearning you are given a key, a badge, a code, that you enter in the right place in the last module. Collecting all the keys gives you a passing score and that is sent back to the LMS. This brings us closer to the idea of experiential learning.

What are your plans for MicroLearning?

Check out my friend Tom Spiglanin’s post on this topic.

May You Live in Interesting Times

I have been blessed to be a part of 3 very interesting times.

Right after High School, I moved to New York to go to art school. New York was just starting to have a resurgence and there was a lot of excitement happening in the East Village Art scene especially around Performance Art. I just so happened to have started a Performance Art Troupe in school and we played theaters and clubs, bending the definition of art and theatre.

During the Tech bubble, I helped to start a dotcom and I got to experience that heartbreaking but exhilarating time. We really believed at the time that we were changing the world.

When I left my first eLearning Guild conference, I thought to myself “How fortunate am I to be able to do this a third time.”

When I expressed my enthusiasm to an old friend she chuckled and said “People have been predicting big changes in learning for years but I haven’t seen anything.” To be fair, this is right to some degree. There is a lot of inertia out there. Also, the change is not exclusive. Performance Art did not eradicate paintings. Dotcoms did not eradicate brick-and-mortars and eLearning did not eradicate classrooms.

The forces at work on workplace learning, however are relentless and change is inevitable. This is not good news for those who want to maintain the status quo. Their ability to dismiss the changes will diminish quickly. This is good news for people who embrace the future and see the opportunities.

What are these unstoppable forces? Firstly the Internet with open access to information written into it’s DNA, and the generation brought up with it who think of learning as a self determined right. There is the business environment where the relationship between organization and individual has become transactional at the same time as it is being acknowledged that the value of a company is intertwined with the ability of its people to learn at the speed of change. Finally there is technology that is creating ease of connection and access that fundamentally changes the way we learn.

This confluence of learning and technology means that anyone with passion around these two things gets a front row seat to a great show.

“May you live in interesting times” is known as the Chinese curse but it is not Chinese and it is not a curse. It is a blessing to be involved in events that are shaping the way that humanity grows.

May you live in interesting times.

Deconstructing the Learning Management System

The Learning Management System is Undead. Everyone wants to kill it but no one does.

We are stuck in this no-win situation because it makes sense to have a system that tracks learning data and manages logistics, however the cognitive load and resource drain created by its complex workflows leads us all to question whether it is worth it. We will remain in this limbo state until we resolve the contradictions. The time has come to face this conundrum, but in order to do so we need to understand what the core problem is. The workflow structure of the LMS is layered with artifacts from past learning practices much like rock strata contain the fossils of creatures from eons ago. Unearthing these structures and examining their flawed assumptions can be a start to working towards a more useful learning architecture.

Let’s go back to the early days of the corporate LMS when it was really a classroom management system. Unlike K-12 where classes occur daily and Academia where classes are given on a semester basis, the scheduling of classroom based training in the workplace is highly variable. For this reason, LMS design had to be hierarchical to accommodate for all possible scenarios. The basic structure is this:
  • Course: the container of the content delivered to learners
  • Class: an instance of delivering the course
  • Session: an actual time and place where learners are presented with the content
Why is it necessary to track down to this level of granularity? LMS vendors are trying to sell more functionality around these objects: reporting on course completion, scheduling of people and resources for classes, attendance at sessions, etc. Without the hierarchy, the data structure necessary for this functionality would be hard to manage. The problem is that this level of complexity makes things easier for the LMS programmers but for the L&D professionals and learners it just adds to the strain of using the system.

An important component to making this hierarchy work is the concept of the registration. On a practical level, registrations are used for printing rosters, to capture attendance, and for managing wait-lists, class sizes, food orders and penalizing no-shows. On a data level, a registration which records an intention to participate in learning, acts to create the first record in the database that all subsequent transactions can be based on. For L&D and learners, though, it adds another layer of workflow that creates more confusion.

Now let’s move on to eLearning. Even though eLearning is nothing like classroom learning as far as workflow, no one wants to have a separate classroom system and eLearning system. So a structure needs be contorted to accommodate both processes. This leads to some pretty obvious contradictions that we all live with. For starters, eLearning is a transactional activity and not hierarchical, yet to fit in an LMS we need to create separate eLearning courses and eLearning activities (the equivalent of classes). Also, registrations are not necessary for eLearning but they are so integral to the LMS data strategy that they can’t be removed. This creates an extra step that learners do not understand.

Once we accept the flawed premise that eLearning must follow classroom training workflows, we now have to solve for problems that this model creates. This is like the problem people have when they try to fit unwieldy metaphors to complex real-life situations. Here’s an example: in classroom training, instructors are the arbiters of learning. If they feel that a student has learned, they give them a completion for the course. The completion becomes the coin-of-the-realm for learning data. With eLearning, there is no instructor to verify your completion. If the LMS simply links to the material, there is no way to prove that the learner “completed” the course. The proxy for the instructor is the end of course quiz. This is designed to prove a negative. Just like attendance of a class is not a guarantee of learning but not attending class is a reliable indicator of not learning, so too, the ability to answer questions about the content is not a guarantee of learning but the inability to answer those questions is a reliable indicator of not learning. Building a whole system to prove a negative seems a bit weak.

To create this dynamic in a foolproof (read human-proof) way, various protocols like AICC and SCORM need to be followed to control the flow of data between the learner and the database. This tight control leads to much of the difficulty in using and supporting these courses. If you are going to reliably report on this data, you need to follow some logical rules that take all possible scenarios into account. For instance. It might seem intuitive that you can make changes to a course any time you want but if you change the learning content itself aren’t you invalidating the completions that have occurred so far? This kind of logic may be valid from a data integrity standpoint but it kills usability. The need for this complexity is created by the Frankenstein monster we put in place when we try to create a hierarchical system that is meant to solve for all situations.

Now we have Social Media and Mobile where control is not practical or desired and you have a crisis. The structure of the LMS was designed to control participation data but we don’t learn that way. We learn by interacting with people and ideas in countless ways. The Internet which was designed to circumvent all control is allowing us to learn without restrictions. The LMS is not equipped to handle that. This is why we have to address these issues now. The latest successor to SCORM, called xAPI (the API formerly known as Tin Can) begins to address some of these problems. It tracks transactions rather than hierarchies and it doesn’t require that you discover learning through the LMS so it can capture data more freely but this is just the tip of the iceberg.

This seemingly intractable problem was created when vendors attempted to give us everything we wanted in one package. The answer is in the L&D industry doing some soul searching about what it is that the learner really needs. I could go on forever about the nuances of the LMS workflow. we need to make the LMS more usable for learners and for organizations. To do this we need to continue this exploration.

 

The Flipped LMS

Hooboy, do people like to complain about their LMS (Learning Management System). As evidenced by this conversation on #chat2lrn that I was a part of, L&D folk would like to wish away this beast of a software program. The thing is though, that no one ever really does get rid of their LMS. Why is that?

I believe that the root cause of this contradiction lies in two conflicting drivers: 1) It is important to the business to track training. 2) The way we track training now gets in the way of learning. Many people including myself have argued the first point so I’ll focus now on the second point. In the current process of using an LMS, in order to track training, you must control access to content. But we live in the Internet age now and we all know that content wants to be free. If we cannot resolve this disconnect, there is no hope for improving our relationship with our learning architecture.

The Flipped Classroom is a concept that breathed life into the disconnects of the education world. Perhaps the idea of a Flipped LMS can bring us to a solution. In the Flipped Classroom, instruction and experimentation (homework) were separated and switched, where the instruction was provided by video at home and the teacher supported the experimentation in the classroom. Imagine if we separated the content from the assessment. Put the assessment in the LMS where it can track whether someone learned, and let the content exist outside the LMS where it is always accessible for anyone. The assessment can have a landing page (most assessment tools can do this) that provides context for the information being assessed: why it’s important, how it is assessed, where to learn what you need to improve your score. Here would be the link to the content. There could be three assessments per program, all orchestrated by the LMS: A pre-test; the post-assesment and a follow-up assessment to reinforce the learning.

This way you are using the LMS for what it does best. By allowing multiple attempts and multiple sources of learning, you are letting the learner be more flexible and you are tracking improvement over time with less complexity.

But how do we know who accessed the content? This is the beauty of the idea. By splitting up access and assessment you also split up what you have to measure. For assessment, you must track individuals and so you need the LMS, but to track the reach of your content, you only need numbers of users and visits, not individuals. This can be done by any web analytics tool like Google Analytics.

Hopefully the clarity produced by this split in efforts will help L&D folk move on to more important conversations than how much they hate their LMS. That is, until they get the next bill from the LMS vendor.

It isn’t Really About the Learning

The role of learning professionals is NOT to make people learn. That’s an unrealistic task. Learners are responsible for learning. Learning happens inside people’s heads and so it is impossible to measure. So then what do Learning Professionals do? Why do companies give them money?
The goal of a corporate learning function is to increase individuals’ capability to produce value. A company with employees that produce more value is more valued itself. To accomplish this goal, Learning Professionals need to create learning experiences, and get the right people to participate in them. If the experiences are designed correctly, the participants should be able to do their work faster, better, smarter or safer. The first part is handled by a Learning Management System. An LMS , when set up correctly, facilitates participation in Learning Experiences for target audiences. You can’t grow your capability if you don’t participate. However, just because you participate doesn’t mean you grow your capability.
Most Learning Organizations stop here. Participation is easy to measure and report on. Stakeholders understand the results. The problem is that learning and Learning Management Systems are a big investment for only getting half of the desired results. The Learning industry has been scratching their collective heads for years looking for a way to measure learning, but that is the wrong measure. We need to measure growth in capability.
Measuring growth is done by setting a baseline and measuring changes over time. The complication is that people are always learning and always adjusting their capability. To try to measure the effect of a single learning experience is going to be hard. Another way to look at it is that Learning Experiences are transactions happening on a regular basis. Capability could be tested at timed intervals and increases could be attributed to the transactions that occurred between the intervals. This kind of measurement is not easy for an LMS but could be done with the xAPI.
Using this model, Learning Experiences could be offered as smaller components of a larger practice. The key is in creating assessments that also provide value.

How Learning Creates Value

diagram-corp-value
Tim Cook, the CEO of Apple wakes up in the middle of the night from a dream. He has had a premonition that somewhere in his company of 80,000 employees there are 100 people who will develop and market Apples next blockbuster product: iPet. Right now, though, these people don’t know enough about pets to make that happen. So Mr. Cook calls his Learning and Development team and tells them to deploy animal behavior training to everyone in the company at a cost of $10 million.

 

If Cook was the head of a division, he would count the $10 million as an expense and subtract that number, along with other expenses like salaries, from the projected revenues to show his boss that there would be profits. But Tim Cook doesn’t have a boss like you and I do. His bosses are the stockholders. They have bought Apple stock for a lot of money and they want the value of their stock to keep going up. The value of the stock is a share of the Market Value of the company. It is Cook’s job to make sure that value keeps increasing.

 

When Jobs and Wozniak were starting Apple Computer in their garage in the seventies, 80% of the value of companies depended on tangeable asssets like factories. Only 20% was created by people. So if you built another factory, you could increase the value of the company. Now the numbers are flipped. 80% of the value of companies is created by people. But people aren’t factories, how do you account for the value that they produce?

 

Not coincidentally, Apple’s book value of $120 billion is about 20% of it’s market value of $570 billion. Book value is calculated by accountants and it lists employees as expenses and liabilities. No where in the book value of Apple is any mention of the legacy of Steve Jobs but you can be sure that it is factored into the market value.

 

The 100 people in Tim Cook’s dream are already contributing to the market value of the company. If Cook wants to increase that value, he needs to increase the value that those 100 people are capable of producing. Since he doesn’t know which employees will produce that increased value, he needs to increase the capability of all of his employees. He will invest 10 million dollars in a learning program that has the potential of raising the company valuation 10% or $57 billion dollars. That is the return on investment for learning.

 

Not every CEO wakes up in the middle of the night with visions of new products, but most good leaders instinctively know that they need to invest in their people as producers of value even though that calculation is not figured into their accounting. The reason companies set aside budgets for learning is to increase the value of the company.

Everyone has Expectations – a review of Charles Jennings’ 70-20-10 Framework Explained

Image
 
Everyone has expectations. When we come to a story we always bring our own point of view. This seems especially true of the 70-20-10 paradigm in learning. I’m guessing that this is part of the reason that the framework’s biggest proponent, Charles Jennings wrote a guidebook for it. First, people get hung up on the numbers. Learning folk do get a little fixated on numbers and the 70-20-10 monicker is a magnet for them. The truth is that the numbers are simply a short hand for the idea and their origin is from the anecdotal information collected from various studies. Successful people, when asked what contributed to their ability to do their job,  relate that 70% of came from doing something. 20% came from talking to people and 10% came from absorbing information from courses, books etc. The reason that the numbers became such a rallying cry for the Learning profession is that they pointed out the disparity between the value of formal learning and the spend. The numbers also lead people to believe that there is segmentation going on. 70 is good and 10 is bad. But this is not the case. The numbers are showing a spectrum of solutions that work together to form a whole.
 
I am not immune to having expectations myself. I have been an avid follower of Mr. Jennings and I’ve generally understood and accepted the point of the 70-20-10 concept log ago. I saw that there was a “guidebook” on Jennings’ 70-20-10 Forum website and I ordered it right away, expecting it to be a recipe book. There are no recipes though. It’s not that kind of idea. I have been tasked in my company with creating a technology infrastructure to support the 70-20-10 framework. This could turn into a journey worthy of Don Quixote. Learning Technologists are always wary of looking for solutions before you understand what the problem is. However even Jennings is encouraging the idea of creating an environment  conducive to 70-20-10 thinking. But what does this mean? We can’t create work experiences. If we created a Project Clearinghouse site, how would it be governed? Even the 20% is tricky. You can’t just plug an enterprise Social Media Tool into the Learning infrastructure and expect people to learn from it. I was expecting answers to my personal questions.
 
At first I was frustrated because a good portion of the book was dedicated to what seemed to be a sales job. There were testimonials of success based on implementing the framework at top companies. Although I’ve gotten a lot out of Mr. Jennings other writing, I was concerned that I would not get what I needed out of this book. I’m not the typical reader. What I was seeing as a sales job was actually a well crafted case for the framework. When I went back to my notes, I saw that I had indeed gotten a lot of ideas. The structure of the book is based on adoption and implementation of each of the components of the framework as it relates to addressing current challenges in the workplace. As I went through these scenarios, I began to get ideas for how technology can be used to facilitate this work.
 
The book identifies what needs to change in an organization in order to adapt the 70-20-10 principles and it tells you what kinds of changes you can expect as a result. It explains what tools you need to give your instructional designers, coaches and managers and it opens up new opportunities to rethink learning.
 
What is interesting is the flexibility. Given the confusion about the numbers, I was expecting a rigid adherence to the principles but there is a lot of room for interpretation. For instance, there is no better way to learn than by experience but that is not always possible. Jennings explains that sometimes a story that elicits the same reactions can be effective. I think what is missing in much of the discussion of 70-20-10 is the importance of creating context for the learning that is taking place. The book briefly mentions the importance of establishing a cognitive framework for understanding what is learned.
 
I’m still working on my project of making our learning architecture “70-20-10 ready” but I think it may not be as complicated as I thought. As long as the systems are open to different types of learning, we should be fine. It is up to the Learning function to figure out how to make those learning opportunities available and to provide the right context.