Reach and Access

Sales people have it easy. They are measured on something very concrete: sales. Marketers on the other hand are measured on reach. How many people did they reach with their message. There is no way to know whether their message resulted in sales but if they don’t reach people, those people don’t buy. Learning professionals are in the same boat. It’s hard to measure whether people learned but they are interested in who they reached with their learning content. They may or may not learn but they won’t get the content if they can’t be reached.

This means that built into every learning experience must be a mechanism to capture who had the experience. The issue is that how people access the experience is tangled up in how it is tracked. There is a danger in creating a tracking process that is more complex than the learning itself. This is unacceptable in the age of Google. Learners expect minimum friction in accessing learning.

This struggle follows the continuum of formality in delivering training. In a formal classroom environment, responsibility for determining completion is given to the instructor and they send that information to the LMS Admin for tracking. In a formal eLearning course, the course developer determines the criteria for completion and uses the SCORM protocol to make sure that completion is passed accurately to the LMS. Once you start looking at informal learning or microlearning where tracking activities might hinder the access to the learning, it gets more complicated.

One solution is to separate usage and learner success. You can use web analytics to measure the usage of the content and then track a post learning assessment to see if people learned. Or you can make the tracking more passive (not requiring action by the learner) by embedding it into the content. The xAPI protocol with its light javascript programming can be good for this.

With social media learning you can extract data about people’s interactions around the learning experience. You can even reintroduce the human into the equation. For instance, in certain MOOCs (Massive Open Online Courses) they use software like Prosolo to give peers the opportunity to “complete” each other for certain tasks.

Counting participation or “buts-in-seats” is only a proxy for learning. It is like following footsteps on a beach and trying to figure out what the experience was. This is frustrating to most learning professionals because we want to know what our real impact was. Did people learn? But is that a realistic measure to hold ourselves to? We can’t get inside people’s heads. When we use an organizations’ resources to create learning experiences we have to be held accountable for something measurable. We have to show that at the very least we reached people with our content. Maybe if we can simplify that process, we can move on to the next questions: Are people able to change how they process new information and challenges? Are they able to do things differently after experiencing learning?


Shifting Perspective: Recap of Learning Solutions #LSCon and Learning Ecosystems #Ecocon

How do you capture the “vibe” of two collocated conferences with over 100 sessions? The Learning Solutions and Learning and Performance Ecosystems conferences held in late March by the eLearning Guild in Orlando, Florida each had their own vibe but the overarching phrase I would use to describe them would be “Shifting Perspectives.” We have all heard countless times that the learning industry like so many others is in the crosshairs of major upheavals fueled by technology and driven by intense economic forces. These two conferences went far in showing concrete examples of thinking and methodologies that are equipped to handle this level of change. The key to making a difference is in shifting our perspective and these sessions made a strong case for doing just that.

Bob Mosher and Conrad Gottfredson, the gurus of Performance Support held a Morning Buzz session on Wednesday. These sessions are usually supposed to be informal chats over coffee about topics of interest but Bob and Conrad led a full scale invasion of their topic piling on a wealth of information and insight. Their key message was that much of what people need to know is needed at the moment when the work is being done. Learning groups need to shift their perspective from pulling people out of the work to learn through training towards bringing learning into the workflow as Performance Support. “When we enter the classroom, we leave context behind. We then have to work hard to recreate the context.” Bob explained. “With Performance Support, the context is already there in the work.”

The keynote speaker Tom Wujec used the recent history of technological disruption to show the necessity for changing perspective in order to keep up. He challenged the crowd by saying “As educators we have an obligation to help people understand how to use technology.”

The audience at Learning Solutions is always lively, fun and a bit irreverent but over on the Learning Ecosystems side, things were more serious. Here were the people who have been tasked in their companies with creating this amorphous thing called an Ecosystem. When Marc Rosenberg and Steve Foreman, both outspoken proponents of the Ecosystems concept gave their presentations, the attendees were hanging on every word. Marc explained that we all already have Ecosystems. The question is whether they are robust enough to serve our needs. Again we were being encouraged to shift our perspectives from being focused on what we need to deliver as learning professionals to focus more on what the associates need to know to do their jobs. This expands learning beyond just training and across a spectrum of resources: Talent Management, Knowledge Management, Social Media collaboration, access to experts and Performance Support as well as standard training.

The person who for me gave the best hands-on example of this kind of shift in perspective was my friend JD Dillon who recounted his approach over the past five years transforming corporate learning at Kaplan. Instead of focusing on the content of courses, JD focused on the knowledge that people needed access to. If it wasn’t written down and available for everyone then it wasn’t going to be provided as learning. To that end he built a Wiki of the entire body of knowledge of how work gets done at Kaplan. More importantly he built it and maintained it by creating a culture of collaboration. as the work process evolves, the people doing the work continuously contribute to the Wiki. The next shift happens in moving needs analysis directly to the learners themselves. Every morning everyone plays games on the Gaming-Assessment engine provided by Axonify. When they struggle they are sent to the exact place in the Wiki where the information exists. When they win, they get points that can be traded for swag or bid on things like a 25 minute meeting with the CEO. This twist in focus means that the daily life of an employee is tied in with learning and contributing to knowledge. This frees the L&D department to create targeted learning that covers deeper more impactful topics.

The world around us is shifting rapidly and shifting our perspectives is how we will adapt and better serve our constituents. Conferences like these are good places to be be reminded and encouraged in this direction.

What to Do About MicroLearning?

Ah, we finally have a new buzzword. I got the standard email yesterday: “We’ve got to do something about <insert buzzword here>” The buzzword of the day is “MicroLearning.” We’ve been talking about “chunking” content for years without getting much traction but dressed up in a new, more grown-up word, it gets taken more seriously. That’s cool. It’s still a good concept. People don’t have time for epic courses. By breaking down content into smaller “micro” parts, they are easier to consume in a hurry and they can be targeted to the right people, the right task and the right delivery channel.

There’s a problem though. It was always lurking behind the chunking conversation. Our current process for delivering learning content: The LMS via SCORM is too heavy handed for the scale we will be working in. Imagine that launching a course takes longer than actually doing the course. Imagine that loading many SCORM based microlearnings into an LMS being more cumbersome than it is worth. How do we track these things in a reasonable manner?

Here are some options:


In the LMS you can load the url for the content and let the user click complete when they are done. This is the simplest idea and I always defer to the simplest but it may not meet your stakeholder’s standards for data integrity.


The Experience API (a.k.a. Tin Can, xAPI) has the advantage of sending data to a database when the learner takes an action rather than forcing the learner to launch the content from the LMS like SCORM does. This would simplify the process but you would have to build a process to insert xAPI calls into your content and figure out how to get the data back into the LMS.

Track the Assessment

Load only the final assessment for a group of microlearnings into the LMS. In this way you are only tracking the successful completion of the quiz as evidence of the learning achieved through the microlearnings. The microlearnings themselves then become supporting material that the learner can launch at will. This is probably the ideal solution but I do have one more trick up my sleeve.


I bet you didn’t see that one coming. Think about a video game with rooms and levels. If you run though the rooms as fast as you can, you won’t beat the level. You need to take something from each room, a key of sorts into the last room to win. How can we apply this to microlearning? Imagine that at the end of each microlearning you are given a key, a badge, a code, that you enter in the right place in the last module. Collecting all the keys gives you a passing score and that is sent back to the LMS. This brings us closer to the idea of experiential learning.

What are your plans for MicroLearning?

Check out my friend Tom Spiglanin’s post on this topic.

It isn’t Really About the Learning

The role of learning professionals is NOT to make people learn. That’s an unrealistic task. Learners are responsible for learning. Learning happens inside people’s heads and so it is impossible to measure. So then what do Learning Professionals do? Why do companies give them money?
The goal of a corporate learning function is to increase individuals’ capability to produce value. A company with employees that produce more value is more valued itself. To accomplish this goal, Learning Professionals need to create learning experiences, and get the right people to participate in them. If the experiences are designed correctly, the participants should be able to do their work faster, better, smarter or safer. The first part is handled by a Learning Management System. An LMS , when set up correctly, facilitates participation in Learning Experiences for target audiences. You can’t grow your capability if you don’t participate. However, just because you participate doesn’t mean you grow your capability.
Most Learning Organizations stop here. Participation is easy to measure and report on. Stakeholders understand the results. The problem is that learning and Learning Management Systems are a big investment for only getting half of the desired results. The Learning industry has been scratching their collective heads for years looking for a way to measure learning, but that is the wrong measure. We need to measure growth in capability.
Measuring growth is done by setting a baseline and measuring changes over time. The complication is that people are always learning and always adjusting their capability. To try to measure the effect of a single learning experience is going to be hard. Another way to look at it is that Learning Experiences are transactions happening on a regular basis. Capability could be tested at timed intervals and increases could be attributed to the transactions that occurred between the intervals. This kind of measurement is not easy for an LMS but could be done with the xAPI.
Using this model, Learning Experiences could be offered as smaller components of a larger practice. The key is in creating assessments that also provide value.

How Learning Creates Value

Tim Cook, the CEO of Apple wakes up in the middle of the night from a dream. He has had a premonition that somewhere in his company of 80,000 employees there are 100 people who will develop and market Apples next blockbuster product: iPet. Right now, though, these people don’t know enough about pets to make that happen. So Mr. Cook calls his Learning and Development team and tells them to deploy animal behavior training to everyone in the company at a cost of $10 million.


If Cook was the head of a division, he would count the $10 million as an expense and subtract that number, along with other expenses like salaries, from the projected revenues to show his boss that there would be profits. But Tim Cook doesn’t have a boss like you and I do. His bosses are the stockholders. They have bought Apple stock for a lot of money and they want the value of their stock to keep going up. The value of the stock is a share of the Market Value of the company. It is Cook’s job to make sure that value keeps increasing.


When Jobs and Wozniak were starting Apple Computer in their garage in the seventies, 80% of the value of companies depended on tangeable asssets like factories. Only 20% was created by people. So if you built another factory, you could increase the value of the company. Now the numbers are flipped. 80% of the value of companies is created by people. But people aren’t factories, how do you account for the value that they produce?


Not coincidentally, Apple’s book value of $120 billion is about 20% of it’s market value of $570 billion. Book value is calculated by accountants and it lists employees as expenses and liabilities. No where in the book value of Apple is any mention of the legacy of Steve Jobs but you can be sure that it is factored into the market value.


The 100 people in Tim Cook’s dream are already contributing to the market value of the company. If Cook wants to increase that value, he needs to increase the value that those 100 people are capable of producing. Since he doesn’t know which employees will produce that increased value, he needs to increase the capability of all of his employees. He will invest 10 million dollars in a learning program that has the potential of raising the company valuation 10% or $57 billion dollars. That is the return on investment for learning.


Not every CEO wakes up in the middle of the night with visions of new products, but most good leaders instinctively know that they need to invest in their people as producers of value even though that calculation is not figured into their accounting. The reason companies set aside budgets for learning is to increase the value of the company.

Complexity Loves Itself

I’m looking at a program that no one knew existed, that has to be renewed by the end of the year. It has 100 hours of content, a manual account setup process, and class sessions that expire. Every part of this thing spawns more complexity.
This is the problem with learning. Learning is marginalized in an organization but they don’t get rid of it because it is important. It’s just that no one wants to deal with it. This leaves the learning folk with anxiety about their relevance and time and funding to create systems. The complexity of the system soon becomes a drug that they can’t quit.
Time moves on and this system fades into the background. Later, when someone like me is asked to step in and clean up, the light of day sends people scurrying. Suddenly stakeholders appear saying that this system must be supported.
What is needed is to challenge every assumption that each component of the system is built on. Then you can cut down the unnecessary details until you are left with the functionality that is actually needed. From there you can start anew and ask the questions you would ask on a new project: What value is this system providing? What is the simplest way to produce that value?

The Wrong Conversation Necessarily Becomes the Right One

I facilitated a Morning Buzz session at DevLearn in October about Learning Technology Strategy. The conversation was very robust. It seems that the topic I picked had touched a nerve. But I noticed that a friend of mine was unusually quiet. When I asked him about it later he explained that he felt it was the wrong conversation. I understood what he meant. It is the ongoing concern about putting too much energy into strategizing technology for learning instead of talking about the learning first. What do people need to learn? How do they need to learn it? That is what the conversation should be about in an ideal world.
In our world though, the conversation about technology has already taken up everyone’s energy and without a strategy it has been going around in circles.”We want eLearning to save on travel costs.” “We want rapid development tools to meet aggressive deadlines and tight budgets.” “We want an LMS with the most features so we can automate delivery and track compliance and make all our stakeholders happy”. We jump from one tech conversation to another without any sense of desired outcomes and priorities. This endless conversation is already taking time and resources away from talking about learning content. It’s easier to get lost down the rabbit hole of tech than to think about what learners really need.
This is why I wanted to start a dialog about developing a Learning Technology Strategy. I want to reduce the noise and commotion. Having a strategy means that you can start talking about outcomes before you talk about tools. For instance, many people in the group had spent a lot of time finding the right LMS but almost none of them were happy with their decision. This is probably because they focused on the feature set available compared to the feature set they wanted, instead of looking at what they wanted to get out of an LMS. An organization decides that they want an LMS to hold people accountable for delivering learning. A secondary benefit is the savings in automating class roster management. Focus on delivering on this requirement will make an LMS selection process faster and more effective.
The same is happening with rapid development tools, virtual classroom tools and content management tools. All of this technology is becoming commoditized anyway. It’s hard to find a truly bad product at this point as well as it is hard to find a perfect product. The real value is in deciding how technology is used. Instead of thinking about how to use the feature sets of all of the tools in your toolkit, think about what these tools make possible for the learner to find and consume.
If we get this conversation going in the right direction, maybe we can get back to the right conversation.