Shifting Perspective: Recap of Learning Solutions #LSCon and Learning Ecosystems #Ecocon

How do you capture the “vibe” of two collocated conferences with over 100 sessions? The Learning Solutions and Learning and Performance Ecosystems conferences held in late March by the eLearning Guild in Orlando, Florida each had their own vibe but the overarching phrase I would use to describe them would be “Shifting Perspectives.” We have all heard countless times that the learning industry like so many others is in the crosshairs of major upheavals fueled by technology and driven by intense economic forces. These two conferences went far in showing concrete examples of thinking and methodologies that are equipped to handle this level of change. The key to making a difference is in shifting our perspective and these sessions made a strong case for doing just that.

Bob Mosher and Conrad Gottfredson, the gurus of Performance Support held a Morning Buzz session on Wednesday. These sessions are usually supposed to be informal chats over coffee about topics of interest but Bob and Conrad led a full scale invasion of their topic piling on a wealth of information and insight. Their key message was that much of what people need to know is needed at the moment when the work is being done. Learning groups need to shift their perspective from pulling people out of the work to learn through training towards bringing learning into the workflow as Performance Support. “When we enter the classroom, we leave context behind. We then have to work hard to recreate the context.” Bob explained. “With Performance Support, the context is already there in the work.”

The keynote speaker Tom Wujec used the recent history of technological disruption to show the necessity for changing perspective in order to keep up. He challenged the crowd by saying “As educators we have an obligation to help people understand how to use technology.”

The audience at Learning Solutions is always lively, fun and a bit irreverent but over on the Learning Ecosystems side, things were more serious. Here were the people who have been tasked in their companies with creating this amorphous thing called an Ecosystem. When Marc Rosenberg and Steve Foreman, both outspoken proponents of the Ecosystems concept gave their presentations, the attendees were hanging on every word. Marc explained that we all already have Ecosystems. The question is whether they are robust enough to serve our needs. Again we were being encouraged to shift our perspectives from being focused on what we need to deliver as learning professionals to focus more on what the associates need to know to do their jobs. This expands learning beyond just training and across a spectrum of resources: Talent Management, Knowledge Management, Social Media collaboration, access to experts and Performance Support as well as standard training.

The person who for me gave the best hands-on example of this kind of shift in perspective was my friend JD Dillon who recounted his approach over the past five years transforming corporate learning at Kaplan. Instead of focusing on the content of courses, JD focused on the knowledge that people needed access to. If it wasn’t written down and available for everyone then it wasn’t going to be provided as learning. To that end he built a Wiki of the entire body of knowledge of how work gets done at Kaplan. More importantly he built it and maintained it by creating a culture of collaboration. as the work process evolves, the people doing the work continuously contribute to the Wiki. The next shift happens in moving needs analysis directly to the learners themselves. Every morning everyone plays games on the Gaming-Assessment engine provided by Axonify. When they struggle they are sent to the exact place in the Wiki where the information exists. When they win, they get points that can be traded for swag or bid on things like a 25 minute meeting with the CEO. This twist in focus means that the daily life of an employee is tied in with learning and contributing to knowledge. This frees the L&D department to create targeted learning that covers deeper more impactful topics.

The world around us is shifting rapidly and shifting our perspectives is how we will adapt and better serve our constituents. Conferences like these are good places to be be reminded and encouraged in this direction.

What to Do About MicroLearning?

Ah, we finally have a new buzzword. I got the standard email yesterday: “We’ve got to do something about <insert buzzword here>” The buzzword of the day is “MicroLearning.” We’ve been talking about “chunking” content for years without getting much traction but dressed up in a new, more grown-up word, it gets taken more seriously. That’s cool. It’s still a good concept. People don’t have time for epic courses. By breaking down content into smaller “micro” parts, they are easier to consume in a hurry and they can be targeted to the right people, the right task and the right delivery channel.

There’s a problem though. It was always lurking behind the chunking conversation. Our current process for delivering learning content: The LMS via SCORM is too heavy handed for the scale we will be working in. Imagine that launching a course takes longer than actually doing the course. Imagine that loading many SCORM based microlearnings into an LMS being more cumbersome than it is worth. How do we track these things in a reasonable manner?

Here are some options:

Self-Completions

In the LMS you can load the url for the content and let the user click complete when they are done. This is the simplest idea and I always defer to the simplest but it may not meet your stakeholder’s standards for data integrity.

xAPI

The Experience API (a.k.a. Tin Can, xAPI) has the advantage of sending data to a database when the learner takes an action rather than forcing the learner to launch the content from the LMS like SCORM does. This would simplify the process but you would have to build a process to insert xAPI calls into your content and figure out how to get the data back into the LMS.

Track the Assessment

Load only the final assessment for a group of microlearnings into the LMS. In this way you are only tracking the successful completion of the quiz as evidence of the learning achieved through the microlearnings. The microlearnings themselves then become supporting material that the learner can launch at will. This is probably the ideal solution but I do have one more trick up my sleeve.

Gamification

I bet you didn’t see that one coming. Think about a video game with rooms and levels. If you run though the rooms as fast as you can, you won’t beat the level. You need to take something from each room, a key of sorts into the last room to win. How can we apply this to microlearning? Imagine that at the end of each microlearning you are given a key, a badge, a code, that you enter in the right place in the last module. Collecting all the keys gives you a passing score and that is sent back to the LMS. This brings us closer to the idea of experiential learning.

What are your plans for MicroLearning?

Check out my friend Tom Spiglanin’s post on this topic.

May You Live in Interesting Times

I have been blessed to be a part of 3 very interesting times.

Right after High School, I moved to New York to go to art school. New York was just starting to have a resurgence and there was a lot of excitement happening in the East Village Art scene especially around Performance Art. I just so happened to have started a Performance Art Troupe in school and we played theaters and clubs, bending the definition of art and theatre.

During the Tech bubble, I helped to start a dotcom and I got to experience that heartbreaking but exhilarating time. We really believed at the time that we were changing the world.

When I left my first eLearning Guild conference, I thought to myself “How fortunate am I to be able to do this a third time.”

When I expressed my enthusiasm to an old friend she chuckled and said “People have been predicting big changes in learning for years but I haven’t seen anything.” To be fair, this is right to some degree. There is a lot of inertia out there. Also, the change is not exclusive. Performance Art did not eradicate paintings. Dotcoms did not eradicate brick-and-mortars and eLearning did not eradicate classrooms.

The forces at work on workplace learning, however are relentless and change is inevitable. This is not good news for those who want to maintain the status quo. Their ability to dismiss the changes will diminish quickly. This is good news for people who embrace the future and see the opportunities.

What are these unstoppable forces? Firstly the Internet with open access to information written into it’s DNA, and the generation brought up with it who think of learning as a self determined right. There is the business environment where the relationship between organization and individual has become transactional at the same time as it is being acknowledged that the value of a company is intertwined with the ability of its people to learn at the speed of change. Finally there is technology that is creating ease of connection and access that fundamentally changes the way we learn.

This confluence of learning and technology means that anyone with passion around these two things gets a front row seat to a great show.

“May you live in interesting times” is known as the Chinese curse but it is not Chinese and it is not a curse. It is a blessing to be involved in events that are shaping the way that humanity grows.

May you live in interesting times.

Deconstructing the Learning Management System

The Learning Management System is Undead. Everyone wants to kill it but no one does.

We are stuck in this no-win situation because it makes sense to have a system that tracks learning data and manages logistics, however the cognitive load and resource drain created by its complex workflows leads us all to question whether it is worth it. We will remain in this limbo state until we resolve the contradictions. The time has come to face this conundrum, but in order to do so we need to understand what the core problem is. The workflow structure of the LMS is layered with artifacts from past learning practices much like rock strata contain the fossils of creatures from eons ago. Unearthing these structures and examining their flawed assumptions can be a start to working towards a more useful learning architecture.

Let’s go back to the early days of the corporate LMS when it was really a classroom management system. Unlike K-12 where classes occur daily and Academia where classes are given on a semester basis, the scheduling of classroom based training in the workplace is highly variable. For this reason, LMS design had to be hierarchical to accommodate for all possible scenarios. The basic structure is this:
  • Course: the container of the content delivered to learners
  • Class: an instance of delivering the course
  • Session: an actual time and place where learners are presented with the content
Why is it necessary to track down to this level of granularity? LMS vendors are trying to sell more functionality around these objects: reporting on course completion, scheduling of people and resources for classes, attendance at sessions, etc. Without the hierarchy, the data structure necessary for this functionality would be hard to manage. The problem is that this level of complexity makes things easier for the LMS programmers but for the L&D professionals and learners it just adds to the strain of using the system.

An important component to making this hierarchy work is the concept of the registration. On a practical level, registrations are used for printing rosters, to capture attendance, and for managing wait-lists, class sizes, food orders and penalizing no-shows. On a data level, a registration which records an intention to participate in learning, acts to create the first record in the database that all subsequent transactions can be based on. For L&D and learners, though, it adds another layer of workflow that creates more confusion.

Now let’s move on to eLearning. Even though eLearning is nothing like classroom learning as far as workflow, no one wants to have a separate classroom system and eLearning system. So a structure needs be contorted to accommodate both processes. This leads to some pretty obvious contradictions that we all live with. For starters, eLearning is a transactional activity and not hierarchical, yet to fit in an LMS we need to create separate eLearning courses and eLearning activities (the equivalent of classes). Also, registrations are not necessary for eLearning but they are so integral to the LMS data strategy that they can’t be removed. This creates an extra step that learners do not understand.

Once we accept the flawed premise that eLearning must follow classroom training workflows, we now have to solve for problems that this model creates. This is like the problem people have when they try to fit unwieldy metaphors to complex real-life situations. Here’s an example: in classroom training, instructors are the arbiters of learning. If they feel that a student has learned, they give them a completion for the course. The completion becomes the coin-of-the-realm for learning data. With eLearning, there is no instructor to verify your completion. If the LMS simply links to the material, there is no way to prove that the learner “completed” the course. The proxy for the instructor is the end of course quiz. This is designed to prove a negative. Just like attendance of a class is not a guarantee of learning but not attending class is a reliable indicator of not learning, so too, the ability to answer questions about the content is not a guarantee of learning but the inability to answer those questions is a reliable indicator of not learning. Building a whole system to prove a negative seems a bit weak.

To create this dynamic in a foolproof (read human-proof) way, various protocols like AICC and SCORM need to be followed to control the flow of data between the learner and the database. This tight control leads to much of the difficulty in using and supporting these courses. If you are going to reliably report on this data, you need to follow some logical rules that take all possible scenarios into account. For instance. It might seem intuitive that you can make changes to a course any time you want but if you change the learning content itself aren’t you invalidating the completions that have occurred so far? This kind of logic may be valid from a data integrity standpoint but it kills usability. The need for this complexity is created by the Frankenstein monster we put in place when we try to create a hierarchical system that is meant to solve for all situations.

Now we have Social Media and Mobile where control is not practical or desired and you have a crisis. The structure of the LMS was designed to control participation data but we don’t learn that way. We learn by interacting with people and ideas in countless ways. The Internet which was designed to circumvent all control is allowing us to learn without restrictions. The LMS is not equipped to handle that. This is why we have to address these issues now. The latest successor to SCORM, called xAPI (the API formerly known as Tin Can) begins to address some of these problems. It tracks transactions rather than hierarchies and it doesn’t require that you discover learning through the LMS so it can capture data more freely but this is just the tip of the iceberg.

This seemingly intractable problem was created when vendors attempted to give us everything we wanted in one package. The answer is in the L&D industry doing some soul searching about what it is that the learner really needs. I could go on forever about the nuances of the LMS workflow. we need to make the LMS more usable for learners and for organizations. To do this we need to continue this exploration.

 

The Flipped LMS

Hooboy, do people like to complain about their LMS (Learning Management System). As evidenced by this conversation on #chat2lrn that I was a part of, L&D folk would like to wish away this beast of a software program. The thing is though, that no one ever really does get rid of their LMS. Why is that?

I believe that the root cause of this contradiction lies in two conflicting drivers: 1) It is important to the business to track training. 2) The way we track training now gets in the way of learning. Many people including myself have argued the first point so I’ll focus now on the second point. In the current process of using an LMS, in order to track training, you must control access to content. But we live in the Internet age now and we all know that content wants to be free. If we cannot resolve this disconnect, there is no hope for improving our relationship with our learning architecture.

The Flipped Classroom is a concept that breathed life into the disconnects of the education world. Perhaps the idea of a Flipped LMS can bring us to a solution. In the Flipped Classroom, instruction and experimentation (homework) were separated and switched, where the instruction was provided by video at home and the teacher supported the experimentation in the classroom. Imagine if we separated the content from the assessment. Put the assessment in the LMS where it can track whether someone learned, and let the content exist outside the LMS where it is always accessible for anyone. The assessment can have a landing page (most assessment tools can do this) that provides context for the information being assessed: why it’s important, how it is assessed, where to learn what you need to improve your score. Here would be the link to the content. There could be three assessments per program, all orchestrated by the LMS: A pre-test; the post-assesment and a follow-up assessment to reinforce the learning.

This way you are using the LMS for what it does best. By allowing multiple attempts and multiple sources of learning, you are letting the learner be more flexible and you are tracking improvement over time with less complexity.

But how do we know who accessed the content? This is the beauty of the idea. By splitting up access and assessment you also split up what you have to measure. For assessment, you must track individuals and so you need the LMS, but to track the reach of your content, you only need numbers of users and visits, not individuals. This can be done by any web analytics tool like Google Analytics.

Hopefully the clarity produced by this split in efforts will help L&D folk move on to more important conversations than how much they hate their LMS. That is, until they get the next bill from the LMS vendor.

It isn’t Really About the Learning

The role of learning professionals is NOT to make people learn. That’s an unrealistic task. Learners are responsible for learning. Learning happens inside people’s heads and so it is impossible to measure. So then what do Learning Professionals do? Why do companies give them money?
The goal of a corporate learning function is to increase individuals’ capability to produce value. A company with employees that produce more value is more valued itself. To accomplish this goal, Learning Professionals need to create learning experiences, and get the right people to participate in them. If the experiences are designed correctly, the participants should be able to do their work faster, better, smarter or safer. The first part is handled by a Learning Management System. An LMS , when set up correctly, facilitates participation in Learning Experiences for target audiences. You can’t grow your capability if you don’t participate. However, just because you participate doesn’t mean you grow your capability.
Most Learning Organizations stop here. Participation is easy to measure and report on. Stakeholders understand the results. The problem is that learning and Learning Management Systems are a big investment for only getting half of the desired results. The Learning industry has been scratching their collective heads for years looking for a way to measure learning, but that is the wrong measure. We need to measure growth in capability.
Measuring growth is done by setting a baseline and measuring changes over time. The complication is that people are always learning and always adjusting their capability. To try to measure the effect of a single learning experience is going to be hard. Another way to look at it is that Learning Experiences are transactions happening on a regular basis. Capability could be tested at timed intervals and increases could be attributed to the transactions that occurred between the intervals. This kind of measurement is not easy for an LMS but could be done with the xAPI.
Using this model, Learning Experiences could be offered as smaller components of a larger practice. The key is in creating assessments that also provide value.

How Learning Creates Value

diagram-corp-value
Tim Cook, the CEO of Apple wakes up in the middle of the night from a dream. He has had a premonition that somewhere in his company of 80,000 employees there are 100 people who will develop and market Apples next blockbuster product: iPet. Right now, though, these people don’t know enough about pets to make that happen. So Mr. Cook calls his Learning and Development team and tells them to deploy animal behavior training to everyone in the company at a cost of $10 million.

 

If Cook was the head of a division, he would count the $10 million as an expense and subtract that number, along with other expenses like salaries, from the projected revenues to show his boss that there would be profits. But Tim Cook doesn’t have a boss like you and I do. His bosses are the stockholders. They have bought Apple stock for a lot of money and they want the value of their stock to keep going up. The value of the stock is a share of the Market Value of the company. It is Cook’s job to make sure that value keeps increasing.

 

When Jobs and Wozniak were starting Apple Computer in their garage in the seventies, 80% of the value of companies depended on tangeable asssets like factories. Only 20% was created by people. So if you built another factory, you could increase the value of the company. Now the numbers are flipped. 80% of the value of companies is created by people. But people aren’t factories, how do you account for the value that they produce?

 

Not coincidentally, Apple’s book value of $120 billion is about 20% of it’s market value of $570 billion. Book value is calculated by accountants and it lists employees as expenses and liabilities. No where in the book value of Apple is any mention of the legacy of Steve Jobs but you can be sure that it is factored into the market value.

 

The 100 people in Tim Cook’s dream are already contributing to the market value of the company. If Cook wants to increase that value, he needs to increase the value that those 100 people are capable of producing. Since he doesn’t know which employees will produce that increased value, he needs to increase the capability of all of his employees. He will invest 10 million dollars in a learning program that has the potential of raising the company valuation 10% or $57 billion dollars. That is the return on investment for learning.

 

Not every CEO wakes up in the middle of the night with visions of new products, but most good leaders instinctively know that they need to invest in their people as producers of value even though that calculation is not figured into their accounting. The reason companies set aside budgets for learning is to increase the value of the company.