Reach and Access

Sales people have it easy. They are measured on something very concrete: sales. Marketers on the other hand are measured on reach. How many people did they reach with their message. There is no way to know whether their message resulted in sales but if they don’t reach people, those people don’t buy. Learning professionals are in the same boat. It’s hard to measure whether people learned but they are interested in who they reached with their learning content. They may or may not learn but they won’t get the content if they can’t be reached.

This means that built into every learning experience must be a mechanism to capture who had the experience. The issue is that how people access the experience is tangled up in how it is tracked. There is a danger in creating a tracking process that is more complex than the learning itself. This is unacceptable in the age of Google. Learners expect minimum friction in accessing learning.

This struggle follows the continuum of formality in delivering training. In a formal classroom environment, responsibility for determining completion is given to the instructor and they send that information to the LMS Admin for tracking. In a formal eLearning course, the course developer determines the criteria for completion and uses the SCORM protocol to make sure that completion is passed accurately to the LMS. Once you start looking at informal learning or microlearning where tracking activities might hinder the access to the learning, it gets more complicated.

One solution is to separate usage and learner success. You can use web analytics to measure the usage of the content and then track a post learning assessment to see if people learned. Or you can make the tracking more passive (not requiring action by the learner) by embedding it into the content. The xAPI protocol with its light javascript programming can be good for this.

With social media learning you can extract data about people’s interactions around the learning experience. You can even reintroduce the human into the equation. For instance, in certain MOOCs (Massive Open Online Courses) they use software like Prosolo to give peers the opportunity to “complete” each other for certain tasks.

Counting participation or “buts-in-seats” is only a proxy for learning. It is like following footsteps on a beach and trying to figure out what the experience was. This is frustrating to most learning professionals because we want to know what our real impact was. Did people learn? But is that a realistic measure to hold ourselves to? We can’t get inside people’s heads. When we use an organizations’ resources to create learning experiences we have to be held accountable for something measurable. We have to show that at the very least we reached people with our content. Maybe if we can simplify that process, we can move on to the next questions: Are people able to change how they process new information and challenges? Are they able to do things differently after experiencing learning?


I’m a big believer in chunking out learning so that the most effective modality can be applied to each component. Fact sharing definitely does not need to be done synchronously. I’m not sure about behavior change because 1) I don’t think people are always clear about what behaviors they want to encourage and 2) I’m not convinced that we can directly change behaviors. Continue reading

Lessons Learned from the Failure of the Jedi Talent Strategy

Note: I was inspired by the recent lrnchat StarWars theme ( and decided to write this little parody:

Memo to the provisional government in Coruscant, committee for post rebellion leadership development by Princess Leia.

Dear committee members,

I am writing to you to give an overview of the Jedi Talent Strategies for recruitment, performance management, learning and development and succession planning and key areas of weakness that led to the dissolution of the program.

  • Recruitment: The use of arcane credentials like the blood-count of midi-chlorians does not take into account personal characteristics both positive and negative as evidenced by Anakin Skywalker. It also can lead to nepotism, as in the case of Luke Skywalker, which discourages diversity. Continue reading

Learning Analytics Course #LAK11 – Final Thoughts

I have just completed my first Massively Open Online Course (MOOC), Learning and Knowledge Analytics taught by George Siemens of the University of Athabasca. Participating in a course of this format is both challenging and rewarding. I did not participate to the level that I had wanted to (2-3 hours per week vs the recommended 4-5) but I made sure that I stuck it out to the end. I had two goals for the course that I would like to discuss here Continue reading

10 Ideas for Integrating Social Media with Formal Courses

There is a lot of excitement being generated around using Social Media for learning and with good reason. Social Media tools support Informal Learning which is recognized as the primary way that people learn (an interview with Jay Cross author of “Informal Learning”). However, some of this buzz reminds me of the early days of eLearning when it was proclaimed that “classroom training is dead” (see this article from way back in 1996). Obviously, classroom training is still predominant (Bersin). Continue reading

Joining a MOOC

I’ve just “registered” for “Learning and Knowledge Analytics 11” a Massive Open Online Course (MOOC) taught by George Siemens of Athabasca University.

There has been some discussion of MOOCs as a learning mode, both negative and positive and even one from George Siemens about his concerns. I am really excited about anything open and connected, so I’ll be looking forward to participating in my first one. The MOOC will also allow me to sample new tools like Elluminate, Netvibes and Moodle.

Part of the program is to share information as the course progresses, so you will see postings on this blog and on my Twitter stream under the hashtag #lak11.

MOOCs, Moodle, Saas! Oh Dr. Seuss would be proud.