7. Make a plan for
implementation
Evaluate for
organizational insights
Establish a
strategy for
learning evaluation
Evaluate Katzell-
Kirkpatrick Levels
Evaluate
engagement &
experience
Take first
next steps
8. Great! We rolled it out on
the LMS & the intranet.
How’s the
course going?
YEAH! So are you
seeing success?
People really like it!
We’ll want some changes.
… how do we measure
success?
17. LEVEL 1
SATISFACTION
Do participants like it?
HOW TO
• Surveys from LMS
• Surveys
• Interviews
• Ask instructor or observer
• Online reviews
IT’S NOT
A measure
of effectiveness
MORE INFO
Will Thalheimer’s
Performance-based
Smile Sheets
19. LEVEL 2
KNOWLEDGE
Did they learn it?
HOW TO
• Knowledge checks in the
course content
• Tests immediate after
• Tests delayed over time
• Adaptive testing
• Confidence based testing
IT’S NOT
A measure of ability
to do anything other
than pass the test
20. LEVEL 3
BEHAVIORS
Do they do it (correctly)
out in the real world?
Check out
Brinkerhoff’s
Success Case
Method
21. LEVEL 3
BEHAVIORS
Do they do it (correctly)
out in the real world?
HOW TO
• Surveys of learners
• Surveys of managers
HOW TO, part 2
• Business process
performance data
MORE INFO
Making Sense of xAPI
Brinkerhoff Success Case
Method
22. LEVEL 4
RESULTS
Does it matter?
Did it solve the problem?
Hint: you measure this with
business metrics,
not training ones!
23. LEVEL 4
RESULTS
Does it matter?
Did it solve the problem?
HOW TO
• Measure organizational
results
IT’S NOT
“clean” – get over it
MORE INFO
Ask your org leaders how
they measure their business
(hint: it’s probably not
training data)
25. Make a plan for
implementation
Evaluate for
organizational insights
Establish a
strategy for
learning evaluation
Evaluate Katzell-
Kirkpatrick Levels
Evaluate
engagement &
experience
Take first
next steps
27. LIKES,
LOOKS &
LAUNCHES
PARTICIPATION
Who? How many? Where?
How long? Dropoff?
HOW TO
• LMS, enrollment,
completion stats
• Social engagement stats
• Google Analytics
• Whatever your tools will
give you
IT’S NOT
useful without the
qualitative data to
understand the why
IT IS …
Useful
What you share while you
collect other data
33. LESSONS
LEARNED
What did we learn?
HOW TO IT’S NOT
• Retrospective
• Start / Stop / Continue
an evaluation of the
training so much as an
evaluation of our process for
getting here
34. Make a plan for
implementation
Evaluate for
organizational insights
Establish a
strategy for
learning evaluation
Evaluate Katzell-
Kirkpatrick Levels
Evaluate
engagement &
experience
Take first
next steps
35. STAKES
TO PUT IN THE
GROUND NOW
Omigosh we need a control!
Yes – I just suggested
you do this before
setting your strategy
36. STAKES
TO PUT IN THE
GROUND NOW
Omigosh we need a control!
You may realize you need to start measuring
something now to get a baseline.
37. Make a plan for
implementation
Evaluate for
organizational insights
Establish a
strategy for
learning evaluation
Evaluate Katzell-
Kirkpatrick Levels
Evaluate
engagement &
experience
Take first
next steps
38. 35% of TD pros’ organizations
evaluated the business results of
learning programs to any extent.
Elaine Beich, ATD Foundations of Talent Development, 2016 research
39. What are you already
measuring? (yeah you!)
What will make the biggest
impact to start measuring?
What is the easiest to start
measuring? (quick wins!)
Who else will you need to
involve? (friendly partners!)
What barriers will you face?
40. For crying out loud fill out
your session evaluations.
Megan Torrance | mtorrance@torrancelearning.com
What will you start measuring in
30 days? 60 days? 90 days?
Opening discussion:
We’re all professionals here so let’s jump right on. How do we know if training works?
Goal: conversation about ways to know if things are working – not specific examples.
We recognize there’s a gap.
I clearly don’t need to tell you why this is important.
So … what are we going to do about it?
Here’s our plan for tonight.
This is not aresearch-based model. This is based on what I’ve seen & done in practice & learned from people who do research.
Give overview of 8Ls, history & adaptation.
Qualitative data provides insights and understanding about a problem.
Methods of collecting qualitative data:
Focus groups
Observation
Interviews
Archival materials like newspapers
Quantitative data computes the values and can be expressed in numerical terms.
Methods of collecting quantitative data:
Surveys
Experiments
Observations and interviews
https://keydifferences.com/difference-between-qualitative-and-quantitative-data.html
Story: the SAP team where people loved it and people hated it … but the performance is the same.
Emerged from the work of Dr. Donald Kirkpatrick and Dr. Jack Phillips
Return on Investment or ROI is a metric applied across many business functions, and for good reason. It’s the fundamental measurement of performance in business.
It’s the statement of profitability over time in relation to the money invested in the effort. Because it is difficult to get a true ROI measurement, ROI is best used when deciding between 2 executions: for example, a website vs. a tradeshow; an additional sales representative in the field vs. more direct mail pieces; and so on.
It worthwhile noting that evaluation may serve a number of key quality control functions in addition to establishing training’s business value. Some of the other objectives of evaluation are to:
Improve the quality of learning programs
Determine if a program meet its objectives
Identify potential strengths and weaknesses in the learning program
Develop a cost/benefit analysis of training and an Human Resources Development (HRD) investments
Support marketing of training or HRD programs
Determine a program’s appropriateness for the target audience
Assist in decision making about program investments and establish funding priorities