SlideShare une entreprise Scribd logo
1  sur  37
Kirkpatrick
The Four Levels
Reaction
Learning
Behavior
Results
All about Kirkpatrick
In 1959, Kirkpatrick wrote four articles describing
the four levels for evaluating training programs.
He was working on his dissertation for a Ph.D.
when he came up with the idea of defining
evaluation.
Evaluation, as according to Kirkpatrick, seems to
have multiple meanings to training and
developmental professionals. Some think
evaluation is a change in behavior, or the
determination of the final results.
All about Kirkpatrick
(continued)
Kirkpatrick says they are all right, and
yet all wrong. All four levels are
important in understanding the basic
concepts in training. There are
exceptions, however.
Kirkpatrick: Evaluating
Training Programs
“What is quality training?”
“How do you measure it?”
“How do you improve it?”
Evaluating
“The reason for evaluating is to
determine the effectiveness of a
training program.” (Kirkpatrick,
1994, pg. 3)
The Ten Factors of Developing
a Training Program
1. Determine needs
2. Set objectives
3. Determine subject content
4. Select qualified applicants
5. Determine the best schedule
The Ten Factors of Developing
a Training Program
6. Select appropriate facilities
7. Select qualified instructors
8. Select and prepare audiovisual
aids
9. Co-ordinate the program
10. Evaluate the program
Reasons for Evaluating
Kirkpatrick gives three reasons ‘why’
there is a need to evaluate training:
1.“To justify the existence of the
training department by showing how
it contributes to the organizations’
objectives and goals.”
Reasons for Evaluating
2. “To decide whether to continue or
discontinue training programs.”
3. “To gain information on how to
improve future training programs.”
(Kirkpatrick, 1994, pg. 18)
The Four Levels
Reaction
Learning
Behavior
Results
“The Four Levels represent a
sequence of ways to evaluate
(training) programs….As you move
from one level to the next, the
process becomes more difficult and
time-consuming, but it also provides
more valuable information.”
(Kirkpatrick, 1994, pg. 21)
Reaction:
is the measuring of the reaction of
the participants in the training
program.
is “a measure of customer
satisfaction.” (Kirkpatrick, 1994,
pg. 21)
Learning:
is the change in the participants’
attitudes, or an increase in
knowledge, or greater skills
received, as a result of the
participation of the program.
Learning
The measuring of learning in any training
program is the determination of at least one
of these measuring parameters:
Did the attitudes change positively?
Is the knowledge acquired related and
helpful to the task?
Is the skill acquired related and helpful to
the task?
Behavior
Level 3 attempts to evaluate how
much transfer of knowledge, skills,
and attitude occurs after the
training.
The four conditions Kirkpatrick
identifies for changes to occur:
Desire to change
Knowledge of what to do and
how to do it
Work in the right climate
Reward for (positive) change
When all conditions are met,
the employee must:
Realize an opportunity to use the
behavioral changes.
Make the decision to use the
behavioral changes.
Decide whether or not to continue
using the behavioral changes.
When evaluating change in
behavior, decide:
When to evaluate
How often to evaluate
How to evaluate
Guidelines for evaluating
behavior:
Use a control group
Allow time for change to occur
Evaluate before and after
Survey/interview observers
Get 100% response or sampling
Repeat evaluation, as appropriate
Consider cost versus benefits
Results
Level 4 is the most important and
difficult of all - determining final
results after training.
Evaluation Questions:
Increased production?
Improved quality?
Decreased costs?
Improved safety numbers?
Increased sales?
Reduced turnover?
Higher profits?
Guidelines for evaluating
results:
Use a control group.
Allow time for results to be achieved.
Measure before and after the program.
Repeat the measurements, as needed.
Consider cost versus benefits.
Be satisfied with evidence if proof is not
possible.
Case Study #1
INTEL CORPORATION
Intel’s Compromise
of the Kirkpatrick Model
Intel uses the four-level model as
an analysis instrument to
determine the initial training needs
and design of its training program;
as well as using the model for
evaluations.
Intel’s Compromise
of the Kirkpatrick Model
Their uniqueness of using the model
is in the fact that the designers of
the training program worked
backwards in the analysis of the
training, starting with Level Four.
The Model
This implementation of the
Kirkpatrick Model stands as vivid
testimony to the versatility of the
model as a training tool, and in
developing fledgling training
programs.
The Model
It also reflects the open-mindedness
of the senior executives at Intel for
their infinite use of the model and
the use of the genius and visions of
Kirkpatrick.
How Intel applies the analysis
to their training program
Level Four …”Determine the
organizations’ structure and future
needs.”
Level Three. Change the
environmental conditions and
employee conditions to improve
business indicators.
How Intel applies the analysis
to their training program
Level Two. “Design a training
program that would ensure a transfer
of deficient skills and knowledge.”
Level One. Use a questionnaire,
according to their skill level, that
would instruct and inspire training
participants.
How Intel applies evaluation
to their training program
Level One - Questionnaire.
Level Two - Demonstrate competency,
create action plans through group
simulations.
Level Three - Follow-up to determine if
action plans were met (specific steps to
implement concepts of what was learned).
Level Four - Ongoing process of tracking
business indicators.
Case Study #2
ST. LUKE’S HOSPITAL
St. Luke’s is unique -
Evaluation of outdoor-based
training program, not classroom.
Results analyzed statistically to
determine the significance of any
change.
Evaluation led to recommendations
for future programs.
The New Questionnaire
Used before attendance in the program.
Used 3 months after completion of the
program.
Used again 6 months after completion of
the program.
(Communication showed statistically significant
improvement, and Group Effectiveness showed
statistically significant change.)
Kirkpatrick’s 4 Levels of
Evaluation are:
Level 1 - Reaction: how participants reacted
to the program.
Level 2 - Learning: what participants
learned from the program.
Level 3 - Behavior: whether what was
learned is being applied on the job.
Level 4 - Results: whether that application
is achieving results.
Post-test Questions
(1) Name three ways evaluation results can
be measured.
(2) Do all 4 Levels have to be used?
(3) Do they have to be used in 1,2,3,4
order?
(4) Is Kirkpatrick’s method of evaluation
summative or formative?
(5) Which developmental “view” does
Kirkpatrick use? (discrepancy,
“IF YOU THINK TRAINING IS
EXPENSIVE, TRY IGNORANCE.”
and, remember, the definition of
ignorance is
repeating the same behavior, over and
over, and expecting different results!

Contenu connexe

Tendances

Simplistic approach of krik patrick
Simplistic approach of krik patrickSimplistic approach of krik patrick
Simplistic approach of krik patrickrhimycrajan
 
Kirkpatrick4 levels
Kirkpatrick4 levelsKirkpatrick4 levels
Kirkpatrick4 levelsNicole
 
Kirkpatrick's Four-Level Training Evaluation Model
Kirkpatrick's Four-Level Training Evaluation ModelKirkpatrick's Four-Level Training Evaluation Model
Kirkpatrick's Four-Level Training Evaluation ModelMaram Barqawi
 
Kirkpatricks Levels Presentation
Kirkpatricks Levels PresentationKirkpatricks Levels Presentation
Kirkpatricks Levels PresentationLarry Weas
 
Kirkpatrick's Levels of Training Evaluation - Training and Development
Kirkpatrick's Levels of Training Evaluation - Training and DevelopmentKirkpatrick's Levels of Training Evaluation - Training and Development
Kirkpatrick's Levels of Training Evaluation - Training and DevelopmentManu Melwin Joy
 
Interrogating evaluation 2015 inductionb
Interrogating evaluation 2015 inductionbInterrogating evaluation 2015 inductionb
Interrogating evaluation 2015 inductionbRita Ndagire Kizito
 
The Benefits Of Utilizing Kirkpatrick’S Four Levels Of
The Benefits Of Utilizing Kirkpatrick’S Four Levels OfThe Benefits Of Utilizing Kirkpatrick’S Four Levels Of
The Benefits Of Utilizing Kirkpatrick’S Four Levels Ofwendystein
 
Kirkpatrick's model
Kirkpatrick's modelKirkpatrick's model
Kirkpatrick's modelcindyyew
 
Kirkpatrick model
Kirkpatrick modelKirkpatrick model
Kirkpatrick modelSanK6
 
Four Levels Of Evaluation (Kirkpatrick Model) By Ravinder Tulsiani
Four Levels Of Evaluation (Kirkpatrick Model) By Ravinder TulsianiFour Levels Of Evaluation (Kirkpatrick Model) By Ravinder Tulsiani
Four Levels Of Evaluation (Kirkpatrick Model) By Ravinder TulsianiRavinder Tulsiani
 
KIRKPATRICK MODEL OF EVALUATION (LEO CHANDRA)
KIRKPATRICK MODEL OF EVALUATION (LEO CHANDRA)KIRKPATRICK MODEL OF EVALUATION (LEO CHANDRA)
KIRKPATRICK MODEL OF EVALUATION (LEO CHANDRA)vina serevina
 
Learner Experience (model for training evaluation)
Learner Experience (model for training evaluation)Learner Experience (model for training evaluation)
Learner Experience (model for training evaluation)Shahla Khan
 
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...Lambda Solutions
 
Kaufman’s five levels of evaluation of trainning
Kaufman’s five levels of evaluation of trainningKaufman’s five levels of evaluation of trainning
Kaufman’s five levels of evaluation of trainningSaranya Dhanesh Kumar
 
Module 8: Assessment and Evaluation
Module 8: Assessment and EvaluationModule 8: Assessment and Evaluation
Module 8: Assessment and EvaluationCardet1
 

Tendances (20)

Simplistic approach of krik patrick
Simplistic approach of krik patrickSimplistic approach of krik patrick
Simplistic approach of krik patrick
 
Kirkpatrick4 levels
Kirkpatrick4 levelsKirkpatrick4 levels
Kirkpatrick4 levels
 
Kirkpatrick's Four-Level Training Evaluation Model
Kirkpatrick's Four-Level Training Evaluation ModelKirkpatrick's Four-Level Training Evaluation Model
Kirkpatrick's Four-Level Training Evaluation Model
 
Levels 1-4 Evaluation
Levels 1-4 EvaluationLevels 1-4 Evaluation
Levels 1-4 Evaluation
 
Kirkpatricks Levels Presentation
Kirkpatricks Levels PresentationKirkpatricks Levels Presentation
Kirkpatricks Levels Presentation
 
Kirkpatrick's Levels of Training Evaluation - Training and Development
Kirkpatrick's Levels of Training Evaluation - Training and DevelopmentKirkpatrick's Levels of Training Evaluation - Training and Development
Kirkpatrick's Levels of Training Evaluation - Training and Development
 
Interrogating evaluation 2015 inductionb
Interrogating evaluation 2015 inductionbInterrogating evaluation 2015 inductionb
Interrogating evaluation 2015 inductionb
 
Kirkpatrick training module_2016
Kirkpatrick training module_2016Kirkpatrick training module_2016
Kirkpatrick training module_2016
 
The Benefits Of Utilizing Kirkpatrick’S Four Levels Of
The Benefits Of Utilizing Kirkpatrick’S Four Levels OfThe Benefits Of Utilizing Kirkpatrick’S Four Levels Of
The Benefits Of Utilizing Kirkpatrick’S Four Levels Of
 
Kirkpatrick's model
Kirkpatrick's modelKirkpatrick's model
Kirkpatrick's model
 
Kirkpatrick's model
Kirkpatrick's modelKirkpatrick's model
Kirkpatrick's model
 
Kirkpatrick model
Kirkpatrick modelKirkpatrick model
Kirkpatrick model
 
Four Levels Of Evaluation (Kirkpatrick Model) By Ravinder Tulsiani
Four Levels Of Evaluation (Kirkpatrick Model) By Ravinder TulsianiFour Levels Of Evaluation (Kirkpatrick Model) By Ravinder Tulsiani
Four Levels Of Evaluation (Kirkpatrick Model) By Ravinder Tulsiani
 
KIRKPATRICK MODEL OF EVALUATION (LEO CHANDRA)
KIRKPATRICK MODEL OF EVALUATION (LEO CHANDRA)KIRKPATRICK MODEL OF EVALUATION (LEO CHANDRA)
KIRKPATRICK MODEL OF EVALUATION (LEO CHANDRA)
 
Learner Experience (model for training evaluation)
Learner Experience (model for training evaluation)Learner Experience (model for training evaluation)
Learner Experience (model for training evaluation)
 
Training evaluation
Training evaluationTraining evaluation
Training evaluation
 
4 Quadrant Approach
4 Quadrant Approach4 Quadrant Approach
4 Quadrant Approach
 
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...
 
Kaufman’s five levels of evaluation of trainning
Kaufman’s five levels of evaluation of trainningKaufman’s five levels of evaluation of trainning
Kaufman’s five levels of evaluation of trainning
 
Module 8: Assessment and Evaluation
Module 8: Assessment and EvaluationModule 8: Assessment and Evaluation
Module 8: Assessment and Evaluation
 

Similaire à Kirkpatrick's 4 Levels Training Evaluation Model

Krickpatrick basic level of evaluation
Krickpatrick basic level of evaluationKrickpatrick basic level of evaluation
Krickpatrick basic level of evaluationSajan Ks
 
Training evaluation models
Training evaluation modelsTraining evaluation models
Training evaluation modelsMegha Anilkumar
 
Review of literature
Review of  literatureReview of  literature
Review of literaturedhanarajnaik
 
Kirkpatricks Foul Levels Evaluation.pptx
Kirkpatricks Foul Levels Evaluation.pptxKirkpatricks Foul Levels Evaluation.pptx
Kirkpatricks Foul Levels Evaluation.pptxJohnnyGGalla
 
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptx
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptxPRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptx
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptxJOHNNYGALLA2
 
AkdjcijdjcBusiness card 8 Dec 2023.pdfhjdhbcjkewhcjkejckljecjdj
AkdjcijdjcBusiness card 8 Dec 2023.pdfhjdhbcjkewhcjkejckljecjdjAkdjcijdjcBusiness card 8 Dec 2023.pdfhjdhbcjkewhcjkejckljecjdj
AkdjcijdjcBusiness card 8 Dec 2023.pdfhjdhbcjkewhcjkejckljecjdjRagaviS16
 
Chapter 11 Training Evaluation.ppt
Chapter 11 Training Evaluation.pptChapter 11 Training Evaluation.ppt
Chapter 11 Training Evaluation.pptDr. Nazrul Islam
 
Measuring Learning Impact
Measuring Learning ImpactMeasuring Learning Impact
Measuring Learning ImpactIskandar Noor
 
G.training evaluation by jyoti k
G.training evaluation by jyoti kG.training evaluation by jyoti k
G.training evaluation by jyoti kjyoti karvande
 
Training evaluation
Training evaluationTraining evaluation
Training evaluationNancy Raj
 
Evaluation models
Evaluation modelsEvaluation models
Evaluation modelsMaarriyyaa
 
Unit 5- training evalutaion pptx
Unit 5- training evalutaion  pptxUnit 5- training evalutaion  pptx
Unit 5- training evalutaion pptxManoj Kumar
 
Training Evaluation Model.pptx
Training Evaluation Model.pptxTraining Evaluation Model.pptx
Training Evaluation Model.pptxHitkarshSethi2
 
Evaluation models by dr.shazia zamir by
Evaluation models by dr.shazia zamir by Evaluation models by dr.shazia zamir by
Evaluation models by dr.shazia zamir by Dr.Shazia Zamir
 
evaluation of program.pptx
evaluation of program.pptxevaluation of program.pptx
evaluation of program.pptxMahwishBukhari3
 
Level1trainingevaluation
Level1trainingevaluationLevel1trainingevaluation
Level1trainingevaluationSandy Clare
 
OBJEKTIF PPT MODEL.pptx
OBJEKTIF PPT MODEL.pptxOBJEKTIF PPT MODEL.pptx
OBJEKTIF PPT MODEL.pptxSitiHafidah1
 

Similaire à Kirkpatrick's 4 Levels Training Evaluation Model (20)

Krickpatrick basic level of evaluation
Krickpatrick basic level of evaluationKrickpatrick basic level of evaluation
Krickpatrick basic level of evaluation
 
Training evaluation models
Training evaluation modelsTraining evaluation models
Training evaluation models
 
Review of literature
Review of  literatureReview of  literature
Review of literature
 
Kirkpatricks Foul Levels Evaluation.pptx
Kirkpatricks Foul Levels Evaluation.pptxKirkpatricks Foul Levels Evaluation.pptx
Kirkpatricks Foul Levels Evaluation.pptx
 
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptx
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptxPRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptx
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptx
 
AkdjcijdjcBusiness card 8 Dec 2023.pdfhjdhbcjkewhcjkejckljecjdj
AkdjcijdjcBusiness card 8 Dec 2023.pdfhjdhbcjkewhcjkejckljecjdjAkdjcijdjcBusiness card 8 Dec 2023.pdfhjdhbcjkewhcjkejckljecjdj
AkdjcijdjcBusiness card 8 Dec 2023.pdfhjdhbcjkewhcjkejckljecjdj
 
Chapter 11 Training Evaluation.ppt
Chapter 11 Training Evaluation.pptChapter 11 Training Evaluation.ppt
Chapter 11 Training Evaluation.ppt
 
Measuring Learning Impact
Measuring Learning ImpactMeasuring Learning Impact
Measuring Learning Impact
 
MED07_joycepagkatipunan.pdf
MED07_joycepagkatipunan.pdfMED07_joycepagkatipunan.pdf
MED07_joycepagkatipunan.pdf
 
Training Evaluation
Training EvaluationTraining Evaluation
Training Evaluation
 
G.training evaluation by jyoti k
G.training evaluation by jyoti kG.training evaluation by jyoti k
G.training evaluation by jyoti k
 
Training evaluation
Training evaluationTraining evaluation
Training evaluation
 
Evaluation models
Evaluation modelsEvaluation models
Evaluation models
 
Unit 5- training evalutaion pptx
Unit 5- training evalutaion  pptxUnit 5- training evalutaion  pptx
Unit 5- training evalutaion pptx
 
Training Evaluation Model.pptx
Training Evaluation Model.pptxTraining Evaluation Model.pptx
Training Evaluation Model.pptx
 
Evaluation models by dr.shazia zamir by
Evaluation models by dr.shazia zamir by Evaluation models by dr.shazia zamir by
Evaluation models by dr.shazia zamir by
 
Kirkspatrick model
Kirkspatrick modelKirkspatrick model
Kirkspatrick model
 
evaluation of program.pptx
evaluation of program.pptxevaluation of program.pptx
evaluation of program.pptx
 
Level1trainingevaluation
Level1trainingevaluationLevel1trainingevaluation
Level1trainingevaluation
 
OBJEKTIF PPT MODEL.pptx
OBJEKTIF PPT MODEL.pptxOBJEKTIF PPT MODEL.pptx
OBJEKTIF PPT MODEL.pptx
 

Dernier

The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxLoriGlavin3
 
What is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfWhat is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfMounikaPolabathina
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESmohitsingh558521
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningLars Bell
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsPixlogix Infotech
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024Stephanie Beckett
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...Rick Flair
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxLoriGlavin3
 
unit 4 immunoblotting technique complete.pptx
unit 4 immunoblotting technique complete.pptxunit 4 immunoblotting technique complete.pptx
unit 4 immunoblotting technique complete.pptxBkGupta21
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii SoldatenkoFwdays
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxLoriGlavin3
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!Manik S Magar
 
Time Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsTime Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsNathaniel Shimoni
 
Moving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfMoving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfLoriGlavin3
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 

Dernier (20)

The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
 
What is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfWhat is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdf
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine Tuning
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and Cons
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
 
unit 4 immunoblotting technique complete.pptx
unit 4 immunoblotting technique complete.pptxunit 4 immunoblotting technique complete.pptx
unit 4 immunoblotting technique complete.pptx
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptx
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!
 
Time Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsTime Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directions
 
Moving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfMoving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdf
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 

Kirkpatrick's 4 Levels Training Evaluation Model

  • 3. All about Kirkpatrick In 1959, Kirkpatrick wrote four articles describing the four levels for evaluating training programs. He was working on his dissertation for a Ph.D. when he came up with the idea of defining evaluation. Evaluation, as according to Kirkpatrick, seems to have multiple meanings to training and developmental professionals. Some think evaluation is a change in behavior, or the determination of the final results.
  • 4. All about Kirkpatrick (continued) Kirkpatrick says they are all right, and yet all wrong. All four levels are important in understanding the basic concepts in training. There are exceptions, however.
  • 5. Kirkpatrick: Evaluating Training Programs “What is quality training?” “How do you measure it?” “How do you improve it?”
  • 6. Evaluating “The reason for evaluating is to determine the effectiveness of a training program.” (Kirkpatrick, 1994, pg. 3)
  • 7. The Ten Factors of Developing a Training Program 1. Determine needs 2. Set objectives 3. Determine subject content 4. Select qualified applicants 5. Determine the best schedule
  • 8. The Ten Factors of Developing a Training Program 6. Select appropriate facilities 7. Select qualified instructors 8. Select and prepare audiovisual aids 9. Co-ordinate the program 10. Evaluate the program
  • 9. Reasons for Evaluating Kirkpatrick gives three reasons ‘why’ there is a need to evaluate training: 1.“To justify the existence of the training department by showing how it contributes to the organizations’ objectives and goals.”
  • 10. Reasons for Evaluating 2. “To decide whether to continue or discontinue training programs.” 3. “To gain information on how to improve future training programs.” (Kirkpatrick, 1994, pg. 18)
  • 12. “The Four Levels represent a sequence of ways to evaluate (training) programs….As you move from one level to the next, the process becomes more difficult and time-consuming, but it also provides more valuable information.” (Kirkpatrick, 1994, pg. 21)
  • 13. Reaction: is the measuring of the reaction of the participants in the training program. is “a measure of customer satisfaction.” (Kirkpatrick, 1994, pg. 21)
  • 14. Learning: is the change in the participants’ attitudes, or an increase in knowledge, or greater skills received, as a result of the participation of the program.
  • 15. Learning The measuring of learning in any training program is the determination of at least one of these measuring parameters: Did the attitudes change positively? Is the knowledge acquired related and helpful to the task? Is the skill acquired related and helpful to the task?
  • 16. Behavior Level 3 attempts to evaluate how much transfer of knowledge, skills, and attitude occurs after the training.
  • 17. The four conditions Kirkpatrick identifies for changes to occur: Desire to change Knowledge of what to do and how to do it Work in the right climate Reward for (positive) change
  • 18. When all conditions are met, the employee must: Realize an opportunity to use the behavioral changes. Make the decision to use the behavioral changes. Decide whether or not to continue using the behavioral changes.
  • 19. When evaluating change in behavior, decide: When to evaluate How often to evaluate How to evaluate
  • 20. Guidelines for evaluating behavior: Use a control group Allow time for change to occur Evaluate before and after Survey/interview observers Get 100% response or sampling Repeat evaluation, as appropriate Consider cost versus benefits
  • 21. Results Level 4 is the most important and difficult of all - determining final results after training.
  • 22. Evaluation Questions: Increased production? Improved quality? Decreased costs? Improved safety numbers? Increased sales? Reduced turnover? Higher profits?
  • 23. Guidelines for evaluating results: Use a control group. Allow time for results to be achieved. Measure before and after the program. Repeat the measurements, as needed. Consider cost versus benefits. Be satisfied with evidence if proof is not possible.
  • 24. Case Study #1 INTEL CORPORATION
  • 25. Intel’s Compromise of the Kirkpatrick Model Intel uses the four-level model as an analysis instrument to determine the initial training needs and design of its training program; as well as using the model for evaluations.
  • 26. Intel’s Compromise of the Kirkpatrick Model Their uniqueness of using the model is in the fact that the designers of the training program worked backwards in the analysis of the training, starting with Level Four.
  • 27. The Model This implementation of the Kirkpatrick Model stands as vivid testimony to the versatility of the model as a training tool, and in developing fledgling training programs.
  • 28. The Model It also reflects the open-mindedness of the senior executives at Intel for their infinite use of the model and the use of the genius and visions of Kirkpatrick.
  • 29. How Intel applies the analysis to their training program Level Four …”Determine the organizations’ structure and future needs.” Level Three. Change the environmental conditions and employee conditions to improve business indicators.
  • 30. How Intel applies the analysis to their training program Level Two. “Design a training program that would ensure a transfer of deficient skills and knowledge.” Level One. Use a questionnaire, according to their skill level, that would instruct and inspire training participants.
  • 31. How Intel applies evaluation to their training program Level One - Questionnaire. Level Two - Demonstrate competency, create action plans through group simulations. Level Three - Follow-up to determine if action plans were met (specific steps to implement concepts of what was learned). Level Four - Ongoing process of tracking business indicators.
  • 32. Case Study #2 ST. LUKE’S HOSPITAL
  • 33. St. Luke’s is unique - Evaluation of outdoor-based training program, not classroom. Results analyzed statistically to determine the significance of any change. Evaluation led to recommendations for future programs.
  • 34. The New Questionnaire Used before attendance in the program. Used 3 months after completion of the program. Used again 6 months after completion of the program. (Communication showed statistically significant improvement, and Group Effectiveness showed statistically significant change.)
  • 35. Kirkpatrick’s 4 Levels of Evaluation are: Level 1 - Reaction: how participants reacted to the program. Level 2 - Learning: what participants learned from the program. Level 3 - Behavior: whether what was learned is being applied on the job. Level 4 - Results: whether that application is achieving results.
  • 36. Post-test Questions (1) Name three ways evaluation results can be measured. (2) Do all 4 Levels have to be used? (3) Do they have to be used in 1,2,3,4 order? (4) Is Kirkpatrick’s method of evaluation summative or formative? (5) Which developmental “view” does Kirkpatrick use? (discrepancy,
  • 37. “IF YOU THINK TRAINING IS EXPENSIVE, TRY IGNORANCE.” and, remember, the definition of ignorance is repeating the same behavior, over and over, and expecting different results!

Notes de l'éditeur

  1. The reason why Kirkpatrick wanted to develop his Four-Level Model was to clarify the meaning and process for determining ‘evaluation’ in a training program. If there is no change in behavior, but there is a change in skills, knowledge, or attitudes, then using only part of the model (not all levels) is acceptable. If the purpose of the training program is to change behavior, then all four levels apply. Other authors on evaluation of training programs have proposed various strategies, but Kirkpatrick is given credit for developing and masterminding the Four-Level Model. Kirkpatrick focuses the Model for the executives and middle management. However, his model works well in most other training areas.
  2. These are questions asked by HRD coordinators on training performance and the beginning criteria and the expectations of the resulting training program. Business training operations need quantitative measures as well as qualitative measures. A happy medium between these two criteria is an ideal position to fully understand the training needs and to fulfill its development. Quantitative - the research methodology where the investigator's “values, interpretations, feelings, and musings have no place in the positivist’s view of the scientific inquiry.” (Borg and Gall, 1989) cont.
  3. The end results after an evaluation are hopefully positive results for both upper management and the program coordinators.
  4. 1. Ask participants, bosses, testing, or ask others who are familiar with the needs or objectives. Some examples are surveys or interviews. 2. a. What are the results that you are trying to do? b. What behaviors do you want the participants to have at the end of the training program. c. What knowledge, skills, and/or attitudes do you want your pupils to demonstrate at the end of the training program. 3. Determine subject content to meet needs and objectives. 4. Four decisions: a. Who is the best suited to receive the training. b. Are the training programs required by law (affirmative action). c. Voluntary or required d. Should hourly and salary be included in the same class or be segregated. 5. Solid week or intermittent days. How often should breaks be taken. Should lunch be brought in or allow participants to leave for a hour.
  5. 6. Should be comfortable and convenient and appropriate. 7. a. In-house or outside contractors b. Do instructors need to be ‘tailored’ to the special needs in the training program. 8. Two purposes: a. Maintain interest b. Help communicate ideas and skill transfer. Both of these purposes can be accomplished by using single, special interest video cassettes or some type of packaged program. 9. Two scenarios: a. Frustration, and b. Needs of the instructor. 10. The determining effectiveness of a training program are planning and implementation.
  6. 1. If and when downsizing occurs, this statement shall have more meaning than ever for some unlucky people. HRD departments are regarded by upper management as an overhead and not contributing directly to production.
  7. 2. Pilot courses may be implemented to see if the participants have the necessary knowledge, or skills, or behavioral changes to make the program work. 3. Kirkpatrick uses eight factors on how to improve the effectiveness of a training program. These eight factors closely follow the Ten Factors of Developing a Training Program. This is a feedback statement spinning off of the Ten Factors.
  8. All of these levels are important. However, in later examples of this model, you shall see where large corporations have taken the Kirkpatrick Model and used all of it, only part of it, and still some reversed the order of the levels.
  9. The reactions of the participants must be positive for the program to survive, grow, and improve. Reactions reach back to bosses and subordinates alike. This word-of-mouth gossip reaction can either make the program or break it. Here ‘customer’ refers to the participants in the training program.
  10. A training program must accomplish at least one of these three learning traits in order to be effective for a participant to learn. The best case scenario is to see an improvement in all three traits. However, as according to Kirkpatrick, only one learning trait is all it takes to have an effective training program.
  11. Guidelines for measuring Learning: 1. Use a control group along with an experimental group to provide a comparison analysis, 2. Have a pre-test and a post-test, then measure the difference, 3. Try to get an honest and true 100% response to any interviews, surveys, or tests. 4. The use of a test to measure participant learning is an effective evaluation for both participant and instructor alike. However, this is not a conclusive fact. There may be other factors involved. Results must be measured across the spectrum of the Ten Factors of Development.
  12. Level 3 asks the question “What changes in behavior occurred because people attended the training? This Level is a more difficult evaluation than Levels 1 and 2.
  13. The employee must want to make the change. The training must provide the what and the how . The employee must return to a work environment that allows and/or encourages the change. There should be rewards - Intrinsic - inner feelings of price and achievement. Extrinsic - such as pay increases or praise.
  14. The employee may - Like the new behavior and continue using it. Not like the new behavior and return to doing things the “old way”. Like the change, but be restrained by outside forces that prevent his continuing to use it.
  15. With Reaction and Learning, evaluation should be immediate. But evaluating change in Behavior involves some decision-making.
  16. Use a control group only if applicable. Be aware that this task can be very difficult and maybe even impossible. Allow time for behavioral changes. This could be immediate, as in the case of diversity training, or it can take longer, such as using training for administration of performance appraisals. For some programs 2-3 months is appropriate. For others, 6 months is more realistic. Evaluate before and after, if time and budgets allow. Conduct interviews and surveys. Decide who is qualified for questioning, and, of those qualified, whose answers would be most reliable, who is available, and, of the choices, should any not be used. Attempt to get 100% response. Repeat the evaluation. Not all employees will make the changes at the same time. Consider cost vs. benefit. This cost can be internal staff time or an outside expert hired to do the evaluation. The greater the possible benefits, the greater the number of dollars that can be justified. If the program will be repeated, the evaluation can be used for future program improvements.
  17. Many of these questions do not get answered. Why? Trainers don’t know how to measure results in comparison to the cost of the training. Secondly, the results may not be clear proof that the training caused the positive results. Unless there is a direct relationship between the training and the results. (i.e. sales training and resulting sales dollars)
  18. Use a control group, again, if applicable, to prove the training caused the change. Allow time for results, different for different programs, different for each individual. Measure before and after. This is easier than measuring behavior because figures are usually available - hard data, such as production numbers or absenteeism. Repeat the measurement. You must decide when and how often to evaluate. Consider cost vs. benefit. Here, the amount of money spent on evaluation should be determined by - cost of training, potential results to be achieved, and how often the training will be repeated. And last, be happy with evidence of training success, because you may not get proof!
  19. This case study used Level 1, Reaction, and Level 3, Behavior: St. Luke’s needed to improve efficiency and cost control and was looking for ways to improve management training. Outdoor-based programs have been effective in improving interdepartmental communications, increasing employee trust, and reducing boundaries between departments, thereby empowering employees. How many of you have taken part in such a program? There is an entire course of “rope and ladder” activities in the woods, some at ground level and some at higher elevations. The goal of these activities is to build trust and encourage openness and sharing.
  20. St. Luke’s program consisted of three 1-day sessions on such a course. Phase I was directed at getting acquainted: in the morning “low rope” activities and in the afternoon, “high rope” elements. Phase II was focused on building trust within the group with harder, more challenging activities. Phase III focused on individual development and increased group support. The group traveled together and had team slogans and T-shirts. Previous participants were given a questionnaire to describe what they had personally gotten from the program and how it had changed their behavior. The results were used to design a new questionnaire for future participants.
  21. Evaluation of this program showed that some of the goals were achieved and were long-lasting. Also, it showed that participants had a positive reaction to the program, which can be linked to results on the job.
  22. (1) For ways results can be measured, refer to Slide 21. (2) All four Levels do not have to be used. Case study on St. Luke’s Hospital only used Levels 1 and 3. (3) The Levels do not have to be used in 1,2,3,4 order. Intel started 4,3,2,1 in designing their program. (4) What is your opinion on Kirkpatrick’s method being summative or formative? Is it a combination? (5) Which developmental view do you think Kirkpatrick uses? Defend your opinion.
  23. FOOD FOR THOUGHT!