Você está na página 1de 7

kirkpatrick's learning and training evaluation theory

Donald L Kirkpatrick's training evaluation model - the four levels of learning evaluation
Donald L Kirkpatrick, Professor Emeritus, University Of Wisconsin (where he achieved his !, " ! and PhD#, first pu$lished his ideas in %&'&, in a series of articles in the (ournal of !merican )ociety of *rainin+ Directors, *he articles were su$se-uently included in Kirkpatrick.s $ook Evaluatin+ *rainin+ Pro+rams (ori+inally pu$lished in %&&/0 now in its 1rd edition 2 errett2Koehler Pu$lishers#, Donald Kirkpatrick was president of the !merican )ociety for *rainin+ and Development (!)*D# in %&3', Kirkpatrick has written several other si+nificant $ooks a$out trainin+ and evaluation, more recently with his similarly inclined son (ames, and has consulted with some of the world.s lar+est corporations, Donald Kirkpatrick.s %&&/ $ook Evaluatin+ *rainin+ Pro+rams defined his ori+inally pu$lished ideas of %&'&, there$y further increasin+ awareness of them, so that his theory has now $ecome ar+ua$ly the most widely used and popular model for the evaluation of trainin+ and learnin+, Kirkpatrick.s four2level model is now considered an industry standard across the 45 and trainin+ communities, "ore recently Don Kirkpatrick formed his own company, Kirkpatrick Partners, whose we$site provides information a$out their services and methods, etc,

kirkpatrick's four levels of evaluation model


*he four levels of Kirkpatrick.s evaluation model essentially measure6

reaction of student - what they thought and felt about the training learning - the resulting increase in knowledge or capability behaviour - extent of behaviour and capability improvement and implementation/application

results - the effects on the business or environment resulting from the trainee's performance

!ll these measures are recommended for full and meaningful evaluation of learnin+ in or+ani7ations, althou+h their application $roadly increases in comple8ity, and usually cost, throu+h the levels from level %2/,

Quick raining !valuation and "eedback "orm# based on Kirkpatrick's Learning !valuation $odel - %!xcel file& kirkpatrick's four levels of training evaluation
*his +rid illustrates the $asic Kirkpatrick structure at a +lance, *he second +rid, $eneath this one, is the same thin+ with more detail,
leve evaluatio evaluation l n type description and %what is characteristics measured & ' (eaction (eaction evaluation is how the delegates felt a$out the training or learning experience) examples of evaluation tools and methods relevance and practicability

.4appy sheets., feed$ack forms, 9er$al reaction, post2trainin+ surveys or -uestionnaires,

:uick and very easy to o$tain, ;ot e8pensive to +ather or to analyse,

Learning

Learning evaluation is the measurement of the increase in knowledge 2 $efore and after,

*ypically assessments or tests $efore and after the trainin+, <nterview or o$servation can also $e used,

5elatively simple to set up0 clear2cut for -uantifia$le skills, Less easy for comple8 learnin+,

,ehaviou r

,ehaviour evaluation is the e8tent of applied

O$servation and interview over time are re-uired

"easurement of $ehaviour chan+e typically re-uires

learning $ack on the =o$ 2 implementation,

to assess chan+e, relevance of chan+e, and sustaina$ility of chan+e,

cooperation and skill of line2mana+ers,

(esults

(esults evaluation is the effect on the business or environment $y the trainee,

"easures are already in place via normal mana+ement systems and reportin+ 2 the challen+e is to relate to the trainee,

<ndividually not difficult0 unlike whole or+anisation, Process must attri$ute clear accounta$ilities,

kirkpatrick's four levels of training evaluation in detail


*his +rid illustrates the Kirkpatrick.s structure detail, and particularly the modern2day interpretation of the Kirkpatrick learnin+ evaluation model, usa+e, implications, and e8amples of tools and methods, *his dia+ram is the same format as the one a$ove $ut with more detail and e8planation6
evaluatio n level and type ') (eaction evaluation description and characteristics (eaction evaluation is how the delegates felt, and their personal reactions to the training or learning experience, for e8ample6 Did the trainees like and en=oy the trainin+> Did they consider the trainin+ relevant> Was it a +ood use of their time> Did they like the examples of evaluation tools and methods *ypically .happy sheets., ?eed$ack forms $ased on su$=ective personal reaction to the trainin+ e8perience, 9er$al reaction which can $e noted and analysed, relevance and practicability

@an $e done immediately the trainin+ ends, 9ery easy to o$tain reaction feed$ack ?eed$ack is not e8pensive to +ather or to analyse for +roups,

Post2trainin+ surveys or <mportant to know -uestionnaires, that people were not upset or Online evaluation or disappointed, +radin+ $y dele+ates, <mportant that )u$se-uent ver$al or people +ive a written reports +iven positive impression $y dele+ates to

venue, the style, timin+, domestics, etc> Level of participation, Ease and comfort of e8perience, Level of effort re-uired to make the most of the learnin+, Perceived practica$ility and potential for applyin+ the learnin+,

mana+ers $ack at their =o$s,

when relatin+ their e8perience to others who mi+ht $e decidin+ whether to e8perience same,

*) Learning

Learning evaluation is the measurement of the increase in knowledge or intellectual capability from $efore to after the learnin+ e8perience6 Did the trainees learn what what intended to $e tau+ht> Did the trainee e8perience what was intended for them to e8perience> What is the e8tent of advancement or chan+e in the trainees after the trainin+, in the direction or area that was intended>

*ypically assessments or tests $efore and after the trainin+, <nterview or o$servation can $e used $efore and after althou+h this is time2 consumin+ and can $e inconsistent,

5elatively simple to set up, $ut more investment and thou+ht re-uired than reaction evaluation,

4i+hly relevant and clear2cut for certain trainin+ such as -uantifia$le or "ethods of assessment technical skills, need to $e closely related to the aims of Less easy for more the learnin+, comple8 learnin+ such as attitudinal "easurement and development, which analysis is possi$le and is famously difficult easy on a +roup scale, to assess, 5elia$le, clear scorin+ and measurements need to $e esta$lished, so as to limit the risk of inconsistent assessment, 4ard2copy, electronic, online or interview style assessments are all possi$le, @ost escalates if systems are poorly desi+ned, which increases work re-uired to measure and analyse,

+) ,ehaviou r

,ehaviour evaluation is the e8tent to which the trainees applied the learning and changed their behaviour, and this

O$servation and interview over time are re-uired to assess chan+e, relevance of chan+e, and sustaina$ility of

"easurement of $ehaviour chan+e is less easy to -uantify and interpret than reaction and learnin+ evaluation,

can $e immediately and several months after the trainin+, dependin+ on the situation6 Did the trainees put their learnin+ into effect when $ack on the =o$> Were the relevant skills and knowled+e used Was there noticea$le and measura$le chan+e in the activity and performance of the trainees when $ack in their roles> Was the chan+e in $ehaviour and new level of knowled+e sustained> Would the trainee $e a$le to transfer their learnin+ to another person> <s the trainee aware of their chan+e in $ehaviour, knowled+e, skill level>

chan+e,

!r$itrary snapshot assessments are not relia$le $ecause people chan+e in different @ooperation and skill ways at different times, of o$servers, typically line2 !ssessments need to mana+ers, are $e su$tle and on+oin+, important factors, and then transferred to and difficult to a suita$le analysis tool, control, !ssessments need to $e desi+ned to reduce su$=ective =ud+ement of the o$server or interviewer, which is a varia$le factor that can affect relia$ility and consistency of measurements, *he opinion of the trainee, which is a relevant indicator, is also su$=ective and unrelia$le, and so needs to $e measured in a consistent defined way, "ana+ement and analysis of on+oin+ su$tle assessments are difficult, and virtually impossi$le without a well2 desi+ned system from the $e+innin+,

)imple -uick response systems unlikely to $e ade-uate,

Evaluation of implementation and application is an e8tremely important assessment 2 there is little point in a +ood reaction and +ood increase in capa$ility if nothin+ chan+es $ack in the 1AB2de+ree feed$ack is =o$, therefore useful method and evaluation in this need not $e used area is vital, al$eit $efore trainin+, challen+in+, $ecause respondents can make a =ud+ement ehaviour chan+e as to chan+e after evaluation is trainin+, and this can possi$le +iven +ood $e analysed for +roups support and of respondents and involvement from trainees, line mana+ers or trainees, so it is !ssessments can $e helpful to involve desi+ned around them from the start, relevant performance and to identify scenarios, and specific $enefits for them, key performance which links to the indicators or criteria, level / evaluation $elow, Online and electronic assessments are more difficult to incorporate 2 assessments tend to $e more successful when inte+rated within e8istin+ mana+ement

and coachin+ protocols, )elf2assessment can $e useful, usin+ carefully desi+ned criteria and measurements,

-) (esults

(esults evaluation is the effect on the business or environment resultin+ from the improved performance of the trainee 2 it is the acid test, "easures would typically $e $usiness or or+anisational key performance indicators, such as6

<t is possi$le that many of these measures are already in place via normal mana+ement systems and reportin+, *he challen+e is to identify which and how relate to to the trainee.s input and influence,

*herefore it is important to identify and a+ree accounta$ility and 9olumes, values, relevance with the percenta+es, trainee at the start of timescales, return on the trainin+, so they investment, and other understand what is to -uantifia$le aspects $e measured, of or+anisational *his process overlays performance, for instance0 num$ers of normal +ood mana+ement practice 2 complaints, staff it simply needs linkin+ turnover, attrition, to the trainin+ input, failures, wasta+e, non2compliance, ?ailure to link to -uality ratin+s, trainin+ input type and achievement of timin+ will +reatly standards and reduce the ease $y accreditations, which results can $e +rowth, retention, attri$uted to the etc, trainin+, ?or senior people particularly, annual appraisals and on+oin+ a+reement of key $usiness o$=ectives are inte+ral to measurin+ $usiness results derived from trainin+,

<ndividually, results evaluation is not particularly difficult0 across an entire or+anisation it $ecomes very much more challen+in+, not least $ecause of the reliance on line2 mana+ement, and the fre-uency and scale of chan+in+ structures, responsi$ilities and roles, which complicates the process of attri$utin+ clear accounta$ility, !lso, e8ternal factors +reatly affect or+anisational and $usiness performance, which cloud the true cause of +ood or poor results,

)ince Kirkpatrick esta$lished his ori+inal model, other theorists (for e8ample (ack Phillips#, and indeed Kirkpatrick himself, have referred

to a possi$le fifth level, namely 5O< (5eturn On <nvestment#, <n my view 5O< can easily $e included in Kirkpatrick.s ori+inal fourth level .5esults., *he inclusion and relevance of a fifth level is therefore ar+ua$ly only relevant if the assessment of 5eturn On <nvestment mi+ht otherwise $e i+nored or for+otten when referrin+ simply to the .5esults. level, Learnin+ evaluation is a widely researched area, *his is understanda$le since the su$=ect is fundamental to the e8istence and performance of education around the world, not least universities, which of course contain most of the researchers and writers, While Kirkpatrick.s model is not the only one of its type, for most industrial and commercial applications it suffices0 indeed most or+anisations would $e a$solutely thrilled if their trainin+ and learnin+ evaluation, and there$y their on+oin+ people2development, were planned and mana+ed accordin+ to Kirkpatrick.s model, ?or reference, should you $e keen to look at more ideas, there are many to choose from,,,

(ack Phillips. ?ive Level 5O< "odel Daniel )tuffle$eam.s @<PP "odel (@onte8t, <nput, Process, Product# 5o$ert )take.s 5esponsive Evaluation "odel 5o$ert )take.s @on+ruence2@ontin+ency "odel Kaufman.s ?ive Levels of Evaluation @<5O (@onte8t, <nput, 5eaction, Outcome# PE5* (Pro+ram Evaluation and 5eview *echni-ue# !lkins. U@L! "odel "ichael )criven.s Coal2?ree Evaluation !pproach Provus.s Discrepancy "odel Eisner.s @onnoisseurship Evaluation "odels <lluminative Evaluation "odel Portraiture "odel and also the !merican Evaluation !ssociation

!lso look at Leslie 5ae.s e8cellent *rainin+ Evaluation and tools availa$le on this site, which, +iven Leslie.s e8perience and knowled+e, will save you the =o$ of researchin+ and desi+nin+ your own tools,

Você também pode gostar