Você está na página 1de 15

Evaluating Professional Development

By

Thomas R. Guskey
University of Kentucky

(Article reprinted with permission from Dr. Guskey

Evaluating Professional Development


!or many years educators have operated under the premise that professional development is "ood #y definition and$ therefore$ more is always #etter. %f you want to improve$ your professional development pro"ram$ simply add a day or two. Today$ however$ we live in an a"e of accounta#ility. &tudents are e'pected to meet hi"her standards$ teachers are held accounta#le for student results$ and professional developers are asked to show that what they do really matters. !or many$ this is a scary situation. They live in fear that a new superintendent or #oard mem#er will come in who wants to know a#out the payoff from the district(s investment in professional development. %f the answers are$ not there$ heads may roll and pro"rams may "et a'ed. )ow it may #e that your professional development pro"rams and activities are state*of*the*art efforts desi"ned to turn teachers and school administrators into reflective$ team#uildin"$ "lo#al*thinkin"$ creative$ nin+a risk*takers. They also may #e #rin"in" a multitude of priceless #enefits to students$ parents$ #oard mem#ers$ and the community at*lar"e. %f that is the case$ you can stop readin" now. But if you are not sure$ and if there is a chance you will #e asked to document those #enefits to the satisfaction of skeptical parties$ you may want to continue. Because in order to provide that evidence$ you are "oin" to have to "ive serious attention to the issues of evaluation. ,istorically$ professional developers haven(t paid much attention to evaluation. -any consider it a costly$ time*consumin" process that diverts attention from important plannin"$ implementation$ and follow*up activities. .thers #elieve they simply lack the skill and e'pertise to #ecome /involved (in ri"orous evaluations. The result is that they either ne"lect evaluation issues completely$ or leave them to 0evaluation e'perts0 who are called in at the end and asked to determine if what was done made any difference. The results of such an inadvertent process are seldom very useful. Good evaluations are the product of thou"htful plannin"$ the a#ility to ask "ood 1uestions$ and a #asic understandin" a#out how to find valid answers. %n many ways they are simply the refinement of everyday thinkin". Good evaluations provide information that is sound$ meanin"ful$ and sufficiently relia#le to use in makin" thou"htful and responsi#le decisions a#out professional development processes and effects (Guskey 2 &parks$ 3443 . %n this article we will consider four #asic 1uestions re"ardin" professional development evaluation5 (3 6hat is evaluation7 (8 6hat are the purposes of evaluation7 (9 6hat are the critical levels of professional development evaluation7 and (: 6hat is the difference #etween evidence and proof7 6e conclude with a list of the "uidelines for evaluatin" the wide ran"e of professional development pro"rams and activities used in schools today.

What Is Evaluation? ;ust as there are many forms of professional development there are also many forms of evaluation. %n fact$ each of us en"a"es in hundreds of evaluation acts every day. 6e evaluate the temperature of our shower in the mornin"$ the taste of our #reakfast$ the chances of rain and the need for an um#rella when we "o outdoors$ and the likelihood we will accomplish what we set out to do on any particular day. These everyday acts re1uire the e'amination of evidence and the application of +ud"ment. As such$ each represents a form of evaluation. The kind of evaluation in which we are interested$ however$ "oes #eyond these informal evaluation acts. .ur interest is in evaluations that are more formal and systematic. 6hile not everyone a"rees on the #est definition of this kind of evaluation$ for our purposes$ a useful operational definition is the followin" 5 Evaluation is the systematic investigation ofmerit or worth*
<

This definition is adapted from The Program Evaluation


Standards (8nd ed. .

=et(s take a careful look at this definition. By usin" the word 0systematic$0 we are distin"uishin" this process from the multitude of informal evaluation acts in which we consciously or unconsciously en"a"e. 0&ystematic0 implies that evaluation in this conte't is a thou"htful$ intentional$ and purposeful process. %t is done for clear reasons and with e'plicit intent. Althou"h the specific purpose of evaluation may vary from one settin" to another$ all "ood evaluations are deli#erate and systematic.

0%nvesti"ation0 refers to the collection and analysis of appropriate and pertinent information. 6hile no evaluation can #e completely o#+ective$ the process is not #ased on opinion or con+ecture. %t is$ instead$ #ased on the ac1uisition of specific$ relevant$ and valid evidence e'amined throu"h appropriate methods and techni1ues. The use of 0merit or worth0 in our definition implies appraisal and +ud"ment. >valuations are desi"ned to determine the value of somethin". They help answer 1uestions such as 0%s this pro"ram or activity leadin" to the results that were intended7 %s it #etter than what was done in the past7 %s it #etter than another$ competin" activity7 %s it worth the costs7 The answers to these 1uestions re1uire more than a statement of findin"s. They demand an appraisal of 1uality and +ud"ments of value$ #ased on the #est evidence availa#le.

What Are The Purposes Of Evaluation? The purposes of evaluation are "enerally classified in three #road cate"ories$ from which stem the three ma+or types of evaluation. -ost evaluations are actually desi"ned to fulfill all three of these purposes$ althou"h the emphasis on each chan"es durin" various sta"es of the evaluation process. Because of this inherent #lendin" of purposes$ distinctions #etween the different types of evaluation are sometimes #lurred. &till$ differentiatin" their intent helps in clarifyin" our understandin" of evaluation procedures (&tevens$ =awren?$ 2 &harp$ 344@ . The three ma+or types of evaluation include plannin"$ formative$ and summative evaluation.

Planning Evaluation Alannin" evaluation takes place #efore a pro"ram or activity actually #e"ins$ althou"h certain aspects may #e continual and on"oin". it is desi"ned to "ive those involved in pro"ram development and implementation a precise understandin" of what is to #e accomplished$ what procedures will #e used$ and how success will #e determined. %n essence$ it lays the "roundwork for all other evaluation activities. Alannin" evaluation involves appraisal$ usually on the #asis of previously esta#lished standards$ of a pro"ram or activity(s critical attri#utes. These include the specified "oals$ the proposal or plan to achieve those "oals$ the concept or theory underlyin" the proposal$ the overall evaluation plan$ and the likelihood that plan can #e carried out with the time and resources availa#le. %t typically includes a determination of needs$ assessment of the characteristics of participants$ careful analysis of the conte't$ and the collection of pertinent #aseline information. >valuation for plannin" purposes is sometimes referred to as 0preformative evaluation0 (&criven$ 3443 and may #e thou"ht of as 0preventative evaluation.0 %t helps identify and remedy early on the difficulties that mi"ht pla"ue later evaluation efforts. Alannin" evaluation also helps ensure that other evaluation purposes can #e met in an efficient and timely manner.

Formative Evaluation !ormative evaluation occurs durin" the operation of a pro"ram or activity. %ts purpose is to provide those responsi#le for the pro"ram with on"oin" information a#out whether thin"s are "oin" as planned and whether e'pected pro"ress is #ein" made. %f not$ this same information can #e used to "uide necessary improvements (&criven$ 34BC . The most useful formative evaluations focus on the conditions for success. They address issues such as5 6hat conditions are necessary for success7 ,ave they #een met7 Dan they #e unproved7 %n many cases$ formative evaluation is a recurrin" process that takes place at multiple times throu"hout the life of the pro"ram or activity. -any pro"ram developers$ in fact$ are constantly en"a"ed in the process of formative evaluation. The evidence they "ather at each step of development and implementation usually stays in*house$ #ut is used to make ad+ustments$ modifications$ or revisions (6orthen 2 &anders$ 34E4 . To keep formative evaluations efficient and avoid e'pectations that will #e disappointed$ &criven (3443 recommends usin" them as 0early warnin"0 evaluations. %n other words$ use formative evaluations as an early version of the final$ overall evaluation. As development and implementation proceed$ formative evaluation can consider intermediate #enchmarks of success to determine what is workin" as e'pected and what difficulties must #e overcome. !laws can #e identified and weaknesses located in time to make the adaptations necessary for success.

Summative Evaluation &ummative evaluation is conducted at the completion of a pro"ram or activity. %ts purpose is to provide pro"ram developers and decision*makers with +ud"ments a#out the pro"ram(s overall merit or worth. &ummative evaluation descri#es what was accomplished$ what were the conse1uences (positive and ne"ative $ what were the final results (intended and unintended $ and$ in some cases$ did #enefits +ustify the costs. Unlike formative evaluations that are used to "uide improvements$ summative evaluations present decision*makers with information they need to make crucial decisions a#out the life of a pro"ram or activity. &hould it #e continued7 Dontinued with modifications7 >'panded7 .r discontinued7 Ultimately$ its focus is 0the #ottom line.0 Aerhaps the #est description of the distinction #etween formative and summative evaluation is one offered #y Ro#ert &take5 06hen the cook tastes the soup$ that(s formativeF when the "uests taste the soup$ that(s summative. (1uoted in &criven$ 3443$ p. 3B4 . Unfortunately$ many educators associate evaluation with its summative purposes only. %mportant information that could help "uide plannin"$ development$ and implementation is often ne"lected$ even thou"h such information can #e key in determinin" a pro"ram or activity(s overall success. &ummative evaluation$ althou"h necessary$ often comes too late to #e much help. Thus$ while the relative emphasis on plannin"$ formative$ and summative evaluation chan"es throu"h the life of a pro"ram or activity$ all three are essential to a meanin"ful evaluation.

What Are The Critical evels Of Professional Development Evaluation?


Alannin"$ formative$ and summative evaluation all involve the collection and analysis of information. %n evaluatin" professional development$ there are five critical sta"es or levels of information to consider. These levels represent an adaptation of an evaluation model developed #y Kirkpatrick (34@4 for +ud"in" the value of supervisory trainin" pro"rams in #usiness and industry. Kirkpatrick(s model$ althou"h widely applied$ has seen limited use in education #ecause of inade1uate e'planatory power. %t is helpful in addressin" a #road ran"e of 0what0 1uestions$ #ut lackin" when it comes to e'plainin" 0why0 (Alli"er 2 ;anak$ 34E4F ,olton$ 344B . The model presented here is desi"ned to resolve that inade1uacy. The five levels in the model are hierarchically arran"ed from simple to more comple'.. 6ith each succeedin" level$ the process of "atherin" evaluation information is likely to re1uire more time and resources. -ore importantly$ each hi"her

level #uilds on the ones that came #efore. %n other words$ success at one level is necessary for success at the levels that follow. Below is a #rief description of each of the five levels and its importance in the evaluation process. %ncluded are the crucial 1uestions addressed at each level$ how that information can #e "athered$ what is #ein" measured$ and how that information will #e used. A summary of these issues is also presented in !i"ure 3.

evel !" Participants# $eactions The first level of professional development evaluation is participants( reactions to the e'perience. This is the most common form of professional development evaluation$ the simplest$ and the level at which educators have the most e'perience. %t is also the easiest type of information to "ather and analy?e. The 1uestions addressed at this level focus on whether or not participants liked it. 6hen they walked out$ did they feel their time was well spent7 Did the material make sense to them7 6ere the activities meanin"ful7 6as the instructor knowled"ea#le and helpful7 Do they #elieve what they learned will #e helpful7 Also important are 1uestions such as$ 6as the coffee hot and ready on time7 6ere the refreshments fresh and tasty7 6as the room the ri"ht temperature7 6ere the chairs comforta#le7 To some$ 1uestions such as these may seem silly and inconse1uential. But e'perienced professional developers know the importance of attendin" to these #asic human needs. %nformation on participants( reactions is "enerally "athered throu"h 1uestionnaires handed out at the end of a session or activity. These 1uestionnaires typically include a com#ination of ratin"*scale items and open*ended response 1uestions that allow participants to provide more personali?ed comments. Because of the "eneral nature of this information$ the same 1uestionnaire often is used for a #road ran"e of professional development e'periences. -any professional or"ani?ations$ for e'ample$ use the same 1uestionnaire for all their professional development activities. -easures of participants( reactions are sometimes referred to as 0happiness 1uotients0 #y those who insist they measure only the entertainment value of an activity$ not its 1uality or worth. But measurin" participants( initial satisfaction with the e'perience provides information that can help improve the desi"n and delivery of pro"rams or activities in valid ways. %n addition$ positive reactions from participants are usually a necessary prere1uisite to hi"her*level evaluation results.

evel %" Participants# earning %n addition to likin" it$ we would also hope that participants learned somethin" from their professional development e'perience. =evel 8 focuses on measurin" the knowled"e$ skills$ and perhaps attitudes participants "ained. Dependin" on the "oals of the pro"ram or activity$ this can involve anythin" from a pencil*and*paper assessment (Dan participants descri#e the critical attri#utes of mastery learnin" and "ive e'amples of how these mi"ht #e applied in common classroom situations7 to a simulation or full*scale skill demonstration (Aresented with a variety of classroom conflicts$ can participants dia"nose each situation$ and then prescri#e and carry out a fair and worka#le solution7 . .ral or written personal reflections$ or e'amination of the portfolios participants assem#le can also #e used to document their learnin". Althou"h evaluation information at =evel 8 sometimes can #e "athered at the completion of a session$ it seldom can #e accomplished with a standardi?ed form. -easures must #e #ased on the learnin" "oals prescri#ed for that particular pro"ram or activity. This means specific criteria and indicators of successful learnin" must #e outlined prior to the #e"innin" of the professional development e'perience. .penness to possi#le 0unintended learnin"s$0 either positive or ne"ative$ also should #e considered. %f there is concern that participants may already possess the re1uisite knowled"e and skills$ some form of pre and post*assessment may #e re1uired. Analysis of this information provides a #asis for improvin" the content$ format$ and or"ani?ation of the pro"ram or activities.

Evaluati on evel 3. Aarticipa ntsG Reaction s

What &uestions Are A''resse'?

(o) Will Information *e +athere'?

What is ,eas Assesse'

8. Aarticipa ntsG =earnin" 9. .r"ani?at ion &upport 2 Dhan"e

H H H H H H H H H

Did they like it7 6as their time well spent7 Did the material make sense7 6ill it #e useful7 6as the leader knowled"ea#le and helpful7 6ere the refreshments fresh and tasty7 6as the room the ri"ht temperature7 6ere the chairs comforta#le7 Did participants ac1uire the intended knowled"e and skills7

H Iuestionnaires administered at the end of


the session.

H %nitial satisfac

the e'perienc

H 6hat was the impact on the or"ani?ation7 H Did it affect or"ani?ational climate and
procedures7

H 6as implementation advocated$ facilitated$ and H H H H H H H


supported7 6as the support pu#lic and overt7 6ere pro#lems addressed 1uickly and efficiently7 6ere sufficient resources made availa#le7 6ere successes reco"ni?ed and shared7 Did participants effectively apply the new knowled"e and skills7 (,ow are participants usin" what they learned7 (6hat challen"e are participants encounterin"7

H H H H H H H H H H

Aaper*and*pencil instruments &imulations Demonstrations Aarticipant reflections (oral andJor written Aarticipant portfolios District and school records -inutes from follow*up meetin"s. Iuestionnaires &tructured interviews with participants and district or school administrators Aarticipant portfolios

H )ew knowled

skills of parti

H The or"ani?at

advocacy$ sup accommodati facilitation$ a reco"nition.

:. Aarticipa ntsG Use of )ew Knowled "e and &kills @. &tudent =earnin" .utcome s

H Iuestionnaires H &tructures interviews with participants and H H H H H H H H H


their supervisors Aarticipant reflections (oral andJor written Aarticipant portfolios Direct o#servations Kideo or audio tapes &tudent records &chool records Iuestionnaires &tructured interviews with students$ parents$ teachers$ andJor administrators Aarticipant portfolios

H De"ree and 1

implementati

H H H H H H H

6hat was the impact on students7 Did it affect student performance or achievement7 Did it influence studentsG physical or emotional well* #ein"7 Are students more confident as learners7 %s student attendance improvin"7 Are dropouts decreasin"7 (,ow does the new learnin" affect other aspects of the or"ani?ation7

H H H H H H H

&tudent learnin

Do"nitive (Aer Achievement Affective (Atti Dispositions Asychomotor ( Behaviors (&tudent 6ork

&tateJ=ocal As

(Aerformance A

evel -" Organi.ation Support an' Change At =evel 9 our focus shifts to the*or"ani?ation and$ specifically$ to information on or"ani?ation support and chan"e. .r"ani?ational varia#les can #e key to the success of any professional development effort. They also can hinder or prevent success$ even when the individual aspects of professional development are done ri"ht (&parks$ 344Ba . &uppose$ for e'ample$ a "roup of educators participate in a professional development pro"ram on cooperative teamin"$ "ain a thorou"h understandin" of the theory$ and or"ani?e a variety of classroom activities #ased on cooperative teamin" principles. !ollowin" their trainin" they try to implement these activities in schools where students are "enerally "raded 0on the curve$0 accordin" to their relative standin" amon" classmates$ and "reat importance is attached to selectin" the class valedictorian. .r"ani?ational policies and practices such as these make teamin" hi"hly competitive and will thwart the most valiant efforts to have students cooperate and help each other learn (Guskey$ 344B . The lack of positive results in this case is not due to poor trainin" or inade1uate teamin". Rather$ it is due to or"ani?ational policies that are incompati#le with implementation efforts. The "ains made at =evels 3 and 8 are essentially canceled #y pro#lems at =evel 9 (&parks 2 ,irsh$ 344C . That is why it is essential to "ather information on or"ani?ation support and chan"e. Iuestions at =evel 9 focus on the or"ani?ational characteristics and attri#utes necessary for success. 6as the advocated chan"e ali"ned with the mission of the or"ani?ation7 6as chan"e at the individual level encoura"ed and supported at all levels7 Did the pro"ram or activity affect or"ani?ational climate and procedures7 6as administrative support pu#lic and overt7 6ere pro#lems addressed 1uickly and efficiently7 6ere sufficient resources made availa#le$ includin" time for sharin" and reflection (=an"er 2 Dolton$ 344: 7 6ere successes reco"ni?ed and shared7 %ssues such as these can #e ma+or contri#utin" factors to the success of any professional development effort. Gatherin" information on or"ani?ation support and chan"e is "enerally more complicated than previous levels. Arocedures also differ dependin" on the "oals of the pro"ram or activity. They may involve analyses of district or school records$ or e'amination of the minutes from follow*up meetin"s. Iuestionnaires sometimes can #e used to tap issues such as the or"ani?ation(s advocacy$ support$ accommodation$ facilitation$ and reco"nition of chan"e efforts. &tructured interviews with participants and district or school administrators can #e helpful as well. This information is used not only to document and improve or"ani?ational support$ #ut also to inform future chan"e initiatives.

evel /" Participants# 0se of 1e) 2no)le'ge an' S3ills 6ith or"ani?ational varia#les set aside$ we turn our attention to whether participants are usin" their new knowled"e and skills on the +o#. At =evel : our central 1uestion is$ 0Did what participants( learn make a difference in their professional practice70 The key to "atherin" relevant information at this level rests in the clear specification of indicators that reveal #oth the de"ree and 1uality of implementation. %n other words$ how can you tell if what participants learned is #ein" used and #ein" used well7 Dependin" on the "oals of the pro"ram or activity$ this may involve 1uestionnaires or structured interviews with participants and their supervisors. .ral or written personal reflections$ or e'amination of participants( +ournals or portfolios also can #e considered. The most accurate information is likely to come from direct o#servations$ either with trained o#servers or #y reviewin" video or audio tapes. 6hen o#servations are used$ however$ they should #e kept as uno#trusive as possi#le (for e'amples$ see ,all 2 ,ord$ 34EC .

Unlike =evels 3and 8$ information at =evel : cannot #e "athered at the completion of a professional development session. -easures of use must #e made after sufficient time has passed to allow participants to adapt the new ideas and practices to their settin". Because implementation is often a "radual and uneven process$ measures also may #e necessary at several time intervals. This is especially true if there is interest in continuin" or on*"oin" use. Analysis of this information provides evidence on current levels of use and can help restructure future pro"rams and activities to facilitate #etter and more consistent implementation. evel 4" Stu'ent earning Outcomes At =evel @ we address what is typically 0the #ottom line0 in education5 6hat was the impact on students7 Did the professional development pro"ram or activity #enefit students (in any way7 The particular outcomes of interest will depend$ of course$ on the "oals of that specific professional development effort. %n addition to the stated "oals$ certain 0unintended0 outcomes may #e important as well. !or this reason$ multiple measures of student learnin" are always essential at =evel @ (;oyce$ 3449 . Donsider the e'ample of a "roup of elementary educators who devote their professional development time to findin" ways to improve the 1uality of students( writin". %n a study "roup they e'plore the research on writin" instruction$ analy?e various approaches$ and devise a series of strate"ies they #elieve will work for their students. %n "atherin" =evel @ information$ they find students( scores on measures of writin" a#ility increased si"nificantly over the course of the school year when compared to the pro"ress of compara#le students who were not involved in these strate"ies. .n further analysis$ however$ they discover that over the same time period$ their students( scores on measures of mathematics achievement declined. This 0unintended0 outcome apparently occurred #ecause instructional time in mathematics was inadvertently sacrificed to provide more time for students to work on their writin". ,ad information at =evel @ #een restricted to a sin"le measure of students( writin"$ this important 0unintended0 result would not have #een identified. -easures of student learnin" typically include indicators of student performance and achievement$ such as assessment results$ portfolio evaluations$ marks or "rades$ and scores from standardi?ed e'aminations. But in addition to these co"nitive indicators$ affective (attitudes and dispositions and psychomotor outcomes (skills and #ehaviors may #e considered as well. >'amples include assessments of students( self*concepts$ study ha#its$ school attendance$ homework completion rates$ or classroom #ehaviors. &choolwide indicators such as enrollment in advanced classes$ mem#erships in honor societies$ participation in school*related activities$ disciplinary actions$ and retention or drop*out rates mi"ht also #e considered. The ma+or source of such information is student and school records. Results from 1uestionnaires and structured interviews with students$ parents$ teachers$ andJor administrators could also #e included. The summative purpose of this information is to document a pro"ram or activity(s overall impact. But formatively$ it can #e used to inform improvements in all aspects of professional development$ includin" pro"ram desi"n$ implementation$ and follow*up. %n some cases information on student learnin" outcomes is used to estimate the cost effectiveness of professional development$ or what is sometimes referred to as 0return on investment$0 or 0R.% evaluation0 (Aarry 344BF Todnern 2 6arner$ 3449 . >valuation at any of these five levels can #e done well or poorly$ convincin"ly or lau"ha#ly. The information "athered at each level is important and can help improve professional development pro"rams and activities. But as many have discovered$ trackin" efficiency at one level tells you nothin" a#out effectiveness at the ne't. Althou"h success at an early level may #e necessary for positive results at the ne't hi"her one$ it is clearly not sufficient. That is why each level is important. &adly$ the #ulk of professional development today is evaluated only at =evel 3$ if at all. .f the rest$ the ma+ority are measured only at =evel 8 (Dody 2 Guskey$ 344C .

What Is The Difference *et)een Evi'ence An' Proof? )ow that you know a#out plannin"$ formative$ and summative evaluation$ and understand the five levels involved in evaluatin" professional development$ are you ready to 0prove0 that your professional development pro"rams make a difference7 6ith this new knowled"e can you demonstrate that what was done in professional development$ and nothin" else$ is solely responsi#le for that ten percent increase in student achievement scores7 !or the five percent decrease in dropout rate7 !or the @< percent reduction in recommendations to the office for disciplinary action 7

Are you tryin" to say the counselin" department had nothin" to do with it7 Do the principal and assistant principal "et no credit for their support and encoura"ement7 -i"ht not year*to*year fluctuations in students have somethin" to do with the results7 And consider the other side of the coin. %f achievement ever happens to drop followin" some hi"hly touted professional development initiative$ would you #e willin" to accept full #lame for the loss7 Ar"uments a#out whether you can a#solutely$ positively isolate the impact of professional development on improvements in student performance are "enerally irrelevant. %n most cases$ you simply cannot "et ironclad proof (Kirkpatrick$ 34CC . To do so you would need to eliminate or control for all other factors that could have caused the chan"e. This re1uires the random assi"nment of educators and students to e'perimental and control "roups. The e'perimental "roup would take part in the professional development activity while the control "roup would not. Dompara#le measures would then #e "athered from each and the differences tested. The pro#lem$ of course$ is that nearly all professional development takes place in real*world settin"s where such e'perimental conditions are impossi#le to meet. The relationship #etween professional development and improvements in student learnin" in these real*world settin"s is far too comple' and there are too many intervenin" varia#les to allow for simple causal inferences (Guskey$ 344CF Guskey 2 &parks$ 344B . 6hat(s more$ most schools are en"a"ed in systemic reform initiatives that involve the simultaneous implementation of multiple innovations (!ullan$ 3448 . %solatin" the effects of a sin"le pro"ram or activity under such conditions is usually impossi#le. But in the a#sence of proof$ you can collect awfully "ood 0evidence0 a#out whether or not professional development is contri#utin" to specific "ains in student learnin". &ettin" up meanin"ful comparison "roups and usin" appropriate pre* and post*measures provides e'tremely valua#le information. Time*series desi"ns that include multiple measures collected #efore and after implementation are another useful alternative. A#ove all$ you must #e sure to "ather evidence on measures that are meanin"ful to stakeholders in the evaluation process. >vidence is what most people want anyway. &uperintendents and #oard mem#ers rarely ask$ 0Dan you prove it70 6hat they ask for is evidence. Donsider$ for e'ample$ the use of anecdotes and testimonials. !rom a methodolo"ical perspective$ they are a poor source of data. They are typically #iased and hi"hly su#+ective. They may #e inconsistent and unrelia#le. )evertheless$ they are a personali?ed form of information that can #e powerful and convincin". And as any trial attorney will tell you$ they offer the kind of evidence that most people #elieve. Althou"h it would #e imprudent to #ase your entire evaluation on anecdotes and testimonials$ they are an important source of evidence that should never #e i"nored. Keep in mind$ too$ that "ood evidence is not that hard to come #y if you know what you(re lookin" for #efore you #e"in. %f you do a "ood +o# of clarifyin" your "oals up front$ most evaluation issues pretty much fall into line. The reason many educators think evaluation at =evels : and @ is so difficult$ e'pensive$ and time*consumin"$ is #ecause they are comin" in after the fact to search for results. %t is as if they are sayin"$ 06e don(t know what we are doin" or why we are doin" it$ #ut let(s find out if anythin" happened0 (Gordon$ 3443 . %f you don(t know where you are "oin"$ it(s very difficult to tell if you(ve arrived. &o when it comes to evidence versus proof$ the messa"e is this5 Always seek proof but collect lots of evidence along the way. Because of the nature of most professional development efforts$ your evidence may #e more e'ploratory than confirmatory .&till$ it can offer important indications a#out whether you are headin" in the ri"ht direction or whether you need to "o #ack to the drawin" #oard. Remem#er$ too$ that knowin" ahead of time what you are tryin" to accomplish will make it much easier to identify the kind of evidence you need.

Evaluation +ui'elines %t should #e clear #y now that "ood evaluations of professional development don(t have to #e costly. )or do they demand sophisticated technical skills$ althou"h technical assistance can sometimes #e helpful. 6hat they do re1uire is the a#ility to ask "ood 1uestions and a #asic understandin" a#out how to find valid answers. Good evaluations provide sound$ useful$ and sufficiently relia#le information that can #e used to make thou"htful and responsi#le decisions a#out professional development processes and effects.

!ollowin" is a list of "uidelines desi"ned to help improve the 1uality of professional development evaluations. Althou"h strictly adherin" to these "uidelines won(t "uarantee your evaluation efforts will #e flawless$ it will "o a lon" way toward makin" them more meanin"ful$ more useful$ and far more effective.

Planning +ui'elines 1. larify the intended goals. The first step any evaluation is to make sure your professional development "oals are clear$ especially in terms of the results you hope to attain with students and the classroom or school practices you #elieve will lead to those results. Dhan"e e'perts refer to this as 0Be"innin" with the end in mind.0 %t is also the premise of a 0results*driven0 approach to professional development (&parks$ 344@$ 344B# . !. Assess the value of the goals. Take steps to ensure die "oals are sufficiently challen"in"$ worthwhile$ and considered important #y all those involved in the professional development process. Broad*#ased involvement at this sta"e contri#utes "reatly to a sense of shared purpose and mutual understandin". DlaniByrin" the relationship #etween esta#lished "oals and the school(s mission is a "ood place to #e"in. ". Analy#e the conte$t. %dentify the critical elements of the conte't where chan"e is to #e implemented and assess how these mi"ht influence implementation. &uch an analysis mi"ht include the e'amination of pertinent #aseline information on students( and teachers( needs$ their uni1ue characteristics and #ack"round e'periences$ the resources availa#le$ the level of parent involvement and support$ and the or"ani?ational climate. %. Estimate the program&s potential to meet the goals. >'plore the research #ase of the pro"ram or activity$ and the validity of the evidence supportin" its implementation in conte'ts similar to yours. 6hen e'plorin" the literature on a particular pro"ram$ #e sure to distin"uish facts from persuasively ar"ued opinions. A thorou"h analysis of the costs of implementation$ and what other services or activities must #e sacrificed to meet those costs$ should #e included as well. @. 'etermine how the goals can be assessed. Decide$ up front$ what evidence you would trust in determinin" if the "oals are attained. >nsure that evidence is appropriate$ relevant to the various stakeholders$ and meets at least minimal re1uirements for relia#ility and validity. Keep in mind$ too$ that multiple indicators are likely to #e necessary in order to tap #oth intended and possi#le unintended conse1uences. (. )utline strategies for gathering evidence. Determine how that evidence will #e "athered$ who will "ather it and when it should #e collected. Be mindful of the critical importance of intermediate or #enchmark indicators that mi"ht #e used to identify pro#lems (formative or forecast final results (summative . &elect procedures that are thorou"h and systematic$ #ut considerate of participants( time and ener"y. Thou"htful evaluations typically use a com#ination of 1uantitative and 1ualitative methods$ #ased on the nature of the evidence sou"ht. To document improvements you must also plan meanin"ful contrasts with appropriate comparison "roups$ pre* and post*measures$ or lon"itudinal time*series measures.

Formative an' Summative +ui'elines C. *ather and analy#e evidence on participants+ reactions. At the completion of #oth structured and informal professional development activities$ collect information on how participants re"ard the e'perience. A com#ination of items or methods is usually re1uired to assess perceptions of various aspects of the e'perience. %n addition$ keepin" the information anonymous "enerally "uarantees more honest responses. E. *ather and analy#e evidence on participants& learning. Develop specific indicators of successful learnin"$ select or construct instruments or situations in which that learnin" can #e demonstrated$ and collect the information throu"h appropriate methods. The methods used will depend$ of course$ on the nature of the learnin" sou"ht. %n most cases a com#ination of methods or procedures will #e re1uired. 4. *ather and analy#e evidence on organi#ation support and change. Determine the or"ani?ational characteristics and attri#utes necessary for success$ and what evidence #est illustrates those characteristics. Then collect and analy?e that information to document and to improve or"ani?ational support.

3<. *ather and analy#e evidence on participants& use of new knowledge and skills. Develop specific indicators of #oth the de"ree and 1uality of implementation. Then determine the #est methods to collect this information$ when it should #e collected$ and how it can #e used to offer participants constructive feed#ack to "uide (formative or +ud"e (summative their implementation efforts. %f there is concern with the ma"nitude of chan"e (%s this really different from what participants have #een doin" all alon"7 $ pre* and post*measures may need to #e planned. The methods used to "ather this evidence will depend$ of course$ on the specific characteristics of the chan"e #ein" implemented. 33. *ather and analy#e evidence on student learning outcomes. Donsiderin" the procedures outlined in &tep B$ collect the student information that most directly relates to the pro"ram or activity(s "oals. Be sure to include multiple indicators to tap the #road ran"e of (intended and possi#le unintended outcomes in the co"nitive$ affective$ and psychomotor areas. Anecdotes and testimonials should #e included to add richness and provide special insi"hts. Analyses should #e #ased on standards of desired levels of performance over all measures and should include contrasts with appropriate comparison "roups$ pre* and post*measures$ or lon"itudinal time*series measures. 38. Prepare and present evaluation reports. Develop reports that are clear$ meanin"ful$ and comprehensi#le to those who will use the evaluation results. %n other words$ present the results in a form that can #e understood #y decision makers$ stakeholders$ pro"ram developers$ and participants. >valuation reports should #e #rief #ut thorou"h$ and should offer practical recommendations for revision$ modification$ or further implementation. %n some cases reports will include information comparin" costs to #enefits$ or the 0return on investment.0

Conclusion .ver the years a lot of "ood thin"s have #een done in the name of professional development. &o have a lot of rotten thin"s. 6hat professional developers haven(t done is provide evidence to document the difference #etween the "ood and the rotten. >valuation is the key$ not only to makin" those distinctions$ #ut also to e'plainin" how and why they occurred. To do this we must reco"ni?e the important summative purposes that evaluation serves$ and its vital plannin" and formative purposes as well. ;ust as we ur"e teachers to plan carefully and make on"oin" assessments of student learnin" an inte"ral part of the instructional process$ we need to make evaluation an %nte"ral part of the professional development process. &ystematically "atherin" and analy?in" evidence to inform what we do must #ecome a central component in professional development technolo"y. Reco"ni?in" and usin" this component will tremendously enhance the success of professional development efforts everywhere.

$eferences Alli"er$ G. -.$ 2 ;anak$ >. A. (34E4 . Kirkpatrick(s levels of trainin" criteria5 Thirty years later. Personnel Psychology, %!(8 $ 993* 9:8. Dody$ D. B.$ 2 Guskey$ T. R. (344C . Arofessional development. %n ;. D. =indle$ ;. -. Aetrosko$ 2 R. &. Aankrat? (>ds. $ 344B -eview of research on the .entucky Education -eform Act (pp. 343*8<4 . !rankfort$ KL5 The Kentucky %nstitute for >ducation Research. !ullan$ -.G. (3448 . Kisions that #lind. Educational /eadership, :4(@ $ 34*8<. Gordon$ ;. (344 3 . -easurin" the ("oodness( of trainin". Training (Au"ust $ 34*8@. Guskey$ T. R. (344B . Reportin" on student learnin"5 =essons from the past * Arescriptions for the future. %n T. R. Guskey (>d. $ ommunicating Student /earning. 344B 0earbook of the Association for Supervision and urriculum 'evelopment (pp. 39*8: . Ale'andria$ KA5 Association for &upervision and Durriculum Development. Guskey$ T. R. (344C . Research needs to link professional development and student learnin". 1ournal of Staff 'evelopment, 3E(8 , 9B:<. Guskey$ T. R.$ 2 &parks$ D. (344 3 . 6hat to consider when evaluatin" staff development. Educational /eadership, :4(9 $ C9*CB. Guskey$ T. R.$ 2 &parks$ D. (344B . >'plorin" the relationship #etween staff development and improvements in student learnin". 1ournal of Staff 'evelopment, 3C(: $ 9:*9 E. ,all$ G. >.$ 2 ,ord$ &. -. (34EC . hange in schools2 3acilitating the process. Al#any$ )L5 &U)L Aress$ ,olton$ >. !. (344B . The flawed four*level evaluation model. 4uman -esources 'evelopment 5uarterly, C(l $ @*8 3. ;ohnson$ B. -. (344@ . 6hy conduct action research7 Teaching and hange, "6l7, 4<*3<: ;oint Dommittee on &tandards for >ducational >valuation (344: . The program evaluation standards (8nd ed. . Thousand .aks$ DA5 &a"e. ;oyce$ B. (3449 . The link is there$ #ut where do we "o from here7 1ournal of Staff 'evelopment, 3:(9 $ 3<*38. Kirkpatrick$ D. =. (34@4 . Techni1ues for evaluatin" trainin" pro"rams. A four*part series #e"innin" in the )ovem#er issue (Kol. 39$ )o. 117 of Training and 'evelopment 1ournal (then titled 1ournal for the American Society of Training 'irectors7. Kirkpatrick$ D. =. (34CC . >valuatin" trainin" pro"rams5 >vidence vs. proof. Training and 'evelopment 1ournal, 93(33 , 4*38. Aarry$ &. B. (344B . -easurin" trainin"(s R.%. Training 8 'evelopment, @<(@ $ C8*C@. &criven$ -. (34BC . The methodolo"y of evaluation. %n R. >. &take (>d. $ urriculum evaluation. American >ducational Research Association -ono"raph &eries on >valuation$ )o. % * Dhica"o5 Rand -c)ally. &criven$ -. (344 3 . Evaluation thesaurus (:th ed. . )ew#ury Aark$ DA5 &a"e.

&tevens$ !.$ =awren?$ !. 2 &harp$ =. (344@ . 9ser:friendly handbook for pro;ect evaluation2 Science, mathematics, engineering, and technology education. Arlin"ton$ KA5 )ational &cience !oundation. &parks$ D. (344@$ April . Be"innin" with the end in mind. School Team <nnovator, 1 (3 $ A M % &parks$ D. (344Ba$ !e#ruary . Kiewin" reform from a systems perspective. The 'eveloper, pp. 8$ B. &parks$ D. (344B#$ ;anuary . Results*driven staff development. The 'eveloper, p. 8. &parks$ D.$ 2 ,irsh$ &. (344C . A new vision for staff development. Ale'andria$ KA5 Association for &upervision and Durriculum Development. Todnem$ G.$ 2 6arner$ -. A. (3449 . Usin" R.% to assess staff development efforts. 1ournal of Staff 'evelopment, 3:(9 , 98*9:. 6orthen$ B. R.$ 2 &anders$ ;. R. (34E4 . Educational evaluation. )ew Lork5 =on"man.

Você também pode gostar