what makes a difference in inservice teach education… · a meta-analysis of research rttrh k....

8
What Makes a Difference in Inservice Teach Education? A Meta-Analysis of Research RtTrH K. WADE idly growing research on inser vice education is a nearly impos- sible task.' Even if one can keep up with it, the results reported are often speculative, contradictor,, and confus ing, The research techniques, t pes of measurements, and the groups stud- ied vary greatly among researchers Drawing conclusions from such a het- erogeneous conglomeration can lead to frustration and further confusion Yet most research reviews and integra tive works continue to be largely a pattern of reviewers' personal judg- ments, individual creativity, and pre ferred styles (Jackson, 1978) Even though many more people are writing about staff development,.few accounts present concrete evidence of its effects on teachers and students Loucks and Melle (1982) conclude that U i: . l hl l , most staff development reports are simply statements of participant satis- faction, which are then used to deter- mine the success of a program Baden (1982) suggests that inservice educa- tion should become a systematic effort to create behavior change in teachers and, eventually, in students While par- ticipant satisfaction and local support are invaluable to inservice programs, there is a need to systematically deter- mine their efficacv Effectiveness should he measured not only at the Staff developers need to take a second level of the teacher-participant hut a ~also at the level of the students with look at coaching and voluntary whom teachers interact participation, among other things, when Ruth K Wade is Staff Detelopment Coordi determining what really matters. nator, AmerstPe Regona 5coo Amherst, Massachu.setts Author's Note I am grateful to P'rofessor Bill Wolf I'niversit) of Massachusetts, for his helpful critique of this article. 48 ED)CATIONAl. LFAI)ERSHIP

Upload: duonghuong

Post on 29-Aug-2018

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: What Makes a Difference in Inservice Teach Education… · A Meta-Analysis of Research RtTrH K. WADE ... writing about staff development,.few ... Does inservice training have dif-

What Makes a Difference inInservice Teach Education?A Meta-Analysis of Research

RtTrH K. WADE

idly growing research on inservice education is a nearly impos-

sible task.' Even if one can keep upwith it, the results reported are oftenspeculative, contradictor,, and confusing, The research techniques, t pes ofmeasurements, and the groups stud-ied vary greatly among researchersDrawing conclusions from such a het-erogeneous conglomeration can leadto frustration and further confusionYet most research reviews and integrative works continue to be largely apattern of reviewers' personal judg-ments, individual creativity, and preferred styles (Jackson, 1978)

Even though many more people arewriting about staff development,.fewaccounts present concrete evidence ofits effects on teachers and studentsLoucks and Melle (1982) conclude that

U i: . l hl l , most staff development reports aresimply statements of participant satis-faction, which are then used to deter-mine the success of a program Baden(1982) suggests that inservice educa-tion should become a systematic effortto create behavior change in teachersand, eventually, in students While par-ticipant satisfaction and local supportare invaluable to inservice programs,there is a need to systematically deter-mine their efficacv Effectivenessshould he measured not only at the

Staff developers need to take a second level of the teacher-participant huta ~also at the level of the students with

look at coaching and voluntary whom teachers interact

participation, among other things, when Ruth K Wade is Staff Detelopment Coordi

determining what really matters. nator, AmerstPe Regona 5cooAmherst, Massachu.setts

Author's Note I am grateful to P'rofessorBill Wolf I'niversit) of Massachusetts, forhis helpful critique of this article.

48 ED)CATIONAl. LFAI)ERSHIP

Page 2: What Makes a Difference in Inservice Teach Education… · A Meta-Analysis of Research RtTrH K. WADE ... writing about staff development,.few ... Does inservice training have dif-

What is needed is a systematic meth-od for integrating findings across inde-pendent studies by converting them toa common base. Integrative analysis,or what Glass (1976) termed meta-analysis, provides the necessary per-spective for this systematic integration.

With meta-analysis. all quantitativestudies on a particular topic can becompared by making them part of alarger class or category: in this case,effects of inservice teacher trainingprograms. The technique has limita-tions in that the researcher can onlyexamine the data that commonly existin a particular body of literature. Theresult, however, is precise quantitativeestimates of the size of effect of variouspractices, based on a large number ofstudies.

Effect sizes are computed from sum-mary statistics reported by authors.More specifically, effect size is thedifference between the means of thetreated and control groups divided bythe standard deviation of the controlgroup (Glass, 197') By standardizingthe scores, characterizing each as thesize of the effect in relation to thestandard deviation. the various sets ofscores can be put on the same scale.After effect size calculations are made,statistical analvses can be used to iden-tify the independent variables that ac-count for the changes A positive effectindicates that the experimental treat-ment was more effective than the con-trol procedure. while a negative effectindicates that the control treatmentwas more effective An effect size of .5means that the treatment groupshowed one-half a standard deviationgreater change than the control group.

In this study, I used meta-analvsis toanswer the following questions:

I What are the effects of inservicetraining programs on teachers in thetypical studv?

2. Does inservice training have dif-ferent effects for different intendedinstructional outcomes? For example,does the effect size van' if the goal ofthe training is increasing participantlearning, changing participant behav-ior, eliciting positive participant reac-tions to the training, or influencingstudents of participants?

3. Finally, do the effects of trainingvary as a function of duration of train-ing, training group characteristics, lo-cation and scheduling, sponsorship,incentives for participation. traininggroup structure, and instructionaltechniques?

MethodI began bv making an exhaustivesearch of the literature to obtain stud-ies that met the topical criteria as wellas provided the necessary data fromwhich to calculate effect size. I re-viewed over 300 journal articles, dis-sertations, and Educational ResourcesInformation Center (ERIC) documentspublished or presented between 1968and 1983: of these, 91 were selectedfor inclusion in this meta-analvsis. Tobe included, a studv needed to meetthe following criteria: (1) the studywas quantitative rather than qualita-tive; (2) the data necessary for calculat-ing effect size were presented; (3) thedata related to one of the major ques-tions in this study, and (4) subjects ofthe studv were public school teachersor their students in grades K-12.

I initially read many of these select-ed studies to determine the variablesthat were frequently represented inthis body of literature and to ferret outthe most commonly used independentvariables I came up with a list of 28variables, which I grouped into eightcategories in order to describe themain features of the studies. Thesevariables include: (1) effect levels orgoals of the training, (2) duration, (3)training group characteristics. (4) loca-ton and scheduling. (5) sponsorship,(6) participant incentives, (-) struc-ture, and (8) instructional techniques.It is important to note that while Iexamined each variable independent-Iv, I also looked at variables 2 through8 in terms of variable 1 to determinewhether the goal of the training influ-enced the effect size of each variable.

I treated each studv as an individualunit of anah-sis and considered multi-ple dependent variables or multiplecomparison groups as separate datasets. The 91 studies vielded '15 datasets

DECEMBER 1984,3A. L AR 1985 49

Page 3: What Makes a Difference in Inservice Teach Education… · A Meta-Analysis of Research RtTrH K. WADE ... writing about staff development,.few ... Does inservice training have dif-

Results1. Effect Letel. For all studies. I

separated effect levels into four cate-gories that defined the level at whichthe evaluation was directed:

0Reaction assessed how the partici-pants felt about the inservice training

0 Learning measured (usuallvthrough pretests and post-tests) theamount of learning that Awas achieved

· Behatior measured whether par-ticipants changed their behavior as aresult of a staff development interven-tion

*Results determined whether therewas an impact in the classroom, usual-Iv on students, as a result of teachertraining

The findings indicate that inserticeteacher education programs reportedin the literature are moderately effec-tive. Inservice treatment of any kind,on the average, resulted in .52 of astandard deviation greater changethan control groups. Such an outcomeshould provide reassurance to staffdevelopers and other educators whowonder whether the time, monev, andeffort invested make a difference

When the data are grouped by thelevel at which the evaluation was di-rected, the findings are conclusive.Attempts to increase participants'learning through inservice teachertraining are highly effective (. 90 meaneffect size): attempts to change partici-pants' behavior and to elicit positivereactions to the training are moderate-ly effective ( 60 and 42 mean effectsize, respectively): while attempts todemonstrate results by Ilx)king at thestudents of participants are only mildlx'effective ( 3' mean effect size)

lasting more than 30 hours Similarly,there was no significant difference inthe effect sizes of programs lasting sixmonths or less versus thiose lastingmore than six months

3. Training Grotup CharacteristicsTraining groups involving both ele-mentar' and secondary teachersachieved higher effect sizes thangroups enrolling only elementary' or

only secondary level educators Myfindings corroborate the often-reportn-ed fact that greater effects are evidentafter training a group of elementanrteachers versus a group of secondarVteachers, but working with bothgroups together may suggest somenew training possibilities

Contrarv to popular opinion, wheth-er a participant voluntarilh chooses to

"... the evidenceis beginning topoint to the factthat coaching, asan instructionaltechnique, doesnot have thepotential toalter teacherbehavior... ."

2. Duration. The studies in thismeta-analvsis indicated that there wasno significant effect of length of treat-ment. This includes studies with train-ing ranging from a few hours to those

EII)i

Page 4: What Makes a Difference in Inservice Teach Education… · A Meta-Analysis of Research RtTrH K. WADE ... writing about staff development,.few ... Does inservice training have dif-

attend insert ice training or is requiredto attend does not make a significantdifference in training effect size It isreassuring to note that over 85 percentof the 609 data sets in this categoryreported voluntars participation. infact, the- had 23 standard dev iationsgreater effectiveness than mandatoryparticipation studies, hut the differ-ence was not statisticall significant

The size of the training group ( 1-20,21-t40. 41-60, greater than 60) did nothave a significant effect on trainingresults. nor did comprosition of thegroup in terms of wnhether participantswere a faculty unit or a group ofunrelated individuals

D)F(r EL'IBER ] 984I1A\ NH) 1985

4. Location and Scheduling No lo-cation or scheduling variables pro-duced a statistically significant impacton effect size. Variables included on-site versus off-site training and sched-uling during or outside of schoolhours

S. ponlsorsnp Training programsinitiated, developed, or funded by- thestate or federal government or a uni-versitn were significantly more effec-tive than those initiated within theschxool, either hb teachers. administra-tors, or supervisors This finding doesnot support the popular belief held bhmans staff developers that teacher-ini-tiated training programs are more

effective. Perhaps education profes-sionals who work outside the schoolshave more time and financial re-sources av-ailable to develop, test, andpresent training programs for teach-ers In anv case. these results do notsupport the limited federal role inschool improvement suggested bsBerman and McLaughlin (19-'8) but dosupport the School ImprovementStudy (Loucks, 1983). w-hich foundhigh success rates in federal- a, well asstate-funded programs.

0 Participant Incenturc.s When aparticipant was selected to take part intraining. either bv- being designated asa representatise of a particular group

Page 5: What Makes a Difference in Inservice Teach Education… · A Meta-Analysis of Research RtTrH K. WADE ... writing about staff development,.few ... Does inservice training have dif-

or through a competitive selectionprocess, the effect size was significant-ly greater than for all other incentivesstudied. This effect may take placebecause the "best" people are chosento participate. or perhaps these indi-viduals work harder when they havebeen "chosen" to participate

The incentive of college credit, fol-lowed by released time, producedmoderately positive effect sizes. Pay,certificate renewal, and no incentiveresulted in small positive effect sizesthat were well below the mean effectsize for all incentives studied.

7. Structure Independent study pro-duced the highest effect size of thestructures examined. An explanationfor this large effect may be that theindependent study structure appealsto the most highly motivated peopleThere do not appear to be importantdifferences in the effect sizes amongworkshops, courses, mini-courses, orinstitutes, all of which produced effect

"Contrary topopular opinion,whether aparticipantvoluntarily choosesto attend inservicetraining or isrequired to attenddoes not makea significantdifference...."

sizes near the mean and are moderate-ly effective

8 Instructional Techniques. The re-sults of analyzing data by the types ofinstruction that yield the highest effectsizes indicate that four types of instruc-tion are significantly more effectivethan others. The most effective instructional methods are observation of ac-tual classroom practices, micro teaching, video/audio feedback, andpractice

Instructional methods associatedwith significantly lower effect sizes arediscussion, lecture, games/simula-tions, and guided field trips.

In spite of many speculative claimsthat coaching greatly enhances instructional effectiveness, I could find noevidence for this in the 225 cases usingcoaching Coaching, modeling, mutualassistance, printed material, produc-tion of instructional materials, pro-grammed study. and fim were allmoderately effective hut did not pro-duce significantly higher effect sizesthan the mean of all instructionalmethods examined

The next logical question tu) examine is whether some combination ofinstructional methods produces signif-icantly higher effect sizes than themean of all instructional methods ex-amined. I grouped together many dif-ferent combinations of types of instruction to see if some particularlyeffective combination existed. Thesecombinations included: (]) lecture,modeling, practice, coaching, (2) lecture, discussion, modeling, practice,coaching; (3) lecture, discussion, modeling, video/audio feedback, coaching;and many others. The result was thatany combination resulted in an effectsize somewhere near the mean at-tained for all instructional methodsgrouped together. Thus it appears thatthere is no "magical" combination ofmethods for effective instruction.

Ei1 (.ATI()NM. I.F~AI)EI RSlll

Page 6: What Makes a Difference in Inservice Teach Education… · A Meta-Analysis of Research RtTrH K. WADE ... writing about staff development,.few ... Does inservice training have dif-

Practical rather than theoretical in-struction, with the instructor takingalmost exclusive responsibility for thedesign and teaching of the class, re-sults in significantly higher effect sizes.In classes where participants are en-couraged to teach each other throughclassroom presentations, group work,and discussion sessions, a lower effectsize results

Although instructors representedmany job categories (such as teacher,support staff, administrator, consul-tant, college personnel), only self-in-struction produced a large positiveeffect size This may say somethingabout the effectiveness of self-instruc-tion as a technique, but it more likelyindicates that any individual who ismotivated enough to complete a pro-gram of self-instruction is likely toachieve successful results. Supportstaff and college personnel as instruc-tors are moderately effective, whileteachers and state department of edu-cation representatives produced onlysmall positive effects. Administratorsand consultants did not have enoughcases for making significant conclu-sions.

ConclusionsMeta-analysis provides an objectivetechnique for research synthesis. Itsemphasis on quantification ensuresagainst projecting personal bias onto avast field of educational research. Mostof my findings are consistent withearlier meta-analvses of inservicetraining byJoslin (1980) and Lawrenceand Harrison (1980). However, I ex-amined several new categories thathad not been previously examinedthrough meta-analysis. My findingscontradict some commonly held as-sumptions regarding effective trainingpractices.

There is no "magic formula" for

effective inservice programs. Figure 1provides a summary of practices asso-ciated with statistically significant,above-average effectiveness. Staff de-velopers who wish to plan programsfor maximum effectiveness shouldcontemplate the following sugges-tions, which are based on the out-comes of this meta-analysis:

1. Plan programs in which elemen-tar-v and secondary teachers can partic-ipate in training together wheneverappropriate.

2. Encourage teachers to becomeinvolved in state-, federal-. or universi-ty-initiated programs.

3. Offer incentives for participation,such as enhanced status or collegecredit, whenever possible.

4. Encourage independent studyand self-instruction as alternatives tothe traditional workshop format.

5 Suggest that instructors set cleargoals and take major responsibility forthe design and teaching of the classrather than encouraging participantsto assume these roles.

6. Use instructional techniques suchas observation, micro teaching, video/audio feedback, and practice as alter-natives to lecture. discussion, games/simulations, and guided field trips.

Throughout the staff developmentliterature, coaching has been cited asan effective technique for achieving"transfer of training." As Griffin (1982)noted, "this inferential model is a po-tentially powerful one for staff devel-opers if it proves to be accurate."However, the evidence is beginning topoint to the fact that coaching, as aninstructional technique, does not havethe potential to alter teacher behavioras proposed by Joyce and Showers(1981). Not only has this study foundthat coaching to achieve transfer oftraining was only moderately effective,other evidence concludes that coach-ing as an instructional technique maynot alwavs be effective. Sparks (1983)found that workshops plus trainer-provided coaching were not superiorto workshops alone or to workshopsplus peer observation.

DECEMBER 1984/1AN uAMn 1985

RFe1u 1. Imeuoa aIPC Asmn

Attempts to increase pticipants' learrigFederal, ste, and wduniv ty in diia .pm.'Training groups with elem ntr i Oosdmt mcPuarticipants who were selc l chen to p cly i..Sdf-inmstucaonIndepedent sb* . -Strong lership roles by instructorsInstruciond tachnques: ;:

Observation .Micro teachingVideoaudio feedback i:Practice Illt-

53

Page 7: What Makes a Difference in Inservice Teach Education… · A Meta-Analysis of Research RtTrH K. WADE ... writing about staff development,.few ... Does inservice training have dif-

Levinson (1962) suggested that the"psychology of coaching" is an impor-tant, vet often neglected consideration.which results in the failure of coachingin the field of management. Levinsoncited four reasons that coaching mightfall short of its potential as a develop-ment technique:

*The coach and the trainee rarelvhave the "psychological time" to de-velop the kind of relationship basedon mutual respect that is necessary foreffective coaching.

* Because there is usually not muchtolerance for giving people time togrow, the coaching relationship is of-ten impaired by pressure from superi-ors to get information from coachesthat might be used against the trainee

* Coaches often do not know howto foster independence: therefore,quick solutions are sometimes pro-posed that do not fit the complexity ofthe problem.

*The coaching situation is in dan-ger of being blocked by the universal

feeling of rivalry and its accompanyingfears.

These potential impediments maybe why coaching was only moderatelyeffective in the 225 cases in this meta-analysis. Under some specific circum-stances, coaching may be an effectivetechnique, but it does not seem toprovide a panacea for staff develop-ment programs

Although these suggestions shouldguide staff developers, they are only apart of what is needed to provideeffective inservice education Many fac-tors are involved in determining theeffectiveness of anv given inserticeactivity Context issues-such as un-derstanding the school climate: princi-pal support: and adequate resources.including time and an understandingof the needs-should not be underestimated. Nor should process issuessuch as governance and teacher investment be overlooked

Nevertheless, the odds for success-ful staff development can be increased

if staff developers consider the factorsthat have been clearly demonstrated tobe related to effectiveness El

'Insertice education and staff development have been used interchangeabhl tomean an - training activxit designed to increase the competencies needed bh! teachers in the performance of their assignedresponsibilities

Referelnc es

Baden. I) I A I 'ser Guide to theEvaluation of Staff t)evelopment Aci¢se.singthe Impact of .Staff DIe elopment l'rogratmsSyracuse. N ': National C(ouncil of Stateson Inservice Education 1982

Berman. P. and MclIaughlin, \ IederalProgra\s S`upporti;pig /-dclcational

Clanzge. Vol VIII. IMipleMenting and .'istanin$g linvliottons. Santa Monicia. C(alifThe Rand ( orpraltion. 19-8

Glass. ( V Integrating Findings TheNMeta-Analxis oIf Research In Ret'ied ofResearchl itn .dutcatio,, Edited h\ 1. SShulman Itasca. III Peac ick PublishingCompan!. 19- -

Glass. G \ Primar-. Secc ndarx, andM.eta Analysis Of Relearch l:'diccatiolialResearcher 5 ( 19-6) 3-8

Griffin. G A Staff D)exelpmeni Paperprepared for the NIE Teaching SyxnthesisConference, Arlie 1 louse \ irginia, 1982

Jackson. (G B Methols for Reviewingand Integrating Research in the S(ocia;ll Sci-ences Final report to the Nationa;l ScienceFoundation fi r (,rant Ni) DIS -6-2(13)9W'ashington, l)C So<cial Research Group,George Washington t nixersilt- 19-8

Joslin. P A Inser-ice Teacher Educlclion: A MeCta Anall sis <)f the Research D) ctoral dissertation. I nix-ersitl of Mlinneso)ta.1980

Joyce. B R. and h \xiers, B Tr;ansfer ofTraining The Cntribution of CoachingJlourmalobflhticalton 163 ( 1981 ) 16i 1-2

Lasrence. G, and I larrison. 1) PolicxImplications i f the Research (in the Pr fessional D)exelopmentl (f Education Personnel: An Analxsi, of Fifti-Nine Studies In1981 Report on lidlucatioal l'ersionnelDetelopment 'Xashington I)C FeistritzerPublications. 1980

Levinson 11 'A 1Psxchol]glst IL)oks atExecutive D)eelppmnent llartard li[.iness Ret ieu i ( 1 962 ) ( -9 -

Loiucks. S F At -asi Sllme (},,ic Nessfrom a Stuckh )if Sch(ool ImpriovxementFducational leatdt/-stli)p 41 ( 1983) -i

Ioucks, S F, and Mlllte. MI F.alluaiti)nof Staff Dexelopnment l )xDI l)o YoU Kni xxI Toi k' The o171/ al/ ,I tfl I)et ('elropime2t3(1982) 1(2 11-

Sparks. (, M Ininrlrte 'lduti;iolnTraining Aclitxlies 1' eacher Attitude. andBehax-ior ( hange D)ict)ral dlssertatioi.Stanfiird I nivx'erits. 1983

HiBdiB s of Researchon uwervice Teacher EducationA meta-analysis of 91 well-documented studies shows that:

* Inservice training that includes both elementary and secondaryteachers is often more effective than inservice for either group sepa-

*onervce is most successful when participants are given specialcrOguO, ; fob r their inolvement, are selected on a competitive basis,

orlse dmps to participate.i --- _of who conducts inservice sessions (trainers comerl{ie d ''l :jobb dcssifications), teachers are more likely to

_ti R ea on their own. Similarly, of all the differentI Mt · Stcrtres, independent study is the most effective.

i, e:re- .ms a cal combination of methods for successful inser-J* *.l :::; , iaWervice programs that use observation, microHIn and: w:~ isual feedback, and practice-either individually_oIIe sn more effective than programs that do not

t .-l'hTu: !oevidec that "coaching" greatly enhances instruction-I! d e uu. M best, it is moderately effective.

i* mervi bless successful when participants are regarded as majori r . Programs are more effective when the leader assumesht role of "gier of information" and the participants are "receivers ofnformantion.'

Hl)l (_%1 1{)x\x:l I \ljili HmI'

Page 8: What Makes a Difference in Inservice Teach Education… · A Meta-Analysis of Research RtTrH K. WADE ... writing about staff development,.few ... Does inservice training have dif-

Copyright © 1984 by the Association for Supervision and Curriculum Development. All rights reserved.