julie q. morrison, ph.d. spdg evaluator (ohio) [email protected] 1

31
SPDG Directors' Webinar: Professional Development Series #1 Evaluation Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) [email protected] 1

Upload: claire-ford

Post on 20-Jan-2016

234 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

1

SPDG Directors' Webinar:

Professional Development Series #1

EvaluationJulie Q. Morrison, Ph.D.SPDG Evaluator (Ohio)

[email protected]

Page 2: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

2

Explore how the National Implementation Research Network’s Implementation Drivers Framework has prioritized staff competence as essential for effective programs and practices

Examine how Guskey’s Five Critical Levels for Evaluating Professional Development can be used as a framework for designing effective professional development

Purpose of the Session

Page 3: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

3

Core Implementation Components

SystemsInterventions

Page 4: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

4

Page 5: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

5

SPDG’s Focus on Competent Use

Implementation of evidence-based practices requires behavior change at the practitioner, supervisory, and administrative support levels.

Training and Coaching are the principle ways in which behavior change is brought about for carefully selected staff in the beginning stages of implementation and throughout the life of evidence-based practices and program. (Fixsen et al., 2005, p. 29)

Page 6: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

6

Job or role description should be explicit about expectations and accountability for all positions (e.g., teachers, coaches, staff, administrators)

Readiness measures to select at a school building-level or school district-level.

Interactive interview process

Best Practices in Selection

(Blase, VanDyke, & Fixsen, 2010)

Page 7: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

7

Training must be … ◦Timely ◦Theory grounded (adult learning)◦Skill-based

Information from Training feeds back to Selection and feeds forward to Coaching

Best Practices in Training

Selection Training Coaching

(Blase, VanDyke, & Fixsen, 2010)

Page 8: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

8

Design a Coaching Service Delivery Plan

Develop accountability structures for Coaching – Coach the Coach!

Identify on-going professional development for coaches

Best Practices in Coaching

Coaching Performance Assessment

Training

(Blase, VanDyke, & Fixsen, 2010)

Page 9: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

9

Must be a transparent process

Use of multiple data sources

Fidelity of implementation should be assessed at the local, regional, and state levels

Tied to positive recognition

Information from this driver feeds back to Selection, Training, and Coaching and feeds forward to the Organization Drivers

Best Practices in Performance Assessment (Fidelity)

Page 10: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

10

Assess fidelity of implementation at all levels and respond accordingly

Identify outcome measures that are …◦ Intermediate and longer-term◦ Socially valid◦ Technically adequate: reliable and valid◦ Relevant data that is feasible to gather, useful

for decision making, widely shared and reported frequently

Best Practices in Decision Support Data Systems

Page 11: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

11

The description of the Implementation Drivers Framework and its implications for best practices represent the work of the members of the National Implementation

Research Network.

My professional experiences with the Implementation Drivers Framework has been informed through discussions with other SPDG Evaluators, most notably Pat Mueller (NH & MS), Amy Gaumer Erickson

(KS & MO), and Pattie Noonan (KS)

Acknowledgements

Page 12: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

12

Guskey’s Five Critical Levels for Evaluating Professional Development

Level 1: Participants’ Reactions

Level 2: Participants’ Learning

Level 3: Organizational Support and Change

Level 4: Participants’ Use of New Knowledge

and Skills

Level 5: Student Learning Outcomes

Page 13: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

13

Measuring participants’ initial satisfaction with the experience provides information that can help improve the design and delivery of programs or activities.

Positive reactions from participants are usually a necessary prerequisite to higher level evaluation results (e.g., fidelity of implementation, impact on student achievement)

Level 1: Participants’ Reactions

Page 14: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

14

Why Professional Development Fails: Poor planning and organization Lack of relevance to the day-to-day issues of the

participants Failure to differentiate the needs of individual

schools and teachers (Wood & Thompson, 1980)

Planning professional development to meet participants needs will increase the likelihood that they will have positive perceptions of the experience and acquire the intended knowledge and skills.

Implications for Increasing Participants’ Positive Reactions

Page 15: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

15

Evidence of participants’ learning validates the relationship between

what was intended and what was achieved

Level 2: Participants’ Learning

Page 16: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

16

◦A clear understanding of the learning objectives targeted by the professional development is needed to promote learning.

Bloom’s Taxonomy of Educational Objectives (Bloom, 1956)

The Instructional Hierarchy (Haring, Lovitt, Eaton, & Hansen, 1978).

Implications for Maximizing Participants’ Learning

Page 17: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

17

Research on Effective Professional Development also Supports:

◦Opportunities to practice the skill or concept under simulated conditions

◦Timely, specific, constructive feedback

◦Coaching to refine implementation (Loucks-Horsley, Harding, Arbuckle, Murray, Dubea, & Williams,

1987; Showers, 1996; Showers, Joyce, & Bennett, 1987)

Implications for Maximizing Participants’ Learning

Page 18: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

18

Organizational variables can be key to the success of any professional development effort. They also can hinder or prevent success, even when the individual aspects of professional development are done right (Sparks, 1996).

Some of the best and most promising improvement strategies have been seriously stifled or halted completely because of seemingly immutable factors in the organization’s culture (Fullan, 1993)

Level 3: Organizational Support and Change

Page 19: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

19

Organizational Policies

Resources

Protections from Intrusions

Implications for Facilitating Organizational Support

and Change

Page 20: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

20

Openness to Experimentation and Alleviation of Fears

Collegial Support Among Teachers

Principal’s Leadership and Support

Implications for Facilitating Organizational Support

and Change

Page 21: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

21

Higher-Level Administrators’ Leadership and Support

Recognition of Success

Provision of Time

Implications for Facilitating Organizational Support

and Change

Page 22: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

22

Fidelity of Implementation Are participants using the new knowledge

and skills to implement the practice as it was intended to be implemented?

Critical Indicators What would you expect to see if effective

implementation were taking place?

Level 4: Participants’ Use of New Knowledge and Skills

Page 23: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

23

Allow sufficient time for participants to adapt the new practices to their setting.

How much fidelity? (replication vs. mutual adaptation)

Anticipate that implementation is often a gradual and uneven process

Attend to depth of implementation (Coburn, 2003)

Implications for Increasing Participants’ Use of New

Knowledge and Skills

Page 24: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

24

Teacher professional development must be explicitly linked to positive student outcomes

In many cases, changes in teacher practices

and attitudes are sustained only when professional development and implementation is combined with evidence of improved student learning (Guskey, 1982, 1984).

Level 5: Student Learning Outcomes

Page 25: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

25

Identify outcome measures that are … Intermediate (formative assessment) and

longer-term (summative assessment) Socially valid Technically adequate: reliable and valid Relevant data that are feasible to gather,

useful for decision making, widely shared and reported frequently

Implications for Increasing the Impact

of Professional Development on Student Learning Outcomes

Page 26: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

26

The description of the five critical levels for evaluating professional development for

teachers represent the work of Tom Guskey.

My professional experiences applying Guskey’s framework has been informed

through discussions with other evaluators, most notably:

Stacey Farber (Cincinnati Children’s Hospital)Kelly Hannum (Center for Creative Leadership)Vanessa Moss-Summers (Xerox)

Acknowledgements

Page 27: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

27

References Bloom, B. S. (1956). Taxonomy of Educational Objectives,

Handbook I: The Cognitive Domain. New York: David McKay Co Inc.

Blase, K. A., Van Dyke, M. K., & Fixsen, D. L. (2010). Implementation Drivers – Best Practices. Chapel Hill, NC: National Implementation Research Network.

Coburn, C. E. (2003). Rethinking scale: Moving beyond numbers to deep and lasting change. Educational Researcher, 32(6), 3-12.

Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation Research: A Synthesis of the Literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231)

Page 28: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

28

Fullan, M. G. (1993). Change forces: Probing the depths of educational reform. Bristol, PA: Falmer.

Guskey, T. R. (1982). The effects of change in instructional effectiveness upon the relationship of teacher expectations and student achievement. Journal of Educational Research, 75(6), 345-349.

Guskey, T. R. (1984b). The influence of change in instructional effectiveness upon the affective characteristics of teachers. American Education Research Journal, 21(2), 245-259.

Guskey, T. R. (2000). Evaluating Professional Development. Thousand Oaks, CA: Corwin Press, Inc.

Page 29: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

29

Haring, N. S., Lovitt, T. C., Eaton, M. D., & Hansen, C. L. (1978). The fourth R: Research in the classroom. Columbus, OH: Charles E. Merrill Publishing Co.

Loucks-Horsley, S., Harding, C. K., Arbuckle, M. A., Murray, L. B., Dubea, C., & Williams, M. K. (1987). Continuing to learn: A guidebook for teacher development. Andover, MA: Regional Laboratory for Educational Improvement of the Northeast & Islands.

Showers, B. (1996). The evolution of peer coaching. Educational Leadership, 53(6), 12-16.

Showers, B., Joyce, B., & Bennett, B. (1987). Synthesis of research on staff development: A framework for future study and a state of the art analysis. Educational Leadership, 45(3), 77-87.

Page 30: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

30

Sparks, D. (1996, February). Viewing reform from a systems perspective. The Developer, pp. 2, 6.

Wood, F. H., & Thompson, S. R. (1980). Guidelines for better staff development. Educational Leadership, 37(5), 374-78.

Page 31: Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio) Julie.Morrison@uc.edu 1

31

Julie Q. Morrison, Ph.D.Assistant ProfessorUniversity of CincinnatiCollege of Education, Criminal Justice, & Human ServicesSchool of Human Services, School Psychology ProgramE-mail: [email protected]

Contact Information