we are using gotowebinar for our distance learning sessions this year. please be sure that you are...

Post on 28-Mar-2015

212 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

We are using GoToWebinar for our Distance Learning sessions this year. Please be sure that you are using a headset with microphone and muting all other speakers OR you may call the conference number located on your message center screen. If you need any other technical assistance, please call Stephanie at 217-732-6462 ext. 32 or email me at sbenedict@adi.org.

Distance Learning – Cohort #2“Building a Statewide System of Support

with Evaluation in Mind”

Agenda• Greetings & who’s here

• Special report from Tom Kerins, CII and Steven Ross, Johns Hopkins University

• Questions, comments, what’s next

Who’s Here• Cohort 2 State Teams and RCC Liaisons:

West Virginia, Montana, Vermont, Nevada, Wisconsin & BIE

• Cohort 1 visitors• Presenting: Tom Kerins, CII & Steven Ross,

Johns Hopkins University• CII Staff: Stephanie Benedict, Marilyn

Murphy & Tom KerinsWhat is one thing you had hoped to learn or hear more about when you signed up for this session?

Topic

“Building a Statewide System of Support with

Evaluation in Mind”

Rubrics-Based Evaluation of a Statewide System of Support

A Tool to Enhance Statewide Systems of Support

Purpose

• To present a framework for how a State Education Agency (SEA) can evaluate the capacity, operational efficiency, and effectiveness of its Statewide System of Support (SSOS).

• For guiding an SEA’s internal evaluation of its SSOS or its development of specifications for an external evaluation.

• In establishing ongoing monitoring, reporting, and formative evaluation processes for an SEA’s SSOS.

Development of the SSOS Evaluation Rubrics

• Basis-A Framework for Effective Statewide Systems of Support developed by Rhim, Hassel, and Redding

• Research on roles of states in school improvement, including case studies of five State Education Agencies and surveys of all 50 states, Washington DC and Puerto Rico.

• Intensive work with a pacesetting group of 9 states.

Conclusions to the Research

• Successful systemic reform requires incentives, capacity, and opportunities

• Each SEA needs an organizational framework to document its strengths and weaknesses and for planning SSOS improvement.

• There is a need for a strong, continuous, state designed and district-directed improvement process to assist schools at all levels of performance

Components of the Rubric-Based Evaluation

Part A: SSOS Plan and Design

1. Specified comprehensive plan for SSOS

2. Defined evidence-based programs/interventions for all students and subgroups

3. Plan for formative evaluation

Components of the Rubric-Based Evaluation

Part B: Resources

4. Staff5. Funding6. Data Analysis and Storage7. Distinguished educators, consultants,

experts, etc. 8. External providers

Components of the Rubric-Based Evaluation

Part C: Implementation

9. Removal of barriers10. Incentives for change11. Communications12. Technical assistance13. Dissemination of Knowledge14. Formative evaluation and monitoring

(audits)

Components of the Rubric-Based Evaluation

Part D: Outcomes

Student achievement

Student attendance

Graduation rate

Essential Indicators

• Within these 4 Parts are 42 Essential Indicators that define the critical components of a State’s SSOS

• Four-point rubrics with cells individualized to each of the 42 indicators help explain and define the different stages a State will go through as it successfully meets each indicator

Rubric Decisions

Next to each indicator there are 4 columns describing the possible continuum of progress

• Little or No Development or Implementation• Limited Development or Partial

Implementation• Mostly Functional level of Development and

Implementation• Full Level of Implementation and Evidence

of Impact

Sample Essential Indicator 5.1

5.1 Coordination among state and federal programs Little or No Development of Implementation: There

is no apparent plan to efficiently coordinate programs with different funding sources that are aimed at improving schools receiving SSOS services.

Limited Development or Partial Implementation: The state has a written plan and has made some preliminary attempts to integrate multiple state and federal programs aimed at school improvement.

Mostly Functional Level of Development and Implementation: The state has begun to integrate multiple programs with common goals but different funding streams in areas such as planning, resource allocation, training, reporting, and compliance monitoring.

Full Level of Implementation and Evidence of Impact: The state has fully implemented its program integration plan, and there is evidence of greater efficiency in planning, resource allocation, and compliance monitoring.

Cumulative Scoring

To receive a rating of III “Mostly functional level of development and implementation”, the SSOS must also fulfill the requirements to receive a rating of II “Limited development or partial implementation”.

Explanatory Materials Provided in the Evaluation Rubric Report

• Evaluation rubric with 42 Essential Indicators• Sample ratings for each indicator along with

examples of evidence to help each SEA Team rate its own SSOS

• Examples from states that help explain the Indicator statements

• A template for SEA Team self- scoring• Essential components of an evaluation plan

Determining the Rating

Essential Indicator 7.2:

Training for distinguished educators and support teams

What the SEA said it had accomplished

As required by the state plan, all Distinguished Educators (DE) must participate in three levels of training/professional development: (a) a one-week summer session, (b) a two-day refresher in early fall, and (c) ongoing coaching mentoring during the DE’s first year. The “DE Academy,” which delivers the training, conducts regular formative evaluations of the activities and services, using the data to make refinements as needed.

Determining the rating

The reviewers rated the state as operating at Level IV on this indicator. The training process for DEs was formally defined, comprehensive, fully implemented, and subjected to continuing review, evaluation and improvement.

State Examples Related to the Indicators*

Indicator 2.2—Coordination of services across SEA departments

The example shows how Ohio worked with the Department of Education, its own Regional Programs, and internally to model how cooperation can be accomplished so funds and requirements can be integrated.

* See the Evaluation Rubric Report for state examples for each indicator

Rubric-Based Evaluation Activities

• The rubrics illustrate the continuum that occurs with each Indicator as States develop their SSOS.

• Each State Team (using evidence) should develop a profile of how its SSOS lines up with all 42 indicators by using the Rubric’s template to note the present stage of development.

• Comments should be included to note what needs to be done to improve the initial results of the self-rating.

• Each State Team should choose at least six indicators for immediate action after this self-review process

Role of CII in this process

• Each State Team should develop a plan of action including tasks, timelines and the responsibilities of each team member as they begin to turn the indicator statements into objectives.

• Staff from CII will be available by webinar as well as on-site work to assist State Teams as they use the Rubric’s template to document the status of their SSOS.

Evaluation

• Each SEA Team should use the initial results from this rubric as baseline information

• Periodically (and certainly annually) each SEA Team should check for progress on the entire rubric and specifically on those sections of the Rubric that generated recommendations.

• CII staff are available to assist in any of these evaluations of SEA progress

The Evaluation Rubric & Indistar

• The Indistar system can be used to choose indicators and document planning.

• Using Indistar procedures, a team can begin the process of selecting indicators through the needs assessment, creating plans and assigning tasks to certain team members and other staff, as well as monitor the progress of the work as a whole.

• To view the sample Indistar site, go to www.centerii.org and click on the Indistar login in the bottom, left corner of the page. Use the following login information… Login: ssos Password: ssos

• Each state that is interested in using the online version of this tool will be given their own unique login and password.

The Evaluation Rubric & Indistar(cont.)

•Before you will be given your unique login and password, we ask that you participate in an additional webinar for the “SSoS Online Tool Orientation Training”.

•The webinar will be scheduled for May 13th at 1:00 pm CST. If you are interested in joining that webinar, please send an email to pacesetters@adi.org and we will send you the registration link.

•An alternative date of May 27th at 1:00 pm CST will also be available if you cannot make the first webinar.

Assess….Plan….Monitor

If your State team is interested in using the Indistar Tool and would like to get an individual state login/password, please contact Stephanie Benedict, sbenedict@adi.org . For all CII support for SSoS, please contact Tom Kerins, tkerins@adi.org .

Questions

Comments….

Evaluating the Outcomes of SSOS:Qualities of an Effective EvaluationSteven M. Ross, Ph.D., Johns Hopkins University

Steven M. Ross, Ph.D.

Steven M. Ross is a senior research scientist and professor at the Center for Research and Reform in Education at Johns Hopkins University. Dr. Ross’ expertise is in educational research and evaluation, school reform and improvement, at-risk learners, and technology integration.

Questions to Ponder

Is evaluation substantively and routinely embedded in your SSOS?

Are we as states going beyond completing the rubrics and looking at root causes and data?

Do evaluation results help to scale up successful activities and discontinuing others (e.g., certain providers)?

Are external evaluators being used for support? Should they be?

The Evaluation Rubric : A Quick Review  

Part A: SSOS Plan and Design

1. Specified comprehensive plan2. Defined evidence-based programs 3. Plan for evaluation

Part A: SSOS Plan and Design

1. Specified comprehensive plan2. Defined evidence-based programs 3. Plan for evaluation

The Evaluation Rubric : A Quick Review  

Part B: Resources

4. Staff5. Funding6. Data analysis and storage7. Distinguished educators, consultants, &

experts8. External providers

Part C: Implementation

9. Removal of barriers for change and innovation

10. Incentives for change11. Communications12. Technical Assistance13. Dissemination of knowledge14. Monitoring and program audits

The Evaluation Rubric : A Quick Review  

Part D: Outcomes for Schools Served by SSOS

15. Student achievement16. Student attendance17. Graduation rate

The Evaluation Rubric : A Quick Review  

42 Rubrics and Their Rating Scales

I. Limited or No Development or Implementation

II. Limited Development or Partial Implementation

III. Mostly Functional Level of Development and Implementation

IV. Full Level of Implementation and Evidence of Impact

SSOS Evaluation Rubric 2.5

From Evaluating the Statewide System of Support, by S. Hanes, T. Kerins, C. Perlman, S. Redding, & S. Ross, 2009, p. 24, Table 2. Copyright 2009 by Academic Development Institute. Reprinted with permission.

2. Defined evidence-based programs/interventions for all students & subgroups

SSOS Evaluation Rubric 15.1

From Evaluating the Statewide System of Support, by S. Hanes, T. Kerins, C. Perlman, S. Redding, & S. Ross, 2009, p. 37, Table 15. Copyright 2009 by Academic Development Institute. Reprinted with permission.

Why Evaluate SSOS: Isn’t There Enough to Do Already?  

Being able to make reliable and valid judgments of the status of the services provided

How fully are services being implemented? To what extent are expected outcomes

being achieved?

To provide accountability information for SDE and external organizations

To demonstrate accountability to consumers (districts, schools, educators, parents)

To develop a process for continuous program improvement

Why Evaluate SSOS: Isn’t There Enough to Do Already?  

Properties of an Effective Evaluation

Validity (rigorous, reliable)

Evidence-Based– Documented plan– Meeting agenda– Survey responses– Performance standards– Outcome measures

Strong Evidence: “80% of principals surveyed rated the Distinguished Educators’ support as very helpful and provided a specific example of how the DE helped their school.”

Weak Evidence: “The Governor touted the state’s progress in assisting low-performing schools during his tenure.”

Properties of an Effective Evaluation

Evaluative rather than descriptive

Properties of an Effective Evaluation

From Evaluating the Statewide System of Support, by S. Hanes, T. Kerins, C. Perlman, S. Redding, & S. Ross, 2009, p. 91, Table 1. Copyright 2009 by Academic Development Institute. Reprinted with permission.

Evaluating Educational Outcomes

Part D, Section 15 of Rubric

• SSOS is ultimately about improving student achievement and educational success

• These are “distal” or “culminating” outcomes that may not show immediate change

• Nonetheless, it is educationally and politically important to monitor these indicators

Evaluating Educational Outcomes: Recommendation 1

1. Treat the essential indicators (Part D, Section 15) as a starting point only

• Given the 42 rubric indicators, which services appear to be the most essential to improve?

• Prioritize these improvement needs

Evaluating Educational Outcomes:Recommendation 2

2. Supplement the essential indicators with follow-up analyses of “root causes” and data

• Potentially successful turnaround strategies that may be scalable

• Unsuccessful strategies that need replacement

• Explanation of the outcomes relative to SSOS service provided

Example: Although we train distinguished educators and revise the training each year, what evidence is there that the training is effective?

Analyses of “Root Causes”

Example A: Schools showing the strongest increases in mathematics are visited by SDE and found to be using highly interactive teaching strategies and expanded learning opportunities

Implication for SSOS?

Analyses of “Root Causes”

Example B: School B increased its reading scores significantly over the past year. Follow-up study of student enrollment patterns reveals that student rezoning decreased the number of disadvantaged students by 50%.

Implication for SSOS?

Analyses of “Root Causes”

Example C: School C had several student subgroups fail to attain AYP in reading. Follow-up interviews with the principal and literacy coaches reveal that the new R/LA curriculum was poorly supported by the provider.

Implication for SSOS?

Evaluating Educational Outcomes: Recommendation 3

3. Supplement the beginning evaluation (Recommendation 1) and follow-up analyses (Recommendation 2) with rigorous evaluations of selected interventions 

•RFP’s for external studies• Assistance to school districts interested in

evaluation research• Rigorous data analyses by SDE to study

achievement patterns associated with SSOS interventions

Accurate and Relevant Evidence

Strong, Suggestive, or Weak ?

“Teachers liked the professional development activities.”

Accurate and Relevant Evidence

Strong, Suggestive, or Weak ?

“Systematic observation by independent observers shows significant increases in student-centered instruction.”

Accurate and Relevant Evidence

Strong, Suggestive, or Weak ?

“Teachers indicate that they use more student-centered instruction than in the past.”

Accurate and Relevant Evidence

Strong, Suggestive, or Weak ?

“Principals and grade-level leaders indicate observing more frequent cooperative learning than last year.”

Accurate and Relevant Evidence

Strong, Suggestive, or Weak ?

“The providers of the professional development believed the offerings to be successful.”

Accurate and Relevant Evidence

Strong, Suggestive, or Weak ?

“Reading scores increased by 15% for the schools receiving SSOS in literacy.”

Working with External Evaluators

Question: Is it more or less “costly” than using SDE staff?

Answer: It depends on the expertise and availability of the latter.

Working with External Evaluators

What types of evaluation tasks most need external evaluators?• The Basic Rubric (“Study I”) and the

essential indicators (“Study II”) might best be performed in-house

• The external evaluator (at low cost) would be helpful to corroborate the Study I and II findings

• Rigorous studies of specific interventions (“Study III”) are most appropriate for external evaluators

Working with External Evaluators

Advantages of External Evaluators

• Expertise in research design/data analysis

• School/district staff likely to be more disclosive

• Independence/credibility

Working with External Evaluators

Key Steps• Use systematic process to select the

evaluator (careful review of prior work and client satisfaction)

• Establish clear plan of work and budget• Clearly define evaluation/research

questions• Monitor the study via regular

meetings/reports, etc. • Work with the evaluator to disseminate

results to improve policies and practices

Concluding Comments

Unless SSOS is evaluated, it is unlikely to improve The benefits of the evaluation depend on its rigor

and quality There is little to gain by painting rosy pictures of

mediocre outcomes– The message is that all is well and should be left

alone– A truthful negative evaluation is a stimulus for

change There is much to gain by presenting objective

results to sustain services that work and improve those that are ineffective

Questions

Comments

What’s Next

Center on Innovation & Improvement Staff

• Tom Kerins, Programs Director, tkerins@adi.org

• Marilyn Murphy, Communication Director, mmurphy@centerii.org

• Lisa Kinnaman, Director of Improvement Support to States, lkinnaman@adi.org

• Stephanie Benedict, Client Relations Coordinator, sbenedict@adi.org

top related