nyscoss conference superintendents training on assessment 9 14

93
Andy Hegedus, Ed.D. September 2014 From a Superintendent ’s Perspective: Using data wisely

Upload: nwea

Post on 30-Nov-2014

120 views

Category:

Education


0 download

DESCRIPTION

Training for 2nd and 3rd year NY state superintendents on assessments and data use - 9-21-14

TRANSCRIPT

Page 1: NYSCOSS Conference Superintendents Training on Assessment 9 14

Andy Hegedus, Ed.D.

September 2014

From a Superintendent’s

Perspective: Using data wisely

Page 2: NYSCOSS Conference Superintendents Training on Assessment 9 14

• How many of you think your literacy with assessments is “Good” or better?

• How many of you have a fine tuned assessment program?

• How many of you think your practical knowledge about using data for systemic improvement is “Good” or better?

Trying to gauge my audience and adjust my speed . . .

Page 3: NYSCOSS Conference Superintendents Training on Assessment 9 14

• An adequate depth of knowledge to ask really good questions and the tenacity to do so

• A person willing to “Speak the Truth, With Love, To Power”

• A big lever – “What gets measured (and attended to),

gets done”

The measurement and reinforcement system is your responsibility

The Bottom Line First:Here’s what you must have

Page 4: NYSCOSS Conference Superintendents Training on Assessment 9 14

• Increase your understanding about various urgent assessment related topics– Ask better questions– Useful for making all types of decisions with

data

My Purpose

Page 5: NYSCOSS Conference Superintendents Training on Assessment 9 14

• Assessment basics +• Improving your assessment program• Data culture

Three main topics

Page 6: NYSCOSS Conference Superintendents Training on Assessment 9 14

• What we’ve known to be true is now being shown to be true– Using data thoughtfully improves student

achievement and growth rates– 12% mathematics, 13% reading

• There are dangers present however– Unintended Consequences

Go forth thoughtfullywith care

Slotnik, W. J. , Smith, M. D., It’s more than money, February 2013, retrieved from http://www.ctacusa.com/PDFs/MoreThanMoney-report.pdf

Page 7: NYSCOSS Conference Superintendents Training on Assessment 9 14

“What gets measured (and attended to), gets done”

Remember the old adage?

Page 8: NYSCOSS Conference Superintendents Training on Assessment 9 14

• NCLB– Cast light on inequities– Improved performance of “Bubble Kids”– Narrowed taught curriculum

The same dynamic happens inside your schools

An infamous example

Page 9: NYSCOSS Conference Superintendents Training on Assessment 9 14

It’s what we do that counts

A patient’s health doesn’t change because we know their blood pressure

It’s our response that makes all the difference

Page 10: NYSCOSS Conference Superintendents Training on Assessment 9 14

Be considerate of the continuum of stakes involved

Support

Compensate

Terminate

Increasing levels of required rigor

Incr

easi

ng r

isk

Page 11: NYSCOSS Conference Superintendents Training on Assessment 9 14

Assessment basics(in a teacher evaluation frame)

Page 12: NYSCOSS Conference Superintendents Training on Assessment 9 14

1. Alignment between the content assessed and the content to be taught

2. Selection of an appropriate assessment• Used for the purpose for which it was designed

(proficiency vs. growth)• Can accurately measure the knowledge of all students• Adequate sensitivity to growth

3. Adjust for context/control for factors outside a teacher’s direct control (value-added)

Three primary conditions

Page 13: NYSCOSS Conference Superintendents Training on Assessment 9 14

1. Assessment results used wisely as part of a dialogue to help teachers set and meet challenging goals

2. Use of tests as a “yellow light” to identify teachers who may be in need of additional support or are ready for more

Two approaches we like

Page 14: NYSCOSS Conference Superintendents Training on Assessment 9 14

Is the progress produced by this teacher dramatically different than teaching peers who deliver instruction to comparable students in comparable situations?

What question is being answered in support of

using data in evaluating teachers?

Page 15: NYSCOSS Conference Superintendents Training on Assessment 9 14

Marcus Normal Growth Needed Growth

Marcus’ growth

College readiness standard

Page 16: NYSCOSS Conference Superintendents Training on Assessment 9 14

The Test

The Growth Metric

The Evaluation

The Rating

There are four key steps required to answer this question

Top-Down Model

Page 17: NYSCOSS Conference Superintendents Training on Assessment 9 14

The Test

The Growth Metric

The Evaluation

The Rating

Let’s begin at the beginning

Page 18: NYSCOSS Conference Superintendents Training on Assessment 9 14

3rd Grade ELA

Standards

3rd Grade ELA

Teacher?

3rd Grade Social

Studies Teacher?

Elem. Art Teacher?

What is measured should be aligned to what is to be taught

1. Answer questions to demonstrate understanding of text….

2. Determine the main idea of a text….

3. Determine the meaning of general academic and domain specific words…

Would you use a general reading assessment in the evaluation of a….

~30% of teachers teach in tested subjects and gradesThe Other 69 Percent: Fairly Rewarding the Performance of Teachers of Nontested Subjects and Grades, http://www.cecr.ed.gov/guides/other69Percent.pdf

Page 19: NYSCOSS Conference Superintendents Training on Assessment 9 14

• Assessments should align with the teacher’s instructional responsibility– Specific advanced content

• HS teachers teaching discipline specific content – Especially 11th and 12th grade

• MS teachers teaching HS content to advanced students

– Non-tested subjects• School-wide results are more likely “professional

responsibility” rather than reflecting competence

– HS teachers providing remedial services

What is measured should be aligned to what is to be taught

Page 20: NYSCOSS Conference Superintendents Training on Assessment 9 14

• Many assessments are not designed to measure growth

• Others do not measure growth equally well for all students

The purpose and design of the instrument is significant

Page 21: NYSCOSS Conference Superintendents Training on Assessment 9 14

Let’s ensure we have similar meaning

Beginning

Literacy

Adult Reading

5th Grade x

x

Time 1 Time 2

StatusGrowth

Two assumptions:1. Measurement accuracy,

and2. Vertical interval scale

Page 22: NYSCOSS Conference Superintendents Training on Assessment 9 14

Accurately measuring growth

depends on accurately measuring

achievement

Page 23: NYSCOSS Conference Superintendents Training on Assessment 9 14

Questions surrounding the

student’s achievement level

The more questions the

merrier

What does it take to accurately measure achievement?

Page 24: NYSCOSS Conference Superintendents Training on Assessment 9 14

Teachers encounter a distribution of student performance

Beginning

Literacy

Adult Reading

5th Grad

e

x x xx

xx

xx

x

x

xx

x

xx

Grade Level Performance

Page 25: NYSCOSS Conference Superintendents Training on Assessment 9 14

Adaptive testing works differently

Item bank can span full range of achievement

Page 26: NYSCOSS Conference Superintendents Training on Assessment 9 14

How about accurately measuring height?

What if the yardstick stopped in the middle of his back?

Page 27: NYSCOSS Conference Superintendents Training on Assessment 9 14

Items available need to match student ability

California STAR NWEA MAP

Page 28: NYSCOSS Conference Superintendents Training on Assessment 9 14

How about accurately measuring height?

What if we could only mark within a pre-defined six inch range?

Page 29: NYSCOSS Conference Superintendents Training on Assessment 9 14

5th Grade Level Items

These differences impact measurement error

.00

.02

.04

.06

.08

.10

.12

Info

rmati

on

260190 200 210 220 230 240Scale Score

Fully Adaptive Test

Significantly Different Error

250

Constrained Adaptive or

Paper/PencilTest

Page 30: NYSCOSS Conference Superintendents Training on Assessment 9 14

To determine growth, achievement

measurements must be related through

a scale

Page 31: NYSCOSS Conference Superintendents Training on Assessment 9 14

If I was measured as:5’ 9”

And a year later I was:1.82m

Did I grow?Yes. ~ 2.5”

How do you know?

Let’s measure height again

Page 32: NYSCOSS Conference Superintendents Training on Assessment 9 14

Traditional assessment uses items reflecting the grade level standards

Beginning

Literacy

Adult Reading

4th Grade

5th Grade

6th Grade

Grade Level Standards

Traditional Assessment Item Bank

Page 33: NYSCOSS Conference Superintendents Training on Assessment 9 14

Traditional assessment uses items reflecting the grade level standards

Beginning

Literacy

Adult Reading

4th Grade

5th Grade

6th Grade

Grade Level Standards

Grade Level StandardsOverlap allows linking and scale construction

Grade Level Standards

Page 34: NYSCOSS Conference Superintendents Training on Assessment 9 14

• Think of a high stakes test – State Summative

– Designed mainly to identify if a student is proficient or not

• Do they do that well?• 93% correct on Proficiency determination

• Does it go off design well?• 75% correct on Performance Levels determination

Error can change your life!

*Testing: Not an Exact Science, Education Policy Brief, Delaware Education Research & Development Center, May 2004, http://dspace.udel.edu:8080/dspace/handle/19716/244

Page 35: NYSCOSS Conference Superintendents Training on Assessment 9 14

• Tests specifically designed to inform classroom instruction and school improvement in formative ways

No incentive in the system for inaccurate data

Using tests in high stakes ways creates new dynamic

Page 36: NYSCOSS Conference Superintendents Training on Assessment 9 14

1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51 53 55 57 59 61 63 65 67 69 71-6.00

-4.00

-2.00

0.00

2.00

4.00

6.00

8.00

10.00

Students taking 10+ minutes longer spring than fall All other students

New phenomenon when used as part of a compensation program

Mean value-added growth by school

Page 37: NYSCOSS Conference Superintendents Training on Assessment 9 14

When teachers are evaluated on growth using a once per year assessment, one teacher who cheats disadvantages the next teacher

Other consequence

Page 38: NYSCOSS Conference Superintendents Training on Assessment 9 14

• What were some things you learned?• What practices do you want to reinforce?• What do you need to do differently?

• Think – Pair– 2 min to make some notes– 3 min to share with a neighbor

Lessons?

Page 39: NYSCOSS Conference Superintendents Training on Assessment 9 14

Testing is complete . . . What is useful to answer our question?

The Test

The Growth Metric

The Evaluation

The Rating

Page 40: NYSCOSS Conference Superintendents Training on Assessment 9 14

The problem with spring-spring testing

4/14 5/14 6/14 7/14 8/14 9/14 10/14 11/14 12/14 1/15 2/15 3/15 4/15

Teacher 1 Summer Teacher 2

Page 41: NYSCOSS Conference Superintendents Training on Assessment 9 14

• When possible use a spring – fall – spring approach

• Measure summer loss and incentivize schools and teachers to minimize it

• Measure teacher performance fall to spring, giving as much instructional time as possible between assessments

• Monitor testing conditions to minimize gaming of fall or spring results

A better approach

Page 42: NYSCOSS Conference Superintendents Training on Assessment 9 14

Grade 2 Grade 3 Grade 4 Grade 5 Grade 6 Grade 7 Grade 80

10

20

30

40

50

60

70

80

90

100

ReadingMath

The metric matters - Let’s go underneath “Proficiency”

Difficulty of New York Proficient Cut Score

Nat

iona

l Per

cent

ile

College Readiness

New York Linking Study: A Study of the Alignment of the NWEA RIT Scale with the New York State (NYS) Testing Program, November 2013

Page 43: NYSCOSS Conference Superintendents Training on Assessment 9 14

The metric matters - Let’s go underneath “Proficiency”

Dahlin, M. and Durant, S., The State of Proficiency, Kingsbury Center at NWEA, July 2011

Page 44: NYSCOSS Conference Superintendents Training on Assessment 9 14

Series1

-40

-20

0

20

40

60

80

10087

55

-31

Estimated Proficiency Rates For Six NY Districts4th Grade Mathematics

With Proficiency Cut Scores Changed

2012 2013 Reported Change

What actually happened?

Page 45: NYSCOSS Conference Superintendents Training on Assessment 9 14

Series1

-40

-20

0

20

40

60

80

100 87

55

-31

Estimated Proficiency Rates For Six NY Districts

4th Grade MathematicsWith Proficiency Cut Scores

Changed

2012 2013Reported Change

What actually happened?

Series10

10

20

30

40

50

60

46

55

10

Estimated Proficiency Rates For Six NY Districts

4th Grade MathematicsWith 2013 Proficiency Cut

Scores Applied

2012 2013 Actual Change

Page 46: NYSCOSS Conference Superintendents Training on Assessment 9 14

Mathematics

No ChangeDownUp

Fall RIT

Num

ber o

f Stu

dent

sWhat gets measured and attended to

really does matter

Proficiency College Readiness

One district’s change in 5th grade mathematics performance relative to the KY proficiency cut scores

Page 47: NYSCOSS Conference Superintendents Training on Assessment 9 14

Mathematics

Below projected growthMet or above pro-jected growth

Student’s score in fall

Nu

mb

er o

f S

tud

ents

Number of 5th grade students meeting projected mathemat-ics growth in the same district

Changing from Proficiency to Growth means all kids matter

Page 48: NYSCOSS Conference Superintendents Training on Assessment 9 14

• What were some things you learned?• What practices do you want to reinforce?• What do you need to do differently?

• Think – Pair – Share– 2 min to make some notes– 3 min to share with a neighbor

Lessons?

Page 49: NYSCOSS Conference Superintendents Training on Assessment 9 14

How can we make it fair?

The Test

The Growth Metric

The Evaluation

The Rating

Page 50: NYSCOSS Conference Superintendents Training on Assessment 9 14

Without context what is “Good”?

Beginning Reading

Adult Literacy

Nati

onal

Pe

rcen

tile

Norms StudyScale

Colle

ge R

eadi

ness

Be

nchm

arks

ACT

Perf

orm

ance

Lev

els

State Test

“Meets”Proficiency

Perf

orm

ance

Lev

els

Common Core

Proficient

Page 51: NYSCOSS Conference Superintendents Training on Assessment 9 14

Normative data for growth is a bit different

Fall Score

Subject: Reading

Grade: 5th

7 points

FRL vs. non-FRL?

IEP vs. non-IEP?

ESL vs. non-ESL?

Outside of a teacher’s direct control

Starting Achievement

Instructional Weeks

Basic Factors

Typical growth

Page 52: NYSCOSS Conference Superintendents Training on Assessment 9 14

A Visual Representation of Value Added

Spring 5th Grade Test

Student ASpring Score 209

Score 207(Average Spring Score for Similar

Students)

Value Added(+2 Score)

Student AFall Score 200

Fall 5th Grade Test

Page 53: NYSCOSS Conference Superintendents Training on Assessment 9 14

• What if I skip this step?– Comparison is likely against normative data

so the comparison is to “typical kids in typical settings”

• How fair is it to disregard context?– Good teacher – bad school– Good teacher – challenging kids

Consider . . .

Page 54: NYSCOSS Conference Superintendents Training on Assessment 9 14

• Value added models can control for a variety of classroom, school level, and other conditions– Proven statistical methods– All attempt to minimize error– Variables outside controls are assumed as random

Value-added is science

Page 55: NYSCOSS Conference Superintendents Training on Assessment 9 14

• Control for measurement error– All models attempt to address

this issue• Population size• Multiple data points

– Error is compounded with combining two test events

– Many teachers’ value-added scores will fall within the range of statistical error

A variety of errors means more stability only at the extremes

Page 56: NYSCOSS Conference Superintendents Training on Assessment 9 14

-12.00-11.00-10.00

-9.00-8.00-7.00-6.00-5.00-4.00-3.00-2.00-1.000.001.002.003.004.005.006.007.008.009.00

10.0011.0012.00

Mathematics Growth Index Distribution by Teacher - Validity Filtered

Aver

age

Grow

th In

dex

Scor

e an

d Ra

nge

Q5

Q4

Q3

Q2

Q1

Each line in this display represents a single teacher. The graphic shows the average growth index score for each teacher (green line), plus or minus the standard error of the growth index estimate (blue line). We removed stu-dents who had tests of questionable validity and teachers with fewer than 20 students.

Range of teacher value-added estimates

Page 57: NYSCOSS Conference Superintendents Training on Assessment 9 14

With one teacher, error means a lot

Page 58: NYSCOSS Conference Superintendents Training on Assessment 9 14

• Value-added models assume that variation is caused by randomness if not controlled for explicitly– Young teachers are assigned disproportionate

numbers of students with poor discipline records– Parent requests for the “best” teachers are

honored• Sound educational reasons for placement are

likely to be defensible

Assumption of randomness can have risk implications

Page 59: NYSCOSS Conference Superintendents Training on Assessment 9 14

“The findings indicate that these modeling choices can significantly influence outcomes for individual teachers, particularly those in the tails of the performance distribution who are most likely to be targeted by high-stakes policies.”

Ballou, D., Mokher, C. and Cavalluzzo, L. (2012) Using Value-Added Assessment for Personnel Decisions: How Omitted Variables and Model Specification Influence Teachers’ Outcomes.

Instability at the tails of the distribution

LA Times Teacher #1LA Times Teacher #2

Page 60: NYSCOSS Conference Superintendents Training on Assessment 9 14

How tests are used to evaluate teachers

The Test

The Growth Metric

The Evaluation

The Rating

Page 61: NYSCOSS Conference Superintendents Training on Assessment 9 14

• How would you translate a rank order to a rating?• Data can be provided

• Value judgment ultimately the basis for setting cut scores for points or rating

Translation into ratings can be difficult to inform with data

Page 62: NYSCOSS Conference Superintendents Training on Assessment 9 14

• What is far below a district’s expectation is subjective

• What about• Obligation to help

teachers improve?• Quality of replacement

teachers?

Decisions are value based, not empirical

Page 63: NYSCOSS Conference Superintendents Training on Assessment 9 14

• System for combining elements and producing a rating is also a value based decision– Multiple measures and principal judgment

must be included– Evaluate the extremes to make sure it

makes sense

Even multiple measures need to be used well

Page 64: NYSCOSS Conference Superintendents Training on Assessment 9 14

Leadership Courage Is A Key

Teacher 1 Teacher 2 Teacher 30

1

2

3

4

5

Ratings can be driven by the assessment

Observation Assessment

Real or Noise?

Page 65: NYSCOSS Conference Superintendents Training on Assessment 9 14

If evaluators do not differentiate their ratings,

then all differentiation comes from the test

Big Message

Page 66: NYSCOSS Conference Superintendents Training on Assessment 9 14

• What were some things you learned?• What practices do you want to reinforce?• What do you need to do differently?

• Think – Pair – Share– 2 min to make some notes– 3 min to share with a neighbor– 2 min for two report outs on anything so far

Lessons?

Page 67: NYSCOSS Conference Superintendents Training on Assessment 9 14

Improving yourassessment program

Page 68: NYSCOSS Conference Superintendents Training on Assessment 9 14

• Read the sheet and highlight anything interesting to you

Let’s DefineTypes of Assessments

Page 69: NYSCOSS Conference Superintendents Training on Assessment 9 14

The pursuit of compliance is exhausting because it is always a

moving target. Governors move on, the party in power gets replaced, a

new president is elected, and all want to put their own stamp on

education.

It is saner and less exhausting to define your own course and align compliance requirements to that.

Page 70: NYSCOSS Conference Superintendents Training on Assessment 9 14

Seven standards that define the purpose driven assessment system

Page 71: NYSCOSS Conference Superintendents Training on Assessment 9 14

The purposes of all assessments are defined and the assessments are valid and useful for their purposes1

Page 72: NYSCOSS Conference Superintendents Training on Assessment 9 14

Teachers are educated in the proper administration and application of the assessments used in their classrooms2

Page 73: NYSCOSS Conference Superintendents Training on Assessment 9 14

Redundant, mis-aligned, or unused assessments are eliminated3

Page 74: NYSCOSS Conference Superintendents Training on Assessment 9 14

Assessment results are aligned to the needs of their audiences4

Page 75: NYSCOSS Conference Superintendents Training on Assessment 9 14

Assessment results are delivered in a timely and useful manner5

Page 76: NYSCOSS Conference Superintendents Training on Assessment 9 14

The metrics and incentives used encourage a focus on all learners6

Page 77: NYSCOSS Conference Superintendents Training on Assessment 9 14

The assessment program contributes to a climate of transparency and objectivity with a long-term focus7

Page 78: NYSCOSS Conference Superintendents Training on Assessment 9 14

1. Typical assessment purposes

• Identify student learning needs• Identify groupings of students for instruction• Guide instruction• Course placement• Determine eligibility for programs• Award credits and/or assign grades• Evaluate proficiency• Monitor student progress• Predict proficiency• Project achievement of a goal• Formative and summative evaluation of programs• Formative evaluation to support school and teacher improvement• Report student achievement, growth, and progress to the

community and stakeholders• Summative evaluation of schools and teachers

Page 79: NYSCOSS Conference Superintendents Training on Assessment 9 14

To increase value…Identify gaps between:1. How critical is this data

to your work?2. How do you actually

use this data?

Take 10 min to fill this out and 5 min to pair and discuss areas of biggest gap

1. Assessment Purpose Survey

Page 80: NYSCOSS Conference Superintendents Training on Assessment 9 14

Compare assessments and their purposes to find unnecessary overlaps

Take 10 min to fill this out and 5 min to pair and discuss areas of redundancy

3. Eliminate waste

Page 81: NYSCOSS Conference Superintendents Training on Assessment 9 14

• What were some things you learned?• What practices do you want to reinforce?• What do you need to do differently?

• Think – Pair – Share– 2 min to make some notes– 3 min to share with a neighbor– 2 min for two report outs on this section

Lessons?

Page 82: NYSCOSS Conference Superintendents Training on Assessment 9 14

Data Culture

Page 83: NYSCOSS Conference Superintendents Training on Assessment 9 14

Education Organizations Mature

Barber, M., Chijioke, C., & Mourshed, M. (2011). How the world’s most improved school systems keep getting better. McKinsey & Company.

Poor to Fair

Fair to Good

Good to Great

Great to Excellent

Achieving the basics of literacy

and numeracy

Getting the foundations

in place

Shaping the professional

Improving through

peers and innovation

Data use does too

Page 84: NYSCOSS Conference Superintendents Training on Assessment 9 14

Data Use Continuum

Poor to Fair

Fair to Good

Good to Great

Great to Excellent

One on One Within Teams

Within the Walls

Across the Walls

Requires a shift in the culture

Page 85: NYSCOSS Conference Superintendents Training on Assessment 9 14

What would you do with this?

Page 86: NYSCOSS Conference Superintendents Training on Assessment 9 14

• Where are your pockets of most maturity?• Least maturity?• What is causing the differences?

Think - 2 min

Reflection Time

Page 87: NYSCOSS Conference Superintendents Training on Assessment 9 14

• Education problems are “Wicked”– Problem boundaries are ill-defined– No definitive solutions– Highly resistant to change– Problem and solutions depend on

perspective– Changes are consequential

Data can only take you so far

Research on data use in school improvement

Page 88: NYSCOSS Conference Superintendents Training on Assessment 9 14

• Use data as a platform for deeper conversations

• Define your problem well– Problem title and description– Magnitude– Location– Duration

Research on data use in school improvement

Page 89: NYSCOSS Conference Superintendents Training on Assessment 9 14

• Part of a continuous improvement process–Data conversations

• Collaborative• Embedded in culture• Structured process

Research on data use in school improvement

• Love, Nancy – Using data to improve learning for all• Lipton & Wellman – Got data? Now what?• NWEA Data Coaching

Page 90: NYSCOSS Conference Superintendents Training on Assessment 9 14

Final Question

Page 91: NYSCOSS Conference Superintendents Training on Assessment 9 14

• Think – 2 min• Pair – 3 min• Share – 2 min - Two people

What are your biggest take-aways?

Page 92: NYSCOSS Conference Superintendents Training on Assessment 9 14

• Materials in your Conference App

Presentation available on Slideshare.net

Last thing

Page 93: NYSCOSS Conference Superintendents Training on Assessment 9 14

More informationContacting me:

E-mail: [email protected] us:

Exhibit Booth or Liz Kaplan