christina kasprzak austin, texas november 2010

Post on 06-Jan-2016

28 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Analyzing and Interpreting Child Outcomes Data. Christina Kasprzak Austin, Texas November 2010. Objective for the day. To share with you ideas and resources for use in training and TA that will help districts to analyze and use COSF data. 2. Agenda. - PowerPoint PPT Presentation

TRANSCRIPT

Christina KasprzakAustin, Texas

November 2010

1

Analyzing and Interpreting

Child Outcomes Data

The National Early ChildhoodTechnical Assistance Center

Objective for the day

To share with you ideas and resources for use in training and TA that will help districts

to analyze and use COSF data

2

Agenda

• Looking at data—generally; national; state;

regional

• Follow up discussion about assessment

tools

• Communicating data results

• Public reporting requirements

• Framework for a quality outcomes system

3

Recap from March

4

• Assessment (more debrief on this after lunch)• no assessment created for this outcomes

process• best practices on assessment = multiple data

sources• types of assessment including pros and cons• benefits of limiting assessments for COSF• selecting tools for COSF process• activity – reviewing assessment tools and

identifying strengths, weaknesses, how it fits with COSF process

Recap from March

• Promoting Data Quality – ECO Training Materials and Activities• COSF refresher training• quality review of COSF team discussion• involving families in outcomes process• written child example• reviewing a COSF for quality

5

Why do a good job with COSF data?

It’s hard to change attitudes!

What motivates people?Altruistic?Fear?Logic?Money?

6

Why do a good job with COSF data?

Altruistic: Because you believe child and family outcomes are why you do your job!Fear: Because you can look bad! (to the state; to the public via public reporting)Logic: Because a program should be accountable for the results of their services!Money: Because OMB is using the data to make decisions– federal dollars are at stake!

7

Why do a good job with COSF data?

•Today’s focus on ‘looking at data’ will give you more tools and resources for changing

attitudes!

8

Looking at Data

9

Continuous Program Improvement

Plan (vision) Program characteristics

Child and family outcomes

Implement

Check(Collect and analyze data)

ReflectAre we where we

want to be?

10

Evidence

Inference

Action

Using data for program improvement = EIA

11

Evidence

• Evidence refers to the numbers, such as“45% of children in

category b”

• The numbers are not debatable

12

• How do you interpret the #s?• What can you conclude from the #s?• Does evidence mean good news?

Bad news? News we can’t interpret?• To reach an inference, sometimes we

analyze data in other ways (ask for more evidence)

Inference

13

• Inference is debatable -- even reasonable people can reach different conclusions

• Stakeholders can help with putting meaning on the numbers

• Early on, the inference may be more a question of the quality of the data

Inference

14

Action

• Given the inference from the numbers, what should be done?

• Recommendations or action steps• Action can be debatable – and often is• Another role for stakeholders• Again, early on the action might have to

do with improving the quality of the data

15

Promoting quality data Promoting quality data through data analysisthrough data analysis

16

• Examine the data for inconsistencies

• If/when you find something strange, look for other data that might help explain it.

• Is the variation caused by something other than bad data?

Promoting quality data through data analysis

17

The validity of your data is questionable if…

The overall pattern in the data looks “strange’:

– Compared to what you expect– Compared to other data– Compared to similar

states/regions/school districts

18

Let’s look at some data …

19

Remember: Part C &619 Child Outcomes (see cheat sheet)

1. Positive social-emotional skills (including social relationships);

2. Acquisition and use of knowledge and skills (including early language/communication [and early literacy]); and

3. Use of appropriate behaviors to meet their needs

20

21

• 7-Completely- Age appropriate functioning in all or almost all everyday situations; no concerns

• 6- Age appropriate functioning, some significant concerns• 5-Somewhat- Age appropriate functioning some of the time and/or in

some settings and situations• 4- Occasional age-appropriate functioning across settings and situations;

more functioning is not age-appropriate than age appropriate.• 3-Nearly- Not yet age appropriate functioning; immediate foundational

skills most or all of the time• 2- Occasional use of immediate foundational skills• 1-Not yet- Not yet age appropriate functioning or immediate

foundational skills

Remember: COSF 7-point scale

Rating Statewide # Statewide%

1 300 15%

2 421 21%

3 516 25%

4 604 29%

5 101 5%

6 109 5%

7 0 0%

COSF Ratings – Outcome 1 Entry data (fake data)

22

0.00

0.05

0.10

0.15

0.20

0.25

0.30

0.35

1 2 3 4 5 6 7

Frequency on Outcome 1 – Statewide Entry Data

23

24

Rating Group 1 #

Group 2 #

Group 3 #

Group 4 #

1 30 11 10 12

2 40 10 42 42

3 50 20 23 23

4 64 31 32 34

5 10 40 45 44

6 10 52 50 40

7 0 4 2 2

COSF Ratings – Outcome 1 Entry data (fake data)

25

Rating Group 1% Group 2 %

Group 3 %

Group 4 %

1 15 7 5 6

2 20 6 21 21

3 25 12 11 12

4 31 18 16 17

5 5 24 22 22

6 5 31 25 20

7 0 2 1 1

COSF Ratings – Outcome 1 Entry data (fake data)

26

Comparison of two Groups

0%

5%

10%

15%

20%

25%

30%

35%

1 2 3 4 5 6 7

0%

5%

10%

15%

20%

25%

30%

35%

1 2 3 4 5 6 7

27

GroupSocial-

EmotionalKnowledgeand Skills

Action toMeet Needs

1 4.5 4.6 4.7

2 5.3 5.2 4.7

3 4.9 4.9 4.9

4 6.4 5.9 6.6

5 5.3 4.3 4.9

6 3.8 2.9 3.9

Total 5.03 4.63 4.95

Average Entry Scores on Outcomes

Entry

Exit 1 2 3 4 5 6 7 total

1 1 4 2         7

2 1 1 5 6 9 3 1 26

3   2 15 14 27 19 6 83

4   4 4 21 39 28 12 108

5   1 12 14 71 86 48 232

6   1   3 21 48 63 136

7       2 18 23 56 99

Review Total 2 13 38 60 185 207 186 691

Outcome 3: Appropriate Action (fake data)

100

Remember: Reporting Categories

Percentage of children who:

a. Did not improve functioningb. Improved functioning, but not sufficient to move

nearer to functioning comparable to same-aged peers

c. Improved functioning to a level nearer to same-aged peers but did not reach it

d. Improved functioning to reach a level comparable to same-aged peers

e. Maintained functioning at a level comparable to same-aged peers

293 outcomes x 5 “measures” = 15 numbers

30

OSEP Categories Children

e. Maintained Age Appro Trajectory 23%

d. Changed Traj – Age Appro 15%

c. Changed Traj – Closer to Age Appropriate 32%

b. Same Trajectory -Progress 28%

a. Flat Trajectory – No Prog. 2%

Progress Data – Outcome 2: fake data

31

OSEP CategoriesGroup 1

(%)Group 2

(%)Group 3

(%)

e. Maintained Age Appro Trajectory 23 16 24

d. Changed Traj – Age Appro 15 23 13c. Changed Traj – Closer to

Age Appropriate 32 34 37

b. Same Trajectory -Progress 28 21 25

a. Flat Trajectory – No Prog. 2 6 1

Progress Data – Outcome 2: fake data

OSEP Progress Categories for Outcome 1

Program a b c d eRow total

Children’s Corner 1 1 3 1 8 14

Elite Care 1 6 2 2 6 17

Ms Mary’s 1 3 3 11 13 31New

Horizons 0 1 4 2 3 10

Oglethorpe 0 2 3 2 10 17Column

total 3 13 15 18 40 89

Progress Categories OSEP 1

Program a b c d e

Row percent totals

Children’s Corner 33% 8% 20% 6% 20% 16%

Elite Care 33% 46% 13% 11% 15% 19%

Ms Mary’s 33% 23% 20% 61% 33% 35%New

Horizons 0% 8% 27% 11% 8% 11%

Oglethorpe 0% 15% 20% 11% 25% 19%Column percent totals 100% 100% 100% 100% 100% 100%

Progress Categories OSEP 1

Program a b c d eRow

percent totals

Children’s Corner 7% 7% 21% 7% 57% 100%

Elite Care 6% 35% 12% 12% 35% 100%

Ms Mary’s 3% 10% 10% 35% 42% 100%New

Horizons 0% 10% 40% 20% 30% 100%

Oglethorpe 0% 12% 18% 12% 59% 100%Column percent totals 3% 15% 17% 20% 45% 100%

Final results

• Using the row percents we know that 35% of children in Ms Mary’s programs closed the gap in Outcome 1.

• As a reference, we can compare this to the 20% of children across all programs that closed the gap in Outcome 1.

• Why? Is this an important difference? – To answer that question we would conduct

additional analysis

36

• Do the data make sense?– Am I surprised? Do I believe the data?

Believe some of the data? All of the data?

• If the data are reasonable (or when they become reasonable), what might they tell us?

Questions to ask

37

• One group - Frequency Distribution– Tables– Graphs

• Comparing Groups– Graphs– Averages

Examining COSF data at one time point

38

Do outcomes vary by:

•Unit/District/Program?•Rating at Entry?•Amount of movement on the scale?•% in the various progress categories?

What we’ve looked at:

39

Do outcomes vary by child/family variables

or by service variables, e.g. :

•Services received?•Age at entry to service?•Type of services received?•Family outcomes?•Education level of parent?

What else might you want to look at?

40

Activity 1: Reviewing sample data

41

• Break into small groups of ~5• Walk through the state example

answering questions as you go• Whole group: share highlights of

your conversations

Small Groups

Application

42

How could you use this type of datadiscussion in your training and TA?

What experiences or resources do youhave with discussing outcomes data inyour training and TA?

43

Summary Statements

Origin of the Summary Statements

• States reported on the OSEP Progress Categories for a few years

• States knew they would be asked to set targets

• Using the progress categories would require setting 15 targets…

44

45

• ECO prepared papers with options• Convened stakeholders• Extensive discussion about pros and cons

of various summary statements• See Options and ECO

Recommendations for Summary Statements for Target Setting on the ECO web site:

http://www.fpg.unc.edu/~eco/assets/pdfs/summary_of_target_setting-2.pdf

Origin of the Summary Statements

46

1. Of those children who entered the program below age expectations in each Outcome, the percent who substantially increased their rate of growth by the time they turned 6 years of age or exited the program.

2. The percent of children who were functioning within age expectations in each Outcome by the time they turned 6 years of age or exited the program.

Summary Statements

47

1. Of those children who entered the program below age expectations in each Outcome, the percent who substantially increased their rate of growth by the time they exited the program.

c + d___ a + b + c + d

Summary Statements

48

• How many children changed growth trajectories during their time in the program?

• Percent of the children who entered the program below age expectations made greater than expected gains, made substantial increases in their rates of growth, i.e. changed their growth trajectories

Other Ways to Think about Summary Statement 1

Summary Statements

2. The percent of children who were functioning within age expectations in each Outcome by the time they exited the program.

d + e__ a + b + c + d + e

49

50

Other Ways to Think about Summary Statement 2

•How many children were functioning like same aged peers when they left the program?

•Percent of the children who were functioning at age expectations in this outcome area when they exited the program, including those who:

• started out behind and caught up and• entered and exited at age level

51

COSF ratings

OSEP categories

Summary Statements

The connection:

52

National and

Texas Data

State Approaches to Measuring Child Outcomes

• Child Outcomes Summary Form (COSF)– 36 (61%) 619

• Single assessment statewide– 9 (15%) 619

• Publishers’ online assessment systems– 6 (10%) 619

• Other approaches– 7 (12%) 619

*one state preschool program still unknown53

MH

HI

GU

PWFM

AS

MP

State Approaches to Child Outcomes Measurement – 619 Programs

Early Childhood Outcomes Center – August 2010

Legend: COSF Publishers’ on-line systems One tool statewide Other

National Progress Data Feb 2010

55

National Summary Statement Data

56

Texas 619 Progress Data Feb 2010

57

Texas 619 Summary Statement Data

58

59

Activity 2: Texas statewide and

regional data

60

1. Review Texas Statewide data2. Review regional data (comparing to one another

and to the state)3. Discuss:

– What surprises you about the data?– What questions do the data raise?– What additional data collection or analysis would you do to

dig deeper?

4. “Gallery Walk” - Record you small groups best ideas on sheet to be posted and shared with whole group

Small Group Instructions:

61

How could you use this type of activity in your training and TA?

What experiences or resources do youhave with discussing outcomes data

inyour training and TA?

Application

Assessment Tools and COSF

62

Recap from March - Assessment

• Assessment (more debrief on this after lunch)

• no assessment created for this outcomes process• best practices on assessment = multiple data sources• types of assessment including pros and cons• benefits of limiting assessments for COSF• selecting tools for COSF process• activity – reviewing assessment tools and identifying strengths, weaknesses, how it fits with COSF process

63

64

Assessment considerations in reporting child

outcomes dataa. No assessment developed for this purposeb. No ‘perfect’ assessmentc. Formal assessment is one piece of informationd. Formal assessment can provide consistency across

teachers/providers, programs, statee. Formal assessment can ground teachers/providers

in age expectations

Selecting and implementing good formal assessments as an essential component of good child outcomes

measurement

DEC recommended practices on early childhood

assessment

1. Professionals and families collaborate in planning and implementing assessment.

2. Assessment is individualized and appropriate for the child and family.

3. Assessment provides useful information for intervention.

4. Professionals share information in respectful and useful ways.

5. Professionals meet legal and procedural requirements and meet recommended practice guidelines.

65

66

Types of Assessment

• Norm-referenced instrument• Criterion-Referenced instrument• Curriculum-based instrument• Direct observation• Progress monitoring• Parent or professional report

(and any combination of above)

PROS and CONS of Norm referenced instruments

PROS• Provides information on

development in relation to others

• Already used for eligibility• Diagnosis of

developmental delay• Standardized procedures

CONS• Does not inform intervention • Information removed from

context of child’s routines• Usually not developed or

validated with children w/ disabilities

• Does not meet many recommended practice standards

• May be difficult to administer or require specialized training.

67

68

PROS and CONS of Criterion Referenced instruments

PROS• Measures child’s performance of

specific objectives• Direct link between assessment

and intervention• Provides information on child’s

strengths and emerging skills• Helps teams plan and meet

individual child’s needs• Meets recommended assessment

practice standards• Measures child progress• May be used to measure program

effectiveness

CONS• Requires agreement on criteria

and standards• Criteria must be clear and

appropriate• Usually does not show

performance compared to other children

• Does not have standard administration procedures

• May not move child toward important goals

• Scores may not reflect increasing proficiency toward outcomes

69

PROS and CONS of Curriculum-based instruments

PROS• Provides link between

assessment and curriculum• Expectations based upon the

curriculum and instruction• Can be used to plan intervention• Measures child’s current status or

curriculum• Evaluates program effects• Often team based• Meets DEC and NAEYC

recommended standards• Represents picture of the child’s

performance

CONS• May not have established

reliability and validity• May not have procedures for

comparing child to a normal distribution

• Generally linked to a specific curriculum

• Sometimes comprised of milestones that may not be in order of importance

70

Benefits of limiting assessment tools used for COSF

• Ensure use of quality assessments as foundation for COSF

• Increase the consistency across individuals and programs (ensure the quality of the data)

• Reduce Cost/Resources it takes to train and support many tools

• Other benefits?

71

What types of criteria to consider in the process of selecting tools for use

with COSF

• How well does it cover the 3 outcome areas?• How functional is the information collected about

the child?• Does the instrument allow a child to show their

skills and behaviors in natural settings and situations?

• Does the instrument incorporate observation, parent input, or other sources?

• Is the instrument limited to an ideal testing situation?

72

Successes?Challenges?Next steps?

How’s it going?

73

Activity 3: Reviewing data on

assessments used with COSFs

74

Small Group Instructions:

1. Review data on assessments used with COSFs

2. Discuss:– What do the data say? What stands out for you?– What might this data mean?– What questions does it raise?– What next steps might be taken?

3. Share back with whole group

75

Application

How could you use an activity like this in your training and TA?

What experiences or resources do you have about assessment that you

already use in your training and TA?

Communicating Effectively with the Media and Public

about Child Outcomes Data

76

Being prepared………..

• How will we talk about the child outcomes data with:– The media– State legislators– State agency heads– Families– Early intervention and 619 providers– State advisory councils– Other key stakeholders in your state

77

78

Being prepared means……….

• Thinking ahead about how to talk about the data.

• Writing out the specific messages you want to make (an internal ‘talking points’ memo).

• Developing a 1-2 page fact sheet that summarizes the findings and your messages.

• Using public dissemination opportunities to get out key messages that will educate the public about your programs, their benefits.

79

Being prepared means thinking about…

• What audiences? • What you want each audience

to know about your program including any recent changes in eligibility, system, etc.)?

• What you want each audience to know about the data?

Being prepared means……….

• Identifying key spokespersons.

• Being thoroughly familiar with your state’s data.

• Practicing your talking points with individuals who are not familiar with the program.

80

81

Crafting the messages:Set the context

• Provide the context (Federal reporting).• Use the ECO Center Q&A document** to

explain:– What are the child outcomes– Why we are measuring and reporting

outcomes– The ultimate goal is to enable young

children to be active and successful participants during the early childhood years and in the future in a variety of settings, in their homes with their families, in child care, preschool or school programs, and in the community.

82

Crafting the messages: Summary Statement #1

• Of those children who entered the program below age expectations in Outcome __, the percent who substantially increased their rate of growth by the time they turned 3/6 years of age or exited the program.

• Share the numbers; describe them in simple ways:– “Nearly two-thirds of the

children made greater than expected progress while in the program.”

83

Crafting the messages: Summary Statement #2

• The percent of children who were functioning within age expectations in Outcome __ by the time they turned 3/6 years of age or exited the program.

• Share the numbers; describe them in simple ways:– “About half of the children

were functioning like same age peers when they left the program.”

84

Key issues in messaging the data….

• How do we look ahead and become thoroughly prepared to present and explain the child outcomes data?

85

Anticipate Questions

• What are 3 questions that different audiences may ask you about the child outcomes data? – Families– Legislators– Agency heads– State or local councils/boards– The media

86

Making the message understandable…..

How do you make the message easily understandable for the public?

Use “Plain Speak”Don’t be repetitiveExplain how your data relates to the

average person in your stateWhat are you saying about how the

children are doing?Discuss in terms of what is important

to all families

87

Describe the numbers in simple ways ….

– “Nearly half the children showed made greater than expected progress while they were in the program.”

– “About two-thirds of the children were performing like same age peers when they left the program.”You can talk about more than the two

Summary Statements.

88

Give YOUR interpretation about

the numbers…..

• “We see these data as good news….”• “We are pleased that the data shows

that children in these programs are making progress between the time they enter and leave these programs…”

• “Many children are catching up with peers in the same age group…”

89

Share other key messages to educate your

audiences….• “These programs serve

many different children….”• “Some children have mild

delays or problems in one area only. These are children who can ‘catch up’”.

• “Other children have more significant disabilities; some make substantial progress and others make less progress”.

90

Link messages to broader EC issues…

• Point out how the program is helping get children ready for school.

• Note that there is lots of policy attention and research about the cost effectiveness of early learning programs.

91

Think ahead about messages that might work or not

work….

• What are some messages that have worked for you in the past?

• What are some messages that didn’t work so well, or were misinterpreted by the media or public or other key audiences?

92

If the data show possible problems….

• Get out in front of the data, and note the problem areas:– “We see large differences in the data in

different regions………..”• Then, offer interpretations and note

that you are trying to understand such differences:– “We are trying to understand these

variations. They may have to do with differences in the children being served or in ways the data are being collected…..”

93

Preparing a response…..

• Find the main message you want to communicate

• Translate the main message into a simple statement about the data

• Use quotes to explain the meaning of the data; give an interpretation– Include quote by state official.– Include quote by program or provider.– Include quote(s) from parent(s).

94

End any messaging by returning to the big picture

message…..

“The goal of these programs is for children to be active and successful participants now and in the future”.

95

Activity 4: Prepare to answer

questions from different audiences

Small Group Instructions:

1. Identify 3 key questions that different audiences may ask about child outcomes data

2. Choose one key question to focus on for creating a response.

3. Discuss how you might use data to respond to the question. What are the messages you want to send?

4. Share back with whole group

96

97

Application

How could you use the messaging materials in your training and TA?

What similar experiences or resources do you have that you

already use in your training and TA?

Public Reporting

98

99

Public Reporting

• Requirements

• Timelines

• Expectations

Wrap Up Day 1

100

ECO Framework and Self Assessment

101

Purpose of the ECO Framework

• Designed to identify key components that

make up a quality outcomes measurement

system.

• Designed to be used by state agencies to

assess progress toward full implementation of

a child outcomes measurement system.

102

103

Components measured

• Purpose

• Data collection and transmission

• Analysis

• Reporting

• Using data

• Evaluation

• Cross-system coordination

104

Self Assessment Scale

1 = No or minimal

implementation

3 = Some implementation

5 = Substantial implementation

7 = Full implementation

(effective, efficient)

105

Activity 3: ECO Framework and Self

Assessment

106

Small Group Instructions:

1. Break into 6 groups – each assigned a focus:1. Data collection and transmission2. Analysis3. Reporting AND Using data

2. Discuss and complete the self assessment area assigned to your group.

3. Share back with whole group• How is Texas doing in this area?• How are regions/districts doing in this area?

107

Application

How could you use the framework and/or self assessment in your training and TA?

What similar experiences or resources do you have that you already use in your

training and TA?

Needs Assessment

108

109

High quality servicesfor children and

familiesthat will lead to good

outcomes.

Keeping our eye on the prize:

110

Find more resources at: http://www. the-eco-center-

org

top related