lisd data camp june and august, 2011 lisd tech center

35
LISD Data Camp June and August, 2011 LISD TECH Center

Upload: victor-herbert-bryan

Post on 11-Jan-2016

224 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: LISD Data Camp June and August, 2011 LISD TECH Center

LISD Data Camp

June and August, 2011

LISD TECH Center

Page 2: LISD Data Camp June and August, 2011 LISD TECH Center

Welcome

• Necessary Forms– SB-CEUs– SHU Credit

Page 3: LISD Data Camp June and August, 2011 LISD TECH Center

Session 1 Essential Questions

• Many Meaning of Multiple Measures– Why use multiple measures of data?– Which ways will we use multiple measures in

our school improvement process?– What data sources will we use to make

decisions about student achievement?

Page 4: LISD Data Camp June and August, 2011 LISD TECH Center

Session 1 Outcomes

• Identify multiple measures of data

• Align data with their school improvement goals

• Develop an action plan for engaging staff in analyzing multiple measures

Page 5: LISD Data Camp June and August, 2011 LISD TECH Center

School Improvement Process

Page 6: LISD Data Camp June and August, 2011 LISD TECH Center

WE MUST UTILIZE AN

INQUIRY APPROACH

TO DATA ANALYSIS

WE MUST USE MULTIPLE

SOURCES OF DATA

We need a data warehouse

for our 21st century

schools

WE MUST FOCUS ON DATA TO INCREASE STUDENT ACHIEVEMENT

Talking Points for the Purpose of Implementing

a Data Warehouse in Lenawee Schools

Page 7: LISD Data Camp June and August, 2011 LISD TECH Center

Norms for Our Work

• Participate actively

• Actively listen

• Seek application

• Press for clarification

• Honor time agreements and confidentiality

• Keep ‘side bars’ to a minimum and on topic

• Take care of adult learning needs

Page 8: LISD Data Camp June and August, 2011 LISD TECH Center

FERPA/HIPAA Pre-Test

To be considered an “education record,” information must be maintained in the student’s cumulative or permanent folder.

• False, because any record that has a student name is an educational record.

Page 9: LISD Data Camp June and August, 2011 LISD TECH Center

FERPA/HIPAA Pre-Test

FERPA grants parents the right to have a copy of any education record.

• True

Page 10: LISD Data Camp June and August, 2011 LISD TECH Center

FERPA/HIPAA Pre-Test

You are in charge of a staff meeting to study student achievement on school improvement goals. As part of your meeting, you are showing a report to the entire staff that shows student scores on a common local assessment. The report shows the student names. In addition, you have given them a paper copy of the report.

It is a violation of FERPA to display the results of the assessment to the entire staff.

The exception would be a group of teachers working on a specific student strategies, as they are a specific population that then has a “legitimate educational interest” in the information.

Page 11: LISD Data Camp June and August, 2011 LISD TECH Center

Data Roles

• What roles will each member of your team play in today’s work?– Identify roles– Describe responsibilities– Hold each other accountable

Page 12: LISD Data Camp June and August, 2011 LISD TECH Center

The Many Meanings of “Multiple Measures”

Susan Brookhart

Volume 2009, Volume 67:3

ASCD, November 2009, pp. 6-12

Page 13: LISD Data Camp June and August, 2011 LISD TECH Center

Would you choose a house using one measure alone?

Page 14: LISD Data Camp June and August, 2011 LISD TECH Center

Guiding Principle for Multiple Measures

• Know your purpose!

–What do you need to know?

–Why do you need to know it?

Page 15: LISD Data Camp June and August, 2011 LISD TECH Center

• assessment for learning– formative

(monitors student progress during instruction)

– placement(given before instruction to gather information on where to start)

– diagnostic(helps find the underlying causes for learning problems)

– interim (monitor student proficiency on learning targets)

• assessment of learning– summative

(the final task at the end of a unit, a course, or a semester)

Purposes of Assessments

Sources: Stiggins, Richard J, Arter, Judith A., Chappuis, Jan, Chappius, Stephen. Classroom Assessment for Student Learning. Assessment Training Institute, Inc., Portland, Oregon, 2004. Bravmann, S. L., “P-I Focus: One test doesn’t fit all”, Seattle Post-Intelligencer, May 2, 2004. Marshall, K. (2006) “Interim Assessments: Keys to Successful Implementation”. NewYork: New Leaders for New Schools.

Page 16: LISD Data Camp June and August, 2011 LISD TECH Center

Why use multiple measures for decisions in education?

• Construct validity– The degree to which a

score can convey meaningful information about an attribute it measures

• Decision validity– The degree to which

several relevant types of information can inform decision-making

Page 17: LISD Data Camp June and August, 2011 LISD TECH Center

Multiple Measures

• Measures of different constructs

• Different measures of the same construct

• Multiple opportunities to pass the same test

Page 18: LISD Data Camp June and August, 2011 LISD TECH Center

Using Multiple Measures for Educational Decisions

Conjunctive Approach

(All measures count)

Compensatory Approach

(High performance on one measure can

compensate for lower performance on another

measure)

Complementary Approach

(High performance on any measure counts)

Page 19: LISD Data Camp June and August, 2011 LISD TECH Center

Examples

• NCLB accountability is conjunctive (i.e., aggregate and subgroups must reach threshold to make AYP)

• Most classroom grading policies are compensatory (i.e., average, percentage)

• Getting a driver’s license is complementary (i.e., passing one of the requirements when you want)

Page 20: LISD Data Camp June and August, 2011 LISD TECH Center

Using Multiple Measures for Educational Decisions

Conjunctive Approach

(All measures count)

Compensatory Approach

(High performance on one measure can

compensate for lower performance on another

measure)

Complementary Approach

(High performance on any measure counts)

Measures of different constructs

Different measurers of the same construct

Multiple opportunities to pass the same test

Page 21: LISD Data Camp June and August, 2011 LISD TECH Center

Examples• MEAP measures different constructs

in mathematics (i.e., measurement, numbers and operations, geometry, algebra, probability)

• Retelling, constructed responses, and cloze tasks are different measures of the same construct (comprehension)

• Some students utilize multiple opportunities to take the ACT (i.e., scholarships, NCAA eligibility)

Page 22: LISD Data Camp June and August, 2011 LISD TECH Center

Using Multiple Measures for Educational Decisions

Conjunctive Approach

(All measures count)

Compensatory Approach

(High performance on one measure can

compensate for lower performance on another

measure)

Complementary Approach

(High performance on any measure counts)

Measures of different constructs

School accreditation ratings based upon

student achievement meeting identified

targets in Reading, Math, Science, and

Social Studies

An outside agency identifies the

“best schools”, identified by computing an index of weighted scores

AYP “Safe Harbor” by having a percentage of students who

scored below proficiency decreasing by ten percentage points from the previous year

Different measurers of the same construct

Students have to pass a reading comprehension test on two stories at the

same reading level before the student is

allowed to read stories at the next higher

reading level

Teachers determine standards-based grades in a course using scores on

multiple assessments measuring the same

GLCE or HSCE

Teachers allow student choice on assessment tasks to

demonstrate their understanding of the learning

targets for a unit

Multiple opportunities to pass the same test

Students meeting all requirements will

graduate after passing an exit exam, no matter how many opportunities

Teachers allow students to retake a unit test to

demonstrate mastery of the unit’s outcomes

Students must pass one mathematics test in order to

graduate; students can choose the state test or an end-of-course exam in either Algebra I or Geometry

Page 23: LISD Data Camp June and August, 2011 LISD TECH Center

Suggestions for UsingMultiple Measures

for Decision Making

• Classroom assessments linked to the same construct to determine mastery

• Granting credit for graduation requirements

• Teacher evaluations

Page 24: LISD Data Camp June and August, 2011 LISD TECH Center

Questions?

Stan MastersCoordinator of

Instructional Data ServicesLenawee Intermediate School

District4107 N. Adrian HighwayAdrian, Michigan 49921

Phone: 517-265-1606Email: [email protected] ID: stan.masters

Data Warehouse webpage:www.lisd.us/links/data

Page 25: LISD Data Camp June and August, 2011 LISD TECH Center

LISD Data Camp

June and August, 2011

LISD TECH Center

Page 26: LISD Data Camp June and August, 2011 LISD TECH Center

Session 2 Essential Questions

• DataDirector Functions– What are the important aspects of various

DataDirector functions?– What decisions must be made in sharing

DataDirector products with others?– How will we build our 2011-2012

assessment calendar?

Page 27: LISD Data Camp June and August, 2011 LISD TECH Center

Session 2 Outcomes

• Describe the functions of DataDirector tabs

• Identify the permissions in sharing a DataDirector product

• Create an assessment calendar for 2011-2012 school year

Page 28: LISD Data Camp June and August, 2011 LISD TECH Center

School Improvement Process

Page 29: LISD Data Camp June and August, 2011 LISD TECH Center

Norms for Our Work

• Participate actively

• Actively listen

• Seek application

• Press for clarification

• Honor time agreements and confidentiality

• Keep ‘side bars’ to a minimum and on topic

• Take care of adult learning needs

Page 30: LISD Data Camp June and August, 2011 LISD TECH Center

Data Roles

• What roles will each member of your team play in today’s work?– Identify roles– Describe responsibilities– Hold each other accountable

Page 31: LISD Data Camp June and August, 2011 LISD TECH Center

How do you develop a monitoring plan?

• Identify specific learning indicators

• Create data collection templates

• Schedule assessment calendar– collaborative collection and analysis

Source: “Developing a Monitoring Plan”. Maryland Department of Education. Accessed May 25, 2010 from http://mdk12.org/data/progress/developing.html

Video Source: Reeves, D. (2009). “Planning for the New Year”. Accessed May 25, 2010 from http://www.leadandlearn.com/webinars

Page 32: LISD Data Camp June and August, 2011 LISD TECH Center

Assessment Calendars

Page 33: LISD Data Camp June and August, 2011 LISD TECH Center

Time Elements of an Assessment CalendarSource: White, S. H. (2005). “Beyond the Numbers: Making Data Work for Teachers

and School Leaders”. Lead and Learn Press: Englewood, CO

• When will we administer the assessment?• When will we collect the data?• When will we disaggregate the data?• When will we analyze the data?• When will we reflect upon the data?• When will we make recommendations?• When will we make the decisions about the

recommendations?• When will we provide written documentation about the

decisions?• When will we share the data with other stakeholders?

Page 34: LISD Data Camp June and August, 2011 LISD TECH Center

Components of an Assessment Calendar Source: White, S. H. (2005). “Beyond the Numbers: Making Data Work for Teachers

and School Leaders”. Lead and Learn Press: Englewood, CO

• Norm-referenced tests• State assessments• Criterion-referenced tests• Writing assessments• End-of-course assessments• Common assessments• Performance assessments• Unit tests• Other

Page 35: LISD Data Camp June and August, 2011 LISD TECH Center

Questions?

Stan MastersCoordinator of

Instructional Data ServicesLenawee Intermediate School

District4107 N. Adrian HighwayAdrian, Michigan 49921

Phone: 517-265-1606Email: [email protected] ID: stan.masters

Data Warehouse webpage:www.lisd.us/links/data