building common assessments april 2, 2013. session targets this session will address: ◦the key...

36
Building Common Assessments APRIL 2, 2013

Upload: bruce-hardy

Post on 17-Dec-2015

216 views

Category:

Documents


2 download

TRANSCRIPT

Building CommonAssessments

APRIL 2, 2013

Session TargetsThis session will address:

◦The key factors to consider when developing common assessments◦Strategies and tools for building common assessments◦Understanding of the rationale for and process of using common assessments

School Building Data, 15%

Teacher Specific Data, 15%

Elective Data, 20%

Observation/ Evidence,

50%

Observation/EvidenceEffective 2013-2014 SYDanielson Framework Do-mainsPlanning and PreparationClassroom EnvironmentInstructionProfessional Responsibilities

School Building DataEffective 2013-2014 SYIndicators of Academic AchievementIndicators of Closing the Achievement Gap, All StudentsIndicators of Closing the Achievement Gap, SubgroupsAcademic Growth PVAASOther Academic IndicatorsCredit for Advanced Achievement

Teacher Specific DataEffective 2016-2017 SYPVAAS / Growth 3 Year Rolling Average2013-2014 SY2014-2015 SY2015-2016 SYOther data as provided in Act 82

Elective Data/SLOsOptional 2013-2014 SYEffective 2014-2015 SYDistrict Designed Measures and Examina-tionsNationally Recognized Standardized TestsIndustry Certification ExaminationsStudent Projects Pursuant to Local Re-quirementsStudent Portfolios Pursuant to Local Re-quirements

Teacher Effectiveness System in Act 82 of 2012

4

Building Level Data, 15%

Observation/ Evidence,

50%

Observation/EvidenceEffective 2013-2014Danielson Framework Do-mainsPlanning and PreparationClassroom EnvironmentInstructionProfessional Responsibilities

Building Level DataEffective 2013-2014 SYIndicators of Academic AchievementIndicators of Closing the Achievement Gap, All StudentsIndicators of Closing the Achievement Gap, Sub-groupsAcademic Growth PVAASOther Academic IndicatorsCredit for Advanced Achievement

Elective Data/SLOsPiloting 2013-2014 SYEffective 2014-2015 SYDistrict Designed Measures and Examina-tionsNationally Recognized Standardized TestsIndustry Certification ExaminationsStudent Projects Pursuant to Local Re-quirementsStudent Portfolios Pursuant to Local Re-quirements

Elective Data, 35%

5

Observa-tion/ Evi-

dence 80%

Student Perfor-mance 20%

Observation/EvidenceDanielson Framework Domains1. Planning and Preparation2. Educational Environment3. Delivery of Service 4. Professional Development

Student Performance of All Students in the School Building in which the Nonteaching Professional Employee is Employed District Designed Measures and ExaminationsNationally Recognized Standardized TestsIndustry Certification ExaminationsStudent Projects Pursuant to Local RequirementsStudent Portfolios Pursuant to Local Requirements

Non Teaching Professional Employee Effectiveness System in Act 82 of 2012

Effective 2014-2015 SY

6

Building Level Data

15%Correlation be-tween Teacher PVAAS scores and Teacher

Danielson rating15%

Elective Data/ SLOs

20%

Observa-tion/ Ev-idence

50%

Observation/ EvidenceDomains1. Strategic/Cultural Leadership2. Systems Leadership3. Leadership for Learning4. Professional and Community

Leadership

Building Level DataIndicators of Academic AchievementIndicators of Closing the Achievement Gap, All StudentsIndicators of Closing the Achievement Gap, SubgroupsAcademic Growth PVAASOther Academic IndicatorsCredit for Advanced Achievement

Correlation Data Based on Teacher Level MeasuresPVAAS

Elective Data/SLOsDistrict Designed Measures and ExaminationsNationally Recognized Standardized TestsIndustry Certification ExaminationsStudent Projects Pursuant to Local RequirementsStudent Portfolios Pursuant to Local Requirements

Principal Effectiveness System in Act 82 of 2012

Effective 2014-2015 SY

Common Assessments in the Context

of PLCs (DuFour, DuFour, Eaker & Many, 2006)

What is it we expect students to learn?How will we know when students have learned it?

How will we respond when students don’t?How will we respond when students do?

What Do We Mean by “Common Assessments”?Any assessment given by two or more

instructors with the intention of collaboratively examining the results for:

◦Shared learning◦Instructional planning for individual students, and/or◦Curriculum, instruction, and/or assessment modifications

What Constitutes Effective Classroom Assessment?

Key Points:Assessment is information, not scores.

(Scores are accountability.)Assessment is best done early and often.

(Assessment at the end is accountability.)

What Constitutes Effective Classroom Assessment?Assessment that:Provides evidence of student

performance relative to content and performance standards

Provides teachers and students with insight into student errors and misunderstanding

Helps lead the teacher and/or team directly to action

What is the Difference between Multiple Choice and Constructed Response Items?

The Anatomy of the Multiple Choice Item

Why are we writing items?

A. to populate a state item bank

B. to build common assessments

C. to work with other teachers

D. to gain experience writing

The Stem

CORRECT ANSWER

Distractor

Distractor

Distractor

Constructed Response ItemsConstructed response items require students to provide a written response. These questions typically ask students to describe, explain, critique, or evaluate the scenario provided in the stimulus•Constructed response items have multiple correct responses. However, there may be specific content that is required in the response to receive full credit.•Constructed response items can assess one or more benchmarks and range from low to high complexity.•Constructed response items are scored using a rubric

Algebra KeystoneModule 1 Module 2

Assessment Anchors Covered

Operations & Linear Equations & Inequalities

Linear Functions & Data Organizations

# of Eligible Content 18 15

# of Multiple Choice 18 18

# of Constructed Response 3 3

Multiple Choice Points 18 18

Constructed Response Points 12 12

60%

40%

Biology KeystoneModule 1 Module 2

Assessment Anchors Covered

Cells and Cell Processes

Continuity and Unity of Life

# of Eligible Content 16 22

# of Multiple Choice 24 24

# of Constructed Response 3 3

Multiple Choice Points 24 24

Constructed Response Points 9 9

73%

27%

Literature KeystoneModule 1 Module 2

Assessment Anchors Covered Fiction Literature Nonfiction Literature

# of Eligible Content 25 31

# of Multiple Choice 17 17

# of Constructed Response 3 3

Multiple Choice Points 17 17

Constructed Response Points 9 9

65%

35%

Keystone Exam Difficulty

The Keystone Exam Test Questions will

almost all be at the Level 2 or 3 from Webb’s

Depth of Knowledge Which leads us to the next

question…

What Is Webb’s Depth of Knowledge?

Norman Webb

Webb’s Depth of KnowledgeNorman Webb developed a process

and criteria for systematically analyzing the alignment between

standards and assessments.

Webb’s Depth of Knowledge Model can be used to analyze the cognitive complexity required for the student to master a standard or complete an

assessment task.

Webb’s Depth of Knowledge

Cognitive complexity refers to the cognitive demand associated with a

test item.

The Depth of Knowledge level of the item is determined by the complexity

of the mental processing that the student must use to answer the item.

Webb’s four levels of complexityLevel 1 - Recall - Recall of a fact, information, or

procedureLevel 2 - Basic Application - of Skill/Concept - Use of information, conceptual knowledge, procedures, two or more steps, etc.Level 3 - Strategic Thinking - Requires reasoning, developing a plan or sequence of steps; has some complexity; more than one possible answer; generally takes less than 10 minutes to do.Level 4 - Extended Thinking - Requires an investigation; time to think and process multiple conditions of the problem or task; and more than 10 minutes to do non-routine manipulations.

Cognitive Complexity/VerbsLow Cognitive Complexity—Level 1

• Remember • Recall• Memorize

• Recognize• Translate• Rephrase

• Describe• Explain• Repeat

Moderate Cognitive Complexity—Level 2• Apply• Execute• Solve

• Connect• Classify• Break

Down

• Distinguish• Compare• Contrast

High Cognitive Complexity—Levels 3 & 4• Integrate• Extend• Combine

• Design• Create• Judge

• Perform• Value• Assess

Same Verb—Three Different DOK Levels

DOK 1- Describe three characteristics of metamorphic rocks.

(Requires simple recall)

DOK 2- Describe the difference between metamorphic and igneous rocks.

(Requires cognitive processing to determine the differences in the two

rock types)

DOK 3- Describe a model that you might use to represent the relationships that exist within the rock cycle.

(Requires deep understanding of rock cycle and a determination of how

best to represent it)

Does Depth of Knowledge Mean Difficulty?

DOK is NOT about difficultyDifficulty is a reference to how many students answer an item correctly•How many of you know the definition of exaggerate?

DOK Low = Recall

If all or most of you know the definition , this item is an easy one.•How many of you know the definition of illeist?

DOK Low = Recall

If most of you do not know the definition, this item is a difficult one.

Complexity vs. DifficultyThe Depth of Knowledge levels are based on the complexity of the mental processes the student must use to find the correct answer, not the difficulty of the item!

How Does Webb’s DOK Relate to Bloom’s Taxonomy

COGNITIVE LEVEL COMPARISON MATRIX: BLOOM AND WEBB

Action WordsBLOOM WEBB

Rememberdefine, identify, name, select, state, order (involves a one-step process)

1.0define, identify, name, select, state, order (involves a one-step process)

Understand

convert, estimate, explain, express, factor, generalize, give example, identify, indicate, locate, picture graphically (involves a 2-step process)

2.0

apply, choose, compute, employ, interpret, graph, modify, operate, plot, practice, solve, use (involves a two-step process)

Apply

apply, choose, compute, employ, interpret, graph, modify, operate, plot, practice, solve, use, (involves a three-or-more step process)

Analyze

compare, contrast, correlate, differentiate, discriminate, examine, infer, maximize, minimize, prioritize, subdivide, test

3.0

compare, contrast, correlate, differentiate, discriminate, examine, infer, maximize, minimize, prioritize, subdivide, test

Evaluate

arrange, collect, construct, design, develop, formulate, organize, set up, prepare, plan, propose, create, experiment and record data 4.0

arrange, collect, construct, design, develop, formulate, organize, set up, prepare, plan, propose, create, experiment and record data

Createappraise, assess, defend, estimate, evaluate, judge, predict, rate, validate, verify

Do You Have Item Writing Rules?

Item Writing Rules

www.bhspd.weebly.com

Is This Worth the Effort?

Yes, It’s Worth the Effort!The Bar is being raised!

The Common Core changes the game.

The Keystone Exams change the game.

The world our students will enter is FAR MORE COMPLEX.

We need to raise the Level of our game!

Groups

GROUP AEnglish

Social StudiesWorld Languages

Physical Education/Health

Session 1: LibrarySession 2: LGI

GROUP BMathematics

BusinessScience

Fine & Practical Arts

Session 1: LGISession 2: Library

Your Mission• Due on May 6, 2013

• Updated Common Final Exams• Common Adapted ELL Final Exams

• Common Adapted Final Exams for students with special needs• Cover Page

• Identify course and teachers who will be administering the assessments.

• Describe changes that were made.• Identify percentages of multiple choice and

constructed response questions.• Identify percentages for each of Webb’s cognitive

levels of knowledge.

Test Your Depth of Knowledgewww.bhspd.weebly.com