developing assessments for and of deeper learning [day 2b-afternoon session] santa clara county...

9
Developing Assessments for and of Deeper Learning [Day 2b-afternoon session] Santa Clara County Office of Education June 25, 2014 Karin K. Hess, Ed.D. [email protected] or [email protected]

Upload: robyn-lamb

Post on 29-Jan-2016

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Developing Assessments for and of Deeper Learning [Day 2b-afternoon session] Santa Clara County Office of Education June 25, 2014 Karin K. Hess, Ed.D

Developing Assessments for and of Deeper Learning

[Day 2b-afternoon session]

Santa Clara County Office of Education June 25, 2014

Karin K. Hess, [email protected] or

[email protected]

Page 2: Developing Assessments for and of Deeper Learning [Day 2b-afternoon session] Santa Clara County Office of Education June 25, 2014 Karin K. Hess, Ed.D

Presentation Overview• Clarify understandings & common

misconceptions about rigor/DOK, deeper learning• Use the Hess Validation Tools to examine sample

performance tasks• Give rubrics the chocolate chip cookie “taste test”• Be inspired by Karin’s performance assessment

coaching tips• Plan & get feedback on future assessment

activities and/or support to teachers

Page 3: Developing Assessments for and of Deeper Learning [Day 2b-afternoon session] Santa Clara County Office of Education June 25, 2014 Karin K. Hess, Ed.D

What we know (from research) about High- Quality Assessment:

• Is defined by agreed-upon standards/ expectations • Measures the individual’s learning & can take

different forms/formats• Measures the effectiveness of instruction and

appropriateness of curriculum • Is transparent: – Students know what is expected of them and how they will

be assessed– Assessment criteria are clear and training is provided to

educators and reviewers/raters.• Communicates information effectively to students,

teachers, parents, administration and the public at large

Page 4: Developing Assessments for and of Deeper Learning [Day 2b-afternoon session] Santa Clara County Office of Education June 25, 2014 Karin K. Hess, Ed.D

Simply put, HQ assessments have…• Clarity of expectations• Alignment to the intended content expectations (skills &

concepts)• Reliability of scoring and interpretation of results • Attention to the intended rigor (tasks & scoring guides)• Opportunities for student engagement & decision

making• Opportunities to make the assessment “fair” & unbiased

for all• Linked to instruction (opportunity to learn)

Page 5: Developing Assessments for and of Deeper Learning [Day 2b-afternoon session] Santa Clara County Office of Education June 25, 2014 Karin K. Hess, Ed.D

2. The DOK Matrix Instructional Paths

Each standard has an assigned Depth of Knowledge.

The DOK determines the cognitive level of instruction.

Recall, locate basic facts, definitions, details, events

Select appropriate words for use when intended meaning is clearly evident.

DOK 1Recall and Reproduction

Remember

Understand

DOK 2Skills and Concepts

Apply

Explain relationshipsSummarizeState central idea

Use context for word meaningsUse information using text features

DOK 3Reasoning and

Thinking

Analyze

Analyze or interpret author’s craft (e.g., literary devices, viewpoint, or potential bias) to critique a text

Explain, generalize or connect ideas using supporting evidence (quote, text, evidence)

.

Cite evidence and develop a logical argument for conjectures based on one text or problem

Evaluate

Use concepts to solve non-routine problems and justify

DOK 4Extended Thinking

Synthesize across multiple sources/ textsArticulate a new voice, theme, or perspective

Evaluate relevancy, accuracy and completeness of information across texts or sources

Analyze multiple sources or multiple textAnalyze complex abstract themes

Devise an approach among many alternatives to research a novel problem

-Explain how concepts or ideas specifically relate to other content domains.

Develop a complex model or approach for a given situationDevelop an alternative solution

.Create

Instruction & Assessment Decisions…

5

Selected ResponseSelected ResponseSelected ResponseSelected Response

Constructed ResponseConstructed ResponseConstructed ResponseConstructed ResponsePerformance Performance

TasksTasks

Performance Performance TasksTasks

Page 6: Developing Assessments for and of Deeper Learning [Day 2b-afternoon session] Santa Clara County Office of Education June 25, 2014 Karin K. Hess, Ed.D

First we consider alignment…

• It’s really about validity – making decisions about the degree to which there is a “strong match” between grade level content standards + performance and the assessment/test questions/tasks

• And making valid inferences about learning resulting from an assessment score

Page 7: Developing Assessments for and of Deeper Learning [Day 2b-afternoon session] Santa Clara County Office of Education June 25, 2014 Karin K. Hess, Ed.D

Alignment (validity) Questions:

• Is there a strong content match between assessment/test questions/tasks and grade level standards?

• Are the test questions/tasks (and the assessment as a whole) more rigorous, less rigorous, or of comparable rigor (DOK) to grade level performance standards?

Page 8: Developing Assessments for and of Deeper Learning [Day 2b-afternoon session] Santa Clara County Office of Education June 25, 2014 Karin K. Hess, Ed.D

Some Common Misconceptions about DOK1. All kids can’t think deeply; or Kids don’t need

scaffolding to get there.2. Webb’s DOK model is a taxonomy (4 vs 1)3. Bloom verbs & levels = Webb DOK4. DOK is about difficulty.5. All DOK levels can be assessed with a multiple choice

question (that’s just dumb!)6. “Higher order” thinking = deeper learning7. Multi-step tasks, multiple texts, or complex texts

always means deeper thinking

Page 9: Developing Assessments for and of Deeper Learning [Day 2b-afternoon session] Santa Clara County Office of Education June 25, 2014 Karin K. Hess, Ed.D

Basic Task Validation Protocol Handout #2a (K. Hess, Linking Research with Practice, Module 3, 2013)

• Table Groups review the technical criteria and descriptions of the Basic Validation Protocol

• Select a sample assessment task to review• Handout 2b – Writing CRM

• Discuss what you see in terms of these criteria:• Purpose & use? – how might you use results?• Clarity – is it clear what is expected?• Alignment: are task content + rigor/DOK appropriate for

grade level, use of data?• Engagement: is there opportunity for student decision

making?