developing a formative assessment protocol to measure

27
Copyright © 2013 by Educational Testing Service. Developing a Formative Assessment Protocol to Measure Implementation Christine J. Lyon, Meghan W. Brenneman, Leslie Nabors Oláh Educational Testing Service Cassandra Brown Florida State University Friday, June 21, 2013 CCSSO, National Harbor, MD 1

Upload: others

Post on 28-Mar-2022

4 views

Category:

Documents


0 download

TRANSCRIPT

Slide 1Developing a Formative Assessment Protocol to Measure
Implementation
Christine J. Lyon, Meghan W. Brenneman, Leslie Nabors Oláh Educational Testing Service
Cassandra Brown
1
Cognitively Based Assessment of, for, and as Learning
• The goal is to create a future comprehensive system of assessment that: – documents what students have achieved ("of
learning"), – helps identify how to plan and adjust
instruction ("for learning"), and – is considered by students and teachers to be a
worthwhile educational experience in and of itself ("as learning").
2
CBAL Components
Formative Assessment
6/28/2013
Formative assessment is an ongoing process in which teachers and students use evidence
gathered through formal and informal means to make inferences about student competency and,
based on those inferences, take actions intended to achieve learning goals.
4
5
Embedded-in-the- Curriculum Formative
Assessment • Informal, unplanned
• Used when a teachable moment unexpectedly occurs and the teacher pursues it to learn more about student thinking and understanding
• Deliberate
• Used at predictable points in a lesson (e.g., planned questions and tasks; planned options for reacting to an anticipated range of evidence)
• Formal, planned
• Used at specific junctures during instruction to further student learning and adjust instruction
Types of Formative Assessment
Romeo and Juliet
7
8
9
10
11
12
Research Questions
• Do the formative assessment tasks and associated materials provide sufficient support to engage teachers in the formative assessment process?
• How can we measure engagement with and implementation of the CBAL assessment materials in a formative context?
13
High-Impact Classroom Assessment Practices (HI-CAP)
• A subset of the broader teacher and student practices constructs
• Focuses on the classroom assessment practices of students and teachers
• Includes assessment of, for, and as learning
14
Developing an Observation Protocol
D raft Protocol
Organization
Existing Narratives
Domains, Dimensions, & Indicators
Rubrics and Narratives Narrative for a low level implementation of
the Effective Questioning Dimension: At the low level of implementation (Score 1 or 2),
teachers pose questions to only a subset of interested students, answer their own questions, do not provide adequate wait time, and/or use only one type of questioning method (e.g., closed-ended questions that require a one-word response). The teacher often misses opportunities to gain valuable insights into student thinking. For example, when the teacher is asked the same question repeatedly by a number of different students, the teacher does not take the opportunity to collect evidence from the entire class to determine if a whole class intervention is necessary. Rather the teacher only attends to students who explicitly ask the question. Questions asked are primarily lower-order, surface- level prompts such as “Is the water going in or going out of the sink?” or “Can you have negative minutes?” There is no exploration of ideas, no deep discourse, no follow-up, or higher-order probes. The teacher does not encourage student questions or elicit upon multiple solution strategies. Similarly a teacher may sit quietly at her desk while students complete independent work. This results in limited student or teacher questions. When questions are posed, the majority are closed-ended with the teacher focused solely on hearing the right answer from a limited number of students.
17
A Validity Argument
• Needed to consider the technical quality of the final pilot form of HI-CAP.
• Multiple step process: 1. Intended use of instrument 2. Explain the assumptions and inferences
o Scoring, generalizability, extrapolation, implications and decisions (Bell et al., 2012; Hill, Charalambous & Kraft, 2012).
3. Evaluate each assumption
Assumptions for Scoring
• Observer Reliability – Do different observers apply the scoring rules in
the same ways? – Do observers use the protocol in ways that are in
line with the developers’ intention? – Do observers apply the scoring rules consistently
across teachers, content areas, and grade levels? • Construct Validity
– Is the protocol is representative of the full construct being measured?
– Do the scoring rubrics include levels that are representative of the full range of the construct?
19
Pilot Study
• 7 teachers from a public high-school in Northeast (3 math, 4 ELA)
• 65 lessons observed by two trained researchers – 30 observed by two developers – 35 observed by one developer and one
trainee • Each researcher independently scored
the HI-CAP
Developer Reliability
Technology 53% 40% 93%
Use of Evidence 70% 27% 97%
Student Involvement 40% 47% 87%
Student Engagement 47% 53% 100%
Copyright © 2013 by Educational Testing Service.
Exact
Technologoy 34% 43% 77%
Use of Evidence 17% 43% 60%
Student involvement 37% 43% 80%
Student Engagement 29% 54% 83%
Trainee Reliability
Construct Validity
23
Min Max Mean Standard Deviation Preparation & Planning 1.00 6.00 3.81 1.15 Technology N/A 6.00 2.92 1.83 Learning Targets 1.00 5.00 1.68 0.90 Content Development 1.00 6.00 3.66 1.38 Effective Questioning 1.50 5.50 3.15 1.08 Use of Evidence 1.50 5.50 3.13 1.10 Student Involvement 1.00 5.50 2.50 1.18 Student Engagement 1.50 6.00 3.50 1.00
Copyright © 2013 by Educational Testing Service.
Discussion • Protocol developed encourages systematic
examination of the implementation of formative assessment and the CBAL materials
• Preliminary data suggests the technical quality of the instrument is promising
• Pilot data suggests that even with support there is limited implementation of some aspects of formative assessment
24
Next Steps
based upon observation scores • Quasi-experimental study • Identification of differential
professional development needs
Framing Questions
• How do you (in your state, district etc.) support teachers’ formative assessment of their own practice in the current evaluation context?
• How do support teachers to improve their formative assessment practice?
• How can the work presented thus far support your efforts?
26
27
CBAL Components
Formative Assessment
Developing an Observation Protocol