assessment -because it counts- chip fissel matt meakin eric naylor lori stollar february 28, 2010
TRANSCRIPT
Assessment-because it counts-
Chip FisselMatt MeakinEric NaylorLori Stollar
February 28, 2010
Assessment
Fair Assessment is a process used by teachers and students before, during, and after instruction to provide feedback and adjust ongoing teaching and learning to improve student achievement. (PDE SAS)
Formative- classroom strategies Summative- PSSA Benchmark- 4Sight Diagnostic- DRA, running records
Classroom Grading in a Standards- Based Educational System
The current use of letter grades
Tradition - percentage grades introduced around 1900 with A-F scale brought in around 1918 (Guskey, 1994)
Record of student learning. Pre-requisite for course sequences. Rank students according to performance. (O’Connor,
2007) Understood by students, parents, and institutions of
higher education. (Bailey & Guskey, 2001) Motivate / Threaten. (Guskey, 2004; Reeves, 2004: Carifio &
Carey, 2009)
Activity
Discuss the handout with a partner. What is a fair grade for these students?
The Problem – One letter = many factors
ACADEMICS Summative assessment (tests,
quizzes, performance, portfolios etc)
Formative assessments (including homework)
NON-ACADEMIC Effort Attitude Attendance Behavior Responsibility Timeliness Compliance Participation Extra-Credit
“Letter or number grades inherently undermine learning.”
Kohn, 2009.
“Grades are often measures of how well a student lives up to the teacher's expectation of what a good student is rather than measuring the student's academic achievement in the subject matter objectives.”
Allen, 2005.
Suggestions from the research
Use a mathematical argument to improve letter grades based on percentages & averages. (Guskey, 2004; Reeves, 2004; O’Connor, 2007; Carifio & Carey, 2009)
Eliminate the zero on assignments & quarter grades. Replace with 50%.(Guskey, 2004; Reeves, 2004)
Use a 4-point scale not a 100-point scale. (Reeves, 2004)
OR…….Standards-based grading practices: Competency over
Compliance
Separate non-academic traits and academic performance. (Marzano, 2000; Tomlinson, 2001; Allen, 2005; Winger 2009)
Consistent use of accurate, skill-based standards. (Marzano, 2000; Bailey & Guskey, 2001; McTighe & O’Connor, 2005: Stiggins, 2005; Stiggins et al., 2006)
Criterion referenced feedback in addition to letter grade. (O’Connor, 2007; Lalley & Gentile, 2009)
Allow for self-adjustment without penalty. (McTighe & O’Connor, 2005; Scriffiny, 2008))
No grade for formative work or extra credit. (O’Connor, 2007; Scriffiny, 2008)
Authentic Assessment
Authentic assessments present the student with a full array of tasks that mirror the priorities and challenges found in the best instructional activities- conducting research, writing, revising and discussing, collaborating, debating. (Wiggins 1990)
Components of Authentic Assessments
Involve “real-world” application. Higher order thinking skills: Open ended inquiry, extended
thinking –analysis, investigation, problem solving, experimental inquiry, invention, induction, deduction, error analysis
Engage students in social learning Student choice and control over their own learning. Tasks that allow students to see relationships,
compare/contrast, “It’s the journey, not the destination” Process and product. Empower students with choices Davidson 2008
Authentic Assessment
Examples: Performance-writing samples, journals, projects, experiments Portfolio- Process or Product Service Learning Senior/Capstone Projects Oral exams Interviews Performances Research studies Read and interpret literature Math problems to solve in “real world context” Artwork, dance, performances
Authentic Assessment
Backward Design
Identify the outcome or results-standards should be linked to outcomes.
Determine what the evidence will entail and look like.
Plan the performance learning tasks.
Wiggins and McTighe
Performance Assessment Design Considerations
Students really understand (the idea, issue, theory, event beingaddressed when they can……
Provide credible theories, models, or interpretations to explain….Critique……Explain the value or importance of….Critically question the commonly held view that…..Recognize the prejudice within that……Accurately self assess…..Make fine subtle distinctions as…..
(Wiggins 1997)
Grading Authentic Assessments
Rubrics – How successful was the student in meeting the standards?
Start with the outcome and clearly define what students should be able to demonstrate.
Specific outcomes- Knowledge and Skills Applying Knowledge and skills= Performance
Examples
NASA will launch the first manned mission to Mars next month. The astronauts will have one week on the planet to conduct research. Develop an outline and prioritization for the research that the astronauts will conduct while they're on Mars. What areas do you want them to study? Why should they study those areas?
Data-Driven Assessmentclassroom…school…district
Assessment Considerations
Interim (Marshall, 2008)
Common (Ainsworth, 2007; DuFour, 2005; Reeves, 2007; Schmoker, 2003)
Formative (Wiliam, 2007)
Professional Learning Communities
Collaborative inquiry (Huff, 2008; Reeves, 2007)
Reflective dialogue (Huff, 2008; Piercy, 2006)
Instructional goals (Schmoker, 2003)
(DuFour, 2005)
“The powerful collaboration that characterizes professional learning communities is a systematic process in which teachers work together to analyze and improve their classroom practice…This process, in turn, leads to higher levels of student achievement.” --DuFour, 2005
Analyzing Multiple Measures of Data
Demographic
Perceptions
Student Learning
School Processes
(Bernhardt, 1998)
Multiple Measures—Four Types of Data
Learning
Teaching
Leadership
Persuasive Data
(White, 2007)
Utilizing Data Effectivelyfor School Improvement
1. Determine what you want to
know.
2. Collect data.
3. Analyze results.
4. Set priorities and goals.
5. Develop strategies.
(Heritage and Chen, 2005)
1. Where are we now?
2. Where do we want to go?
3. How will we get there?
4. How will we know we are
(getting) there?
5. How can we keep it going?
(Holcomb,
2009)
High School Exit Exams
History
Minimum competency exams 1970’s…Normally pass/ fail
Reform aimed at raising the academic performance (Conroy, 2005)
Evaluate school performance (Conroy, 2005) Reassure the public that graduates have basic,
literacy, numeracy and other minimal skills that are required for employment and effective citizenship.
(Grodsky, Warren, Kalogrides, 2009)
Assessment v. Accountability
What’s the difference?
Assessment informs and drives instruction Good assessments provide immediate feedback
Accountability is about making parties responsible for a certain outcome
Political term used to support spending and initiatives
Standardized tests are often used as policy instruments, designed forpolitical ends…mechanism for guaranteeing the input (financial input) isbeing used wisely to increase output (student readiness). (Ullucci &Spencer, 2009)
Pros / Cons
Good BadIncreased Accountability - Decrease AchievementIncrease College Preparedness - Increase Drop-Out RateFocus Curriculum - Decrease Curriculum ChoicesIncrease Achievement - Reduce Quality of InstructionIncrease Test Scores - Decrease SAT/ ACT ScoresBoost Public Confidence - Divert ResourcesBolster Value of Diploma - Deny Diplomas to Some
Exit exams are just challenging enough to reduce graduation rate, but not challenging enough to have measurable consequences for how much students learn or for how prepared they are for life after high school. (Warren & Grodsky, 2009)
Rationale for these tests is that they ensure that the high school diploma means something. Do they? The evidence is scant…states of course are not actively looking for evidence…political risks are too great…what would an education commissioner or governor say to the public if a study showed millions of dollars had been spent to no effect...states are not looking for “collateral damage.” Gerald Bracey, 2009
Policymakers would be well advised to reexamine the goals, structures, and outcomes of high school graduation tests. Brian Jacobs, 2001
So what is next…
Next Steps
Texas: 2009 legislation to shift from 1 exam to series of end-of
course exams with two sets of standards: Passing and College
Readiness. (Shen, 2009) Many of the states that employ exit exams have developed or
are developing systems that rely on multiple measures. (Perkins-Gough, 2005)
PA Keystone Exams
Keystone Exams
On January 8, 2010, following approvals by the state’s Independent Regulatory Review Commission and the Attorney General, the State Board of Education published changes to Chapter 4 that set stronger, more consistent high school graduation requirements for students, beginning with the class of 2015.
1. Successful course completion, with Keystone final exam
2. Rigorous, independently-validated local assessments
3. AP or International Baccalaureate exams
OR ANYCOMBINATION
=
6 Characteristics of Exit Exams that generate powerful incentives:Real consequencesDefine achievement relative to an external standardOrganized by discipline and specific to course sequencesMultiple levels of achievementCover almost all studentsAssess major portion of what students are studying and expected to knowBrian Jacobs, 2001