response to intervention, problem solving, and the 3 tier model universal data collection and...

46
Response to Intervention, Problem Solving, and the 3 Tier Model Universal Data Collection and Assessment Ruth Poage-Gaines, IASPIRE Regional Coordinator 11-16-09 Presentation Materials from Mary Miller- IASPIRE

Upload: robert-phelps

Post on 25-Dec-2015

213 views

Category:

Documents


0 download

TRANSCRIPT

  • Slide 1
  • Response to Intervention, Problem Solving, and the 3 Tier Model Universal Data Collection and Assessment Ruth Poage-Gaines, IASPIRE Regional Coordinator 11-16-09 Presentation Materials from Mary Miller- IASPIRE
  • Slide 2
  • Acknowledgements Mark Shinn and the IASPIRE North Region Coordinators (Barb Curl, Christine Martin, Madi Phillips, Ben Ditkowsky, Pam Radford, Janice Miller, Christine Malecki) D300- Carpentersville RtI Team- Mary Miller, Coordinator
  • Slide 3
  • Expected Outcomes Familiarity with general assessment principles An understanding of how summative assessment differs from formative assessment An understanding of mastery measurement vs. general outcome measures Problem Identification through the referral system vs. universal screening data Norms vs. Standards based approaches to defining at risk populations Understanding Curriculum Based Measurement How to use CBMs for program accountability
  • Slide 4
  • Shift in approach from: Assessment OF Learning to Assessment FOR Learning
  • Slide 5
  • General Assessments Types of Assessments Screening Screening tests identify at risk students according to a designated cut score Formative Formative assessment is ongoing Summative Summative assessment is often used at the end of major units of instruction and at years end Diagnostic Diagnostic assessments can be used for screening, or for formative of summative assessment
  • Slide 6
  • General Assessment Principles All assessment should be planful Tests should be given to answer a specific question about a childs performance Use Summative and Formative Evaluation Shift from what has been learned to what is being learned Move the focus away from unalterable variables to alterable variables that educators can do something about
  • Slide 7
  • Variables Related to Student Achievement Within StudentExternal to Student AlterableDesire to learn Motivation Strategies for learning Skills Prior content knowledge Self-efficacy Quality of curriculum Quality of instruction Pedagogical knowledge Content knowledge Quality of evaluation Quality of learning environment Quality of time/content Unalterable (Hard to change) Race Genetic potential Gender/sex Birth order Disposition Health Physical differences IQ Disability category Personal history Family income/resources Family housing Parent education level Mobility Family members Family values Peer socioeconomic status Family history
  • Slide 8
  • Slide 9
  • Diagnostic Tests Give information on specific skills that need to be taught Take longer to administer and score Best when tied to the curriculum and/or target important skills Standardized diagnostic tests are often used for determining eligibility for programming
  • Slide 10
  • General Assessment Principles Mastery Measurement vs. General Outcome Measurement Mastery measurement (i.e. Summative) is a measure of a childs mastery of a concept or curriculum presented General Outcome Measures (i.e. CBM) are not tied to a specific curriculum and measure progress on long term goals
  • Slide 11
  • ACADEMIC SYSTEMSBEHAVIORAL SYSTEMS STUDENTS Tier 1 Core Instructional Interventions All students Preventive, proactive 80% Tier 1 Core Instructional Interventions All settings, All students Preventive, proactive Tier 2 Targeted Group Interventions Some students (at-risk) High efficiency Rapid response 15% Tier 2 Targeted Group Interventions Some students (at-risk) High efficiency Rapid response 15% Tier 3 Intensive, Individual Interventions Individual Students Assessment - based Intense, durable procedures 5% Tier 3 Intensive, Individual Interventions Individual Students Assessment - based High intensity Of longer duration 5%
  • Slide 12
  • Successful 3 Tier Models Have. A continuum of services and/or programs across the tiers that are scientifically based Methods of identifying students at risk for academic failure and for evaluating/monitoring progress across the tiers, ideally those that are considered scientifically based Efficient, COMMON methods of communicating student performance for all disciplines (i.e. progress monitoring)
  • Slide 13
  • If I had 1 hour to save the world, I would use 50 minutes to define the problem. Albert Einstein
  • Slide 14
  • A Problem Defined At Tier 3: At Tier 3: The difference between an individual students performance and a criterion of success in a curriculum area. At Tier 2: At Tier 2: The difference between at-risk students performance and a criterion of success in a curriculum area. At Tier 1: At Tier 1: The difference between how many students are proficient on their accountability assessments and 100%. The desired state is for all students to be proficient. (NASDSE, 2006)
  • Slide 15
  • Identifying Student Need 1. Universal Screening AdvantagesDisadvantages Prevention and Intervention Focused Requires Proactive Programmatic Planning Doesnt Place Sole Reliance on Teachers to Refer a Student Universal Screening Data May Not Be Accurate All Students are Placed into Programs Based on Educational Need at the Beginning of the Year Requires a Systems Commitment to Universal Screening and Progress Monitoring
  • Slide 16
  • Identifying Student Need 2. Referral-Driven AdvantagesDisadvantages Consistent with Long History of Educational Practice Some Refer, Some Dont, Some Under, Some Over Capitalizes on Teachers Seeing the Whole Child Potential Biases Dont Require a Systems Commitment to Universal Screening and Progress Monitoring
  • Slide 17
  • Schools Use Specific Tools for Specific Assessment Purposes TypeFeatureExample Reliable, Valid, Low Cost, Accurate, Production Type Responses, Sensitive to Between Persons Differences CBM Family Members, MAP Lots of Items, Production-Type Responses Curriculum-Based Evaluation; Informal Tests, MAP, DRA-2 Reliable, Valid, Low Cost, Accurate, Production Type Responses, REPEATABLE, Sensitive to Within Persons Differences CBM Family Members Program EvaluationLinked to Important Outcomes MAP, ISAT
  • Slide 18
  • Universal Screening The basic question in a screening measure is whether or not the student should be judged as at risk For a screening measure to be useful, it should satisfy three criteria: Accurately identify students who require further assessment Be practical Efficient use of resources
  • Slide 19
  • Universal Screening Practices: Universal Screening and Benchmarking Data is collected at the beginning of a school year. School leadership team makes a decision about whether to use norms- or standards- based discrepancy for identifying problems. Teams use the Data to make Decisions about potential problems. Programs and Resources are Allocated to each of the 3-Tiers based on the Data.
  • Slide 20
  • Use Benchmark for Universal Screening 2 Approaches to Identifying Students: 1. Norm-Based Approaches to Identify the Most Needy Students 2. Standards-Based Approaches to Identify Intensity of Programs and Progress Monitoring Frequency
  • Slide 21
  • Methods of Measuring Performance Discrepancies Norm-Based Approaches Percentile Rank Cut Scores Discrepancy Ratios (Tiers 2 and 3) Standards-Based Approaches Illinois AIMSweb Standards (Cut Scores for ISAT and Minnesota State Test) Oregon DIBELS Standards (Cut Scores for Oregon State Test)
  • Slide 22
  • Examples of Percentile Rank Norms
  • Slide 23
  • Discrepancy Ratio Compute By: Peer Median Target Student Median 90 30 = Discrepancy of 3x Will Need Problem Solving Quantify how many times the students current level of performance varies from that of his/her peers.
  • Slide 24
  • Norm-Based Criteria 2nd Grade Discrepancy Tier 1 At Tier 1, 62% of 2 nd grade students have met the expected criteria (55 WRC) compared to 80% nationally.
  • Slide 25
  • Standard-Based Approaches Illinois AIMSweb Standards Tied to ISAT and Minnesota State Oregon DIBELS Standards
  • Slide 26
  • General Outcome Measures from Other Fields Medicine measures height, weight, temperature, and/or blood pressure Federal Reserve Board measures the Consumer Price Index Wall Street measures the Dow- Jones Industrial Average McDonalds measures how many hamburgers they sell
  • Slide 27
  • Understanding General Outcome Measures (GOM) from Mark Shinn, Ph.D. & Michelle Shinn, Ph.D. Measures important outcomes General skill rather than individual sub skills Contains a large pool of items Measurable and observable Sensitive to growth over relatively short periods of time Valid and reliable measure
  • Slide 28
  • What is Curriculum Based Measurement? Education has its own set of indicators of general basic skill success (General Outcome Measures). Curriculum-Based Measurement allows us to make important statements about our students reading, spelling, written expression, and mathematics computation skills.
  • Slide 29
  • AIMSweb Web-based data management system Organizes data Informs the teaching and learning process by providing continuous student performance data Reports improvements to students, parents, teachers, and administrators Assessment data and interventions are closely linked
  • Slide 30
  • Oral Reading Fluency (R-CBM), a standardized 1 minute sample of oral reading where the number of words read correctly is counted. (Grades 1-8) Reading (Maze-CBM), a multiple-choice cloze task that students complete while reading silently. The first sentence of a 150-400 word passage is left intact. Thereafter, every 7th word is replaced with three words inside parenthesis. (Grades 1-8) Phonics and Phonological Awareness (Early Literacy Measures), a standardized sample of fluency in initial sound identification, letter naming, and phonemic segmentation. (Grades K-1) Math Computation (M-CBM), a standardized 2-4 minute completion of computational problems where the number of correct digits is counted. (Grades 1-8) **May use at High School Level to identify at-risk and Progress Monitoring AIMSweb CBM Assessments
  • Slide 31
  • Early Numeracy (EN-CBM), a standardized sample of skills in oral counting, identifying missing numbers, number identification and quantity discrimination. (Grades K-1) Spelling (S-CBM), a standardized 2 minute spelling word dictation where the number of words spelled correctly or the number of correct letter sequences is counted. (Grades 1-8) Written Expression (WE-CBM), a standardized 2-4 minutes of writing after being provided a story starter where the total number of words written or the number of correct word sequences is counted. (Grades 1-8) MIDE Spanish Early Literacy a standardized sample of letter naming fluency, letter sound fluency, syllable segmentation, syllable reading fluency, syllable and word spelling, and oral reading fluency. These measures require students to produce information in one minute with the exception of syllable and word spelling in which prompts are given every 20 seconds for two minutes.
  • Slide 32
  • Phonemic Awareness Alphabetic Understanding Fluency Vocabulary Comprehension What Does R-CBM Measure? All of these skills. General Reading Ability All of these skills. General Reading Ability
  • Slide 33
  • Evaluating Core Reading Programs http://www.nationalreadingpanel.org/ Phonemic Awareness Phonics Fluency Vocabulary Comprehension R-CBM Assessing Reading DIBELS/ ISEL Running Record ITBS, etc. IRI, Gates, etc.
  • Slide 34
  • ReadingComprehension Knowledge Fluency* We Refer to It as General Reading Skills Metacognition Language Prosody Prosody Automaticity/Rate Automaticity/Rate Accuracy Accuracy Decoding Decoding Phonemic Awareness Phonemic Awareness Oral Language Skills Oral Language Skills Knowledge of Language Knowledge of Language Structures Structures Vocabulary Vocabulary Cultural Influences Cultural Influences Life Experience Life Experience Content Knowledge Content Knowledge Activation of Prior Activation of Prior Knowledge Knowledge Knowledge about Knowledge about Texts Texts Motivation & Motivation & Engagement Engagement Active Reading Active Reading Strategies Strategies Monitoring Strategies Monitoring Strategies Fix-Up Strategies Fix-Up Strategies *modified slightly from presentations by Joe Torgeson, Ph.D. Co-Director, Florida Center for Reading Research; www.fcrr.org
  • Slide 35
  • Student Scores- Correct Words per Minute Box Plot draws a box around the range of student scores: 169-43 90%ile 75%ile 50%ile 25%ile 10%ile Above 90%ile Below 10%ile
  • Slide 36
  • Progress Monitoring General Education Benchmark Assessment
  • Slide 37
  • Schools Use CBM in Universal Screening Instead of Referral Driven Practices < 25th Tier 2 Candidates
  • Slide 38
  • Strategic Monitoring of At Risk
  • Slide 39
  • Frequent Monitoring toward Individualized Goals
  • Slide 40
  • Local Assessments Correlated with Accountability Assessments Collect a large sample of scores from local assessments (e.g., R-CBM) and correlate with passing scores on accountability tests (e.g., ISAT) over time. Need AIMSweb or statistician to calculate correlations Correlations between test scores result in determining what minimum score is needed on local assessment to pass the state accountability measures
  • Slide 41
  • Advantages of Using CBM for Accountability Assessments Measures are simple and easy to administer Measures are reliable and valid Training is quick Entire student body can be measured efficiently and frequently Routine testing allows schools to track progress during school year
  • Slide 42
  • Slide 43
  • What Assessment Systems Does Your School Use for Each Purpose? Essential Components Screening (Problem Identification) Diagnostic (Problem Analysis) Progress Monitoring (Plan Development and Implementation Outcome/ Accountability Reading Math Behavior
  • Slide 44
  • Lets Review General Assessment Principles Summative vs. Formative assessment Mastery Measurement vs. General Outcome Measures Problem Identification through the referral system vs. universal screening data Norms vs. Standards based approaches Understanding Curriculum Based Measurement How to read a box plot CBMs for program accountability
  • Slide 45
  • It is better to know some of the questions than all of the answers. James Thurber
  • Slide 46
  • Thank you Questions Comments For further information contact: [email protected] Have a Great Thanksgiving