curriculum-based measurement and general outcome measurement (gom) and mastery monitoring (mm) mark...
TRANSCRIPT
Curriculum-Based Measurement and Curriculum-Based Measurement and General Outcome Measurement (GOM) General Outcome Measurement (GOM) and Mastery Monitoring (MM)and Mastery Monitoring (MM)
Mark R. Shinn, Ph.D.
Professor and School Psychology Program
National Louis University, Skokie, IL
http://markshinn.org
November 29th, 2012
1 of 6 members of Technical Review Panel, National Center for Student Progress Monitoring, USDE/OSEP2003-2007
Editor and Contributor to 2 Major Texts on CBM
Author of More than 75 Refereed Journal Articles and Book Chapters on the Topic of CBM, Progress Monitoring, and Screening
My Area of Expertise
Mark R. Shinn, Ph.D. Serves as a Paid Consultant for Pearson Assessment for their AIMSweb product that provides CBM assessment materials and organizes and report the information from 3 tiers, including RTI. He provides technical support and training.
Mark R. Shinn, Ph.D. Serves as a Consultant for Cambium/Voyager/Sopris for their Vmath product, a remedial mathematics intervention but has no financial interests. He helped them develop their progress monitoring system.
Mark R. Shinn, Ph.D. Serves as a Consultant for McGraw-Hill Publishing for their Jamestown Reading Navigator (JRN) product and receives royalties.He helped them develop their progress monitoring system.
Mark R. Shinn, Ph.D. Serves as a Member of the National Advisory Board for the CORE (Consortium on Reaching Excellence) and receives a stipend for participation. He provides training and product development advice.
DisclosureDisclosure
Background Reading on CBM and Decision Making In Multi-Tiered Model/RtI
Espin, C.A., McMaster, K., Rose, S., & Wayman, M. (Eds.). (2012). A measure of success: The influence of Curriculum-Based Measurement on education. Minneapolis, MN: University of Minnesota Press.
Available in
•pdf format
•iBook format
Presentation is Based on the Following White Paper
A “glossy” and official Pearson version will be finished soon and sent to you.Shinn, M.R. (2012). Measuring general outcomes: A
critical component in scientific and practical progress monitoring practices. Minneapolis, MN: Pearson Assessment.
References on CBM, GOM, and MM
Deno, S.L. (1986). Formative evaluation of individual student programs: A new role for school psychologists. School Psychology Review, 15, 358-374.
Espin, C.A., McMaster, K., Rose, S., & Wayman, M. (Eds.). (2012). A measure of success: The influence of Curriculum-Based Measurement on education. Minneapolis, MN: University of Minnesota Press.
Fuchs, L.S., & Deno, S.L. (1991). Paradigmatic distinctions between instructionally relevant measurement models. Exceptional Children, 57, 488-500.
Fuchs, L.S., & Fuchs, D. (1999). Monitoring student progress toward the development of reading competence: A review of three forms of classroom-based assessment. School Psychology Review, 28, 659-671.
Jenkins, J.R., & Fuchs, L.S. (2012). Curriculum-Based Measurement: The paradigm, history, and legacy. In C. A. Espin, K. McMaster, S. Rose & M. Wayman (Eds.), A measure of success: The influence of Curriculum-Based Measurement on education (pp. 7-23). Minneapolis, MN: University of Minnesota Press.
Shinn, M.R. (2012). Measuring general outcomes: A critical component in scientific and practical progress monitoring practices. Minneapolis, MN: Pearson Assessment.
8
Accessing Reading Materials
markshinn.org1. Click on the Downloads
for Professionals Icon
2. Click on the Presentations and Handouts Folder
3.Click on AIMSweb GOM and MM Webinar (Sponsored by Pearson) Folder
8
A Personal Story: Approaching 60, I Needed to Get Healthier
What Could I Measure to Gauge the Effects of My Efforts?
I wanted to measure something important.
I wanted it to be easy to do and not take a lot of time and $$.
I wanted it to be easy for me to understand, as well as for my wife and kids.
The Answer Was Obvious
This is General Outcome
Measurement Testing something “small” to make statements
about something “big” (important)!”
There Were Other Things I Could Measure
• Daily Calorie Targets
• Calories per Item Consumed
• Minutes of Daily Exercise
• Estimated Calories Burned from Exercise
• Inches Around Waist
• Miles per Day and Per Week of Bike Riding
• Average Biking MPH
• Average Cadence While Riding
• Energy Watts Generated
These Things ALSO Were Important, But More Difficult to Measure, to Compare, and “Put Together” for a Picture of Progress
This is Mastery Monitoring
Big Ideas
1. Educators typically have lots of opinions about assessment and progress monitoring is no exception. However, few of us have sufficient training in assessment in general and progress monitoring in particular.
2. (Yet) Frequent progress monitoring is one of the most powerful tools in educators’ intervention toolbox and the single most powerful teaching variable that they can control!
3. There are two “families” of Progress Monitoring tools,
1. General Outcome Measurement (GOM) and
2. Mastery Monitoring (MM)
4. GOM assesses progress on a standard and equivalent measure the same way over time. It answers the question of “Is the student becoming a ‘better reader?’ “ or “Is the student better at mathematics computation?” It is associated with gains in “important” outcomes or “big things.”
5. MM assesses progress on ever changing and different tests aligned with short-term instructional objectives or units. It answers the question of “Did the student learn what I taught today (or this week)? It is associated with instructional validity.
6. Most Curriculum-Based Measurement (CBM) tests are associated with GOM.
7. The ideal progress monitoring system is a combination of GOM and MM.
My Assessment Training
Schools Are Looking for Swiss Army Knife of Tests
Tests that Can...•Do EVERYTHING
•With Little to No Teacher Time
•Little Hassle
The Emphasis is On Program Evaluation, Accountability,
Perhaps Screening, But Quality PM is Not Their Strength!
Frequent Progress Monitoring (of a Particular Type) is One of Our Most Powerful Intervention Tools
• ...effective across student age, treatment duration, frequency of measurement, and special needs status
• Major message is for teachers to pay attention to the formative effects of their teaching as it is this attribute of seeking (my emphasis) formative evaluation...that makes for excellence in teaching (p. 181)
Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. New York, NY: Routledge.
Frequent (Formative) Progress Monitoring
Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. New York, NY: Routledge.
And the Number 1 Most Powerful TEACHING Variable
Some Basic Vocabulary to Support Understanding
General Outcome Measurement
GOM assesses progress on a standard and equivalent measure the same way over time.
Think: Testing “small” to make statements about something “big” (i.e.,very important)!
Also often referred to as Long-Term Measurement (LTM)
Other Professions Are Highly Dependent on GOM
Medicine
Blood Pressure
Blood Glucose Levels
Business
Earnings per Share
Economy
Consumer Price Index
Unemployment Rate
The Key Concept is An Empirically Validated “Indicator”
Curriculum-Based Measurement
Short, standardized basic skills measures validated as general outcomes measures (GOM).
General reading skill or ability:
R-CBM: Oral reading
Maze: Silent reading
General mathematics skill or ability:
M-COMP: General mathematics computation skills
M-CAP: General math concepts and application skills
General writing skill or ability:
WE-CBM: General written expression skills
General spelling skill or ability:
S-CBM: General written expression skills
A Reading General Outcome:A “Rich Task” Consistent with
CCSS
It was a pretty good composition. I felt proud knowing it was the best one at my school. After I’d read it five times, I was impatient to start reading it out loud. I followed the book’s directions again. First I read the composition out loud without trying to sound impressive, just to hear what the words sounded like.
QuickTime™ and aH.263 decompressor
are needed to see this picture.
Billy, 4th Grader
Questions I Can Answer
At a Single Point in Time:
Is This Student a Good or Poor Reader, gauged normatively or with standards?
Questions I Can Answer
Over Time:
Is This Student Improving in His General Reading Skill?
The Judgment is Empirical
Shinn, M.R., Good, R.H., Knutson, N., Tilly, W.D., & Collins, V. (1992). Curriculum-Based reading fluency: A confirmatory analysis of its relation to reading. School Psychology Review, 21(3), 458-478.
Points of Confusion with GOM
1. Short Tests can’t tell you anything.
Tell that to your physician. They are short to reduce the amount of instructional time lost to testing.
2. Because something “little” is tested, this “thing” becomes the specific instructional target.
It is not “oral reading fluency.”
To Move the R-CBM “Dial,” We Need to Target a Variety of Reading Skills, Not Just Reading Speed
CBM doesn’t include all the things we teach.
It does not measure everything in reading, math, writing, etc.
Less important for instructional planning and program evaluation and accountability
A Mathematics Computation General Outcome:
A “Rich Task” Consistent with CCSSAll Problems of
DIFFERENT TYPES
2 Digit Addition w/o Regrouping
2 Digit Subtraction w/o Regrouping
Row Addition
Column Addition
Mastery Monitoring
MM assesses progress on constantly different tests that are closely tied to specific instructional content and equivalent measure the same way over time.
Think: Testing “small” to make statements about something “small”!
Also often referred to as Short-Term Measurement (STM)
Examples: End-of Unit Tests, Specific Skills Tests, Quizzes
Single Skill Mathematics Computation ProbeBasic Addition Facts 0-12
Mathematics Computation Mastery Monitoring:
All Problems of The SAME TYPE
Questions I Can Answer
In the Short Term:
Is This Student Learning Multi-Digit Addition Skills?
Questions That Are More Difficult
In the Long Term:
Is This Student Improving in Mathematics Computation?
Why Is This Question More Difficult?
1. It presumes the student has retained addition skills.
2. It assumes that addition skills must be taught the student has before subtraction skills.
3. It assumes that the addition and subtraction skills tests are reliable and valid.
4. It assumes that the criterion for mastery (in this case 80%) has been validated.
GOM Assumptions, Advantages, and Disadvantages
Assumptions Advantages Disadvantages
An “Indicator” Has Been Established Empirically
Curriculum Independent Not Everything Students Need to Know Has a Validated Indicator; Currently Constrained to the Basic Skills
Progress Monitoring is Relatively Easy to Do--Logistically Feasible
NOT Consistent with How Teachers “Think” about PM
Reliable and Valid Tests Have Been Created
Lacks Exhaustive Information for Diagnosis and Instructional Planning
Assessment for Retention and Generalization Built In
Confident Decisions About Progress
MM Assumptions, Advantages, and Disadvantages
Assumptions Advantages Disadvantages
Validated Instructional Hierarchy
High Instructional ValidityLet’s Teachers Know if What They’ve Been Teaching Has Been Learned (and Least Initially)
Curriculum Dependent-Different Curriculum Value Different Things, Teach Them in Different Orders, Etc. Comparing Progress Within and Across Different Curriculum is Difficult
Reliable and Valid Tests are Available for Each Unit, Objective, Skills
Consistent with How Teachers “Think” about PM
Doesn’t Routinely Test for Retention and Generalization--Therefore Students May Not Be Taught to Mastery
Mastery Criterion are Empirically Established
Tests Can Often Be Used Diagnostically
Logistically Complex, Even if Reliable and Valid Tests Have Been Created; Testing is Always Changing and If Students are Taught to Criterion, Can Be Overwhelming
Reliable Decisions About Progress Are Sorta Iffy
Standards for Evaluating General Outcome Measures
Standards for Evaluating Mastery Monitoring Measures
Comparison of Progress Monitoring Standards
GOM Standards MM Standards
Alternate Forms Skill Sequence Specified
Sensitive to Student Improvement Sensitive to Improvement
Reliability of the Performance Level Score Reliability
Reliability of the Slope
Validity of the Performance Level Score Validity
Predictive Validity of the Slope of Improvement
End-of-Year Benchmarks Pass/Fail Criterion
Rates of Improvement Specified
Disaggregated Reliability and Validity Data Disaggregated Reliability and Validity Data
Norms Disaggregated for Diverse Populations
Mark’s Bottom Line SuggestionsTHINK PROGRESS
Progress Monitoring is Vital and We Have the Capacity to do This Efficiently and Effectively--In the Basic Skills
Frequent GOM Using CBM is the “Best” Way to Do This--Let’s Get It Done, Especially for At Risk Students and Those with Severe Achievement Discrepancies
THINK PERFORMANCE
MM is Important--But Less So For Progress
Performance is About What I am Teaching and If Students Don’t Perform What I’m Teaching, then No Learning Occurred
Bottom LineSo...
Build Basic Skills PM Using CBM at Tier 1 As Long As You Need To
Use More Frequent PM Using CBM at Tiers 2 and 3 As Long As You Have Students with Basic Skills Discrepancies--And In Most Schools, That’s Through Grade 12
Use Your Existing Assessments WITHIN THE CURRICULUM as Performance Assessment, Instructional Planning, and Supporting Evidence (Not Primary) of Progress
Questions?Questions?