meeting nclb act: students with disabilities who are caught in the gap martha thurlow ross moen jane...
Post on 21-Dec-2015
213 Views
Preview:
TRANSCRIPT
Meeting NCLB Act: Students with Disabilities Who Are
Caught in the GapMartha Thurlow
Ross MoenJane Minnema
National Center on Educational Outcomeshttp://education.umn.edu/nceo
Sue RigneyU.S. Department of Education
http://www.ed.gov
CCSSO, 2004Boston, MA
Session Plan
• Background – NCLB and the “Gap”
• Out-of-level testing realities
• Current psychometric thinkings
• What IS a state to do?!
• Q & A Session
Purpose
“…to ensure that all children have a fair, equal, and significant opportunity to obtain a high-quality education and reach, at a minimum, proficiency on challenging State academic achievement standards and state academic assessments”
Progress over time
Safe Harbor
AYP combines
TestData
% Proficient + AdvancedAND
95% tested
Progress over time
12 Years to 100% Proficient Intermediate goals
Annual measurable objectives
Within a Content Area
Other Data
States’ “Gap” Problem??
• General assessment without accommodations
• General assessment with accommodations
<<<< Something in between?>>>>
• Alternate assessment (≤ 1 proficient)
States’ Responses Have Been Forms of
Out-of-Level Testing
• Off-Level• Modified
Assessment• Performance Level• Challenge Down• Levels Testing
• Alternate Assessment
• Adaptive Testing• Instructional Level• Functional Level• Alternative
Assessment
Definition Issues
• The administration of a test at a level that is above or below the student’s grade level in school
ASES SCASS (1999)
• Typically, only students with disabilities tested below the grade in which their same-age peers are enrolled
• States continue to use varying definitions!
National Status of Out-of-Level Testing
• For 2002 – 2003, 18 states tested out of level in large-scale assessment programs
New Pattern!• Currently, 8 states have
discontinued or are doing so
One State’s Prevalence Data Prevalence data available for both enrollment grade and grade at which tested.
o Approximately 30% of special education students were tested out of level in reading and math, and approximately 20% in writing
o How far below grade students were tested spread as grade level increased:
Enrolled grade 8: 44% tested gr 6; 40% tested gr 4; 16% tested gr 2
Enrolled grade 4: 68% tested gr 4; 32% tested gr 2
One State’s Data Interpretation
Performance data showed from 5% to 35% of students performed at goal level on the below grade level test – suggesting that they probably should have been in a higher grade level test
For example:
35% of grade 8 students tested on the grade 2 reading test performed at goal level
5% of grade 8 students tested on the grade 6 math test performed at goal level
One State’s Out-of-Level Test Data Use
• Interpreted as TOO MANY tested at TOO LOW of a test level
• Developed training with specific out-of-level testing focus
• Challenged teachers to raise assessment expectations!
NRT View at 50%8
7
6
5
4
3
Pe
rform
an
ce
Sta
nd
ard
s
3 4 5 6 7 8Content Standards
NRT View at 25%8
7
6
5
4
3
Pe
rform
an
ce
Sta
nd
ard
s
3 4 5 6 7 8Content Standards
Vertical Equating8
7
6
5
4
3
Pe
rform
an
ce
Sta
nd
ard
s
3 4 5 6 7 8Content Standards
Vertical Equating Challenges
• Statistics– Error variance: A score on one test equates
to a range of scores on another test.
• Content– Construct differences: Could predict
mathematics scores from reading scores.
OOLT Alignment Study
• Nine states interviewed in Spring 2003
• Only two relate out-of-level test results to enrolled grade performance– One = psychometric support for single
grade difference– One = human judgment with rubrics
Testing Without Aligning to Enrolled Grade
• Cannot help meet AYP
• Is it a legitimate, humane way to meet 95% participation requirements or a violation of NCLB intent?
• What can be done psychometrically?
8
7
6
5
4
3
Pe
rform
an
ce
Sta
nd
ard
s
3 4 5 6 7 8Content Standards
Standards Based Assessment
8
7
6
5
4
3
Pe
rform
an
ce
Sta
nd
ard
s
3 4 5 6 7 8Content Standards
Universal or Accommodated Standards Based Assessment
Psychometric Limitations
• Psychometric improvements can only remove barriers to seeing which students are proficient
• Changing actual proficiency requires something else
Research Findings Regarding Out-of-Level Testing
• Number of research studies conducted at NCEO on out-of-level testing
• The most recent:– Prevalence Study– Reporting Study– Alignment Study– Case Studies
NCLB and Out-of-Level Testing• “In order to improve instruction and achievement for all
students with disabilities, the Department expects States to assess as many students as possible with academic assessments aligned to regular achievement standards.”
• Out-of-level assessments aligned with alternate achievement standards for students with the most significant cognitive disabilities may be considered alternate assessments (fall within 1.0 Percent Cap for proficient and advanced scores)
(Federal Register, 2003)
To Reduce the Use of Out-of-Level Tests:
• Provide students access to the general curriculum
• Develop universally designed assessments
• Ensure that all teachers set high expectations for all students and understand the State’s academic content standards
(Federal Register, 2003)
How do NCEO recommendations align with
NCLB regulations?
• Introduce more appropriate assessments (i.e., universal design)
• Advocate for grade-level instruction and high expectations for ALL students
• Clear and thorough reporting of all test results and inclusion in accountability programs
• Call for reduction in out-of-level testing through policy and implementation change
Easy to say …
… but what can we REALLY DO??!!
NCEO’s Responses at End of Out-of-Level Testing Project
#1 – Identify students in “gap”
#2 – Restructure current thinking
#3 – Understand large-scale assessments
#4 – Use of accommodations
#5 – Raise teacher expectations
#6 – Provide high-quality training
#1 - States’ “Gap” Conundrum
• Who are these students?
• Can states identify these students?
• Similar subgroup of students across states?
• Change from year to year?
• Different students by grade, disability category, general vs. special education?
#2 - Re-think at State, District, and School Levels
Not a child issue
More than an assessment issue
An INSTRUCTIONAL issue!
Need to think critically about how to augment instruction!
#3 - Understanding Purpose of Large-Scale Assessments
Commonly used rationale for out-of-level testing is the more precise and accurate use for instructional decision making.
Thurlow & Minnema, 2001
But …
Time for a Test!
NCLB does NOT require student accountability (e.g., graduation exams to get diploma).
NCLB does require SYSTEM level accountability to ensure all students learn to high levels.
#4 - Accommodations Use
Anecdotal evidence of teacher confusion about accommodations
Little use of states’ accommodations policies
Now … real data!
Most Frequently Used in Testing
• Special education teachers said:– Extended (extra) time– Small group or individual administration– Test items read aloud– Directions read aloud– Alternate setting
However …
25% of teachers said they did not know
which accommodations were considered nonstandard in their state.
n = ~ 750
% Not Knowing Standard vs. % Not Knowing Standard vs. Nonstandard by AccommodationNonstandard by Accommodation
• Read aloud (67%)• Calculator (67%)• Spell check or dictionary (46%)• Scribe (32%)• Visual cues on test (20%)• Extended/extra time (17%)
#5 - Raise Instruction AND Assessment Expectations! Counter inconsistencies in practice
Grade of enrollment drove out-of-level testing decision Language arts and math teachers used out-of-level test
criteria differently
Think in terms of entire school systemAdministratorsALL school staffStudentsFamily
#6 - Training, training, training! Content focused on basic, important information
“Our large-scale assessment program is a ‘blip’ on my professional radar screen.” Administrator, large urban school district, April, 2004
Extend participant pool
Don’t rely on “train the trainer” models only
Experiment with new technologies!
Combine general and special education + practitioners + administrators
~ Thought shift happens! ~
SEAs think critically about who these students really are!
Work across SEA divisions (e.g., assessment, special education, English language learning)
Update state policy in practitioner-friendly format
top related