wes bruce – parcc brandt redd – smarter balanced scott elliot – elpa 21 assets – carsten...

42
Are Your Schools Ready for the Next Generation Assessments? What You Need to Know from All Six Multi-State Consortia Wes Bruce – PARCC Brandt Redd – Smarter Balanced Scott Elliot – ELPA 21 ASSETS – Carsten Wilmes Neal Kingston – DLM Chris Domaleski - NCSC Philip Olsen – Wisconsin Department of Public Instruction Andy Middlestead – Michigan Department of Education

Upload: leona

Post on 13-Feb-2016

35 views

Category:

Documents


1 download

DESCRIPTION

Are Your Schools Ready for the Next Generation Assessments ? What You Need to Know from All Six Multi-State Consortia. Wes Bruce – PARCC Brandt Redd – Smarter Balanced Scott Elliot – ELPA 21 ASSETS – Carsten Wilmes Neal Kingston – DLM Chris Domaleski - NCSC - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

Are Your Schools Ready for the Next Generation Assessments?What You Need to Know from All Six Multi-State Consortia

Wes Bruce – PARCCBrandt Redd – Smarter BalancedScott Elliot – ELPA 21ASSETS – Carsten WilmesNeal Kingston – DLMChris Domaleski - NCSCPhilip Olsen – Wisconsin Department of Public InstructionAndy Middlestead – Michigan Department of Education

Page 2: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

Assessment ConsortiaComprehensive

PARCCSmarter Balanced

English Language ProficiencyElpa21ASSETS

AlternateDynamic Learning Maps (DLM)NCSC

Page 3: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

SETDA Guide to Technology Readinesshttp://gtr.setda.org

Page 4: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

“Get ready cause here we come!”*NCSA – New Orleans

Wes BruceJune 27, 2014

* With apologies to The Temptations

4

Page 5: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

PARCC Technology

Specifications

Desktops, Laptops, Tablets, Thin Client/VDI

Minimum Recommended

Operating System

Windows XP–SP3 (with caveats)Mac OS 10.6Linux*: Ubuntu 9-10, Fedora 6*iOS6Android 4.0*Chrome OS

Windows 7 or newerMac OS 10.7 or newerLinux*: Ubuntu 11.10, Fedora 16iOS6 or newerAndroid 4.0 or newer*Chrome OS

Memory By operating system 1 GB RAM

Processor By operating system 1 GHz

Screen Size 9.5 “ 9.5 “ or larger

Screen Resolution

1024 x 768 1024 x 768 or better

Bandwidth 5 kbps/ studentusing local caching

100 kbps/ studentto support instruction and assessment

Page 6: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

• Counting devices and checking bandwidth is the tip of the iceberg….– Can schools actually deliver the tests?– Are test administrators ready for the logistics?– Can you provide the data to take advantage of the new

opportunities?– Do teachers and students know what is expected of them?– Are parents and key publics aware of the tests and the

possible results?– Have you addressed Opportunity to Learn?

6

Readiness, “Has Only Just Begun” with the Technical Specifications

Page 7: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

• Determine if schools/districts are ready– What is your “due diligence” to validate that schools can

deliver? • Test administrators

– Even if you have been online, these systems are different and place different expectations on those administering the test

• Data Systems– Can you produce the kind of data needed to take

advantage of these new systems •PNP?

7

Schools, Districts and Systems

Page 8: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

8

Page 9: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

• Teacher Readiness– Have they taught the content and are they aware of the

tasks that will be used• Student Readiness

– Item types– Test interface

• Parent Readiness– Assessment itself– Results - both item and CCR

•Cite evidence vs. plagiarism

9

Teachers, Students and Parents

Page 10: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

• Model Content Frameworks– www.parcconline.org/parcc-model-content-frameworks

• Test Specifications and Blueprints– http://www.parcconline.org/assessment-blueprints-test-specs

• Sample items, tutorials and practice tests (all grades & sub.)– http://www.parcconline.org/practice-tests

• Technology Specifications– http://parcconline.org/technology

• Technology Resources– http://parcc.pearson.com/support

PARCC Readiness Resources

14

Page 11: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

Smarter Balanced

Brandt ReddCTO

National Conference on Student Assessment27 June 2014

Page 12: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

• 22 member states and territories representing 39% of K-12 students

• 20 Governing States, 1 Advisory State, 1 Affiliate Member

• Washington state is fiscal agent

• UCLA Graduate School of Education will be permanent home.

A National Consortium of States

12

Page 13: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

A Balanced Assessment System

Common Core State Standards

specify K-12

expectations for college and career readiness

All students leave

high school college

and career ready

Teachers and schools have

information and tools they need

to improve teaching and

learningInterim assessments

Flexible, open, used for actionable

feedback

Summative assessments

Benchmarked to college and career

readiness

Teacher resources for formative

assessment practices

to improve instruction

13

Page 14: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

SmarterApp.org

• Smarter Balanced: A consortium of states developing common assessments for ELA and Mathematics that are aligned to the Common Core State Standards.

• SmarterApp: A community of organizations devoted to collaboration on an open licensed software suite for the support of educational assessment.

14

Page 15: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

Assessment DeliverySystem*

Consortium Hosted

Item Authoring(author, approve,

versions, etc.)

Test Item Bank(test items, test authoring, test packager, etc.)

Data Warehouse

and Reporting

Test Delivery

Item Scoring(Deterministic, AI,

Hand)Test

Administration and Registration

(student reg.,test scheduling)

State Student Data System

Adaptive Engine

Smarter Balanced Assessment Delivery Architecture

Aggregate and student-level‡

reports.

Student responses,item scores,

and test scores.

Student ID, school, grade, ethnicity, etc.

Items

Responses

DistrictSIS

Eligible students, scheduled tests

Test package

Determine next set of items

Operationalitems

*Operated by a State or Smarter Balanced-certified vendor.

District will need to register studentsif no State system is available

‡ Individual Student Reports will be generated by the Consortium for states that allow student identification data to be stored by the Consortium. Other states will host instances of the Data Warehouse and Reporting components.

Students

Test registration will Q/A registration info against previous years’ data in the Data Warehouse.

Extract filewith student-level results

Test Integration,Test Scoring

Parents and Educators

TestAdministrator

(Proctor)

15

Page 16: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

Spring 2014 Field Test

• 4.2 Million Students• 16.5 Thousand Schools• 12.2 Million Tests completed

– 4.5 Million with accessibility features• Up to 4 tests per student. Average: 2.8.

• ELA• ELA Performance Task• Math• Math Performance Task

16

Page 17: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

A New Generation of Standards & Assessments for ELLs

•11 states funded in September 2012 by the U.S. Department of Education

•Partners• Lead State: Oregon Department of Education• Project Management: Council of Chief State School Officers (CCSSO) • Understanding Language Initiative (Stanford University), CRESST of the University of California, Los Angeles, and NCEO of the University of Minnesota

•Timeline• Item bank development (ongoing)• Field Test SY 2014-2015 • First Operational Summative SY 2015-2016 • Operational Screener SY 2016-2017 • Platform, Technical Requirements, and Reporting SY 2014-2015

Page 18: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

ELPA21 Consortium States

Arkansas, Florida, Iowa, Kansas, Louisiana, Nebraska, Ohio, Oregon, South Carolina, Washington, and West Virginia

Page 19: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

ELPA21 Structure

•New English Language Proficiency Standards• With guidance from states, WestEd, Understanding Language Initiative of Stanford University, and CCSSO developed completely new standards• College and career readiness focus

•Screener for each of 6 grade bands•Summative

Page 20: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

Assessment System Features

•We are in the unique position of integrating new standards, assessments, technology, and teacher and administrator supports — all leading toward better systems of support and learning for ELLs. •Comprehensive web-based delivery • Innovative technology-enhanced items • Includes teacher-developed items

•Cohesive system •High quality communications and outreach within states

•Sustainability

Page 21: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

•Reports• Screener & Summative

• Individual Student • Parent/Guardian • Aggregate (e.g., Classes, Schools, Districts, and States)

• Administrative and Technical (e.g., Registration, Q/A, Analyses)•Potential Reporting Information

• Scores for Listening, Reading, Writing, and Speaking – And comprehension

• Student Proficiency Level • Performance Level Descriptors

•Interpretive Guide•Professional Development

Features (cont.)

Page 22: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

Technical Challenges

•Demands of Speaking, Listening, Writing•Hardware Requirements for Listening and Speaking

•Assessment of Earliest Learners (K-2)

Page 23: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

© 2012 Board of Regents of the University of Wisconsin System, on behalf of the WIDA Consortium www.wida.us

ASSETS Project Overview

Page 24: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

24ASSETS

WIDA Consortium with ASSETS

35 member states - WIDA 35 member states - ASSETS

Page 25: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

25ASSETS

Development Timeline

2015-16: Fully Operational

Page 26: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

26ASSETS

ASSETS System

Page 27: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

27ASSETS

Updates

2014 Field Test wrapping up this monthAnalyzing results of student/LEA/SEA surveys to better deliver Field Test in 2015Selecting technology vendor via RFP process (completed by August)Assisting SEAs/LEAs with technology readiness preparation

Page 28: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

28ASSETS

For more information

• ASSETS Project Website: http://www.assetsproject.org/

• ACCESS for ELLs 2.0 Field Test resources: http://assetsproject.org/implementation/fieldtest.aspx#overview

• ACCESS for ELLs 2.0 Operational Test resources: http://assetsproject.org/implementation/operational.aspx

Page 29: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

School Readiness: Lessons Learned during DLM Field Testing

Neal KingstonMeagan Karvonen

Nicholas Studt

June 27, 2014

Page 30: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

30

Page 31: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

31

Overview of the Dynamic Learning Maps Alternate Assessment• Fine-grained learning maps

• A subset of particularly important nodes that serve as content standards – Essential Elements

• Instructionally-embedded and year-end assessments

• Instructionally relevant testlets• Accessibility and alternate pathways• Dynamic assessment• Status and growth reporting that is readily

actionable• Professional development• A technology platform to tie it all together

Page 32: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

32

Lessons Learned about Providing Resources

• How your organize information makes a difference– Quick checklists– Comprehensive documents

• District people need to have role-specific information

• State capacity is critically important to the district staff

Page 33: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

33

Lessons Learned About Training

• District people need a training structure, not just good self-directed training materials

• Different teachers learn best with different approaches

• Confusion between required and optional resources

• Educators need time and experience before a new system becomes routine

Page 34: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

34

Lessons Learned about Help Desk Support

• Educators are a immensely flexible group.• Educators initiate contact via email more often

than phone. • Minor changes to resources and training

provided are visibly amplified at the help desk. – A single sentence can cause noticeable increase in

calls & emails.• Smaller testing populations, with more

educators, require a larger than expected staff to support. – The economies of scale work against DLM

educators.

Page 35: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

THANK YOU!For more information, please contact:

[email protected] or

Go to: www.dynamiclearningmaps.org

For Professional Development, contact:[email protected]

The present publication was developed under grant 84.373X100001 from the U.S. Department of Education, Office of Special Education Programs. The views expressed herein are solely those of the author(s), and no official endorsement by the U.S. Department should be inferred.

Page 36: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

National Conference on Student Assessment, June 2014

Chris Domaleski

National Center and State Collaborative

Page 37: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

Overview• Five partner organizations

– National Center on Educational Outcomes– edCount, LLC.– National Center for the Improvement of Educational Assessment– University of North Carolina at Charlotte– The University of Kentucky

• 13 partner states and 11 tier 2 states• Long-term goal is to ensure that students with significant cognitive

disabilities achieve increasingly higher academic outcomes and leave high school ready for post-secondary options.

• Theory of Action: a well-designed summative assessment alone is insufficient. To achieve this goal, an AA-AAS system also requires: – Curricular & instructional frameworks – Teacher resources and professional development

Page 38: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

NCSC Technology Framework

Page 39: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

NCSC Technology Customized Open Source• Compliant with commonly used AT/AAC devices• Paper & pencil alternative delivery• Verify student profile LCI/PNP data • Hand scoring/interaction for teachers• Keyboard only navigation• Adaptive testing features• Accessibility features (e.g., Text to speech, magnification,

high contrast)• Upload evidence for an item feature• PD training, survey, practice tests• Federally funded and open source system/content available

to all schools and states without licensing fees

Page 40: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

Assessment Development Timeline

Spring 2014 - Pilot Test 1: Item Tryouts

Summer 2014 – Item Data Review, Finalize Test Specs

Fall 2014 – Pilot Test 2: Form Tryouts

Spring 2015 – Census/ Operational Test

Summer/Fall 2015 – •Standard Setting•Finalize Technical Docs

Page 41: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

Additional Resources

• www.ncscpartners.org• Curriculum and instruction resources• Technology architecture and

specifications• Presentations, papers, handouts and

more for various audiences…

Page 42: Wes  Bruce – PARCC Brandt Redd – Smarter Balanced Scott  Elliot – ELPA 21 ASSETS – Carsten Wilmes

Q&AConsortia

PARCC: http://www.parcconline.orgSmarter Balanced: http://www.smarterbalanced.orgElpa21: http://www.elpa21.org ASSETS: http://www.wida.us DLM: http://dynamiclearningmaps.org NCSC: http://www.ncscpartners.org

GuidesSETDA – Guide to Technology Readiness: http://gtr.setda.orgCoSN – Becoming Assessment Ready: http://www.cosn.org/focus-areas/it-management/becoming-assessment-readyETS – Coming Together to Raise Achievement: http://www.k12center.org/publications/raise_achievement.html