act aspire® periodic technical manual - …ocs.archchicago.org/portals/23/act aspire periodic...

104
discoveractaspire.org PERIODIC TECHNICAL MANUAL 2017 VERSION 1

Upload: truongquynh

Post on 06-Feb-2018

217 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

discoveractaspire.org

PERIODIC TECHNICAL

MANUAL

2017 VERSION 1

Page 2: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

PrefaceThe ACT Aspire Periodic Technical Manual contains detailed technical information about the ACT Aspire® Periodic assessments. The principal purpose of the manual is to document technical characteristics of the ACT Aspire assessments in light of their intended purposes. The ACT Aspire Periodic Technical Manual documents the collection of validity evidence that supports appropriate interpretations of test scores and describes various content and psychometric aspects of ACT Aspire. Multiple test design and development processes are articulated documenting how ACT attends to building the assessment in line with the validity argument and how concepts like construct validity, fairness, accessibility are attended to throughout the process. Also described are routine analyses designed to support ongoing and continuous improvement and research intended to assure the program remains psychometrically sound.

ACT endorses and is committed to industry standards and criteria. ACT endorses and is committed to complying with The Standards for Educational and Psychological Testing (AERA, APA, & NCME, 2014). ACT also endorses the Code of Fair Testing Practices in Education (Joint Committee on Testing Practices, 2004), which is a statement of the obligations to test takers of those who develop, administer, or use educational tests and test data in the following four areas: developing and selecting appropriate tests, administering and scoring tests, reporting and interpreting test results, and informing test takers. ACT endorses and is committed to complying with the Code of Professional Responsibilities in Educational Measurement (NCME Ad Hoc Committee on the Development of a Code of Ethics, 1995), which is a statement of professional responsibilities for those involved with various aspects of assessments, including development, marketing, interpretation, and use.

We encourage individuals who want more detailed information on a topic discussed in this manual, or on a related topic, to contact ACT.

Please direct comments or inquiries to the address below:

Research Services ACT, Inc. 500 ACT Drive Iowa City, Iowa 52243-0168.

© 2017 by ACT, Inc. All rights reserved.

Page 3: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

iii

Contents

Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ii

Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vi

Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii

1 General Description of ACT Aspire Assessments and Standards . 1.1

1.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.11.2 Purposes, Claims, Interpretations, and Uses of ACT Aspire Periodic

Assessments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.21.3 ACT Aspire: College and Career Readiness Standards . . . . . . . . . . . . . . . . 1.51.4 ACT Aspire and Standards Alignment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.5

2 ACT Aspire Test Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.1

2.1 Assessment Design Elements and Construct Coherence . . . . . . . . . . . . . . . 2.12.1.1 Cognitive Complexity and Depth of Knowledge (DOK) . . . . . . . . . . . . . 2.2

2.2 ACT Aspire Test Development Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.32.2.1 Selecting and Training Item Writers . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.32.2.2 Designing Items that Elicit Student Evidence . . . . . . . . . . . . . . . . . . . . 2.32.2.3 Item Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.3

3 Assessment Specifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.1

3.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.13.2 ACT Aspire Assessment Support Materials . . . . . . . . . . . . . . . . . . . . . . . . . . 3.1

3.2.1 ACT Knowledge and Skills Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.13.2.2 ACT Aspire Exemplar Items . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.2

3.3 Interim Assessments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.23.3.1 English Language Arts Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.2

3.3.1.1 English Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.23.3.1.2 English Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.23.3.1.3 English Reporting Categories . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.3

Page 4: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

iv

CONTENTS

3.3.1.4 English Item Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.43.3.1.5 English Test Blueprints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.4

3.3.2 Mathematics Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.53.3.2.1 Mathematics Reporting Categories . . . . . . . . . . . . . . . . . . . . . . . . 3.63.3.2.2 Calculator Policy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.133.3.2.3 Mathematics Item Types, Tasks, Stimulus . . . . . . . . . . . . . . . . . . 3.143.3.2.4 Mathematics Test Blueprints . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.14

3.3.3 Reading Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.153.3.3.1 Reading Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.153.3.3.2 Reading Reporting Categories . . . . . . . . . . . . . . . . . . . . . . . . . . 3.163.3.3.3 Reading Test Blueprints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.18

3.3.4 Science Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.193.3.4.1 Science Reporting Categories . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.203.3.4.2 Continuum of Complexity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.203.3.4.3 Science Item types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.223.3.4.4 Science Test Blueprints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.28

3.4 Classroom Assessments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.283.4.1 English Classroom Assessments . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.28

3.4.1.1 Specifications for English Language Arts Classroom Assessments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.29

3.4.2 Mathematics Classroom Assessments . . . . . . . . . . . . . . . . . . . . . . . . 3.303.4.3 Reading Classroom Assessments . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.313.4.4 Science Classroom Assessments . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.33

4 Accessibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.1

4.1 Development of the ACT Aspire Accessibility Support System . . . . . . . . . . . 4.14.2 Test Administration and Accessibility Levels of Support . . . . . . . . . . . . . . . . 4.3

4.2.1 Understanding Levels of Accessibility Support . . . . . . . . . . . . . . . . . . . 4.44.2.2 Support Level 1: Default Embedded System Tools . . . . . . . . . . . . . . . . 4.64.2.3 Support Level 2: Open Access Tools . . . . . . . . . . . . . . . . . . . . . . . . . . 4.74.2.4 Support Level 3: Accommodations . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.84.2.5 Support Level 4: Modifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.10

4.3 Accommodations, Open Access and Embedded Tools . . . . . . . . . . . . . . . . 4.11

5 Test Administration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.1

5.1 Policies and Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.15.2 Standardized Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.15.3 Selecting and Training Testing Staff . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.2

5.3.1 Room Supervisors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.25.3.1.1 Room Supervisor Qualifications . . . . . . . . . . . . . . . . . . . . . . . . . . 5.35.3.1.2 Room Supervisor Responsibilities . . . . . . . . . . . . . . . . . . . . . . . . . 5.3

5.3.2 Responsibilities of Other Testing Staff . . . . . . . . . . . . . . . . . . . . . . . . . 5.35.3.3 Staff Training Sessions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.4

Page 5: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

v

CONTENTS

6 Test Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.1

6.1 Data Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.26.1.1 Personally Identifiable Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.26.1.2 Security Features of Data Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.36.1.3 Data Breaches and Remedies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.56.1.4 Data Privacy and Use of Student Data . . . . . . . . . . . . . . . . . . . . . . . . . 6.6

6.1.4.1. Data Privacy Assurances . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.66.1.4.2 Security Processes and Procedures . . . . . . . . . . . . . . . . . . . . . . . 6.76.1.4.3 Transparency Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.86.1.4.4 Student Privacy Pledge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.8

6.2 Test Security and Administration Irregularities . . . . . . . . . . . . . . . . . . . . . . . . 6.9

7 Reporting and Research Services . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.1

7.1 Interim Reporting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.17.1.1 Student Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.17.1.2 Educator Reports . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.17.1.3 School Reports . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.37.1.4 District Reports . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.3

7.2 Classroom Reporting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.37.2.1 Student Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.37.2.2 Educator Reports . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.3

8 ACT Aspire Interim Assessment Scores and Reliabilities . . . . . . . . . 8.1

8.1 Descriptive Statistics of Subject Raw Scores . . . . . . . . . . . . . . . . . . . . . . . . 8.18.2 Subject Raw Score Reliability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8.5

9 Concordance of ACT Aspire Interim Scores to ACT Aspire Summative Scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.1

9.1 Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.19.2 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.2

Coming Soon

Score Scale

Norms

ACT Aspire Interim Linking

Fairness

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . R.1

Page 6: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

vi

FiguresFigure 1.1. The Full Picture: Evidence and Validity. . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.3Figure 3.1. ACT Aspire English Test Reporting with Example Skill Targets for

One Reporting Category. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.4Figure 3.2. Mathematics Domains and Primary Currents in the Flow

across Grades. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.6Figure 3.3. Foundation and Grade Level Progress (illustrated for grade 7). . . . . . . . . 3.12Figure 3.4. ACT Aspire Reading Test Reporting Categories with Example Skill

Targets for One Reporting Category. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.16Figure 3.5. ACT Aspire Interim Reading Test Passage Types. . . . . . . . . . . . . . . . . . . 3.18Figure 3.6. ACT and ACT Aspire Interim Science Content Complexity Continuum. . . 3.22Figure 4.1. Accessibility Feature Mapping Process. . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.3Figure 4.2. ACT Aspire Levels of Accessibility. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.5Figure 4.3. Default Embedded System Tools. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.7Figure 4.4. Open Access Tools. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.8Figure 4.5. Accommodations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.10Figure 4.6. Modifications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.10

Page 7: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

vii

TablesTable 3.1. Specification Ranges by Reporting Category . . . . . . . . . . . . . . . . . . . . . . 3.5Table 3.2. Percentage of Points by DOK for ACT Aspire Interim English Tests . . . . . 3.5Table 3.3. Specification Ranges by Reporting Category for Grades 3–5 . . . . . . . . . 3.14Table 3.4. Specification Ranges by Reporting Category for Grades 6–8 . . . . . . . . . 3.14Table 3.5. Specification Ranges by Reporting Category for EHS . . . . . . . . . . . . . . . 3.15Table 3.6. Percentage of Points by DOK for ACT Aspire Interim Mathematics Tests 3.15Table 3.7. Passage Coverage by Text Complexity for the ACT Aspire

Reading Test Grade Level . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.17Table 3.8. Specifications Ranges by Reporting Category and Grade Level . . . . . . . 3.19Table 3.9. Percentage of Points by DOK for the ACT Aspire Interim Reading Tests . 3.19Table 3.10. Stimulus Modes Used on the ACT Aspire Interim Science Test . . . . . . . 3.24Table 3.11. ACT Aspire Interim Science College and Career Readiness

Knowledge and Skill Domain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.25Table 3.12. Specifications Ranges by Reporting Category and Grade Level . . . . . . . 3.28Table 3.13. Percentage of Points by DOK for the ACT Aspire Interim Science Tests . 3.28Table 4.1. Interim Online Testing Presentation Supports . . . . . . . . . . . . . . . . . . . . . . 4.12Table 4.2. Interim Online Testing Interaction and Navigation Supports . . . . . . . . . . . 4.14Table 4.3. Interim Online Testing Response Supports . . . . . . . . . . . . . . . . . . . . . . . . 4.15Table 4.4. Interim Online Testing General Test Condition Supports . . . . . . . . . . . . . 4.15Table 8.1. Descriptive Statistics of ACT Interim Raw Scores in English . . . . . . . . . . . 8.2Table 8.2. Descriptive Statistics of ACT Interim Raw Scores in Mathematics . . . . . . . 8.3Table 8.3. Descriptive Statistics of ACT Interim Raw Scores in Reading . . . . . . . . . . 8.4Table 8.4. Descriptive Statistics of ACT Interim Raw Scores in Science . . . . . . . . . . . 8.5Table 8.5. Raw Score Reliability Coefficient and Standard Error

of Measurement by Grade, Subject, and Form . . . . . . . . . . . . . . . . . . . . . . 8.7Table 9.1. Interim Raw Scores to Summative Readiness Benchmarks

Concordance in English . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.3Table 9.2. Interim Raw Scores to Summative Readiness Benchmarks

Concordance in Mathematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.3

Page 8: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

viii

TABLES

Table 9.3. Interim Raw Scores to Summative Readiness Benchmarks Concordance in Reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.3

Table 9.4. Interim Raw Scores to Summative Readiness Benchmarks Concordance in Science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.4

Page 9: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

1.1

CHAPTER 1

General Description of ACT Aspire Assessments and Standards

1.1 OverviewThe ACT Aspire® Periodic assessments are an integral part of the ACT Aspire Program. The assessments are composed of two components which serve to support a longitudinal assessment system for grade 3 through Early High School (EHS) in English, mathematics, reading, and science. These components—Interim assessments and Classroom assessments—allow educators to receive data at multiple points throughout the school year, supplying educators with timely, actionable, and instructionally valuable data. This data provides information about students’ progress toward learning goals and objectives that can be used for adapting instruction, as well as for evaluating and monitoring, allowing time for intervention with struggling students or for enrichment for students who are excelling. Like the summative assessments, these two components are aligned with ACT’s College and Career Readiness Standards (CCRS), as well as concepts and skills outlined in the Common Core State Standards (CCSS) for English Language Arts and mathematics and the Next Generation Science Standards (NGSS) for science.

The ACT Aspire Interim assessments are administered periodically throughout the course of an academic year to provide information about students’ progress toward end-of-year learning goals and objectives. They are designed to simulate the ACT Aspire Summative experience, and in this way offer a general sense of prediction for summative performance. All Interim assessments within a grade

Page 10: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

GENERAL DESCRIPTION OF ACT ASPIRE ASSESSMENTS AND STANDARDS

1.2

and subject measure the same content, providing a “point-in-time” picture for comparison throughout the year.

The classroom-based portion of the ACT Aspire Periodic assessment program consists of short mini-assessments available for grades 3–8. They are based on specific learning targets, typically addressing one to two standards in each of the four subjects, identifying students’ strengths and areas of need based on data. The ACT Aspire Classroom assessments are designed to fit naturally into teachers’ instructional sequences before, during, or after a given lesson or unit of study. They serve an instructional purpose by giving teachers information to make immediate instructional decisions and by giving students information about how to improve their academic performance.

Both Interim and Classroom assessments are administered in a selected response format and offered by computer, which allows for nearly immediate analysis and reporting. Reporting includes data usable for item response/analysis (including released items, percent choosing each distractor, and timely score reports showing strengths and weaknesses based on percent of items correct within reporting category). To enhance score interpretation, reporting categories for all ACT Aspire assessments use the same terminology as the ACT College and Career Readiness Standards (ACT CCRS) and other standards that target college and career readiness, including the standards of many states and the Common Core State Standards (CCSS).

1.2 Purposes, Claims, Interpretations, and Uses of ACT Aspire Periodic AssessmentsThe purpose of the ACT Aspire Periodic assessments is to help better prepare students for the Aspire Summative assessment. The Periodic assessments are one component of the ACT Aspire program and are built around the same framework used to design the summative tests. As such, the following paragraphs describe the Aspire system as a whole and not just the Periodic.

To identify college and career readiness constructs, ACT uses empirical and performance data to define requisite constructs in the content areas of English, mathematics, reading, and science. Every three years, ACT administers the ACT National Curriculum Survey study to identify what kindergarten through postsecondary teachers, including instructors of entry-level college and workforce-training courses, expect of their entering students—this includes the most current knowledge and skills students need to demonstrate to be ready for entry-level postsecondary courses and jobs. ACT also collects data about what is actually being taught in elementary, middle, and high school classrooms. Taken altogether, these results support ACT Aspire’s curriculum base and ACT’s ability to identify the key skill targets and knowledge that is most important for students to know to be ready for the next step. ACT uses these and other research results to design assessments and inform test blueprints so that each

Page 11: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

GENERAL DESCRIPTION OF ACT ASPIRE ASSESSMENTS AND STANDARDS

1.3

test targets the most important college and career readiness skills across skill progressions.

Additional validation of the constructs occurs through ACT’s longitudinal research (see figure 1.1).

Figure 1.1. The Full Picture: Evidence and Validity.

Figure 1.1 shows how ACT uses research including empirical feedback loops to help inform continuous improvement. The ACT Aspire development starts by exhaustively understanding the most important requirements and most desired evidence needed to support inferences about college and career readiness, and tests are designed specifically to elicit the identified evidence. Subject matter experts (SMEs), item writers, and other educators work collaboratively to write items and review them for accuracy, appropriateness, and fairness. Scores from assessments are calculated, reported, and sent out on various reports designed for each audience. Instructionally relevant reporting categories provide immediately actionable information about how students perform on key skills at a more granular level. Additional psychometric work continues, including analyzing student performance along the Kindergarten to Early High School (K–EHS) continuum. Students who meet readiness score benchmarks at one grade level are compared to students who succeed in the next grade. This information is used to ensure the benchmarks are accurate and up-to-date. Through research and feedback loops, ACT ensures that key targets are identified and measured appropriately across the continuum.

ACT Aspire was created by employing a theory of action (TOA) approach that fuses content validity (academic research) with predictive validity (empirical data), thus following similar methodologies used to build the ACT. The TOA begins by answering fundamental questions about the purpose of the assessment, such as: Who are the intended users? What are the intended uses

Page 12: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

GENERAL DESCRIPTION OF ACT ASPIRE ASSESSMENTS AND STANDARDS

1.4

of the assessment results? What are the consequences that may result from using the assessment? What are the intended interpretations or claims based on the assessment? What are measurable outcomes from using the assessment? The answers to these questions emerge as a result of rigorous research and data collection that inform and allow for the identification of high-value skill targets in each subject area, resulting in focal points for the development of tasks and test forms. The procedure set forth by the TOA further gives rise to hypothesized mechanisms or processes for bringing about the intended goals of the assessment results. For example, cognitive labs, piloting, and field testing are used to validate results and iteratively improve the specifications and design of the assessment. Operational results are used to continuously improve the components of the assessment.

Artifacts of the assessment architecture emerge from the research and data collection process to ensure that items and test forms elicit the intended evidence to support the claims made by the assessment. For example, content and item specifications, test blueprints, benchmarks, and performance level descriptors (PLD’s) influence the technical quality and output of test items and forms. These artifacts are informed by several factors, including:

• Academic research on skill targets, sequencing of skills, and grade level placement

• Data and evidence of student understanding collected from the assessments• The ACT National Curriculum Survey• Survey of standards frameworks—including, but not limited to the ACT CCRS,

CCSS, and Next Generation Science Standards• Subject Matter Experts (SME)

The principal claims, interpretations, and uses of ACT Aspire are the following:

1. To measure student readiness on an empirically derived college readiness trajectory. (Note that students taking the ACT Aspire summative battery will receive scores that can be compared to ACT Readiness Benchmarks that are linked to the ACT.)

2. To measure student readiness on a career readiness trajectory.

The secondary claims, interpretations, and uses of ACT Aspire are the following:

1. To provide instructionally actionable information to educators. Data from the ACT Aspire Periodic assessment can be used to identify areas of student strength and weakness in content areas at student, classroom, and school levels. Data can inform instruction and facilitate the identification of interven-tions.

2. To provide empirical data for inferences related to accountability. ACT Aspire data can be one of multiple measures for making inferences about student

Page 13: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

GENERAL DESCRIPTION OF ACT ASPIRE ASSESSMENTS AND STANDARDS

1.5

progress and growth with respect to college and career readiness in reading, language, writing, mathematics, and science, as reported in terms of ACT CCRS. ACT Aspire can also be used for accountability reporting where college and career readiness standards (such as CCSS and others) have been adopted.

1.3 ACT Aspire: College and Career Readiness StandardsACT Aspire assessments in grade 8 and Early High School are aligned with the ACT College and Career Readiness Standards (CCRS). The ACT CCRS, developed for each content test, are descriptions of the skills and knowledge that ACT has empirically linked to readiness in postsecondary education and the world of work. Different groups of SMEs developed the ACT CCRS statements by synthesizing the domain-specific knowledge and skills demonstrated by students in particular score bands across thousands of students’ scores. Within each content area, the CCRS are organized by strand, which mirror the reporting categories featured in ACT Aspire, and by score band. The ACT CCRS are organized into six achievement levels; each level is defined by a 3-point score band. The CCRS also indicate the ACT College Readiness Benchmarks in each content area that were derived using test performance and college placement data. The ACT College Readiness Benchmark scores are clearly delineated on the CCRS tables (see the ACT Technical Manual for more information on derivation of ACT CCRS and ACT College Readiness Benchmarks).

Although a lot of similarity exists for expectations in ELA and mathematics, science learning standards currently vary dramatically across states and organizations. ACT Aspire science assessments for grades 3–7 are associated with the draft ACT Aspire Science College and Career Readiness Knowledge and Skill Domain (See Table 3.11). SMEs developed these statements by back-mapping the ACT CCRS down to grade 3. Content and measurement specialists then consulted several sources including ACT’s evidence, research in science education, the National Science Education Standards, NAEP findings, and input from nationally known SMEs in elementary level science education.

1.4 ACT Aspire and Standards AlignmentMany sets of standards target college and career readiness exist including the ACT CCRS, the CCSS, and unique standards from states across the United States. The purpose of these standards is to articulate requisite knowledge and skills needed to prepare students for postsecondary and career success. As previously described, ACT Aspire assessments are designed to measure progress and provide evidence to back up the claim, in this case, that students are on target for college and career readiness.

Page 14: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

GENERAL DESCRIPTION OF ACT ASPIRE ASSESSMENTS AND STANDARDS

1.6

To demonstrate alignment, evidence must be provided that assessment scores support inferences about student achievement and content understanding identified by a given set of expectations (often articulated by a state’s standards). Some assessments are designed to measure a single set of content standards; alignment is sometimes assumed for those cases. Sometimes alignment is demonstrated by asking SMEs to examine several content test forms and standards and judge whether content standards are covered adequately by the test forms. ACT Aspire has been involved in several SME-based alignment studies with each study demonstrating slightly different results depending on the standards examined and on the SMEs who participated.

Strong empirical evidence supports that ACT Aspire scores can be interpreted as measures of college and career readiness. ACT score scales for English, mathematics, reading, and science have been empirically linked to ACT Aspire scores and the ACT College Readiness Benchmarks, which relate scores on the ACT assessment directly to performance in college courses. ACT Aspire Readiness Benchmark scores have been back-mapped from student performance data that support inferences of a student being on target for readiness. An empirical link has also been established between the ACT Aspire Composite score and the ACT National Career Readiness Certificate (ACT NCRC), which provides information about student achievement and employable skills in reading, math, and locating information. These research-based connections to real-world college work and real-world career performance contribute direct evidence of alignment from ACT Aspire assessments to college and career readiness skills and knowledge. Therefore, ACT Aspire assessments exhibit strong alignment with standards that target college and career readiness.

ACT has a historical connection to the CCSS, a set of academic standards adopted by many states. In 2008, The National Governors Association (NGA) and the Council of Chief State School Officers (CCSSO) invited ACT and other partners to participate in the development of a set of research-supported college and career readiness standards. ACT’s role included providing data, empirical research, and staff expertise about what constitutes college and career readiness. Due to some of the shared research upon which the CCSS were based, significant overlap exists between them and the college and career readiness constructs used to guide the development of ACT Aspire.

ACT closely monitors and has contributed to the creation of the Next Generation Science Standards (NGSS) that are being implemented in some states. ACT Science content experts presented ACT research on science curricula and career and college readiness with NGSS developers early in the NGSS standard drafting process. Considerable alignment between ACT Aspire and the NGSS exists; however, the ACT Aspire assessments are not designed specifically to assess the NGSS. As previously described, the ACT Aspire science assessments are based on ACT research on current curricula at the elementary,

Page 15: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

GENERAL DESCRIPTION OF ACT ASPIRE ASSESSMENTS AND STANDARDS

1.7

middle, and high school levels as well as ACT research on college and career readiness.

Some areas of achievement are not measured by ACT Aspire Periodic assessments. Specifically, any standards that require constructed response, extended time, advanced uses of technology, active research, peer collaboration, producing evidence of a practice over time, or speaking and listening are not currently assessed on ACT Aspire Periodic assessments.

Further information on the alignment of ACT assessments can be found in the document “How ACT Assessments Align with State College and Career Readiness Standards,” available at: http://discoveractaspire.org/pdf/Alignment -White-Paper.pdf.

Page 16: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity
Page 17: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

2.1

CHAPTER 2

ACT Aspire Test Development

2.1 Assessment Design Elements and Construct CoherenceACT Aspire tests are designed to measure student achievement and progress toward college and career readiness. A principal philosophical basis for a longitudinal system of tests such as ACT Aspire is that readiness is best assessed by measuring, as directly and authentically as possible, the academic knowledge and skills that students will need in order to be successful. Coherence across the learning trajectory of college and career readiness constructs is required to support inferences about individual and aggregate growth. Skills and understanding develop over time, which means that students must learn new material and must also apply what they have previously learned in more sophisticated ways. ACT Aspire assessments are designed to measure key college and career readiness constructs in a way that recognizes that knowledge and skills are not isolated to specific grades, but rather progress across grades. Items and tasks are designed to elicit evidence about constructs learned primarily in a particular grade, and also collect evidence about the skill progression across grades. From a measurement perspective, this design is an important component of supporting a vertical scale thus reinforcing inferences about growth. Using this design is also important to develop actionable reports for students across all achievement levels.

Although sampling constructs across grades occurs in all the subject tests, it can be illustrated most clearly by examining the design of the mathematics test. In each form, certain items contribute to Foundation scores. Foundation items are explicitly designed to elicit evidence about a concept that was introduced in a

Page 18: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACT ASPIRE TEST DEVELOPMENT

2.2

previous year but that the student must now apply in a more advanced context. An illustration of this for the mathematics test is provided in Chapter 3. These items are important for understanding where a student in the current grade actually falls on the learning trajectory.

ACT Aspire assessments are designed to be developmentally and conceptually linked across grade levels. To clearly reflect and help interpret that linkage, the reporting categories are associated across all grade levels. Higher grade levels of ACT Aspire assessments are taken by students who are closest in age and skill development to students taking the ACT and ACT NCRC, and, therefore, these scores have an even stronger empirical bridge to inferring levels of college and career readiness.

2.1.1 Cognitive Complexity and Depth of Knowledge (DOK)The cognitive complexity level of written passages and the cognitive demands of an assessment item are important characteristics to consider when measuring a student’s academic achievement. ACT Aspire assessments reflect expectations that students will need to think, reason, and analyze at high levels of cognitive complexity in order to be college and career ready; items and tasks require the sampling of different levels of cognitive complexity, with most items targeted at upper levels. Due to the wide use of Norman Webb’s depth-of-knowledge terminology, this document describes the cognitive complexity of items using language such as depth of knowledge (DOK). Given the various definitions of DOK levels, understanding ACT’s interpretation of them is important to understanding the test design.

Similar to Webb’s definition, DOK levels are assigned to reflect the complexity of the cognitive process required, not the psychometric “difficulty” of the item. Unlike other DOK interpretations, ACT only assigns a DOK level 4 value to describe multiday, potentially collaborative classroom activities and assessments designed for learning purposes. By this definition, DOK assignments on any periodic assessment (including ACT Aspire) are limited to values of 1 to 3.

ACT’s DOK level 1 corresponds to Webb’s level 1 where students are primarily actively using knowledge and skills with limited extended processing. ACT’s DOK level 2 extends beyond level 1 and involves applying these cognitive processes to many situations, including real-world situations. Therefore, ACT’s DOK level 2 actually aligns with Webb’s DOK level 2 and some of Webb’s DOK level 3. ACT’s DOK level 3 involves situations where the student must apply high-level, strategic thinking skills to short- and long-term situations. Some of these situations are novel and some require generating something like a graph, but all involve higher-level thinking skills. Given this interpretation, ACT’s DOK level 3 aligns with Webb’s DOK level 3 and DOK level 4.

Page 19: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACT ASPIRE TEST DEVELOPMENT

2.3

2.2 ACT Aspire Test Development Processes

2.2.1 Selecting and Training Item WritersItem writers are chosen from applicants who demonstrate adequate qualifications (e.g., practicing teachers, subject specialists, curriculum coordinators, department chairs). Item writers have extensive content and pedagogical knowledge and teach at the grade levels covered by ACT Aspire. These educators are actively engaged in teaching at various levels, at a variety of institutions, from small private schools to large public schools.

ACT recruits item writers who represent the diversity found in the United States with respect to ethnic background, gender, English-language proficiency, and geographic location.

2.2.2 Designing Items that Elicit Student EvidenceItem writers are instructed to consider the entire construct when crafting assessment tasks and items. Items are designed to elicit evidence about students’ knowledge and skills that span the full construct, which in many cases, is beyond any specific achievement standard language. Item writers use templates that frame what knowledge, skills, and abilities are of greatest interest in construct measurement while calling out unintentional knowledge, skills, and abilities that should not be measured. Items must fulfill task template requirements (e.g., content, DOK, word count, accessibility), reflect diversity, and meet fairness standards.

The goal of crafting high quality items or tasks is to design situations to collect relevant evidence from the student in a manner that is as authentic as possible while sampling enough of the construct to support the inferences based on the student’s responses.

2.2.3 Item ReviewAll items undergo rigorous content reviews by internal and external content experts to ensure that they solicit sufficient student evidence, are developmentally appropriate, and that the contents and contexts are error-free. The process includes internal peer reviews, internal senior-level reviews, and external reviews. The content reviews also ensure that each item or task measures what is intended and functions at the intended DOK. External experts participate in fairness reviews to ensure that items and tasks are fair and not biased to one student demographic.

Initial forms are constructed to conform to test specifications. Once constructed, forms are reviewed by ACT staff and external content panelists. These panelists evaluate items for content accuracy and evaluate the test form for context appropriateness and representation. They also confirm that the form will collect the necessary evidence to support the intended inferences from scores.

Page 20: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACT ASPIRE TEST DEVELOPMENT

2.4

After forms are finalized, they are administered operationally. Before students receive scores, data is carefully checked to ensure items and forms are working as intended.

Page 21: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

3.1

CHAPTER 3

Assessment Specifications

3.1 OverviewThe design of each ACT Aspire test includes a specified range of item difficulties at targeted depths of knowledge organized into content-specific reporting categories. The amount and nature of evidence needed to support an inference about a student’s achievement about a given construct are taken into consideration when selecting the number of items developed for each reporting category. These requirements are balanced with maintaining manageable administration conditions. ACT Aspire assessments cover topic progressions from foundational concepts to sophisticated applications.

Selected-response items require students to select a correct answer from several alternatives. Each correct selected-response item has a value of 1 point. Incorrect, missing responses (items that a student did not answer), and multiple responses have a value of zero points.

Non-operational items are included but do not contribute toward raw score points. The tables in this chapter refer only to operational items.

3.2 ACT Aspire Assessment Support Materials

3.2.1 ACT Knowledge and Skills MapThe ACT Knowledge and Skills Map is an interactive, online tool that can be used to learn more about ACT’s college and career readiness domain across grades. Each subject area provides a grade-by-grade overview of knowledge and skills domains that research suggests a student should achieve on the path to college and career readiness. In addition to providing a framework for

Page 22: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.2

understanding the path to college and career readiness, the ACT Knowledge and Skills Map provides sample items and research citations.http://skillsmap.actlabs.org/map

• username: actlabs• password: actlabs

3.2.2 ACT Aspire Exemplar ItemsACT Aspire has developed two resources designed to help students, parents, educators, and policymakers become familiar with ACT Aspire test presentation and content. These resources, the Student Sandbox and the Exemplar Test Question Booklets, illustrate the different types of test questions and formats found in both paper-based and computer-based testing modes.http://actaspire.pearson.com/exemplars.html

3.3 Interim Assessments

3.3.1 English Language Arts OverviewELA knowledge and skills include overlapping constructs related to English and reading. ACT Aspire ELA assessments are designed to elicit sufficient evidence to support general inferences about student achievement in ELA as well as student achievement in each of these content areas separately. Inferences about student achievement in reading and English language skills can be made by analyzing ACT Aspire reading and English test scores, respectively. ACT Aspire reporting uses language consistent with college and career readiness standards in ELA and literacy so that students, parents, and educators can understand performance on the test in terms that clearly relate to instruction.

3.3.1.1 English TestThe English test puts the student in the position of a writer who makes decisions to revise and edit a text. Short texts and essays in different genres provide a variety of rhetorical situations. Students must use the rich context of the passage to make editorial choices, demonstrating their understanding of grammar, usage, and mechanics conventions. Students must also apply understanding of rhetorical purposes and strategies, structure and organize sentences and paragraphs for effective communication, and maintain a consistent style and tone. Figure 3.1 shows an example of the construct hierarchy for the English test.

3.3.1.2 English FrameworkThe English Framework articulates constructs measured across the grade levels. Each grade is broken out and each has descriptive statements of knowledge and skills measured within each reporting category. Additional information on the English Framework may be found in PDF at https://www.discoveractaspire.org /act-aspire-technical-manual.

Page 23: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.3

3.3.1.3 English Reporting CategoriesThe English test measures student knowledge and skill in the following reporting categories.

Production of Writing

The items in this category require students to apply their understanding of the rhetorical purpose and focus of a piece of writing to develop a topic effectively and to use various strategies to achieve logical organization, topical unity, and general cohesion.

• Topic Development: These items require students to demonstrate an understanding of, and control over, the rhetorical aspects of texts by identifying the purposes of parts of texts, determining whether a text or part of a text has met its intended goal, and evaluating the relevance of material in terms of a text’s focus.

• Organization, Unity, and Cohesion: These items require students to use various strategies to ensure that a text is logically organized, flows smoothly, and has an effective introduction and conclusion.

Knowledge of Language

The items in this category require students to demonstrate effective language use through ensuring precision and concision in word choice and maintaining consistency in style and tone.

Knowledge of Language is not included as a reporting category on the Interim assessments, but is included on the Classrooms.

Conventions of Standard English

The items in this category require students to apply an understanding of the conventions of Standard English grammar, usage, and mechanics to revise and edit text.

• Punctuation, Usage, and Capitalization Conventions: These items require students to edit text to conform to Standard English punctuation, usage, and capitalization.

• Sentence Structure and Formation: These items test understanding of relationships between and among clauses, placement of modifiers, and shifts in sentence construction.

Page 24: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.4

Figure 3.1. ACT Aspire English Test Reporting with Example Skill Targets for One Reporting Category.

3.3.1.4 English Item TypesThe ACT Aspire English test puts the student in the position of a writer who makes decisions to revise and edit a text. Different passage types provide a variety of rhetorical situations, each accompanied by selected-response items.

3.3.1.5 English Test BlueprintsThe test consist of texts (such as essays, sentences, and paragraphs), each accompanied by selected-response. Different essay genres are employed to provide a variety of rhetorical situations. Texts vary in length depending on the grade level. The stimuli are chosen not only for their appropriateness in assessing writing skills, but also to reflect students’ interests and experiences. Some questions refer to underlined or highlighted portions of the essay and offer several alternatives to the portion underlined. These items include “NO CHANGE” to the underlined or highlighted portion in the essay as one of the possible responses. Some questions ask about a section of a paragraph or an essay, or about the paragraph or essay as a whole. The student must decide which choice best answers the question posed.

Tables 3.1 and 3.2 show the makeup of the ACT Aspire Interim English tests for each grade level. Non-operational items (7–12 plus associated passage, depending on grade), which do not contribute to raw score points, are included on every form but not included in the counts that follow.

Page 25: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.5

Table 3.1. Specification Ranges by Reporting Category Grade Level

3 4 5 6 7 8 EHSPOW 7–9 7–9 7–9 11–13 11–13 11–13 17–19CSE 9–11 9–11 9–11 11–13 11–13 11–13 17–19Total 18 18 18 24 24 24 36

Notes: POW = Production of Writing; subscores/reporting categories CSE = Conventions of Standard English

Table 3.2. Percentage of Points by DOK for ACT Aspire Interim English Tests

Grade Level3 4 5 6 7 8 EHS

DOK1 39–44% 39–44% 39–44% 33–38% 33–38% 33–38% 33–38%DOK2 22–28% 22–28% 22–28% 21–29% 21–29% 21–29% 21–29%DOK3 28–33% 28–33% 28–33% 38–42% 38–42% 38–42% 38–42%

ACT Aspire Interim English Tests, Passage Counts by Grade Level

The grades 3–8 English tests contain two passage units. In EHS, the English test contains three passages.

3.3.2 Mathematics Test The mathematics test measures the whole of a student’s mathematical development, breaking it into progress with topics new to the grade and progress with lasting topics from previous grades. The mathematics test provides nine reporting categories, facilitating actionable insights.

The ACT Aspire mathematics assessments emphasize quantitative reasoning frequently applied to real-world contexts. The mathematics construct requires making sense of the problem, context, and notation; accessing appropriate mathematical knowledge from memory; incorporating the given information; doing mathematical calculations and manipulations; interpreting; applying thinking skills; providing justification; making decisions based on the mathematics; and communicating results. Knowledge of basic formulas and computational skills are assumed as background for the problems, but memorization of complex formulas and extensive computation are not required.

The mathematical content assessed at a given grade is based on mathematical learning expectations up through that grade that are important to future college and career readiness. Test tasks focus on what students can do with the mathematics they have learned, which encompasses not only content but also mathematical practices.

Page 26: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.6

3.3.2.1 Mathematics Reporting CategoriesSometimes tasks will cut across these domains; they will be assigned to the most appropriate single domain for reporting purposes. All tasks involve some level of mathematical practices. Reporting categories are specific to grade bands indicated in the parentheses following each category name. EHS indicates early high school.

Figure 3.2 shows the general flow of the mathematical domains across the grade levels.

Figure 3.2. Mathematics Domains and Primary Currents in the Flow across Grades.

Number & Operations in Base 10 (3–5)

Starting from an understanding of the base 10 system involving 1s and 10s and 100, students extend to 100s, 1,000s, and further, working toward seeing place-value connections between each place and the neighboring places. Students use these place-value connections to extend their number system to include decimals. Students can compute using place-value understanding, flexibly combining and decomposing groups. This thinking undergirds mental mathematics for one’s lifetime, and being able to explain and critique strategies helps cement these important skills. Concurrently, students are seeing the need for efficient computation, and they understand and can explain the procedures they use. By the end of this grade band, students use place-value strategies fluently for mental arithmetic and explain their reasoning. By the end of this grade band, students compute fluently using efficient computation strategies with multidigit whole numbers and decimals to hundredths.

Page 27: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.7

Number & Operations—Fractions (3–5)

Students develop an understanding of several useful interpretations of fractions. There are connections from unit fractions to equal sharing, and representations of a whole broken into equal-size pieces. But fractions are also numbers, each with a place on the number line and a role in computation. Students compute with fractions (up through division with a unit fraction and a whole number) and understand how fraction computation is built on the properties of operations with whole numbers. They can explain why a step in a procedure is necessary.

The Number System (6–8)

Students coming into this grade band have nearly finished extending whole-number operations to work with fractions, and they complete the task, understanding division of one fraction by another. Then the cycle continues: students extend the four operations to work for all rational numbers. Students understand the order of rational numbers, which now allows them to plot points in all four quadrants of the coordinate plane, and they understand absolute value in terms of distance. Students solve problems and use properties of the rational numbers to aid in computation. They clearly tie their computations into their arguments and draw conclusions from conditional statements. Students can convert rational numbers to decimal and to fraction form. But not all numbers are rational numbers, and by the end of this grade band students understand that a number like √–2 is not rational and they can work with approximations.

Number & Quantity (EHS)

Coming into high school, students have knowledge of the real number system and they have strong understanding and fluency with rational numbers and the four basic operations so they can work with irrational numbers by working with rational numbers that are close. Students are ready to move from integer exponents to rational exponents. And they are ready to probe a little deeper into properties of the real number system. Later, students extend to complex numbers, which offer the solutions to some simple equations that have no real-number solutions, and students learn to compute in this system. Students are invited to go further, exploring properties of complex numbers and therefore learning more about real numbers. Students are invited to explore vectors and matrices and to view them as number systems with properties, operations, and applications.

Throughout high school, students are maturing in their understanding of quantity and its connections to measurement. Attending to the types of quantities and units can guide solution strategies and help avoid errors. Students work with derived quantities, and when modeling they choose appropriate quantities to model.

Page 28: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.8

Operations & Algebraic Thinking (3–5)

Students bring an understanding of addition and subtraction as well as counting by 2s and 5s to this grade band and use it to develop an understanding of multiplication in terms of combining equal-size groups and of division as forming equal-size groups for sharing. Arrays and rectangular area provide models for multiplication and division, and from those connections, students learn about properties of operations, using those properties to aid in computation. Solving problems, working with factors and multiples, and learning in other domains all contribute to fluency with all four operations and the ability to explain reasoning. Students use conditional statements (when that is true, then this is true) mainly with specific numbers but working toward general statements. Numerical expressions and equations capture calculations and relationships, and working with number patterns develops operational sense as well as moving toward the concept of function.

Expressions & Equations (6–8)

This category is closely related to Ratio and Proportional Reasoning/Functions, building a foundation of algebraic understanding. Prior to this grade band, students routinely write numerical expressions. Now students routinely write expressions using a letter to represent a number and are working to attach meaning to expressions and to understand how the properties of numbers are useful for working with expressions. Integer exponents come into play for representing very large and very small quantities, and later, students extend to use rational exponents. Students need to understand what it means for a number to be a solution to an equation, and that solving an equation involves a process of reasoning about solutions, justified by the properties of numbers and operations. Linear equations are the focus, and students use graphs and equations to express relationships. Lines (excepting vertical lines) have a well-defined slope, and the concurrent geometric progress with similar triangles allows students to prove this. Students understand the connections between proportional relationships, lines, and linear equations. Reasoning about the solution of linear equations is applied to solve linear inequalities and to find simultaneous solutions of a pair of linear equations.

Ratios & Proportional Reasoning (6–7)

Proportional relationships are multiplicative relationships between quantities and are the next step toward general linear relationships. Students coming into this grade band start with strong multiplication skills and expand the ways of working with “how many times as much?” comparisons, including ratios, rates, and percentages. They look at rate-of-change and connect unit rates with the slope of a graph, a pattern in a table, and a multiplier in an equation, using all forms of rational numbers. This builds to general understanding of function as input and output pairs and understanding of proportional functions and then linear functions. Functions are not tied to a specific representation: tables can show

Page 29: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.9

representative values, graphs can show behavior in a representative region, equations can represent some types of functions. Linear equations y = mx + b represent all linear functions. The Expressions and Equations domain provides ways of working with these equations. Linear functions are useful in modeling.

Functions (8–EHS)

Functions have been with students since their early years: consider the counting function that takes an input of “seven” and gives “eight” and an input of “twelve” to give “thirteen.” In grade 8, the concept of function was named and became an object of study. Understanding some general properties of functions will equip students for problem solving with new functions they create over their continued studies and careers. Functions provide a framework for modeling real-world phenomena, and students become adept at interpreting the characteristics of functions in the context of a problem and become attuned to differences between a model and reality. Some functions accept all numbers as inputs, but many accept only some numbers. Function notation gives another way to express functions that highlights general properties and behaviors. Students work with functions that have no equation, functions that follow the pattern of an equation, and functions based on sequences, which can even be recursive. Students investigate particular families of functions—like linear functions, quadratic functions, and exponential functions—in terms of the general function framework: looking at rates of change, algebraic properties, and connections to graphs and tables, and applying these functions in modeling situations. Students also examine a range of functions like those defined in terms of square roots, cube roots, polynomials, and exponentials, and also piecewise-defined functions.

Algebra (EHS)

Students coming into high school build on their understanding of linear equations to make sense of other kinds of equations and inequalities: what their graphs look like, how to solve them, and what kinds of applications they have for modeling. Students develop fluency with algebra. They continue to make sense of expressions in terms of their parts in order to use their fluency with strategy and to solve problems. Through repeated reasoning, students develop general understanding of solving equations as a process that provides justification that all the solutions will be found, and students extend to quadratic equations, polynomial equations, rational equations, radical equations, and systems, integrating an understanding of solutions in terms of graphs. Families of equations have properties that make them useful for modeling. Solutions of polynomial equations are related to factors of a polynomial. Students recognize relationships in applications and create expressions, equations, and inequalities to represent problems and constraints.

Page 30: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.10

Geometry (3–EHS)

For grades 3–5, students start this grade band recognizing shapes in terms of attributes, which now makes for interrelated categories of shapes that share certain attributes (e.g., the category of parallelograms is made up of all shapes where each of the 4 sides is parallel to the opposite side). Viewed as categories, squares make up a subcategory of parallelograms, and so all the properties of parallelograms apply to squares. In general terms, the attributes of a category apply for all subcategories. Shapes can be made from points, lines or line segments, angles, and other shapes. Shapes can have lines of symmetry. Shapes can be decomposed, which has connections to fractions. All of these concepts can be used to solve problems. Students can find examples and counterexamples. Toward the end of this grade band, students extend number-line concepts to the coordinate plane and can plot points in the first quadrant and make interpretations in the context of solving real-world problems.

For grades 6–8, this grade band starts with a focus on area, surface area, and volume. The additive property of area means the area of a polygon can be found by decomposing it into simpler figures the student already can find the area of. Through repeated reasoning, and in conjunction with the Expressions and Equations domain, students bring meaning to formulas for perimeter/ circumference, area, surface area, and volume, and solve problems involving triangles, special quadrilaterals, and polygons, with attention to symmetry, working up to circles, cubes, right prisms, cylinders, cones, and spheres. Students develop the spatial reasoning to describe the cross-section of three-dimensional figures sliced by a plane. Students address geometric distance. In the coordinate plane, they start plotting polygons and finding the length of horizontal and vertical sides. Later they reach the Pythagorean Theorem and its converse, justifying these relations and using them to solve problems in two and three dimensions and in a coordinate plane. They understand how to use scale. Students delve into creating figures that meet given conditions based on angle measures and lengths and symmetry, noting when the conditions lead to a unique figure. And this leads to the notion of congruence and then to similarity in terms of dilations, translations, rotations, and reflections. Students learn about these transformations and establish facts about, for example, angle sums in a triangle, and use these facts to write equations and solve problems.

For EHS, building on their knowledge from grade 8, students add depth to what they know about transformations, reflections, rotations, and dilations and add precision to their understanding of congruence, similarity, and symmetry. Justification is the glue that holds structure together and lets students answer “Why?” questions, and students justify by using definitions and theorems, tying in calculations and diagrams, considering cases, understanding general versus specific statements, applying counterexamples, and putting statements together into coherent arguments. Students make constructions, solve problems, and model with geometric objects. Informal arguments give a chain of reasoning that leads to

Page 31: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.11

formulas for the area of a circle and then on to volume of cylinders, pyramids, and cones. Students solve right-triangle problems. All of these results transfer to the coordinate plane, where analytic treatment of distance allows students to derive conditions for parallel and perpendicular lines, to split a line segment into pieces with a given ratio of lengths, to find areas, and to develop equations.

Measurement & Data (3–5)

Measurement and Data topics in this grade band provide mutual reinforcement with topics from other domains: unit conversions related to scaling and multiplication, whole numbers and fractions related to measurement and operations. Students solve problems that involve units and conversions, starting with time. They have a framework for measurement and can work with measurements as data for line plots and scaled picture graphs, and they can interpret these displays during problem solving. Students develop the concepts of perimeter and area through measurement and relate these to multiplication and addition, moving on to the concept of volume with its relationships to measurement, multiplication, and addition. Students understand angle measure and connect this to fractions of a circle.

Statistics & Probability (6–EHS)

For grades 6–8, statistics is about distributions of quantities—like the amount of electricity used by each home in Anchorage, Alaska, last month—and how to tell how likely something is—given what is known about the shape of the distribution and being sure to take into account the context of the data and the data-gathering process. Students have already graphed distributions, but now they pay more attention to the shape of the distribution, particularly where it seems to be centered and how spread out it is. Box plots capture some of the differences between distributions in a way that makes comparison surer. Students make informal inferences and learn about randomness in sampling. Probability is a language for conveying likelihood, and students develop uniform probability models and probability models from empirical data. They can use lists, tables, tree diagrams, and simulation results to represent sample spaces and estimate the probability of compound events. For EHS, students add to their understanding of distributions of a single quantity, describing center and spread with statistics and interpreting these in the context of the data.

Before high school, students have used two-way tables and scatter plots to look at relationships between different quantities, and have used linear functions to model relationships that look linear. Now students pay more attention to informal model fit and use other functions to model relationships; they use models for prediction and they interpret characteristics of the model in the context of the data. From two-way tables, students interpret relative probabilities (including joint, marginal, and conditional relative frequencies but not tied to these terms) and relate these to probabilities. Students look for association and distinguish correlation and causation.

Page 32: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.12

Randomness unlocks the power of statistics to estimate likelihood, and students learn about the role of randomness in sample surveys, experiments, and observational studies. Students use data to estimate population mean or proportion and make informal inferences based on their maturing judgment of likelihood. They can compare qualities of research reports based on data and can use simulation data to make estimates and inform judgment. Before high school, students have tacitly used independence, but now the idea is developed with a precise definition. Students relate the sample space to events defined in terms of “and,” “or,” and “not,” and calculate probabilities, first using empirical results or independence assumptions.

For each grade, assessment of student progress within the five domains of topics new to the grade is complemented by assessment of student progress with lasting topics from previous grades. Not only should students retain what they learned in previous grades, they should actually strengthen and integrate their learning and what they can do with those topics.

Progress with topics new to a grade is reflected in the Grade Level Progress reporting category. Progress with lasting topics from previous grades is reflected in the Foundation reporting category. Figure 3.3 shows how these components fit together. The sampling within the Grade Level Progress component is done in such a way to create a reporting category around each of the five domains of topics new to the grade. This additional layer of detail contributes greater precision to support insights about students.

Figure 3.3. Foundation and Grade Level Progress (illustrated for grade 7).

Page 33: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.13

Mathematical Practices

Mathematical Practices highlight cross-cutting mathematical skills and understandings and the complex and vital ways they integrate with content. Test items focus on important mathematics, which includes various levels of using mathematical practices. In addition to assessing mathematical practices as a general part of scores, ACT Aspire Interim reports individually on Modeling.

Modeling

Modeling involves two objects: the actual and the model. A model often helps one predict or understand the actual. The Modeling reporting category represents items that involve producing, interpreting, understanding constraints of, evaluating, and improving models.

3.3.2.2 Calculator PolicyStudents at grade 6 or above are allowed, but not required, to use a calculator on Interim mathematics tests, as all problems can be solved without a calculator. The testing platform includes a calculator tool for mathematics tests. Schools may also provide calculators or each student may bring his/her calculator. Most 4-function, scientific, or graphing calculators are permitted. The ACT calculator policy for ACT Aspire is available at www.act.org.

3.3.2.3 Mathematics Item Types, Tasks, Stimulus

Item Sets

Some test items are part of a set, presented as a common stimulus used for all the items in the set followed by test items whose solution must utilize some information from the common stimulus. Test items in the set are independent of one another, meaning that getting the correct answer to one item does not hinge on getting the correct answer to another item.

Each form has approximately two item sets, with two to five items in each set.

3.3.2.4 Mathematics Test BlueprintsThe mathematics test is composed of selected-response items. The numbers of items is provided in Tables 3.3–3.6. Non-operational items (5–6 plus associated passage, depending on grade), which do not contribute to raw score points, are included on every form but not included in the counts that follow.

Page 34: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.14

Table 3.3. Specification Ranges by Reporting Category for Grades 3–5Reporting Categories Number of ItemsGrade Level Progress 17

Numbers & Operations in Base 10 3–4Numbers & Operations—Fractions 3–5Operations & Algebraic Thinking 3–4Geometry 3–5Measurement & Data 3–4

Foundation 8Modeling ≥8TOTAL 25

Table 3.4. Specification Ranges by Reporting Category for Grades 6–8Number of Items

Reporting Categories Grade 6 Grade 7 Grade 8Grade Level Progress 17 20 20

The Number System 3–5 3–5 2–3Expressions & Equations 3–4 3–5 5–7Ratios & Proportional Reasoning 3–4 3–5 –Functions – – 3–5Geometry 3–5 3–5 4–6Statistics & Probability 3–4 3–5 3–4

Foundation 8 10 10Modeling ≥10 ≥14 ≥13TOTAL 25 30 30

Table 3.5. Specification Ranges by Reporting Category for EHSReporting Categories Number of ItemsGrade Level Progress 20

Number & Quantity 3–4Algebra 4–6Functions 3–5Geometry 3–5Statistics & Probability 3–5

Foundation 10Modeling ≥7TOTAL 30

Page 35: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.15

Table 3.6. Percentage of Points by DOK for ACT Aspire Interim Mathematics Tests

Grade Level3 4 5 6 7 8 EHS

DOK1 12–20% 12–20% 12–20% 12–20% 13–21% 13–21% 13–21%DOK2 44–52% 44–52% 44–52% 44–52% 53–61% 43–51% 43–51%DOK3 32–40% 32–40% 32–40% 32–40% 23–31% 33–41% 33–41%

Note: EHS: Early High School (Grades 9 and 10)

3.3.3 Reading TestThe reading test measures a student’s ability to read closely, reason logically about texts using evidence, and integrate information from multiple sources. Passages in the reading test include both literary narratives, such as prose fiction, memoirs, and personal essays, and informational texts from the natural sciences and social sciences. Across the grades, these texts span a range of complexity levels that lead up to the reading level in challenging first-year college courses. On the test at each grade, passages at different levels of complexity within a range appropriate for the grade present students different opportunities to demonstrate understanding. In addition, reading items assess the student’s ability to complete reading-related tasks at various depth-of-knowledge (DOK) levels, and, as a whole, the tests reflect a range of task difficulty appropriate for the grade level. Figure 3.4 shows an example of the construct hierarchy for the Reading test.

3.3.3.1 Reading FrameworkThe Reading Framework articulates constructs measured across the grade levels. Each grade is broken out and each has descriptive statements of knowledge and skills measured within each reporting category. Additional information on the Reading Framework may be found in PDF at https://www .discoveractaspire.org/act-aspire-technical-manual.

3.3.3.2 Reading Reporting CategoriesThe reading test assesses skills in the following reporting categories:

Key Ideas and Details

The items in this category require students to read texts closely; to determine central ideas and themes, and summarize information and ideas accurately; and to understand relationships and to draw logical inferences and conclusions including understanding sequential, comparative, and cause-effect relationships.

Craft and Structure

The items in this category require students to determine word and phrase meanings, analyze an author’s word choice rhetorically, analyze text structure, and understand authorial purpose and characters’ points of view. They interpret

Page 36: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.16

authorial decisions rhetorically and differentiate between various perspectives and sources of information.

Integration of Knowledge and Ideas

The items in this category require students to understand authors’ claims, differentiate between facts and opinions, and use evidence to make connections between different texts that are related by topic. Students read a range of informational and literary texts critically and comparatively, making connections to prior knowledge and integrating information across texts. They analyze how authors construct arguments, evaluating reasoning and evidence from various sources.

Integration of Knowledge and Ideas is not included as a reporting category on the Interims but is included on the Classrooms.

Figure 3.4. ACT Aspire Reading Test Reporting Categories with Example Skill Targets for One Reporting Category.

Reading Test Complexity and Types of Texts

During the development of the reading test, ACT evaluates reading passages using both quantitative and qualitative analyses. This information is used to determine the levels of text complexity of passages and to build assessments with passages at a range of text complexity appropriate for each grade level.

Table 3.7 shows ACT Aspire’s five levels of text complexity. These levels were established through ACT’s research and validated through a review process that included ELA teachers. Together with tasks that target important reading skills at each grade, these complexity levels ensure a coherent progression across the assessment continuum. At each grade level, the reading test includes a range of text complexities as specified in the table. Note that a sixth level of complexity, Highly Complex, is not included on the table because those passages are only administered on the ACT.

Page 37: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.17

Table 3.7. Passage Coverage by Text Complexity for the ACT Aspire Reading Test Grade Level

3 4 5 6 7 8 EHSBasic X X XStraightforward X X X X XSomewhat challenging X X X X XMore challenging X XComplex X

ACT research staff has analyzed the effect of text complexity on reading comprehension. Drawing on this research, ACT designed a measure that reports student performance on specific reading tasks that require comprehension and integration of information across texts. In addition to the overall reading score, each student receives this Text Complexity Progress Measure. While the overall reading score provides a measure of important comprehension skills, this score offers a focused measure of the skills that a reader uses to draw inferences, make connections, and integrate meaning across the challenging texts found in educational and workplace settings.

Performance on the Text Complexity Progress Measure is compared to a readiness level empirically derived from ACT College Readiness Benchmarks. Students who perform above the benchmark level will receive an indication that they are making sufficient progress toward reading the complex texts they will encounter in college and career. Students who perform below the benchmark level will receive recommendations for improvement such as practicing reading appropriately complex texts from a variety of genres, monitoring understanding, and using other strategies to comprehend literary and informational texts.

ACT Aspire reading passages are drawn from the following range of text types:

• Literary narrative: Literary passages from short stories, novels, memoirs, and personal essays

• Social science: Informational passages on topics such as anthropology, archaeology, biography, business, economics, education, environmentalism, geography, history, political science, psychology, and sociology

• Natural science: Informational passages on topics such as anatomy, astronomy, biology, botany, chemistry, ecology, geology, medicine, meteorology, microbiology, natural history, physiology, physics, technology, and zoology

Page 38: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.18

Figure 3.5. ACT Aspire Interim Reading Test Passage Types.

3.3.3.3 Reading Test BlueprintsThe reading test comprises four sections in grades 3 through 5 and three sections in grades 6 through EHS. Each section contains one prose passage that is representative of the level and kinds of text commonly encountered in school curriculum at that grade. Each passage is accompanied by a set of selected-response test items, which focus on complementary and mutually supportive skills that readers use to understand texts written for different purposes across a range of subject areas. Items do not test the rote recall of facts from outside the passage or rules of formal logic, nor do they assess vocabulary in isolation from the context. These items require the student to synthesize information from across the texts.

Tables 3.8 and 3.9 show the makeup of the ACT Aspire Interim reading tests for each grade level. Non-operational items (6 plus associated passage), which do not contribute to raw score points, are included on every form but not included in the counts that follow.

Table 3.8. Specifications Ranges by Reporting Category and Grade LevelGrade Level

3 4 5 6 7 8 EHSKID 6–9 6–9 6–9 6–9 6–9 6–9 6–9CS 5–8 5–8 5–8 5–8 5–8 6–9 6–9Total 14 14 14 14 14 15 15

Notes: KID = Key Ideas and Details; CS = Craft and Structure

Page 39: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.19

Table 3.9. Percentage of Points by DOK for the ACT Aspire Interim Reading Tests

Grade Level3 4 5 6 7 8 EHS

DOK1 20–35% 20–35% 20–35% 20–35% 20–35% 20–35% 20–35%DOK2 40–70% 40–70% 40–70% 40–70% 40–70% 40–70% 40–70%DOK3 40–70% 40–70% 40–70% 40–70% 40–70% 40–70% 40–70%

ACT Aspire Interim Reading Test, Passage Types by Grade Level

The reading tests for grades 3–7 contain two passage units: one literary narrative and one informational passage. Those for grade 8 and EHS contain three passage units: one literary narrative, one social science passage, and one natural science passage.

Measurement of Text Complexity Progress

Text complexity is a foundation of the Common Core Reading Standards, which state that students must “be able to comprehend texts of increasing complexity as they progress through school”(CCSS Appendix A, pg. 2 and Reading Standard 10). At least six items per test are used to determine whether students are making sufficient progress toward reading complex texts, based on ACT’s college and career readiness benchmarks.

3.3.4 Science TestThe ACT Aspire Interim science assessments are designed based on research evidence about what students need to know and be able to do in order to be on a trajectory where they will be successful in college and career.

Only the ACT Aspire Interim science test questions contribute to students’ science scores—questions on the ACT Aspire English, mathematics, and reading tests are not repurposed to contribute to the science score. Analyses of correlations of other ACT Aspire tests support that the science test score represents unique achievement information. NCS results also show that educators at all levels, from elementary through postsecondary, report that science is better assessed using a science test rather than a reading, writing, or math test. Testing reading in science is important and aligns with most college and career readiness state standards related to literacy, and because of that, reading in science is assessed on the ACT Aspire reading test.

3.3.4.1 Science Reporting CategoriesThe science test assesses and reports on science knowledge, skills, and practices across three domains:

• Interpretation of Data• Scientific Investigation• Evaluation of Models, Inferences, and Experimental Results

Page 40: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.20

The knowledge, skills, and practices contained in these domains comprise the ACT College and Career Readiness Standards for Science, which link specific skills and knowledge with quantitatively determined score ranges for the ACT Science test and a benchmark science score that is predictive of success in science at the post-secondary level. The science test is built on these same skills—skills that students need to learn early, and then continually apply and refine, in order to be on a path to college and career readiness in science. All items on the science test are based on authentic scientific scenarios that are built around important scientific concepts and are designed to mirror the experiences of students and working scientists engaging in real science. Some of the items require that the students have discipline-specific content knowledge (e.g., for the Early High School test, knowledge specific to an introductory middle school biology course), but science content is always assessed in concert with science skills and practices. ACT’s research on science curricula and instruction at the high school and post-secondary levels shows that while science content is important, science skills and practices are more strongly tied to college and career readiness in science.

The content of the science test includes biology (life sciences at the earlier grades), chemistry and physics (physical science at the earlier grades), and Earth/space sciences (e.g., geology, astronomy, and meteorology). Advanced knowledge in these areas is not required, but background knowledge acquired in general, introductory science courses may be needed to correctly respond to some of the items in the upper-grade assessments. The assessments do not, however, sample specific content knowledge with enough regularity to make inferences about a student’s attainment of content knowledge in any broad area, or specific part, of the science content domain. The science test stresses science practices over recall of scientific content, complex mathematics skills, and reading ability.

3.3.4.2 Continuum of ComplexityThere are a wide variety of scenarios in which students may need to apply their scientific knowledge and reasoning skills. Scenarios are categorized along two dimensions on ACT Aspire: Mode and Complexity. Different modes focus on different aspects of science (e.g., scientific data/results, experimentation, and debate). Complexity is determined by several factors that contribute to how challenging a scientific scenario is likely to be for a student. ACT Aspire focuses on five factors in particular: the topic (e.g., weather, photosynthesis rates), the tone (i.e., formality), the nature of the presented data (i.e., the density, content, and structure of graphs, tables, charts, and diagrams), and the difficulty of the embedded concepts (i.e., the introduced and assumed concepts, symbols, terminology, units of measure, and experimental methods). These elements combine in various proportions to result in scenarios of different complexity, and this complexity affects suitability for different grade levels. This complexity also

Page 41: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.21

impacts the difficulty of questions and tasks students are asked to perform in a given context (e.g., an easy question can become much more difficult when posed in a more challenging context).

The science test development process and item design is informed by science education research in learning progressions and misconceptions. This research informs the item task designs and test designs to assure that science knowledge, skills, and practices are tested at developmentally appropriate times and in appropriate ways and at increasing levels of sophistication from grade 3 through EHS. Core skills and practices are similar across the grade continuum, as our research shows that these skills and practices should be taught early, and continually refined, as a student progresses through school. What differs most from grade to grade in the science test is the complexity of the contexts in which science knowledge, skills, and practices are assessed. The contexts used on the tests place the students in rich and authentic scientific scenarios that require that the student apply their knowledge, skills, and practices in science to both familiar and unfamiliar situations. With increasing grade level, the scenarios have increasingly complex scientific graphics and experimental designs and are based on increasingly complex concepts. The students must then apply their knowledge, skills, and practices with increasing levels of sophistication. The amount of assumed discipline-specific science content knowledge a student needs to answer questions on the tests also increases. Figure 3.6 illustrates this continuum.

Page 42: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.22

Figure 3.6. ACT and ACT Aspire Interim Science Content Complexity Continuum.

3.3.4.3 Science Item TypesACT Aspire science items and tasks require students to apply what they have learned in their science classes to novel, richly contextualized problems. The test is structured to collect evidence about students’ ability to, for example, evaluate the validity of competing scientific arguments or to scrutinize the designs of experiments, and to find relationships in the results of those experiments. To fully cover this complexity continuum, six different stimulus (passage) modes are employed across the Science Test, which vary from grade 3 through Early High school. These include using modes focused on data interpretation, scientific experimentation, or scientific debate.

Page 43: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.23

Conflicting Viewpoints (CV)

A passage format used in grades 8 and EHS in which two or more differing scientific explanations/arguments on the same topic are given, sometimes with one or more graphs, tables, and/or diagrams. Items focus on understanding the explicit and implicit points of each explanation, comparing and contrasting the explanations, drawing conclusions from the arguments, and evaluating the validity of the arguments.

Research Summaries (RS)

A passage format used for grades 7–EHS involving a detailed experimental procedure along with one or more graphs, tables, and/or diagrams for experimental setups and results. Items focus on aspects of experimental design, accompanying procedures, obtaining and analyzing data, and drawing conclusions from the data.

Data Representation (DR)

A passage format used for grades 6–EHS in which a brief description is given for one or more graphs, tables, and/or diagrams. Items focus on obtaining and analyzing data and drawing conclusions from the data.

Student Viewpoints (SV)

A passage format used for grades 6 and 7 that is analogous to conflicting viewpoints but with one to three brief and simple explanations/arguments that support an observation (e.g., students may have differing interpretations of a data set or teacher demonstration). Items focus on understanding the explicit and implicit points of the argument(s), drawing conclusions from an argument, and evaluating the validity of an argument.

Science Investigations (SI)

A passage format used for grades 3–7 that is analogous to research summaries but is shorter, simpler, more student-centered experimental procedures along with one to two graphs, tables, and/or diagrams for experimental setups and results. Items are focused on aspects of experimental design, accompanying procedures, obtaining and analyzing data, and drawing conclusions from the data.

Data Presentation (DP)

A passage format used for grades 3–5 that is analogous to data representation, but data sources are simpler (and rarely more than two per unit) and more related to everyday life, with briefer descriptions.

Page 44: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.24

Table 3.10 shows how many passages from which stimulus modes are used at each grade level.

Table 3.10. Stimulus Modes Used on the ACT Aspire Interim Science TestGrade Level

3 4 5 6 7 8 EHSCV 1 1DP 2 2 2DR 1 1 1 1RS 1 1 1SI 1 1 1 1SV 1 1

Page 45: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.25

Table 3.11. ACT Aspire Interim Science College and Career Readiness Knowledge and Skill Domain

Reporting Category Skill Area

Skill Code Skill Statement

College & Career Readiness Standards

Inte

rpre

tatio

n of

Dat

a (IO

D)

Locating and Understanding

IOD-LU-01

Select one piece of data from a data presentation

IOD 201, 401

IOD-LU-02

Find information in text that describes a data presentation

IOD 203, 303

IOD-LU-03

Select two or more pieces of data from a data presentation

IOD 301, 401

IOD-LU-04

Identify features of a table, graph, or diagram (e.g., axis labels, units of measure)

IOD 202

IOD-LU-05

Understand common scientific terminology, symbols, and units of measure

IOD 302

Inferring and Translating

IOD-IT-01

Translate information into a table, graph, or diagram

IOD 403

IOD-IT-02

Determine how the value of a variable changes as the value of another variable changes in a data presentation

IOD 304, 503

IOD-IT-03

Compare data from a data presentation (e.g., find the highest/lowest value; order data from a table)

IOD 402, 502

IOD-IT-04

Combine date from a date presentation (e.g., sum data from a table)

IOD 501, 601, 701

IOD-IT-05

Compare data from two or more data presentations (e.g., compare a value in a table to a value in a graph)

IOD 501, 601, 701

IOD-IT-06

Combine data from two or more data presentations (e.g., categorize data from a table using a scale from another table

IOD 501, 601, 701

IOD-IT-07

Determine and/or use a mathematical relationship that exists between data (e.g., averaging data, unit conversions)

IOD 504, 602

Extending and Reevaluating

IOD-ER-01

Perform an interpolation using data in a table or graph

IOD 404, 603

IOD-ER-02

Perform an extrapolation using data in a table or graph

IOD 404, 603

IOD-ER-03

Analyze presented data when given new information (e.g., reinterpret a graph when new findings are provided)

IOD 505, 702

Page 46: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.26

Table 3.11. ACT Aspire Interim Science College and Career Readiness Knowledge and Skill Domain—continued

Reporting Category Skill Area

Skill Code Skill Statement

College & Career Readiness Standards

Scie

nfifi

c In

vest

igat

ion

(SIN

)

Locating and Comparing

SIN-LC-01

Find information in text that describes an experiment

SIN 201, 303

SIN-LC-02

Identify similarities and differences between experiments

SIN 404

SIN-LC-03

Determine which experiments utilized a given tool, method, or aspect of design

SIN 405

Designing and Implementing

SIN-DI-01

Understand the methods, tools, and functions of tools used in an experiment

SIN 202, 301, 302, 402

SIN-DI-02

Understand an experimental design

SIN 401, 403, 501

SIN-DI-03

Determine the scientific question that is the basis for an experiment (e.g., the hypothesis)

SIN 601

SIN-DI-04

Evaluate the design or methods of an experiment (e.g., possible flaws or inconsistencies; precision and accuracy issues)

SIN 701

Extending and Improving

SIN-EI-01

Predict the results of an additional trial or measurement in an experiment

SIN 502

SIN-EI-02

Determine the experimental conditions that would produce specified results

SIN 503

SIN-EI-03

Determine an alternate method for testing a hypothesis

SIN 602

SIN-EI-04

Predict the effects of modifying the design or methods of an experiment

SIN 702

SIN-EI-05

Determine which additional trial or experiment could be performed to enhance or evaluate experimental results

SIN 703

Page 47: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.27

Table 3.11. ACT Aspire Interim Science College and Career Readiness Knowledge and Skill Domain—continued

Reporting Category Skill Area

Skill Code Skill Statement

College & Career Readiness Standards

Eval

uatio

n of

Mod

els,

Infe

renc

es, a

nd E

xper

imen

tal R

esul

ts (E

MI)

Inferences and Results: Evaluating and Extending

EMI-IE-01

Determine which hypothesis, prediction, or conclusion is, or is not, consistent with a data presentation or piece of information in text

EMI 401,601

EMI-IE-02

Determine which experimental results support or contradict a hypothesis, prediction, or conclusion

EMI 505

EMI-IE-03

Determine which hypothesis, prediction, or conclusion is, or is not, consistent with two or more data presentations and/or pieces of information in text

EMI 501, 701

EMI-IE-04

Make a prediction and explain why it is consistent with a data presentation or piece of information in text

EMI 401,601

EMI-IE-05

Explain why presented information, or new information, supports or contradicts a hypothesis or conclusion

EMI 502, 702

EMI-IE-06

Make a prediction and explain why it is consistent with a data presentation or piece of information in text

EMI 501, 701

Models: Understanding and Comparing (Grade 6 and higher)

EMI-MU-01

Find information in a theoretical model (a viewpoint proposed to explain scientific observations)

EMI 201

EMI-MU-02

Identify implications and assumptions in a theoretical model

EMI 301, 402

EMI-MU-03

Determine which theoretical models present or imply certain information

EMI 302, 403

EMI-MU-04

Identify similarities and differences between theoretical models

EMI 404

Models: Evaluating and Extending (Grade 6 and higher)

EMI-ME-01

Determine which hypothesis, prediction, or conclusion is, or is not, consistent with a theoretical model

EMI 401, 601

EMI-ME-02

Determine which hypothesis, prediction, or conclusion is, or is not, consistent with two or more theoretical models

EMI 501, 701

EMI-ME-03

Identify the strengths and weaknesses of theoretical models

EMI 503

EMI-ME-04

Determine which theoretical models are supported or weakened by new information

EMI 504

EMI-ME-05

Determine which theoretical models support or contradict a hypothesis, prediction, or conclusion

EMI 505

EMI-ME-06

Use new information to make a prediction based on a theoretical model

EMI 603

EMI-ME-07

Explain why presented information, or new information, supports or weakens a theoretical model

EMI 602

EMI-ME-08

Make a prediction and explain why it is consistent with a theoretical model

EMI 401, 601

EMI-ME-09

Make a prediction and explain why it is consistent with two or more theoretical models

EMI 501, 701

Page 48: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.28

3.3.4.4 Science Test BlueprintsThe test presents several sets of scientific information, each followed by a number of test items. The scientific information is conveyed in one of three different formats: data representation (graphs, tables, and other schematic forms), research summaries (descriptions of several related experiments), or conflicting viewpoints (expressions of several related hypotheses or views that are inconsistent with one another).

Tables 3.12–3.13 show the makeup of the ACT Aspire Interim science tests for each grade level. Non-operational items (5–6 plus associated passage, depending on grade), which do not contribute to raw score points, are included on every form but not included in the counts that follow.

Table 3.12. Specifications Ranges by Reporting Category and Grade LevelGrade Level

3 4 5 6 7 8 EHSIOD 8–11 8–11 8–11 8–11 8–11 6–8 6–8SIN 5–7 5–7 5–7 5–7 5–7 6–8 6–8EMI 5–7 5–7 5–7 5–7 5–7 6–8 6–8

Total 21 21 21 21 21 21 21Notes: IOD = Interpretation of Data; SIN = Scientific Investigation; EMI = Evaluation of Models, Inferences, and Experimental Results.

Table 3.13. Percentage of Points by DOK for the ACT Aspire Interim Science Tests

Grade Level3 4 5 6 7 8 EHS

DOK1 10–19% 10–19% 10–19% 5–14% 5–14% 5–14% 5–14%DOK2 43–62% 43–62% 43–62% 43–62% 43–62% 43–62% 43–62%DOK3 19–29% 19–29% 19–29% 24–33% 24–33% 24–33% 24–33%

3.4 Classroom AssessmentsACT Aspire Classroom assessments are designed for implementation between their Interim counterparts described above. The flexible nature of these assessments allows teachers to choose the best time for administration. Each assessment provides short-term, guiding insights to student progress. There are ten online, selected response assessments in each grade from 3rd through 8th and in each content area: English, reading, mathematics, and science. Each five-item assessment is typically mapped to one or two standards, and is intended to be completed in 15 to 20 minutes.

Standards coverage for each of the Classroom assessments may be found at http://actaspire.avocet.pearson.com/actaspire/home, under Periodic, Periodic Classroom Reporting.

Page 49: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.29

3.4.1 English Classroom AssessmentsThe English Classroom assessments for grades 3–8 include a range of written modes such as narrative, informative, and persuasive or argumentative. In some instances, the essays have an authentic voice since they were adapted from the work of student writers. Some of the essays mirror the passage lengths (word counts) used on the Interim and Summative assessments while others may be longer. Our goal for the Classroom assessments is to engage students with essays that reflect their interests and experiences; in other words, essays that are interesting and informative to a diverse audience.

Sarah Brown Wessling, English language arts teacher and author of Supporting Students in a Time of Core Standards, wrote that “Writing is thinking. To this end, cultivating writers means cultivating thinkers.” (2011, p. 76). The ACT Aspire English Classroom assessments require students to think—to make a variety of decisions involving given pieces of writing. Students will be asked to improve pieces of writing that have varied kinds of incorrect, awkward, or confusing aspects to them. For example, students will be asked to make decisions regarding conventions of standard written English in terms of punctuation, usage, and sentence structure. There are instances in which capitalization and spelling are assessed. In addition, students will be asked to make decisions concerning rhetorical aspects of writing such as deciding whether a writer has fulfilled a specified goal, whether an essay is logically organized, or whether the language is precise. The English Classroom assessments provide students with an opportunity to enhance their skills by targeting specific aspects of the writing process such as revising and editing.

Each essay contains underlined and highlighted portions. The student must decide which change would be best to make. “NO CHANGE” is always one of the choices. Students will need to determine whether a suggested change is best in terms of the context of the essay, in terms of the conventions of standard written English, or in terms of a particular writing problem defined by the question posed. Some items identify by numbers or letters in brackets or by a blank line. These items pose questions about a portion of the text. There are some items that pose questions about the essay as a whole. The students must decide which choice is most appropriate in terms of the defined rhetorical situation.

3.4.1.1 Specifications for English Language Arts Classroom Assessments

English

• Each English assessment includes five selected-response items and one essay

• Each English assessment measures the CCSS for Language

Page 50: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.30

English Essays

• The length (word count) for the English passages for the Aspire Classroom assessments are approximately:• Grades 3–4: 150 words• Grade 5: 250 words• Grades 6–8: 300 words

Assessment Items

• Each assessment includes a range of DOK levels: from DOK 1 to DOK 3. • Each assessment uses the following DOK specifications: 1–2 DOK 1 items;

2–3 DOK 2 items; 1–2 DOK 3 items• Each assessment includes items that focus on 1 CCSS standard or will

include items that focus on 2 CCSS standards

3.4.2 Mathematics Classroom AssessmentsThe mathematics Classroom assessments include relevant word problems and address concepts that are often difficult for students. CCSSM standards were identified at each grade level to generate a theme of assessments that measure students’ learning as they progress through increasingly complex fraction concepts, skills, and applications.

In grades 3–5, standards from the Number and Operations—Fractions domain of the CCSSM were selected. The grade 3 assessments include questions that ask students to demonstrate their understanding of what fractions are and how to write them, their ability to identify and write equivalent fractions, and their skills related to comparing fractions with the same numerator or denominator. In grade 4, the assessments address more complex ideas by requiring students to compare fractions with different numerators and denominators, to solve word problems based on adding and subtracting fractions with common denominators, and to multiply fractions and whole numbers within relevant contexts. The grade 5 assessments include relevant word problems that involve adding and subtracting fractions with unlike denominators, multiplying fractions, and finding quotients of whole numbers and unit fractions.

The theme of fractions continues in the grades 6 and 7 assessments. The understanding and skills developed during grades 3–5 are applied through solving problems involving ratios and proportions. Questions in the grades 6–7 assessments address standards found in the Ratios and Proportional Relationships domain of the CCSSM. The concept of a ratio is initially introduced in grade 6, so the assessments are geared toward assessing students’ understanding of applying ratios in situations involving unit rates and equivalent ratios. The grade 7 assessments continue to assess students’ ability to apply ratios and proportional reasoning in real-world contexts; they include problem-solving scenarios that involve multiple steps and ratios of fractions.

Page 51: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.31

In grade 8, students begin to transition from applying mathematical concepts in concrete settings to applying them in more generalized, abstract ways. The grade 8 assessments, therefore, transition to considering fractions as part of a larger number system, a domain in the CCSSM. Students are asked to think about fractions as rational numbers, which allows them to think about the characteristics of rational numbers and begin identifying rational and irrational numbers. The grade 8 assessments assess the ability to differentiate between rational and irrational numbers and to approximate the values of irrational numbers using their prior understanding of rational numbers.

Although mathematical concepts are important, the application of the CCSSM Mathematical Practices is also essential. The Classroom assessments at all grade levels promote the Mathematical Practices as part of the problem-solving process. Unique item types that call for reasoning in different ways, questions that ask about mathematical processes, relevant problem-solving scenarios that involve multiple steps and problems that require students to analyze, evaluate, and critique can be found throughout the grades 3–8 Classroom assessments.

Students taking the Classroom mathematics tests in all grades are encouraged to bring a calculator they are familiar using and can use fluently. A calculator tool is available online. Students are permitted to use most any 4-function, scientific, or graphing calculator. The ACT calculator policy for ACT Aspire is available at www.act.org.

3.4.3 Reading Classroom AssessmentsThe reading Classroom assessments for grades 3–8 include both literary narratives and informational texts of varying levels of complexity—from basic to straightforward to challenging. Some of the assessments mirror the passage lengths (word counts) used on the Interim and Summative assessments while others will be longer. The goal for the Classroom assessments is to engage students with passages selected from both contemporary and classical works; passages that will encourage students to discuss their thoughts with their teacher and peers, to read the entire text, or to find out more about topics or characters of interest. Taken as a whole, the passages for each grade level represent a range of authors, topics, points of view, and characters.

Many of the passages have more than one assessment, which allows teachers to measure the same and/or different grade-level standards and enable students to practice and demonstrate their developing reading skills. In some cases the assessments go beyond what is explicitly stated in the grade-level standards, capturing additional skills that are needed for college and career readiness as identified in the CCSS anchor standards. Timothy Shanahan, distinguished professor emeritus, wrote in his literacy blog (2012) that “a first reading is about figuring out what a passage says.” So, the reading Classroom assessments require students to demonstrate their understanding of each passage based on

Page 52: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.32

a first reading; for the grades 3–7 assessments in particular, students determine the passages’ central ideas or themes.

Subsequent assessments that use the same passage work to deepen students’ understanding by focusing on how or why the passage works, in other words, requiring students to conduct a close reading. Shanahan defines close reading as “an intensive analysis of a text in order to come to terms with what it says, how it says it, and what it means” (2012). Douglas Fisher (2014), professor of language and literacy education at San Diego State University, describes close reading as “a careful and purposeful rereading of a text . . . where students really focus on what the author had to say, what the author’s purpose was, what the words mean, and what the structure of the text tells us.”

• Each reading assessment includes five selected-response items and one passage; unless they align to CCSS standards 7, 8, or 9, which will include two or three passages or two types of stimulus material, such as quoted text, graphics, or illustrations.

• Each reading assessment typically measures one or two of the CCSS for English Language Arts, either the Standards for Reading Literature or the Standards for Reading Information Texts.

• The primary focus of the reading assessments is on the grade-level standard though the anchor standards will be addressed periodically.

Reading Passages/Text Complexity per Grade

• Each grade level contains both Literature and Informational Texts:• Grades 3–5: 50% Literature and 50% Informational Texts; • Grades 6–8: 40% Literature and 60% Informational Texts;

Literature passages may include the genres of prose fiction, poetry, drama, personal essay, and memoir, while information texts may include literary nonfiction, historical, scientific, and technical texts. Texts will cross a variety of topics and themes, consider points of view, and include diverse authors and characters.

• The length (word count) for the single-text reading passages for the ACT Aspire Classroom assessments are:• Grade 3: 250 (minimum) to 600 words (maximum)• Grade 4: 300 (minimum) to 700 words (maximum)• Grade 5: 375 (minimum) to 775 words (maximum)• Grade 6: 500 (minimum) to 825 words (maximum)• Grade 7: 500 (minimum) to 900 words (maximum)• Grade 8: 500 (minimum) to 950 words (maximum)

Page 53: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.33

3.4.4 Science Classroom AssessmentsThe science Classroom assessments for grades 3–8 are designed to assess student understanding of fundamental scientific concepts, as well as student understanding of the key processes of science. To ensure a link to critical conceptual ideas in science, the standards addressed in the Classroom assessments are aligned to the Disciplinary Core Ideas (DCIs) from the Next Generation Science Standards (NGSS). The DCIs are statements “about the most essential ideas in the major science disciplines that all students should understand during 13 years of school” (NGSS & NSTA, 2014, p.4). They are grouped into three main disciplines: Physical Science, Life Science, and Earth and Space Science. Each discipline is further subdivided into three to four main core ideas.

The first two science assessments at grades 3–8 focus on the Life Science discipline. Three of the four Life Science core ideas will be addressed in the initial offering of assessments: Biological Evolution: Unity and Diversity (grades 3 and 8 assessments), From Molecules to Organisms: Structures and Processes (grades 4 and 7 assessments), and Ecosystems: Interactions, Energy, and Dynamics (grades 5 and 6 assessments.) These concepts not only represent key ideas that set the stage for further instruction in life science, but they also encompass areas in which students, even at the postsecondary level, typically hold naïve conceptions or misconceptions that teachers would likely want to address (D’Avanzo, 2003; Nehm & Reilly, 2007).

A majority of the items in the science Classroom assessments are part of stimulus-based passages. Many of these passages include tables, graphs, and figures. The use of these passages and their associated tables, graphs, and figures, allows for the assessment of student understanding of science processes skills such as interpreting data, evaluating models, and analyzing scientific investigations in combination with the assessment of conceptual understanding. In addition, many of these stimulus-based item sets mirror the types of questions students will encounter on the ACT Aspire Interim and Summative assessments.

Ideally, as students engage with the passages and items, they will have an opportunity to demonstrate their understanding of critical conceptual ideas, as well as their ability to employ the process skills associated with true scientific literacy; these process skills are emphasized both in the NGSS and the ACT College and Career Readiness Standards. Finally, many of the passage-based items ask students to determine which claims are supported with evidence from the passage. As Yore (2012) points out, effective instructional practices in science should promote an understanding of “science as inquiry,

Page 54: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ASSESSMENT SPECIFICATIONS

3.34

argument, and constructing knowledge claims and explanations of patterns in nature and naturally occurring events” (p. 3). The infusion of claims and evidence in the science test questions can provide one more valuable outcome from the use of the science Classroom assessments.

Page 55: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

4.1

CHAPTER 4

AccessibilityACT Aspire uses a variety of levels of accessibility support including default embedded tools, open access tools, and full accommodations to allow students with disabilities to participate in testing. For more specific information about ACT Aspire Periodics accommodations, see the Accessibility User’s Guide: Interim Test Form. Accessibility Supports for Classroom ACT Aspire Tests are not addressed here as they are under full local control, meaning they are all determined and provided locally. With regard to Classroom tests, this guide can help to inform educators about what types of supports (if locally provided) would honor the intended test constructs and would support valid test performance under Interim or Summative test conditions.

ACT Aspire Interim and Classroom assessments are exclusively delivered online. Hard copy braille and tactile graphics are available for Interim testing.

4.1 Development of the ACT Aspire Accessibility Support SystemThe Standards for Educational and Psychological Testing (AERA, APA, & NCME, 2014), address fairness in testing as the central concern posed by the threat to validity known as measurement bias. The Standards specify two major concepts that have emerged in the literature for minimizing such bias, namely accessibility and Universal Design. Accessibility is defined as, “the notion that all test takers should have an unobstructed opportunity to demonstrate their standing on the construct(s) being measured.” (p. 49). The second major concept, Universal Design, is defined as, “an approach to test design that seeks to maximize accessibility for all intended examinees” (p. 50).

The development of all ACT Aspire accessibility supports followed a theory of action known as Access by Design (Fedorchak, 2013) which incorporates into its

Page 56: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACCESSIBILITY

4.2

conceptual structure elements of Universal Design for Learning (UDL) described by the Center for Applied Special Technologies (CAST, 2011), and Evidence Centered Design (Mislevy, Almond, & Loves, 2004; Mislevy & Haertel, 2006).

Universal Design is necessary but it is not sufficient by itself to meet the needs of every learner with diverse needs (UDL Guidelines, Version 2.0, Principle III, 2011). Multiple means of engagement are required to meet the needs of all users. To build a system of accessibility supports that meets the needs and provides a fair performance pathway for all learners, a system of accessibility levels of support was identified and created using a structured data collection procedure designed at ACT and called Accessibility Feature Mapping. The development team conducted a detailed feature analysis of every passage and item to determine which accessibility supports could be permitted or provided for at least seven diverse learner populations that would fully honor the constructs being measured. The data collected from this feature mapping process served to inform the accessibility policy and support system for ACT Aspire. (The ACT Aspire Accessibility Levels of Support resulting from this procedure are described more fully in section 4.2.)

The Accessibility Feature Mapping Process involved creating a data chart for every item in every form for all audio scripted ACT Aspire items. Every feature within each item (passage, stem, equations, graphics, interaction type, etc.) was cross-inventoried by: 1. Constructs being measured (original content metadata targets) and 2. Cognitive Performance demands of the item: (Presentation demands, Interaction & Navigation demands, Response demands and General Test Condition demands). Then for each of seven targeted learner populations (see schematic below) valid access pathways to this item (either already available or shown to be needed by the analysis) were identified and compared with targets to ensure that all task performance pathways led to valid measurement outcomes and did not violate construct intended for item. The schematic components of this development procedure are shown on the next page.

Components of the Accessibility Feature Mapping Process used to determine valid communication performance pathways (accessibility support options) to be allowed during ACT Aspire Testing.

Page 57: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACCESSIBILITY

4.3

Figure 4.1. Accessibility Feature Mapping Process.

4.2 Test Administration and Accessibility Levels of SupportAll accessibility supports permitted during testing and described in Accessibility User’s Guide: Interim Test Form are designed to remove unnecessary barriers to student performance on the assessments. Based on the feature mapping development process described above that incorporated the targeted construct definitions, all permitted supports fully honor the content, knowledge, and skills the tests measure.

What claim(s) does performance on this item support?

Task/Item Content Metadata-Claims (as available)Claim level 1: Content AreaClaim level 2: Broad subarea w/in contentClaim level 3: DomainClaim level 4: Primary ClusterClaim level 5: Secondary Cluster, as applicable

What performance does this task require?

Communication Demands of the Task Based on Claims

(boundaries of Acceptable Performance Evidence):

1. Task Presentation demands2. Task Interaction & Navigation

demands3. Task Response demands, and4. General Test Condition

demands

Who has a valid pathway to demonstrating the required performance?

Known Communication Access Pathways Usable/Needed by

learners with:

1. Default/Typical communication access needs

2. Blind/Low Vision communication access needs

3. Deaf/Hard of Hearing communication access needs

4. Limited Motor Control communication access needs

5. English Language learner communication access needs

6. Reading or Language impaired communication access needs

7. Attention, Focus or Endurance communication access needs

Page 58: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACCESSIBILITY

4.4

4.2.1 Understanding Levels of Accessibility SupportAccessibility is universal concept that is not restricted to any one group of students. It describes needs we all have regardless of whether or not we have an official diagnostic label. The term “accessibility” also fully addresses the needs of those students with disabilities as well as those who are English Learners. The older and more familiar term “accommodations” describes only one intensive level of support that few students actually need.

Over the last decade in educational research and practice we have come to understand that all students have tools they need and use every day to engage in the classroom and communicate effectively what they have learned and can do. (See the References section later in this manual.) There are different levels of support that students may need in order to demonstrate what they know and can do on academic tests. ACT Aspire assessments make several possible levels of support available. All these levels of support taken together are called accessibility supports. These accessibility supports:

• allow all students to gain access to effective means of communication that in turn allow them to demonstrate what they know without providing an advantage over any other student

• enable effective and appropriate engagement, interaction, and communication of student knowledge and skills

• honor and measure academic content as the test developers originally intended

• remove unnecessary barriers to students’ demonstrating the content, knowledge, and skills being measured on ACT Aspire assessments.

Page 59: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACCESSIBILITY

4.5

Figure 4.2. ACT Aspire Levels of Accessibility.

In short, accessibility supports do nothing for the student academically that he or she should be doing independently; they just make interaction and communication possible and fair for each student.

The ACT Aspire accessibility system defines four levels of support that range from minor support (default embedded system tools) to extreme support (modifications). Figure 4.2 shows the architectural structure of ACT Aspire accessibility supports.

ACT Aspire permits the use of those accessibility supports that will honor and validly preserve the skills and knowledge that our tests claim to measure, while removing needless, construct-irrelevant barriers to student performance. The four levels of support in the ACT Aspire accessibility system represent a continuum of supports, from least intensive to most intensive, and assumes all users have communication needs that fall somewhere on this continuum. The unique combination of supports needed by a single test taker is called the Personal Needs Profile (PNP). A PNP tells the system which supports to provide for a specific test taker. Many students will not need a documented PNP. When a student’s communication needs are not documented in a PNP, the system treats the student as a default user whose accessibility needs are sufficiently met through the default test administration represented by the base of the

Page 60: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACCESSIBILITY

4.6

pyramid—that is, without accessibility features other than the basic set already embedded for all test takers. (See support level 1, “Default Embedded System Tools” in Figure 4.1; these supports are also described in the next section.) The continuum of supports permitted in ACT Aspire results in a personalized performance opportunity for all. This systematic continuum of supports is schematically illustrated in Figure 4.2.

4.2.2 Support Level 1: Default Embedded System ToolsThe first level of supports is called the default embedded system tools (see Figure 4.3). They are automatically available to a default user whose accessibility needs are sufficiently met through the basic test administration experience.

Default embedded system tools meet the common, routine accessibility needs of the most typical test takers. All students are provided these tools, as appropriate, even students who have no documented PNP. Default embedded system tools include but are not limited to the following examples in online tests:

• computer keyboard• computer screen display• mouse• browser zoom magnification• answer eliminator• highlighter• scratch paper• personal calculators for mathematics tests• mark items for review

These tools are either embedded in the computer test delivery platform or provided at the local level automatically. They are the accessibility tools that nearly everyone uses routinely and assumes will be made available although we seldom think of them in this way. These tools serve a basic accessibility function for all.

As seen in Figure 4.3., default embedded system tools are common supports made available to all users upon launch/start of test. These tools are either embedded in the basic computer test delivery platform or they may be locally provided as needed. No advance request is needed for these supports. Students whose needs are met by default embedded tools do not need a PNP.

Page 61: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACCESSIBILITY

4.7

Figure 4.3. Default Embedded System Tools.

4.2.3 Support Level 2: Open Access ToolsOpen Access tools (see Figure 4.4) are available to all users but must be identified in advance in the PNP, planned for, and then selected from the pull-down menu inside the test to be activated (online), or else provided locally.

Many students’ unique sensory and communication accessibility needs are predictable and can be met through a set of accessibility features designed into the underlying structure and delivery format of test items. Rather than overwhelm the user with all the possible tools, Open Access tools provide just the tools needed by individual users.

Open Access tools are slightly more intensive than default embedded system tools but can be delivered in a fully standardized manner that is valid, appropriate, and personalized to the specific access needs identified within an individual student’s PNP. Some of these require the use of tool-specific administration procedures. In ACT Aspire, Open Access tools include but are not limited to the following examples:

• color overlay• respond on separate paper (transcribe)• line reader• magnifier tool• answer masking• dictate responses (scribe)• keyboard or augmentative or assistive communication (AAC) + local print

(transcribe)• breaks: supervised within each day• special seating/grouping

Page 62: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACCESSIBILITY

4.8

• location for movement• individual administration• home administration• other setting• audio environment• visual environment• physical/motor equipment

Open Access tools should be chosen carefully and specifically to prevent overwhelming or distracting the student during testing. Remember: routine annual documentation of successful (and unsuccessful) use of accessibility tools through the student’s educational experience helps to inform and improve future choices.

As seen in Figure 4.4, Open Access tools may be used by anyone, but to be activated they must be identified in advance in the PNP, planned, and selected from the pull-down menu inside the test to activate them (online), or else provided locally. Room supervisors must follow required procedures. Users should be practiced, familiar, and comfortable using these types of tools as well as comfortable using them in combination with any other tools.

Figure 4.4. Open Access Tools.

4.2.4 Support Level 3: AccommodationsAccommodations are high-level accessibility tools needed by relatively few students (see Figure 4.5). The ACT Aspire system requires accommodation-level supports to be requested by educational personnel on behalf of a student

Page 63: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACCESSIBILITY

4.9

through the online PNP process. This will allow any needed resources to be assigned and documented for the student.1

Typically, students who receive this high level of support have a formally documented need for resources or equipment that requires expertise, special training, and/or extensive monitoring to select, administer, and even to use the support effectively and securely. These can include but are not limited to the following examples:

• text-to-speech English audio• text-to-speech English audio + orienting description for blind/low vision• word-to-word dictionary• human reader, English audio• translated test directions• Braille + tactile graphics• sign language interpretation• abacus, locally provided• extra time• breaks: securely extend session over multiple days

Decisions about accommodation-level supports are typically made by an educational team on behalf of and including the student. Accommodation decisions are normally based on a formal, documented evaluation of specialized need. Accommodation supports require substantial additional local resources or highly specialized, expert knowledge to deliver successfully and securely.

As seen in Figure 4.5, accommodations are available to users who have been qualified by their school or district to use them. ACT Aspire recommends that students who use accommodation-level supports have a formally documented need as well as relevant knowledge and familiarity with these tools. Accommodations must be requested through the online PNP process. Any formal qualifying procedure that is required by the responsible educational authority must be completed prior to completing the PNP request process.

1 Qualifying procedures or formal documentation required to request and receive accommodation-level support during ACT Aspire testing should be set by schools.

Page 64: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACCESSIBILITY

4.10

Figure 4.5. Accommodations.

4.2.5 Support Level 4: ModificationsModifications are supports that are sometimes used during instruction, but they alter what the test is attempting to measure and thereby prevent meaningful access to performance of the construct being tested (see Figure 4.6). Because modifications violate the construct being tested, they invalidate performance results and communicate low expectations of student achievement. Modifications are not permitted during ACT Aspire testing. (Modifications are further discussed in the Accessibility User Guide in the section titled, “When Instruction and Assessment Supports Differ.”)

As seen in Figure 4.6 modifications are supports that alter what the test is attempting to measure and therefore are not permitted in ACT Aspire tests.

Figure 4.6. Modifications.

Page 65: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACCESSIBILITY

4.11

4.3 Accommodations, Open Access and Embedded ToolsIn our commitment to provide a level field and fair performance opportunities for all students, ACT Aspire provides an integrated system of accessibility supports that include accommodations as well as other forms (less intensive levels) of accessibility support. There are times when supports provided for those who test using the online format are combined with other types of locally provided supports.

All students who use accessibility features should have this use documented in the online Personal Needs Profile (PNP) portal by appropriate school personnel. For this reason we have provided the general description of ACT Aspire Accessibility Supports here in one section. Full procedural requirements and instructions for using permitted supports during test administration are provided in the ACT Aspire Accessibility User’s Guide: Interim Test Form. In addition the following procedural appendices and test administration resources are provided with the ACT Aspire Accessibility User’s Guide: Interim Test Form:

• Appendix A: Personal Needs Profile (PNP) Worksheet—Interim Testing• Appendix B: General Response Dictation and Scribing Procedures—Interim

Testing• Appendix C: Guidelines for Sign Language Interpretation—Interim Testing• Appendix D: Approved Bilingual Word-to-Word Dictionaries—Interim Testing• Appendix E: Procedures for Local Delivery of Read-Aloud Support—Interim

Testing

2016 ACT Aspire Interim tests differ from Summative tests in the following ways:

• ACT Aspire interim tests are exclusively delivered online.• To be scored, all student responses must be returned through the online

platform.• Students must use browser zoom magnification and a magnifier tools, as

needed, instead of large print paper tests.• Hard-copy (paper) braille and tactile graphics are available for ACT Aspire

interim testing and must be ordered in advance, but there is no paper answer document or paper companion test proctor booklet for the braille test. All student responses—even the responses of blind users—must be provided through the online system. Unlike the Summative test, hard-copy braille materials for the Interim test should be kept for future use. Both Unified English Braille (UEB) with Nemeth Math/Science and English Braille American Edition (EBAE) are available.

• Spanish translation of interim test items is not available.

Page 66: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACCESSIBILITY

4.12

• Translations of test directions may be provided at the local level in the language needed by the student. No previously recorded online translated directions are available for this test.

• Interim tests are untimed. Timing of these tests is determined and controlled locally.

It is strongly recommended that use of all accommodation-level accessibility supports (whether provided locally or by test provider) be chosen by the appropriate educational team (as defined by the responsible educational authority) to meet individual student need, and then planned, practiced, and documented prior to the test.

Tables 4.1–4.4 on the following pages identify the accessibility supports available in the Interim ACT Aspire online test delivery format.

Table 4.1. Interim Online Testing Presentation Supports

Presentation SupportsContent Area

Support Level Reading English Math ScienceText-to-Speech (English Audio)

Accommodation* Directions Only

Directions Only Yes Yes• Intended for user with ability to see

graphics.Text-to-Speech (English Audio + Orienting Description)• Intended for user with blindness or

low vision.Directions

OnlyDirections

Only Yes Yes

• Requires: Braille + Tactile Graphics Companion; response support to record responses; time for shipment of braille materials (if none available locally from prior interim administration).

Accommodation*

(then must use Braille + Tactile

Graphics)

(then must use Braille + Tactile

Graphics)

(with Braille + Tactile

Graphics)

(with Braille + Tactile

Graphics)

• Recommended: Extra time.Translated Test Directions

Accommodation* Yes Yes Yes Yes• Allowed for all grades.• Requires: locally provided.Word-to-Word Dictionary, ACT-Approved Accommodation* — — Yes Yes• Requires: locally provided.Braille, Contracted, American Edition (EBAE) Includes Tactile Graphics (TTS Audio)

Accommodation* Yes Yes Yes Yes• Requires: Response support to record responses; time for shipment of materials.

• Recommended: Extra time.* Qualification for use of permitted accessibility supports must follow policies of your local educational authority.

Page 67: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACCESSIBILITY

4.13

Table 4.1. Interim Online Testing Presentation Supports—continued

Presentation SupportsContent Area

Support Level Reading English Math ScienceBraille, Uncontracted, American Edition (EBAE) Includes Tactile Graphics (TTS Audio)

Accommodation* Yes Yes Yes Yes• Requires: Response support to record responses; time for shipment of materials.

• Recommended: Extra time.Braille, Contracted, Unified English Braille (UEB) Includes Tactile Graphics (TTS Audio)

Accommodation* Yes Yes Yes Yes• Requires: Response support to record responses; time for shipment of materials.

• Recommended: Extra time.Braille, Uncontracted, Unified English Braille (UEB) Includes Tactile Graphics (TTS Audio)

Accommodation* Yes Yes Yes Yes• Requires: Response support to record responses; time for shipment of materials.

• Recommended: Extra time.American Sign Language (ASL): Directions Only (English Text) Accommodation* Directions

OnlyDirections

OnlyDirections

OnlyDirections

Only• Requires: locally provided.American Sign Language (ASL) Test Items (English Text)

Accommodation* — — Yes Yes• Requires: locally provided 1:1 administration.

• Recommended: Extra time.Signed Exact English (SEE): Directions Only (English Text) Accommodation* Directions

OnlyDirections

OnlyDirections

OnlyDirections

Only• Requires: locally provided.Signed Exact English (SEE): Test Items (English Text)

Accommodation* — — Yes Yes• Requires: locally provided 1:1 administration.

• Recommended: Extra time.Cued Speech

Accommodation* — — Yes Yes• Requires: locally provided.* Qualification for use of permitted accessibility supports must follow policies of your local educational authority.

Page 68: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACCESSIBILITY

4.14

Table 4.1. Interim Online Testing Presentation Supports—continued

Presentation SupportsContent Area

Support Level Reading English Math ScienceColor Overlay

Open Access Yes Yes Yes Yes• Platform tool unavailable; may locally provide Color Overlay using acetate overlay taped to screen.

Line Reader MaskOpen Access Yes Yes Yes Yes• Online platform tool; may be locally

provided.Magnifier Tool

Open Access Yes Yes Yes Yes• Online platform tool; may be locally provided.

Browser Zoom MagnificationEmbedded Yes Yes Yes Yes

Online only.* Qualification for use of permitted accessibility supports must follow policies of your local educational authority.

Table 4.2. Interim Online Testing Interaction and Navigation Supports

Interaction and Navigation SupportsContent Area

Support Level Reading English Math ScienceAbacus

Accommodation* — — Yes —• Requires: locally provided.Answer Masking

Open Access Yes Yes Yes Yes• Online platform tool.Answer Eliminator

Embedded Yes Yes Yes Yes• Online platform tool.Highlighter Tool

Embedded Yes Yes Yes Yes• Online platform tool is available in all non-audio forms. Not available with audio text-to-speech.

Scratch PaperEmbedded Yes Yes Yes Yes• Requires: locally provided.

Calculator

Embedded — — Yes —• Calculators not permitted at

grades 3–5 for Interim tests only.• Follow ACT Aspire Calculator Policy;

may use accessible calculators.* Qualification for use of permitted accessibility supports must follow policies of your local educational authority.

Page 69: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACCESSIBILITY

4.15

Table 4.3. Interim Online Testing Response Supports

Response SupportsContent Area

Support Level Reading English Math ScienceElectronic Spell Checker

Accommodation* — — Yes Yes• Requires: locally provided separate

device which must meet specifications provided in Procedures for Administration in Guide.

Respond on Separate PaperOpen Access Yes Yes Yes Yes• Requires: locally provided; response

transcription.Dictate Responses

Open Access Yes Yes Yes Yes• Requires: Follow procedure in Accessibility Guide.

• Recommended: Extra time.Keyboard or AAC + Local Print

Open Access Yes Yes Yes Yes• Requires: Response transcription; original work must be returned.

• Recommended: Extra time.Mark Item for Review Embedded Yes Yes Yes Yes* Qualification for use of permitted accessibility supports must follow policies of your local educational authority.

Table 4.4. Interim Online Testing General Test Condition Supports

General Test Condition SupportsContent Area

Support Level Reading English Math ScienceExtra Time†

Accommodation* Yes Yes Yes Yes• Interim test timing is locally decided, not online-controlled.

Breaks: Supervised within Each Day Open Access Yes Yes Yes YesSpecial Seating/Grouping Open Access Yes Yes Yes YesLocation for Movement Open Access Yes Yes Yes YesIndividual Administration Open Access Yes Yes Yes YesHome Administration Open Access Yes Yes Yes YesOther Setting Open Access Yes Yes Yes YesAudio Environment Open Access Yes Yes Yes YesVisual Environment Open Access Yes Yes Yes YesPhysical/Motor Equipment Open Access Yes Yes Yes Yes* Qualification for use of permitted accessibility supports must follow policies of your local educational authority.

† Extra time represents the maximum allowed within a same-day test. Session may end earlier if time not needed. Automatic assignment of extra time occurs only with TTS audio formats. If using only paper form braille without TTS, extra time must be manually selected in PNP.

Page 70: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity
Page 71: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

5.1

CHAPTER 5

Test AdministrationThe ACT Aspire Periodic Assessment Guide contains the instructions for administering the ACT Aspire Interim assessments, grades 3–Early High School in English, mathematics, reading, and science and the ACT Aspire Classroom assessments, grades 3–8 in English, mathematics, reading, and science. The test coordinator is the main ACT Aspire contact at a school and the person who makes arrangements for the test administration. The room supervisor is responsible for administration of the tests to the students in a testing room.

All training and test administration resources are available online on the websites associated with ACT Aspire: actaspire.tms.pearson.com houses training videos, actaspire.avocet.pearson.com includes links to other training materials, and actaspire.pearson.com has access to many more resources.

5.1 Policies and ProceduresThe ACT Aspire Periodic Assessment Guide provides guidelines in administering ACT Aspire Periodic tests. It is important to follow these guidelines to successfully measure students’ academic skills. Prohibited behavior is outlined in the Room Supervisor Manual.

5.2 Standardized ProceduresThroughout the ACT Aspire Periodic Assessment Guide, there are detailed directions for securing test materials and administering tests in a standardized manner.

Relatives of students taking ACT Aspire should not serve in the role of room supervisor in the same testing room as the student relative. It is permissible for a relative to serve as a room supervisor in the same school/district as a related

Page 72: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

TEST ADMINISTRATION

5.2

student, provided that student is not testing in the room being supervised by the related testing staff person.

Use of Calculators on ACT Aspire

Students in grades 3–5 are not permitted to use calculators on the Interim mathematics tests. Calculators are allowed on all levels of the Classroom mathematics test.

Students in grade 6 and above are allowed, but not required, to use an approved calculator on Interim mathematics tests. However, all problems can be solved without a calculator.

The TestNav 8 testing platform includes a calculator tool for all mathematics tests except Interim assessments for grades 3–5. Schools may also provide calculators or each student may bring his or her calculator to the test.

Scratch Paper

Each student should be given one sheet of scratch paper for use at the start of each testing session. Students may request and receive additional blank sheets for scratch paper during testing. Students are allowed to use pencils or pens with the scratch paper.

5.3 Selecting and Training Testing StaffThe test coordinator is responsible for selecting and training all room supervisors and other testing staff.

5.3.1 Room SupervisorsTypically, teachers will administer the tests to students during regular class periods.

The test coordinator should be sure that everyone involved in test administration has access to the ACT Aspire Periodic Assessment Guide and is familiar with its contents. All manuals are periodically updated, so it is important to check the Avocet website for updated versions before each new test administration. A room supervisor is needed in each testing room to read directions and monitor students.

Before the test day, all testing personnel should read all of the testing instructions carefully, particularly the verbal instructions, which will be read aloud to students on the test day. It is important that testing personnel are familiar with these instructions.

Page 73: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

TEST ADMINISTRATION

5.3

5.3.1.1 Room Supervisor QualificationsThe ACT Aspire test coordinator should confirm that the room supervisor(s) meet all of the following criteria. Each room supervisor should be

• proficient in English • experienced in testing and classroom management • a staff member of the institution or district where the test administration will

take place

To protect both students and the room supervisor from questions of possible conflict of interest, the following conditions should also be met. The room supervisor should

• not be a relative or guardian of a student in the assigned room• not be a private consultant or individual tutor whose fees are paid by a

student or student’s family

5.3.1.2 Room Supervisor ResponsibilitiesSpecific responsibilities are:

• Read and thoroughly understand the policies, procedures, and instructions in the ACT Aspire Periodic Assessment Guide and other materials provided.

• Supervise a test room.• Start a test session.• Help students sign in to the online testing system.• Read test instructions.• Walk around the testing room during testing to be sure students are working

on the correct test and to observe student behavior. Monitor the online testing system as needed.

• Monitor the online testing system as needed.• Pay careful attention to monitoring students’ behavior during the entire testing

session.• Collect and account for all scratch paper and authorization tickets before

dismissing students.• Ensure students have stopped testing and have correctly signed out of the

online testing system.• Complete documentation of any testing irregularities.

5.3.2 Responsibilities of Other Testing StaffOther school staff can assist the room supervisor with an administration to a group—according to the policies and procedures in the ACT Aspire Periodic Assessment Guide. If used, the staff members should meet the same qualifications as a room supervisor (see above).

Page 74: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

TEST ADMINISTRATION

5.4

5.3.3 Staff Training SessionsACT Aspire recommends that a training session be conducted prior to testing for all testing staff to discuss the testing guidelines and organizational details, including:

1. Security and Materialsa. Describe how to use the online testing system—see the ACT Aspire Portal

User Guide posted at actaspire.avocet.pearson.com for step-by-step instructions.

b. Emphasize that room supervisors must collect used and unused scratch paper and login credentials after testing.

c. Emphasize that staff members should never leave a test room unattended.d. Emphasize that test sessions must be started in the Portal before students

can sign in to the test.2. Activities Before the Test

a. Determine which set of verbal instructions room supervisors are to follow. Room supervisors should clearly mark those instructions in their manuals.

3. Test Daya. Discuss when and where staff members are to report on the test day.b. Determine how to handle late arrivals.c. Stress that verbal instructions for the tests must be read verbatim.d. Stress that login credentials should not be distributed prior to admitting

students.e. Emphasize that staff members should not read (other than the ACT Aspire

Periodic Assessment Guide), correct papers, or do anything not related to administering the test. Their attention should be focused on the students.

f. Emphasize that conversations among staff must be quiet and kept to a minimum. Even whispered conversations can be distracting to students while testing.

g. Note that during the test, staff members should walk quietly around the room, be available to respond to students’ questions, assist in the case of illness, and check that students are working on the correct test. Staff should also ensure students have signed in to the correct test and assist them with technical or system navigation issues.

h. Discuss procedures for a student leaving during the test to go to the bathroom.

i. Discuss what actions to take in the case of a group irregularity (e.g., a power outage) or an emergency.

j. Discuss potential individual irregularities and actions to take.

Page 75: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

TEST ADMINISTRATION

5.5

4. After the Testa. Emphasize that room supervisors must collect all used and unused scratch

paper and login credentials and return them to the test coordinator. b. Emphasize that all test sessions must be closed in the Portal after testing

has completed.

Page 76: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity
Page 77: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

6.1

CHAPTER 6

Test SecurityIn order to ensure the validity of the ACT Aspire test scores, test takers, individuals that have a role in administering the tests, and those who are otherwise involved in facilitating the testing process, must strictly observe ACT Aspire’s standardized testing policies, including the Test Security Principles and test security requirements. Those requirements are set forth in the instructions for the ACT Aspire test and the ACT Aspire Test Supervisor's Manual and may be supplemented by ACT from time to time with additional communications to test takers and testing staff.

ACT Aspire’s test security requirements are designed to ensure that examinees have an equal opportunity to demonstrate their academic achievement and skills, that examinees who do their own work are not unfairly disadvantaged by examinees who do not, and that scores reported for each examinee are valid. Strict observation of the test security requirements is required to safeguard the validity of the results.

Testing staff must protect the confidentiality of the ACT Aspire test items and responses. Testing staff should be aware of and competent for their roles, including understanding ACT Aspire’s test administration policies and procedures and acknowledging and avoiding of conflicts of interest in their roles as test administrators for ACT Aspire.

Testing staff must be alert to activities that can compromise the fairness of the test and the validity of the scores. Such activities include, but are not limited to, cheating and questionable test taking behavior (such as copying answers or using prohibited electronic devices during testing), taking photos or making copies of test questions or test materials, test proctor or test administrator misconduct (such as providing answers or questions to test takers or permitting test takers to engage in prohibited conduct during testing).

Page 78: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

TEST SECURITY

6.2

6.1 Data SecurityACT systems hardening baselines are built from the Center for Internet Security (CIS) (see http://www.cisecurity.org) standards and modified to meet the risk profile of applications and systems that they support, per the CIS guidance.

6.1.1 Personally Identifiable InformationACT Aspire, LLC’s, general policy on the release of personally identifiable information (PII) that is collected as part of the ACT Aspire program is that we will not provide such information to a third party without the consent of the individual (or, in the case of an individual under age 18, the consent of a parent or legal guardian). However, ACT Aspire may share PII as follows, as allowed by applicable laws:

• ACT Aspire will share PII with its member companies, NCS Pearson, Inc., and ACT, Inc. Both member companies provide services to ACT Aspire that are essential for ACT Aspire to provide testing services, and both members will use PII only in the same ways ACT Aspire can, in accordance with this policy. In the event of dissolution of ACT Aspire, LLC, its members will assume ownership of its assets, including routine business records that may include PII.

• ACT Aspire may provide PII to the state once it has been determined that the state will pay for the individual to take the assessment. This PII may include linking information relating to ACT Aspire assessments with information relating to ACT programs such as the ACT assessment.

• If the individual is under 18 years of age, ACT Aspire may provide PII to the individual’s parent or guardian, who may direct us to release the information to third parties. If the individual is 18 years of age or older, ACT Aspire may provide PII to the individual, who may direct us to release the information to third parties.

• ACT Aspire may enter into contracts with third parties to perform functions on its behalf, which may include assisting it in processing PII. These parties are contractually required to maintain the confidentiality of the information and are restricted from using the information for any purposes other than performance of those functions.

• ACT Aspire may disclose PII to facilitate the testing process, for example, to test proctors.

• ACT Aspire may provide PII as required or allowed by operation of law, or to protect the health and safety of its customers or others.

ACT Aspire may authorize its members and other subcontractors to use de-identified and aggregate data for other purposes, such as research. PII will not be included in any de-identified or aggregate data sets.

Page 79: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

TEST SECURITY

6.3

6.1.2 Security Features of Data SystemsACT employs the Principle of Least Privilege when providing access to users, and access entitlement reviews are conducted periodically. All student Information is secured and password-controlled. Access to student information during various testing processes is limited to personnel who need that access in order to fulfill the requirements of the contract. Access to confidential student information is reviewed on a periodic basis. ACT is in the process of modernizing our infrastructure to measure up to established policy and that project will be ongoing. ACT takes a risk-based approach in prioritizing the order in which systems are to be modernized. ACT does not permit any customer access to data systems in its environment as this would cause exposure for other ACT customers.

The overall approach to security and cyber security standards by Pearson, ACT Aspire’s subcontractor, includes information about how the systems prevent infiltration and protect student information. Pearson’s overall approach involves the following facets:

Security and Audit Management. Pearson undergoes regular audits from internal corporate audit function, as well as third-party audits. Identified gaps are reported to the Pearson Chief Information Security Officer and respon-sible Business Information Security Officers and the specific business func-tion management, where a remediation plan is identified, responsibilities are assigned, milestones are created, and follow-up documentation is stored. Summary reports from the independent audits can be made available to the OIT Office of Cyber Security upon request.Update Processes. The “change management” process manages changes to optimize risk mitigation, minimizes the severity of any known impact, and aims to be successful on the first attempt. This includes additions, modifica-tions, and removals of authorized components of services and their associ-ated documentation. The process includes acknowledging and recording changes; assessing the impact, cost, benefit, and risk of proposed changes; obtaining approval(s); managing and coordinating implementation of chang-es; and reviewing and closing requests for change. Critical steps in the change process require management approval before proceeding to the next step of the process.System Behavior Monitoring. Both during test-taking and at other times, Pearson monitors for anomalous activity throughout the entire system, not just at the application layer. Should student testing activity trigger a behav-ior alert, the test is halted and the proctor is alerted. System monitors may be triggered, which will generate an alert to the appropriate team for further investigation.Features That Prevent Infiltration. Several methods mitigate the risk of uninvited access to systems. All external traffic is encrypted. All item content

Page 80: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

TEST SECURITY

6.4

payloads are encrypted end-to-end using AES-128. All student responses are encrypted using AES-256, using a different key than the content payloads. Students receive a one-time password to take a test. Account and password controls meet accepted standards for length, complexity, lockout, re-use, and expiration.

Systems and components use the following appropriate protocols and design features to neutralize known threats:

Component-to-Component. All sensitive communication between applica-tion components that crosses the Internet is encrypted using Secure Sockets Layer SSL)/Transport Layer Security (TLS). The strongest version of TLS is provided to the user’s browser depending on that browser’s capabilities. This does not apply to assessment item content delivery, because these are encrypted as described below. A different method will enable proctor caching, which is required to reduce their overall bandwidth requirements.User Authentication and Authorization. Users are only created when there is a business need and are granted a user role in the system based on job requirements. A user requests access with specific privileges, a role, that are needed to perform their required activities. User accounts are only authorized to access the data specific to their role and organization. The request for ac-cess is reviewed by the appropriate program manager and either approved or denied. Users are removed as part of an overall Termination Process in accordance with Human Resources policy. Network access is removed im-mediately, and accountable managers are notified of specific applications from which the users must have access removed. All applications, devices, databases, and network access require authentication of the user.Assessment Item-Level Security. When a test is marked “secure” during publishing, each individual item is encrypted using Advanced Encryption Standard (AES) and a 128-bit key. Test items remain encrypted until they are downloaded to the TestNav client, decrypted, and presented to the student during testing.Student Response and Score Data Security. Student responses are en-crypted using an AES 256-bit key. Score data are communicated between components using SSL/TLS.Data Storage. Database backups are AES encrypted using Amazon’s S3 Server Side Encryption functionality.

Internal corporate security policies are aligned with ISO 27001 and are updated regularly to reflect changes to the industry landscape. Application security practices are based on a variety of accepted good practices, including ISO 27001, Open Web Application Security Project, Web Application Security Consortium, and internally developed standards. Pearson security analysts update the standards and practices on a regular basis.

Page 81: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

TEST SECURITY

6.5

Specific security protocols for device types, operating systems, browsers, plugins and settings, enterprise management, and virtual desktop practices vary over time as new software versions are released by vendors, such as a new version of Windows or Java, and as new devices come on the market or are rendered obsolete because of age. These protocols are communicated regularly and clearly, early enough for schools and technology managers to confirm they are prepared well in advance of test day. In general, protocols include locking down web browsers or devices so that nothing but the test may be accessed during testing and verifying that supported versions of software include the latest security releases from the original vendor.

6.1.3 Data Breaches and RemediesACT’s Information Security Policy governs the handling of student data that is classified as confidential restricted. The policy states that confidential restricted information must meet the following guidelines:

• Electronic information assets must only be stored on ACT-approved systems/media with appropriate access controls.

• Only limited authorized users may have access to this information.• Physical records must be locked in drawers or cabinets while not being used.• ACT also has a Clear Desk and Clean Screen Standard, Access Management

Standard, Malware Protection Standard, Network Security Management Standard, Secure System Configuration Standard, Security Event Logging and Monitoring Standard, and System Vulnerability and Patch Management Standard to form a system of control to protect student data.

ACT has an Information Security Incident Response Plan (ISIRP) that brings needed resources together in an organized manner to deal with an incident, classified as an adverse event, related to the safety and security of ACT networks, computer systems, and data resources. The adverse event could come in a variety of forms: technical attacks (e.g., denial of service attack, malicious code attack, exploitation of a vulnerability), unauthorized behavior (e.g., unauthorized access to ACT systems, inappropriate usage of ACT data, loss of physical assets containing Confidential or Confidential Restricted data), or a combination of activities. The purpose of the plan is to outline specific steps to take in the event of any information security incident.

This Information Security Incident Response Plan charters an ACT Security Incident Response Team (ISIRT) with providing an around-the-clock (i.e., 24x7) coordinated security incident response throughout ACT. The Information Security Officer is the overall ACT representative, and has the responsibility and authority to manage the Information Security Incident Response Team and implement necessary ISIRP actions and decisions during an incident.

Page 82: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

TEST SECURITY

6.6

The ACT Aspire program, as part of the National Services group of NCS Pearson, Inc., (Pearson) responds to circumstances that relate to a security or privacy issues involving loss or access to student data processed by or on behalf of ACT Aspire.

Pearson has a formally documented Incident Response Policy and Process so that necessary stakeholders are notified and engaged in the process to investigate, remediate, and follow-up on any suspected or actual breach. If a breach is suspected, Pearson staff will communicate with our Chief Information Security Officer and Security team to initiate an investigation. If a breach is confirmed during the initial investigation, we execute the Incident Response Plan. Key stakeholders will be notified in the security, technical, business, and legal groups. Steps are promptly taken to restrict access to the compromised host or application while maintaining the integrity of the environment, to preserve any potential evidence for the investigation.

Pearson analysts will perform an investigation using a combination of tools and manual analysis in an attempt to identify the root cause and the extent of the compromise.

At the same time, the business and legal teams will determine what external notification requirements Pearson is subject to, based on legislation and contractual obligations. Required notifications and remedies will be performed.

If/when the root cause is identified, a remediation plan will be created to address the vulnerability, not only on the host(s) or application(s) in question, but across systems that may face the same vulnerability even if they have not yet been compromised. Systems or applications are patched promptly and must pass a security assessment prior to being released for general use. The investigator will submit a report with the details of the incident, and a post-mortem meeting will be held to review that report and discuss any lessons learned from the incident.

6.1.4 Data Privacy and Use of Student Data

6.1.4.1. Data Privacy AssurancesACT and our subcontractors agree not to use student PII in any way or for any purpose other than those expressly granted by the contract.

ACT does not employ the technology to encrypt at rest our storage frames; however, we have the following compensating solutions in this space:

• Storage frames supporting the ACT are dedicated to ACT for our in house system

• Storage frames support the ACT are housed at secure data centers which have SOC1 controls and provide reports to ACT

• E-Business does encrypt PII data within the database structures

Page 83: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

TEST SECURITY

6.7

• ACT does not use any external tape solutions; all backups are replicated within the solution to a secondary solution at our non-production data center for disaster recovery purposes

• Office 365 (email, SharePoint) is encrypted as a cloud SaaS solution

6.1.4.2 Security Processes and ProceduresACT conducts new hire training in Information Security practices and the handling of sensitive data. Annual refreshers are then conducted on an ongoing basis. ACT employs the Principle of Least Privilege so that only those individuals that have a legitimate need to access the data will be able to access the data and employs role-based security so that users only have accesses to the pieces of data necessary to conduct their work on behalf of our customers. Accounts are terminated immediately when a user departs the organization and periodic entitlement reviews are conducted to ensure that those that have access still require that level of access to perform their duties.

To manage access to identifiable student data for ACT Aspire, Pearson authorizes access to networks, systems, and applications based on business need to know, or the least privilege necessary to perform stated job duties. Any elevated rights not granted by virtue of job responsibilities at time of hire must be approved by management, and in certain instances, the Information Security Group (such as administrative rights).

Regarding de-identifying data during scoring, the digital scoring platform routes student responses to scorers randomly. Scorers cannot see any student information such as name or other demographics, unless a student self-identifies in his or her response.

The security processes include the following aspects:

Security Training of Staff. A Security Awareness Program is an ongoing effort providing guidance to every employee so they understand security policies, individual responsibilities for compliance, and how behaviors affect the ability to protect systems and data. The core of these efforts is built on the well-known principles of Confidentiality, Integrity, and Availability (CIA). Security awareness begins immediately with orientation for new hires. Train-ing covers acceptable use of systems and fundamental best practices. Web-based training modules are available, and all employees are required to com-plete the course when hired and thereafter are routinely asked to refresh their knowledge. Specific online courses covering Payment Card Industry (PCI) compliance and PII have also been developed. An annual refresher course including components of all of these control areas is also provided.Access to Identifiable Student Data. Pearson authorizes access to net-works, systems, and applications based on business need to know, or the least privilege necessary to perform stated job duties. Any elevated rights not

Page 84: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

TEST SECURITY

6.8

granted by virtue of job responsibilities at time of hire must be approved by management, and in certain instances, the Information Security Group (such as administrative rights).Internal Network Access Controls. Policy requires system access to be granted with a unique user ID. Every employee is provided with a unique do-main account and a secure one-time password. To safeguard confidentiality, passwords must be changed upon first logon. Pearson stipulates a minimum password length as well as complexity requirements. Accounts are locked out after five failed login attempts. Passwords must be changed every 60 days and cannot be reused within one year. A screen lock is also enabled by Do-main Security Policy to activate after 15 minutes of inactivity. All accounts are locked at time of termination. To verify compliance, domain controls identify and lock all accounts that have been idle for more than 30 days. If they are still unused after 90 days, the account is deleted.Application Access Controls. For externally facing systems and applica-tions, access to specific data is determined based on user roles and permis-sion requirements established by our customers. At a minimum, applications hosted on Pearson systems are required to provide the same level of security as network access would require.System Access Controls. Separation of duties between system administra-tors, application developers, application DBAs, and system DBAs provides clear definition of responsibility and prevents an individual from attaining the access rights that would allow for easy compromise.De-identifying Data during Scoring. The digital scoring platform routes student responses to scorers randomly. Scorers cannot see any student infor-mation such as name or other demographics, unless a student self-identifies in his or her response.

6.1.4.3 Transparency RequirementsFor paper testing, ACT will store all scored answer documents and used test books at a secure facility until securely destroyed based upon ACT’s retention schedule. This will include all materials used to capture student responses for scoring. ACT’s standard retention policy is to store all scanned answer documents and secure test materials for two years. ACT is also committed to retaining any electronic files including, but not limited to, scanned images, scored files, file exchanges, and contractual documentation at the ACT standard retention rate.

6.1.4.4 Student Privacy PledgeACT does not put limits on data retention as it is not aligned with our mission to be able to issue college reportable scores and conduct research to improve the ACT and provide data trends in education. Pearson, the subcontractor for ACT Aspire, LLC, has not signed the Student Privacy Pledge. Pearson remains committed to protecting student privacy and to working with state and local

Page 85: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

TEST SECURITY

6.9

school officials to be good stewards of student personally identifiable information in our care.

6.2 Test Security and Administration IrregularitiesThe irregularity reporting tool in the ACT Aspire Portal is intended for use primarily by school personnel to record any test administration irregularities that may affect student scores or the analysis of ACT Aspire results. Recording an irregularity for a student is not the same as voiding his or her test and dismissing him or her for prohibited behavior. Instructions for using this tool can be found in the related section of the ACT Aspire Portal User Guide. Testing personnel should use the tool to report any of the irregularities occurring within the room.

Room supervisors should enter test irregularities directly into the reporting tool in the ACT Aspire Portal, if possible. If room supervisors do not have access to the Portal, they should use the Testing Irregularity Report located at the end of this manual. All Testing Irregularity Reports should be forwarded to the local test coordinator after testing and be entered into the Portal.

For either format of the test, room supervisors should document any of the following occurrences during administration:

• A student engages in any instance of prohibited behavior as outlined above.• A student becomes ill or leaves the room during testing.• A student fails to follow instructions (responds to questions randomly,

obviously does not read questions prior to responding, or refuses to respond to questions).

• A general disturbance or distraction occurs which could affect one or more students’ results.

• A student questions the accuracy or validity of an item.• A student has a technical issue that interrupts online testing.

For any instances where students can resume testing after illness, a technical issue, or a general disturbance, room supervisors should follow the instructions about how to resume a test session from the “Resume a Test” section of the Avocet website.

For the latest update of irregularity categories and codes used in the ACT Aspire Portal, see the “Irregularities” section of the Avocet website.

The irregularities in the Environment/Materials category include external factors that may affect student testing. These include things like outside noises or hot/cold room temperatures; damaged, missing, or stolen test materials; and occurrences like power outages, severe weather, or emergency evacuations.

Page 86: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

TEST SECURITY

6.10

The Examinee category of irregularities includes student behaviors that may affect their performance or the performance of other students. These include the exhibition of prohibited behaviors described previously, student complaints about testing conditions, or challenges of test items.

The Staff category includes actions testing staff may engage in that affect testing. These include failure to follow testing procedures like mistiming a test or not reading the verbal instructions from the Room Supervisor Manual, or other inappropriate behavior like engaging in personal communication via other staff, telephone, or text during testing.

The Technical category pertains to the performance of online testing and includes system failure, slowness, or freezing; difficulties launching the test or with students using the testing platform; and other system issues like problems with using a keyboard, mouse, monitor, or other related hardware.

Page 87: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

7.1

CHAPTER 7

Reporting and Research Services

7.1 Interim ReportingACT Aspire Interim Reports provide valuable information to help students during the academic year. ACT Aspire Interim school and district reports provide information that principals, teachers, counselors, and district superintendents can use to monitor and evaluate the academic achievement of their students.

After completing a research linking study and a norming study in spring 2017, ACT will revise and enhance Interim reports to include scale scores and normative information. Report development is currently underway and will provide enhanced reports to track student growth within an academic year.

7.1.1 Student ReportThe ACT Aspire Student Report includes the following information:

a. Test scores (percent correct and total raw score) for the four academic tests (English, mathematics, reading, and science)

b. Detailed Skills Results, which is performance on the reporting categories of the four academic tests.

c. Percent correct and the number of correct items for Text Complexity.

7.1.2 Educator Reports

Subject Proficiency by Student Report

ACT Aspire Interim Subject Proficiency by Student Report provides a group average percent correct in each subject. Percent correct scores for each student are also provided. For group comparisons, students are grouped into

Page 88: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

REPORTING AND RESEARCH SERVICES

7.2

two categories: At or Above Group Average and Below Group Average in 2+ Subjects.

Subject Proficiency by Group Report

The Subject Proficiency by Group Report provides average proficiency for the group (average percent correct, average raw score, and the number of students in the group). The average percent correct for each of the subject skills allows for group performance comparisons.

Student Performance Report

Student Performance Reports provide an average score and average percent correct for the group selected, as well as the raw score and the percent correct for selected students. This report can be sorted by name or score.

Skill Proficiency Report

The Skill Proficiency Report provides a group average percent correct for each subject skill and percent correct for selected students for a given subject. The Skills are sorted in descending order, from higher performing skill to lower performing skill.

Student data is provided in two groups: At or Above Group Average and Below Group Average in 2+ Subjects. Student scores below group average in the skill are flagged. The Skill Focus section in the lower left corner provides ideas to help students improve a lower-performing skill for the group selected.

Response and Content Analysis Reports

The Response and Content Analysis Reports allow educators to review group performance at the test question level. The percent correct for the selected students in each of the subject skills is shown, with performance data for each of the test questions within those skills. For each subject, response data is provided for each question. The correct response is highlighted and the counts of students who selected each response are provided. Standards assessed for each question are listed. For science, the Skills are listed.

For math, English, and science, educators can click on the question number to view the actual test question, the response distribution, and, if applicable, the names of the students who selected each response. The ability to view actual questions, the response distribution, and the names of the students provide educators with an in depth analysis of student and group responses. This expanded review is not available in the reading assessment because of the copyright of passage-based items.

Page 89: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

REPORTING AND RESEARCH SERVICES

7.3

7.1.3 School Reports

Subject Proficiency by Grade Level Report

This Subject Proficiency Report provides summary data by subject for the school, with an analysis for each grade tested. All groups are compared with the school average in the subject. The Subject Focus provides tips for the subject with the lowest overall scores.

Skill Proficiency by Group Report

The Skill Proficiency by Group Report provides summary data for each of the skills within a given subject. The average for the subject is provided, with comparisons for each of the educators for a given grade in the school. Skill Focus for the lowest performing skill is provided.

7.1.4 District Reports

Subject Proficiency by School Report The Subject Proficiency by School Report provides a district average percent correct for all students in each subject, as well as an average percent correct for each school within the district. All schools are compared with the district average in the subject. The Subject Focus provides tips for the subject with the lowest overall scores.

Student Performance File (SPF)

The Student Performance File is a student data file that includes student data, demographic data, and Interim test scores (raw score and percent correct for both subject and skills scores). The SPF layout includes specific field names, valid values, and maximum field length for each field.

7.2 Classroom ReportingACT Aspire Classroom assessments provide valuable information to help address specific standards for students. Educators can assign ACT Aspire Classroom assessments based on the individual needs of students using the Classroom Standard Coverage documents listed in Avocet. Classroom assessments are five-question quizzes, typically covering one to two standards.

7.2.1 Student Report

Assessment Results Report

The Assessment Results Report provides student feedback. Raw score and percent correct are reported, and the standards are assessed. Students can review the questions they answered incorrectly and the standards that were assessed.

Page 90: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

REPORTING AND RESEARCH SERVICES

7.4

7.2.2 Educator Reports

Student Performance Report

Student Performance Report provides an average score for the group selected, as well as the raw score and percent correct for selected students. This report can be sorted by name or score.

Response and Content Analysis Reports

The Response and Content Analysis Reports mirror the Interim report. They allow educators to review group performance at the test question level. The percent correct for the selected students in each of the subject skills is shown along with performance data for each of the test questions within those skills. For each subject, response data is provided for each question. The correct response is highlighted and the counts of students who selected each response are provided. Standards assessed for each question are listed. For science, the Skills are listed.

For math, English, and science, educators can click on the question number to view the actual test question, the response distribution, and, if applicable, the names of the students who selected each response. The ability to view actual questions, the response distribution, and the names of the students provide educators with an in depth analysis of student and group responses. This expanded review is not available in the reading assessment because of copyright of passage-based items.

Page 91: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

8.1

CHAPTER 8

ACT Aspire Interim Assessment Scores and ReliabilitiesThis chapter provides the ACT Aspire Interim subject scores and reliabilities. ACT is on the path to link the raw scores of the four Interim forms within each subject and each grade to put them on the same score scale. The subject raw scores are presented in this technical manual until the score scales of ACT Aspire Interim assessments are ready.

8.1 Descriptive Statistics of Subject Raw ScoresACT Aspire Interim raw scores are reported to students and teachers/administers from grade 3 through grade 10 in English, mathematics, reading, and science. Students may take the ACT Aspire Interim assessments many times during an academic year; however, this chapter only reports the summary descriptive statistics of raw scores of students who took the ACT Aspire Interim assessments two months before taking the spring 2016 ACT Aspire Summative assessment. It was assumed that teachers covered most of the academic year curriculum two months before the ACT Aspire Summative assessment administration. Tables 8.1–8.4 list the raw scores average and the standard deviation (SD) of English, mathematics, reading, and science of students from grade 3 through grade 10 for each form.

Page 92: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACT ASPIRE INTERIM ASSESSMENT SCORES AND RELIABILITIES

8.2

Table 8.1. Descriptive Statistics of ACT Interim Raw Scores in English

Grade FormNumber of

Scored Items N Mean SD

3

Form 1 18 12,491 9.12 3.65Form 2 18 7,580 9.45 3.41Form 3 18 6,158 9.47 3.52Form 4 18 1,101 10.03 3.61

4

Form 1 18 11,372 9.62 3.85Form 2 18 8,170 9.54 3.68Form 3 18 6,660 9.86 3.85Form 4 18 1,156 9.74 3.52

5

Form 1 18 11,261 11.36 3.63Form 2 18 7,839 11.37 3.55Form 3 18 7,071 10.60 3.80Form 4 18 1,475 10.75 3.41

6

Form 1 24 11,383 12.51 4.57Form 2 24 8,516 11.97 4.17Form 3 24 7,305 13.41 4.72Form 4 24 1,099 13.56 4.31

7

Form 1 24 11,383 12.78 4.28Form 2 24 8,677 12.41 4.20Form 3 24 7,359 13.98 4.64Form 4 24 719 10.18 3.90

8

Form 1 24 10,830 12.64 4.60Form 2 24 8,766 12.15 4.42Form 3 24 7,229 11.94 4.33Form 4 24 784 14.38 4.77

9

Form 1 36 9,540 20.61 6.99Form 2 36 6,937 20.72 7.10Form 3 36 4,620 17.62 6.55Form 4 36 421 19.23 6.39

10

Form 1 36 10,077 21.79 7.09Form 2 36 7,540 21.50 7.37Form 3 36 6,248 18.31 6.86Form 4 36 798 20.29 6.58

Page 93: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACT ASPIRE INTERIM ASSESSMENT SCORES AND RELIABILITIES

8.3

Table 8.2. Descriptive Statistics of ACT Interim Raw Scores in Mathematics

Grade FormNumber of

Scored Items N Mean SD

3

Form 1 25 13,489 10.87 4.52Form 2 25 9,558 9.27 4.15Form 3 25 9,160 11.81 5.00Form 4 25 1,547 12.28 4.87

4

Form 1 25 12,268 8.64 4.38Form 2 25 10,368 10.85 4.53Form 3 25 9,670 10.16 4.61Form 4 25 1,495 10.68 4.51

5

Form 1 25 10,746 9.13 4.18Form 2 25 9,781 7.78 3.95Form 3 25 9,497 10.75 4.60Form 4 25 1,610 11.94 4.67

6

Form 1 25 12,480 8.52 4.04Form 2 25 8,618 8.41 3.67Form 3 25 8,390 8.62 4.09Form 4 25 1,243 8.93 4.09

7

Form 1 30 12,133 10.12 5.08Form 2 30 8,420 10.58 4.65Form 3 30 7,622 10.06 5.01Form 4 30 922 9.83 4.29

8

Form 1 30 11,317 11.20 4.77Form 2 30 8,908 11.69 5.19Form 3 30 7,639 11.01 4.75Form 4 30 991 8.39 3.50

9

Form 1 30 9,469 12.86 4.76Form 2 30 7,347 10.50 4.08Form 3 30 5,407 7.91 3.70Form 4 30 636 8.17 2.82

10

Form 1 30 10,212 13.16 4.99Form 2 30 6,255 10.91 4.48Form 3 30 6,723 8.47 4.14Form 4 30 790 8.32 2.96

Page 94: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACT ASPIRE INTERIM ASSESSMENT SCORES AND RELIABILITIES

8.4

Table 8.3. Descriptive Statistics of ACT Interim Raw Scores in Reading

Grade FormNumber of

Scored Items N Mean SD

3

Form 1 14 14,360 7.61 3.47Form 2 14 9,620 8.20 3.10Form 3 14 9,108 7.49 3.11Form 4 14 1,551 8.04 3.41

4

Form 1 14 13,614 8.98 3.44Form 2 14 11,009 8.89 3.18Form 3 14 9,793 8.56 3.17Form 4 14 1,710 8.21 3.53

5

Form 1 14 12,466 8.28 3.02Form 2 14 9,987 9.02 3.45Form 3 14 9,892 7.70 3.09Form 4 14 1,928 9.39 3.25

6

Form 1 14 13,169 8.42 3.41Form 2 14 9,582 8.42 3.06Form 3 14 8,935 8.11 3.45Form 4 14 1,141 7.27 3.00

7

Form 1 14 12,779 8.28 3.53Form 2 14 8,456 9.06 3.29Form 3 14 7,661 8.80 3.35Form 4 14 1,165 7.28 3.37

8

Form 1 15 12,631 8.71 3.44Form 2 15 8,183 8.46 3.46Form 3 15 8,305 8.22 3.77Form 4 15 1,137 8.03 3.47

9

Form 1 15 9,130 7.17 3.37Form 2 15 6,302 7.42 3.57Form 3 15 5,133 6.50 2.86Form 4 15 227 8.32 3.50

10

Form 1 15 9,635 7.66 3.50Form 2 15 6,457 7.52 3.62Form 3 15 6,282 6.87 3.01Form 4 15 491 8.48 3.42

Page 95: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACT ASPIRE INTERIM ASSESSMENT SCORES AND RELIABILITIES

8.5

Table 8.4. Descriptive Statistics of ACT Interim Raw Scores in Science

Grade FormNumber of

Scored Items N Mean SD

3

Form 1 21 10,541 11.51 4.84Form 2 21 6,519 11.14 4.41Form 3 21 6,028 12.17 3.95Form 4 21 805 12.87 5.02

4

Form 1 21 10,351 11.55 4.33Form 2 21 7,122 13.41 4.27Form 3 21 6,344 12.02 4.18Form 4 21 1,039 13.82 4.84

5

Form 1 21 9,808 12.55 4.14Form 2 21 7,746 14.79 4.08Form 3 21 7,885 14.44 4.07Form 4 21 1,500 13.71 4.17

6

Form 1 21 9,913 11.46 4.47Form 2 21 7,358 13.11 4.26Form 3 21 6,482 12.16 3.75Form 4 21 878 12.78 4.25

7

Form 1 21 11,374 12.46 4.40Form 2 21 8,741 14.37 4.74Form 3 21 7,819 14.81 4.23Form 4 21 1,328 11.60 4.21

8

Form 1 21 9,625 11.32 4.67Form 2 21 7,638 10.35 4.28Form 3 21 7,161 12.47 4.63Form 4 21 603 9.92 3.81

9

Form 1 21 8,829 10.95 4.58Form 2 21 6,846 11.76 3.93Form 3 21 5,070 11.25 4.18Form 4 21 641 10.42 4.22

10

Form 1 21 7,866 11.14 4.64Form 2 21 7,097 11.98 4.10Form 3 21 5,740 11.64 4.22Form 4 21 765 9.81 4.18

8.2 Subject Raw Score ReliabilitySome degrees of inconsistency or error is contained in the measurement of cognitive characteristics. A student taking one form of a test on one occasion and a second parallel form on another occasion would likely earn somewhat different scores on the two administrations. These differences might be due to the student or the testing situation, such as different motivations, different levels of distractions across occasions, or student growth between testing events.

Page 96: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACT ASPIRE INTERIM ASSESSMENT SCORES AND RELIABILITIES

8.6

Differences across testing occasions might also be due to the particular set of test items or prompts included on each test form. While procedures such as standardizing testing environment are in place to reduce differences across testing occasions, differences cannot be eliminated.

Reliability coefficients are estimates of the consistency of test scores. The coefficients typically range from zero to one, with values near one indicating greater consistency and those near zero indicating little or no consistency. As the Standards for Educational and Psychological Testing states, “For each total score, subscore, or combination of scores that is to be interpreted, estimates of relevant indices of reliability/precision should be reported” (AERA, APA, & NCME, 2014, p. 43).

Reliability coefficients are usually estimated based on a single test administration by calculating the inter-item variance-covariance matrix. These coefficients are referred to as internal consistency reliability. Cronbach’s coefficient alpha (Cronbach, 1951) is one of the most widely used estimates of test reliability and was computed for all of the ACT Aspire Interim tests. Coefficient alpha can be computed using the following formula:

𝛼𝛼 ̂ = ( 𝑘𝑘𝑘𝑘 − 1) (1 −

∑ 𝑠𝑠𝑖𝑖2𝑘𝑘

𝑖𝑖=1𝑠𝑠𝑥𝑥2

) = ( 𝑘𝑘𝑘𝑘 − 1) (

∑ ∑ 𝑠𝑠𝑖𝑖𝑖𝑖𝑘𝑘𝑖𝑖≠𝑖𝑖

𝑘𝑘𝑖𝑖=1

𝑠𝑠𝑥𝑥2) ,

where k is the number of test items, s2i is the sample variance of the i th item,

sij is the sample covariance between item i and item j, and s2x is the sample

variance of the observed total raw score. Tables 8.5 lists the raw score reliability of English, mathematics, reading, and science based on data of students who took the ACT Aspire Interim assessments two months before taking the spring 2016 ACT Aspire Summative assessment. Note that some of the low reliability coefficients could be due to small sample sizes.

The standard error of measurement (SEM) is closely related to test reliability. The SEM summarizes the amount of error or inconsistency in scores on a test. The SEM (σE) is computed using the following formula:

𝜎𝜎𝐸𝐸 = 𝜎𝜎𝑥𝑥√(1 − 𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟),

where σx is the observed score standard deviation. Table 8.5 also lists the SEM of subject raw scores for each Interim form.

Page 97: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

ACT ASPIRE INTERIM ASSESSMENT SCORES AND RELIABILITIES

8.7

Table 8.5. Raw Score Reliability Coefficient and Standard Error of Measurement by Grade, Subject, and Form

English Mathematics Reading ScienceGrade Form Reliability SEM Reliability SEM Reliability SEM Reliability SEM

3

Form 1 0.73 1.90 0.77 2.17 0.78 1.63 0.84 1.94Form 2 0.72 1.80 0.73 2.16 0.74 1.58 0.81 1.92Form 3 0.70 1.93 0.82 2.12 0.72 1.65 0.78 1.85Form 4 0.75 1.81 0.80 2.18 0.77 1.64 0.85 1.94

4

Form 1 0.76 1.89 0.77 2.10 0.80 1.54 0.79 1.98Form 2 0.74 1.88 0.77 2.17 0.77 1.53 0.81 1.86Form 3 0.77 1.85 0.78 2.16 0.75 1.59 0.79 1.92Form 4 0.73 1.83 0.77 2.16 0.79 1.62 0.85 1.87

5

Form 1 0.77 1.74 0.72 2.21 0.74 1.54 0.79 1.90Form 2 0.74 1.81 0.72 2.09 0.80 1.54 0.81 1.78Form 3 0.76 1.86 0.78 2.16 0.74 1.58 0.81 1.77Form 4 0.70 1.87 0.80 2.09 0.80 1.45 0.82 1.77

6

Form 1 0.77 2.19 0.71 2.18 0.79 1.56 0.81 1.95Form 2 0.73 2.17 0.63 2.23 0.72 1.62 0.80 1.91Form 3 0.80 2.11 0.72 2.16 0.78 1.62 0.75 1.88Form 4 0.76 2.11 0.72 2.16 0.70 1.64 0.79 1.95

7

Form 1 0.74 2.18 0.79 2.33 0.80 1.58 0.80 1.97Form 2 0.74 2.14 0.75 2.33 0.79 1.51 0.85 1.84Form 3 0.80 2.08 0.78 2.35 0.78 1.57 0.83 1.74Form 4 0.67 2.24 0.71 2.31 0.76 1.65 0.78 1.97

8

Form 1 0.77 2.21 0.75 2.39 0.75 1.72 0.82 1.98Form 2 0.77 2.12 0.79 2.38 0.77 1.66 0.78 2.01Form 3 0.74 2.21 0.75 2.38 0.81 1.64 0.82 1.96Form 4 0.81 2.08 0.56 2.32 0.75 1.74 0.71 2.05

9

Form 1 0.86 2.62 0.75 2.38 0.74 1.72 0.81 2.00Form 2 0.86 2.66 0.67 2.34 0.77 1.71 0.75 1.97Form 3 0.83 2.70 0.62 2.28 0.65 1.69 0.77 2.00Form 4 0.84 2.56 0.34 2.29 0.78 1.64 0.77 2.02

10

Form 1 0.87 2.56 0.77 2.39 0.76 1.71 0.81 2.02Form 2 0.88 2.55 0.73 2.33 0.78 1.70 0.78 1.92Form 3 0.85 2.66 0.69 2.31 0.69 1.68 0.78 1.98Form 4 0.85 2.55 0.41 2.27 0.76 1.68 0.76 2.05

Page 98: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity
Page 99: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

9.1

CHAPTER 9

Concordance of ACT Aspire Interim Scores to ACT Aspire Summative ScoresThis chapter summarizes a concordance study between the ACT Aspire Interim assessment and the ACT Aspire Summative assessment. This concordance study established a direct link between scores on the ACT Aspire Interim and the ACT Aspire Summative Readiness Benchmarks.

9.1 MethodTo understand the relationship between the ACT Aspire Interim and ACT Aspire Summative test scores, equipercentile concordances were conducted. The equipercentile concordance relates scores from the ACT Aspire interim assessments and the ACT Aspire Summative assessments using percentile ranks, where a concorded Summative score is defined as the Summative scale score that has the same percentage of students at or below a given Interim raw score point within the group of students used in the study.

The concordance between ACT Aspire Interim raw scores and Summative scale scores was conducted based on students who took the ACT Aspire Interim assessments two months before taking the spring 2016 ACT Aspire Summative assessment. To reduce the possible practice effects of taking the same form repeatedly, students’ first records (if they had multiple records for the same form) were selected to conduct the concordance. Via the concordance, the Interim

Page 100: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

CONCORDANCE OF ACT ASPIRE INTERIM SCORES TO ACT ASPIRE SUMMATIVE SCORES

9.2

score a student needs to achieve if he/she were to be “likely on track” (i.e., at or above the benchmark) on the summative assessment can be estimated.

Note that the concordance results are based on students taking an Interim assessment close to the Summative assessment administration. These results do not take into account instructional interventions, such as a teacher providing remedial instruction for a student who scores low on an Interim assessment. A student who scores low on an Interim assessment because certain material has not yet been presented will likely score better later on in the Summative assessment than the concordance results suggest.

9.2 ResultsTables 9.1 to 9.4 list Interim raw scores that correspond to the Summative Readiness Benchmarks for each form in each grade in English, mathematics, reading, and science, respectively. In addition, the average concordance standard errors (SE) of the raw scores across the four Interim forms that correspond to the Summative Readiness Benchmarks are listed in the tables. The number of students used in the 128 concordances ranged from a low of 227 to a high of 14,360. Some of the concordance procedures did not produce the exact concordance benchmark Summative score points; when there were no concordance benchmark Summative score points, the next highest concordance Summative scores available was used to find the benchmark equivalent Interim raw scores. One example of not producing the exact concordance benchmark Summative score point was the 3rd grade form 4 English; the concordance Summative scores that corresponded to the Interim raw scores of 7 and 8 were 412 and 414, respectively. Since the concordance benchmark score of 413 was not available, the next highest concordance Summative score, which was 414, was used to find the benchmark equivalent Interim raw score, which was 8. ACT is on the path to link the raw scores of the four Interim forms within each subject and each grade to put them on the same score scale. When the scale scores of the Interim assessment are available, the Interim scale scores that correspond to the Summative Readiness Benchmarks will be provided.

Page 101: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

CONCORDANCE OF ACT ASPIRE INTERIM SCORES TO ACT ASPIRE SUMMATIVE SCORES

9.3

Table 9.1. Interim Raw Scores to Summative Readiness Benchmarks Concordance in English

Grade Benchmark Form 1 Form 2 Form 3 Form 4 SE3 413 7 7 7 8 0.494 417 8 7 8 8 0.125 419 9 9 8 9 0.116 420 10 10 10 11 0.807 421 9 9 10 8 0.218 422 10 10 9 11 0.259 426 20 19 16 18 0.2910 428 21 21 17 19 0.25

Table 9.2. Interim Raw Scores to Summative Readiness Benchmarks Concordance in Mathematics

Grade Benchmark Form 1 Form 2 Form 3 Form 4 SE3 413 10 8 10 10 0.064 416 8 9 9 9 0.085 418 9 7 10 12 0.096 420 8 7 7 8 0.107 422 10 10 10 10 0.128 425 12 12 12 9 0.239 428 15 12 10 10 0.2510 432 18 15 12 11 0.33

Table 9.3. Interim Raw Scores to Summative Readiness Benchmarks Concordance in Reading

Grade Benchmark Form 1 Form 2 Form 3 Form 4 SE3 415 10 10 9 10 0.154 417 11 10 10 10 0.135 420 10 12 9 11 0.116 421 10 9 9 8 0.147 423 11 11 10 9 0.128 424 10 9 9 9 0.139 425 8 9 8 10 0.2710 428 10 10 9 11 0.19

Page 102: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

CONCORDANCE OF ACT ASPIRE INTERIM SCORES TO ACT ASPIRE SUMMATIVE SCORES

9.4

Table 9.4. Interim Raw Scores to Summative Readiness Benchmarks Concordance in Science

Grade Benchmark Form 1 Form 2 Form 3 Form 4 SE3 418 13 12 14 15 0.184 420 13 14 13 15 0.175 422 14 16 16 15 0.086 423 11 13 13 12 0.127 425 14 16 17 13 0.118 427 13 11 14 10 0.129 430 14 13 14 12 0.2310 432 14 14 15 12 0.25

Page 103: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

R.1

ReferencesAmerican Educational Research Association (AERA), American Psychological

Association (APA), & National Council on Measurement in Education (NCME). (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.

Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika, 16(3), 297–334.

D’Avanzo, C. (2003). Application of Research on Learning to College Teaching: Ecological Examples. Bioscience, 53(11), 1121–1128.

Fedorchak, G. (2012). Access by Design—Implications for equity and excellence in education. Internal whitepaper prepared on behalf of the NH Department of Education for the Smarter Balanced Assessment Consortium. (Internal Document – no public link)

Fisher, D. d., & Frey, N. n. (2014). Addressing CCSS Anchor Standard 10: Text Complexity. Language Arts, 91(4), 236–250.

Mislevy, R. J., Almond, R. G., & Lukas J. (2004). A brief introduction to evidence-centered design (CSE Report 632). Los Angeles, CA: University of California-Los Angeles, National Center for Research on Evaluation, Standards, and Student Testing. Retrieved from files.eric.ed.gov/fulltext/ED483399.pdf

Mislevy, R. J., & Haertel, G. (2006). Implications for evidence-centered design for educational assessment. Educational Measurement: Issues and Practice, 25(4), 6–20.

National Governors Association Center for Best Practices & Council of Chief State School Officers. (2010). Common Core State Standards for English language arts and literacy in history/social studies, science, and technical subjects. Washington, DC: Authors.

Page 104: ACT Aspire® Periodic Technical Manual - …ocs.archchicago.org/Portals/23/ACT Aspire Periodic Technical Manual... · Periodic Technical Manual documents the collection of validity

REFERENCES

R.2

NGSS Lead States. 2013. Next Generation Science Standards: For States, By States. Washington, DC: The National Academies Press.

Nehm, R. H., & Reilly, L. (2007). Biology Majors’ Knowledge and Misconceptions of Natural Selection. Bioscience, 57(3), 263–272

Shanahan, T. (2012, June 18). What is Close Reading? [Web log post]. Retrieved from http://www.shanahanonliteracy.com/2012/06/what-is-close-reading.html.

Wessling, S. B. (2011). Supporting Students in a Time of Core Standards: English Language Arts, Grades 9–12. Urbana, IL: National Council of Teachers of English.

Yore, L. l., & Hammond-Todd, M. (2012). The Role of Public Policy in K–12 Science Education. International Journal of Environmental & Science Education, 7(4), 651–657.