final report of the spring 2012 mepa online test administration

46
Final Report of the Spring 2012 MEPA Online Test Administration Prepared for the Massachusetts Department of Elementary and Secondary Education by Measured Progress June 2012

Upload: others

Post on 12-Sep-2021

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the Spring 2012 MEPA Online Test

Administration

Prepared for the Massachusetts Department of Elementary and Secondary Education

by Measured Progress

June 2012

Page 2: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

TABLE OF CONTENTS EXECUTIVE SUMMARY ............................................................................................................ 2 FULL REPORT .............................................................................................................................. 6 APPENDIX A: Participating Schools and Districts ..................................................................... 17 APPENDIX B: Participation of Schools and Districts 2010–2012 ............................................ 23 APPENDIX C: 2012 MEPA-R/W Online Testing: Technical Requirements for School-based

Technology .......................................................................................................... 34 APPENDIX D: Summary of School Observations during Spring 2012 MEPA Online

Testing ................................................................................................................. 35 APPENDIX E: Overview of Survey Results ................................................................................ 38

Page 3: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

2

EXECUTIVE SUMMARY The purpose of this executive summary is to briefly report on the spring 2012 Massachusetts English Proficiency Assessment (MEPA) online test administration, including the preparation for testing, technical support, and activities completed after testing. This was the third year in which the spring operational tests were available online and is the final year of the MEPA testing program. The testing contractor, Measured Progress (MP), provided the online testing system, and the Massachusetts Department of Elementary and Secondary Education (the Department) implemented test administration policies and procedures. The MEPA online tests in reading and writing (MEPA-R/W) were available for Massachusetts schools to administer to English language learner (ELL) students in grades 3–12 for the spring administration February 27–March 16, 2012. ELL students in grades K–2 took paper-and-pencil tests exclusively. A total of 6,216 students from 173 schools in 81 districts participated in the spring 2012 MEPA online tests. These students represented 14 percent of the total number of ELL students tested in grades 3–12. For the purpose of this report, a school is indicated as having participated if, at minimum, one student in the school responded to at least one operational test question online during any test session.

Preparation for Testing

Preparations for online testing included recruiting eligible schools, verifying that those schools met the technology requirements, training principals or designees from each school, and deploying the online testing system. For the spring 2012 administration, the Department expanded eligibility for online testing to include all schools with ELL students in grades 3–12, except adult diploma programs, out-of-state schools, approved private special education schools, institutional programs, and most test sites and educational collaboratives. Exceptions were made for the SAFE test sites in Springfield and the Hampshire Educational Collaborative, which were included in the invitation because they expressed interest in participation. In total, 1,691 schools were invited to participate. The invitation was disseminated as a memo from the Commissioner of Elementary and Secondary Education in the fall of 2011. Schools were asked to affirm their interest in participating in online testing by completing a survey of interest and a technical readiness certification. Measured Progress analyzed the survey and readiness certification results and contacted some schools to clarify responses and to resolve potential issues. A total of 185 schools were accepted for participation for spring 2012. Written notification was sent to each of these schools confirming that the Department had accepted them for participation in training and subsequent online testing. Training materials included PowerPoint presentations and the Administration Manual for Spring 2012 MEPA Online Testing. Three types of training sessions were offered.

First-Time Participants: Eight half-day face-to-face training sessions for first-time participants were conducted in six locations (Bridgewater State University in

Page 4: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

3

Bridgewater, Barnstable High School in Hyannis, A.C. Whelan Elementary School in Revere, Dedham Middle School in Dedham, Springfield High School of Science and Technology in Springfield, and Merrimack College in North Andover). These training sessions included policy-related information and instructions for using each component of the online testing system. One additional training session was conducted via WebEx to accommodate first-time participants who could not attend a face-to-face training session. The WebEx session contained the same information as the face-to-face sessions and was recorded and posted to the online system so that schools could access it for review and to train staff.

• Returning Participants: Three training sessions were conducted via WebEx for schools who had participated previously in the MEPA online tests. One of these sessions was also recorded and posted to the online system. These sessions for returning participants recapped the most critical information and provided updates on changes to the system.

• Technology Coordinators: One pre-recorded training session video for district and/or school technology coordinators was available via the online system.

The online testing system was deployed on January 10, 2012, preloaded with student data for each school based on the October 2011 Student Information Management System (SIMS) submission. Following training, principals/designees prepared for testing by training staff, adding and removing students from their rosters, scheduling students to view a video tutorial, and conducting practice and locator tests, as appropriate. Principals/designees could enter data to allow participating staff access to the testing system, assign tests to each test administrator, assign students to appropriate test sessions depending on their level of English proficiency (i.e., sessions 1 & 2 or sessions 2 & 3), and conduct practice testing. Schools that tested online were allowed a longer testing window for those students testing online (not other grade spans participating in paper-and-pencil testing) in order to provide sufficient time for the students to access the schools’ computers. The operational tests were available online February 27–March 16. (The paper-and-pencil testing window was March 5—16.)

Technical Support

The MEPA Technical Service Center was available to provide technical support by telephone and email to participating schools. Other support resources were provided during face-to-face training sessions and in the online testing system itself, including the Administration Manual for Spring 2012 MEPA Online Testing. This manual contained detailed information on the online testing components, procedures to follow, and recommendations for trouble-shooting. Other resources available within the online system included the following:

frequently asked questions a checklist of tasks to be completed an online help system “Show Me” videos which demonstrated commonly used processes in the online system recorded versions of each of the three types of training sessions and PowerPoint

presentations for the non-technical training sessions tips and strategies collected from schools that had previously participated in online

testing

Page 5: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

4

a reference sheet to assist in training test administrators instructions for maintaining test security technical requirements for online testing links and instructions for deployment of the Student Test Workstation (STW) to

individual computers

Test Security and Test Administration

Both students and school staff required permissions and passwords to access the online system. Test administrators could administer only those tests that were assigned to them. Students could only access a test if it was assigned to them and if a school staff member opened the test and provided a test access code. Additionally, MP monitored access to the STW and the Test Administrator’s System (TAS) on the weekends during the operational testing window to determine whether unexpected activity took place and for Department follow-up.¹ The test security requirements and guidelines were described in a document which was presented at training sessions and posted in the online system. Suggestions contained in this document included testing students in small groups, using partitions or cardboard dividers to prevent students from viewing each other’s screens, and having two or more test administrators present during testing. Each school was responsible for developing a security plan tailored to the resources and computers in that school’s classrooms or labs. The Department made commercially manufactured cardboard “security carrels” available for those schools that had not ordered carrels in 2011. The carrels could be positioned around three sides of each computer to prevent students from viewing the screens of other students adjacent to or in front of them. Principals/designees coordinated with test administrators to ensure all students viewed a tutorial video, which explained to students how to navigate through the test and use each of the buttons and functions they would see in the testing system. In addition to the operational tests, the online system included two practice tests and a locator test for each grade span. Schools were instructed to give the online version of the practice test to students testing online. The locator tests were optional, but teachers who chose to administer one were encouraged to use the online version so that the students would have more exposure to the online system before operational testing began.

Observation Visits to Schools

As in 2011, staff from the Department and Measured Progress visited a sample of schools to observe the online test administration and to hear feedback from staff on the benefits and challenges of online testing. The six schools visited in 2012 were Sarah Greenwood and John P. ¹One school was open on Saturday and was observed to be actively engaged in testing. This activity was not viewed as a security breach as it had been pre-arranged with the Department. In addition, students from two schools attempted to log into the STW on a weekend. As no tests had been opened in the TAS and the students did not have a test access code, they were unable to access any material other than the login screen.

Page 6: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

5

Holland Schools in Boston, International High School in Lawrence, Lowell Community Charter School, Greater Lowell Vocational and Technical High School, and Collins Middle School in Salem.

Activities after Testing

The Principal’s Administration System (PAS) remained open through March 19 to allow principals/designees additional time after test administration to update student data (e.g., accommodations used during testing) and upload students’ MELA-O scores. The student survey was discontinued this year and the school staff survey was updated. This survey was available online (via a link to SurveyMonkey in the PAS) to collect anonymous and voluntary feedback from participating schools. Responses were received from a total of 109 principals/designees, test administrators, and technology coordinators. Over ninety percent of respondents expressed overall satisfaction with the MEPA online testing system, an increase of almost ten percent from 2011.

Page 7: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

6

FULL REPORT Purpose The purpose of this document is to provide a summary of the spring 2012 MEPA reading and writing online test administration, including preparation for testing, technical support, and activities completed after testing

Background of the MEPA Program

Federal and state laws require that students reported as limited English proficient (LEP) be assessed annually to measure their English proficiency in reading, writing, listening, and speaking, as well as their progress in learning English. ELL students in Massachusetts were required to participate in the two components of MEPA in spring 2012:

Massachusetts English Proficiency Assessment reading and writing tests (MEPA-R/W) Massachusetts English Language Assessment-Oral (MELA-O), an observational

assessment of listening (comprehension) and speaking (production) MEPA tests were administered each spring to ELL students in grades K–12 and in the fall to ELL students in grades 1–12 who did not participate the previous spring.

Participation in Spring 2012 Online Testing

Nearly all schools with ELL students (1,691 schools) were invited to participate in online testing in one or more of the tested grade spans (3–4, 5–6, 7–8, and 9–12). Students in grade span K–2 did not participate in online testing and took paper-and-pencil tests exclusively. The state’s initial goal for spring 2012 online test participation was 60 percent of the tested ELL population in grades 3–12. Based on participation in the previous two years, this goal was adjusted to 40 percent for 2012. Final participation data indicate that 6,216 ELL students in 173 schools and 81 districts participated in spring 2012 MEPA online testing. The online participation was approximately 14 percent of the statewide total of 44,114 ELL students tested in grades 3–12. The criterion for being identified as an online test participant was completion of at least one test question online during any test session. Based on this criterion, students were counted as online participants if they took part of the test online and part of the test on paper. Refer to Appendices A and B for additional details about the schools and grade spans that participated. Thirty-four schools that opted not to participate in the online tests provided feedback in the survey of interest indicating the reasons for this decision. The primary issues cited were:

concerns about students’ computer skills; inadequate computers or Internet connectivity at schools; lack of technology support within schools; insufficient numbers of computers in schools; the perception that online testing required too much time and effort.

Page 8: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

7

Preparation for Online Testing

Preparation for online testing included the following steps: recruitment of schools with ELL students training of representatives from each school via face-to-face meetings or WebEx sessions pre-recorded training session for technology coordinators posted in the PAS final selection by MP and the Department of schools with appropriate technology systems

and a willingness to participate onsite preparations by participating schools, including conducting practice testing with

students and training of test administrators by school representatives who had participated in a Department-led training session

deployment of the online testing system with school-specific data (principal account, student records, and grade-span tests)

Recruitment Schools were invited to participate if they had ELL students in grades 3–12 (based on March 2011 data from the Department’s SIMS). Adult diploma programs, out-of-state schools, approved private special education schools, institutional programs, test sites, and collaborative schools were generally not included in the invitation, but the Department allowed exceptions for the SAFE test sites in Springfield and for the Hampshire Educational Collaborative because these sites expressed interest in participation. All schools that were willing to participate were asked to complete a survey of interest and a technical readiness certification. Measured Progress contacted schools, when necessary, to clarify concerns that arose during analysis of the survey and/or the readiness certification responses. A total of 185 schools were accepted to participate in online testing. The table below indicates the number of schools at each recruitment stage.

Recruitment Stage Count Criteria for Count Stage 1 – Invitation to participate

1,691 schools

Schools that had ELL students in grades 3–12 (using 2011 March SIMS data) received a memo from the Commissioner inviting them to participate in online testing

Stage 2 – School responses to invitation (Schools that did not respond are not included in these counts.)

176 schools

Schools that completed both the survey of interest and the readiness certification indicating willingness to participate

112 schools

Schools that indicated they did not wish to participate, either through their responses on the survey of interest or by notifying MP or the Department

Stage 3 – Notification of acceptance/rejection

185 schools

Schools that received notification of acceptance to participate based on survey of interest and readiness certification responses and/or information gathered in follow-up telephone calls from MP or the Department

22 schools

Schools that received notification that they were not accepted for online testing due to concerns about the student-to-computer ratio, concerns about technological issues, lack of ELL students, or because they did not register for a training session

Page 9: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

8

The following table shows the participation counts of grade spans, districts, and schools.

Participant Count Criteria for Count Students in grades 3–4

1,588

A student was counted as a participant in the MEPA online test if he or she responded to at least one test question online during any test session.

Students in grades 5–6

1,496

Students in grades 7–8

1,288

Students in grades 9–12

1,840

Total students 6,216 Total districts 81 A school/district was counted as having participated in the

MEPA online test if at least one student in the school/district responded to at least one test question online during any test session.

Total schools 173

Refer to Appendix A for a list of the schools that participated and the number of students in each grade span who tested online at those schools. See Appendix C for the technical requirements. Training MP and the Department worked together to formulate the training plan and schedule. The training schedule included seven days which were reserved for make-up sessions if inclement weather forced any session cancellations. Although the weather did necessitate the use of make-up training dates in 2011, none of the reserved days were used in 2012. After experimenting in 2011 with holding some face-to-face training sessions in a centrally located hotel using rented laptop computers, all of the 2012 face-to-face sessions were conducted in public school or college settings. This resulted in lower costs and the ability to better target prospective training sites in the geographical regions with the largest numbers of participating schools. Schools indicated in the survey of interest whether they were interested in hosting face-to-face training sessions. In addition, the Department investigated the possibility of holding training sessions at readiness centers, situated on college campuses. Four schools and two colleges were ultimately selected as training sites, based on location, the number of available computers, technological capacity, and availability during the training window (January 10–31, 2012). Conducting training sessions in schools meant that students did not have access to the computer lab while the session was taking place. The two training sessions at the college locations were scheduled when the college students were on break, so the computer labs were not being used by students and parking issues were minimal. Otherwise, the resources at the colleges were comparable to those in the public schools.

Page 10: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

9

Additional information about the training sites and the number of participants who attended each training session is shown in the following table.

Location Region Date of Training

Session Number of Participants*

Bridgewater State University, Bridgewater (readiness center)

Southeast January 10–morning 14 January 10–afternoon 12

Barnstable High School, Barnstable

South Coast January 11– morning 14

A.C. Whelan Elementary School, Revere

Boston Area January 12–morning 5

Dedham Middle School, Dedham

Greater Boston January 18–morning 21 January 18–afternoon 4

Springfield High School of Science and Technology, Springfield

West January 19– morning 29

Merrimack College, North Andover (readiness center)

Northeast January 20–morning 26

WebEx for first-time participants n/a January 25–morning 37 WebEx for returning participants

n/a

January 25–afternoon 45 January 26–morning 43 January 26–afternoon 37

Total 287 *Participation counts for face-to-face sessions are based on sign-in sheets at the training sessions. For the WebEx sessions, participation could not be verified, so the participation count is based upon the number of people who registered for the session.

One representative from each school (principal/designee) was requested to register for a training session conducted by MP and the Department. Several training options were provided and participants selected the option which best suited their needs and level of experience. Eight half-day training sessions for first-time participants were conducted via face-to-face sessions. In addition, one WebEx session covering the same content was conducted for first-time participants who were unable to attend a face-to-face session. Three condensed WebEx sessions, held for returning participants, focused on highlights and changes to the system. Recorded versions of the WebEx sessions for both first-time and returning participants were posted on the home page of the PAS. A pre-recorded training session for technology coordinators was also available through the PAS, although no live training sessions were offered. A number of changes to the training content and delivery were implemented this year as a result of the increased number of schools invited to participate, feedback from the 2011 post-testing staff survey, and input from focus groups held in summer 2011. Major changes included the following:

The online help system was demonstrated during all training sessions. Differences between the online testing process and the paper-and-pencil testing process

were identified in the first-time participant sessions.

Page 11: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

10

The training session presentation for first-time participants was restructured so that hands-on exercises were integrated throughout the sessions rather than occurring only at the end.

Expanded resources for participants to use when training other staff at their school included a revised Quick Reference Guide for Conducting Test Administration Training and specific scheduling/planning information.

Because afternoon face-to-face sessions have not historically been well attended, only two were scheduled this year in the locations with the highest numbers of participating schools.

Rather than requiring returning participants to attend face-to-face training sessions, condensed versions were available for them to access via WebEx.

First-time participants were encouraged to participate in a face-to-face session, but they had the option of registering for a WebEx session.

Recorded versions of the WebEx sessions for both first-time and returning users were posted on the home page of the PAS. A pre-recorded training session for technology coordinators was also available through the PAS, although no live sessions were offered.

Each training session for first-time participants was about three hours in duration and consisted of the following components:

introduction and overview of the online testing system and the support documentation data administration and management using the Principal’s Administration System administering a test using the Test Administrator’s System and the Student Test

Workstation test security requirements review of next steps and key dates

Face-to-face training participants were able to use the online testing components (PAS, TAS, and STW) at the appropriate times during the training session. Each WebEx session for returning participants was approximately 90 minutes in duration and emphasized critical elements and/or changes from the previous year (e.g., the introduction of the technology coordinator role in the PAS and the renaming of training student records). WebEx training participants did not have the opportunity for hands-on training, but were encouraged to practice using the system on their own after the training session. All training sessions (except the pre-recorded webinar for technology coordinators) followed a “train-the-trainer” model in which participants were requested to take the training information back to their schools and train other staff how to use the PAS, the TAS, and the STW. Portions of a video demonstrating the interaction of the TAS and STW were shown to participants in all training sessions. This and other “Show Me” videos were posted to the PAS for reference when trainees demonstrated the system to school staff. The PAS also included preloaded sample student records for training and practice. The Administration Manual for Spring 2012 MEPA Online Testing was the primary resource for schools, describing the online system, security requirements, and administration information. It included screen shots and step-by-step instructions for principals, test administrators, and technology coordinators. The manual also included a troubleshooting section to help test

Page 12: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

11

administrators and technology coordinators with best practices and connectivity issues. The manual was distributed at the face-to-face training sessions, shipped to schools participating in WebEx sessions, posted on the home page of the PAS, and made available through the online help systems in the PAS and the TAS. The Quick Reference Guide for Conducting Test Administration Training was revised in 2012 to provide more succinct instructions for principals/designees to use when training staff at their schools. This guide included recommendations on how to prepare for training and what critical information should be covered. Other training materials included:

printed versions of the training session slides; a document about maintaining test security during online testing; a checklist of tasks to be completed; a document titled “Tips and Strategies for MEPA Online Test Administration: Practical

Suggestions from Previous Online Test Administrators.” All of the training materials were also posted in the online system so that school staff could access them after training and use them when training other staff in their schools. Although no record was kept of the number of questions asked during training sessions, MP and Department staff observed that the questions seemed more focused than in previous years. This may have been a result of having separate training sessions for new users and returning users so the trainees in any given session had a comparable level of experience with the online testing system. Deployment The online testing system was deployed for schools on January 10, 2012 (the scheduled date of the first training session). Files containing data for eligible students, based on the October 2011 SIMS submission, were preloaded in the system. The operational tests were not accessible at this time, but the rest of the system was open so that principals/designees could complete the following tasks:

train staff add and remove students from their preloaded student data (as appropriate) view the video tutorial with students administer online practice tests administer online locator tests (as appropriate) create accounts to provide system access for staff members who would serve as test

administrators, school administrators, and technology coordinators assign tests to each test administrator verify and update student data assign test sessions (sessions 1 & 2 or sessions 2 & 3) to students

In recognition of the logistical issues of providing computer access to each student testing online, participating schools were able to start online testing one week earlier than schools that were testing only with paper. The online operational tests became available on February 27 and were closed at the end of the day on March 16. The Principal’s Administration System remained open

Page 13: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

12

through March 19 to allow adequate time to update student data, including MELA-O scores, and complete the staff survey.

Overview of the Spring 2012 MEPA Online Testing System

The three components of the online testing system were the Principal’s Administration System, the Test Administrator’s System, and the Student Test Workstation. Staff members used the same user name and password to access both the PAS and the TAS. The Principal’s Administration System had numerous functions and four levels of users. The following table describes these functions and which users were able to access them. Account Type Functions Technology Coordinator Technology coordinators could log into the PAS to access technical

documentation and also the STW software which they were to deploy to the student computers being used for testing. Users designated as technology coordinators did not have access to the TAS or STW.

Test Administrator A user designated as a test administrator could access the PAS to perform the following functions:

view student records edit IEP and 504 plan data edit assigned sessions, participation status, and

accommodations print student rosters create student groups edit MELA-O scores print student login tickets print test progress reports print roster of student results from the locator tests

School Administrator A user designated as a school administrator had access to all of the

functions that were available to test administrators. In addition, a school administrator could perform the following functions:

add and remove student records edit change-of-enrollment status add staff accounts, and assign tests and student groups to staff

accounts edit staff passwords de-activate staff accounts

Principal/Designee The school principal/designee had access to all of the functions

described above. In addition, the principal or designee was the only user who had the ability to end access to the TAS and STW after all testing in the school had been completed.

Page 14: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

13

Online resources posted on the home page of the PAS included the following:

the Administration Manual for Spring 2012 MEPA Online Testing (posted as a PDF) an online help system “Show Me” videos that demonstrated common tasks frequently asked questions a checklist of tasks to be completed recorded versions of each of the three types of training sessions and PowerPoint

presentations for the first-time and returning participant sessions tips and strategies collected from schools that had previously participated in online

testing release notes for MEPA online testing a reference sheet to assist in training test administrators instructions for maintaining test security technical requirements for online testing announcements links and instructions for technology coordinators to deploy the STW to individual

computers In the Test Administrator’s System, staff could open the test session(s) they were administering and provide the test access code to the students. As students moved through the tests, the administrators were able monitor their progress. After students had completed testing, the administrators used the TAS to close the test session and submit students for scoring. If necessary, the administrator could also remove a student from the test. Students entered their user name and password to log into the Student Test Workstation and used the test access code provided by their administrator in order to begin a test. The students moved through the test by clicking “Next,” or by using a test map at the bottom of their screens. Features available in the Student Test Workstation included flags that could be used to mark items for later review, the ability to change screen and font colors and font size, a highlighter tool, and an eraser. Editing tools and a character counter were provided for constructed-response items.

Tasks Schools Completed Prior to Testing

Following training, principals/designees identified staff who would administer online tests, created staff accounts, assigned tests to administrators, and trained school staff on how to use the system. As the system was preloaded with data from October SIMS, schools were also responsible for verifying student data and making updates as needed. The schools assigned students to operational test sessions (sessions 1 & 2 or sessions 2 & 3) and uploaded MELA-O scores. In addition to data management and staff training, the schools prepared their technology in advance of testing. These preparations included load testing, restricting high-bandwidth usage, configuring networks and software to allow access to the tests, and deploying the STW to student computers. Schools using laptops were also advised to make sure the computers were charged prior to each test session.

Page 15: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

14

Student Preparation

Participating schools were allowed to select one or more grade spans to test online. All students in the selected grade spans were required to participate in the online test, except the few students who lacked sufficient computer skills or required an accommodation that was not available for online testing. Prior to testing, all students were required to view the Student Tutorial video, which explained to students how to navigate through the test and use each of the buttons and functions they would see on the operational test. In response to feedback from schools, the tutorial was shortened considerably (from 17 to 8 minutes) in 2012 by reducing the time spent on each element and by de-emphasizing the editing tools. In addition, narration was added to the video. Schools were able to access the tutorial through the PAS for group viewing, or they could download a link to student computers so that the video could be viewed individually. For each grade span, the online system included a practice test and a locator test, which were open for use prior to the beginning of the testing window. Schools were instructed that all students who were testing online be given the online version of the practice test. Students were allowed to take the practice tests as many times as necessary in order to feel comfortable using the STW. In addition, students took an online locator test if teachers were uncertain which test sessions (1 & 2 or 2 & 3) to administer. The online practice and locator tests were the same as the paper-and-pencil versions.

Test Administration

The online testing window began on February 27. Schools had a three-week window in which to complete testing for all grade spans they had designated for online testing. Staff from the Department and from Measured Progress visited a sampling of schools to observe online test administration and gather insights from school staff on the benefits and challenges of online testing. These observational visits were coordinated by Paulette Watson, the Department’s MEPA Specialist. Her summary of the school observations is included as Appendix D of this report.

Tasks Schools Completed After Testing

Test administrators were responsible for marking students as complete within the TAS after students finished the locator and/or operational tests. When appropriate, test administrators notified principals of any students who required make-up test sessions. After testing, principals/designees updated students’ accommodations data and enrollment status, as necessary, in the PAS. Principals also had the option of closing testing for their school before the end of the official testing window if all students in their schools were finished. Surveys for school staff were available online (via SurveyMonkey). Participation was voluntary and anonymous. Each respondent could choose one of three roles (principal/designee, test administrator, or technology coordinator) that best described his or her role in online testing. The

Page 16: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

15

survey questions presented to each respondent varied depending upon the respondent’s role. In survey feedback, 90.7 percent of respondents (98 of 108) expressed overall satisfaction with the MEPA online testing system. For additional details of survey responses and changes suggested by users, please see Appendix E.

Test Security

Access to the three online testing system components (PAS, TAS, and STW) required a user name and password. Letters were mailed to principals with login information and instructions for accessing the PAS. The system automatically emailed login information to other staff as the principal/designee created accounts for them. Student login information was generated in the PAS as student data was added to the system. To maintain test security, test administrators were only allowed to administer tests that had been assigned to them by the principal/designee or school administrator. A unique test access code was displayed in the TAS when the administrator opened a test session. The test would not display in the STW until the student had logged in and entered the appropriate test access code. During the testing window, including on weekends, MP monitored access to the STW and TAS to watch for unexpected activity that might require Department action. Schools were responsible for developing a security plan and ensuring that all test administrators and technology coordinators complied with test security requirements and instructions provided by the Department. To assist in the security of the online tests, the Department made commercially manufactured cardboard “security carrels” available to surround students’ computers. These carrels were also available in the 2011 online administration, and schools reported that they were an effective deterrent to students viewing each other’s monitors during testing. A total of 42 schools ordered carrels in 2012. Of these, three schools had received carrels the previous year, but the Department allowed them to order again because they needed a different size, or because they were testing larger numbers of students. Other guidelines that were suggested to schools to prevent students from viewing other students’ computer screens included:

seating students at every other computer within a computer lab or classroom; seating students using laptops in a semicircle; frequent monitoring by at least two test administrators in each test session.

Technical Support A MEPA Technical Service Center was provided by MP to give telephone and email support and technical assistance to participating schools. The Technical Service Center was available from 7:00 a.m. to 5:00 p.m., Monday through Friday, and received a total of 275 calls before and during the MEPA online test administration. The Technical Service Center received fewer calls in 2012 than in previous years, possibly because schools had greater experience.

Page 17: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

16

The Technical Service Center staff maintained a log of all calls received and issued reports to the Department regarding call categories, issues, and resolution. The table below lists the most frequent categories of calls and n counts for each category.

Rank Call Category n Count 1 TAS> Client Instruction 53 2 PAS> Account Create, Delete, Modify 37 3 PAS> Client Instruction 34 4 PAS> Login Issue 25 5 STW> Login Issue 23 6 Documentation/Technical Requirements 18 7 PCPA 12 8 PAS> Test Assignment 11 9 STW> Client Instruction 9

10 Network Issues> ISP/Local/Configuration 9 11 Referred to MCAS Service Center 9 12 Carrel Orders 8 13 TAS> Login Issue 6 14 STW> Network Error 6 15 PAS> Reports 5 16 PAS > Un-submitting Students for Scoring 5 17 PAS> Test Administration Window/Ending Testing 4 18 STW> Assistive Technology 1

Total 275

Page 18: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

17

APPENDIX A: Participating Schools and Districts

District School Name Number of Students Participating Online in Each Grade Span Test

Total Number of Participants

3–4 5–6 7–8 9–12 Acushnet Albert F Ford Middle School 2 2Agawam

Clifford M Granger 4 4Benjamin J Phelps 6 6Robinson Park 10 10James Clark School 6 6Roberta G. Doering School 13 13Agawam High 14 14

Amesbury Amesbury Middle 2 2Ashland Ashland Middle 4 5 9Ashland Ashland High 6 6Attleboro

Hyman Fine Elementary School 10 10Robert J. Coelho Middle School 10 8 18Cyril K. Brennan Middle School 21 15 36Wamsutta Middle School 27 16 43

Avon Ralph D Butler 2 2Barnstable

Barnstable Intermediate School 9 17 26Barnstable High 11 31 42

Bellingham Bellingham High School 6 6Belmont Belmont High 28 28Beverly

Ayers/Ryal Side School 13 6 19Briscoe Middle 4 5 9Beverly High 16 16

Billerica

Thomas Ditson 1 3 4Hajjar Elementary 5 3 8Billerica Memorial High School 4 4

Boston

John P Holland 113 55 168Orchard Gardens 108 108Sarah Greenwood 58 34 11 103Boston Middle School Academy 3 4 7Harbor School 14 16 30Boston Latin Academy 12 3 15

Brewster Eddy Elementary 3 1 4Brockton Hancock 30 7 37Brockton West Middle School 25 25 50Cambridge Cambridge Rindge and Latin 87 87

Page 19: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

18

District School Name Number of Students Participating Online in Each Grade Span Test

Total Number of

Participants3–4 5–6 7–8 9–12 Canton

Lt Peter M Hansen 3 2 5Dean S Luce 1 1 2Wm H Galvin Middle 3 3 6Canton High 11 11

Chelsea

William A Berkowitz Elementary 20 20George F. Kelly Elementary 24 24Eugene Wright Science and Technology Academy 22 15 37Clark Avenue School 25 6 31Joseph A. Browne School 90 99 189

Chicopee

Barry 24 16 40Gen John J Stefanik 18 6 24

Concord Alcott 3 1 4Dartmouth Dartmouth Middle 1 2 3Dedham Dedham Middle School 14 13 27Dracut Dracut Senior High 3 3Eastham Eastham Elementary 1 1Everett Madeline English School 12 14 6 32Fall River

James Tansey 1 2 3Samuel Watson 10 7 17Matthew J Kuss Middle 1 1B M C Durfee High 105 105

Framingham Barbieri Elementary 90 21 111Greenfield Greenfield Middle 3 3 6 12Holliston Miller School 5 5 10Holyoke

Morgan Elementary 39 39 31 109William R. Peck School 56 73 52 181E N White Elementary 18 17 35Maurice A Donahue Elementary 26 23 49Wm J Dean Vocational Technical High 178 178

Hudson Hudson High 6 20 26

Page 20: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

19

District School Name Number of Students Participating Online in Each Grade Span Test

Total Number of Participants

3–4 5–6 7–8 9–12 Lawrence

South Lawrence East Elementary School 62 62Alexander B Bruce 33 33Arlington Middle School 56 61 117Robert Frost 35 35Guilmette Middle School 40 41 81Parthum Middle School 46 29 75Francis M Leahy 45 8 53James F Leonard 21 52 73Henry K Oliver 42 41 32 115Edward F. Parthum 46 46Emily G Wetherbee 37 26 16 79Frost Middle School 15 21 36Business Management & Finance High School 48 48Health & Human Services High School 54 54Humanities & Leadership Development High School 38 38Math, Science & Technology High School 42 42International High School 413 413Performing & Fine Arts High School 40 40

Leominster

Fall Brook 25 9 34Southeast School 14 8 22Sky View Middle School 24 22 46Leominster Senior High 45 45Leominster High School & Center for Technical Education 12 12

Ludlow Ludlow Senior High 6 6Lynn

A Drewicz Elementary 23 9 32Cobbet Elementary 68 29 97Lincoln-Thomson 12 1 13

Malden

Forestdale 12 5 2 19Salemwood 74 67 141

Marblehead Marblehead Veterans Middle School 3 3

Mashpee Mashpee Middle School 1 1Methuen

Marsh Grammar School 9 10 5 24Comprehensive Grammar School 32 26 22 80Donald P Timony Grammar 10 12 13 35

Page 21: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

20

District School Name Number of Students Participating Online in Each Grade Span Test

Total Number of Participants

3–4 5–6 7–8 9–12 Milford Milford Middle East 7 7Natick J F Kennedy Middle School 7 4 11New Bedford

John B Devalles 12 6 18Ellen R Hathaway 4 4Hayden/McFadden 23 28 51

Newburyport Rupert A Nock Middle 2 2Newton

Charles E Brown Middle 13 22 35Oak Hill Middle 15 32 47

Norwell

Grace Farrar Cole 1 1William G Vinal 1 1Norwell High 3 3

Peabody Peabody Veterans Memorial High 44 44

Provincetown Provincetown Schools 2 2Quincy

Broad Meadows Middle 4 5 9Reay E Sterling Middle 10 12 22Point Webster Middle 7 8 15North Quincy High 57 57

Revere

A. C. Whelan Elementary School 24 6 30Beachmont Veterans Memorial School 17 5 22Paul Revere 24 6 30

Salem Collins Middle 21 36 57Scituate Scituate High School 5 5Seekonk George R Martin 7 5 12Southampton William E Norris 3 2 5Springfield

Samuel Bowles 11 12 23William N Deberry 23 11 34Mary M Lynch 12 8 20Alice B Beal Elementary 4 2 6John F Kennedy Middle 44 81 125High School/Science-Tech 177 177

Stoughton

South Elementary 2 2 4Joseph H Gibbons 6 2 8

Uxbridge Uxbridge High 4 4Walpole

Bird Middle 5 3 8Eleanor N Johnson Middle 1 6 7

Waltham

Northeast Elementary School 25 20 45Henry Whittemore Elementary School 18 1 19

Wareham

John William Decas 1 1Wareham Middle 1 1

Page 22: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

21

District School Name Number of Students Participating Online in Each Grade Span Test

Total Number of Participants

3–4 5–6 7–8 9–12 Watertown James Russell Lowell 10 5 15Westford Stony Brook School 4 2 6Weymouth

Maria Weston Chapman Middle School 8 8Abigail Adams Middle School 16 16Weymouth High School 24 24

Winchendon Toy Town Elementary 3 2 5Woburn Daniel L Joyce Middle School 4 9 13Worcester

Lincoln Street 29 29May Street 28 15 43Claremont Academy 51 72 123Forest Grove Middle 145 145

Northampton-Smith Vocational Agricultural

Smith Vocational and Agricultural High 9 9

Academy of the Pacific Rim Charter

Academy of the Pacific Rim Charter Public School 3 3

Community Charter School of Cambridge

Community Charter School of Cambridge 6 5 11

Lowell Community Charter Public

Lowell Community Charter Public School 108 39 147

Salem Academy Charter (District) Salem Academy Charter School 3 7 4 14Hampden Charter School of Science

Hampden Charter School of Science 2 1 7 10

Dennis-Yarmouth

Ezra H Baker 4 4Marguerite E Small Elementary 13 12 25Laurence C MacArthur Elementary 9 9Station Avenue Elementary 7 7Mattacheese Middle School 10 12 22N H Wixon Middle 6 9 7 22Dennis-Yarmouth Regional High 22 22

Dover-Sherborn

Dover-Sherborn Regional Middle 2 1 3

Nauset Nauset Regional Middle 2 4 6

Page 23: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

22

Nauset Regional High 11 11Hampshire Hampshire Regional High 2 1 3

District School Name Number of Students Participating Online in Each Grade Span Test

Total Number of Participants

3–4 5–6 7–8 9–12 Mount Greylock Mt Greylock Regional High 2 2Northboro-Southboro Algonquin Regional High 6 6Whitman-Hanson Louise A Conley 2 2Blackstone Valley Regional Voc Blackstone Valley 7 7Blue Hills Regional Vocational

Blue Hills Regional Vocational Technical 8 8

Greater Lowell Regional Vocational Technical

Gr Lowell Regional Vocational Technical 127 127

South Middlesex Regional Vocational Technical

Joseph P Keefe Technical High School 35 35

Total for All Schools and All Districts 1589 1496 1288 1843 6216

Page 24: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

23

APPENDIX B: Participation of Schools and Districts 2010–2012

District School Name Participated in 2010

Participated in 2011

Participated in 2012

Academy of the Pacific Rim Charter

Academy of the Pacific Rim Charter

Acushnet Albert F Ford Middle School

Agawam Clifford M Granger √ √

Benjamin J Phelps √ √

James Clark School √

Robinson Park √ √

Roberta G. Doering School √ √

Agawam Junior High √

Agawam High √ √

Amesbury Amesbury Middle √

Arlington Hardy √

Ashland Ashland Middle √

Ashland High √

Attleboro Hyman Fine Elementary School

√ √

Robert J. Coelho Middle School

√ √

Cyril K. Brennan Middle School

Wamsutta Middle School √

Avon Ralph D Butler √

Barnstable Centerville Elementary √

Barnstable Intermediate School

√ √

Barnstable High √ √

Bellingham Bellingham High School √

Page 25: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

24

District School Name Participated in 2010

Participated in 2011

Participated in 2012

Belmont Belmont High √ √

Beverly Ayers/Ryal Side School √

Briscoe Middle √

Beverly High √

Billerica Thomas Ditson √ √

Hajjar Elementary √ √

Marshall Middle School √

Billerica Memorial High School

√ √

Blackstone Valley Regional Voc

Blackstone Valley √

Blue Hills Regional Vocational

Blue Hills Regional Vocational

Boston Blackstone √

Fenway High School √

Hugh Roe O’Donnell √

Jackson Mann √ √

James P Timilty Middle √

John F Kennedy √

John P Holland √ √

Nathan Hale √

Odyssey High School √

Oliver Hazard Perry √ √

Orchard Gardens √

Dr. William Henderson √

Samuel W Mason √

Sarah Greenwood √ √ √

Winship Elementary √ √

Page 26: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

25

District School Name Participated in 2010

Participated in 2011

Participated in 2012

Boston Middle School Academy

√ √

Harbor School √ √

Monument High School √

Boston Latin Academy √ √

Boston Arts Academy √

O'Bryant School Math/Science

Brewster Eddy Elementary √

Brockton East Middle School √

Manthala George Jr School √

Hancock √ √ √ West Middle School √ √ √

Cambridge Maria L. Baldwin √

Amigos School √ √

King Open √ √

Cambridge Rindge and Latin

√ √ √

Canton Lt Peter M Hansen √

Dean S Luce √

Wm H Galvin Middle √

Canton High √

Chelsea William A Berkowitz Elem √ √

George F. Kelly Elem √ √

Eugene Wright School √ √

Clark Avenue School √ √

Joseph A. Browne School √ √ √

Page 27: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

26

District School Name Participated in 2010

Participated in 2011

Participated in 2012

Chicopee

Barry √ √ √

Bowe √ √

Selser √ √

Fairview Middle √ √

Chicopee High √

Chicopee Comprehensive HS √

Gen John J Stefanik √

Community Charter School of Cambridge

Community Charter School of Cambridge

Concord Alcott √

Dartmouth Dartmouth Middle √

Dedham Dedham Middle School √

Dennis-Yarmouth

Ezra H Baker √ √

Marguerite E Small Elem √ √

Laurence C MacArthur Elem √ √

Station Avenue Elem √ √

Mattacheese Middle School √ √

N H Wixon Middle √ √

Dennis-Yarmouth Reg High √ √

Dover-Sherborn Dover-Sherborn Regional Middle

Dracut Dracut Senior √

Eastham Eastham Elementary √

Page 28: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

27

District School Name Participated in 2010

Participated in 2011

Participated in 2012

Everett George Keverian School √ √

Lafayette School √

Everett High √ √

Madeline English School √

Palin School √

Fall River Letourneau Elementary School √

James Tansey √

Samuel Watson √

B M C Durfee High √ √

Matthew J Kuss Middle √ √

Framingham Barbieri Elementary √ √

Fuller Middle √

Framingham High School √

Greater Lowell Regional Vocational

Greater Lowell Regional Vocational

Greenfield Greenfield Middle √

Hampden Charter School of Science

Hampden Charter School of Science

Hampshire Hampshire Regional High √

Haverhill Consentino √

Dr Paul Nettle √

Haverhill High √

John G. Whittier √

Tilton √

Holliston Miller School √

Page 29: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

28

District School Name Participated in 2010

Participated in 2011

Participated in 2012

Holyoke Morgan Elementary √ √

William R. Peck School √ √ √

Maurice A Donahue Elem √ √

Holyoke High √ √

Center for Excellence √ √

Wm J Dean Voc Tech High √ √ √

E N White Elementary √

Hudson Hudson High √ √

Lawrence South Lawrence East Elementary

√ √ √

Arlington Elementary School √ √

Alexander B Bruce √ √ √

South Lawrence East Middle School

Arlington Middle School √ √ √

Robert Frost √ √ √

Parthum Middle School √ √ √

Francis M Leahy √ √ √

James F Leonard √ √ √

Henry K Oliver √ √ √

Edward F. Parthum √ √ √

Emily G Wetherbee √ √ √

Frost Middle School √ √ √

Business Management & Finance

√ √

Health & Human Services High √ √ √

Humanities & Leadership Development

√ √ √

Math Science & Technology High

√ √ √

Page 30: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

29

International High School √ √ √

Performing & Fine Arts High School

√ √ √

High School Learning Center √ √

School for Exceptional Studies √ √

Guilmette Middle School √ √

John K Tarbox √

Leominster Fall Brook √ √ √

Southeast School √ √ √

Samoset School √ √

Sky View Middle School √ √ √

Leominster Senior High √ √

Leominster High Schools & Center for Technical Education

Northwest √

Leominster High School & Center

Lowell Bartlett Community Partnership √

Joseph McAvinnue √

Pyne Arts √

Lowell Community Charter Public

Lowell Community Charter Public

√ √

Ludlow Ludlow Senior High √

Lynn Cobbet Elementary √ √

A Drewicz Elementary √ √

Capt William G. Shoemaker √

Lincoln-Thomson √ √ √

Lynn Woods √

Malden Forestdale √

Salemwood √ √ √

Page 31: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

30

District School Name Participated in 2010

Participated in 2011

Participated in 2012

Marblehead Marblehead Veterans Middle School

Mashpee Mashpee Middle School √

Methuen Donald P Timony Grammar √ √

Tenney Grammar School √

Marsh Grammar School √ √ √

Comprehensive Grammar School

√ √ √

Milford Milford Middle East √ √

Montachusett Montachusett Reg Voc Tech √

Mount Greylock Mt Greylock Regional High √

Natick J F Kennedy Middle School √

Nauset Nauset Regional Middle √

Nauset Regional High √

New Bedford John B Devalles √ √ √

Ellen R Hathaway √ √ √

Betsey B. Winslow √

Hayden/McFadden √ √ √

Newburyport Rupert A Nock Middle √

Newton Charles E Brown Middle √ √ √

Oak Hill Middle √ √

Northampton-Smith Vocational Agricultural

Smith Vocational & Agricultural

Northboro-Southboro

Algonquin Regional High √

Norwell Grace Farrar Cole √

William G Vinal √

Norwell High √

Norwood Dr. Philip O. Coakley Middle School

Page 32: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

31

District School Name Participated in 2010

Participated in 2011

Participated in 2012

Peabody Peabody Veterans Memorial High

√ √

Provincetown Provincetown Schools √

Quincy Broad Meadows Middle √

Reay E Sterling Middle √ √

Point Webster Middle √

North Quincy High √

Revere Abraham Lincoln √

A. C. Whelan Elementary School

√ √ √

Beachmont Veterans Memorial School

√ √

Paul Revere √ √ √

Susan B. Anthony Middle School

√ √

Revere High √ √

Salem Bates √

Saltonstall School √

Collins Middle √ √

Salem Academy Charter (District)

Salem Academy Charter School √

Scituate Scituate High School √

Seekonk George R Martin √

South Middlesex Regional Voc

Joseph P Keefe Technical High √

Southbridge Mary E Wells Jr High √

Southampton William E Norris √

Page 33: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

32

District School Name Participated in 2010

Participated in 2011

Participated in 2012

Springfield Samuel Bowles √

William N Deberry √ √

Lincoln √ √

Dryden Memorial √ √

Mary M Lynch √

Mary M Walsh √ √

Sumner Avenue √

Alice B Beal Elem √ √ √

John F Kennedy Middle √ √ √

High School/Science-Tech √

Stoughton Joseph R Dawe Jr Elem √

South Elementary √ √

Joseph H Gibbons √

West Elementary √

Taunton Taunton High √

Uxbridge Uxbridge High √

Walpole Bird Middle √

Eleanor N Johnson Middle √

Waltham Northeast Elementary School √ √ √

Henry Whittemore Elementary School

√ √ √

Wareham John William Decas √

Wareham Middle √

Watertown Hosmer √

James Russell Lowell √ √

Westford Stony Brook School √

Page 34: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

33

District School Name Participated in 2010

Participated in 2011

Participated in 2012

Weymouth Maria Weston Chapman Middle School

√ √

William Seach √

Thomas W. Hamilton Primary School

Abigail Adams Middle School √ √

Weymouth High School √ √

Whitman-Hanson Louise A Conley √

Winchendon Toy Town Elementary √

Woburn Mary D Altavesta √

Daniel L Joyce Middle School √ √

Worcester Lincoln Street √

May Street √ √

Claremont Academy √ √ √

Forest Grove Middle √ √ √

Burncoat Senior High √

Goddard School/Science Tech √

Jacob Hiatt Magnett √

Roosevelt √

Page 35: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

34

APPENDIX C: 2012 MEPA-R/W Online Testing: Technical Requirements for School-based Technology

Local Network

Network Connection Specifications

Wired (required for each test administrator’s computer): 100 Mbps Fast Ethernet TCP/IP(Minimum) Wireless: Minimum - 802.11g; Recommended - 802.11n

Internet Bandwidth Minimum - 1 Mbps with at least 8 Kbps per concurrent user Firewall / Proxy / Internet Content Filtering

Set to allow connections to *.measuredprogress.org

Uniform Resource Locators (URLs)

Set to allow connections to: https://mepaonline.measuredprogress.org

Internet Protocol (IP) Addresses

Set to allow connections to: 64.140.199.0 through 64.140.199.24 inclusive

Ports Set to allow connections to ports 80 and 443

Email Allow emails from @measuredprogress.org, as the system sends account information via email to users on behalf of the person creating/updating the account.

Students’ Computers – One for each student testing concurrently Windows Macintosh

Operating System1 Windows XP SP3 or (32-bit only) Windows Vista SP2 (32-bit only) Windows 7 Home Premium or greater

OS X 10.4.11, 10.5.8 or 10.6.8 Note: 10.6 Snow Leopard requires that optional component Rosetta is installed from the OS 10.6 installation disk for Intel-based computers.

RAM Windows XP SP3: 512 MB or greater, or Windows Vista SP2: 1 GB or greater Windows 7: 1 GB or greater for 32-bit, 2 GB or greater for 64-bit

10.4 Tiger or 10.5 Leopard: 512 MB or greater 10.6 Snow Leopard: 1 GB or greater

Internet Browser None required. Firefox Portable Kiosk (FPK) is installed with the Student Test Workstation (STW) software.

Processor Pentium III 1.33 GHz or greater (32-bit) 1 GHz x86-64 or greater (64-bit) G4 867 MHz or greater

Flash Player Version 10 is installed as part of the FPK. Keyboard/Mouse Standard Monitor 32-bit color or greater; 1024 x 768 resolution or greater

Fonts Times New Roman, Helvetica, and Verdana

Principals’ and Test Administrators’ Computers

Windows Macintosh

Operating System1 Windows XP SP3 (32-bit only) or Windows Vista SP2 (32-bit only) Windows 7 Home Premium or greater

OS X 10.4.11, 10.5.8 or 10.6.8

RAM

Windows XP SP3: 512 MB or greater, (32-bit) or Windows Vista SP2: 1 GB or greater (32-bit) Windows 7: 1 GB or greater for 32-bit, 2 GB or greater for 64-bit

10.4 Tiger or 10.5 Leopard: 512 MB or greater 10.6 Snow Leopard: 1 GB or greater

Processor Pentium III 1.33 GHz or greater (32-bit) 1 GHz x86-64 processor or greater (64-bit) G4 867 MHz or greater

Internet Browser Internet Explorer 7.x (32-bit or 64-bit) ; 8.x, 9.x2; Firefox 3.6.20, Firefox 5.x

Safari 3.2.3, Safari 4.0.4 Firefox 3.6.20, Firefox 5.x

Flash Player 10 Pop-Up Blocking Software Must be configured to allow pop-ups from *.measuredprogress.org

Keyboard/Mouse Standard Monitor 32-bit color or greater; 1024 x 768 resolution or greater

1 It is recommended that auto-updates for operating systems and browsers be turned off on all computers used for testing.

2 Internet Explorer 9 is supported for the Test Administrator’s System, but not the Principal’s Administration System.

Page 36: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

35

APPENDIX D: Summary of School Observations during Spring 2012 MEPA Online Testing Staff from the Department and Measured Progress visited six schools to observe students and to meet with school leaders and learn about their experiences with online MEPA testing. The table below shows details about the visits.

School Name District Name

Date of Visit

ESE Staff MP Staff No. of Students Testing Online

Grade Levels Observed

Sarah Greenwood

Boston 3/1/12

Paulette Watson

100 3–4

Lowell Community Charter

Lowell Community Charter

3/7/12 Paulette Watson

Mary Beth Myers, Stanley Constant

152 3–4

John P. Holland

Boston 3/9/12 Paulette Watson, Dan Wiener

200 3–4

Collins Middle Salem 3/12/12 Paulette Watson

100 6–8

Greater Lowell Voc. Tech.

Gr. Lowell Voc. Tech.

3/14/12 Paulette Watson

150 9–12

International High School

Lawrence 3/8/12 Paulette Watson

Renee Tavarez, Samantha Phelps

428 9–12

The main reasons for the visits were to observe and discuss the following:

the facility/skill of the students in the lower grades in using computers for testing scheduling and logistics in schools testing large numbers of students security measures used to maintain the integrity of the test

Facility with Computers Approximately 40 students in the 3–4 grade-span were observed in each of three elementary schools. The majority of both the third and fourth graders displayed excellent keyboarding skills and mastery of various aspects of the computer, indicating thorough facility with the technology. In all three elementary schools, test administrators reported that the students had been receiving weekly computer classes since kindergarten. In all six schools visited (Grades 3–12), students use computers regularly to do ordinary classwork, using programs like Rosetta Stone, Lexia, Envision Math, and Study Island. In two schools, students did online testing three times per year for MAP testing. Although in all six schools all students were able to do the test without having problems with using the computer, a

Page 37: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

36

few students were observed typing more slowly and using one or two fingers to type. It was explained that these students had arrived in the school only this year from other countries. Scheduling and Logistics Time Needed for Testing Five of the six schools visited used two weeks or fewer to complete testing. Only the International High school in Lawrence needed all three weeks of the test administration window to complete testing, and this was because the school had had an influx of 200 additional ELLs this year. All except Greater Lowell Vocational Technical had participated previously in MEPA online testing and had no difficulty in scheduling testing. Schools reported that a very useful way to economize on time and space was to administer sessions with read-aloud sections first and then on subsequent days to administer several tests without read-aloud sections concurrently. Number of Computer Labs Used Collins Middle School and Sarah Greenwood School used one computer lab with about 25 computers, while the others used two or more labs of the same size. Greater Lowell used four labs, while International High used three. Only the two high schools reported that scheduling was difficult—International because of the large number of students, and Greater Lowell because this was the school’s first experience with online testing. Organizing the Use of Test Administrators One strategy that helped to reduce the difficulty of scheduling and avoid disruption in the regular school schedule was to have ELL teachers in the labs during the times they would normally have ELL students in class. Several schools also indicated the use of substitute teachers in the regular classroom when the classroom teacher had to administer the test. Bandwidth Restrictions Restricting the use of the Internet, especially the use of streaming videos in other classes during MEPA testing, resulted in a technologically problem-free administration for five schools. At one school, there were major disruptions on the day we visited because of bandwidth problems. Six schools in the same building, sharing the same network, made it difficult to control use of the Internet and video streaming during MEPA testing. However, the principal was able to resolve the problem and complete testing on time by more strictly enforcing the ban on Internet use in his school. Security The use of two test administrators to monitor between 10 and 20 students was the general practice in all the schools visited. This, in addition to the use of the carrels, resulted in secure testing environments. The carrels were generally in very good condition, even where test coordinators had used them for other testing during the school year. In one school visited, the technology coordinator had not installed the secure Firefox Portable Kiosk (FPK) on all the students’ computers. This kiosk was provided by Measured Progress to lock students into the test and prevent access to other material on their computers or the Internet.

Page 38: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

37

The school used an internal security blocking system. This was the same school that experienced the bandwidth issues that made it difficult for students to log on. When the FPK was installed, students were able to log on, although the bandwidth issues persisted.

Page 39: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

38

APPENDIX E: Overview of Survey Results

An online survey was developed and administered via SurveyMonkey to participating schools. The survey instructions and link were posted on the home page of the PAS, and all school staff who took part in online testing were encouraged to complete the survey. Participation in the survey was voluntary and anonymous. Respondents had the opportunity to skip questions or end their participation in the survey at any time; thus, response counts varied by survey question. The survey included multiple-choice and open-response questions. In some cases, the total number of responses is greater than the total number of respondents because survey participants were allowed to select more than one response. In other cases, questions were displayed only if the respondent selected a particular answer to a previous question.

Participation in Surveys

Fewer school staff responded to the 2012 survey than responded to the 2011 survey. Of the 109 people who responded to at least one question in the survey,

38.5% (42 of 109) were principals/designees; 56% (61 of 109) were test administrators; 5.5% (6 of 109) were technology coordinators.

Ninety-six principals/designees indicated the number of English language learner students who participated in MEPA online testing in each grade span. Unlike prior years, the greatest number of students participated at grade span 7–8. The table below includes the online testing participation count for all grade spans based on survey feedback.

MEPA Online Testing Participation by Grade Span

Response Average

Response Total

Response Count

Grade span 3–4 17.60 704 40 Grade span 5–6 14.90 924 62 Grade span 7–8 24.31 1,264 52 Grade span 9–12 28.97 1,072 37

answered question 96

Overall Satisfaction

School staff continued to be satisfied with the MEPA online testing system. The school staff who indicated satisfaction with the overall system were 90.7 percent of the survey respondents (98 of 108), an increase of about 10 percent between 2011 and 2012. Student Tutorial Video Between 2011 and 2012, a slight increase was seen in the percentage of respondents who indicated the Student Tutorial video was shown to students in their schools. The percentage of respondents who felt the tutorial was helpful in preparing students for online testing increased by almost 15 percent.

94.0% of principals/designees and test administrators (94 of 100) indicated that all students viewed the Student Tutorial video at least once.

Page 40: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

39

83.0% of principals/designees and test administrators (83 of 100) felt the Student Tutorial video was helpful in preparing students for online testing.

Thirty-two principals/designees and test administrators responded to the question about how the Student Tutorial video could be improved. The suggestions offered in 2012 differed from those offered in 2011, perhaps because the 2012 tutorial video was updated to address the top two suggestions from 2011. The top five categories of 2012 responses are summarized in the table below.

Top 5 Categories of Comments and Suggestions for Improving the Student Tutorial Video

Response Percent

Response Count

No change – good as is 21.9% 7 Practice test was more helpful to students – the video was not necessary 18.8% 6 Make it interactive 15.6% 5 Content changes (e.g., narration in another language, change the pace, etc.) 15.6 5 Increase the time the tutorial and practice tests are available to students 12.5% 4

answered question 32

Practice Tests and Locator Tests

Use of the online practice tests as reported by survey respondents in 2012 was about the same as it was in 2011; whereas, use of the online locator tests decreased by about 30.6 percent between 2011 and 2012.

97.9 % of principals/designees and test administrators (95 of 97) who administered the practice tests used the online testing system as opposed to downloading the tests from the Department’s website.

94.9% of principals/designees and test administrators (94 of 99) indicated that all students took the online practice test at least once.

88.9% of principals/designees and test administrators (88 of 99) reported that their schools reviewed the online practice test with the students to make sure they understood how to respond to each type of question.

74.7% of principals/designees and test administrators (74 of 99) reported that their schools did not administer the practice test more than once to any students.

The majority of principals/designees and administrators indicated that their schools did not administer locator tests to students.

o 35.4% of principals/designees and test administrators (35 of 99) said their schools administered online locator tests to students.

o 9.1% of principals/designees and test administrators (9 of 99) said that their schools downloaded the locator tests from the Department’s website.

Twenty-four principals/designees and test administrators responded to the question about improving the online practice tests. The percentage of respondents who thought the practice tests required no changes increased by more than 10 percent between 2011 and 2012. The following table includes the top five categories of responses.

Page 41: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

40

Suggestions for Improving the Online Practice Test Response Percent

Response Count

None – good as is 37.5% 9 Change length (shorter or longer) 16.7% 4 More practice questions and/or tests 16.7% 4 Include more information in TAMs about practice tests (e.g., script, answers) 8.6% 2 Increase the time the tutorial and practice tests are available to students 8.6% 2

answered question 24

Test-Taking Experience

Principals/designees and test administrators continue to think that the students understand online instructions and use similar test-taking strategies to those they use when taking paper tests. Between 2011 and 2012, there was only a slight increase in the percentage who thought students understood the online instructions. The percentage who thought students used similar test-taking strategies, however, increased by about 10 percent.

97.0% of principals/designees and test administrators (98 of 1018) thought that students generally understood the onscreen instructions for taking the MEPA online tests. The table below includes reasons offered by eight principals and test administrators. Reasons that Students Generally Understood the Onscreen Directions for Taking the MEPA Online Tests

Response Percent

Response Count

Familiarity with computers 50.0% 4 Teacher answered questions and provided guidance 27.5% 3 Clear, concise directions 12.5% 1

answered question 8

58.4% of principals/designees and test administrators (59 of 101) thought that students used the same test-taking strategies they used on the paper test. The table below includes the top five reasons offered.

Top 5 Strategies Used by Students and/or School Staff Comments About Student Strategies

Response Percent

Response Count

Students highlighted key words/phrases. 65.6% 21 Students used more test-taking strategies for computer-based testing than they did for paper-pencil tests. 18.8% 6 Students used the font tools (e.g., bold, color change) 15.6% 5 Students used less test-taking strategies for computer-based testing than they did for paper-and-pencil tests. 12.5% 4 Students reviewed their answers and/or passages. 12.5% 4

answered question 32 Note: Some responses fell into more than one category. Thus, the total suggestion count is greater than the number of respondents.

Page 42: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

41

Principal’s Administration System

Eighty-one principals/designees and test administrators rated how easy it was to use seven aspects of the PAS on a 5-point rating scale, where 1 was very difficult and 5 was very easy.

All components received an average rating of 4.08 or higher. Twenty-eight staff members responded to the question about improving the PAS. The

table below includes the top five categories of responses. Top 5 Categories of Suggested Changes to the Principal’s Administration System

Response Percent

Response Count

System is easy to use/fine as it is. 28.6% 8 Streamline the test assignment/monitoring progress processes 17.9% 5 Improve content of resource materials (e.g., how to assign tests, clarification on roles, etc.) 17.9% 5 Reduce number of steps to update student data 14.3% 4 Training and service center support was helpful 10.7% 3

answered question 28 Note: Some responses fell into more than one category. Thus, the total suggestion count is greater than the number of respondents.

Direct data entry was the method preferred by the seventy-three principals/designees and

test administrators who submitted MELA-O scores through the PAS just as it was the preferred method of survey respondents in 2011.

o 68.5% of principals/designees and test administrators (50 of 73) preferred direct data entry method for submitting MELA-O scores

o 13.7% of principals/designees and test administrators (10 of 73) preferred the upload method.

o 17.8% of principals/designees and administrators (13 of 73) had no preference.

Test Administrator’s System

Similar to the 2011 ratings of the TAS, all aspects of the TAS were rated as easy/very easy to use in 2012. Ninety-one principals/designees and test administrators rated the TAS segments on a 5-point rating scale, where 1 was very difficult and 5 was very easy. All components received an average rating of 4.27 or higher. Twenty-eight principals/designees and test administrators responded to the question about improving the TAS. The following table includes the top five categories of responses.

Page 43: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

42

Top 5 Categories of Comments about and Suggested Changes to the Test Administrator’s System

Response Percent

Response Count

System is easy to use/fine as it is. 32.1% 9 Simplify processes/screens 25.0% 7 Content of and access to resource materials (e.g., how and when to submit students for scoring, online access to TAMS) 14.0% 4 Logging into the TAS 10.7% 3 None 10.7% 3

answered question 28

Training

Similar to the 2011 survey results, the majority of school staff thought that the Department-sponsored training prepared them appropriately to participate in online testing. More than three-quarters of test administrators (49 of 61) attended Department-sponsored training sessions; the remaining third may have been trained by someone who attended a Department-sponsored session.

Only one technology coordinator (1 of 6) responded to any questions about training. This respondent indicated that he/she attended a Department-sponsored session and that the recorded WebEx session was sufficient to prepare the school’s technology for online testing.

97.3% or principals/designees (36 of 37) thought they received sufficient information to prepare their school’s technology for online testing.

94.4% of principals/designees (34 of 36) indicated the training adequately prepared them to train test administrators in their schools.

87.8% of test administrators (43 of 49) indicated they received sufficient information during training to administer online tests effectively.

Eight principals/designees suggested changes to the Department-sponsored training sessions. The responses are summarized in the table below.

Principal’s/Designee’s Comments and Suggested Changes for Training

Response Percent

Response Count

No change – good as is 37.5% 3 More hands-on time using the system 25.0% 2 Simplify the content in training/manual 25.0% 2 Require more than one person from school to participate in a training session 12.5% 1

answered question 8

Page 44: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

43

Twenty-four test administrators responded to the question about the Department-sponsored training sessions. The responses are summarized in the table below

Top 5 Categories of Test Administrator’s Comments and Suggested Changes for Training

Response Percent

Response Count

No change – good as is 20.8% 5 More time simulating test administration 16.7% 4 Prefer face-to-face session 16.7% 4 Changes to level of detail provided (2 respondents wanted more detail, one wanted less) 12.5% 3 Change timing and/or location of training 12.5% 3

answered question 24

Support Documentation

The Administration Manual for Spring 2012 MEPA Online Testing was the main reference document. It included detailed information and procedures for use of the online testing components and provided troubleshooting information for school and district technology coordinators. Additional online resources were available through the home page of the PAS and the online help system was also available through the TAS. Similar to 2011, the hard copy of the manual distributed at training sessions continued to be the most frequently used format, and the online help system was a distant second choice of respondents. One hundred six school staff members indicated which resources they used, including which format of the Administration Manual for Spring 2012 MEPA Online Testing.

88.7% (94 of 106) used the hard copy of the Administration Manual for Spring 2012 distributed at the training sessions.

18.9% (20 of 106) used the Online Help System from the PAS and/or the TAS. 17.0% (16 of 106) used the PDF version of the Administration Manual for Spring 2012

provided on the home page of the PAS. 5.7% (6 of 106) did not use the manual. 83.3% (60 of 72) were satisfied with the Online Help System.

Similar to 2011, the Student Tutorial video continued to be the most frequently used PAS home page resource. However, a resource which was new for 2012, the Checklist of Tasks to Be Completed for Spring 2012 MEPA Online Testing, was the second most frequently used resource. The following table shows response data for all PAS resources.

Page 45: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

44

Online PAS Home Page Resource Response Percent

Response Count

Student Tutorial Video 81.1% 73 Checklist of Tasks to Be Completed for Spring 2012 MEPA Online Testing 56.7% 51 Show Me Videos 55.6% 50 Frequently Asked Questions (FAQs) 38.9% 35 Maintaining Test Security During MEPA Online Testing 37.8% 34 Quick Reference Sheet for Test Administration Training 36.7% 33 Tips and Strategies for MEPA Online Testing 31.1% 28 Technology Information 25.6% 23 Recorded Webinars 22.2% 20 Announcements 22.2% 20

answered question 90 Note: Respondents were asked to check all that apply. Thus, the total response count is greater than the number of respondents.

Ninety-eight school staff members rated how helpful the Administration Manual for Spring 2012 MEPA Online Testing was in providing information on eight topics using a 5-point rating scale, where 1 was not very helpful and 5 was very helpful.

All topics received a rating of 4 or higher, even the Troubleshooting section which had received a rating of 3.88 in 2011.

Thirteen school staff responded to the question about improvements to the online manual. The responses are summarized in the table below.

Comments and Suggested Changes to Online Manual Response Percent

Response Count

None 38.5% 5 Change organization and/or language 23.1% 3 Useful as is 15.4% 2 Off-topic/Not related to manual 15.4% 2 Explain student survey no longer exists 7.7% 1

answered question 13

Technical Assistance

Between 2011 and 2012, the percentage of principals and test administrators who reported that their schools experienced technical issues decreased by about 10 percent. As expected, a technology coordinator’s perception of a technical issue varied from a principal’s or test administrator’s perception.

None of the technology coordinators (0 of 6) indicated that their school/district experienced technical issues during online testing.

48.0% of principals/designees and test administrators (48 of 100) indicated that their school experienced technical issues during online testing.

Page 46: Final Report of the Spring 2012 MEPA Online Test Administration

Final Report of the 2012 Spring MEPA Online Test Administration

45

With two exceptions, all technical issues experienced by schools/districts were resolved. The majority of issues were resolved locally (either by the technology staff, by following manual instructions, or by the local internet service provider).

73.5% of principals/designees and test administrators (36 of 49) resolved the issues themselves.

53.1% of principals/designees and test administrators (26 of 49) received help from a technology staff person in the school/district.

53.1% of principals/designees and test administrators (26 of 49) contacted the MEPA technical service center.

4.1% of principals/designees and test administrators (2 of 49) indicated that they were unable to resolve the technical issues.

4.1% of principals/designees and test administrators (2 of 49) specified other responses. One school indicated that the issue was insufficient bandwidth; the other school did not specify what the issue was.

When school staff contacted the MEPA Technical Service Center, representatives were helpful and able to resolve the issue within one business day.

100% (47 of 103) indicated the MEPA Technical Service Center was able to resolve their issue within one business day.

93.5% (43 of 46) indicated the MEPA Technical Service Center representative was helpful.

45.6% (26 of 48) contacted the MEPA Technical Service Center for technical support.

Test Security

Schools were required to develop a school-based test security plan detailing the measures they would use during online testing. Additionally, the Department made cardboard dividers available to schools that participated in online testing and that did not receive these dividers in 2011. Survey questions about test security were revised for the 2012 survey in accordance with the updated test security requirements, so comparisons with the 2011 survey results are limited. Seating configuration continues to be a frequently used measure of ensuring test security.

92.7% of school staff (97 of 105) indicated that seating configurations were used to ensure test security.

60.0% of school staff (63 of 105) indicated that carrels or dividers between student computers were used to ensure test security.

7.6% of school staff (8 of 105) indicated that privacy screens were used to ensure test security.