miami-dade county public schools office of evaluation and research

76
Miami-Dade County Public Schools Office of Evaluation and Research 1500 Biscayne Boulevard Miami, Florida 33132 EVALUATION OF THE EDISON PROJECT SCHOOL Final Report: 1999-00 School Year April 2001 Principal Evaluators/Authors: Joseph J. Gomez, Ph.D. Sally A. Shay, Ph.D.

Upload: others

Post on 12-Sep-2021

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Miami-Dade County Public Schools Office of Evaluation and Research

Miami-Dade County Public SchoolsOffice of Evaluation and Research

1500 Biscayne BoulevardMiami, Florida 33132

EVALUATION OF THEEDISON PROJECT SCHOOL

Final Report: 1999-00 School Year

April 2001

Principal Evaluators/Authors:Joseph J. Gomez, Ph.D.

Sally A. Shay, Ph.D.

Page 2: Miami-Dade County Public Schools Office of Evaluation and Research

THE SCHOOL BOARD OF MIAMI-DADE COUNTY, FLORIDA

Ms. Perla Tabares Hantman, ChairDr. Michael M. Krop, Vice-Chair

Dr. Robert B. IngramMs. Betsy H. Kaplan

Mrs. Manty Sabatés MorseMs Jacqueline V. Pepper

Mr. Demetrio Pérez, Jr., M.S.Dr. Marta Pérez

Dr. Solomon C. Stinson

Ms. Marylynne K. Hunt-Dorta, Student Adviser

Mr. Roger C. CuevasSuperintendent of Schools

Ms. Carol Cortes, Deputy SuperintendentManagement and Accountability

Page 3: Miami-Dade County Public Schools Office of Evaluation and Research

i

PREFACE

The Miami-Dade County Public Schools has a five-year contract with Edison Schools Inc.(formerly The Edison Project) to manage Henry E.S. Reeves Elementary School through the2000-01 school year. Under the provisions of the contract, an evaluation of the project wasconducted by the district’s Office of Evaluation and Research in conjunction with EdisonSchools Inc. The evaluation, which adhered to strategies stipulated in Appendix C of thecontract, spanned the initial four years of the project. At the conclusion of each year, a reportdetailing the cumulative findings to date was published. Accordingly, a total of four reportshave been generated by the evaluation: three interim reports and a final, summary report. Thefollowing document is the latter, the Final Report: 1999-00 School Year.

Page 4: Miami-Dade County Public Schools Office of Evaluation and Research

ii

TABLE OF CONTENTS

Page

EXECUTIVE SUMMARY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vDESCRIPTION OF THE PROJECT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

Privatization in Miami-Dade County . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1The Edison Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

DESIGN OF THE EVALUATION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13Evaluation Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13Implementation of the Edison Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14Students' Academic Achievement: Stanford Achievement Test . . . . . . . . . . . . . . 15Students' Academic Achievement: Florida Writing Assessment . . . . . . . . . . . . . 21Students' Academic Achievement: Edison Curriculum Standards . . . . . . . . . . . . 22Satisfaction of the Parents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23Involvement of the Parents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25School Climate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26Summary of Evaluation Activities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

RESULTS OF THE EVALUATION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28Implementation of the Edison Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28Impact of the Model on the Students’ Academic Achievement . . . . . . . . . . . . . . . . 33

Stanford Achievement Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33Stanford Achievement Test: Project Students not in the Quasi-Experiment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48Florida Writing Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50Edison Curriculum Standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52

Parents’ Satisfaction and Involvement with the Project School . . . . . . . . . . . . . . . . 53Climate of the Project School . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57

CONCLUSIONS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61

REFERENCES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67

APPENDIX A: Survey of Teachers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68APPENDIX B: Classroom Observation Form . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73APPENDIX C: School Climate Survey, Parent Form . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79APPENDIX D: Parent Questionnaire . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82APPENDIX E: School Climate Survey, Staff Form . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87

COMMENTS ON THE REPORT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90

Page 5: Miami-Dade County Public Schools Office of Evaluation and Research

iii

LIST OF TABLES

Page

Table 1. Grade Levels of the Edison Academies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3Table 2. School Year: Edison Model vs. Miami-Dade County . . . . . . . . . . . . . . . . . . . 4Table 3. School Day: Edison Model vs. Miami-Dade County . . . . . . . . . . . . . . . . . . . . 5Table 4. Edison Support Staff . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10Table 5. Basic Elements of the Edison Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12Table 6. SAT 8 Subtests Administered in Miami-Dade County in 1995-96,

Grades 1-5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16Table 7. SAT Pre and Posttests of the Quasi-Experiment . . . . . . . . . . . . . . . . . . . . . . 17Table 8. Predictors in the Regression Equations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19Table 9. Basic Steps in the Analysis of the Students' SAT Performance . . . . . . . . . 21Table 10. Sources of Data for the Evaluation Questions . . . . . . . . . . . . . . . . . . . . . . . . 27Table 11. Survey of Teachers: Implementation of the Edison Model, Year 4 . . . . . . . . 30Table 12. All Sources of Data: Implementation of the Edison Model, Year 4 . . . . . . . . 32Table 13. Comparison of the Groups’ Reading Performance on the Pretest . . . . . . . 35Table 14. Comparison of the Groups’ Mathematics Performance on the Pretest . . . . 36Table 15. Regression Weights of the Predictors, Year 4 . . . . . . . . . . . . . . . . . . . . . . . . 37Table 16. Residual Scores in Reading, Years 1, 2, 3, and 4 . . . . . . . . . . . . . . . . . . . . . 40Table 17. Residual Scores in Mathematics, Years 1, 2, 3, and 4 . . . . . . . . . . . . . . . . . 41Table 18. Comparison of the Groups’ Reading Performance on the Posttest,

Years 1, 2, 3, and 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46Table 19. Comparison of the Groups’ Mathematics Performance on the Posttest,

Years 1, 2, 3, and 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48Table 20. SAT Performance of Project Students not in the Quasi-Experiment . . . . . . 50Table 21. Comparison of the Groups’ Writing Performance in Grade 4, Years 1,

2 and 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51Table 22. Project Students’ Attainment of the Edison Curriculum Standards,

Year 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53Table 23. Return Rates for the School Climate Survey, Parent Form, Year 4 . . . . . . . 54Table 24. School Climate Survey, Parent Form: Comparison of General Satisfaction

with the School, Years 2, 3, and 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55Table 25. Parent Questionnaire: General Satisfaction with the School,

Years 1, 2, 3, and 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57Table 26. School Climate Survey, Parent Form: Comparison of Involvement in the

School, Years 1, 2, 3, and 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58Table 27. Return Rates for the School Climate Survey, Staff Form, Year 4 . . . . . . . . . 59Table 28. School Climate Survey, Staff Form: Comparison of General Satisfaction

with the School, Years 2, 3, and 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59Table 29. Comparison of Factors that Reflect School Climate, Year 4 . . . . . . . . . . . . 60

Page 6: Miami-Dade County Public Schools Office of Evaluation and Research

iv

LIST OF FIGURES

Page

Figure 1. Residual scores in reading, year 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42Figure 2. Residual scores in mathematics, year 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42Figure 3. Residual scores in reading, year 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43Figure 4. Residual scores in mathematics, year 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43Figure 5. Residual scores in reading, year 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44Figure 6. Residual scores in mathematics, year 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44Figure 7. Residual scores in reading, year 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45Figure 8. Residual Scores in mathematics, year 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

Page 7: Miami-Dade County Public Schools Office of Evaluation and Research

v

EXECUTIVE SUMMARY

On December 13, 1995, the School Board of Miami-Dade County approved a five-yearcontract with Edison Schools Inc. (formerly The Edison Project) to manage Henry E.S. ReevesElementary School. Edison Schools Inc. is a for-profit, management company involved in theprivatization of public schools. The company markets a unique model of education andsupplementary services. It consists of an eclectic mixture of such elements as an extendedschool year, an interdisciplinary curriculum, and the use of modern technology. The contractcalls for the company to employ this model in managing the project school from August, 1996until June, 2001. The contract also calls for an evaluation of the project.

The evaluation was conducted by the Office of Evaluation and Research of the Miami-DadeCounty Public Schools (MDCPS) in conjunction with Edison Schools Inc. The intent of theevaluation was to gauge the impact of the Edison model on the project school. Four generalareas were addressed by the evaluation. The first was the actual implementation of theEdison model in the school. The remaining three areas were the stated objectives of theproject, which are:

1. To raise the academic achievement of all students to the highest levelpossible.

2. To increase parent involvement and satisfaction to levels consistent witheducational excellence.

3. To improve school climate in the many ways necessary to foster greaterlearning.

The evaluation encompassed the project school’s initial four years of operation: 1996-97(year 1), 1997-98 (year 2), 1998-99 (year 3), and 1999-00 (year 4). With regard to theproject’s implementation, the data revealed that the model on the whole was implemented inyear 4. This has been the basic outcome in every year of the evaluation since year 2. Dataon the project students’ academic achievement were drawn primarily from controlledcomparisons of their performance on standardized tests. The results revealed that despiteovercoming a disappointing performance in year 1, the project students’ performance in years2 through 4 remained relatively unchanged. At best, their test scores in both reading andmathematics were only comparable to those of their counterparts in the regular MDCPSprogram. Thus, the project failed to comparatively improve the students’ academicachievement, which is the first and most important of its stated objectives. The projectparents’ involvement and satisfaction with their children’s education were assessed primarilythrough controlled comparisons of survey responses. The results in year 4 revealed that, asin previous years, the project parents were comparatively more involved and satisfied withtheir children’s education. Accordingly, the project successfully attained its second statedobjective. Finally, controlled comparisons of both survey responses and

Page 8: Miami-Dade County Public Schools Office of Evaluation and Research

vi

archival data were used to assess the climate of the project school. The results revealed thatthe project school’s climate in year 4 did not compare favorably with that of comparableMDCPS schools. Indeed, the climate of the project school appears to have waned after year2. Therefore, the project failed to attain its third stated objective.

In summary, the outcome of the evaluation was not encouraging. While the parents clearlyretained their enthusiasm for the school, the same could not be said for the teachers. Moreimportantly, the project students ultimately failed to capitalize on the academic gains that theymade between year 1 and year 2. Despite the lofty academic standards of the Edison model,the project students never once exhibited an academic advantage over the students in theregular MDCPS program. Consequently, the evaluation failed to produce any evidence thatthe Edison model represents a superior educational program.

The implications of this conclusion, however, are circumscribed by the evaluation’s limitedgeneralizability. It involved a single elementary school in an economically depressedneighborhood attended almost exclusively by students of a single race. Given theseconstraints, it seems clear that this evaluation cannot provide a definitive answer on theefficacy of the Edison model. Such an answer can only result from additional controlledstudies. Ideally these studies should be conducted by independent third parties in a varietyof school settings. In this manner, it might be possible to either verify or refute the results ofthe current evaluation. If the outcome is the latter, perhaps the accumulated research caneventually identify the school setting where the Edison model is most effective. But until thisis known, it is recommended that the MDCPS give careful consideration before committingadditional resources to the Edison model.

Page 9: Miami-Dade County Public Schools Office of Evaluation and Research

1

DESCRIPTION OF THE PROJECT

In recent years, the idea of reforming public schools through privatization has receivedconsiderable attention. Privatization refers to contracting with a for-profit company to managesome or all the functions of a school. The idea is not new. Public schools have oftencontracted for the management of such ancillary functions as food services and transportation.What is new, is the idea of privatizing the school's primary function of education. Theproponents of privatization tout the cost benefits of this arrangement. They view privatizationas "a magic bullet for the nation's ailing, bureaucratically entrenched public schools" (Gleick,1995, November 13). The appeal of privatization has convinced a number of school boardsto authorize its implementation. This usually involves a limited number of schools, since theidea is generally regarded as experimental. However, there are districts which have mademuch greater commitments to privatization. For example, Minneapolis, Minnesota for severalyears had a contract with Public Strategies Group to manage all of its schools; and Hartford,Connecticut had a similar contract with Education Alternatives Inc. (EAI). The latter contract,which was regarded at the time as the largest privatization effort in the nation, was dissolvedin January, 1996 at the request of the district.

Privatization in Miami-Dade County

The Miami-Dade County Public Schools (MDCPS) has been receptive to the idea ofprivatization. On December 13, 1995, the School Board approved a five-year contract withEdison Schools Inc. (formerly The Edison Project). This company, which was founded in1991, markets a unique model of education and supplementary services for schools. Themodel was developed by a group of nationally known educators. The group included BennoC. Schmidt, Jr., who is a former president of Yale University and the current Chairperson ofthe Board for Edison. The contract calls for Edison Schools Inc. to manage Henry E.S.Reeves Elementary School from August, 1996 until June, 2001. The school will draw studentsfrom the attendance area established by the district. However, the contract stipulates that "nostudent shall be assigned to [the project school] over the objections of his or her parent/s orguardian/s" (p. 2). The project school will receive funding comparable to other schools in thedistrict but adjusted to accommodate the unique aspects of its operation (e.g., an extendedschool year). Edison Schools Inc. will "retain any excess of revenues over expenditures asits compensation for the services provided" (p. 21).

The Edison contract is the district's second experiment with privatization. The first wasundertaken in June, 1990, when the School Board approved a five-year contract with EAI tomanage South Pointe Elementary School. At the time, EAI was the largest of a handful ofmanagement companies involved in privatization. The contract called for EAI to use theschool's existing funding and other resources developed by the company to implement its

Page 10: Miami-Dade County Public Schools Office of Evaluation and Research

2

Tesseract educational program. This program, which is named after a magical pathway ina children's book, stresses small classes, individualized lesson plans and parentalinvolvement. An evaluation of the Tesseract program at South Pointe was conducted by thedistrict. It revealed that the program was well received and the students' academicperformance improved, but not to the degree anticipated. The students in a control group,who did not benefit from either the Tesseract program or the additional resources providedby EAI, performed comparably well (Abella, 1994). Consequently, the district did not renewits contract with EAI.

The Edison contract held greater promise for success. The EAI experiment was basedessentially on the implementation of a new instructional method. The Edison experiment,however, called for global changes in the project school's operation. The implementation ofthe Edison model required fundamental changes in the school's organization, schedule,curriculum, staffing, technology, as well as instructional method.

The Edison Model

The Edison model is described in the company's book, Partnership School Design.According to this document, the model is founded on ten basic principles, which are knownas "fundamentals." The fundamentals, furthermore, are derived from a "powerful philosophy"and "volumes of research." The document contends that the application of any of thefundamentals would result in the improvement of a school's program. However, a strategy thatencompassed all ten would achieve dramatic results. The fundamentals are:

1. Schools organized for every student's success 2. A better use of time 3. A rich and challenging curriculum 4. Teaching methods that motivate 5. Assessment that provides accountability 6. Educators who are true professionals 7. Technology for an Information Age 8. A partnership with families 9. Schools tailored to the community

10. The advantages of system and scale (p. 10).

The implications of each fundamental on the operation of an Edison project school aredetailed in the following sections.

Organization. The organization of an Edison project school is quite distinct from a typicalschool in Miami-Dade County. The project school is organized into "schools-within-a-school,"which are known as "academies." As Table 1 illustrates, each academy spans twoto three ages/grade levels. This arrangement allows a teacher to remain with the same groupof students for two to three years. The teacher can become well acquainted with the studentsand establish mutually supportive ties with their families.

Page 11: Miami-Dade County Public Schools Office of Evaluation and Research

3

Table 1

Grade Levels of the Edison Academies

Academy Age/Grade Level School Level

Readiness Ages 3-4 Pre-school

Primary Grades K-2 Elementary

Elementary 3-5

Junior 6-8 Middle

Senior 9-10 Senior High

Collegiate 11-12

Within each academy, students are organized into "houses." Each house includes 90 to 120students, depending on the level of the academy and the enrollment of the school. ThePrimary, Elementary and Junior Academies usually contain three houses, while the Senior andCollegiate usually contain two. Each house includes an equal number of students from eachage/grade level in the academy. For example, a house in the Elementary Academy mightinclude 35 students in grade 3, 35 in grade 4 and 35 in grade 5. This mixture of ages/gradesfacilitates the forming of ability groups for instruction and related activities. Flexible, ad hocability groups are a key element in the instructional strategy of the Edison model. Such groupsare distinct from tracking, which the model shuns. In a project school, every student receivesthe same curriculum.

Four teachers are assigned to each house. They work as a team in instructing every studentin the house. The team is responsible for teaching the core subjects of the curriculum:mathematics, science, history, geography, civics, economics and language arts. Additionalteachers supplement the house teams by teaching such enrichment subjects as art, music andphysical fitness.

The teacher-student ratio of a project school is approximately 1:18, but the Edison model'semphasis is not necessarily on small classes. According to Partnership School Design, "thekey to effective class organization is not the staffing ratio or the average class size, butmatching the class and staffing structure to instructional purpose" (p. 19). Accordingly, atypical scenario in the instructional activities of a house might consist of a large number ofstudents attending a lecture by a single teacher, while other teachers work with a small numberof students on a hands-on lesson. The time devoted to these activities, furthermore, may vary,since the Edison model emphasizes flexible scheduling. This is achieved through blockscheduling, which is the designation of one time period of instruction to two or more subjects.

Page 12: Miami-Dade County Public Schools Office of Evaluation and Research

4

Time. Perhaps the most critical element in an educational program is the "time on task," i.e.,the time actually devoted to instruction. Yet, the United States has one of the shortest schoolyears of the industrialized world. To remedy this, the Edison model is based on an extendedschool year. In a project school, the school year typically begins August 15 and ends July 1.It consists of 210 school days, which is the same as schools in Japan. In contrast, as Table2 illustrates, the school year in Miami-Dade County consists of only 180 days.

Table 2

School Year: Edison Model vs. Miami-Dade County

School Year Edison Model Miami-Dade Countya

Number of terms 4 4

Length of term 10½ weeks 9 weeks

Start of year August 15 August 28

End of year July 1 June 12

Total number of school days 210 180

aThe information for Miami-Dade County is based on the district's 1995-96 calendar.

The school day is also longer in an Edison project school than in the typical Miami-DadeCounty school. The specific differences in time are detailed in Table 3. A review of the tablereveals that, depending on the grade level, the school day in a project school is from 50 to 90minutes longer. Such differences may seem minor; but, when combined with the extendedschool year, they produce a remarkable increase in the amount of time a student spends inschool during the 13 years between kindergarten and grade 12. A student who remains inproject schools for the duration will have a total of 21,210 hours in school. In contrast, astudent who attends the typical Miami-Dade County schools (with grades 6 through 8 inmiddle school) will have only 15,570 hours. The difference, when adjusted to a 180-dayschool year with a seven-hour day, represents almost 4½ years of additional schooling. Thisis a sizable increase in the time on task.

Page 13: Miami-Dade County Public Schools Office of Evaluation and Research

5

Table 3

School Day: Edison Model vs. Miami-Dade County

School Day: Start - End (Number of Hours)

Grade Edison Model Miami-Dade Countya

Pre K 8:00 - 12:00 (4) or 8:30 - 2:00 (5½)

8:00 - 4:00 (8)b

K 8:00 - 3:00 (7) 8:30 - 2:00 (5½)

1 8:00 - 3:00 (7) 8:30 - 2:00 (5½)

2 8:00 - 3:00 (7) 8:30 - 3:00 (6½)

3 8:00 - 4:00 (8) 8:30 - 3:00 (6½)

4 8:00 - 4:00 (8) 8:30 - 3:00 (6½)

5 8:00 - 4:00 (8) 8:30 - 3:00 (6½)

6 8:00 - 4:00 (8) 8:30 - 3:00 (6½)c

8:30 - 3:40 (71/6)d

7 8:00 - 4:00 (8) 8:30 - 3:40 (71/6)

8 8:00 - 4:00 (8) 8:30 - 3:40 (71/6)

9 8:00 - 4:00 (8) 8:30 - 3:40 (71/6)e

7:30 - 2:30 (7)f

10 8:00 - 4:00 (8) 7:30 - 2:30 (7)

11 8:00 - 4:00 (8) 7:30 - 2:30 (7)

12 8:00 - 4:00 (8) 7:30 - 2:30 (7)

aThe information for Miami-Dade County is based on the regular schedule of districtschools during the 1995-96 school year. bUnder the Edison model, prekindergarten students can attend either a four-hour sessionor an eight-hour. cGrade 6 in an elementary school. dGrade 6 in a middle school. eGrade 9 in a middle school. fGrade 9 in a senior high school.

Page 14: Miami-Dade County Public Schools Office of Evaluation and Research

6

An opportunity for additional schooling is provided to Edison students during the six-weeksummer hiatus. Students have the option of taking the six weeks as vacation or attending theEdison summer program. This program consists of three two-week sessions, which areusually taught by regular faculty members. All students are encouraged to attend thesesessions; and students who are behind academically may be required to do so.

Curriculum. The Edison curriculum is holistic in nature. The curriculum is divided not so muchinto subjects but domains. There are five major domains: (a) humanities and the arts, (b)mathematics and science, (c) character and ethics, (d) health and physical fitness, and (e)practical arts and skills. The domains serve to integrate the curriculum, so teachers canemploy an interdisciplinary approach in the delivery of instruction. Lessons are organizedaround projects or real life problems, which require the students to delve into differentdisciplines. For example, in the Primary Academy, the garden is used to teach the studentsabout science, mathematics, geography, literature and history. This approach helps thestudents to see the connections between different subjects, and between abstract ideas andpractical applications.

The Edison curriculum is also results-oriented. Each academy has a set of over 100 explicitstandards which define the level of educational development that students are expected toachieve. The students must demonstrate the attainment of these standards before they canbe promoted to the next academy.

The standards, furthermore, establish very high academic expectations for the students.According to Partnership School Design, by the end of the Senior Academy (i.e., grade 10),all students who have attended project schools for "a substantial period of time [will] have theequivalent of a first-rate high school education." By graduation, students will be "well preparedfor Advanced Placement (AP) exams in core academic subjects." In general, the academicexpectations of "Edison's standards equal or surpass the ambitious standards currently beingestablished by educators in every academic field, both in this country and around the world"(p. 30). The expectations, for example, include the following:

C All students will have "strong reading skills" by the end of the Primary Academy (i.e.,grade 2).

C All students will learn to read music.C All students will successfully complete the first year of algebra before the end of the

Junior Academy (i.e., grade 8).C By graduation, all students will successfully complete either AP Calculus or Introductory

College-Level Probability and Statistics.C All students will learn two foreign languages.C By graduation, all students will be able to converse in a foreign language with a native

speaker and pass the AP examination in that language.

Page 15: Miami-Dade County Public Schools Office of Evaluation and Research

7

Teaching Methods. The instructional philosophy of the Edison model is founded on thepremise that students are naturally curious. They learn by exploring and interacting with theirenvironment. The Edison instructional program is designed to harness this inherent curiosity.A project school provides the students with a rich and varied environment to stimulate theirinterests and encourage learning.

This educational environment is achieved in part through elements of the Edison model whichwere previously discussed: flexible scheduling, an interdisciplinary curriculum, and lessonsbased on projects or real life problems. These elements form the conceptual framework ofthe teaching strategy. The strategy, however, does not include a prescribed teaching method.The developers of the Edison instructional program recognize the fact that students learn indifferent ways, so the program employs a variety of teaching methods. They range fromtraditional lectures to such innovative methods as cooperative learning. However, all themethods, according to Edison Schools Inc., have one thing in common: their effectivenessis well documented.

Assessment. For the assessment methods of a school to be effective, they must be alignedwith the objectives of its educational program. This is a potential problem for an Edisonproject school, because most traditional, standardized tests are not well suited to its program.Such tests, which generally gauge only the acquisition of knowledge, do not address thingslike creative thought and interdisciplinary problem solving. Consequently, a uniqueassessment system was developed for the Edison model. The system is used to monitor theprogress of the students in attaining the standards of the educational program. The systemalso provides information on the overall success of the program. In this manner, it serves tohold Edison Schools Inc. accountable.

The assessment system of the Edison model, like the instructional strategy, is derived fromthe curriculum of the program. Under the Edison model, curriculum, instruction andassessment are regarded as a coherent whole. The specific components of the assessmentsystem are: (a) the Quarterly Learning Contract, (b) the student portfolio, (c) the embeddedassessments, and (d) the Academy Promotion Review.

The Quarterly Learning Contract is the most important component of the assessment system.A contract is drawn up for each Edison student, and it is updated at the end of each gradingperiod. The contract is a formal statement of the expectations and objectives agreed uponby the project school, the student and the student's parents. A quarterly progress report, whichdetails the progress in fulfilling the terms of the contract, is prepared by the student's adviserand teachers. The overall process is intended to be diagnostic and developmental in nature;it is not judgmental. Its purpose is to help the student attain the standards established for theacademy.

Page 16: Miami-Dade County Public Schools Office of Evaluation and Research

8

The student portfolio is the second component of the assessment system. For each student,the project school maintains a cumulative portfolio, which contains samples of the student'swork. These work samples are direct indicators of the student's progress in attaining thestandards of the academy.

Embedded within the work samples of the student portfolio are specially designedassessments. These embedded assessments, which represent the third component of theassessment system, are not simply tests. According to Partnership School Design, they are"carefully constructed learning experiences, ... that provide reliable indications of studentprogress." Additionally, they provide "standardized measures of student performance... [thathelp] calibrate other work in the students' portfolios" (p. 69).

The final component of the assessment system is the Academy Promotion Review. This isthe formal review of a student's progress to determine if promotion to the next academy iswarranted. To be promoted, a student must attain three-fourths of the academy's standardsin each discipline. The review is based on the student portfolio, the Quarterly LearningContract and, at some junctures, formal examinations. The review process is theresponsibility of the project school, but it is closely monitored by Edison Schools Inc.

Beyond the assessment measures of the Edison model, students in a project school also takeall the standardized tests required by the district and the state. Additionally, students whohave been in the program for a substantial period of time take the AP examinations in theCollegiate Academy (i.e., grades 11 and 12). The results of these tests enable the projectschool to gauge the academic progress of the students against local and national norms.

Educators. The best teachers are often promoted right out of the classroom. To curtail thisloss, the Edison model employs a career ladder for teachers. The ladder is designed toprovide teachers with both professional fulfillment and financial rewards. There are four levelsin the ladder: (a) master teacher, (b) senior teacher, (c) teacher, and (d) resident. The masterteacher is the highest in rank. The master teacher has at least seven years of experience, amaster's degree in field, publications in the Edison curriculum, and a general mastery ofteaching. The master teacher is the professional and organizational leader of a team ofteachers, which includes teachers from the various levels of the career ladder. In this capacity,the master teacher serves as a mentor for less experienced teachers; he/she guides them inthe development of teaching strategies and curriculum units. The master teacher also assiststhe principal in the evaluation of the team's performance and the hiring of new teachers. Thesenior teacher, the next level in the career ladder, is a veteran teacher with experience indeveloping curriculum units and assisting less experienced teachers. The teacher, the nextlevel, is an individual with complete teaching credentials but with limited experience. Finally,the resident, the lowest level in the ladder, is an individual with provisional teachingcredentials. The resident spends two years in an orientation program under the directsupervision of a master teacher.

Page 17: Miami-Dade County Public Schools Office of Evaluation and Research

9

Professional development is an integral part of the Edison model. The professionaldevelopment program is carefully aligned with the model's elements of curriculum, instructionand assessment. The program has two distinct components: the introductory program andthe on-going program. The introductory program is offered prior to the opening of a projectschool. During several months of evening and weekend workshops, as well as a six-weeksummer session, the teachers are introduced to the Edison model's philosophy, curriculum,instructional methods and technology. By the end of the summer session, the teaching teamshave been formed and the courses of study for the year have been prepared. The on-goingprofessional development program continues during the course of the year. Edison staffassist the faculty of the project school in designing a professional development program foreach team. This involves establishing individual goals for each teacher and identifyingappropriate activities. The latter may involve such things as on-site workshops and coachingsessions, which are provided by Edison Schools Inc.

Closely associated with professional development is the assessment of job performance. Thejob performance of the teachers in a project school is assessed by a system which wasdeveloped specifically for the Edison model. The system is essentially based on theoutcomes of the teacher's job performance. The specific focus of the assessment includes:(a) the students' academic progress, (b) the quality of the curriculum units developed, (c) thefluency with various instructional methods, and (d) the relationship with colleagues, studentsand their families. Evidence of the teacher's accomplishments in these areas is maintainedin a portfolio. The portfolio serves to both document the teacher's job performance andhighlight areas in need of improvement. The principal is primarily responsible for assessingthe job performance of teachers in a project school. However, Edison Schools Inc. providesthe principal with assistance in this area.

The principal of a project school receives assistance in various areas of the school'soperation. Edison Schools Inc. assists with the computer system, social services, specialprograms, as well as professional development. For each project school, a support staff isassembled to ensure the success of the program. The members of the support staff and theirvarious jobs are detailed in Table 4.

Technology. The use of sophisticated technology in the classroom is not a new idea.However, an Edison project school is unique in the extent to which it uses such technology.Within three to five years of its opening, a project school attempts to emulate the real worldin its use of computers and computer networks. A computer is made available to everystudent's family, every teacher and the principal; all these computers are linked by a network.In this manner, communication among these parties is facilitated. For example, parents cancontact a teacher through E-mail; students can submit homework electronically; andeducational ideas can be exchanged on the Common, a network forum for such discussions.Additionally, the computers make available a wealth of information.

Page 18: Miami-Dade County Public Schools Office of Evaluation and Research

10

Table 4

Edison Support Staff

Position Job Description

Technology Support Team Maintains the computer system and providesassistance in its use

Media Specialist Maintains a full-service media center and worksin conjunction with the technology support team

Tutors (three full-time employees) Assist the Primary Academy teams in tutoring thestudents

Director of Social Services Coordinates the social services provided bycommunity agencies and professionals

Director of Special Programs Organizes and supervises a variety of ancillaryprograms and services (e.g., before and after-school care)

Professional Development Specialist Coordinates and provides much of theprofessional development services for a numberof project schools

Partnership Director Functions as a conduit between the project schooland the Central Services division of EdisonSchools Inc.

A number of databases can be accessed including the Edison Library of Ideas, which iscreated by the students themselves. Finally, the computer programs represent powerful toolsfor both the educational program and the management of the school's operation. Throughsuch extensive use of computers and other sophisticated technology like video cameras andrecorders, the project school prepares its students for life in the Information Age.

Partnership. It is a basic tenet of the Edison model that "it takes an entire village to raise achild." As such, a project school attempts to engage the students' families in the educationalprocess. Teachers meet quarterly with family members to discuss their children'saccomplishments and needs. Communication is also maintained by telephone and throughthe computer network. The principal has an electronic bulletin board for announcements anda suggestion box which is open to the entire school community.

Page 19: Miami-Dade County Public Schools Office of Evaluation and Research

11

Additionally, if the space is available, a project school maintains a Parent Center, which serves as a base for family members who visit the school. The center distributes informationon the curriculum, parenting and school activities; it also provides the facilities for conductingmeetings and workshops. Finally, a Parent Advisory Council is convened by the principal.The council provides a forum for family members to express their opinions and ideasregarding the school's programs and policies.

Community. An Edison project school is tailored to the community it serves. About one-fourthof the school's curriculum is determined locally. This ensures that the educational program isin accord with the priorities and concerns of the community. Furthermore, the project schoolis designed to be a hub of activity for the community. Programs and services are offered inthe afternoons and evenings, and during the summer. Finally, the project school enlists thecooperation of community agencies and professionals in the delivery of social services to thestudents. The social services director of the Edison support staff organizes a consortium ofservices designed to meet the specific needs of the students in the project school.

System and Scale. Project schools are linked both in their common purpose and literallythrough the computer network. The network facilitates the exchange of materials and ideasfor improving the educational program. In addition, the Central Services of Edison SchoolsInc. provides support, guidance and resources to the project schools. According toPartnership School Design, "the chief responsibility of Central Services is to give each schoolwhat it needs to achieve high results without stifling its creative fervor" (p. 99). Accordingly,resources are committed to: (a) curriculum research and development, (b) the assessmentof students' academic performance and the staff's job performance, (c) the recruitment of thestaff, (d) professional development, and (e) the improvement and upkeep of the facility. In thismanner, an individual project school benefits from the system and scale of the entire Edisonoperation.

In summary, the educational strategy of Edison Schools Inc. is detailed in the company’s book,Partnership School Design. According to this document, the project’s educational strategyis founded on ten basic principles, which are known as "fundamentals." A review of theimplications of these fundamentals on the operation of a typical project school has rendereda picture of the Edison model. The model consists of an eclectic mixture of elements, whichare not necessarily new or unusual. They are based on established educational practices andsound research. However, the merger of these elements into a cohesive educational strategyis unique to the Edison model. And, while some aspects of this strategy have been refinedsince the publication of Partnership School Design in 1994, the basic elements of the modelremain unchanged. They are summarized in Table 5.

Page 20: Miami-Dade County Public Schools Office of Evaluation and Research

12

Table 5

Basic Elements of the Edison Model

Fundamentalsa Elements

Organization 1. "Schools-within-a-school" consisting of academies thatcombine several ages/grade levels

2. Ability grouping, but no tracking3. Teachers organized into teams4. Flexible scheduling (e.g., block scheduling)

Time 5. Extended school year6. Extended school day

Curriculum 7. Interdisciplinary curriculum8. Lessons organized around projects or real life problems9. Results-oriented standards for promotion to the next

academy

Teaching Methods 10. Variety of teaching methods

Assessment 11. Unique assessment system for monitoring students'progress toward standards

Educators 12. Career ladder for teachers13. Emphasis on professional development14. Results-oriented system for assessing job performance of

teachers

Technology 15. Emphasis on technology (e.g., computer provided for eachstudent's home)

16. Home and school are linked by computer network

Partnership 17. Emphasis on parental involvement

Community 18. One-fourth of curriculum determined locally19. School is conduit for social services

System and Scale 20. School is supported by Central Services of EdisonSchools Inc.

21. School and Central Services are linked by computernetwork

a For the full text of the fundamentals, see page 2.

Page 21: Miami-Dade County Public Schools Office of Evaluation and Research

13

DESIGN OF THE EVALUATION

The Miami-Dade County Public Schools (MDCPS) embarked on its second experiment withprivatization on December 13, 1995. On that date, the School Board approved a five-yearcontract with Edison Schools Inc. (formerly The Edison Project) to manage Henry E.S. ReevesElementary School. Edison Schools Inc. is a for-profit company, which was founded in 1991.The company markets a unique model of education and supplementary services, which wasdescribed in the previous section (see Table 5). The contract with the MDCPS calls for thecompany to employ this model in managing the project school from August, 1996 until June,2001. The contract also calls for an evaluation of the project. The evaluation was undertakenby the Office of Evaluation and Research of the MDCPS in conjunction with Edison SchoolsInc. The evaluation, which spanned four school years, has now been completed. Thisdocument is the final, summary report. It details the cumulative findings through year 4 (1999-00).

The intent of the evaluation was to gauge the impact of the Edison model on the projectschool. The evaluation was designed to address four general areas. The first was the actualimplementation of the Edison model in the school. The remaining three areas were the statedobjectives of the project, specifically:

1. To raise the academic achievement of all students to the highest level possible.2. To increase parent involvement and satisfaction to levels consistent with

educational excellence.3. To improve school climate in the many ways necessary to foster greater

learning.

The results of the evaluation serve as a means of holding Edison Schools Inc. accountable forattaining these objectives.

Evaluation Questions

To define the specific focus of the evaluation, a series of questions was formulated. One ormore questions were derived from each of the four general areas addressed by theevaluation. The questions are:

1. Have the basic elements of the Edison model been implemented in the project school?2. Are the students in the project school performing better on the Stanford Achievement

Test than would be expected, if these students were attending other MDCPS schools(Objective 1)?

3. Are the students in the project school performing better on the Florida WritingAssessment than would be expected, if these students were attending other MDCPSschools (Objective 1)?

4. Are the students in the project school making good progress in meeting the curriculumstandards of the Edison model (Objective 1)?

Page 22: Miami-Dade County Public Schools Office of Evaluation and Research

14

5. Are the parents of students in the project school more satisfied with their children'seducation than the parents of students attending comparable MDCPS schools(Objective 2)?

6. Are the parents of students in the project school more involved in their children'seducation than the parents of students attending comparable MDCPS schools(Objective 2)?

7. Is the school climate of the project school superior to that of comparable MDCPSschools (Objective 3)?

In order to answer these evaluation questions, data were drawn from a number of sources.The specific sources for each question are detailed in the subsequent sections. In addition,for each question the following two topics are addressed: the methodology employed, and theanalysis of the data.

Implementation of the Edison Model

It was necessary to establish that the Edison model was fully operational before conclusionscould be drawn about its impact on the project school. Otherwise, there may have beenconfusion regarding what factors contributed to the results. Consequently, the evaluation’sinitial focus was on the implementation of the model (Question 1). Data were drawn from thefollowing sources: (a) interviews with the principal of the project school, (b) a survey of theteachers, and (c) classroom observations.

Methodology. During the years of the evaluation, a number of unstructured interviews wereconducted with the principal. The interviews dealt primarily with the school’s progress inimplementing the basic elements of the Edison model. The principal was asked, wheneverpossible, to provide supporting documentation. Basically, the interviews served to providean overview of the status of the model.

In the latter part of each year, a survey was conducted of all the teachers in the project school.Many of the items on the survey dealt with the teachers' perceptions of the Edison model. Theteachers were asked to rate the degree of implementation of the basic elements of the model.They were also asked to compare, from their experiences, the differences in the educationalprogram of the project school and that of other schools in the district. To ensureconfidentiality, the survey instruments were coded and distributed directly to the teachers. Thenames of the recipients appeared on the forwarding envelopes, but not on the returnenvelopes or the instruments themselves. A copy of the Survey of Teachers for year 4appears in Appendix A.

The final source of data on the implementation of the model was a set of classroomobservations. Generally, at the beginning of the third grading period and again at theconclusion of the fourth grading period, several houses within the academies of the project

Page 23: Miami-Dade County Public Schools Office of Evaluation and Research

15

school were randomly selected for visitations. The visitations, which were unannounced, wereconducted by members of the evaluation team. Each selected house was observed forapproximately half of the school day. The observations were recorded on an instrumentdesigned to document the operational status of the model's basic elements. A copy of theClassroom Observation Form for year 4 appears in Appendix B.

Data Analysis. To ascertain the status of the model's implementation, an analysis was doneof all the relevant data derived from the interviews of the principal, the survey of the teachers,and the classroom observations. For the most part, the data derived from these varioussources are qualitative in nature; and the analysis was conducted accordingly. It basicallyinvolved determining whether the various sources of data generally concur with regard to thestatus of the model’s implementation.

Students' Academic Achievement: Stanford Achievement Test

The students' performance on the Stanford Achievement Test (SAT) was the primary sourceof data on their academic achievement (Question 2). The SAT test is a norm-referencedinstrument, which is published by The Psychological Corporation. The test is designed "tomeasure the important outcomes of the school curriculum" (Kramer & Conoley, 1992). Itconsists of a battery of subtests that assess a number of the "content domains" in the typicalcurriculum. The domains include, among others, reading, mathematics, science and socialscience (The Psychological Corporation, 1989). The MDCPS routinely administers the SATtest to the students in the spring. Table 6 lists the subtests which were administered in the1995-96 school year to students at the elementary school level. The district’s testing program,however, is occasionally modified. During the period of the evaluation, both the grade levelstested and the forms of the test varied. At the elementary level, students in grades 1 through5 were tested in the school years 1995-96 and 1996-97 (i.e., year 1 of the evaluation); but insubsequent years, only grades 2 though 5 were tested. With regard to the test form, the eighthedition of the SAT test (i.e., SAT 8) was administered in the 1995-96 school year and in everyyear of the evaluation period except year 4 (1999-00). In that year, it was replaced by a formof the SAT 9 test, which had been incorporated into the state’s Florida ComprehensiveAssessment Test.

Methodology. To gauge the students' performance on the SAT test, a quasi-experiment wasconducted as part of the evaluation. A quasi-experiment is a technically acceptablealternative to a true experiment. Quasi-experiments are often used in "natural social settings"where a true experiment is not feasible (Campbell & Stanley, 1963). The specific quasi-experimental design that was used in the evaluation is a form of the “nonequivalent controlgroup design.” This design essentially involves using pre and posttest scores to compare theperformance of a group of subjects who are exposed to an experimentaltreatment (i.e., the experimental group) with that of a group who are not (i.e., the control group).The two groups are considered “nonequivalent,” because the subjects were not randomlydistributed between them (as is the case with a true experimental design). Consequently, thepretest scores are used as evidence of the comparability of the two groups prior to thetreatment.

Page 24: Miami-Dade County Public Schools Office of Evaluation and Research

16

Table 6

SAT 8 Subtests Administered in Miami-Dade County in 1995-96, Grades 1-5

Grade

Content Domains

Reading Mathematics Science SocialScience

Language

1

2

3

4

5

Word ReadingWord Study SkillsReading Comprehension

Word Study SkillsReading VocabularyReading Comprehension

Word Study SkillsReading VocabularyReading Comprehension

Reading VocabularyReading Comprehension

Reading VocabularyReading Comprehension

Concepts of NumberMathematicsComputationMathematics Applications

Concepts of NumberMathematicsComputationMathematics Applications

Concepts of NumberMathematicsComputationMathematics Applications

Concepts of NumberMathematicsComputationMathematics ApplicationsConcepts of NumberMathematicsComputationMathematics Applications

)

)

Science

)

Science

)

)

SocialScience

)

SocialScience

)

)

)

LanguageMechanics

LanguageMechanics

Note. SAT 8 = Stanford Achievement Test, edition 8.

In applying the nonequivalent control group design to the assessment of the project students’academic achievement, the Edison model represented the experimental treatment. All the studentswho attended the project school were part of the experimental group. As for the control group, itactually consisted of three groups of students. This departure from the basic design served tobolster the control for extraneous variables. Each control group had approximately the samenumber of students as the experimental group. The students in the control groups were drawn inyear 1 from a pool of all MDCPS students who were not attending the Edison project school.Stratified random sampling was used in the selection of each control group to ensure that itcorresponded proportionally with the experimental group on the following variables: grade level,ethnicity, participation in the free/reduced lunch program, and performance on the pretest.

The pretest was the SAT 8 test administered in the 1995-96 school year. This was the last SATtest administered prior to the students’ entry into the project school in year 1. The posttest,however, was not a single measure. It consisted of the series of SAT tests administered during thestudents’ passage through the grade levels of the project school. An analysis of their academic

Page 25: Miami-Dade County Public Schools Office of Evaluation and Research

17

achievement versus that of the control groups was done annually as the results of the posttestbecame available.

Page 26: Miami-Dade County Public Schools Office of Evaluation and Research

18

The analysis was limited to the students who enrolled in year 1. These students representedthe first cohort of students to pass through the project school. The quasi-experiment focusedexclusively on them for several reasons. First, it simplified the data analysis, which, even withthis constraint, was quite involved. Secondly, the first cohort of students was much largernumerically and had greater exposure to the Edison model than subsequent groups. Finally,the subsequent cohort groups yielded few qualified subjects. After the initial year, the studentsentering the project school were those who transferred from other schools or who entered aspart of the kindergarten class. The latter students, who constituted the vast majority, lackedpretest scores. Moreover, their first SAT scores in the project school could not be used as apretest, because the students would already have been exposed to the model. Consequently,most of the students enrolling after year 1 were precluded from participating in the quasi-experiment. As such, the first cohort of students represented the best source of subjects.

Thus, the subjects of the quasi-experiment consisted of the students in grades 2, 3, 4 and 5in year 1. The inclusion of students in kindergarten and grade 1 was, of course, precluded bytheir lack of pretest scores. An overview of the subjects’ passage through the grade levels ofthe project school is depicted in Table 7. This table identifies the specific SAT pre andposttests which were used in the quasi-experiment.

Table 7

SAT Pre and Posttests of the Quasi-Experiment

PretestSAT 8

1995-96

PosttestsYear 1SAT 8

1996-97

Year 2SAT 8

1997-98

Year 3SAT 8

1998-99

Year 4SAT 9

1999-00

Grades:a

–1234

b

b

2345

––345

––45

––5

Note. While the project spans five school years from 1996-97 to 2000-01, the evaluation waslimited to the first four years. SAT 8/9 = Stanford Achievement Test, edition 8/9.aThe grade levels depicted are those of the first cohort of students who enrolled in the projectschool in year 1 (1996-97).bThe inclusion of the kindergarten and grade 1 classes in the quasi-experiment was precludedby their lack of pretest scores.

Page 27: Miami-Dade County Public Schools Office of Evaluation and Research

1 Time in the current school actually proved to be a negligible predictor. However, it was used in the regressionequations, because Edison Schools Inc. places great stock in this variable.

2The MDCPS and Edison Schools Inc. subsequently agreed to also include as a predictor an estimate of the socio-economic status of the students’ school.

19

Data Analysis. The analysis of the data from the quasi-experiment required three separatestatistical procedures. They were: (a) analyses of variance (ANOVAs) to compare the pretestscores of the experimental and control groups; (b) multiple regression analyses to predict thegroups’ posttest scores; and (c) ANOVAs to assess the differences in the groups’ predictedand actual scores. The first procedure involved verifying the comparability of the experimentaland control groups on their pretest performance. Simple ANOVAs were done on the pretestscaled scores of the experimental group and the control groups to identify any statisticallysignificant differences. A lack of such differences is evidence that the groups werecomparable prior to the experimental group’s exposure to the Edison model. This statisticalprocedure was repeated each year of the quasi-experiment to verify that the groups remainedcomparable in their pretest performance despite the attrition of subjects. Once thecomparability of the pretest results is established, the results of the posttest can be examined.If the quasi-experiment has properly controlled for extraneous variables, the differences in theposttest performance of the groups can be attributed to the programs they received in theinterim.

The multiple regression analyses were the second statistical procedure of the data analysis.They were used to predict the students’ posttest scores. These predicted scores served asa means of gauging the academic achievement of the students. The predictions were basedon the pretest scores and other archival data which are associated with academicachievement. The archival data, which were considered for use in the analyses, werenumerous. They included, among others, the student’s: scores on various combinations ofSAT subtests, age, gender, ethnicity, home language, number of absences, number ofsuspensions, time in the current school, participation in the free/reduced lunch program,enrollment in the English for Speakers of Other Languages (ESOL) program, and enrollmentin the Exceptional Student Education (ESE) program. However, not all of these variablesproved to be viable predictors of the posttest scores. Preliminary computations of theregression equations identified the following variables as the most promising predictors: thetotal score on an SAT content domain, ethnicity, time in the current school, participation in thefree/reduced lunch program, ESOL enrollment, and ESE enrollment.1 Depending on thegrade level, the content domain of the subtests and the time lapse since the pretest, thesevariables collectively accounted for approximately 70.0% of the variance in the posttestscores. Accordingly, these are the variables that were used in the regression equations of thequasi-experiment.2 A complete definition of each variable appears in Table 8.

Page 28: Miami-Dade County Public Schools Office of Evaluation and Research

20

The computation of the regression equations for the quasi-experiment began with theselection of several samples of MDCPS students. From a pool of all students who were notinvolved in the quasi-experiment, a sample of 1000 was randomly selected in each of thegrades that corresponded with those of the students in the quasi-experiment at the time. AsTable 7 illustrates, this included grades 2, 3, 4 and 5 in year 1, grades 3, 4 and 5 in year 2,grades 4 and 5 in year 3, and grade 5 in year 4. Each sample of 1000 students was used tocompute the regression equations for their particular grade level.

For each grade level, two regression equations were computed using different pretest scoresas predictors. One equation was based on the total scaled score in the reading subtests ofthe pretest, and the other on the total scaled score in the mathematics subtests. The specificsubtests in each of these two content domains are identified in Table 6. This table, whichis based on the SAT 8 test used as the pretest, lists all the subtests administered in the 1995-96 school year in the grade levels of the project school. A review of the table reveals thatnearly all the subtests are from the reading and the mathematics domains. The subtests inthe remaining content domains had proven in preliminary computations to be too few innumber and/or too low in predictive value to warrant their use in the regression equations ofthe quasi-experiment.

Table 8

Predictors in the Regression Equations

Predictor Definition

PretestEthnicityTime in Current School

Free/Reduced LunchESOLESESchool’s Socio-EconomicStatusb

Total score of the subtests comprising a content domain of the SAT 8testEthnic classification: White, Black, or HispanicConsecutive number of days in the current school, and the number ofdays squareda

Participation in the free/reduced lunch programEnrollment in ESOL program, and classification by English proficiencyEnrollment in ESE program, and classification as either gifted or notPercentage of students not participating in the school’s free/reducedlunch program

Note. SAT 8 = Stanford Achievement Test, edition 8; ESOL = English for Speakers of Other Languages;ESE = Exceptional Student Education.a Separate regression weights were computed for the number of days and the number of days squared toaccommodate for either a linear or a curvilinear relationship between time in the current school and academicachievement.b The district and Edison Schools Inc. agreed to add this predictor, which was not part of the preliminarycomputations of the regression equations.

Page 29: Miami-Dade County Public Schools Office of Evaluation and Research

1Preliminary computations revealed that this substitution only slightly diminished the accuracy of thepredictions.

21

Having computed the regression equations, the regression weights of the variables were thenused to predict the students’ posttest scores. For each grade level, the students’ scores inreading and mathematics were predicted separately. In year 1, the predicted scoresconsisted of the total scaled scores in the SAT content domains of reading and mathematics.As such, the predicted scores and the predictors were based on the same set of subtests.In year 2, however, the MDCPS modified its testing program. Starting in that year, the districtno longer administered all the subtests which comprise the reading and mathematics contentdomains. The predictions were thus limited to the scaled scores in Reading Comprehensionand in Mathematics Applications, which represent the key subtests within these two contentdomains.1 The same two subtests were used in the predictions for year 3. However, with theadvent of the SAT 9 test in year 4, a minor change occurred. The Mathematics Applicationssubtest was renamed Mathematics Problem Solving, although its content remained relativelyunchanged.

The predicted scores, which are referred to as “par,” were computed individually for all thestudents in the experimental and control groups. Par represents the posttest scores that thestudents would be expected to attain, if they were attending typical MDCPS schools. As such,the actual posttest scores of the students in the control groups should be close to par, sincetheir situation is not unlike that of the students from whom the regression equations werederived. However, if the Edison model represents a superior educational program, the actualposttest scores of the students in the experimental group should exceed par.

Determining if this was so was the objective of the third and final statistical procedure in thedata analysis. It involved comparing the differences between the actual posttest scaledscores and the par scores for all the groups in the quasi-experiment. These differences, whichare known as “residual scores,” were tested using simple ANOVAs. The intent was todetermine if the residual scores of the experimental group exhibited a statistically significantadvantage over those of the control groups.

In each of the four years spanned by the evaluation, the process of computing the regressionequations was repeated. A new sample of 1000 students was selected annually for each ofthe appropriate grade levels (see Table 7). The predictors in the regression equationsremained the same, but the regression weights that they yielded varied. These weights wereused to compute par for the students remaining in the experimental and control groups. Anoverview of the basic steps in the analysis of the data is presented in Table 9.

Incidentally, since the inclusion of the kindergarten and grade 1 classes in the quasi-experiment was precluded by their lack of pretest scores (see Table 7), an alternative methodof monitoring their SAT performance was agreed upon by the MDCPS and Edison SchoolsInc. This method involved using descriptive statistics to track the longitudinal changes in theSAT scaled scores of these two groups. Beginning in year 2 of the evaluation, these changes

Page 30: Miami-Dade County Public Schools Office of Evaluation and Research

22

were gauged by comparing them to those of the other class groups at the project school.

Table 9

Basic Steps in the Analysis of the Students’ SAT Performance

ANALYSIS OF VARIANCE

! Pretest scores: Experimental group vs. control groups

MULTIPLE REGRESSION ANALYSIS

! Compute regression equations * Use regression * Compute residual scores:based on: weights to predict Actual posttest score - Par = Residual score

Pretest Ethnicity posttest scores + value: Par exceeded Time in school ESOL (i.e., par) zero: Par attained Free/reduced lunch ESE ! value: Par not attained

School’s socio-economic status

ANALYSIS OF VARIANCE

! Residual scores: Experimental group vs. control groups

Note. SAT = Stanford Achievement Test; ESOL = English for Speakers of Other Languages; ESE =Exceptional Student Education.

Students’ Academic Achievement: Florida Writing Assessment

In addition to the students’ performance on the SAT test, their performance on the FloridaWriting Assessment (FWA) was used as a source of data. It provided a different perspectiveof the students’ academic achievement (Question 3). The FWA is “designed to measurestudents’ proficiency in writing responses to assigned topics within a designated testingperiod” (Florida Department of Education, 1995). This assessment, which is mandated bystate law, is conducted in grades 4, 8 and 10 each spring. Every student who is assessedreceives a folder which consists of one writing prompt (i.e., topic) and two pages of linedpaper. The student has 45 minutes to read the prompt, formulate a response and write it inthe folder.

Page 31: Miami-Dade County Public Schools Office of Evaluation and Research

23

The FWA generally includes two categories of prompts per grade level. For example, in the1994-95 administration they were: narrative and expository in grade 4, and persuasive andexpository in grades 8 and 10. The two categories are alternated from one student to the next;however, for scoring purposes, the categories are considered parallel (i.e., equivalent). Thescoring is done by trained readers using a holistic approach. Four elements are considered:(a) focus, (b) organization, (c) support (i.e., providing examples and illustrations), and (d)conventions (e.g., punctuation, spelling, grammar, etc.). The score, which ranges from 0 to6, is based on the overall response and not on any single element.

Methodology. Assessing the project students’ performance on the FWA was less involvedthan assessing their performance on the SAT test. Since the FWA is only administered in oneof the grade levels of the project school (i.e., grade 4), the use of multiple regression analyseslike those used in the quasi-experiment was not practical. Nevertheless, a component of thequasi-experiment was used to assess the project students’ performance on the FWA. Thecontrol groups’ scores on the FWA served as a criterion for assessing performance. It shouldbe recalled that the control groups were selected to proportionally correspond with theexperimental group in grade level, ethnicity, participation in the free/reduced lunch program,and performance on the SAT pretest. It would not be unreasonable to assume that the writingproficiency of the groups also corresponded prior to the experimental group’s exposure to theEdison model. Therefore, the FWA scores of the fourth grade students in the experimentalgroup were gauged by comparing them to those of their counterparts in the control groups.

Data Analysis. The comparison of the FWA scores of the experimental group and the controlgroups required a single analysis. A simple ANOVA was computed on the scores to identifyany statistically significant differences in the groups’ performances. This analysis wasrepeated each year that the grade 4 class was part of the quasi-experiment. As Table 7illustrates, this included years 1, 2 and 3 of the evaluation. By year 4, the cohort of studentsconsisted of only the grade 5 class. Incidentally, in that year the FWA was renamed andincorporated into the state’s Florida Comprehensive Assessment Test (FCAT). The FWA iscurrently known as FCAT Writing.

Students’ Academic Achievement: Edison Curriculum Standards

The final source of data on the students’ academic achievement was their progress inattaining the curriculum standards of the Edison model (Question 4). As noted in theDescription of the Project, the Edison curriculum is results-oriented. For every academy,there is a set of over 100 explicit curriculum standards. These standards define the level ofeducational development that students must attain before they can be promoted to the nextacademy.

Page 32: Miami-Dade County Public Schools Office of Evaluation and Research

24

Methodology. The students’ progress in attaining the curriculum standards was measuredby assessments which are embedded in their school work. These embedded assessmentsare not simply tests. According to Edison Schools Inc., they are “carefully constructed learningexperiences, ... that provide reliable indications of student progress.” Unlike a standardizedtest, embedded assessments are fully aligned with the curriculum of a project school.Additionally, embedded assessments can be used to test students on complicated tasks likesolving open-ended problems (The Edison Project, 1994). The results of the embeddedassessments become part of the structured portfolios which are maintained on all the studentsin a project school.

Data Analysis. Edison Schools Inc. has developed a standardized system of assessing thestudents’ academic achievement based on the contents of their portfolios. This systeminvolves: (a) performance tasks, like embedded assessments, that are common across all theEdison project schools in the nation; (b) multiple reviews of the students’ work by teachersother than their own; and (c) the use of rubrics and scoring guidelines by the reviewers. Theassessment system was implemented in stages at the project school to allow for the teachersto be trained in its use. Edison Schools Inc. summarizes the results of these assessments inperiodic reports to the district. These summaries provided the evaluation with an additionalperspective of the students’ academic achievement. These assessment results, however,could not be gauged, because comparative data were not available from either the controlgroups or the Edison students prior to their exposure to the model.

Satisfaction of the Parents

To ascertain the degree of the parents’ satisfaction with the education that their childrenreceived at the Edison project school, data were drawn from two separate surveys (Question5). The first is the School Climate Survey, which is administered annually by the MDCPS tocomply with state statutes. There are three forms of the survey; they are: the Staff Form, theStudent Form, and the Parent Form. The Staff Form is forwarded to all teachers in the district.The remaining two forms, however, are forwarded to only samples of students and parents.These samples are randomly selected and represent approximately 25.0% of the students ineach school in the district. The purpose of the survey is “to gather information regarding whatthese groups think about the school” (Office of Educational Accountability, 1995). Therespondents are encouraged to be candid. While the survey is coded to identify the schoolbeing rated, the respondents remain anonymous.

The School Climate Survey, which was administered in year 1 of the evaluation, wasdeveloped by School Improvement Specialists. This survey instrument is based on theresearch of factors that correlate with effective schools. Seven of these factors are addressedby the instrument. They are: (a) clear school mission, (b) safe and orderly environment, (c)instructional leadership, (d) high expectations, (e) monitoring student progress, (f) opportunityto learn (i.e., time on task), and (g) home-school relations (i.e.,

Page 33: Miami-Dade County Public Schools Office of Evaluation and Research

25

parental involvement). Under each factor, the survey instrument includes a number of items.The format of each item consists of a statement detailing a typical situation in an effectiveschool. The respondents use five-point scales to rate both the perceived importance of thesituation and its frequency in the school. High ratings on the frequency scale indicate that theschool is similar to the hypothetical, effective school described in the items. Low ratingsindicate that it is not.

In year 2 of the evaluation, the district opted to replace the survey instrument developed bySchool Improvement Specialists with one developed internally. The new instrument retainsthe title of the old, uses the same sampling procedure, and targets the same populations.However, the new School Climate Survey is clearly distinct from its predecessor. It containsapproximately two-thirds fewer items. The items, furthermore, generally adhere to a Likertformat. In other words, they consist of statements about the school. The respondents indicatetheir degree of agreement with them by means of a single, five-point scale. The scale rangesfrom “Strongly Agree” to “Strongly Disagree”; and it includes “Undecided/Unknown” as its mid-point. Finally, unlike its predecessor, the items in the new instrument function independently,so it is not possible to aggregate the responses to obtain a total score. However, an overallrating can be obtained from the final, summary item of the instrument. In brief, the new surveyinstrument is so distinct from the old, that a comparison of the results from years 1 and 2 wouldyield little viable data. Nevertheless, since the district continued to use the new instrument inyears 3 and 4, comparisons of the results from years 2, 3 and 4 were feasible.

The second survey, which was used as a source of data, is part of the CSMpactsm for Schools.This is an instrument designed to measure satisfaction with a school. It was developed in1993 by Gordon S. Black Corporation. The company was subsequently renamed HarrisInteractive Inc., but the instrument was unaffected by this change. It is currently in usenationwide. The instrument, like the School Climate Survey, consists of three survey formswhich target separately the school staff, the students, and their parents. Each form includesitems which address both the respondents’ familiarity and level of satisfaction with differentschool experiences. The responses to the latter are provided by means of a letter grade scalewhich ranges from “A” to “F”. The analysis of the responses includes a multiple regressionanalysis to determine the relative importance of each experience to the general level ofsatisfaction. The results are used to determine a score dubbed the Impact Index. This indexarithmetically combines the regression weight and the frequency of the experience into asingle score that ranges from a high of 10 to a low of zero. The purpose of the index is tofacilitate the prioritization of efforts to improve the satisfaction with the school. HarrisInteractive Inc. has been retained by Edison Schools Inc. to administer the CSMpactsm forSchools annually in all the project schools.

Methodology. The new School Climate Survey, like its predecessor, is essentially a norm-referenced instrument. To gauge the parents’ responses, it was necessary to compare themto those of their counterparts in similar schools. Four MDCPS schools that are

Page 34: Miami-Dade County Public Schools Office of Evaluation and Research

26

comparable to the project school were selected during year 1 of the evaluation. The selectionwas based on the same basic variables used to select the control groups in the quasi-experiment. These variables include: the school level, the ethnic composition of the studentbody, the proportion of the students receiving free/reduced lunch, and the students’performance on the 1995-96 SAT 8 test. The following four elementary schools wereselected: Benjamin Franklin, Lakeview, Natural Bridge, and North Miami. These schoolsserved as the control schools in gauging the results of the survey.

Each year of the evaluation, the survey responses of the parents from the project school werecompared to those from the control schools. The scope of the comparison, however, waslimited by the fact that the responses were anonymous. This situation precluded the trackingof specific parents’ responses across time. Consequently, the identification of longitudinaltrends in the responses must take into account both the change in respondents from one yearto the next and the introduction of a new survey instrument in year 2. A copy of the ParentForm of the new School Climate Survey appears in Appendix C.

Unlike the School Climate Survey, the CSMpactsm for Schools is not a norm-referencedinstrument. It was designed to be a diagnostic instrument, which is used to improvesatisfaction with a school. Consequently, the results of administering it at the project schoolcould not be readily gauged through comparisons. The analysis of the results was also limitedby the fact that the responses were anonymous. Nevertheless, the responses to the parentform of the instrument, which is titled the Parent Questionnaire, were used as a source of dataon the parents’ satisfaction with the project school. A copy of the Parent Questionnaireappears in Appendix D.

Data Analysis. Since the responses to the new School Climate Survey cannot be aggregatedto obtain total scores, the statistical analysis conducted in year 1 could not be replicated.Thus, the overall outcome of the parents’ survey responses since year 2 were based on thefinal, summary item of the Parent Form. This item asks: “Students get grades A, B, C, D orF for the quality of their school work. What overall grade would you give to your child’sschool?” The responses of the project parents were compared to those of their counterpartsin the control schools. The results of the comparison are presented using descriptivestatistics. Likewise, descriptive statistics are used to present the results of the ParentQuestionnaire.

Involvement of the Parents

Data on the extent of the parents’ involvement in their children’s education were drawn fromthe School Climate Survey (Question 6). Additionally, the evaluation included data obtainedfrom project records that detail the parents’ participation in school-related activities.

Page 35: Miami-Dade County Public Schools Office of Evaluation and Research

27

Methodology. The Parent Form of the School Climate Survey contains a single item thatspecifically addresses the extent of parent involvement in school-related activities. Item 3asks: “How many school-related activities have you attended this year? Include PTAmeetings, Open House, parent-teacher conferences, meetings, theatrical performances, etc.”Unlike nearly all the other items in the survey, the response options of item 3 are not basedon the previously described scale. The options consist of the number of activities. To gaugethe numbers reported by the parents from the project school, they were compared to thosefrom the control schools. Additional data on the parents’ participation in school-relatedactivities were obtained from reports provided by Edison Schools Inc. However, it was notpossible to properly gauge these data. They were regarded as qualitative evidence ofparental involvement.

Data Analysis. The analysis of the data on parental involvement consisted primarily ofdescriptive statistics.

School Climate

The final question addressed by the evaluation concerned school climate (Question 7). Dataon school climate were drawn primarily from the teachers’ responses to the Staff Form of theSchool Climate Survey. These responses were supplemented by archival data on certainfactors that reflect the climate of a school. These factors include: the students’ attendancerate, their index of mobility, the number of indoor and outdoor suspensions, and the teacher-student ratio.

Methodology. To gauge the school climate of the Edison project school, the teachers’responses to the School Climate Survey were compared to those of the teachers from thecontrol schools. The comparison also encompassed the aforementioned factors that reflectthe climate of a school. A copy of the Staff Form of the School Climate Survey appears inAppendix E.

Data Analysis. As previously mentioned, the responses to the new version of the SchoolClimate Survey cannot be aggregated to obtain a total score. Therefore, the analysis of theteachers’ responses, like those of the parents, were based exclusively on the final, summaryitem of the survey. The project teachers’ responses were compared to those of theircounterparts in the control schools. The results of the comparison are presented usingdescriptive statistics. Likewise, descriptive statistics are used to present the results of thecomparison of the project school and the control schools on the factors that reflect the climateof a school.

Summary of Evaluation Activities

In summary, the evaluation of the Edison project school in Miami-Dade County focused on aseries of seven questions. These questions, for the most part, were derived from the threebasic objectives of the project. Question 1, which deals with the implementation of

Page 36: Miami-Dade County Public Schools Office of Evaluation and Research

28

the Edison model, was addressed primarily through qualitative data. Questions 2, 3 and 4deal with the academic performance of the students. These questions, which are based onObjective 1, were addressed primarily by making controlled comparisons of the students’performance on standardized tests. Parental satisfaction and involvement in their children’seducation are the subjects of Questions 5 and 6 respectively, which are based on Objective2. These two questions were addressed by making controlled comparisons of surveyresponses. The same basic strategy was employed in addressing Question 7. This question,which is based on Objective 3, deals with school climate. An overview of the sources of datathat were used to address each evaluation question is displayed in Table 10.

Table 10

Sources of Data for the Evaluation Questions

ObjectiveNumbera Evaluation Questionb Primary Source/s of Data Responsibility

1

1

1

2

2

3

1. Implementation of Edison model

2. Students’ performance onStanford Achievement Test(SAT)

3. Students’ performance on Florida Writing Assessment (FWA)

4. Students’ attainment of Edison curriculum standards

5. Parents’ satisfaction with children’s education

6. Parents’ involvement in children’s education

7. School climate

Interviews of principalSurvey of TeachersClassroom observations

SAT

FWA

Curriculum standards

School Climate Survey, Parent FormParent Questionnaire

School Climate Survey, Parent FormProject records

School Climate Survey, Staff FormArchival data

DistrictDistrictDistrict

District

District

Edison

DistrictEdison

DistrictEdison

DistrictDistrict

a For a full text of the project objectives, see page 13.b For a full text of the evaluation questions, see pages 13 and 14.

Page 37: Miami-Dade County Public Schools Office of Evaluation and Research

29

RESULTS OF THE EVALUATION

Edison Schools Inc. (formerly The Edison Project) has a contract with the Miami-Dade CountyPublic Schools (MDCPS) to manage Henry E.S. Reeves Elementary School for a period offive years. Edison Schools Inc. is a for-profit, management company involved in theprivatization of public schools. The company markets a unique model of education andsupplementary services. It consists of an eclectic mixture of such elements as an extendedschool year, an interdisciplinary curriculum, and the use of modern technology. An overviewof all the basic elements of the model appears in Table 5. The contract calls for the companyto employ this model in managing the project school from August, 1996 until June, 2001. Thecontract also calls for an evaluation of the project.

The evaluation, which spanned four school years, has been completed. The evaluation’scumulative findings through year 4 (1999-00) are detailed in this final report. The evaluationexamined both the implementation and the impact of the Edison model. Several sources ofdata were used. They included project documents, interviews of the school principal,classroom observations, surveys of the teachers and parents, and the students’ performanceon standardized tests. The data obtained from these various sources were used to addressthe issues raised in the evaluation questions listed in the Design of the Evaluation. Theseissues include: (a) the implementation of the Edison model at the project school; (b) theimpact of the model on the students’ academic achievement; (c) the parents’ satisfaction withthe model, and their involvement in the project school; and (d) the impact of the model on theclimate of the project school. Each of these issues warrants careful scrutiny.

Implementation of the Edison Model

To ascertain the extent of the Edison model’s implementation at the project school, theevaluation drew data from three sources. They included: (a) a survey of the full-time teachersat the project school, (b) interviews of the principal, and (c) a set of classroom observations.For the most part, the data derived from these sources are qualitative in nature. Accordingly,the analysis simply involved determining whether the various sources of data concur withregard to the extent of the model’s implementation.

The initial data on the model’s implementation during year 4 of the evaluation were drawn fromthe survey of the teachers. The survey instruments were forwarded on May 9, 2000 to all 61full-time teachers at the project school. The deadline for their responses was May 19;however, completed instruments were accepted until June 15, when they were forwarded forkey punching. At that time, a total of 57 teachers had responded, which yielded a return rateof 93.4%. This high percentage allows for the generalization of the results to the entirepopulation of full-time teachers at the school.

Page 38: Miami-Dade County Public Schools Office of Evaluation and Research

30

The survey instrument, which is titled Survey of Teachers, was developed originally for year1 of the evaluation. Since its initial administration, minor revisions have been made toenhance the clarity of the items. The instrument in its year 4 form consists of 38 items. Themajority of them deal with the implementation of the basic elements of the Edison model.These items, which include 2 through 28, essentially adhere to a Likert format. Each itemconsists of a statement describing the situation that would exist at the project school, if theelement in question were properly implemented. The teachers respond by using a four-pointscale to indicate the degree to which each statement is true about their school. A summaryof the teachers’ responses to the implementation items appears in Table 11. The balance ofthe survey items generally deal with the teachers’ opinions of the model’s impact. Theseinclude items 29 through 37, which also adhere to a Likert format. The response scale,however, differs from the one previously described. It calls for the teachers to indicate theirdegree of agreement with each statement. When warranted, the teachers’ responses to theseimpact items will be addressed in the subsequent sections of this report. A copy of the Surveyof Teachers appears in Appendix A.

As previously noted, Table 11 depicts the teachers’ responses to the implementation itemsin year 4. The table also illustrates the links between these items and the basic elements ofthe Edison model. There are four columns in the table. The first lists the 10 categories of themodel’s fundamentals and the corresponding elements, which were originally presented inTable 5. There is a total of 21 elements in the model. The second column identifies thenumbers of the survey items which address specific elements. Every element is addressedby at least one item. The limited space of the table precludes the inclusion of the items’ text,but this is available in Appendix A. The third column depicts the mean rating and, inparentheses, the number of respondents for each item (i.e., the n). The rating is based on thefollowing response scale: 4= “True,” 3= “Mostly true,” 2= “Mostly false,” and 1= “False.” Thescale also includes the option: “Not applicable/Unable to respond”. However, such a response was regarded as a non-response, and it was not included in the n. The finalcolumn in the table uses symbols to summarize whether, in the opinions of the teachers, anelement was implemented at the project school. A plus appears in the row of an element, ifthe rating of each corresponding item exceeds 2.5. A rating above this mid-point in the scaleindicates that the teachers tended to view the statement as true. A minus appears in the rowof an element, if the rating of each corresponding item is below 2.5. Such a rating indicatesthat the teachers tended to view the statement as false. Finally, if the ratings of thecorresponding items are mixed (i.e., either at 2.5, or some above and some below), bothsymbols appear in the row to indicate that the element may be partially implemented. Neitherof the latter two symbols, however, were needed to describe the teachers’ responses to thesurvey in year 4. As Table 11 illustrates, all the elements of the model were rated above the2.5 mid-point of the response scale. Thus, in the opinions of the teachers, every element wasimplemented in the project school. This is reflected in the uniform series of plus symbols inthe fourth column of the table.

Page 39: Miami-Dade County Public Schools Office of Evaluation and Research

31

Table 11

Survey of Teachers: Implementation of the Edison Model, Year 4

Fundamental: Element/sSurveyItem No.a

MeanRating (n)b

Implementationof Element c

Organization1. “Schools-within-a-school” consisting of

academies that combine several ages/grade levels

2. Ability grouping, but no tracking

3. Teachers organized into teams4. Flexible scheduling (e.g., block

scheduling)Time

5. Extended school year6. Extended school day

Curriculum7. Interdisciplinary curriculum8. Lessons organized around projects or

real life problems9. Results-oriented standards for promotion

to the next academyTeaching Methods 10. Variety of teaching methodsAssessment 11. Unique assessment system for

monitoring student’s progress towardstandards

Educators 12. Career ladder for teachers 13. Emphasis on professional development 14. Results-oriented system for assessing

job performance of teachersTechnology 15. Emphasis on technology (e.g.,

computer provided for each student’shome)

16. Home and school linked by computernetwork

Partnership 17. Emphasis on parental involvementCommunity 18. One-fourth of curriculum determined

locally 19. School is conduit for social services System and Scale20. School is supported by Central Services

of Edison Schools Inc.21. School and Central Services are linked

by computer network

2345678

109

1112

13

14

1516

171819

2021

22

23

24

25

27

2628

3.9 (57)3.9 (57)3.3 (54)3.8 (55)3.6 (53)3.8 (55)3.7 (51)

4.0 (57)4.0 (57)

3.5 (56)3.4 (56)

3.1 (49)

3.8 (56)

3.7 (56)3.8 (55)

2.9 (48)3.2 (56)2.9 (45)

3.3 (56)3.2 (56)

3.5 (57)

3.4 (55)

3.2 (35)

3.3 (51)

3.5 (50)

3.4 (49)3.8 (52)

1. +

2. +

3. + 4. +

5. + 6. +

7. + 8. +

9. +

10. +

11. +

12. + 13. + 14. +

15. +

16. +

17. +

18. +

19. +

20. +

21. +

a For the full text of the survey items, see Appendix A.b The ratings are based on the following scale: 4=True, 3=Mostly true, 2=Mostly false, and 1=False.c According to the teachers’ responses, this element of the model is: + = Implemented, +/- = Partiallyimplemented, or = Not implemented.

Page 40: Miami-Dade County Public Schools Office of Evaluation and Research

32

This outcome differs from that of year 1. In the initial year of the evaluation, the teachersgenerally held the opinion that while a vast majority of the model’s elements wereimplemented, not all of them were. This outcome had been unexpected. Consequently, itprompted an adjustment in the interviews of the principal in year 1. Originally, the interviewswere intended to be unstructured in format; and, for the most part, they were. However, dueto the outcome of the teachers’ survey, a portion of one interview with the principal wasstructured around the survey instrument. In this manner, the principal was specificallyquestioned about the implementation of each element in the model. This strategy wasrepeated in years 2, 3 and 4. As in year 1, the intent was to ascertain whether there wasconcurrence between the teachers and the principal regarding the degree of the model’simplementation. The results of the interview of the principal in year 4 are depicted in Table12. The table also includes the results of the classroom observations; and, for comparisonpurposes, the results of the teachers’ survey (i.e., the data in the final column of Table 11). Areview of Table 12 reveals that the principal was nearly in complete agreement with theteachers. According to the principal, every element of the model was implemented at theproject school in year 4 with one exception. Element 18 (i.e., one-fourth of curriculumdetermined locally) was not implemented.

The classroom observations were more limited than the other sources of data in terms of timeand focus. The observations were conducted at the beginning of the third grading period ofyear 4 and again at the conclusion of the fourth grading period. During each session,unannounced visits were made to randomly selected houses by members of the evaluationteam. The observations made during these visits were recorded on the ClassroomObservation Form, a copy of which appears in Appendix B. This instrument, which wasdeveloped expressly for this evaluation, was designed to document the operational status ofthe model’s basic elements. However, the focus of the data collection was limited toobservable evidence in the selected classrooms. Consequently, the status of some elementsof the model could not be determined. In Table 12, these elements are labeled “ND” (i.e., “notdetermined”). They include elements which could only have been discerned through extendedcontact with various aspects of the project school’s operation. This would have been thesituation for both the teachers and the principal, but not for an evaluator making a short-termvisit to the classrooms. Yet, despite this constraint, the classroom observations yieldedevidence that nearly all the elements of the model were implemented. Specifically, evidenceof implementation was noted for all but three of the elements.

In summary, the three sources of data failed to fully concur on the extent of the Edison model’simplementation in year 4 of the evaluation. The data derived from both the survey of teachersand the classroom observations generally indicated that all the basic elements of the modelwere implemented. However, the principal questioned the implementation of one element.This is a reversal of the results in year 3. In that year, the principal felt that all the elementswere implemented, but the teachers questioned the implementation of at

Page 41: Miami-Dade County Public Schools Office of Evaluation and Research

33

Surv

ey o

f T

each

ers

Inte

rvie

ws

of P

rinc

ipal

Cla

ssro

om O

bser

vatio

nsb

Table 12

All Sources of Data: Implementation of the Edison Model, Year 4

Sources of Dataa:

Fundamentals: Element/s

Organization1. “Schools-within-a-school” consisting of academies that combine

ages/grade levels2. Ability grouping, but no tracking3. Teachers organized into teams4. Flexible scheduling (e.g., block scheduling)Time5. Extended school year6. Extended school dayCurriculum7. Interdisciplinary curriculum8. Lessons organized around projects or real life problems9. Results-oriented standards for promotion to the next academyTeaching Methods10. Variety of teaching methodsAssessment11. Unique assessment system for monitoring students’ progress

toward standardsEducators12. Career ladder for teachers13. Emphasis on professional development14. Results-oriented system for assessing job performance of teachersTechnology15. Emphasis on technology (e.g., computer provided for each

student’s home)16. Home and school linked by computer networkPartnership17. Emphasis on parental involvementCommunity18. One-fourth of curriculum determined locally19. School is conduit for social servicesSystem and Scale20. School is supported by Central Services of The Edison Schools Inc.21. School and Central Services are linked by computer network

+

+++

++

+++

+

+

+ + +

+

+

+

+ +

+

+

+

+++

++

+++

+

+

+++

+

+

+

–+

+

+

+

+++

++

+++

+

+

ND+

ND

+

+

+

++

ND

+

a According to the source of data, this element of the model is: + = Implemented, +/– = Partiallyimplemented, or – = Not implemented. b If the implementation of an element could not be determined during the classroom observations,the cell is labeled “ND”.

least one element. Nevertheless, it must be acknowledged that the difference between the

Page 42: Miami-Dade County Public Schools Office of Evaluation and Research

34

implementation of all the elements and nearly all may be a moot point. Although there may bequestions regarding the status of one of the elements in year 4, the model on the whole wasclearly implemented. This essentially has been the outcome in every year of the evaluation,with the exception of year 1. In that year, the interim evaluation report concluded that themodel was “not quite fully implemented.” However, it is understandable that given the scopeand complexity of the model, its initial implementation required a considerable amount of time.By year 2 the model was fully implemented, and it has essentially remained so in thesubsequent years of the evaluation.

Impact of the Model on the Students’ Academic Achievement

The first, and clearly the most important, of the project’s three stated objectives is: “To raisethe academic achievement of all students to the highest level possible.” To ascertain whetherprogress was made in attaining this objective, data were drawn from the students’performance on two standardized tests of academic achievement. These were the StanfordAchievement Test (SAT) and the Florida Writing Assessment. In addition, data were drawnfrom reports on the students’ attainment of the curriculum standards of the Edison model.Each of these sources of data will be examined separately.

Stanford Achievement Test

Public school students in Miami-Dade County are routinely administered the SAT test eachspring. At the elementary level, all grades except kindergarten and grade 1 are tested. Togauge the project students’ performance on the test, a quasi-experiment was conducted aspart of the evaluation. The design of the quasi-experiment is a variation of the nonequivalentcontrol group design. It essentially involves using pre and posttest scores to compare theacademic achievement of students who attended the project school (i.e., the experimentalgroup) with that of three groups of students who did not (i.e., the control groups). Each controlgroup, when initially created, had the same number of students as the experimental group.The three control groups were drawn in year 1 (1996-97) of the evaluation from the pool of allMDCPS students who were not attending the project school. Stratified random sampling wasused to ensure that each control group corresponded proportionally with the experimentalgroup on the following variables: grade level, ethnicity, participation in the free/reduced lunchprogram, and performance on the pretest.

The pretest was the SAT 8 test which was administered in the 1995-96 school year. This wasthe last SAT test administered to the students in the experimental group prior to theirenrollment in the project school. The posttest, unlike the pretest, was not a single measure.It consisted of the series of SAT tests administered during the students’ passage

Page 43: Miami-Dade County Public Schools Office of Evaluation and Research

35

through the grade levels of the project school. An analysis of the project students’performance on the test versus that of the students in the control groups was conducted eachyear of the evaluation period. In year 4, the analysis encompassed only the students enrolledin grade 5. An overview of the grade levels involved in each annual analysis is depicted inTable 7.

The analysis of the data from the quasi-experiment required three separate statisticalprocedures. They consisted of: (a) analyses of variance (ANOVAs) to compare the pretestscores of the project and control groups; (b) multiple regression analyses to predict thegroups’ posttest scores; and (c) ANOVAs to assess the differences in the groups’ predictedand actual scores. The first procedure served to verify that the project and control groupsperformed comparably on the pretest. Simple ANOVAs were done on the total scaled scoresof the reading and the mathematics subtests to identify statistically significant differences inthe performance of the groups. The results of the ANOVAs on the reading subtests appearin Table 13, and the ANOVAs on the mathematics subtests in Table 14.

The third column in Table 13 lists the means of the total scaled scores of the project andcontrol groups on the SAT reading subtests. A review of this column reveals that the meanscores within each grade level are very similar. It would appear that the selection processyielded comparable groups of students. This perception was tested by ANOVAs, which weredone on the mean scores of each grade level. The results are illustrated in the last column ofthe table. A review of the F-values reveals that none of them are statistically significant. Thus,there were no statistically significant differences in the mean scores detected by ANOVAs atany grade level. The groups were indeed comparable in reading performance prior to thestart of the quasi-experiment.

Table 14 adheres to the same format as Table 13. And, a review of Table 14 yields similarresults. The project and control groups were also comparable in mathematics performanceprior to the start of the quasi-experiment. Incidentally, the review of both tables also revealsthat the number of students in each grade level of the project group nearly always differs fromthat of the control groups (i.e., the n value in column 4). This, however, is misleading, sincethe number of students selected for each grade level of the control groups was dictated by thenumber in the corresponding level of the project group. The discrepancies in both tables arethe result of project students who lacked certain pretest scores. This, of course, was not aproblem with the students in the control groups, since their selection was contingent on havingthese scores.

In year 4 simple ANOVAs were again used to compare the pretest scores of the project andthe control groups. The intent was to verify that, despite the loss of subjects, the groupsremained comparable in their pretest performance. The results of the analysis mirrored thoseof the initial comparison. In both reading and mathematics, none of the

Page 44: Miami-Dade County Public Schools Office of Evaluation and Research

36

Table 13

Comparison of the Groups’ Reading Performance on the Pretest

Pretesta ANOVA

Grade Group Mean n df MS Fb

2 ProjectControl 1Control 2Control 3

510.7509.5509.5509.9

111114114114

3 35.54 .02

3 ProjectControl 1Control 2Control 3

544.2544.2544.1544.0

148159159159

3 2.12 .00

4 ProjectControl 1Control 2Control 3

571.0569.6570.7570.7

150161161161

3 60.58 .07

5 ProjectControl 1Control 2Control 3

588.6589.2589.7589.9

115121121121

3 39.66 .05

a The analyses are based on the total scaled scores in the reading subtests of the Stanford AchievementTest, which was administered during the 1995-96 school year.b None of the F-values are statistically significant (p <.05).

differences in the total scaled scores of the groups proved to be statistically significant. Thisis evidence that the initial comparability of the groups had not been compromised by thesubsequent attrition of subjects.

Having established the comparability of the project and control groups in both their readingand mathematics performance on the pretest, the next step in the data analysis involvedpredicting the groups’ performance on the posttest. Multiple regression analyses were usedto generate the predicted scores. A total of seven variables were used as predictors in theregression equations. Six of the predictors are directly associated with the student; they are:(a) the pretest scores, (b) ethnicity, (c) time in the current school, (d) participation in thefree/reduced lunch program, (e) enrollment in the English for Speakers of Other Languages(ESOL) program, and (f) enrollment in the Exceptional Student Education (ESE) program.The seventh predictor is indirectly associated with the student; it is the socio-economic statusof the school where the student is enrolled. The specific definitions of these predictors appearin Table 8.

Page 45: Miami-Dade County Public Schools Office of Evaluation and Research

37

Table 14

Comparison of the Groups’ Mathematics Performance on the Pretest

Pretesta ANOVA

Grade Group Mean n df MS Fb

2 Project

Control 1

Control 2

Control 3

522.7

521.1

521.5

522.4

114

114

114

114

3 62.14 .04

3 Project

Control 1

Control 2

Control 3

547.9

547.7

547.4

546.6

157

159

159

159

3 53.01 .03

4 Project

Control 1

Control 2

Control 3

577.9

578.3

578.5

579.2

159

161

161

161

3 43.05 .03

5 Project

Control 1

Control 2

Control 3

611.4

611.8

611.5

612.0

119

121

121

121

3 8.43 .01

a The analyses are based on the total scaled scores in the mathematics subtests of the StanfordAchievement Test, which was administered during the 1995-96 school year.b None of the F-values are statistically significant (p <.05).

The computation of the regression equations began with the selection of a sample of MDCPSstudents. From the pool of all year 4 students who were not enrolled in the project school, asample of 1000 grade 5 students was randomly selected. This grade level corresponds withthat of the students in the project and control groups in year 4. The sample of students wasthen used to generate the regression equations.

Two regression equations were computed using different pretest scores as a predictor. Oneequation was based on the total scaled score in the reading subtests of the pretest, and theother on the total scaled score in the mathematics subtests. The regression weights whichwere generated by these equations are listed in Table 15. The first column in the table liststhe predictors, and the second identifies their various values or ranges. The specificregression weights for each predictor appear in the final column of the table. The weightsused to predict the students’ reading scores are on the top half of the table, and those usedto predict the mathematics scores are on the bottom half. A review of the Table 15

Page 46: Miami-Dade County Public Schools Office of Evaluation and Research

38

Regression Weights of the Predictors, Year 4

READING Weights

Predictor Values Grade 5

Pretest 406 to 686 .297

Ethnicity: Whitea 0 or 1 (1= White) 3.404

Ethnicity: Blacka 0 or 1 (1= Black) -13.993

Ethnicity: Hispanica 0 or 1 (1= Hispanic) 0c

Time in Current School 20 to 721b -.014

Time in Current School Squared 400 to 519,841b .000d

Free/Reduced Luncha 0 or 1 (1= not receiving) 3.638

ESOL 1,2,3,4,5, or 6 (6= not ESOL) -1.134

ESE -1,0 or 1 (1= gifted) 29.786

School’s SES 1.0 to 91.4% .164Intercept Constant 490.084

MATHEMATICS Weights

Predictor Values Grade 5

Pretest 407 to 662 .361

Ethnicity: Whitea 0 or 1 (1= White) .392

Ethnicity: Blacka 0 or 1 (1= Black) -8.924

Ethnicity: Hispanica 0 or 1 (1= Hispanic) 0c

Time in Current School 20 to 721b .003

Time in Current School Squared 400 to 519,841b .000d

Free/Reduced Luncha 0 or 1 (1= not receiving) 4.948

ESOL 1,2,3,4,5, or 6 (6= not ESOL) -5.833

ESE -1,0 or 1 (1= gifted) 25.649

School’s SES 1.0 to 91.4% .189

Intercept Constant 484.788

Note: The regression weights are rounded off at their third decimal place. ESOL = English for Speakersof Other Languages; ESE = Exceptional Student Education; SES = Socio-Economic Status.aThis is a categorical variable with coded values.bThe number of days in the extended school year of the project school was prorated, so it could beaccommodated within the 180-day range of the regular school year.cThe zero weight of the Hispanic students is a function of the set of categorical variables involved in thecoding of ethnicity.dThis weight value is too low to be reflected within three decimal places.

Page 47: Miami-Dade County Public Schools Office of Evaluation and Research

39

values in the second column of the table reveals that some of the predictors are continuousvariables (e.g., the score on the pretest), while others are categorical variables (e.g., ethnicity).Since a categorical variable has no inherent numerical value, it is necessary to give it a codedvalue to allow for its inclusion in the computation of the regression equation. It is this codingthat prompted the listing of ethnicity three times in the first column. It also led to the zerovalues of the regression weights for Hispanic students, which can be misleading. The portionof the Hispanic students’ predicted scores which is linked to ethnicity is controlled by therelative values in the regression weights of non-Hispanic students. Another item in the secondcolumn that should be mentioned is the range of the “time in current school.” To adjust for theproject school, the number of days in its extended school year was prorated, so it could beaccommodated within the 180-day range of the regular school year. Finally, it should be notedthat the regression weights are rounded at the third decimal place and not the customary firstor second. This was necessary to partially accommodate some very low values.

Using the regression weights, predictions were made of the students’ posttest scores for year4 of the evaluation. The students’ scaled scores in Reading Comprehension andMathematics Problem Solving subtests were predicted separately. These predicted scores,which are referred to as “par”, were computed individually for all the students in the project andcontrol groups. Par represents the posttest scores that the students would be expected toattain, if they were attending typical MDCPS schools. As such, the actual posttest scores ofthe students in the control groups should be close to par, since their situation was not unlikethat of the students from whom the regression equations were derived. However, if the Edisonmodel represents a superior educational program, the actual posttest scores of the studentsin the project group should exceed par.

Accordingly, the initial step in gauging the students’ performance on the posttest involveddetermining the differences between their actual scores and the par scores. Thesedifferences, which are known as “residual scores”, can range from negative to positive values.A negative value indicates that the par score was not attained; a zero value indicates that itwas; and a positive value indicates that it was exceeded. Table 16 displays the residualscores in reading for all the groups of students in the quasi-experiment. The table displaysthe following data for year 4: the number of students (i.e., the n), the mean posttest score (i.e.,the actual score), and the mean par score. In addition, the mean residual score for years 1,2, 3 and 4 are displayed. The number of scores differ by year, due to the students’advancement through the grade levels.

Finally, the table includes the results of t-tests on the statistical significance of the residualscores. Scores that proved to be significant are labeled with an asterisk. Since residualscores represent the difference between the actual score and the par score, any that provednot to be statistically significant can in effect be regarded as a zero. In other words,

Page 48: Miami-Dade County Public Schools Office of Evaluation and Research

40

the actual score and the par score are essentially the same. With this in mind, a review of theresidual scores from year 4 reveals that the project group attained the par score in reading.This is a somewhat favorable outcome, since in previous years the project group often failedto attain this standard. In fact, in the nine separate analyses conducted in years 1 through 3,four resulted in the attainment of the par score and five did not. In contrast, the control groupsover the same time period rarely failed to either attain or exceed the par score. Theirperformance in year 4 was particularly notable. All three control groups exceeded the parscore in reading. This outcome tends to eclipse the project group’s accomplishment.

Table 17 displays the groups’ residual scores in mathematics. A review of this table, whichis identical in format to Table 16, reveals that in year 4 the project group failed to attain the parscore in mathematics. This outcome, unfortunately, is in keeping with the performance of theproject group in previous years. In the nine separate analyses conducted in years 1 through3, only one resulted in the attainment of the par score; eight did not. In contrast, the controlgroups over the same time period consistently either attained or exceeded the par score.And, their performance in year 4 was no exception. One control group exceeded the parscore and the other two attained it. Thus, once again, the performance of the control groupseclipsed that of the project group.

Consequently, the analyses of the groups’ academic performance based on the residualscores revealed that the project group had exhibited some improvement in reading since year1. However, the group had failed to keep pace with the reading accomplishments of thecontrol groups. By year 4, the control groups had a distinct advantage in readingperformance. Moreover, the control groups maintained a similar advantage in mathematicsperformance for the duration of the evaluation. The differences in the groups’ year 4performance in reading and mathematics are graphically depicted in Figures 7 and 8respectively. Additionally, the differences in years 1, 2 and 3 are depicted in Figures 1 and2, 3 and 4, and 5 and 6 respectively. Each figure utilizes a bar graph to illustrate the residualscores of the project and the control groups by grade level.

The final analysis of the data from the quasi-experiment consisted of direct comparisons ofthe relative performances of the groups on the posttest. This analysis, like the one based onthe residual scores, was conducted as a means of gauging the performance of the projectgroup. The analysis involved the use of simple ANOVAs to test the statistical significance ofthe differences in the mean residual scores of the project group and each of the controlgroups. This direct comparison renders a clearer picture of each group’s relativeperformance than the analysis based solely on the residual scores. Consequently,

Page 49: Miami-Dade County Public Schools Office of Evaluation and Research

41

Table 16

Residual Scores in Reading, Years 1, 2, 3 and 4

Mean SAT Reading Score

Year 4 Year 3 Year 2 Year 1

Group Grade n Actual Para Residualb

Residual Residual Residual

Project 2345

— — — 69

———

635.3

———

634.5

———.8

— — -2.4 -6.2*

—-1.7 -4.5 -3.2*

-4.7* -3.8* -4.7*-1.9

Control 1 2345

——

— 88

———

637.7

———

629.1

———

8.6*

— — 4.4 -1.5

—7.1*

-1.6 -.1

-.4-2.9*2.4-.6

Control 2 2345

——

— 95

———

639.5

———

630.4

———

9.1*

— — 4.9 -2.8

—7.6*

-1.5 -.8

4.8-.9 .21.8

Control 3 2345

———99

———

637.5

———

630.5

———

7.0*

— — 6.8* -1.2

—5.5 -.0

1.2

4.0-1.4 1.2.9

Note. The analyses are based on the scaled scores in the reading subtests of the StanfordAchievement Tests (SAT), which were administered during the school years 1996-97 (year 1),1997-98 (year 2), 1998-99 (year 3), and 1999-00 (year 4).a The par scores were generated through multiple regression analyses.bThe residual score is the difference between the actual score and the par score.* A t-test revealed that this residual score is statistically significant (p < .05).

the direct comparison of the groups’ performance constitutes the primary analysis used ininterpreting the data generated by the quasi-experiment. The results of the analysis whichwere based on the reading scores appear in Table 18, and those based on the mathematicsscores appear in Table 19. These two tables adhere to the same basic format as Tables 13and 14, which present the results of the ANOVAs on the pretest. Table 13 and 14, however,conclude with a column containing the F-values. Since none of the F-values are statisticallysignificant (i.e., all the groups within the grade level have

Page 50: Miami-Dade County Public Schools Office of Evaluation and Research

42

Table 17

Residual Scores in Mathematics, Years 1, 2, 3 and 4

Mean SAT Mathematics Score

Year 4 Year 3 Year 2 Year 1

Group Grade n Actual Para Residualb Residual Residual Residual

Project 2345

— — — 70

—— —

640.8

—— —

648.0

— — — -7.2*

— — -19.1* -6.7*

—-4.9

-16.6* -6.2*

-16.7* -8.6*

-10.8* -8.7*

Control 1 2345

———91

—— —

647.4

—— —

641.1

— — — 6.3*

— — -.5 -2.4

—2.4

-1.9 -5.0*

3.01.4-1.6 -.8

Control 2 2345

——

— 97

—— —

643.8

—— —

642.0

— — — 1.8

— — 4.4 2.2

— 6.6 6.3* -3.3

6.9*2.8 -.7 -1.1

Control 3 2345

—— — 99

—— —

643.0

—— —

643.1

— — — -.1

— — -1.7 1.8

—-2.0 2.6

-4.8*

-.5 1.4 -1.3

.0

Note. The analyses are based on the scaled scores in the mathematics subtests of the StanfordAchievement Tests (SAT), which were administered during the school years 1996-97 (year 1),1997-98 (year 2), 1998-99 (year 3), and 1999-00 (year 4).aThe par scores were generated through multiple regression analyses.bThe residual score is the difference between the actual score and the par score.*A t-test revealed that this residual score is statistically significant (p < .05).

comparable mean scores), it precludes the testing of the differences in the mean scores ofindividual groups. In contrast, the ANOVAs of the posttests in years 1, 2, 3 and 4 yieldedseveral F-values which are statistically significant. This necessitated the inclusion of GroupDifference columns in both Tables 18 and 19. Each of these columns depicts the outcomeof a single year of the quasi-experiment. Notations consisting of abbreviations and symbolsare used to illustrate whether a statistically significant difference exists between the meanresidual score of the project group and that of a specific control group.

Page 51: Miami-Dade County Public Schools Office of Evaluation and Research

43

Grade 2 Grade 3 Grade 4 Grade 5

-20

-15

-10

-5

0

5

10

Group:

Project

Control 1

Control 2

Control 3

READING, YEAR 1

Figure 1. Residual scores in reading, year 1 (a graphic depiction by gradelevel of the data in Table 16, column 9).

Grade 2 Grade 3 Grade 4 Grade 5

-20

-15

-10

-5

0

5

10

Group:

Project

Control 1Control 2

Control 3

MATHEMATICS, YEAR 1

Figure 2. Residual scores in mathematics, year 1 (a graphic depiction bygrade level of the data in Table 17, column 9).

Page 52: Miami-Dade County Public Schools Office of Evaluation and Research

44

Grade 2 Grade 3 Grade 4 Grade 5

-20

-15

-10

-5

0

5

10

Group:

Project

Control 1

Control 2

Control 3

READING, YEAR 2

Figure 3. Residual scores in reading, year 2 (a graphic depiction by gradelevel of the data in Table 16, column 8).

Grade 2 Grade 3 Grade 4 Grade 5

-20

-15

-10

-5

0

5

10

Group:

Project

Control 1Control 2

Control 3

MATHEMATICS, YEAR 2

Figure 4. Residual scores in mathematics , year 2 (a graphic depiction bygrade level of the data in Table 17, column 8).

Page 53: Miami-Dade County Public Schools Office of Evaluation and Research

45

Grade 2 Grade 3 Grade 4 Grade 5

-20

-15

-10

-5

0

5

10

Group:

Project

Control 1

Control 2

Control 3

READING, YEAR 3

Figure 5. Residual scores in reading, year 3 (a graphic depiction by gradelevel of the data in Table 16, column 7.

Grade 2 Grade 3 Grade 4 Grade 5

-20

-15

-10

-5

0

5

10

Group:

Project

Control 1Control 2

Control 3

MATHEMATICS, YEAR 3

Figure 6. Residual scores in mathematics, year 3 (a graphic depiction bygrade level of the data in Table 17, column 7).

Page 54: Miami-Dade County Public Schools Office of Evaluation and Research

46

READING, YEAR 4

Grade 2 Grade 3 Grade 4 Grade 5

-20

-15

-10

-5

0

5

10

Group:

Project

Control 1

Control 2

Control 3

Figure 7. Residual scores in reading, year 4 (a graphic depiction by gradelevel of the data in Table 16, column 6).

MATHEMATICS, YEAR 4

Grade 2 Grade 3 Grade 4 Grade 5

-20

-15

-10

-5

0

5

10

Group:

Project

Control 1

Control 2

Control 3

Figure 8. Residual scores in mathematics, year 4 (a graphic depiction bygrade level of the data in Table 17, column 6).

Page 55: Miami-Dade County Public Schools Office of Evaluation and Research

47

Table 18

Comparison of the Groups’ Reading Performance on the Posttest, Years 1, 2, 3 and 4

Year 4: Posttest ANOVA Group Differencea

Grade Group Mean n df MS F Year 4 Year 3 Year 2 Year 1

2 ProjectControl 1Control 2Control 3

————

————

— — —— — —

———

———

P = C1 P < C2* P < C3*

3 ProjectControl 1Control 2Control 3

— — — —

— ———

— — —— — —

———

P = C1P = C2P = C3

P = C1P = C2P = C3

4 ProjectControl 1Control 2Control 3

— — — —

— — — —

— — — —— —

P = C1P = C2P = C3

P = C1P = C2P = C3

P < C1* P < C2* P < C3*

5 ProjectControl 1Control 2Control 3

.8 8.6

9.1 7.0

69889599

3 1097.5 1.34P = C1P = C2P = C3

P = C1P = C2P = C3

P = C1P = C2P = C3

P = C1P = C2P = C3

Note. The analyses are based on the residual scores in the reading subtests of the Stanford AchievementTests (SAT), which were administered during the school years 1996-97 (year 1), 1997-98 (year 2), 1998-99(year 3), and 1999-00 (year 4).aP = Project group; C = Control group.*This difference is statistically significant (p < .05).

A review of the notations in the Group Difference columns of Table 18 reveals that thestatistically significant differences in the students’ reading performance are confined to year1. In that year, two of the four grade levels exhibited significant differences. Specifically,within grade level 2, control groups 2 and 3 performed significantly better than the projectgroup; and, within grade level 4, all three control groups did likewise. By year 2, however,none of the remaining grade levels exhibited statistically significant differences. And, thissituation persisted through year 4. Thus, the data revealed the reading performance of theproject group has risen since year 1, but only to a level comparable to that of the controlgroups. The result of the groups’ mathematics performance is somewhat similar. As Table19 illustrates, in year 1 the project group was significantly outperformed by all the controlgroups in each of the four grade levels. By year 2, however, this outcome persisted in only oneof the three remaining grade levels (i.e., grade 4). The situation remained relativelyunchanged in year 3, although the cohort of students was reduced to only two grade levels.

Page 56: Miami-Dade County Public Schools Office of Evaluation and Research

48

Finally, with only grade level 5 remaining in year 4, the project group was outperformed by oneof the control groups (i.e., control group 1). Therefore, the data revealed that the mathematicsperformance of the project group has risen since year 1, but not quite to the level of the controlgroups.

In summary, the quasi-experiment was designed to use the results of the SAT tests to gaugethe academic achievement of the project students. Prior to the opening of the project school,the results of the pretest revealed that the project students and the students in the controlgroups exhibited comparable performance in both reading and mathematics. At theconclusion of each year of the project, the results of the posttest were analyzed. The analysisinvolved a comparison of the students’ posttest scores and the par scores, which areperformance standards established through a multiple regression model. In year 1, it revealedthat while nearly every grade level of the control groups met the performance standards, theproject group generally did not. In year 2, however, the control groups’ advantage in meetingthese standards was reduced considerably. In that year the reading performance of theproject group was nearly comparable to that of the control groups; and, the difference inmathematics performance had narrowed. But, the project group failed to capitalize on thisimprovement. By year 3, the control groups regained the advantage in meeting the standards.And, in year 4 they bolstered this position. The control groups exhibited a distinct advantagein both reading and mathematics performance. A second analysis of the posttest resultsconsisted of a direct comparison of the relative performances of the groups. Thiscomparison, which is regarded as the primary analysis, rendered a more favorable picture ofthe project group’s performance. In year 1, the analysis revealed that in reading, the controlgroups performed either comparably or better than the project group. And, in mathematics,the control groups performed invariably better. However, once again by year 2, the controlgroups’ advantage over the project group was reduced considerably. In that year, the readingperformance of the control groups was comparable to that of the project groups. And, inmathematics, the control groups advantage was diminished. This situation remained relativelyunchanged in the subsequent two years. The project group and the control groups remainedcomparable in reading performance, and the control groups retained their advantage inmathematics performance. Consequently, the quasi-experiment revealed that the projectgroup was able to overcome its disappointing performance in year 1. But, by year 4 itsreading performance had only risen to a level comparable to that of the control groups, andits mathematics performance had yet to attain this level. More poignantly, during the four-yearspan of the evaluation, the project group never outperformed any of the control groups in eitherreading or mathematics.

Page 57: Miami-Dade County Public Schools Office of Evaluation and Research

49

Table 19

Comparison of the Groups’ Mathematics Performance on the Posttest, Years 1, 2, 3 and 4

Year 4: Posttest ANOVA Group Differencea

Grade Group Mean n df MS F Year 4 Year 3 Year 2 Year 1

2 ProjectControl 1Control 2Control 3

————

————

— — ————

———

———

P < C1*P < C2*P < C3*

3 ProjectControl 1Control 2Control 3

————

————

— — ————

———

P = C1P = C2P = C3

P < C1*P < C2*P < C3*

4 ProjectControl 1Control 2Control 3

————

————

— — ————

P < C1*P < C2*P < C3*

P < C1*P < C2*P < C3*

P < C1*P < C2*P < C3*

5 ProjectControl 1Control 2Control 3

-7.2 6.3 1.8 -.1

70919799

3 2,461.0 3.72* P <C1*

P = C2P = C3

P = C1P = C2P = C3

P = C1P = C2P = C3

P < C1*P < C2*P < C3*

Note. The analyses are based on the residual scores in the mathematics subtests of the Stanford AchievementTests (SAT), which were administered during the school years 1996-97 (year 1), 1997-98 (year 2), 1998-99 (year3), and 1999-00 (year 4).aP = Project group; C = Control group.*This value/difference is statistically significant (p < .05).

Stanford Achievement Test: Project Students not in the Quasi-Experiment

The MDCPS administered the SAT test in year 1 of the evaluation to the project students inevery grade level except kindergarten. As such, the group of project students involved in thequasi-experiment was limited to those in grades 2, 3, 4 and 5 in year 1. As Table 7 illustrates,the inclusion of students in kindergarten and grade 1 was precluded by their lack of pretestsscores. However, these two grade levels represent a large number of students, who wereexposed to the Edison model at an earlier age than the other project students. The SATperformance of these younger students could yield valuable data on the impact of the model.Consequently, an alternative method of monitoring their performance was agreed upon by thedistrict and Edison Schools Inc. This method involved using descriptive statistics to track thelongitudinal changes in the SAT scores of these two class groups. The changes were gaugedby comparing them to those of another class group at the project school.

Page 58: Miami-Dade County Public Schools Office of Evaluation and Research

50

Such a comparison first became possible in year 2 of the evaluation. However, in that yearand in every subsequent year of the evaluation, the SAT results which were available forcomparison were very limited. This is due in part to the district’s decision in year 2 todiscontinue its practice of testing grade level 1. Also contributing to the situation is thereplacement of the SAT 8 test with the SAT 9 in year 4. While these two tests are highlycorrelated, it would be inappropriate and potentially misleading to make a direct comparisonof scaled scores derived from one test with those derived from the other. Consequently, thecomparison of the students’ performance is confined to the first three years of the evaluation,when the SAT 8 test was administered.

As previously noted, the comparison involved two “non-quasi experimental” groups ofstudents. In year 1 they were enrolled respectively in kindergarten and grade 1 of the projectschool. Unfortunately, as Table 20 illustrates, the first group lacks SAT scores in grade 1,since grade 1 was not tested in year 2. And, its SAT scores in grade 3 are not comparable,since they are based on the SAT 9 test. This leaves only the group’s grade 2 scores in year3, which renders a longitudinal comparison impossible. The second non-quasi-experimentalgroup, however, has a complete set of SAT 8 scores spanning from grade 1 in year 1 to grade3 in year 3. Its performance can be gauged by comparing it to that of the youngest group ofstudents in the quasi-experiment, who were in grade 1 in the year prior to the advent of theproject. To facilitate the comparison on the table, the cells containing the quasi-experimentalgroup’s scores are shaded.

A review of the Reading Comprehension subtests in Table 20 reveals that the mean scaledscores of the quasi-experimental group rose steadily from grade 1 to grade 3. The group’sscores were respectively 511.1, 560.3 and 592.4. Such a rise in SAT scaled scores acrossgrade levels is typical. Indeed, the non-quasi-experimental group exhibited similar results.Its scores were respectively 488.4, 559.0 and 591.3. A comparison of these two sets ofscores reveal that with the exception of grade 1, the scores are nearly equivalent. Thedisparity in grade 1, however, is understandable. It most likely stems from differences in theeducational settings of the groups at the time. Specifically, the non-quasi-experimental groupwas enrolled in the project school in grade 1, but the quasi-experimental group was not. It hadcompleted grade 1 the year prior to the advent of the project. Nevertheless, by grade 2 bothgroups were enrolled in the project school, and their test results in that grade and subsequentlyin grade 3 were quite similar. In fact, their scores differed by less than two points.

The results of the Mathematics Applications subtests display a similar pattern, although thereis a greater difference in the scores in grade 3. The quasi-experimental group has nearly a12 point advantage in that grade. Still, the prevailing pattern in the test results seems toindicate that the academic performance of the non-quasi-experimental group is essentiallyparalleling that of the quasi-experimental group. Despite the fact that the non-quasi-experimental group enrolled in the project school at an earlier age than the quasi-experimentalgroup (i.e., grade 1 versus grade 2), this did not translate into an advantage in academicperformance.

Page 59: Miami-Dade County Public Schools Office of Evaluation and Research

51

Table 20

SAT Performance of Project Students not in the Quasi-Experiment

Grade Level

1 2 3 4

Subtest (SAT Edition)School Year (Project Year) Mean (n) Mean (n) Mean (n) Mean (n)

Read. Comp. (SAT 8)Read. Comp. (SAT 8)Read. Comp. (SAT 8)Read. Comp. (SAT 8)Read. Comp. (SAT 9)c

1995-961996-97 (year 1)1997-98 (year 2)1998-99 (year 3)1999-00 (year 4)

511.1 (90)a

488.4 (92)—b

— —

560.3 (90)559.0 (92)

540.8 (124)—

592.4 (90)591.3 (73)590.5 (99)

608.2 (80)621.6 (61)

Math. Appl. (SAT 8)Math. Appl. (SAT 8)Math. Appl. (SAT 8)Math. Appl. (SAT 8)Math. Prob. Solv. (SAT 9)c

1995-961996-97 (year 1)1997-98 (year 2)1998-99 (year 3)1999-00 (year 4)

520.0 (94)a

493.0 (91)—b

— —

549.9 (94)549.9 (91)

543.6 (108)—

590.4 (94)578.5 (74)583.1 (84)

614.5 (82)620.1 (60)

Note. The data are based on the project students’ scaled scores on the eighth and ninth editions of theStanford Achievement Test (i.e., respectively SAT 8 and SAT 9). Two non-quasi experimental groups of projectstudents were involved in the comparison: the kindergarten class in year 1, and the grade 1 class in year 1.Read. Comp. = Reading Comprehension; Math. Appl. = Mathematics Applications; Math. Prob. Solv. =Mathematics Problem Solving.aThe youngest project students in the quasi-experiment were in grade 1 in the 1995-96 school year. The cellsdisplaying this group’s test results are shaded.bThe SAT test was not administered in grade level 1 after year 1.cIn year 4 the district replaced the SAT 8 test with the SAT 9. While these two tests are highly correlated, itwould be inappropriate and potentially misleading to make a direct comparison of scaled scores derived fromone test with those derived from the other.

Florida Writing Assessment

A different perspective of the project students’ academic achievement was obtained from theirperformance on the Florida Writing Assessment (FWA). The FWA is an instrument designedto assess the students’ writing proficiency. The scores on the assessment, which range from0 to 6, are provided by trained readers. In accordance with state law, the FWA isadministered each spring. At the elementary level, only grade 4 is tested. Thus, theanalyses of the FWA scores encompassed only the years when the grade 4 class was partof the cohort of students. As table 7 illustrates, these were years 1, 2 and 3 only. By year 4(1999-00), the cohort consisted of only the grade 5 class. Incidentally, in that year the FWAwas renamed and incorporated into the state’s Florida Comprehensive Assessment Test(FCAT). The FWA is currently known as FCAT Writing.

Page 60: Miami-Dade County Public Schools Office of Evaluation and Research

52

Since only grade 4 in the project school was administered the FWA, the data generated bythis assessment are limited. For this reason, the analysis of the results was not nearly asinvolved as that of the SAT test in the quasi-experiment. Still, the analysis of the FWA resultsdid include a component of the quasi-experiment. This component was the set of controlgroups, which had been selected to proportionally correspond with the project group in gradelevel, ethnicity, participation in the free/reduced lunch program, and performance on the SATpretest. Given these common elements, it would not be unreasonable to assume that thegroups also corresponded in writing proficiency prior to the project group’s exposure to theEdison model. Therefore, the mean score on the FWA of the project group’s fourth gradestudents was gauged by comparing it to those of their counterparts in the control groups. Thedifferences were tested for statistical significance using a simple ANOVA. The results appearin Table 21.

The second, third and fourth columns in Table 21 list the mean scores of the project andcontrol groups on the FWA in years 1, 2 and 3 respectively. A review of these columnsreveals that the scores of the project group increased steadily during this three-year period.Specifically, the group’s mean score rose from 2.33 in year 1, to 2.68 in year 2, and finally to3.08 in year 3. However, similar scores were obtained by each of the control groups. And,the ANOVAS, which were computed each year, failed to identify any statistically significantdifference between the scores of the project group and the control groups. This outcome isillustrated in the Group Difference columns of the table. Accordingly, all four groups performedcomparably on the FWA. The analyses did not yield any evidence that the Edison modelproduced an advantage in the writing proficiency of the project students.

Table 21

Comparison of the Groups’ Writing Performance in Grade 4, Years 1, 2 and 3

FWA Mean (n) Group Differencea b

Group Year 1 Year 2 Year 3 Year 1 Year 2 Year 3

ProjectControl 1Control 2Control 3

2.33 (156)2.27 (154)2.27 (153)2.27 (151)

2.68 (132)2.64 (148)2.76 (145)2.82 (142)

3.08 (82) 3.14 (100)3.08 (98) 3.02 (102)

P = C1P = C2P = C3

P = C1P = C2P = C3

P = C1P = C2P = C3

Note. The analyses are based on the scores in the Florida Writing Assessments (FWA),which were administered during the school years 1996-97 (year 1), 1997-98 (year 2), and1998-99 (year 3). At the elementary level, only grade 4 was administered the FWA duringthose years.aP = Project group; C = Control group.bNone of the group differences proved to be statistically significant (p < .05).

Page 61: Miami-Dade County Public Schools Office of Evaluation and Research

53

Edison Curriculum Standards

The Edison curriculum was designed to be results-oriented. Every academy in the projectschool has a set of over 100 explicit curriculum standards that the students must attain beforethey can be promoted to the next academy. The students’ progress in attaining thesestandards represents the final source of data on their academic achievement.

To assess the students’ attainment of the standards, Edison Schools Inc. has developed astandardized system. It is based on the content of the structured portfolios which aremaintained on all the students enrolled in the project school. The system was implementedin stages to allow for the teachers to receive training in its use. In year 4 of the evaluation, theteachers utilized the system to assess the students’ progress in a number of subjects. Table22 displays the results of these assessments for three key subjects: language arts, readingand mathematics.

A review of Table 22 reveals that it is partitioned into the three aforementioned subjects.Under each, two columns display the results of the assessments conducted respectively in thefirst term and the third term of year 4. Incidentally, in the initial three years of the evaluation,the results of the fourth (and final) term were used in this analysis. However, in year 4 theproject school was only able to provide the results through the third term. Table 22 depictsthese results as the percentage and number (i.e., the n) of students who either met orexceeded the curriculum standards in the designated subject. By comparing the percentagesin the first term with those in the third term, a partial picture of the students’ academicachievement during the year is rendered. Specifically, of the 18 possible comparisons on thetable (i.e., three subjects by six grade levels), 10 exhibit an increase in percentages. Someof the increases are modest, while others are considerable. An example of the latter is thepercentage of grade 2 students attaining the curriculum standards in reading; it more thandoubled, rising from 32.0% to 66.0%. Unfortunately, it was not possible to gauge theseresults, since comparative data were not available from either the control groups or the Edisonstudents prior to their enrollment in the project school.

Only subjective judgements can be drawn from these results. The number and magnitude ofthe increases in the percentages may appear to be adequate for the three terms underscrutiny. However, the teachers who were in the best position to judge this did not fully concur.Their responses to item 33 in the Survey of Teachers reveal that they were not completelycertain that “the students in the school [made] good progress in attaining the Edisoncurriculum standards.” Moreover, the teachers expressed a general disappointment in thestudents’ academic achievement during year 4. This is revealed in their responses to item32 of the survey, which states: “The students in the school are exhibiting greater academicprogress than would normally be expected.” The teachers’ responses yielded a mean ratingof 2.3 (n = 54), which indicates that they tended to disagree with this statement. This opinionis poignant, because it reflects the disappointing outcomes of both

Page 62: Miami-Dade County Public Schools Office of Evaluation and Research

54

Table 22

Project Students’ Attainment of the Edison Curriculum Standards, Year 4

Percentage (n) of Students Meeting/Exceeding Standards

Language Arts Reading Mathematics

Grade FirstTerm

Third Terma FirstTerm

Third Terma First Term Third Terma

K 70 (93) 77 (105) 70 (90) 80 (109) 67 (104) 79 (124)

1 81 (168) 76 (158) 84 (176) 89 (183) 83 (174) 79 (166)

2 75 (146) 75 (141) 32 (60) 66 (121) 88 (171) 85 (161)

3 74 (127) 86 (164) 53 (102) 61 (115) 72 (137) 89 (176)

4 84 (173) 80 (165) 58 (122) 69 (139) 87 (187) 83 (174)

5 84 (147) 77 (105) 61 (107) 67 (122) 88 (156) 79 (124)

Note. The table is based on data provided by Edison Schools Inc.aIn the initial three years of the evaluation, the results of the final (i.e., fourth) term were used inthis analysis; however, the results of the 1999-00 school year (year 4) were only availablethrough the third term.

the analysis of the SAT test results and the analysis of the FWA results. Indeed, the outcomesof these two analyses have remained essentially the same throughout the evaluation. Theywere not altered even by the project student’s considerable progress between year 1 and year2. Consequently, the project failed to attain the first and most important of its statedobjectives: “To raise the academic achievement of all students to the highest level possible.”Despite the lofty academic standards of the Edison model, the project students never onceexhibited an academic advantage over their counterparts in the regular MDCPS program.

Parents’ Satisfaction and Involvement with the Project School

The second of the project’s three stated objectives is: “To increase parent involvement andsatisfaction to levels consistent with educational excellence.” To ascertain whether progresswas made in attaining this objective, data were drawn primarily from two surveys. The firstwas the School Climate Survey, which is administered annually by the MDCPS. This surveyhas three forms; they are: the Staff Form, the Student Form, and the Parent Form. The StaffForm is forwarded to all teachers in the district. The remaining two forms, however, areforwarded to only samples of students and parents. These samples are randomly selectedand represent approximately 25.0% of the students in each school in

Page 63: Miami-Dade County Public Schools Office of Evaluation and Research

55

the district. In year 1 of the evaluation, the district administered the School Climate Surveydeveloped by School Improvement Specialists. However, starting in year 2 the districtadministered its own version of the instrument. The district’s instrument generally adheres toa Likert format. In other words, nearly all the items consist of statements about the school.The respondents indicate their degree of agreement with them by means of a five-point scale.The items, furthermore, function independently, so it is not possible to aggregate theresponses to obtain a total score. The instrument, however, does have a final item that yieldsan overall, summary rating of the school.

The Parent Form of the School Climate Survey was used to assess the parents’ satisfactionwith the project school. A copy of this instrument appears in Appendix C. To gauge theparents’ overall rating of the project school, their responses to the final, summary item of theParent Form were compared to those from the four control schools. This group included thefollowing elementary schools: Benjamin Franklin, Lakeview, Natural Bridge, and North Miami.The return rates for the Parent Form from these schools, as well as the project school, aredisplayed in Table 23. A review of the table reveals that the project parents’ return rate inyear 4 is distinctly higher than that of all but one control school. In fact, at 64.9%, it exceedsthe control schools’ composite return rate of 59.6% by more than 5 percentage points. Ingeneral, this level of response was constant throughout the evaluation period. Indeed, withthe exception of year 1, the project parents’ return rate exceeded the composite return rateof the control schools in every year.

Table 23

Return Rates for the School Climate Survey, Parent Form, Year 4

Surveys

School Forwarded Returned % Returned

Edison project schoolControl elementary schools: Benjamin Franklin Lakeview Natural Bridge North MiamiAll control schools

291

223134255331943

189

120 82120240562

64.9

53.861.247.172.559.6

The parents’ responses to the summary item of the Parent Form in years 2, 3 and 4 aredepicted in Table 24. The responses are based on a letter grade scale which ranges from“A” to “F.” The table displays the mean ratings, which were computed by assigning thefollowing values to the letter grades: A = 4, B = 3, C = 2, D = 1, and F = 0. The ratings

Page 64: Miami-Dade County Public Schools Office of Evaluation and Research

56

from year 4 are listed in the final column of the table. A review of this column reveals that theproject school’s rating of 3.4 is notably higher than the control schools’ composite rating of 3.1.The difference, however, was not tested for statistical significance. Since it is based on asingle item, such a test was deemed inappropriate. Nevertheless, given the degree of thedifference, it is reasonable to assume that it accurately reflects a higher level of satisfactionon the part of the project parents. This outcome, furthermore, mirrors those of years 2 and 3.

The second survey, which was used to assess the parents’ satisfaction with the project school,was the CSMpactSM for Schools. This survey is administered annually in every Edison projectschool in the nation by Harris Interactive Inc. (formerly the Gordon S. Black Corporation). Likethe School Climate Survey, the CSMpactSM for Schools consists of three forms which targetseparately the school staff, the students and their parents. Each form includes items whichaddress both the respondents’ familiarity and level of satisfaction with different schoolexperiences. The analysis of the responses includes a multiple regression analysis todetermine the relative importance of each experience to the general level of satisfaction. Theresults are used to determine a score dubbed the Impact Index, which ranges from a high of10 to a low of zero. A copy of the parent form of the survey, the Parent Questionnaire,appears in Appendix D.

Table 24

School Climate Survey, Parent Form: Comparison of General Satisfaction with the School,Years 2, 3 and 4

Mean Rating (n)a

Item School Year 2 Year 3 Year 4

35. Students get gradesA, B, C, D, or F for thequality of their schoolwork. What overallgrade would you giveto your child’s school?

Edison project schoolControl elementary schools:

Benjamin FranklinLakeviewNatural BridgeNorth Miami

All control schools

3.4 (151)

3.0 (120)3.4 (52)3.2 (118)3.0 (182)3.1 (472)

3.4 (175)

3.2 (80)3.3 (40)3.3 (78)3.2 (160)3.2 (358)

3.4 (174)

3.1 (111)3.2 (72)3.1 (100)3.1 (195)3.1 (478)

Note. In year 1(1996-97) the parents’ satisfaction with the school was assessed with a different versionof the School Climate Survey, which did not include the above item.aThe mean rating was computed by assigning the following values to the response options: A = 4, B = 3,C = 2, D = 1, and F = 0.

Page 65: Miami-Dade County Public Schools Office of Evaluation and Research

57

Table 25 displays the mean ratings of the project school on the summary items of the ParentQuestionnaire. The table includes, in order, the ratings for years 1, 2, 3 and 4. The survey’sreturn rate in year 4 was 32.6%. A review of the ratings in that year reveal that they are quitesimilar to those in the previous years. Specifically, the ratings appear to be quite high, sincethey tend to cluster near the top of the scale’s range. Furthermore, as was noted in the reportby Harris Interactive Inc. the ratings of the project school in year 4 exceeded the compositeratings of all the Edison project schools in every category (Harris Interactive Inc., 2000).

Such a comparison may not be entirely appropriate, given the diagnostic nature of this surveyinstrument. Nevertheless, the results of the comparison are consistent with the results of otherefforts to gauge the survey ratings. Given this consistency, it would appear that the ratingsaccurately reflect a high level of satisfaction on the part of the project parents. Consequently,the outcome of the Parent Questionnaire readily concurs with that of the Parent Form. In bothsurveys, the parents’ responses rendered a very favorable picture of the project school inyear 4.

With regard to the parents’ involvement in the project school, the primary source of data wasthe School Climate Survey. Specifically, item 3 of the Parent Form asks: “How many school-related activities have you attended this year? Include PTA meetings, Open House, parent-teacher conferences, meetings, theatrical performances, etc.” Unlike nearly all the other itemsin the Parent Form, item 3 has been part of both versions of the survey used to date. So, theparents’ responses are available for all four years of the evaluation. Also unlike the otheritems, the response options of item 3 are not based on the five-point scale, which waspreviously described. The options consist of the number of activities. To gauge thenumbers reported by the parents from the project school, they were compared to those of theparents from the control schools. The comparisons for years 1, 2, 3 and 4 are illustrated inTable 26. It should be noted that despite the inclusion of year 1 in the table, that year’s surveyresponses were deemed inconclusive due to the low return rate. Table 26 displays thepercentage and number of respondents (i.e., the n) for each of the response options. A reviewof the table reveals that there was a slight decline in the number of school activities attendedby the project parents. This can be readily perceived by comparing the changes in thepercentages reporting attendance at six or more activities versus the percentages reportingnone. Over the years, the former has declined while the latter has increased. Still, theoutcome of the comparison in year 4 did not differ from that of previous years. The projectparents reported a greater degree of involvement in their school than did their counterpartsin the control schools. Specifically, 87.0% of the project parents reported attending one ormore school activities, while only 79.2% of their counterparts reported likewise. Thedifference in these values, however, were not tested for statistical significance, since theyw e r e d r a w n f r o m a s i n g l e s u r v e y i t e m .

Page 66: Miami-Dade County Public Schools Office of Evaluation and Research

58

Table 25

Parent Questionnaire: General Satisfaction with the School, Years 1, 2, 3 and 4

Questionnaire Item: Overall rating of . . . Mean Rating (n)a

Year 1 Year 2 Year 3 Year 4

The school’s equipment and facilities

Your child’s school bus

The school’s communications/involvement

Your child’s teacher

The Board of Education

The superintendent/central office

The principal/school administration

The school’s curriculum/training

The feedback on your child’s performance

Your satisfaction with your child’s school

8.6 (81)

5.0 (7)

8.1 (80)

8.9 (78)

7.3 (–b)

7.2 (–b)

8.1 (67)

7.2 (63)

8.5 (74)

8.3 (79)

9.0 (526)

7.9 (84)

8.7 (538)

9.1 (528)

– c

– c

8.8 (513)

8.3 (470)

8.7 (502)

8.9 (–b)

8.5 (252)

7.1 (32)

8.4 (258)

8.8 (253)

– c

– c

8.8 (242)

8.0 (229)

8.4 (240)

8.5 (241)

8.5 (346)

7.1 (58)

8.3 (343)

9.0 (342)

– c

– c

8.8 (319)

8.0 (298)

8.5 (320)

8.5 (326)

Note. The Parent Questionnaire is the parent form of the CSMpactSM for Schools. The table is based on datafrom the Gordon S. Black Corporation’s reports on school years 1996-97 (year 1), 1997-98 (year 2), 1998-99(year 3), and 1999-00 (year 4).aThe ratings are based on the Impact Index which ranges from a high of 10 to a low of zero.bThis n value was not available.cThis item was not addressed by the survey after year 1.

Nevertheless, the extent of the project parents’ involvement is corroborated by project records,which document such activities as attendance at meetings and participation in workshops.Consequently, the analyses of the data yielded evidence that attested to both the parents’comparatively greater involvement and satisfaction with the project school. This outcome wasthe same in every year of the evaluation period with the exception of year 1, when the availabledata was deemed inconclusive. Thus, it would appear that the project successfully attainedits objective of increasing “parent involvement and satisfaction to levels consistent witheducational excellence.”

Climate of the Project School

The third and final stated objective of the project is: ”To improve school climate in the manyways necessary to foster greater learning.” To ascertain whether progress was made inattaining this objective, data were drawn from both archival sources and from the teachers’responses to the Staff Form of the School Climate Survey. A copy of this instrument appearsin Appendix E.

Page 67: Miami-Dade County Public Schools Office of Evaluation and Research

59

Table 26

School Climate Survey, Parent Form: Comparison of Involvement in the School, Years 1, 2,3 and 4

Item: 3. How many school-related activities have you attended this year? Include PTAmeetings, Open House, parent-teacher conferences, meetings, theatricalperformances, etc.

Percentage of Respondents (n)

Year School 6 plus 4-5 2-3 1 0

1 Edison project school

All control schoolsa

10.6 (5)

9.1 (44)

12.8 (6)

18.4 (89)

51.1 (24)

43.1 (208)

14.9 (7)

12.6 (61)

10.6 (5)

16.8 (81)

2 Edison project school

All control schoolsa

10.6 (15)

8.1 (37)

22.0 (31)

17.3 (79)

42.6 (19)

40.7 (186)

13.5 (60)

16.4 (75)

11.3 (16)

17.5 (80)

3 Edison project school

All control schoolsa

10.4 (17)

8.1 (30)

22.6 (37)

17.3 (64)

45.7 (75)

42.3 (156)

9.1 (15)

16.0 (59)

12.2 (20)

16.3 (60)

4 Edison project school

All control schoolsa

8.0 (13)

8.8 (40)

21.0 (34)

13.1 (59)

43.2 (70)

40.9 (185)

14.8 (24)

16.4 (74)

13.0 (21)

20.8 (94)

Note. Item 3 has remained unchanged, despite the use of a different version of the SchoolClimate Survey in year 1 (1996-97).aThe elementary schools used as a control were: Benjamin Franklin, Lakeview, Natural Bridgeand North Miami.

The Staff Form, as previously mentioned, was forwarded to all teachers in the district. Thereturn rates in year 4 for the project teachers and the teachers in the four control schools aredisplayed in Table 27. A review of this table reveals that the project teachers’ return rate of64.7% is lower than that of all but one control school. In fact, the rate is nearly 12 percentagepoints below the control schools’ composite return rate of 76.5%. Only in year 1 was theproject teachers’ return rate lower.

The teachers’ responses to the summary item of the Staff Form are depicted in Table 28.This table is identical in format to Table 24, which depicts the parents’ responses to the sameitem. A review of Table 28 reveals that the teachers’ responses in year 4 yielded a meanrating of 2.3 for the project school, which is clearly lower than the control schools’ compositerating of 2.8. This outcome is surprising, since it differs from the results of the

Page 68: Miami-Dade County Public Schools Office of Evaluation and Research

60

Table 27

Return Rates for the School Climate Survey, Staff Form, Year 4

Surveys

School Forwarded Returned % Returned

Edison project schoolControl elementary schools: Benjamin Franklin Lakeview Natural Bridge North MiamiAll control schools

68

54 41 71 81 247

44

43 32 45 69 189

64.7

79.678.063.485.276.5

parents’ ratings of the school. It does, however, concur with the teachers’ previous ratings ofthe school in year 3. Accordingly, it would appear that in the final years of the evaluation, theteachers were somewhat disenchanted with the project school. Once again, the differencein the ratings in year 4 was not tested for statistical significance, since they were derived froma single survey item. Nevertheless, given the magnitude of the difference, it is reasonable toassume that it accurately reflects an advantage by the control schools in school climate.

Table 28

School Climate Survey, Staff Form: Comparison of General Satisfaction with the School,Years 2, 3 and 4

Mean Rating (n)a

Item School Year 2 Year 3 Year 4

35. Students get grades A,B, C, D, or F for thequality of their schoolwork. What overallgrade would you give tothis school?

Edison project schoolControl elementary schools:

Benjamin Franklin

Lakeview

Natural Bridge

North Miami

All control schools

3.1 (90)

3.0 (72)

3.1 (19)

2.8 (127)

2.7 (159)

2.8 (377)

2.2 (55)

3.1 (27)

2.4 (26)

3.0 (43)

2.8 (66)

2.8 (162)

2.3 (41)

3.0 (35)

2.8 (31)

2.8 (44)

2.8 (66)

2.8 (176)

Note. In year 1 (1996-97) the staffs’ satisfaction with the school was assessed with a different version ofthe School Climate Survey, which did not include the above item.aThe mean rating was computed by assigning the following values to the response options: A = 4, B = 3,C = 2, D = 1, and F = 0.

The project teachers’ responses to the summary item of the Staff Form were supplementedby archival data on certain factors that reflect the climate of a school. These factors include:

Page 69: Miami-Dade County Public Schools Office of Evaluation and Research

61

the students’ attendance rate, their index of mobility, the number of indoor and outdoorsuspensions, and the teacher-student ratio. Table 29 presents the values of each factor forthe project school and the four control schools during year 4 of the evaluation. The table alsoincludes the composite value of each factor for the four control schools. A review of the tablereveals that the project school compares favorably in three of the five factors. The school hada lower mobility index, and lower numbers of both indoor and outdoor suspensions. But, italso had a lower percentage of attendance, and a higher teacher-student ratio. Consequently,the comparison yielded mixed results on the school climate. This outcome is somewhat atodds with the project teachers’ survey responses. However, it should be emphasized that theaforementioned factors are only indirect indicators of school climate. The most directevidence of the project school’s climate is represented by the teachers’ responses. And, theyindicate that the climate of the project school waned after year 2. Thus, it appears that theproject failed to attain its objective of improving “school climate in many ways necessary tofoster greater learning.”

Table 29

Comparison of Factors that Reflect School Climate, Year 4

Suspensions

School % ofAttendancea

MobilityIndexb

Indoor Outdoor Teacher-Student Ratio

Edison project schoolControl elementary schools: Benjamin Franklin Lakeview Natural Bridge North MiamiMean of control schools

93.6

96.495.795.596.496.0

17

30 31 36 31 32c

0

1 0 0 93 24c

31

10 12 84 79 46c

1:17

1:171:141:161:17 1:16

Note. The table is based on data from the 1999-00 school year (i.e., year 4 of the evaluation). a The percentage of attendance for the school is based on the sum total of the days attended byall the students during the year divided by the sum total of days they were enrolled.b The mobility index is actually a percentage based on the number of students who entered orwithdrew from the school during the year (without regard to how many times an individualstudent did either) divided by the cumulative number of students who were enrolled during thesame period.c This figure is rounded to the nearest whole number.

Page 70: Miami-Dade County Public Schools Office of Evaluation and Research

62

CONCLUSIONS

The Miami-Dade County Public Schools (MDCPS) is currently experimenting withprivatization. The district has contracted with Edison Schools Inc. (formerly The EdisonProject) to manage Henry E.S. Reeves Elementary School for a period of five years. EdisonSchools Inc. is a for-profit, management company involved in the privatization of publicschools. The company markets a unique model of education and supplementary services.It consists of an eclectic mixture of such elements as an extended school year, aninterdisciplinary curriculum, and the use of modern technology. The contract calls for thecompany to employ this model in managing the project school from August, 1996 until June,2001. The contract also calls for an evaluation of the project.

The evaluation was conducted by the Office of Evaluation and Research of the MDCPS inconjunction with Edison Schools Inc. The evaluation spanned four school years from 1996-97through 1999-00. The intent of the evaluation was to gauge the impact of the Edison modelon the project school. Four general areas were addressed by the evaluation. The first wasthe actual implementation of the Edison model in the school. The remaining three areas werethe stated objectives of the project. They are:

1. To raise the academic achievement of all students to the highest level possible.2. To increase parent involvement and satisfaction to levels consistent with educational

excellence.3. To improve school climate in the many ways necessary to foster greater learning.

The specific focus of the evaluation was defined by a series of questions, which were primarilyderived from the stated objectives. These questions can now be addressed.

1. Have the basic elements of the Edison model been implemented in the project school?

The Edison model is described in the company’s book, Partnership School Design. Areview of this document resulted in the identification of 21 elements which can beregarded as basic to the model. To ascertain the extent to which these elements wereimplemented in the project school, data were drawn each year from: (a) a survey of theteachers, (b) interviews of the principal, and (c) classroom observations. Since the dataderived from these sources are qualitative in nature, the analysis was confined todetermining the degree of their concurrence on the extent of the model’s implementation.The results revealed that the sources failed to fully concur on the extent of the model’simplementation in year 4 of the project. The data derived from both the survey of teachersand the classroom observations generally indicated that all the basic elements of themodel were implemented. The principal, however, questioned the implementation of oneelement. Nevertheless, it must be acknowledged that the difference between theimplementation of all the elements and

Page 71: Miami-Dade County Public Schools Office of Evaluation and Research

63

nearly all may be a moot point. Although there may be questions regarding the status ofone of the elements in year 4, the model on the whole was clearly implemented. And, thishas been the basic outcome in every year of the evaluation, with the exception of year 1.In that year, the interim evaluation report concluded that the model was “not quite fullyimplemented.” But by year 2 it was, and it has essentially remained so in the subsequentyears of the evaluation.

2. Are the students in the project school performing better on the Stanford Achievement Test(SAT) than would be expected, if these students were attending other MDCPS schools(Objective 1)?

A quasi-experiment was conducted to gauge the project students’ performance on theSAT test. Its design is a variation of the nonequivalent control group design. It essentiallyinvolved using the pre and posttest scores to compare the performance of students whowere attending the project school (i.e., the experimental group) with that of three groupsof students who were not (i.e., the control groups). The results of the pretest revealed thatthe project group and the control groups exhibited comparable performance in bothreading and mathematics in the year prior to the inception of the project. At theconclusion of each year of the project, the results of the posttest were analyzed. Theprimary analysis consisted of a direct comparison of the relative performances of thegroups. In year 1, the analysis revealed that in reading, the control groups performedeither comparably or better than the project group. And, in mathematics, the controlgroups performed invariably better. However, by year 2, the control groups’ advantageover the project group was reduced considerably. In that year, the reading performanceof the control groups was comparable to that of the project group; and, in mathematics,the control groups’ advantage was diminished. Unfortunately, the project group failed tocapitalize on this improvement. And, the situation remained relatively unchanged in thesubsequent two years. Consequently, the analysis of the SAT test results revealed thatthe project students made substantial gains since their disappointing performance in year1. But, by year 4, they did not perform better on the SAT test than they would have, if theyhad been attending other MDCPS schools. Their reading performance was onlycomparable, and their mathematics performance had yet to attain this level. Indeed, theproject students never once outperformed their counterparts in the control groups on theSAT test.

3. Are the students in the project school performing better on the Florida WritingAssessment (FWA) than would be expected, if these students were attending otherMDCPS schools (Objective 1)?

To assess the students’ writing proficiency, the MDCPS administers the FWA each springto students in grades 4, 8 and 10. Since only grade 4 in the project school isadministered this assessment, the data generated by it are limited. For this reason,

Page 72: Miami-Dade County Public Schools Office of Evaluation and Research

64

the analysis of the results was not nearly as involved as that of the SAT test in the quasi-experiment. Still the analysis of the FWA results did include a component of the quasi-experiment. The three control groups from the quasi-experiment were used to gauge theperformance of the project students on the FWA. The analyses for years 1, 2 and 3 of theevaluation revealed no differences between the performance of the fourth grade studentsin the project school and their counterparts in the control groups. This analysis was notpossible in year 4, since the project group by then consisted solely of the grade 5 class.Thus, the analyses based on the FWA results were concluded in year 3. And, theyrevealed that the project students did not perform better on the FWA than they would have,if they had been attending other MDCPS schools. Their performance was onlycomparable.

4. Are the students in the project school making good progress in meeting the curriculumstandards of the Edison model (Objective 1)?

The Edison curriculum was designed to be results-oriented. Every academy in the projectschool has a set of over 100 explicit curriculum standards that the students must attainbefore they can be promoted to the next academy. To assess the students’ attainmentof these standards, Edison Schools Inc. has developed a standardized system. Theteachers utilize this system to assess the students’ progress in various subjects. Theevaluation focused on the students’ attainment of the standards in three key subjects:language arts, reading and mathematics. A review of the results in the initial three termsof year 4 reveal that the students generally made progress in attaining the standards ofthese three subjects. However, it is not possible to gauge the degree of their progress,since comparative data were not available from either a control group or the Edisonstudents prior to their enrollment in the project school. Thus, the results can only be judgedsubjectively. The degree of progress appears to be adequate, but the project teachersdid not fully concur. Their survey responses reveal that, as a group, they were notcompletely certain the students made good progress in year 4 in attaining the curriculumstandards of the Edison model.

5. Are the parents of students in the project school more satisfied with their children’seducation than parents of students attending comparable MDCPS schools (Objective 2)?

To assess the project parents’ satisfaction with their children’s education, data weredrawn primarily from two surveys. The first was the School Climate Survey, which isadministered annually by the MDCPS. This survey has three forms: the Staff Form, theStudent Form, and the Parent Form. The latter was used to assess the project parents’satisfaction. Their responses were gauged by comparing them to those of

Page 73: Miami-Dade County Public Schools Office of Evaluation and Research

65

the parents from four control schools. The comparison revealed that in year 4, the projectparents expressed a higher level of satisfaction than their counterparts from the controlschools. The second survey, which was used to assess the parents’ satisfaction, is theCSMpactSM for Schools. The parent form of this survey, the Parent Questionnaire, isadministered annually in every Edison project school in the nation. In this district, theresults of the Parent Questionnaire in year 4 concur with those of the Parent Form. In bothsurveys, the parents’ responses render a very favorable picture of the project school.Thus, the evidence indicates that in year 4 of the evaluation, the parents of students in theproject school were more satisfied with their children’s education than were the parentsof students attending comparable MDCPS schools. This favorable outcome was thesame in every year of the evaluation period with the exception of year 1, when theavailable data was inconclusive.

6. Are the parents of students in the project school more involved in their children’seducation than the parents of students attending comparable MDCPS schools (Objective2)?

Data on the extent of the project parents’ involvement in their children’s education weredrawn from the responses to the Parent Form of the School Climate Survey. Additionally,the evaluation included data obtained from project records that detail the parents’participation in school-related activities. The project parents’ survey responses weregauged by comparing them to those of the parents from the four control schools. Thecomparison revealed that in year 4, the project parents reported a greater degree ofinvolvement in school-related activities than their counterparts from the control schools.The project parents’ involvement, furthermore, was corroborated by project records. Thus,the evidence indicates that in year 4, the parents of students in the project school weremore involved in their children’s education than were the parents of students attendingcomparable MDCPS schools. Once again, this favorable outcome was the same in everyyear of the evaluation period with the exception of year 1, when the available data wasinconclusive.

7. Is the school climate of the project school superior to that of comparable MDCPS schools(Objective 3)?

Data on the climate of the project school were drawn from the teachers’ responses to theStaff Form of the School Climate Survey, and from archival files on several relevantfactors. The latter included: the students’ attendance rate, their index of mobility, thenumber of indoor and outdoor suspensions, and the teacher-student ratio. To gauge boththe teachers’ responses and the values of the aforementioned factors from the projectschool, they were compared to those from the four control schools. The comparison of thefactors yielded mixed results. However, the teachers’ responses, which represent themore direct evidence, indicated that the climate of the project school in year 4 waswanting. It did not compare favorably with the control schools.

Page 74: Miami-Dade County Public Schools Office of Evaluation and Research

66

This outcome is surprising, since it differs from the perceptions of the parents. But, itdoes concur with the teachers’ previous ratings of the school in year 3. Not since year 2have the results of this comparison favored the project school. Thus, it appears that theclimate of the project school waned in the final years of the evaluation period. It was notsuperior to that of comparable MDCPS schools.

In summary, the evaluation encompassed the project school’s initial four years of operation:1996-97 (year 1), 1997-98 (year 2), 1998-99 (year 3), and 1999-00 (year 4). The evaluationfocused primarily on four areas. The first was the implementation of the Edison model at theproject school. The remaining areas were the attainment of the project’s three statedobjectives. With regard to the project’s implementation, the data revealed that the model onthe whole was implemented in year 4. This has been the basic outcome in every year of theevaluation since year 2. Data on the project students’ academic achievement were drawnprimarily from controlled comparisons of their performance on standardized tests. The resultsrevealed that despite overcoming a disappointing performance in year 1, the project students’performance in years 2 through 4 remained relatively unchanged. At best, their test scoresin both reading and mathematics were only comparable to those of their counterparts in theregular MDCPS program. The project thus failed to attain the first and most important of itsstated objectives: “To raise the academic achievement of all students to the highest levelpossible.” The project parents’ involvement and satisfaction with their children’s educationwere assessed primarily through controlled comparisons of survey responses. The results inyear 4 revealed that, as in previous years, the project parents were comparatively moreinvolved and satisfied with their children’s education. Accordingly, the project successfullyattained its second stated objective: “To increase parent involvement and satisfaction to levelsconsistent with educational excellence.” Finally, controlled comparisons of both surveyresponses and archival data were used to assess the climate of the project school. Theresults revealed that the project school’s climate in year 4 did not compare favorably with thatof comparable MDCPS schools. Indeed, the climate of the project school appears to havewaned after year 2. Therefore, the project failed to attain its third stated objective: “To improveschool climate in the many ways necessary to foster greater learning.”

On the whole, the outcome of the evaluation was not encouraging. While the parents clearlyretained their enthusiasm for the school, the same could not be said for the teachers. Moreimportantly, the project students ultimately failed to capitalize on the academic gains that theymade between year 1 and year 2. At that time, it was speculated that if their progresscontinued at the same rate, they would surpass the academic performance of theircounterparts in the regular MDCPS program. Unfortunately, this did not come to pass.Despite the lofty academic standards of the Edison model, the project students never onceexhibited an academic advantage over the students in the regular MDCPS program.Consequently, the evaluation failed to produce any evidence that the Edison model representsa superior educational program.

Page 75: Miami-Dade County Public Schools Office of Evaluation and Research

67

The implications of this conclusion, however, are rather circumscribed. Despite thesophistication of the evaluation’s design, it was admittedly attenuated by the inclusion of onlya single project school. Originally, the intent was to base the evaluation on two schools.Unfortunately, only one was subsequently established in the district. Had more than oneproject school been available, it would have facilitated the control of certain factors that mayhave contributed to the unfavorable results of the evaluation. These factors included the turn-over of personnel at the project school and conflicts among its staff. And, while suchdisruptions to a school’s operation may not be uncommon, they seemed to be particularlypronounced at the project school during the evaluation period. As such, they cannot bedisregarded when considering the generalizability of the evaluation’s results. And, to be surethis evaluation has limited generalizability. It involved a single elementary school in aneconomically depressed neighborhood attended almost exclusively by students of a singlerace. Given these constraints, it seems clear that this evaluation cannot provide a definitiveanswer on the efficacy of the Edison model. Such an answer can only result from additionalcontrolled studies. Ideally these studies should be conducted by independent third parties ina variety of school settings. In this manner, it might be possible to either verify or refute theresults of the current evaluation. If the outcome is the latter, perhaps the accumulated researchcan eventually identify the school setting where the Edison model is most effective. But untilthis is known, it is recommended that the MDCPS give careful consideration beforecommitting additional resources to the Edison model.

Page 76: Miami-Dade County Public Schools Office of Evaluation and Research

68

REFERENCES

Abella, R. (1994). Evaluation of the Saturn School Project at South Pointe ElementarySchool. Miami, FL: Dade County Public Schools.

Campbell, D.T., & Stanley, J.C. (1963). Experimental and quasi-experimental designs forresearch. Boston: Haughton Mifflin.

Florida Department of Education (1995). Florida Writing Assessment Program, statisticalreport. Tallahassee, FL: Author.

Gleick, E. (1995, November 13). Privatized lives. Time, 146, 88.

Harris Interactive Inc. (2000). Henry E.S. Reeves Elementary School, a report of surveyresearch, 1999-2000 school year. Rochester, NY: Author.

Kramer, J. J., & Conoley, J. C. (Eds.) (1992). The eleventh mental measurement yearbook. Lincoln, NE: Buros Institute of Mental Measurements.

Office of Educational Accountability (1995). A guide to interpreting the results of the School Climate Survey. Miami, FL: Dade County Public Schools.

The Edison Project (1994). Partnership school design. New York: Author.

The Psychological Corporation (1989). Stanford Achievement Test series, eighth edition,national norms booklet. San Antonio, TX: Harcourt Brace Jovanovich.