boe packet

Upload: fnoschese

Post on 29-May-2018

226 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/9/2019 BOE Packet

    1/27

    REGENTS PHYSICS

    Wi ll the rigor of the course be compromised?

    No. Studies show that students need time to move from their nave conceptions aboutscience to more expert-like views. In addition, students who study fewer topics in greater

    depth in high school have higher success in college science than their peers who had a

    greater breadth of coverage. Instead of being asked to demonstrate their knowledge abouta wide-range of topics at a superficial level (like the Regents exam), students must show

    they have a deep understanding of the essentials.[BLUE Attached: Depth vs. Breadth study]

    What is essential to prepare students for college?

    The College Board recently identified the knowledge and skills students need to besuccessful in college science. These standards adhere to the more depth, less breadth

    approach. I can attest that the physics standards are pedagogically sound they draw fromlarge volume of research on how students learn physics.

    [PINK Attached: College Board Standards for College Success]

    How w ill students learn physics to obtain a deep understanding of the essentials?

    We currently teach physics using Modeling Instruction whenever possible. Frank was trained

    in Modeling Instruction at a 2.5 week summer institute at Buffalo State in 2004. Jimattended 2.5 weeks of summer training at Arizona State in 2005. Modeling instruction has

    embraced Tony Wagners 7 essential skills long before he wrote the book! Instead of relyingon lectures and textbooks, Modeling Instruction has students design experiments to

    determine physics concepts and relationships themselves. Students must reconcile theresults of their experiments with their own (often nave views) of how the world works. They

    then present their results to the class so we can come to a consensus. This entire process

    takes much more time than a traditional-lecture based approach, but the depth of studentunderstanding and retention is much greater.

    [GREEN Attached: Modeling Instruction article]

    What types of alternate assessments might students do?

    There are a variety of ways students can be assessed besides a paper-and-pencil exam. Ive

    included a list of possible projects/assessments that can be used as a culminating activity ina modeling cycle or even given at the start of a unit as motivation to drive instruction

    forward. Again, implementing experiences like these for students takes more time thantraditional instruction.

    [YELLOW Attached: Sample assessments]

    How will you know students are performing at the same level without the exam?I am implementing a criterion-based grading system. Learning goals for teach topic are

    clearly laid out and students are graded using a 1-4 rubric for each goal. Every HW, classdiscussion, quiz, project, etc. can be used as evidence to see how well students are

    reaching these learning goals. Students are evaluated on eventual mastery and any initialmissteps while exploring the content are not considered. This system ensures I know (and

    students know) whether students have the essential skills and knowledge needed for

    success in college science.[WHITE Attached: Samples criterion-based grading and rubrics used by students]

  • 8/9/2019 BOE Packet

    2/27

    Depth Versus Breadth: HowContent Coverage in High SchoolScience Courses Relates to LaterSuccess in College ScienceCoursework

    MARC S. SCHWARTZ

    University of Texas, Arlington, TX 76019, USA

    PHILIP M. SADLER, GERHARD SONNERT

    Science Education Department, Harvard-Smithsonian Center for Astrophysics,

    Cambridge, MA 02138, USA

    ROBERT H. TAI

    Curriculum, Instruction, and Special Education Department, Curry School of Education,

    University of Virginia, Charlottesville, VA 22904, USA

    Received 5 March 2008; revised 1 September 2008, 10 October 2008;

    accepted 17 October 2008

    DOI 10.1002/sce.20328

    Published online in Wiley InterScience (www.interscience.wiley.com).

    ABSTRACT:Thisstudyrelatesthe performance of collegestudents in introductory science

    courses to the amount of content covered in their high school science courses. The sample

    includes 8310 students in introductory biology, chemistry, or physics coursesin 55 randomly

    chosen U.S. colleges and universities. Students who reported covering at least 1 major topic

    in depth, for a month or longer, in high school were found to earn higher grades in college

    science than did students who reported no coverage in depth. Students reporting breadth in

    their high school course, covering all major topics, did not appear to have any advantage in

    chemistry or physics and a significant disadvantage in biology. Care was taken to account

    for significant covariates: socioeconomic variables, English and mathematics proficiency,

    and rigor of their preparatory high science course. Alternative operationalizations of depth

    and breadth variables result in very similar findings. We conclude that teachers should

    use their judgment to reduce coverage in high school science courses and aim for mastery

    by extending at least 1 topic in depth over an extended period of time. C 2008 Wiley

    Periodicals, Inc. Sci Ed 1 29, 2008

    Correspondence to: Marc S. Schwartz; e-mail: [email protected] grant sponsor: Interagency Eductional Research Initiative.Contract grant number: NSF-REC 0115649.

    C 2008 Wiley Periodicals, Inc.

  • 8/9/2019 BOE Packet

    3/27

    DEPTH VERSUS BREADTH 21

    academically weaker students? We systematically investigated the interactions be-

    tween performance variables and the depth and breadth variables and found onlytwo interactions significant below the .05 level. Contrary to the main effects analysis

    results, the significance of the interactions was not consistent and did not provide

    a clear picture of the connections between the outcome and predictors. The results

    suggest that the associations for high- and low-achieving students are similar with

    respect to depth and breadth of content coverage.4

    Students of highly rated teachers. Our survey asked students to rate several teacher

    characteristics. If depth and breadth were to a considerable extent correlated with

    those teacher characteristics, this would make the results harder to interpret because

    of the conundrum of colinearity: that particular teacher characteristics (rather than

    depth or breadth of coverage) might be the actual determinants of student outcomes,

    and that depth or breadth themselves might merely reflect those teacher charac-

    teristics by association. Teacher characteristics, however, were found to be only

    minimally correlated with breadth and depth. Among the teacher characteristics,only very weak correlations are found between breadth and the teachers rated abil-

    ity to explain problems in several different ways (r = .04). Depth correlated weakly

    with the ratings of the teachers subject knowledge (r= .05). It appears that highly

    rated teachers are no more likely than others to teach with depth or breadth.

    DISCUSSION

    The baseline model (Table 2) shows the association between breadth and depth of high

    school science curricula and first-course introductory-level college science performance.

    The most striking analytical characteristic is the consistency of the results across the three

    baseline models, revealing the same trends across the disciplines of biology, chemistry,

    and physics. In addition, the contrast of positive associations between depth and perfor-

    mance and negative associations between breadth and performance is striking, though notsignificant in all cases. This replication of results across differing disciplines suggests a

    consistency to the findings. However, we caution that the baseline model offers only out-

    comes apart from the consideration of additional factors that may have some influence on

    these associations. The robustness of these associations was put to the test, as additional

    factors were included in an extended model with additional analyses.

    The extended model, shown in Table 3, included variables that control for variations

    in student achievement as measured by tests scores, high school course level (regular,

    honors, AP), and high school grades as well as background variables such as parents

    educational level, community socioeconomic level, public high school attendance, and

    year in college. With the addition of these variables in the extended model, we find that

    the magnitude of the coefficients decreased in the majority of instances, but the trends

    and their significance remained. Keeping in mind the generalized nature of the analyses

    we have undertaken, the most notable feature of these results stems not simply from themagnitude of the coefficients, but rather from the trends and their significance. Here, the

    positive associations between depth of study and performance remain significant whereas

    the negative association between breadth of study and performance retains its significance

    in the biology analysis. Although this association is not significant in the chemistry and

    physics analyses, the negative trend is consistent with our result for biology.

    4 A significant interaction (p= .03) was found between the math SAT score and depth. In addition, a

    second significant interaction (p= .02) was found between breadth and most advanced mathematics grade.

    Apart from these two significant interactions, all others were nonsignificant.

    Science Education

  • 8/9/2019 BOE Packet

    4/27

    22 SCHWARTZ ET AL.

    What these outcomes reveal is that although the choice to pursue depth of study has

    a significant and positive association with performance, the choice to pursue breadth ofstudy appears to have implications as well. These appear to be that students whose teachers

    choose broad coverage of content, on the average, experience no benefit. In the extended

    model, we arrive at these results while accounting for important differences in students

    backgrounds and academic performance, which attests to the robustness of these findings.

    The findings run counter to philosophical positions that favor breadth or those that advocate

    a balance between depth and breadth (Murtagh, 2001; Wright, 2000).

    Finally, a particularly surprising and intriguing finding from our analysis indicated that

    the depth and breadth variables as defined in this analysis appear to be uncorrelated;

    additionally, the interactions between both variables are not significant in either model.

    Figure 6 offers an analysis of the outcomes treating breadth and depth as independent

    characteristics of high school science curriculum. The first panel reveals that over 40%

    of the students in our survey fall within the depth presentbreadth absent grouping,

    whereas the other three varied between 14% and 23%. The distribution across these fourgroupings shows that the teachers of students in our survey are making choices about which

    topics to leave out and which to emphasize. These choices have consequences. The second

    panel indicates that those students reporting high school science experiences associated

    with the group depth presentbreadth absent have an advantage equal to two thirds of a

    year of instruction over their peers who had the opposite high school experience (Depth

    AbsentBreadth Present). This outcome is particularly important given that high school

    science teachers are often faced with choices that require balancing available class time

    and course content. It appears that even teachers who attempt to follow the best of both

    worlds approach by mixing depth and breadth of content coverage do not, on average,

    provide an advantage to their students who continue to study science in college over their

    peers whose teachers focused on depth of content coverage.

    Ratherthan relying solelyon onedefinition of depth andbreadth, we examinedalternative

    operationalizations of the breadth and depth variables and replicated the analysis. Thefindings remained robust when these alternative operationalizations were applied.

    However, a limitation in our definition of depth and breadth stems from the smallest unit

    of time we used in the study to define depth of study, which was on the order of weeks.

    Our analysis cannot discern a grain size smaller than weeks. Teachers may opt to study

    a single concept in a class period or many within that period. Likewise, teachers might

    decide that a single laboratory experiment should go through several iterations in as many

    days, while others will cover three different, unrelated laboratories in the same time span.

    These differences are beyond the capacity of our analysis to resolve. In the end, our work

    is but one approach to the analysis of the concepts of depth and breadth generated from

    students self-reports of their high school science classroom experiences.

    Findings in Light of Research From Cognitive ScienceMost students hold conceptions that are at odds with those of their teachers or those

    of scientists. In general, students cling tenaciously to these ideas, even in the face of

    concerted efforts (Sadler, 1998). These misconceptions or alternative conceptions have

    been uncovered using several research techniques, but qualitative interviews, traceable to

    Piaget, have proven highly productive (Duckworth, 1987; Osborne & Gilbert, 1980). A

    vast number of papers have been published revealing student misconceptions in a variety

    of domains (Pfunt & Duit, 1994).

    Students do not quickly or easily change their nave scientific conceptions, even when

    confronted with physical phenomena. However, given enough time and the proper impetus,

    Science Education

  • 8/9/2019 BOE Packet

    5/27

    DEPTH VERSUS BREADTH 23

    students can revisit and rethink their ideas. This change can often be painstakingly slow,

    taking much longer than many teachers allow in the rapid conveyance of content that isthe hallmark of a curriculum that focuses on broad coverage. In the view of Eylon and

    Linn (1988), in-depth coverage can elaborate incomplete ideas, provide enough cues to

    encourage selection of different views of a phenomenon, or establish a well-understood

    alternative (p. 263). Focused class time is required for teachers to probe for the levels of

    understanding attained by students and for students to elucidate their preconceptions. It

    takes focused time for students to test their ideas and find them wanting, motivating them to

    reconstruct their knowledge. Eylon and Linn note, Furthermore, students may hold onto

    these ideas because they are well established and reasonably effective, not because they are

    concrete. For example, nave notions of mechanics are generally appropriate for dealing

    with a friction-filled world. In addition, students appear quite able to think abstractly as

    long as they have appropriate science topic knowledge, even if they are young, and even

    if their ideas are incomplete. Thus, deep coverage of a topic may elicit abstract reasoning

    (p. 290).The difficulty of promoting conceptual growth is well documented in neo-Piagetian

    models (Case, 1998; Fischer & Bidell, 2006; Parziale & Fischer, 1998; Schwartz & Sadler,

    2007; Siegler, 1998). These researchers argue that more complex representations, such as

    those prized in the sciences, require multiple opportunities for construction in addition to

    multiple experiences in which those ideas and observations may be coordinated into richer,

    more complex understandings of their world. This process is necessarily intensive and time

    consuming because the growth of understanding is a nonlinear process that is sensitive to

    changes in context, emotions, and degree of practice. The experience of developing richer

    understandings is similar to learning to juggle (Schwartz & Fischer, 2004). Learning to

    juggle three balls at home does not guarantee you can juggle the same three balls in front of

    an audience. Alternatively, learning to juggle three balls does not mean that you can juggle

    three raw eggs. Changes in context and variables lead to differences in the relationships

    that students have with ideas (both new and old) and their ability to maintain them overtime (Fischer & Pipp, 1984; Fischer, Bullock, Rotenberg, & Raya, 1993). The process

    of integrating many variables that scientists consider in scientific models is a foreign and

    difficult assignment for anyone outside the immediate field of research.

    As just noted, the realization in the cognitive sciences that the learning process is not

    linear is an important pedagogical insight. Because learning is a dynamic operation that

    depends on numerous variables, leaving a topic too soon deprives students and teachers

    the time to experience situations that allow students to confront personal understandings

    and connections and to evaluate the usefulness of scientific models within their personal

    models.

    Implications in Light of High-Stakes Testing

    We are currently in the Age of Accountability in U.S. education. The hallmark of thisera is high-stakes standardized testing. The nature and consequences of these examinations

    influence the pedagogy that teachers use in the classroom. As Li et al. (2006) observed:

    Academic and policy debates seem somewhat remote to practitioners on the frontline of

    science education. Science teachers, department heads, and instructional specialists need to

    survive and thrive in a teaching environment increasingly driven by standards and measured

    by accountability tests. They are the ones who must, here and now, find solutions to the

    pressing problems of standards-based reform (p. 5).

    Our concern stems from the connection between the extent that high-stakes state ex-

    aminations focus on the facts of science, secondary school science teachers will focus

    Science Education

  • 8/9/2019 BOE Packet

    6/27

    24 SCHWARTZ ET AL.

    their pedagogy on ensuring that students can recite these facts. Clearly, high-stakes ex-

    aminations that require recall of unrelated bits of scientific knowledge in the form of factsand isolated constructs will increase the likelihood that teachers will adjust their teaching

    methodologies to address these objectives. The more often discrete facts appear on high-

    stakes examinations, the more often we imagine that teachers will feel pressured to focus

    on breadth of knowledge. Conversely, we feel that the adoption of a different approach in

    high-stakes testing that is less focused on the recall of wide-ranging facts but is, instead,

    focused on a few widely accepted key conceptual understandings will result in a shift of

    pedagogy. In such situations, we envision teachers instructional practice and curricula

    designed to offer students greater opportunity to explore the various contexts from which

    conceptual understandings emerge. We suspect that the additional focused time will allow

    students to recognize (and teachers to challenge) students nave conceptions of nature.

    These concerns fall in line with Andersons (2004, p. 1) three conclusions about

    standards-based reform in the United States:

    The reform agenda is more ambitious than our current resources and infrastructure

    will support.

    The standards advocate strategies that may not reduce achievement gaps among

    different groups of students.

    There are too many standards, more than students can learn with understanding in

    the time we have to teach science.

    Andersons (2004) final conclusion strongly resonates with that of the NRC (2007)

    and more specifically with findings, at the international level, by Schmidt et al. (1997,

    2005). In a comparison of 46 countries, Schmidt et al. (2005) noted that in top-achieving

    countries, the science frameworks cover far fewer topics than in the United States, and

    that students from these countries perform significantly better than students in the United

    States. They conclude that U.S. standards are not likely to create a framework that developsand supports understanding or coherence, a strategy that encourages the development of

    a deeper understanding of the structure of the discipline. By international standards, the

    U.S. science framework is unfocused, repetitive, and undemanding (p. 532).

    From the perspective of our study, both Schmidt et al. (2005) and Andersons (2004)

    conclusions are particularly relevant. Clearly, increasing the quantity of information re-

    quired for examination preparation will lead to an increased focus on broader coverage

    of course content, an approach that our findings indicate is negatively (or certainly not

    positively) associated with performance in subsequent science courses. If teachers know

    that they and their students are accountable for more material, then the pressure to focus

    on breadth would seem like the natural pedagogical choice to make. Hence, teachers must

    decide whether they choose to maximize students test scores or maximize preparation

    for success in future study. Although they have considerable feedback concerning the test

    scores of their students (from state tests, SATs, ACTs, and AP examinations), they have

    almost no knowledge of how the bulk of their students do in college science, save the few

    who keep in touch. Moreover, the rare college student who reports back to their high school

    science teacher is often the most successful and simply reinforces a teachers confidence in

    his or her current methods.

    Caveats

    There are several concerns we wish to clearly elucidate. A potential criticism of this

    study stems from a central element of this analysis, the length of time spent on a topic

    Science Education

  • 8/9/2019 BOE Packet

    7/27

    DEPTH VERSUS BREADTH 25

    and how this variable might be influenced by the ability level of students to grasp the

    material. One might imagine classrooms containing many struggling students actuallyspending more time on particular topics and classes with generally high achieving students

    requiring less time to grasp the concepts and quickly moving on to other topics. In this

    scenario, depth of content coverage would be an indicator for remedial assistance by the

    teacher at the classroom level. However, we assume that students in our samplethose

    who went on to take introductory science coursesare typically well above average in

    their science abilities. Thus we would expect our depth measure, if it really signified a

    remedial scenario, to have a negative impact, if any, on student performance in introductory

    college science courses. As it was, we observed the opposite (i.e., depth was associated

    with higher performance, even when controlled for student achievement in high school).

    This argument suggests that the remedial depth scenario is unlikely.

    Another issue in this analysis is the degree to which we might expect other variables in

    the FICSS survey to correlate with depth and breadth as defined. We identified one such

    item in our survey: How would you best describe learning the material required in your[high school science] course? The respondents were provided with a 5-point rating scale

    ranging from A lot of memorization of facts to A full understanding of topics. This

    question is constructed in a manner that clearly alludes (at one end of the scale) to a type

    of memorization that presumes a cursory or superficial level of understanding. This was

    certainly our intention when we wrote this question. However, Ramsden (2003) notes that

    the act of memorization may help learners deconstruct the structure and connections in the

    body of information they are attempting to commit to memory. Thus, is memorization a

    feature of both depth and breadth in learning? On the basis of this more carefully considered

    characterization of memorization and taking into account the tone and clear intention of

    this questionnaire item, we hypothesized that the responses to this item would have a

    weak positive correlation, if any, with depth, and a weak negative correlation, if any, with

    breadth. In the analysis, we found the correlation with depth was r = .15 (very weak) and

    the correlation with breadth was r =.09, a result commonly considered not correlated.The results in this case suggest that memorization (as a new variable) is orthogonal with

    our definition of breadth and depth and is not subsumed by depth or breadth. However, it

    remains to be seen how well our definitions hold up as new variables are identified.

    Further Research: Looking at Individual Subject Areas

    A key issue for further research in the analysis of the constructs of breadth versus depth is

    whether there are specific topics that high school teachers should focus on or whether high

    school teachers might choose to focus on any topic and still potentially reap an advantage

    in terms of students enhanced future performance. In other words, do students perform

    better in introductory college science courses if they were exposed to at least one science

    topic in depth, regardless of what it was, or does it help them if they specifically studied

    topic A in depth, but not if they studied topic B in depth? An analogous argument canbe made regarding breadth. It therefore appears useful to examine the potential effects of

    depth in individual subject areas, and of the omission of individual subject areas, in future

    research.

    CONCLUSION

    The baseline model reveals a direct and compelling outcome: teaching for depth is

    associated with improvements in later performance. Of course, there is much to consider

    in evaluating the implications of such an analysis. There are a number of questions about

    Science Education

  • 8/9/2019 BOE Packet

    8/27

    26 SCHWARTZ ET AL.

    this simple conclusion that naturally emerge. For example, how much depth works best?

    What is the optimal manner to operationalize the impact of depth-based learning? Dospecific contexts (such as type of student, teacher, or school) moderate the impact of depth?

    The answers to these questions certainly suggest that a more nuanced view should be

    sought. Nonetheless, this analysis appears to indicate that a robust positive association

    exists between high school science teaching that provides depth in at least one topic and

    better performances in introductory postsecondary science courses.

    Our results also clearly suggest that breadth-based learning, as commonly applied in

    high school classrooms, does not appear to offer students any advantage when they enroll

    in introductory college science courses, although it may contribute to higher scores on

    standardized tests. However, the intuitive appeal of broadly surveying a discipline in an

    introductory high school course cannot be overlooked. There might be benefits to such a

    pedagogy that become apparent when using measures that we did not explore. The results

    regarding breadth were less compelling because in only one of the three disciplines were

    the results significant in our full model. On the other hand, we observed no positive effectsat all. As it stands, our findings at least suggest that aiming for breadth in content coverage

    should be avoided, as we found no evidence to support such an approach.

    The authors thank the people who made this large research project possible: Janice M. Earle, Finbarr

    C. Sloane, and Larry E. Suter of the National Science Foundation for their insight and support; James

    H. Wandersee, Joel J. Mintzes, Lillian C. McDermott, Eric Mazur, Dudley R. Herschbach, Brian

    Alters, and Jason Wiles of the FICCS Advisory Board for their guidance; and Nancy Cianchetta,

    Susan Matthews, Dan Record, and Tim Reed of our High School Advisory Board for their time

    and wisdom. This research has resulted from the tireless efforts of many on our research team:

    Michael Filisky, Hal Coyle, Cynthia Crockett, Bruce Ward, Judith Peritz, Annette Trenga, Freeman

    Deutsch, Nancy Cook, Zahra Hazari, and Jamie Miller. Matthew H. Schneps, Nancy Finkelstein,

    Alex Griswold, Tobias McElheny, Yael Bowman, and Alexia Prichard of our Science Media Group

    constructed of our dissemination website (www.ficss.org). We also appreciate advice and interestfrom several colleagues in the field: Michael Neuschatz of the American Institute of Physics, William

    Lichten of Yale University, Trevor Packer of the College Board, Saul Geiser of the University of

    California, Paul Hickman of Northeastern University, William Fitzsimmons, Marlyn McGrath Lewis,

    Georgene Herschbach, and Rory Browne of Harvard University, and Kristen Klopfenstein of Texas

    Christian University. We are indebted to the professors at universities and colleges nationwide who

    felt that this project was worth contributing a piece of their valuable class to administer our surveys

    and their students willingness to answer our questions. Any opinions, findings, and conclusions or

    recommendations expressed in this material are those of the authors and do not necessarily reflect

    the views of the National Science Foundation, the U.S. Department of Education, or the National

    Institutes of Health.

    REFERENCES

    American Association for the Advancement of Science. (1989). Project 2061: Science for all Americans.

    Washington, DC: Author.

    American Association for the Advancement of Science. (1993). Benchmarks for science literacy. New York:

    Oxford University Press.

    Anaya, G. (1999). Accuracy of self-reported test scores. College and University, 75(2), 13 19.

    Anderson, C. W. (2004). Science education research, environmental literacy, and our collective future. National

    Association for Research in Science Teaching. NARST News, 47(2), 1 5.

    Anderson, R. D. (1995). Curriculum reform. Phi Delta Kappan, 77(1), 33 36.

    Baird, L. (1976). Using self-reports to predict student performance. Research Monograph No. 7. New York:

    College Entrance Examination Board.

    Science Education

  • 8/9/2019 BOE Packet

    9/27

    Print Page Educators - Information & Tools For Teachers, Counselors, Higher Education Faculty andAdministrators Home > K12 Services > College Board Standards for College Success

    Close x

    College Board Standards for College SuccessAbout the College Board Standards for College Success (CBSCS)

    The College Board Standards for College Success (CBSCS) define the knowledge and skills students need to

    develop and master in English language arts, mathematics and statistics, and science in order to be college and

    career ready. The CBSCS outline a clear and coherent pathway to Advanced Placement (AP) and college

    readiness with the goal of increasing the number and diversity of students who are prepared not only to enroll in

    college, but to succeed in college and 21st-century careers. The College Board has published these standards freely

    to provide a national model of rigorous academic content standards that states, districts, schools and teachers may

    use to vertically align curriculum, instruction, assessment and professional development to AP and college

    readiness. These rigorous standards:

    provide a model set of comprehensive standards for middle school and high school courses that lead to

    college and workplace readiness;

    reflect 21st-century skills such as problem solving, critical and creative thinking, collaboration, and media and

    technological literacy;

    articulate clear standards and objectives with supporting, in-depth performance expectations to guide

    instruction and curriculum development;

    provide teachers, districts and states with tools for increasing the rigor and alignment of courses across

    grades 6-12 to college and workplace readiness; and

    assist teachers in designing lessons and classroom assessments.

    Standards Development Process

    To guide the standards development process, the College Board convened national committees of middle school

    and high school teachers, college faculty, subject matter experts, assessment specialists, teacher education facultyand curriculum experts who had experience in developing content standards for states and national professional

    organizations.

    The standards advisory committees relied on college-readiness evidence gathered from a wide array of sources to

    design and develop the CBSCS. These sources include national and international frameworks such as National

    Assessment of Educational Progress (NAEP), Programme for International Student Assessment (PISA), and Trends

    in International Mathematics and Science Study (TIMSS); results of surveys and course content analyses from

    college faculty regarding what is most important for college readiness; assessment frameworks from relevant AP

    exams, the SAT, the PSAT/NMSQT, and College Level Examination Program (CLEP) exams, and selected

    university placement programs.

    Beginning with the end goal in mind, the committees first defined the academic demands students will face in AP or

    first-year college courses in English, mathematics and statistics, and science. After identifying these demands, thecommittees then backmapped to the start of middle school to outline a vertical progression, or road map, of critical

    thinking skills and knowledge students need to be prepared for college-level work.

    Standards and Curriculum Alignment Services

    States and districts are encouraged to use the CBSCS as a guiding framework to strengthen the alignment of their

    standards, curriculum and assessments to college readiness. The College Board offers standards and curriculum

    alignment services to states and districts, providing independent reviews of standards, curriculum and assessments.

    College Board content specialists use established alignment methods and review criteria to analyze the alignment

    and offer meaningful findings and recommendations tailored to a states or district's needs.

    Page 1 of 2College Board Standards for College Success

    10/21/2009http://professionals.collegeboard.com/portal/site/Professionals/menuitem.b6b1a9bc0c561 ...

  • 8/9/2019 BOE Packet

    10/27

    The College Board also uses the CBSCS to align our own curriculum and assessment programs, including

    SpringBoard, to college readiness.

    Learn More

    Please contact Standards and Curriculum Alignment Services at [email protected] or your

    regional College Board representative for more information on the CBSCS and our alignment services.

    Science (.pdf/4.4M)English Language Arts (.pdf/1.9M)

    Mathematics & Statistics (.pdf/535K)

    Mathematics & StatisticsAdapted for Integrated Curricula(.pdf/368K)

    Mathematics & Statistics Three-Year Alternative for Middle School(.pdf/121K)

    Page 2 of 2College Board Standards for College Success

    10/21/2009http://professionals.collegeboard.com/portal/site/Professionals/menuitem.b6b1a9bc0c561 ...

  • 8/9/2019 BOE Packet

    11/27

    0 Science educator

    Modeling Instruction: An Effective

    Model for Science EducationThe authors describe a Modeling Instruction program that places an emphasis

    on the construction and application of conceptual models of physical

    phenomena as a central aspect of learning and doing science.

    Jane Jackson, Larry Dukerich, David Hestenes

    IntroductionModeling Instruction is an evolving,

    research-based program for high

    school science education reform

    that was supported by the National

    Science Foundation (NSF) from

    989 to 2005. The name Modeling

    Instruction expresses an emphasis

    on the construction and application

    of conceptual models of physical

    phenomena as a central aspect of

    learning and doing science (Hestenes,

    987; Wells et al, 995; Hestenes,

    997). Both the National Science Education Standards (NRC, 996)

    and National Council of Teachers

    of Mathematics Standards (NCTM),

    as well as Benchmarks for Science

    Literacy (AAAS, 993) recommend

    models and modeling as a unifying

    theme for science and mathematics

    education. To our knowledge, no other

    program has implemented this theme

    so thoroughly.

    From 995 to 999, 200 high

    school physics teachers participated intwo four-week Leadership Modeling

    Workshops with NSF support. Since

    that time, 2500 additional teachers from

    48 states and a few other nations have

    taken summer Modeling Workshops at

    universities in many states, supported

    largely by state funds. Participants

    include teachers from public and

    private schools in urban, suburban,

    and rural areas. Modeling Workshops

    at Arizona State University (ASU) are

    the cornerstone of a graduate programfor teachers of the physical sciences.

    Recently, Modeling has expanded to

    embrace the entire middle/high school

    physical science curriculum. The

    program has an extensive web site at

    http://modeling.asu.edu.

    Product: Students Who Can

    Think

    Modeling Instruction meets or

    exceeds NSES teaching standards,

    professional development standards,

    assessment standards, and content and

    inquiry standards.

    Modeling Instruction produces

    students who engage intelligently in

    public discourse and debate about

    matters of scientic and technical

    concern. Betsy Barnard, a modeler

    in Madison, Wisconsin, noticed

    a significant change in this area

    after Modeling was implemented

    in her school: I teach a course inbiotechnology, mostly to seniors,

    nearly all of whom had physics the

    previous year. When asked to formally

    present information to the class aboutcontroversial topics such as cloning

    or genetically modied organisms, it

    is delightfully clear how much more

    articulate and condent they are.

    Students in modeling classrooms

    experience rst-hand the richness and

    excitement of learning about the natural

    world. One example comes from

    Phoenix modeler Robert McDowell.

    He wrote that, under traditional

    instruction, when asked a questionabout some science application in a

    movie, I might get a few students who

    would cite -2 errors, but usually with

    uncertainty. Since I started Modeling,

    the students now bring up their own

    topics not just from movies, but

    their everyday experiences. One of

    his students wrote, Mr. McDowell, I

    was at a Diamondback baseball game

    recently, and all I could think of was

    all the physics problems involved.

    A former student of another modeler,Gail Seemueller of Rhode Island,

    described it as follows: She wanted us

    to truly LEARN and more importantly

    UNDERSTAND the material. I was

    engaged. We did many hands-on

    experiments of which I can still vividly

    remember, three years later.

    Kelli Gamez Warble of rural

    Arizona, who has taught physics and

    Students in modelingclassrooms experiencefrst-hand the richness andexcitement of learning aboutthe natural world.

  • 8/9/2019 BOE Packet

    12/27

    Spring 2008 Vol. 17, no. 1

    calculus for a decade using Modeling

    Instruction, has had numerous students

    whose career choices were inuenced

    by Modeling Instruction. She wroteabout discovering several former

    students when visiting an ASU

    Engineering Day, and all but one were

    females. She wrote, As a former

    female engineering student myself,

    I was gratied but not surprised.

    Modeling encourages cooperation

    and discourse about complicated

    ideas in a non-threatening, supportive

    environment. Females who view

    science, engineering, and technology

    as elds encouraging cooperation

    and supportiveness will, I believe,

    become much more attracted to these

    non-traditional areas.

    Excellence

    Many modeling teachers have

    been recognized nationally. For

    example, three users of Modeling

    Instruction have received the National

    Science Teachers Association (NSTA)

    Shell Science Teaching Award.

    Numerous modelers have received

    the Presidential Award for Excellence

    in Math and Science Teaching

    (PAEMST). At Modeling Workshops

    and related graduate courses for

    science teachers at ASU, as well as

    at Modeling Workshops across the

    United States, teachers learn to impact

    students of various backgrounds and

    learning styles. Modeling Workshops

    and classrooms are thriving centers of

    interactive engagement.

    The Essence of Modeling

    Instruction

    The Modeling method of instruction

    corrects many weaknesses of the

    traditional lecture-demonstration

    method, including the fragmentation

    of knowledge, student passivity, and

    the persistence of nave beliefs about

    the physical world. From its inception,

    the Modeling Instruction program has

    been concerned with reforming highschool physics teaching to make it

    more coherent and student-centered,

    and to incorporate the computer as an

    essential scientic tool.

    In a series of intensive workshops

    over two years, high school teachers

    learn to be leaders in science teaching

    reform and technology infusion in

    their schools. They are equipped

    with a robust teaching methodology

    for developing student abilities to

    make sense of physical experience,to understand scientic claims, to

    articulate coherent opinions of their

    own and defend them with cogent

    arguments, and to evaluate evidence

    in support of justied belief.

    More specically, teachers learn

    to ground their teaching in a well-

    defined pedagogical framework

    (modeling theory; Hestenes, 987),

    rather than following rules of thumb;

    to organize course content aroundscientifc models as coherent units

    of structured knowledge; to engage

    students collaboratively in making

    and using models to describe, explain,

    predict, design, and control physical

    phenomena; to involve students in

    using computers as scientifc tools

    for collecting, organizing, analyzing,

    visualizing, and modeling real data; to

    assess student understanding in more

    meaningful ways and experiment with

    more authentic means of assessment;

    to continuously improve and updateinstruction with new software,

    curriculum materials, and insights

    from educational research; and

    to work collaboratively in action

    research teams to mutually improve

    their teaching practice. Altogether,

    Modeling Workshops provide detailed

    implementation of the National

    Science Education Standards.

    The modeling cycle, student

    conceptions, discourseInstruction is organized into

    modeling cycles rather than traditional

    content units. This promotes an

    integrated understanding of modeling

    processes and the acquisition of

    coordinated modeling skills. The

    two main stages of this process

    are model development and model

    deployment.

    T h e f i r s t s t a g e m o d e l

    developmenttypically begins

    with a demonstration and classdiscussion. This establishes a common

    understanding of a question to be

    asked of nature. Then, in small groups,

    students collaborate in planning and

    conducting experiments to answer or

    clarify the question. Students present

    and justify their conclusions in oral and

    written form, including the formulation

    of a model for the phenomena in

    question and an evaluation of the

    model by comparison with data.

    Technical terms and representational

    tools are introduced by the teacher as

    they are needed to sharpen models,

    facilitate modeling activities, and

    improve the quality of discourse. The

    teacher is prepared with a denite

    agenda for student progress and guides

    student inquiry and discussion in that

    direction with Socratic questioning

    Students present and justify

    their conclusions in oral andwritten form, including theformulation of a model forthe phenomena in questionand an evaluation of themodel by comparison withdata.

  • 8/9/2019 BOE Packet

    13/27

    2 Science educator

    and remarks. The teacher is equipped

    with a taxonomy of typical student

    misconceptions to be addressed as

    students are induced to articulate,analyze, and justify their personal

    beliefs (Halloun and Hestenes, 985b,

    Hestenes et al., 992).

    During the second stagemodel

    deploymentstudents apply their

    newly-discovered model to new

    situations to rene and deepen their

    understanding. Students work on

    challenging worksheet problems in

    small groups, and then present anddefend their results to the class. This

    stage also includes quizzes, tests, and

    lab practicums. An example of the

    entire modeling cycle is outlined in

    Table .

    The teacher is equipped

    with a taxonomy of typicalstudent misconceptions tobe addressed as studentsare induced to articulate,analyze, and justify theirpersonal beliefs.

    I. Model Development (Paradigm

    Lab)A.Pre-lab discussion

    Students observe battery-

    powered vehicles moving

    across the oor and describe

    their observations. The

    teacher guides them toward

    a laboratory investigation to

    determine whether the vehicle

    moves at constant speed, and

    to determine a mathematical

    model of the vehiclesmotion.

    B. Lab investigation

    Students collect position and

    time data for the vehicles and

    analyze the data to develop a

    mathematical model. (In this

    case, the graph of position

    vs. time is linear, so they do a

    linear regression to determine

    the model.) Students then

    display their results on small

    whiteboards and preparepresentations.

    C. Post-lab discussion

    Students present the results of

    their lab investigations to the

    rest of the class and interpret

    what their model means in

    terms of the motion of the

    vehicle. After all lab groups

    Table 1. Modeling Cycle Example: The Constant VelocityModel

    have presented, the teacher

    leads a discussion of themodels to develop a general

    mathematical model that

    describes constant-velocity

    motion.

    II. Model Deployment

    A. Worksheets

    Working in small groups,

    students complete worksheets

    that ask them to apply the

    constant-velocity model

    to various new situations.They are asked to prepare

    whiteboard presentations of

    their problem solutions and

    present them to the class. The

    teachers role at this stage is

    continual questioning of the

    students to encourage them to

    articulate what they know and

    how they know it, thereby

    correcting any lingering

    misconceptions.B. Quizzes

    In order to do mid-course

    progress checks for student

    understanding, the modeling

    materials include several

    short quizzes. Students

    are asked to complete

    these quizzes individually

    to demonstrate their

    understanding of the modeland its application. Students

    are asked not only to solve

    problems, but also to provide

    brief explanations of their

    problem-solving strategies.

    C. Lab Practicum

    To further check for

    understanding, students

    are asked to complete a lab

    practicum in which they need

    to use the constant-velocitymodel to solve a real-world

    problem. Working in groups,

    they come to agreement on

    a solution and then test their

    solution with the battery-

    powered vehicles.

    D. Unit Test

    As a f i na l check fo r

    understanding, students take

    a unit test. (The constant-

    velocity unit is the rst unitof the curriculum. In later unit

    tests, students are required to

    incorporate models developed

    earlier in the course into their

    problem solving; this is an

    example of the spiral nature of

    the modeling curriculum.)

  • 8/9/2019 BOE Packet

    14/27

    Spring 2008 Vol. 17, no. 1 3

    That models and modeling should

    be central to an inquiry-based approach

    to the study of physics is no surprise.

    The NSES state, Student inquiriesshould culminate in formulating

    an explanation or model In the

    process of answering the questions, the

    students should engage in discussions

    and arguments that result in the

    revision of their explanations.

    Traditional instruction often

    overlooks the crucial inuence of

    students personal beliefs on what

    they learn. Force Concept Inventory

    (FCI) data (Hestenes et al., 992) show

    that students are not easily induced

    to discard their misconceptions in

    favor of Newtonian concepts. Some

    educational researchers have expended

    considerable effort in designing and

    testing teaching methods to deal with

    specic misconceptions. Although

    their outcomes have been decidedly

    human thought; we use metaphors so

    frequently and automatically that we

    seldom notice them unless they are

    called to our attention. Metaphors areused to structure our experience and

    thereby make it meaningful. A major

    objective of teaching should therefore

    be to help students straighten out

    their metaphors. In Modeling, instead

    of designing the course to address

    specific nave conceptions, the

    instructor focuses on helping students

    construct appropriate models to

    account for the phenomena they study.

    When students learn to correctly

    identify a physical system, represent

    it diagrammatically, and then apply

    the model to the situation they are

    studying, their misconceptions tend

    to fall away (Wells et al., 995).

    A key component of this approach is

    that it moves the teacher from the role

    of authority gure who provides the

    knowledge to that of a coach/facilitator

    who helps the students construct their

    own understanding. Since students

    systematically misunderstand mostof what we tell them (due to the fact

    that what they hear is ltered through

    their existing mental structures),

    the emphasis is placed on student

    articulation of the concepts.

    Laboratory experiences are centered

    on experiments that isolate one

    concept, with equipment that enables

    students to generate good data reliably.

    Students are given no pre-printed

    list of instructions for doing these

    experiments. Rather, the instructorintroduces the class to the physical

    system to be investigated and engages

    students in describing the system until

    a consensus is achieved. The instructor

    elicits from students the appropriate

    dependent and independent variables

    to characterize the system. After

    obtaining reasoned defenses from

    the students for selection of these

    variables, the instructor asks the

    students to help design the experiment.

    A general procedure is negotiated sothat students have a sense of why

    they are doing a particular procedure.

    Frequently, multiple procedures arise

    as students have access to equipment

    that allows them to explore the

    relationship in different ways. Lab

    teams are then allowed to begin

    collecting data.

    Students have to make sense of the

    experiment themselves. The instructor

    must be prepared to allow them to fail.The apparatus should be available

    for several days, should they need it.

    Students use spreadsheet and graphing

    software to help them organize and

    analyze their data. After allowing time

    to prepare whiteboards to summarize

    their ndings, the instructor selects

    certain groups to present an oral

    account of the groups experimental

    procedure and interpretation. Students

    use multiple representations to present

    their findings, including conciseEnglish sentences, graphs, diagrams,

    and algebraic expressions.

    After some initial discomfort with

    making these oral presentations,

    students generally become more at

    ease and gradually develop the skills

    of making an effective presentation.

    They learn how to defend their

    views clearly and concisely. These

    In Modeling, instead ofdesigning the course toaddress specifc naveconceptions, the instructorfocuses on helping studentsconstruct appropriatemodels to account for thephenomena they study.

    The Modeling method

    stresses developinga sound conceptualunderstanding throughgraphical and diagrammaticrepresentations beforemoving on to an algebraictreatment of problemsolving.

    better than the traditional ones,

    success has been limited. Many have

    concluded that student beliefs are so

    deep-seated that heavy instructional

    costs to unseat them are unavoidable.

    However, documented success with

    the Modeling method suggests that an

    indirect treatment of misconceptions is

    likely to be most efcient and effective.

    (Wells et al., 995)

    Cognitive scientists have identied

    metaphors as a fundamental tool of

  • 8/9/2019 BOE Packet

    15/27

    4 Science educator

    communication skills are valuable

    beyond their immediate application

    in the physics classroom.

    Assessment

    Clearly, one role of assessment

    is to ascertain student mastery of

    the skills and understanding of the

    concepts in the unit. An equally

    important role is the feedback it

    provides the instructor about his or her

    curriculum and teaching methods. The

    Modeling method stresses developing

    a sound conceptual understanding

    through graphical and diagrammatic

    representations beforemoving on toan algebraic treatment of problem

    solving. To be consistent with this

    end, the assessment instruments test

    students ability to interpret graphs and

    draw conclusions, as well as to solve

    quantitative problems.

    In addition to lab write-ups, in which

    students report their ndings using a

    format outlined in the curriculum

    materials, the lab practicum is used as a

    means to check student understanding.

    In the practicum, an application

    problem is posed to the class as a

    whole. The class has a xed period

    of time to gure out what model is

    appropriate to describe the situation,

    decide what measurements to make,

    collect and analyze the data, and then

    prepare a solution to the problem. This

    is used as the culminating activity in

    the unit and usually helps the students

    review key principles for the unit

    test.Whiteboarding is another major

    component of assessment. In

    whiteboarding, small groups of

    students write up their results from

    a lab or solutions to a worksheet

    problem on a small whiteboard. One

    student from the group presents the

    whiteboard to the class, responding

    to questions from the class and the

    instructor. Other group members

    are allowed to help if needed. This

    is an important assessment tool that

    helps the instructor determine how

    well students have mastered theconcepts. When one hears fuzzy or

    incoherent explanations, one has the

    opportunity to help students deal with

    their incomplete conceptions before

    moving on to the next topic. The two

    questions teachers ask most frequently

    are, Why do you say that? and

    How do you know that? Students

    have to account for everything they

    do in solving a problem, ultimately

    appealing to models developed on the

    basis of experiments done in class.Instructors trained in the Modeling

    method do not take correct statements

    for granted; they always press for

    explicit articulation of understanding.

    This probing by the teacher often

    reveals that the student presenter does

    not fully understand the concept (even

    when answers are correct). The teacher

    then has the opportunity to help the

    student correct his or her conceptions

    through Socratic questioning.Whiteboarding has other valuable

    uses as well. First, it gives students a

    chance to reinforce their understanding

    of concepts; students often dont know

    exactly what they think until theyve

    heard themselves express the idea.

    Second, students are highly motivated

    to understand the question they are

    assigned to present. No one enjoys

    getting up before a group of peers

    with nothing to say. Additionally,

    during preparation, the instructor has

    the opportunity to help students if noone in the group knows how to do the

    problem.

    Professional Development

    Modeling Instruction places a strong

    emphasis on professional development,

    both during the workshops and

    afterward. The workshops are the

    beginning of a process of lifetime

    learning. Teachers are encouraged

    to network amongst themselves and

    to become leaders of reform in theirschools and districts. The ASU web

    site and list-serve provide a forum

    for communication among Modelers

    across the nation and the world.

    Since teachers teach as they have

    been taught, the workshops include

    extensive practice in implementing the

    curriculum as intended for high school

    classes. Participants rotate through

    roles of student and instructor as they

    practice techniques of guided inquiry

    and cooperative learning. Plans andtechniques for raising the level of

    discourse in classroom discussions and

    student presentations are emphasized.

    The workshops immerse teachers

    in the physics content of the entire

    semester, thereby providing in-depth

    remediation for under-prepared

    teachers.

    The great success and popularity

    of the Modeling Workshops has

    generated an overwhelming demand

    for additional courses. In response to

    this demand, in 200 ASU approved

    a new graduate program to support

    sustained professional development of

    high school physics teachers. Modeling

    Workshops are the foundation of the

    program, which can lead to aMaster

    of Natural Science (MNS) degree.

    The National Science Education

    Students have to account

    for everything they do insolving a problem, ultimatelyappealing to modelsdeveloped on the basis ofexperiments done in class.

  • 8/9/2019 BOE Packet

    16/27

    Spring 2008 Vol. 17, no. 1 5

    Standards (NSES) emphasize that

    coherent and integrated programs

    supporting lifelong professional

    development of science teachers areessential for signicant reform. They

    state that The conventional view of

    professional development for teachers

    needs to shift from technical training

    for specic skills to opportunities for

    intellectual professional growth. The

    MNS program at ASU is designed to

    meet that need.

    Evidence of Effectiveness

    Evaluation of Physics Instruction.The Force Concept Inventory (FCI)

    was developed to compare the

    effectiveness of alternative methods

    of physics instruction (Halloun et al.,

    985a, Hestenes et al., 992). It has

    become the most widely used and

    inuential instrument for assessing

    the effectiveness of introductory

    physics instruction, and has been cited

    as producing the most convincing

    hard evidence of the need to reform

    traditional physics instruction (Hake,998).

    The FCI assesses students

    conceptual understanding of the

    force concept, the key concept in

    mechanics. It consists of 30 multiple

    choice questions, but there is one

    crucial difference between the FCI

    questions and traditional multiple-

    choice items: distracters are designed

    to elicit misconceptions known from

    the research base. A student must havea clear understanding of one of six

    fundamental aspects of the Newtonian

    force concept in order to select the

    correct response. The FCI reveals

    misconceptions that students bring

    as prior knowledge to a class, and it

    measures the conceptual gains of a

    class as a whole. The FCI is research-

    grounded, normed with thousands

    of students at diverse institutions.

    It is the product of many hours of

    interviews that validated distracters,

    and it has been subjected to intensepeer review.

    Including the survey by Hake

    (998), we have FCI data on some

    30,000 students of 000 physics

    teachers in high schools, colleges

    and universities, throughout the

    world. This large data base presents

    a highly consistent picture, showing

    that the FCI provides statistically

    reliable measures of student concept

    understanding in mechanics. Results

    strongly support the following general

    conclusions:

    Before physics instruction, students

    hold naive beliefs about motion and

    force that are incompatible with

    Newtonian concepts.

    Such beliefs are a major determi-

    nant of student performance in

    introductory physics.

    Traditional(lecture-demonstration)

    physics instruction induces only a

    small change in student beliefs. This

    result is largely independent of the

    instructors knowledge, experience

    and teaching style.

    Much greater changes in student

    beliefs can be induced with

    instructional methods derived

    from educational research in, for

    example, cognition, alternative

    conceptions, classroom discourse,

    and cooperative learning.

    These conclusions can be quantied.

    The FCI is best used as a pre/post

    diagnostic. One way to quantify

    pre/post gains is to calculate the

    normalized gain (Hake, 998). This is

    the actual gain (in percentage) divided

    by the total possible gain (also in

    percentage). Thus, normalized gain =

    (%post - %pre)/(00 - %pre). Hence,

    the normalized gain can range from

    zero (no gain) to (greatest possible

    gain). This method of calculating the

    gains normalizes the index, so thatgains of courses at different levels

    can be compared, even if their pretest

    scores differ widely.

    A challenge in physicseducation research for morethan a decade has been toidentify essential conditions

    for learning Newtonianphysics and thereby devisemore effective teachingmethods.

    From pre/post course FCI scores of

    4 traditional courses in high schools

    and colleges, Hake found a mean

    normalized gain of 23%. In contrast,

    for 48 courses using interactive

    engagement teaching methods (minds-

    on always, and hands-on usually),

    he found a mean normalized gain of

    48%. The difference is much greater

    than a standard deviationa highly

    signicant result. This would indicate

    that traditional instruction fails badly,

    and moreover, that this failure cannot

    be attributed to inadequacies of the

    students because some alternative

    methods of instruction can do

    much better. A challenge in physicseducation research for more than a

    decade has been to identify essential

    conditions for learning Newtonian

    physics and thereby devise more

    effective teaching methods. Modeling

    Instruction resulted from pursuing this

    challenge. Results from using the FCI

    to assess Modeling Instruction are

    given below.

  • 8/9/2019 BOE Packet

    17/27

    6 Science educator

    How effective is modeling instruction?

    Figure summarizes data from a

    nationwide sample of 7500 high school

    physics students who participated inLeadership Modeling Workshops from

    995 to 998. Teachers gave the FCI

    to their classes as a baseline posttest

    when they were teaching traditionally

    (before their rst Modeling

    Workshop), and as a pretest

    and posttest thereafter.

    The average FCI pretest

    score was about 26%, slightly

    above the random guessing

    level of 20%. Figure

    shows that traditional high

    school instruction (lecture,

    demonstration, and standard

    laboratory activities) has little

    impact on student beliefs,

    with an average FCI posttest

    score of 42%, well below the 60%

    score which, for empirical reasons,

    can be regarded as a threshold in

    understanding Newtonian mechanics.

    This corresponds to a normalized gain

    of 22%, in agreement with Hakesresults.

    After their rst year of teaching

    with the Modeling method, posttest

    scores for 3394 students of 66 novice

    modelers were about 0 percentage

    points higher, as shown in Fig. .

    Students ofexpert modelers do much

    better. For teachers identified

    as expert modelers after two years

    in the program, posttest scores for

    647 students averaged 69%. This

    corresponds to a normalized gain of56%, considerably more than double

    the gain under traditional instruction.

    After two years in the program, gains

    for under-prepared teachers were

    comparable to gains in one year for

    well-prepared teachers. Subsequent

    data have conrmed all these results

    for 20,000 students (Hestenes, 2000).

    Thus, student gains in understanding

    under expert modeling instruction

    are more than doubled compared to

    traditional instruction.

    Student FCI gains for 00 Arizonateachers who took Modeling

    Workshops were almost as high as

    those for leading teachers nationwide,

    even though three-fourths of the

    participating Arizona teachers do not

    have a degree in physics. Teachers who

    implement the Modeling method most

    fully have the highest student posttest

    FCI mean scores and gains.

    External evaluation of ModelingInstruction.The Modeling Instruction

    Program was assessed by Prof. Frances

    Lawrenz, an independent external

    evaluator for the National Science

    Foundation. We quote at length from

    the four page report on her July 22,

    998 site visit, because it describes

    the character of the workshops very

    well.

    This site visit was one of several

    over the past four years to sitesin the modeling project I had

    the opportunity to observe the

    participants working in their concept

    groups and presenting to the full

    group. I also interviewed most of

    the participants, either as part of

    a small group or individually, and

    interviewed two coordinators of the

    workshop The workshop appears

    to me to be a resounding success.

    My observations revealed a very

    conscientious, dedicated, and well-

    informed group of teachers activelyinvolved in discussions about the

    most effective ways to present physics

    concepts to students. They were doing

    a marvelous job of combining what

    they knew about how children learn

    with what they knew about physics

    and the modeling approach. The

    discussions were very rich.

    The interviews confirmed my

    observations about the nature of

    the group. All the participants were

    articulate about physics instruction

    and the modeling approach. The

    participants reported being pleased

    with everything that was happening

    at the workshop and spoke in glowing

    terms about the facilitators. The

    participants felt that the workshop

    was well run and that the facilitators

    were extremely knowledgeable about

    how best to teach physics. They also

    commented that the group itself was

    an excellent resource. They all couldhelp each other.

    I have almost never seen such

    overwhelming and consistent

    support for a teaching approach. It

    is especially surprising within the

    physics community, which is known

    for its critical analysis and slow

    acceptance of innovation. In short,

    the modeling approach presented by

    the project is sound and deserves to

    be spread nationally.

    Recognition by U.S. Department of

    Education. In September 2000, the

    U.S. Department of Education

    announced that the Modeling

    Instruction Program at Arizona State

    University is one of seven K-2

    educational technology programs

    designated as exemplary or promising,

    out of 34 programs submitted to the

    42

    26

    52

    26

    69

    2920

    40

    60

    80

    Traditional Novice

    Modelers

    Expert

    Modelers

    Pre-test

    Post test

    FCImeansscores(%)

  • 8/9/2019 BOE Packet

    18/27

    Spring 2008 Vol. 17, no. 1 7

    agencys Expert Panel. Selections

    were based on the following criteria:

    (l) Quality of Program, (2) Educational

    Significance, (3) Evidence of

    Effectiveness, and (4) Usefulness

    to Others. In January 200, a U.S.

    Department of Education Expert Panel

    in Science recognized the Modeling

    Instruction Program as one of onlytwo exemplary K-2 science programs

    out of 27 programs evaluated (U.S.

    Department of Education, 200).

    Long-term implementation. In a

    follow-up survey of Leadership

    Modeling Workshop graduates,

    between one and three years after

    they had completed the program,

    75% of them responded immediately

    and enthusiastically. More than 90%

    reported that the Workshops had ahighly signicant inuence on the

    way they teach. 45% reported that

    their use of Modeling Instruction has

    continued at the same level, while

    another 50% reported an increase.

    (Hestenes, 2000).

    Final Thoughts

    Instead of relying on lectures and

    textbooks, the Modeling Instruction

    program emphasizes active student

    construction of conceptual andmathematical models in an interactive

    learning community. Students are

    engaged with simple scenarios to

    learn to model the physical world.

    Modeling cultivates physics teachers

    as school experts on the use of

    technology in science teaching, and

    encourages teacher-to-teacher training

    in science teaching methods, thereby

    providing schools and school districts

    with a valuable resource for broader

    reform.

    Data on some 20,000 students showthat those who have been through the

    Modeling program typically achieve

    twice the gains on a standard test of

    conceptual understanding as students

    who are taught conventionally.

    Further, the Modeling method is

    successful with students who have

    not traditionally done well in physics.

    Experienced modelers report increased

    enrollments in physics classes,

    parental satisfaction, and enhanced

    achievement in college courses across

    the curriculum.

    Carmela Minaya of Honolulu,

    NSTA Shell Science Teaching

    Awardee, wrote: The beauty of

    Modeling Instruction is that it creates

    an effective framework for teachers

    to incorporate many of the national

    standards into their teaching without

    having to consciously do so, because

    the method innately addresses many

    of the various standards. In a certaincourse, different students may learn

    the same material, except they never

    learn it in exactly the same way. This

    method appeals to all learning styles.

    Students cling to whatever works for

    them. Although the content is identical

    yearly, no two students ever are. The

    classroom becomes a dynamic center

    for student owned learning.

    References

    [All references by David Hestenes are

    online in pdf format at http://modeling.

    asu.edu/R&E/Research.html]

    U.S. Department of Education (200).

    200 Exemplary and Promising Sci-

    ence Programs. Retrieved Dec. 0,

    2007 at http://www.ed.gov/ofces/

    OERI/ORAD/KAD/expert_panel/

    math-science.html

    Hake, R. (998). Interactive-engage-

    ment vs. traditional methods: A six

    thousand-student survey of mechan-

    ics test data for introductory physicscourses.American Journal of Physics

    66, 64-74.

    Halloun, I., & Hestenes, D. (985a). Initial

    knowledge state of college physics

    students,American Journal of Physics

    53, 043-055.

    Halloun, I., & Hestenes, D. (985b).

    Common sense concepts about mo-

    tion,American Journal of Physics 53,

    056-065.

    Hestenes, D. (987). Toward a modeling

    theory of physics instruction,American

    Journal of Physics 55, 440-454.Hestenes, D., Wells, M., & Swackhamer,

    G. (992). Force concept inventory,The

    Physics Teacher 30, 4-58.

    Hestenes, D. (997). Modeling methodol-

    ogy for physics teachers. In E. Redish

    & J. Rigden (Eds.) The changing role

    of the physics department in modern

    universities. American Institute of

    Physics. Part II, 935-957.

    Hestenes, D. (2000). Findings of the mod-

    eling workshop project, 994 - 2000.

    One section of an NSF nal report.

    Online in pdf format at http://modeling.asu.edu/R&E/Research.html.

    Wells, M., Hestenes, D., & Swackhamer,

    G. (995). A modeling method for high

    school physics instruction, American

    Journal of Physics 63, 606-69.

    Jane Jackson is co-director, Modeling

    Instruction Program, Department of Physics,

    Arizona State University, PO Box 87504,

    Tempe, AZ 85287. Correspondence pertaining

    to this article may be sent to Jane.Jackson@

    asu.edu.

    Larry Dukerich is director of Professional

    Development in Science, Center for Research

    in Education in Science, Mathematics,

    Engineering, and Technology (CRESMET),

    Arizona State University, Tempe, AZ.

    David Hestenes is distinguished research

    professor, Department of Physics, Arizona

    State University, Tempe, AZ

    The Modeling method has

    a proven track record ofimproving student learning.

  • 8/9/2019 BOE Packet

    19/27

    ALTERNATE ASSESSMENTS

    Students become sports broadcasters for PBS. They must compose a voice-over dubfor a sports video of their choice. Their video must convey the excitement of the

    sport and the physics principles observed. To gain knowledge and understanding ofphysics principles necessary to meet this challenge, students work collaboratively on

    activities in which they apply concepts of Newton's laws, forces, friction, andmomentum to sporting events.

    Students consider themselves members of an engineering team that is developing asystem that will communicate from one room in the school to another. The challenge

    is to send and receive, then measure the speed of the transmission. The final reportmust describe both the design and the physics of the system and a discussion of how

    the system is better than the methods explored in the chapter. To gain

    understanding of science principles necessary to meet this challenge, students workcollaboratively on activities with codes, electricity and magnetism, sound waves, and

    light rays. They also use the iterative process of engineering design; refining designs

    based on effectiveness and physics concepts.

    Students are challenged to design or build a safety device, or system, for protecting

    automobile, airplane, bicycle, motorcycle, or train passengers. New laws, increased

    awareness, and improved safety systems are explored as students work on thischallenge. They are also encouraged to design improvements to existing systems

    and to find ways to minimize harm caused by accidents. To meet this challenge,students engage in collaborative activities that explore motions and forces and the

    principles of design technology.

    Students are presented with a myth or story about some physical situation (For

    example, The winner of a Tug-of-war is the strongest team.) They will work inteams to design, build, run, and analyze experiments that will test the physical ideas

    central to the myth. Students will then extend their experiments, exploring thephysical limits of the experiment and applying it to real-world situations. Teams will

    then present their findings to the class (and to the world!) in the form of a

    Mythbusters video and make conclusions as to whether the situation described bythe myth is physically plausible or even possible.

    Students will design and build their own musical instrument from available householditems. Students must analyze the tonal quality for the notes played and make acomparison to the tonal quality of several musical instruments. To gain

    understanding of science principles necessary to meet this challenge, students work

    collaboratively on activities to learn about wave motion, standing waves, soundwaves, resonance, timbre, and hearing. They also learn to use the iterative process

    of engineering design, refining designs based on the physics they learn.

  • 8/9/2019 BOE Packet

    20/27

    Dear Physics Students and Parents,

    During this year, you will be learning information related to many topics in physics. In this course,

    a non-traditional criterion-based assessment system is used. Rather than using a point system torecord scores on various assignments, quizzes and tests, you must demonstrate your level of

    proficiency on various learning goals. For example, instead of getting one grade for a worksheetthat may cover many topics, you will be scored on individual learning goals such as, I can

    interpret/draw position vs. time graphs for an object moving with constant velocity. Rubrics thatlist the learning goals and define achievement levels of each will be provided to you and used for

    assessment.

    Grades will be assigned based on descriptors of achievement only. Grades will not be affected by

    issues such as lateness of work, effort, attitude, participation, and attendance. Those factors will bereported separately. Even though there will be many opportunities for cooperative learning, you

    will never be assigned group grades.

    New information showing additional learning will replace old information. Grades will reflect the

    trend in most recent learning. You may re-attempt assessments provided that you havedocumented an effort to engage in additional learning (e.g., tutoring, additional practice, test

    corrections, etc.). In addition, an alternative assessment may be accepted if the work is pre-contracted between you (the student) and me (the teacher). There is no deadline for

    reassessments. Any grade changes due to reassessments made after the quarter ends will bereflected in the final year-end grade.

    For each learning goal and unit of study, you will be scored using the following scale:

    4 = Advanced. Indicators include:

    I understand the content/skills completely and can explain them in detail. I can explain/teach the skills to another student. I have high confidence on how to do the skills. I can have a conversation about the skills. I can independently demonstrate extensions of my knowledge. I can create analogies and/or find connections between different areas within the sciences or

    between science and other areas of study.

    My responses demonstrate in-depth understanding of main ideas and of related details.3 = Proficient. Indicators include:

    I understand the important things about the content/skills. I have confidence on how to do the skills on my own most of the time, but I need to continue

    practicing some parts that still give me problems.

    I need my handouts and notes once in a while. I am proficient at describing terms and independently connecting them with concepts. I understand not just the what, but can correctly explain the how and why of scientific

    processes.

    My responses demonstrate in-depth understanding of main ideas.2 = Developing. Indicators include:

    I have a general understanding of the content/skills, but Im also confused about some importantparts.

    I need some help from my teacher (one-on-one or small group) to do the skills correctly I do not feel confident enough to do the skills on my own I need my handouts and notebook most of the time. I can correctly identify concepts and/or define vocabulary; however I cannot not make

    connections among ideas and/or independently extend my own learning.

    My responses demonstrate basic understanding of some main ideas, but significant information ismissing.

  • 8/9/2019 BOE Packet

    21/27

  • 8/9/2019 BOE Packet

    22/27

    THE BUGGY LAB

    I. PROBLEM

    Design an experiment to detposition of a toy buggy and t

    You have access to rulers, mete

    stopwatches.

    II . DESIGN

    ___ Describe your experimenta

    and a verbal description.___ State what the independen___ Include how will you vary a

    II I. DATA

    ___ Perform the experiment and

    appropriate table.

    IV. ANALYSIS___ Graph and determine the n

    V. CONCLUSION

    ___ What pattern did you find fand mathematical descript

    ___ What is the physical signific

    ___ What is the physical signific___ Turn your numerical model

    You w ill be assessed using

  • 8/9/2019 BOE Packet

    23/27

  • 8/9/2019 BOE Packet

    24/27

    1ST QUARTER LAB SKILLS

    LEARNING GOALS

    ASSIGNMENT/ ASSESSMENT & DATE:

    INTERIMP

    ROGRESS

    QUARTERF

    INAL

    LAB.1 Ican identify the hypothesis to be

    tested, phenomenon to be investigated, orthe problem to be solved.

    LAB.2 I can design a reliable experimentthat tests the hypothesis, investigates the

    phenomenon, or solves the problem.

    LAB.3 I can communicate the details of an

    experimental procedure clearly andcompletely.

    LAB.4 I can revise the hypothesis whennecessary.

    LAB.5 I can decide what parameters are to

    be measured and can identify independentand dependent variables.

    LAB.6 I can record and represent data in ameaningful way.

    LAB.7 I can analyze data appropriately.

    LAB.8 I can represent data graphically and

    use the graph to make predictions.

    LAB.9 I can identify a pattern in the data.

    LAB.10 I can represent a patternmathematically and provide physical meaning

    to the slope and y-intercept.

    UNIT SCORE

  • 8/9/2019 BOE Packet

    25/27

    MOMENTUM CONSERVATION AND TRANSFER

    LEARNING GOALS

    ASSIGNMENT/ ASSESSMENT & DATE:

    INTERIMP

    ROGRESS

    QUARTERF

    INAL

    MOM.1 I can determine whether

    interactions are present for a given situationby considering the motion of objects.

    MOM.2 I can calculate the momentum ofan object/system with direction and proper

    units.

    MOM.3 I can draw an interaction diagramand specify the system and the surroundings.

    MOM.4 I can draw and analyze momentumbar charts.

    MOM.5 I can use momentum conservation

    to solve different problems.

    MOM.6 I can apply the property of

    reciprocity to the interactions betweenobjects.

    MOM.7 I can compute the momentum

    transfer into/out of an object/system(impulse) with direction and proper units.

    MOM.8 I know the relationship betweenaverage force and the rate of momentum

    transfer.

    MOM.9 I know the relationship betweenaverage force and time for a given

    momentum transfer (impulse).

    MOM.10 I can use momentum transfer

    (impulse) to solve different problems.

    UNIT SCORE

  • 8/9/2019 BOE Packet

    26/27

    1ST 2ND 3RD 4TH INTERIM PROGRESS

    LEARNING GOALS

    UNIT SCORE

    INTERIM GRADE: %

    The learning goals I need to improve most are: _________________________________

    ______________________________________________________________________

    What I can do to improve: _________________________________________________

    ______________________________________________________________________

    The learning goals I excel in are: ____________________________________________

    ______________________________________________________________________

    Reasons why I excel in these areas: _________________________________________

    ______________________________________________________________________

    Student Name:

    Student Signature:

    Parent Signature:

    _________________________________

    _________________________________

    _________________________________

    Date: ___________

    Date: ___________

    Questions/ Comments:

  • 8/9/2019 BOE Packet

    27/27

    1ST 2ND 3RD 4TH QUARTER FINAL

    LEARNING GOALS

    UNIT SCORE

    QUARTER GRADE: %

    The learning goals I need to improve most are: _________________________________

    ______________________________________________________________________

    What I can do to improve: _________________________________________________

    ______________________________________________________________________

    The learning goals I excel in are: ____________________________________________

    ______________________________________________________________________

    Reasons why I excel in these areas: _________________________________________

    ______________________________________________________________________

    Student Name:

    Student Signature:

    Parent Signature:

    _________________________________

    _________________________________

    _________________________________

    Date: ___________

    Date: ___________

    Questions/ Comments: