preparing for ngss: planning and carrying out...

65
LIVE INTERACTIVE LEARNING @ YOUR DESKTOP 9 October 9, 2012 6:30 p.m. – 8:00 p.m. Eastern time Preparing for NGSS: Planning and Carrying Out Investigations Presented by: Rick Duschl

Upload: phamque

Post on 11-Mar-2018

220 views

Category:

Documents


5 download

TRANSCRIPT

LIVE INTERACTIVE LEARNING @ YOUR DESKTOP

9

October 9, 20126:30 p.m. – 8:00 p.m. Eastern time

Preparing for NGSS: Planning and Carrying Out Investigations

Presented by: Rick Duschl

• 9,500+ resources– 3,200+ free!

– Add to “My Library” to access later

• Community forums

• Online advisors to assist you

• Tools to plan and document your learning

• http://learningcenter.nsta.org

NSTA Learning Center

10

Developing the Standards

11

Instruction

Curricula

Assessments

Teacher Development

Developing the Standards

12

2011-2013

July 2011

IT’S NOT OUT YET!

NGSS Development ProcessIn addition to a number of reviews by state teams and critical stakeholders, the process includes two public reviews.

1st Public Draft was in May 2012

2nd Public Draft will take place in the Fall of 2012

Final Release is expected in the Spring of 2013

13

A Framework for K-12 Science Education

Released in July 2011Developed by the National Research Council at the National Academies of SciencePrepared by a committee of Scientists (including Nobel Laureates) and Science Educators

Three-Dimensions:Scientific and Engineering PracticesCrosscutting ConceptsDisciplinary Core Ideas

14

Free PDF available from The National Academies Press (www.nap.edu)Print Copies available from NSTA Press (www.nsta.org/store)

1. Asking questions (for science) and defining problems (for engineering)

2. Developing and using models

3. Planning and carrying out investigations

4. Analyzing and interpreting data

5. Using mathematics and computational thinking

6. Constructing explanations (for science) and designing solutions (for engineering)

7. Engaging in argument from evidence

8. Obtaining, evaluating, and communicating information

Scientific and Engineering Practices

15

Crosscutting Concepts1. Patterns

2. Cause and effect: Mechanism and explanation

3. Scale, proportion, and quantity

4. Systems and system models

5. Energy and matter: Flows, cycles, and conservation

6. Structure and function

7. Stability and change

16

Life Science Physical ScienceLS1: From Molecules to Organisms:

Structures and Processes

LS2: Ecosystems: Interactions, Energy, and Dynamics

LS3: Heredity: Inheritance and Variation of Traits

LS4: Biological Evolution: Unity and Diversity

PS1: Matter and Its Interactions

PS2: Motion and Stability: Forces and Interactions

PS3: Energy

PS4: Waves and Their Applications in Technologies for Information Transfer

Earth & Space Science Engineering & TechnologyESS1: Earth’s Place in the Universe

ESS2: Earth’s Systems

ESS3: Earth and Human Activity

ETS1: Engineering Design

ETS2: Links Among Engineering, Technology, Science, and Society

Disciplinary Core Ideas

17

Performance expectations combine practices, core ideas, and crosscutting concepts into a single statement.

18

Construct and use models to explain that atoms combine to form new substances of varying complexity in terms of the number of atoms and repeating subunits. [Clarification Statement: Examples of atoms combining can include Hydrogen (H2) and Oxygen (O2) combining to form hydrogen peroxide (H2O2) or water(H2O). [Assessment Boundary: Restricted to macroscopic interactions.]

Closer Look at a Performance Expectation

Construct and use models to explain that atoms combine to form new substances of varying complexity in terms of the number of atoms and repeating subunits. [Clarification Statement: Examples of atoms combining can include Hydrogen (H2) and Oxygen (O2) combining to form hydrogen peroxide (H2O2) or water(H2O). [Assessment Boundary: Restricted to macroscopic interactions.]

Closer Look at a Performance Expectation

Performance expectations combine practices, core ideas, and crosscutting concepts into a single statement.

19

Taking Science to

School

20

For States By States

21

A Framework to guide changes in K-12 science

Instruction

Curricula

Assessments

Teacher Development

22

4 Strands of Science Proficiency

• Understanding Scientific Explanations – understand central concepts and use them to build and critique explanations.

• Generating Scientific Evidence – generating and evaluating evidence as part of building and refining models and explanations of the natural world.

• Reflecting on Scientific Knowledge – understand that doing science entails searching for core explanations and the connections between them.

• Participating Productively in Science – understand the norms for presenting scientific arguments and evidence and practice productive social interactions with peers around classroom science investigations.

NRC, 2008 Ready, Set, Science!

23

Science & Engineering Practices

• 1. Asking questions (for science) and defining problems (for engineering)

• 2. Developing and using models

• 3. Planning and carrying out investigations

• 4. Analyzing and interpreting data

• 5. Using mathematics and computational thinking

• 6. Constructing explanations (for science) and designing solutions (for engineering)

• 7. Engaging in argument from evidence

• 8. Obtaining, evaluating, and communicating information

• Planning and Carrying Out Investigations

• Scientists and engineers plan and carry out investigations in the field or laboratory, working collaboratively as well as individually. Their investigations are systematic and require clarifying what counts as data and identifying variables or parameters.

• Engineering investigations identify the effectiveness, efficiency, and durability of designs under different conditions.

• Planning and carrying out investigations may include elements of all of the other practices.

24

Webinar Outline

• Generating Evidence

• Designing Experiments

• Evaluating Evidence

• Two Broad Themes:– The role of prior knowledge in scientific thinking at

all ages

– The importance of experience and instruction

25

Generating and Evaluating Chapter 5Evidence and Explanations TSTS

Major Findings in the Chapter:

• Children are far more competent in their scientific reasoning than first suspected and adults are less so. Furthermore, there is great variation in the sophistication of reasoning strategies across individuals of the same age.

• In general, children are less sophisticated than adults in their scientific reasoning. However, experience plays a critical role in facilitating the development of many aspects of reasoning, often trumping age.

• Scientific reasoning is intimately intertwined with conceptual knowledge of the natural phenomena under investigation. This conceptual knowledge sometimes acts as an obstacle to reasoning, but often facilitates it.

• Many aspects of scientific reasoning require experience and instruction to develop. For example, distinguishing between theory and evidence and many aspects of modeling do not emerge without explicit instruction and opportunities for practice.

26

Poll 1

Familiarity with the NRC Reports Taking Science to School and Ready, Set, Science!

A. I have read both reports and understand the main messages and recommendations.

B. I have only read Ready, Set, Science! and understand the main messages and recommendations.

C. I have read Ready, Set, Science! and I am familiar with the main messages.

D. I have heard about Ready, Set, Science! but have not examined the report.

E. I have not heard about Ready, Set, Science!

27

Two Major Shifts from Current Curriculum/Instruction:– Shifting of science from ‘lone’ scientist in an isolated laboratory to an

image of science as both an individual and deeply social enterprise. (Talk & Argument) (Critique & Communication) (Models and Representations)

– Shift in scientific reasoning as a highly developed form of logical thinking that cuts across scientific domains to the study of scientific thinking as the interplay of general reasoning strategies, knowledge of the natural phenomena being studied, and a sense of how scientific evidence and explanations are generated. (Building & Refining Models, Mechanisms, and Theories) (Problematize the Evidence)

28

Generating and Evaluating Chapter 5Evidence and Explanations TSTS

Poll 2 – Generating Evidence

The evidence-gathering phase of inquiry includes planning and designing the investigation as well as carrying out the steps required to collect the data.

Which of the statements below do you think is NOT a part of Generating Evidence?

A. asking questions

B. deciding what to measure

C. developing measures

D. collecting data from the measures

E. structuring the data29

Generating Evidence

Generating evidence entails all of the following:– asking questions,

– deciding what to measure,

– developing measures,

– collecting data from the measures,

– structuring the data,

– systematically documenting outcomes of the investigations,

– interpreting and evaluating the data, and

– using the empirical results to develop and refine arguments, models, and theories.

30

Asking Questions and Formulating Hypotheses

• An iterative cycle – not a one-time event

• Begin with exploratory study of natural world with structured observations that lead to specific questions and hypotheses

• Collection of data could lead to new questions and revision of hypotheses and perhaps another round of data collection

• Asking questions is also about formulating the goals of the activity and generating predictions

31

Submit your questions via the chat.

Ted Willard Brynn SlateRick Duschl

REMINDERS

• To turn off notifications of other participants arriving go to:Edit -> Preferences -> General -> Visual notifications

• You can minimize OR detach and expand chat panel

• Continue the discussion in the Community Forumshttp://learningcenter.nsta.org/discuss

32

Collecting and Structuring Data

Exercise for Healthy Heart

1. Intro Unit and Lab 1 (Day 1)– Conduct prelab including demonstration of STEP test and taking a

pulse. Students collect data Lab 1- Resting Heart Rate at at 6,10,15,& 60 seconds.

2. Data Collection Labs 2&3 (Days 2&3)– Lab 2 - Activity Level (slow/fast stepping) and Heart Rate– Lab 3 - Weight (with/without hand weights) and Heart Rate

3. Data Analysis for Labs 2&3(Days 4&5)– Knowledge Forum Activity “What Matters in Getting Good Data”– Determining Trends and Patterns of Data– Developing and Evaluating Explanations for the Patterns of Data

33

Agree/Disagree with the following statements.

✔ = Agree, ✖ = Disagree1. It matters where you take a pulse

Wrist, neck, thigh

2. It matters how long you take a resting pulse6-10-15-60 seconds

3. It matters how long you take an exercising pulse 6-10-15-60 seconds

4. It matters who takes a pulse

Poll 3 - Exercise for a Healthy Heart

34

Heartrate/min 60 s

36495051

5657595960606060626464666667676870707273757575

79808181

8586

92

0 20 40 60 80 100

1

3

5

7

9

11

13

15

17

19

21

23

25

27

29

31

33

35

stud

ent

heartrate

Resting Heart Rates 6, 10, 60 sec

35

Designing Experiments

Experimentation can be designed to:– Generate observations/measurements that induce a

hypothesis to account for a pattern (Discovery Context)

– Test an existing hypotheses under consideration (Confirmation/Verification Context)

– Isolating variables – control of variables is a basic strategy that allows for valid inferences and constrains the number of possible experiments to consider.

36

• At all ages, prior knowledge of the domain under investigation plays an important role in the formulation of questions and hypotheses.

• Time engaging with the phenomena is very important; in some domains students have this experience, in others it must be built into the classroom events.

37

Prior Knowledge

Prior Knowledge & Benchmark Activities

• Tasks that are given to students at the beginning of a unit prior to any instruction

• Students can choose how to respond –Drawing, Labeled Drawing, Story Board, Symbols, Writing

• Used by teachers to target instruction and identify learner’s – Commonsense understandings (misconceptions)

– Productive intuitions

38

What does the child seem to understand? What does the child appear to confuse? What is the student ready to learn?

Drawing 1 Drawing 2

39

What differences did you see?

• Use of arrows – S1 as lines of force; S2 as pointers

• Force concept – S1 uses word ‘force’; S2 does not

• Confusions– S1 has ‘weight of air’ acting as a downward force, a frequent

commonsense idea; gravity arrows sideways

– S2 has buoyancy > gravity to explain sinking

• Guiding conception– S1 uses density to explain floating/sinking

– S2 uses gravity=buoyancy to explain floating/sinking

• Productive intuitions– S1 uses buoyancy arrows to show water pressure acting in all directions

40

Designing Experiments

Domain-general – minimize the role of prior knowledge (knowledge lean)

– Example – Law of the Pendulum – isolate the 3 variables (length of string, size of weight, height weight is released) to determine which variable influences the period/time of swing. One Lesson.

Domain-specific – infuses the role of prior knowledge (knowledge rich)

– Example – Build a 1 second timer using the data set gathered from class investigations examining varying lengths of string; find out if the 1 second length works with wooden sticks and/or metal pipes; i.e., will it give the same results for a 1 second timer. A Sequence of Lessons.

Sequence matters! Sustained engagement with the phenomena is essential! “Get a grip on nature!”

41

Heartrate/min 60 sec

36495051

5657

595960606060

626464

6666676768

7070

7273

757575

79808181

8586

92

0 20 40 60 80 100

1

3

5

7

9

11

13

15

17

19

21

23

25

27

29

31

33

35

stud

ent

heartrate

What’s the range for a normal heart rate?

42

43

Growth: First Grade

44

Growth: Third Grade

45

Growth: Fifth GradeShifts in Distribution Signal Transitions

in Growth Processes

46

47

Epistemic (What Counts?) Discourse & Data Texts

Data Texts – Selecting/Obtaining

Raw Data

– Selecting Data for Evidence

– Patterns & Models of Evidence

– Explanations of Patterns & Models

Data Transformations for Epistemic Dialog– T1 - what data count, are

worth using

– T2 - what patterns & models to use

– T3 - what explanations account for patterns & models

48

49

Evaluating Evidence that Contradicts Prior Beliefs

Chinn and Brewer propose that there are eight possible responses to anomalous data. Individuals can: (1) ignore the data,

(2) reject the data (e.g., because of methodological error, measurement error, bias);

(3) acknowledge uncertainty about the validity of the data;

(4) exclude the data as being irrelevant to the current theory;

(5) hold the data in abeyance (i.e., withhold a judgment about the relation of the data to the initial theory);

(6) reinterpret the data as consistent with the initial theory;

(7) accept the data and make peripheral change or minor modification to the theory;

(8) accept the data and change the theory. 50

Submit your questions via the chat.

Ted Willard Brynn SlateRick Duschl

REMINDERS

• To turn off notifications of other participants arriving go to:Edit -> Preferences -> General -> Visual notifications

• You can minimize OR detach and expand chat panel

• Continue the discussion in the Community Forumshttp://learningcenter.nsta.org/discuss

51

PRACCISPromoting Reasoning and Conceptual Change in

Science

Clark A. ChinnRichard A. Duschl

Ravit G. DucanPrincipal Investigators

52

Learning Targets: The scientific strategies

1. Reasoning about methodological strengths and weaknesses of studies

• E.g., sample size; reliability and accuracy of measures; alternative interpretations of data; the adequacy of controls.

2. Interpreting data

3. Constructing models or explanations that fit complex

patterns of data from multiple studies

4. Resolving conflicts among studies with seemingly incompatible results

5. Deciding the extent to which one can generalize

53

Lesson 2: Modeling Cellular Transport

Overview: In this lesson students develop several models for how materials cross cellular membranes. Each of these models will be explored in more detail over the week. Students view the results of the iodine experiment which proves the viability of the ‘Squeeze model’ of cellular transport (i.e. simple diffusion into the cell). Students set up the egg experiment which will test the squeeze model in more detail in Lesson 3 – it is essential that the egg experiment be set up on this day – students must at the very least complete Row A and B (from which they can easily calculate C, time-permitting). Finally, time permitting; students discuss criteria for evaluating models (this can be moved to the next day if necessary).

Driving Question: How could things get inside cells?

Learning Objectives: Students will learn that the very basic ‘Squeeze model’ (i.e. simple diffusion into the cell) is a viable model of cellular transport. Students will learn more about how models work and how to build and justify them.

Materials:

• Handouts: Egg experiment directions, Egg experiment data sheet.

• Overheads: Students models (drawn by teacher from discussion); ‘3 Kinds of Models’

• Egg Experiment: Per group – 2 deshelled eggs, 4 cups, balance, 100ml syrup, 100 ml water, plastic wrap, soap, paper towel, 2 plastic spoons.

54

Data Table

55

56

Data Table

High

57

Medium/High

58

Medium

I think the less the density of the substance the easier for smaller things to get into / through something small like the cell membrane. Something else is water goes from when there is more molecules to where theres less molecules

59

Low

I know that lead can get into people’s blood stream. I don’t think it can do anything besides eat the cell so that is why I think that. Then I think it takes over the cell so that it is dead.

60

61

“Setting up a model of the world to study the world does not come easy to children”

Leona Schauble, Vanderbilt University

• Prolonged experience with phenomena

• Posing and revising questions – working over time to make explicit and refine criteria for good questions

• Parsing objects and events into attributes that bear on the question

• Considering/debating means of measuring attributes in ways that support an initial model of the phenomenon (considering the measure properties of those attributes

• Generating/creating data (observing its measure qualities, reliability, etc

62

Continued . . .

• Structuring data (patterns are made, not found)

• Interpreting data as evidence – model construction

• Model testing against the original phenomenon & new cases

• Generation/entertainment of alternative models

• Evaluation of model fit

• Model selection/revision . . . which usually results in theoretically deeper questions

Lehrer, R., Schauble, L., & Lucas, D. (2008). Supporting development of the epistemology of inquiry. Cognitive Development, 23, 512-529.

63

Submit your questions via the chat.

Ted Willard Brynn SlateRick Duschl

REMINDERS

• To turn off notifications of other participants arriving go to:Edit -> Preferences -> General -> Visual notifications

• You can minimize OR detach and expand chat panel

• Continue the discussion in the Community Forumshttp://learningcenter.nsta.org/discuss

64

65

NSTA Website (nsta.org/ngss)

Upcoming Web Seminars on PracticesDate Topic Speaker

1 9/11 Asking Questions and Defining Problems Brian Reiser

2 9/25 Developing and Using Models Christina Schwarz and CindyPassmore

3 10/9 Planning and Carrying Out Investigations Rick Duschl

4 10/23 Analyzing and Interpreting Data Ann Rivet

5 11/6 Using Mathematics and Computational Thinking Robert Mayes and Bryan Shader

6 11/20 Constructing Explanations and Designing Solutions

Katherine McNeill and Leema Berland

7 12/4 Engaging in Argument from Evidence Joe Krajcik

8 12/18 Obtaining, Evaluating and Communicating Information

Philip Bell, Leah Bricker, and Katie Van Horne

66All take place on Tuesdays from 6:30-8:00 pm ET

Next Web SeminarOctober 9 (two weeks from today)

Analyzing and Interpreting Data Teachers will learn more about:

scientific investigations that produce data;the range of tools scientists use for scientific investigations—including tabulation, graphical interpretation, visualization, and statistical analysis—to identify the significant features and patterns in the data;how modern technology makes the collection of large data sets much easier, providing secondary sources for analysis;engineering investigations that include analysis of data collected in the tests of designs; andthe range of tools engineers use to identify patterns within data and interpret the results.

67

Presenter: Ann Rivet

Graduate Credit AvailableShippensburg University will offer one (1) graduate credit to individuals who attend or view all eight webinars.

Participants must either: Attend the live presentation, complete the survey at the end of the webinar, and obtain the certificate of participation from NSTA, or View the archived recording and complete the reflection question for that particular webinar.

In addition, all participants must complete a 500 word reflection essay.

The total cost is $165.

For information on the course requirements, as well as registration and payment information visit www.ship.edu/extended/NSTA

68

69

Community Forums

NSTA Area Conferences

These three conferences will include a number of sessions about the K–12 Framework and the highly anticipated Next Generation Science Standards.

Among the sessions will be an NSTA sponsored session focusing on the Scientific and Engineering Practices.

70

NSTA Print Resources

NSTA Reader’s Guide to the Framework

71

NSTA Journal Articles about the Frameworkand the Standards

Thank you to the sponsor of tonight’s web seminar:

This web seminar contains information about programs, products, and services offered by third parties, as well as links to third-party websites. The presence of a listing or such information does not constitute an endorsement by NSTA of a

particular company or organization, or its programs, products, or services.72

National Science Teachers AssociationGerry Wheeler, Interim Executive Director

Zipporah Miller, Associate Executive Director, Conferences and Programs

Al Byers , Ph.D., Assistant Executive Director, e-Learning and Government Partnerships

Flavio Mendez, Senior Director, NSTA Learning Center

NSTA Web SeminarsBrynn Slate, Manager

Jeff Layman, Technical Coordinator73