best practices in physics program assessment€¦ · physics / stem bachelor degrees 0 2,000 4,000...
TRANSCRIPT
APS March Meeting
14 March 2017New Orleans, LA
Best Practices in Physics Program Assessment:Should APS Provide Accreditation Standards for Physics?
Theodore Hodapp, Director of Project DevelopmentSenior Advisor to Education and Diversity
www.aps.org ©2017, American Physical Society; Email: [email protected]
Physics / STEMBachelor Degrees
0
2,000
4,000
6,000
8,000
10,000
12,000
14,000
16,000
0
50,000
100,000
150,000
200,000
250,000
300,000
350,000
400,000
1965 1975 1985 1995 2005 2015
Physics
All STEM
Source: IPEDS
2
www.aps.org ©2017, American Physical Society; Email: [email protected]
Percentage of Women in Physics
0%
5%
10%
15%
20%
25%
1965 1970 1975 1980 1985 1990 1995 2000 2005 2010 2015
Bachelors
Doctorate
Source:IPEDS
3
www.aps.org ©2017, American Physical Society; Email: [email protected]
Physics Bachelor Degrees Awarded to Underrepresented Minorities
4
0%
1%
2%
3%
4%
5%
6%
7%
8%
9%
1995 2000 2005 2010 2015
Hispanic
AfricanAmerican
Source:IPEDS
www.aps.org ©2017, American Physical Society; Email: [email protected]
High school classes taught by teacher with degree in the field
0%#
10%#
20%#
30%#
40%#
50%#
60%#
70%#
80%#
90%#
Social#Studies# English# Biology# Math# Physics# Chemistry#Source: Schools and Staffing Survey
5
www.aps.org ©2017, American Physical Society; Email: [email protected]
Change in Public Funding of Higher Education in the US
6
Section 1: Addressing Current Financial ChallengesPublic Higher Education
Higher education has long been described as the “balance wheel” of state budgets. Public colleges and universities are different from other state agencies: they have their own revenue streams, they can adjust their program offerings, and they have some control over employee salaries. As this report will make clear, their budgets are not infinitely flexible, but they are more flexible than the budgets of most state institutions. Accordingly, the states tend to increase their contributions to public higher education when the economy is strong, and cut their contribu-tions when the economy is weak.
-4 0
-3 5
-3 0
-2 5
-2 0
-15
-10
-5
0
5
2 0 0 0 2 0 0 1 2 0 0 3 2 0 0 4 2 0 0 5 2 0 0 6 2 0 0 7 2 0 0 8 2 0 0 9 2 0 10 2 0 11 2 0 12 2 0 13 2 0 14
Perc
ent C
hang
eFigure 1: Percent Change in State Support for Public Higher Education
(All Colleges and Universities) per Full-Time Equivalent Student,in Constant 2014 $, since 2000
Despite modest increases in 2013 and 2014, state support for public higher education per full-time equivalent student remains nearly 30 percent below spending in 2000, after adjusting for inflation using the State Higher Education Finance cost adjustment. Source: State Higher Education Executive Officers (SHEEO) Association, SHEF: FY 2014—State Higher Education Finance (Boulder, Colo.: State Higher Education Executive Officers Association, 2015).
6 The Lincoln Project: Excellence and Access in Public Higher Education
Source: The Lincoln Project: Public Research Universities - Recommitting to Lincoln’s Vision: An Educational
Compact for the 21st Century (Amer. Acad. of Arts & Sciences)
www.aps.org ©2017, American Physical Society; Email: [email protected]
What do these institutions have in common?
• Cleveland State University• Elizabeth City State University• Long Island University• Minnesota State University Moorhead• Prairie View A&M University • Southern Oregon University• Tennessee State University• Texas Southern University• University of Northern Iowa• University of Southern Maine
7
www.aps.org ©2017, American Physical Society; Email: [email protected]
Physics Bachelor Degrees
www.aps.org/programs/education/statistics/compare.cfm8
www.aps.org ©2017, American Physical Society; Email: [email protected]
BPUPP: Brief Timeline
STB*: Requests to APS to do what ACS does: Program Certification2012: APS leadership asks to investigate2013: Working group formed to investigate2014: Survey of physics chairs, report written2015: Committee on Education (COE) discusses, and makes
recommendation to APS Council; ABET announces intention to accredit all science fields
2015: APS Council charges COE to form task force2016: COE begins process, drafts preliminary documents, recruits task force2016: Task force starts meeting2017: Applied for funding, beginning drafts and discussions on underlying
issues
*Since Time Began
9
www.aps.org ©2017, American Physical Society; Email: [email protected]
Guide for Undergraduate Physics Program Assessment, Review, and Improvement
1. Develop a guide for self-assessment of undergraduate physics programs founded on documented best practices linked to measurable outcomes
The guide should provide a physics-community-based resource to assist programs in developing a culture of continuous self-improvement, in keeping with their individual mission, context, and institutional type. The guide should include considerations of curricula, pedagogy, advising, mentoring, recruitment and retention, research and internship opportunities, diversity, scientific skill development, career/workforce preparation, staffing, resources, and faculty professional development.
2. Recommend a plan for ongoing review and improvement of this guide under the oversight of the APS Committee on Education
10
www.aps.org ©2017, American Physical Society; Email: [email protected]
APS BPUPP Task Force Members
Co-Chair: David Craig, Le Moyne College
Co-Chair: Michael Jackson, Millersville University of Pennsylvania
• Noah Finkelstein, University of Colorado Boulder• Courtney Lannert, Smith College and UMass Amherst• Ramon Lopez, University of Texas at Arlington• Willie Rockward, Morehouse College• Gay Stewart, West Virginia University• Gubbi Sudhakaran, University of Wisconsin-La Crosse• Kathryn Svinarich, Kettering University• Carl Wieman, Stanford University• Lawrence Woolf, General Atomics Aeronautical Systems, Inc.
Project Manager: Sam McKaganStaff Liaison: Ted Hodapp; Task Force Support: Miranda BardAAPT Liaison: Bob Hilborn
www.aps.org/bpupp
11
www.aps.org ©2017, American Physical Society; Email: [email protected]
What BPUPP is Doing
Designing a process to help department chairs with:• Periodic program assessment (departmental review)• Improve usefulness of assessment• Bring together known literature on topics, and best
practices when there is insufficient evidence-based knowledge about topics
• Encourage discussions in departments on continuous improvement of physics major using evidence-based methods
• Provide a leverage point for departments to advocate for resources to improve the major
• Engage PER community on departmental needs12
www.aps.org ©2017, American Physical Society; Email: [email protected]
Main Sections
• Recruitment and retention (enrollment strategies, degree tracks, partnerships, student interactions)
• Student learning (curriculum, education research, undergraduate research)
• Career preparation (teacher preparation, graduate school, diverse careers)
• Assessment (program and student learning)• Diversity and equity• Department climate and faculty professional development• Department leadership• Program Review (for reviewers and departments)
13
www.aps.org ©2017, American Physical Society; Email: [email protected]
1-Minute Exercise
• Take a piece of paper• Write one or two things you would like addressed
in this guide• Pass the paper to the aisles
14
Active learning increases student performance inscience, engineering, and mathematicsScott Freemana,1, Sarah L. Eddya, Miles McDonougha, Michelle K. Smithb, Nnadozie Okoroafora, Hannah Jordta,and Mary Pat Wenderotha
aDepartment of Biology, University of Washington, Seattle, WA 98195; and bSchool of Biology and Ecology, University of Maine, Orono, ME 04469
Edited* by Bruce Alberts, University of California, San Francisco, CA, and approved April 15, 2014 (received for review October 8, 2013)
To test the hypothesis that lecturing maximizes learning andcourse performance, we metaanalyzed 225 studies that reporteddata on examination scores or failure rates when comparing studentperformance in undergraduate science, technology, engineer-ing, and mathematics (STEM) courses under traditional lecturingversus active learning. The effect sizes indicate that on average,student performance on examinations and concept inventories in-creased by 0.47 SDs under active learning (n = 158 studies), andthat the odds ratio for failing was 1.95 under traditional lecturing(n = 67 studies). These results indicate that average examinationscores improved by about 6% in active learning sections, and thatstudents in classes with traditional lecturing were 1.5 times morelikely to fail than were students in classes with active learning.Heterogeneity analyses indicated that both results hold acrossthe STEM disciplines, that active learning increases scores on con-cept inventories more than on course examinations, and that ac-tive learning appears effective across all class sizes—although thegreatest effects are in small (n ≤ 50) classes. Trim and fill analysesand fail-safe n calculations suggest that the results are not due topublication bias. The results also appear robust to variation in themethodological rigor of the included studies, based on the qualityof controls over student quality and instructor identity. This is thelargest and most comprehensive metaanalysis of undergraduateSTEM education published to date. The results raise questions aboutthe continued use of traditional lecturing as a control in researchstudies, and support active learning as the preferred, empiricallyvalidated teaching practice in regular classrooms.
constructivism | undergraduate education | evidence-based teaching |scientific teaching
Lecturing has been the predominant mode of instruction sinceuniversities were founded in Western Europe over 900 y ago
(1). Although theories of learning that emphasize the need forstudents to construct their own understanding have challengedthe theoretical underpinnings of the traditional, instructor-focused, “teaching by telling” approach (2, 3), to date there hasbeen no quantitative analysis of how constructivist versus expo-sition-centered methods impact student performance in un-dergraduate courses across the science, technology, engineering,and mathematics (STEM) disciplines. In the STEM classroom,should we ask or should we tell?Addressing this question is essential if scientists are committed
to teaching based on evidence rather than tradition (4). Theanswer could also be part of a solution to the “pipeline problem”that some countries are experiencing in STEM education: Forexample, the observation that less than 40% of US students whoenter university with an interest in STEM, and just 20% ofSTEM-interested underrepresented minority students, finish witha STEM degree (5).To test the efficacy of constructivist versus exposition-centered
course designs, we focused on the design of class sessions—asopposed to laboratories, homework assignments, or other exer-cises. More specifically, we compared the results of experimentsthat documented student performance in courses with at leastsome active learning versus traditional lecturing, by metaanalyzing
225 studies in the published and unpublished literature. The activelearning interventions varied widely in intensity and implementa-tion, and included approaches as diverse as occasional groupproblem-solving, worksheets or tutorials completed during class,use of personal response systems with or without peer instruction,and studio or workshop course designs. We followed guidelines forbest practice in quantitative reviews (SI Materials and Methods),and evaluated student performance using two outcome variables:(i) scores on identical or formally equivalent examinations, conceptinventories, or other assessments; or (ii) failure rates, usuallymeasured as the percentage of students receiving a D or F gradeor withdrawing from the course in question (DFW rate).The analysis, then, focused on two related questions. Does ac-
tive learning boost examination scores? Does it lower failure rates?
ResultsThe overall mean effect size for performance on identical orequivalent examinations, concept inventories, and other assess-ments was a weighted standardized mean difference of 0.47 (Z =9.781, P << 0.001)—meaning that on average, student perfor-mance increased by just under half a SD with active learningcompared with lecturing. The overall mean effect size for failurerate was an odds ratio of 1.95 (Z = 10.4, P << 0.001). This oddsratio is equivalent to a risk ratio of 1.5, meaning that on average,students in traditional lecture courses are 1.5 times more likely tofail than students in courses with active learning. Average failurerates were 21.8% under active learning but 33.8% under tradi-tional lecturing—a difference that represents a 55% increase(Fig. 1 and Fig. S1).
Significance
The President’s Council of Advisors on Science and Technologyhas called for a 33% increase in the number of science, tech-nology, engineering, and mathematics (STEM) bachelor’s degreescompleted per year and recommended adoption of empiricallyvalidated teaching practices as critical to achieving that goal. Thestudies analyzed here document that active learning leads toincreases in examination performance that would raise averagegrades by a half a letter, and that failure rates under traditionallecturing increase by 55% over the rates observed under activelearning. The analysis supports theory claiming that calls to in-crease the number of students receiving STEM degrees could beanswered, at least in part, by abandoning traditional lecturing infavor of active learning.
Author contributions: S.F. and M.P.W. designed research; S.F., M.M., M.K.S., N.O., H.J.,and M.P.W. performed research; S.F. and S.L.E. analyzed data; and S.F., S.L.E., M.M.,M.K.S., N.O., H.J., and M.P.W. wrote the paper.
The authors declare no conflict of interest.
*This Direct Submission article had a prearranged editor.
Freely available online through the PNAS open access option.1To whom correspondence should be addressed. E-mail: [email protected].
This article contains supporting information online at www.pnas.org/lookup/suppl/doi:10.1073/pnas.1319030111/-/DCSupplemental.
www.pnas.org/cgi/doi/10.1073/pnas.1319030111 PNAS Early Edition | 1 of 6
PSYC
HOLO
GICALAND
COGNITIVESC
IENCE
S
Interactive Education: Read this Article
www.aps.org ©2017, American Physical Society; Email: [email protected]
Backward Design Goals
I want you to:• Get a sense of the motivation driving this effort• Understand BPUPP Task Force vision and goals• Agree that outcomes are being designed to help
you and not hinder you• Know who to contact with questions and concerns
16
www.aps.org ©2017, American Physical Society; Email: [email protected]
Fundamentals of Writing Across the Curriculum (WAC)
Writing to Learn• Using writing (and other forms of communication)
to enhance learningLearning to Write• Improving writing skills within the disciplinary
context
17
www.aps.org ©2017, American Physical Society; Email: [email protected]
Implementation: Learning to Write
Learning to Write• Class size is important (to provide timely feedback)• Multiple opportunities to write (and receive feedback) –
some programs require at least 7,500 words (15 pages)• Multiple revisions• Assessed (graded) – some require assessment fraction• Approached from different perspectives (e.g., concept
outlines, drafts, document structure, editing, proofreading, ethics), and graded in these contexts
• Explicitly taught, not just required• Support services
18
www.aps.org ©2017, American Physical Society; Email: [email protected]
Implementation: Writing to Learn
Writing to Learn• Informal (lab notebooks, white boards, 1-minute
interactions, at chalkboard, online posts, etc.), not intended for public consumption
• Many opportunities (throughout the entire course)• Classroom expectation• Graded on participation rather than writing• Instructor acknowledgment• Includes words, drawings, numbers, equations, etc.
19
www.aps.org ©2017, American Physical Society; Email: [email protected]
Why the 1-minute writing?Metacognitive Analysis
• Active engagement• Sent message of my interest• Got (anonymous) feedback on your interests• Example: Writing to learn• Physically engaged (can’t sleep)
20
www.aps.org ©2017, American Physical Society; Email: [email protected]
Guide Timeline
2017: Initial drafts; external input on drafting sections; external section reviews by invited department chairs and experts in the field
2018: Limited release of first sections; feedback from community; continued development and feedback
2018: Begin workshops on use of guide; training of departmental reviewers
2019: Release of entire (1st edition) of guide; continued workshops
2020: First review by COE to update/improve content, updating of review procedures to ensure fidelity of design principles
21
www.aps.org ©2017, American Physical Society; Email: [email protected]
References
SPIN-UP 2002 (enrollment): aps.org/programs/education/undergrad/faculty/spinup/
T-TEP 2012 (teacher education): phystec.org/webdocs/TaskForce.cfm
Phys21 2016 (careers): compadre.org/phys21/
Vision and Change 2011 (biology):visionandchange.org
Writing across the curriculum: John Bean, “Engaging Ideas: The Professor’s Guide to Integrating Writing, Critical Thinking, and Active Learning in the Classroom”
Active learning: Scott Freeman, et al., “Active learning increases student performance in science, engineering, and mathematics,” PNAS 111 (23), 8410-8415 (2014).
22
www.aps.org ©2017, American Physical Society; Email: [email protected]
More information?
Co-chairs:• Mike Jackson ([email protected])• David Craig ([email protected])
Project Manager: Sarah “Sam” McKagan ([email protected])
APS Contact: Ted Hodapp ([email protected])
Web Site: www.aps.org/bpupp
23