using technology for content delivery, formative ... · data for over 6,000 students in...

37
Using Technology for Content Delivery, Formative Assessment, and Reflection Bridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick Using Technology for Content Delivery, Formative Assessment, and Reflection Michael S. Kirkpatrick JMU Computer Science Bridgewater College Annual Pedagogy Project 2015 Thursday, June 4, 15

Upload: others

Post on 24-Sep-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Using Technology for Content Delivery, Formative Assessment, and Reflection

Michael S. KirkpatrickJMU Computer Science

Bridgewater CollegeAnnual Pedagogy Project 2015

Thursday, June 4, 15

Page 2: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Workshop Objectives

At the completion of this session, participants will demonstrate progress toward the following objectives:• Summarize relevant literature for active learning• Explain how to use videos for effective content delivery• Describe how to incorporate formative assessment into a

flipped classroom• Identify practical strategies to encourage students’ day-to-

day class preparation and metacognition

Thursday, June 4, 15

Page 3: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Agenda

• Introductions and welcome•Active learning preassessment•Camtasia lessons learned•Formative assessment and PI•Metacognition and reflection•Discussion and exploration

Thursday, June 4, 15

Page 4: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

“Adopting instructional practices that engage students in the learning process is the

defining feature of active learning.”-Michael Prince

Thursday, June 4, 15

Page 5: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Benefits Illustrated

R.R. Hake, "Interactive-engagement vs traditional methods: A six-thousand-student survey ofmechanics test data for introductory physics courses," Am. J. Phys. 66, 64- 74 (1998).http://www.physics.indiana.edu/~sdi/ajpv3i.pdf

Measure of performance gain• Mechanics Diagnostic (MD)

or Force Concept Inventory (FCI)• 62 courses (14 trad.) at

multiple institutions• 6542 students (2084)

Thursday, June 4, 15

Page 6: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Collaborative and Cooperative Learning

attention span during lecture is roughly fifteen minutes. After that,Hartley and Davies [36] found that the number of students payingattention begins to drop dramatically with a resulting loss in reten-tion of lecture material. The same authors found that immediatelyafter the lecture students remembered 70 percent of informationpresented in first ten minutes of the lecture and 20 percent of infor-mation presented in last ten minutes. Breaking up the lecture mightwork because students’ minds start to wander and activities providethe opportunity to start fresh again, keeping students engaged.

2) Promoting Student Engagement: Simply introducing activ-ity into the classroom fails to capture an important component ofactive learning. The type of activity, for example, influences howmuch classroom material is retained [34]. In “Understanding byDesign” [37], the authors emphasize that good activities developdeep understanding of the important ideas to be learned. To dothis, the activities must be designed around important learning out-comes and promote thoughtful engagement on the part of the stu-dent. The activity used by Ruhl, for example, encourages studentsto think about what they are learning. Adopting instructional prac-tices that engage students in the learning process is the defining fea-ture of active learning.

The importance of student engagement is widely accepted andthere is considerable evidence to support the effectiveness of studentengagement on a broad range of learning outcomes. Astin [38]

reports that student involvement is one of the most important pre-dictors of success in college. Hake [39] examined pre- and post-testdata for over 6,000 students in introductory physics courses andfound significantly improved performance for students in classeswith substantial use of interactive-engagement methods. Testscores measuring conceptual understanding were roughly twice ashigh in classes promoting engagement than in traditional courses.Statistically, this was an improvement of two standard deviationsabove that of traditional courses. Other results supporting the effec-tiveness of active-engagement methods are reported by Redish et al.[40] and Laws et al. [41]. Redish et al. show that the improvedlearning gains are due to the nature of active engagement and not toextra time spent on a given topic. Figure 1, taken from Laws et al.,shows that active engagement methods surpass traditional instruc-tion for improving conceptual understanding of basic physics con-cepts. The differences are quite significant. Taken together, thestudies of Hake et al., Redish et al. and Laws et al. provide consider-able support for active engagement methods, particularly for ad-dressing students’ fundamental misconceptions. The importance ofaddressing student misconceptions has recently been recognized asan essential element of effective teaching [42].

In summary, considerable support exists for the core elements ofactive learning. Introducing activity into lectures can significantlyimprove recall of information while extensive evidence supports thebenefits of student engagement.

B. Collaborative LearningThe central element of collaborative learning is collaborative vs.

individual work and the analysis therefore focuses on how collabora-tion influences learning outcomes. The results of existing meta-stud-ies on this question are consistent. In a review of 90 years of research,Johnson, Johnson and Smith found that cooperation improved learn-ing outcomes relative to individual work across the board [12]. Simi-lar results were found in an updated study by the same authors [13]that looked at 168 studies between 1924 and 1997. Springer et al.[43] found similar results looking at 37 studies of students in science,mathematics, engineering and technology. Reported results for eachof these studies are shown in Table 1, using effect sizes to show theimpact of collaboration on a range of learning outcomes.

What do these results mean in real terms instead of effect sizes,which are sometimes difficult to interpret? With respect to academicachievement, the lowest of the three studies cited would move a

4 Journal of Engineering Education July 2004

Figure 1. Active-engagement vs. traditional instruction for im-proving students’ conceptual understanding of basic physics concepts(taken from Laws et al., 1999)

Table 1. Collaborative vs. individualistic learning: Reported effect size of the improvement in different learning outcomes.

student from the 50th to the 70th percentile on an exam. In absoluteterms, this change is consistent with raising a student’s grade from75 to 81, given classical assumptions about grade distributions.*With respect to retention, the results suggest that collaboration re-duces attrition in technical programs by 22 percent, a significantfinding when technical programs are struggling to attract and retainstudents. Furthermore, some evidence suggests that collaboration isparticularly effective for improving retention of traditionally under-represented groups [44, 45].

A related question of practical interest is whether the benefits ofgroup work improve with frequency. Springer et al. looked specifical-ly at the effect of incorporating small, medium and large amounts ofgroup work on achievement and found the positive effect sizes associ-ated with low, medium and high amount of time in groups to be 0.52,0.73 and 0.53, respectively. That is, the highest benefit was found formedium time in groups. In contrast, more time spent in groups didproduce the highest effect on promoting positive student attitudes,with low, medium and high amount of time in groups having effectsizes of 0.37, 0.26, and 0.77, respectively. Springer et al. note that theattitudinal results were based on a relatively small number of studies.

In summary, a number of meta-analyses support the premisethat collaboration “works” for promoting a broad range of studentlearning outcomes. In particular, collaboration enhances academicachievement, student attitudes, and student retention. The magni-tude, consistency and relevance of these results strongly suggest thatengineering faculty promote student collaboration in their courses.

C. Cooperative LearningAt its core, cooperative learning is based on the premise that co-

operation is more effective than competition among students forproducing positive learning outcomes. This is examined in Table 2.

The reported results are consistently positive. Indeed, looking athigh quality studies with good internal validity, the already large ef-fect size of 0.67 shown in Table 2 for academic achievement in-creases to 0.88. In real terms, this would increase a student’s examscore from 75 to 85 in the “classic” example cited previously, thoughof course this specific result is dependent on the assumed grade dis-tribution. As seen in Table 2, cooperation also promotes interper-sonal relationships, improves social support and fosters self-esteem.

Another issue of interest to engineering faculty is that coopera-tive learning provides a natural environment in which to promote

effective teamwork and interpersonal skills. For engineering faculty,the need to develop these skills in their students is reflected by theABET engineering criteria. Employers frequently identify teamskills as a critical gap in the preparation of engineering students.Since practice is a precondition of learning any skill, it is difficult toargue that individual work in traditional classes does anything todevelop team skills.

Whether cooperative learning effectively develops interpersonalskills is another question. Part of the difficulty in answering thatquestion stems from how one defines and measures team skills.Still, there is reason to think that cooperative learning is effective inthis area. Johnson et al. [12, 13] recommend explicitly training stu-dents in the skills needed to be effective team members when usingcooperative learning groups. It is reasonable to assume that the op-portunity to practice interpersonal skills coupled with explicit in-structions in these skills is more effective than traditional instructionthat emphasizes individual learning and generally has no explicit in-struction in teamwork. There is also empirical evidence to supportthis conclusion. Johnson and Johnson report that social skills tendto increase more within cooperative rather than competitive or indi-vidual situations [46]. Terenzini et al. [47] show that students re-port increased team skills as a result of cooperative learning. In addi-tion, Panitz [48] cites a number of benefits of cooperative learningfor developing the interpersonal skills required for effective team-work.

In summary, there is broad empirical support for the centralpremise of cooperative learning, that cooperation is more effectivethan competition for promoting a range of positive learning out-comes. These results include enhanced academic achievement and anumber of attitudinal outcomes. In addition, cooperative learningprovides a natural environment in which to enhance interpersonalskills and there are rational arguments and evidence to show the ef-fectiveness of cooperation in this regard.

D. Problem-Based Learning As mentioned in Section II of this paper, the first step of deter-

mining whether an educational approach works is clarifying exactlywhat the approach is. Unfortunately, while there is agreement onthe general definition of PBL, implementation varies widely.Woods et al. [16], for example, discuss several variations of PBL.

“Once a problem has been posed, different instructional methods may beused to facilitate the subsequent learning process: lecturing, instructor-facilitated discussion, guided decision making, or cooperative learning. Aspart of the problem-solving process, student groups can be assigned to

July 2004 Journal of Engineering Education 5

*Calculated using an effect size of 0.5, a mean of 75 and a normalized grade dis-tribution where the top 10 percent of students receive a 90 or higher (an A) and thebottom 10 percent receive a 60 or lower (an F).

Table 2. Collaborative vs. competitive learning: Reported effect size of the improvement in different learning outcomes. M. Prince, “Does active learning work? A review of the research.” J. Eng. Education 93(3), 223- 241, 2004.http://www4.ncsu.edu/unity/lockers/users/f/felder/public/Papers/Prince_AL.pdf

Thursday, June 4, 15

Page 7: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Problem-Based Learning

complete any of the learning tasks listed above, either in or out of class. In thelatter case, three approaches may be adopted to help the groups stay on trackand to monitor their progress: (1) give the groups written feedback after eachtask; (2) assign a tutor or teaching assistant to each group, or (3) create fullyautonomous, self-assessed “tutorless” groups.”

The large variation in PBL practices makes the analysis of its ef-fectiveness more complex. Many studies comparing PBL to tradi-tional programs are simply not talking about the same thing. Formeta-studies of PBL to show any significant effect compared to tra-ditional programs, the signal from the common elements of PBLwould have to be greater than the noise produced by differences inthe implementation of both PBL and the traditional curricula.Given the huge variation in PBL practices, not to mention differ-ences in traditional programs, readers should not be surprised if noconsistent results emerge from meta-studies that group togetherdifferent PBL methods.

Despite this, there is at least one generally accepted finding thatemerges from the literature, which is that PBL produces positivestudent attitudes. Vernon and Blake [19] looking at 35 studies from1970 to 1992 for medical programs found that PBL produced a sig-nificant effective size (0.55) for improved student attitudes andopinions about their programs. Albanese and Mitchell [20] similar-ly found that students and faculty generally prefer the PBL ap-proach. Norman and Schmidt [18] argue “PBL does provide amore challenging, motivating and enjoyable approach to education.That may be a sufficient raison d’etre, providing the cost of the im-plementation is not too great.” Note that these and most of the re-sults reported in this section come from studies of medical students,for whom PBL has been widely used. While PBL has been used inundergraduate engineering programs [49, 50] there is very littledata available for its effectiveness with this population of students.

Beyond producing positive student attitudes, the effects of PBLare less generally accepted, though other supporting data do exist.Vernon and Blake [19], for example, present evidence that there is astatistically significant improvement of PBL on students’ clinicalperformance with an effect size of 0.28. However, Colliver [22]points out that this is influenced strongly by one outlying study witha positive effect size of 2.11, which skews the data. There is also evi-dence that PBL improves the long-term retention of knowledgecompared to traditional instruction [51–53]. Evidence also suggeststhat PBL promotes better study habits among students. As onemight expect from an approach that requires more independencefrom students, PBL has frequently been shown to increase library

use, textbook reading, class attendance and studying for meaningrather than simple recall [19, 20, 53, 54].

We have already discussed the problems with meta-studies thatcompare non-uniform and inconsistently defined educational inter-ventions. Such studies are easily prone to factors that obscure re-sults. The approach for handling this difficulty with active, collabo-rative and cooperative learning was to identify the central elementof the approach and to focus on this rather than on implementationmethods. That is more difficult to do with PBL since it is not clearthat one or two core elements exist. PBL is active, engages studentsand is generally collaborative, all of which are supported by our pre-vious analysis. It is also inductive, generally self-directed, and oftenincludes explicit training in necessary skills. Can one or two ele-ments be identified as common or decisive?

Norman and Schmidt [18] provide one way around the difficul-ty by identifying several components of PBL in order to show howthey impact learning outcomes. Their results are shown in Table 3,taken directly from Norman and Schmidt using the summary ofmeta-studies provided by Lipsey and Wilson [17]. The measuredlearning outcome for all educational studies cited by Lipsey andWilson was academic achievement.

Norman and Schmidt present this table to illustrate how differ-ent elements of PBL have different effects on learning outcomes.However, the substantive findings of Table 3 are also worth high-lighting for faculty interested in adopting PBL because there seemsto be considerable agreement on what works and does not work inPBL.

Looking first at the negative effects, there is a significant nega-tive effect size using PBL with non-expert tutors. This finding isconsistent with some of the literature on helping students make thetransition from novice to expert problem solvers. Research compar-ing experts to novices in a given field has demonstrated that becom-ing an expert is not just a matter of “good thinking” [42]. Instead,research has demonstrated the necessity for experts to have both adeep and broad foundation of factual knowledge in their fields. Thesame appears to be true for tutors in PBL.

There is also a small negative effect associated with both self-paced and self-directed learning. This result is consistent with thefindings of Albanese and Mitchell [20] on the effect of PBL on testresults. In seven out of ten cases they found that students in PBLprograms scored lower than students in traditional programs ontests of basic science. However, in three out of ten cases, PBLstudents actually scored higher. Albanese and Mitchell note thatthese three PBL programs were more “directive” than others,

6 Journal of Engineering Education July 2004

Table 3. Effect sizes associated with various aspects of problem-based learning.

M. Prince, “Does active learning work? A review of the research.” J. Eng. Education 93(3), 223- 241, 2004.http://www4.ncsu.edu/unity/lockers/users/f/felder/public/Papers/Prince_AL.pdf

Thursday, June 4, 15

Page 8: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Active Learning in STEM

Heterogeneity analyses indicated no statistically significantvariation among experiments based on the STEM discipline ofthe course in question, with respect to either examination scores(Fig. 2A; Q = 910.537, df = 7, P = 0.160) or failure rates (Fig. 2B;Q = 11.73, df = 6, P = 0.068). In every discipline with more than10 experiments that met the admission criteria for the meta-analysis, average effect sizes were statistically significant foreither examination scores or failure rates or both (Fig. 2, Figs.S2 and S3, and Tables S1A and S2A). Thus, the data indicatethat active learning increases student performance across theSTEM disciplines.For the data on examinations and other assessments, a het-

erogeneity analysis indicated that average effect sizes were lowerwhen the outcome variable was an instructor-written course ex-amination as opposed to performance on a concept inventory(Fig. 3A and Table S1B; Q = 10.731, df = 1, P << 0.001). Al-though student achievement was higher under active learning forboth types of assessments, we hypothesize that the difference ingains for examinations versus concept inventories may be due tothe two types of assessments testing qualitatively different cogni-tive skills. This explanation is consistent with previous research

indicating that active learning has a greater impact on studentmastery of higher- versus lower-level cognitive skills (6–9), andthe recognition that most concept inventories are designed todiagnose known misconceptions, in contrast to course examinationsthat emphasize content mastery or the ability to solve quantitativeproblems (10). Most concept inventories also undergo testing forvalidity, reliability, and readability.Heterogeneity analyses indicated significant variation in terms

of course size, with active learning having the highest impacton courses with 50 or fewer students (Fig. 3B and Table S1C;Q = 6.726, df = 2, P = 0.035; Fig. S4). Effect sizes were sta-tistically significant for all three categories of class size, how-ever, indicating that active learning benefitted students inmedium (51–110 students) or large (>110 students) class sizesas well.When we metaanalyzed the data by course type and course

level, we found no statistically significant difference in activelearning’s effect size when comparing (i) courses for majorsversus nonmajors (Q = 0.045, df = 1, P = 0.883; Table S1D), or(ii) introductory versus upper-division courses (Q = 0.046, df = 1,P = 0.829; Tables S1E and S2D).

Fig. 1. Changes in failure rate. (A) Data plotted as percent change in failure rate in the same course, under active learning versus lecturing. The mean change(12%) is indicated by the dashed vertical line. (B) Kernel density plots of failure rates under active learning and under lecturing. The mean failure rates undereach classroom type (21.8% and 33.8%) are shown by dashed vertical lines.

Fig. 2. Effect sizes by discipline. (A) Data on examination scores, concept inventories, or other assessments. (B) Data on failure rates. Numbers below datapoints indicate the number of independent studies; horizontal lines are 95% confidence intervals.

Freeman et al. PNAS | June 10, 2014 | vol. 111 | no. 23 | 8411

PSYC

HOLO

GICALAND

COGNITIVESC

IENCE

SSE

ECO

MMEN

TARY

Meta-analysis of 225 studies• 158 studies: average 0.47

SDs better on CIs/exams• 67 studies: average failure

rate dropped from 33.8% to 21.8% with active learning

S. Freeman et al., “Active learning increases student performance in science, engineering, and mathematics,” Proceedings of the National Academy of Sciences 111(23), 8410- 8415, 2014.http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4060654/pdf/pnas.201319030.pdf

Thursday, June 4, 15

Page 9: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

From Content Delivery to Formative Assessment

Thursday, June 4, 15

Page 10: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Thursday, June 4, 15

Page 11: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Thursday, June 4, 15

Page 12: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Thursday, June 4, 15

Page 13: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Thursday, June 4, 15

Page 14: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Lessons Learned

Be mindful of accessibility• Never use red and green contrasts• Create subtitles or transcripts as needed

Write a script first• Doing so will save you time later• Makes for easy transcript creation

Align videos with learning objectives• Use a discussion forum for students to ask questions• Incorporate an ungraded Moodle quiz for practice

Thursday, June 4, 15

Page 15: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Formative Assessment andPeer Instruction

Thursday, June 4, 15

Page 16: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

“Feedback is value-neutral help on worthy tasks. It describes what the learner did and did not do in

relation to her goals. It is actionable information, and it empowers the student to make intelligent adjustments

when she applies it to her next attempt to perform.”-Grant Wiggins

Thursday, June 4, 15

Page 17: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Peer Instruction

Peer Instruction• Created by Eric Mazur (Harvard)• Augment class with ConcepTests• Expose common misconceptions• Think-vote-pair-revote pattern• E. Mazur, Peer Instruction: A User’s Manual, 1996.• http://mazur.harvard.edu/research/detailspage.php?rowid=8

Thursday, June 4, 15

Page 18: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Physics Question 1

Thursday, June 4, 15

Page 19: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Closing the Gender Gap

PI can eliminate gender gap in physics• T: traditional lectures• IE: interactive lectures• IE+: interactive assignments, lectures, tutorials

Cooperative learning closes the gender gap• Pretest scores were 10%

points higher for men• Gap persisted with lecture

alone• Posttest results for

cooperative classes were almost equal

E. Mazur, “The scientific approach to teaching: Research as a basis for course design,” keynote/plenarytalk at the International Computing Education Research Conference (ICER), 2011.http://mazur.harvard.edu/search-talks.php?function=display&rowid=1712

Thursday, June 4, 15

Page 20: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Closing the Gender Gap 100

80

60

40

20

0

gain

(%)

100806040200pretest score (%)

menwomen

100

80

60

40

20

0

gain

(%)

100806040200pretest score (%)

menwomen

Traditional lectures leave women behind• Women tend to have

smaller performance gains

Cooperative learning improves gains for women• ...but men improve as well

E. Mazur, “The scientific approach to teaching: Research as a basis for course design,” keynote/plenarytalk at the International Computing Education Research Conference (ICER), 2011.http://mazur.harvard.edu/search-talks.php?function=display&rowid=1712

Thursday, June 4, 15

Page 21: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Physics Question 2

Thursday, June 4, 15

Page 22: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Demos and Engagement

Performance and understanding increase with engagement• Those who only observe sometimes learn it wrong• Those who discuss show clearer reasoning and provide

partially correct answers

mode correct balances no clear

no demo 31% 53% 42%

observe 42% 55% 42%

predict 41% 65% 32%

discuss 46% 85% 15%

Thursday, June 4, 15

Page 23: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Physics Question 3

Thursday, June 4, 15

Page 24: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Confusion and Understanding

“Please tell us briefly what points of reading you found most difficult or confusing.”• “Nothing was difficult or confusing.”• “I found the explanation inadequate. I don’t understand the

reasoning that led to the conclusion.”

Thursday, June 4, 15

Page 25: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Best Practices for Formative Assessment

Identify the learning gap• Space between what students know and need to know

Bidirectional feedback• Identify student progress and suggest corrections

Actively engage students• Students need to assess their own understanding

Create learning progressions• Break larger goal into subgoals

Thursday, June 4, 15

Page 26: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Best Practices for Formative Assessment

Ask clear questions• Avoid ambiguity, use only one verb

Psychological safety• Positive reinforcement, accept imperfection

Sequencing and balance• Consider the order and type of questions, activities

Wait time• Come ot terms with silence

Avoid pimping questions• Do not try to establish intellectual superiority

Thursday, June 4, 15

Page 27: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Thursday, June 4, 15

Page 28: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Metacognition Space Race

Thursday, June 4, 15

Page 29: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Metacognitive Activities

Preassessments• What do I already know about this topic?

Muddiest points• What am I still confused on?

Exam corrections• Why did I miss this question?

Documented problem solving• What were the steps I used to solve this problem?

Learning progress journals• How did my understanding of this concept change?

Thursday, June 4, 15

Page 30: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

RSQC2

Recall the most important points• Requires student to analyze relative importance

Summarize the most important points• Provides practice with comprehension

Construct a question you would like answered• Encourages reflection and evaluation

Connect this material to other concepts• Establishes scaffolding from previously learned material

Comment on your learning progress this week• Uses metacognitive reflection to instill study habits

Thursday, June 4, 15

Page 31: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Tools for Metacognitive Activities

Socrative• Good use: quick in-class polling

Qualtrics• Good use: offline, anonymous surveys

Moodle• Good use: graded and offline assessments

Social media• Good use: blogs as learning journals

Piazza• Good use: collaborative student discussions and answers

Thursday, June 4, 15

Page 32: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Best Practices for Metacognition

Develop metacognitive culture• Give students freedom to be confused• Integrate reflection into credited course work• Model metacognitive behavior

Teach the concept and language of metacognition• Explicit instruction over time expands skill set

Reflect the specific learning context• Metacognition is NOT generic

Externalize mental events• Increase accurate awareness of strengths and weaknesses

Thursday, June 4, 15

Page 33: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Bloom’s Taxonomy

Thursday, June 4, 15

Page 34: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Discussion and Exploration

Thursday, June 4, 15

Page 35: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Other Technology Tools

Twitter• Good use: announcements, highlight news stories

CATME• Good use: team formation, peer evaluation

Top Hat Monocle• Alternative to Socrative (requires paid subscription)

Asynchronous MOOC Videos• edX, Kahn Academy, MIT OpenCourseWare

Thursday, June 4, 15

Page 36: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Resources

Peer Instruction (PI)• E. Mazur, Peer Instruction: A User’s Manual, 1996.• http://mazur.harvard.edu/research/detailspage.php?rowid=8

Formative Assessment• T. A. Angelo and K. P. Cross, Classroom Assessment Techniques: A Handbook

for College Teachers, 1993.• http://wvde.state.wv.us/teach21/ExamplesofFormativeAssessment.html• http://cft.vanderbilt.edu/guides-sub-pages/cats/

Metacognition• E. Cook, E. Kennedy, S. Y. McGuire, “Effect of Teaching Metacognitive

Learning Strategies on Performance in General Chemistry Courses,” J. Chem. Educ., 2013, 90 (8), pp 961–967.

• http://www.lmu.edu/Assets/Centers+$!2b+Institutes/Center+for+Teaching+Excellence/Teach+STEM+Students+How+to+Learn--Metacognition+is+the+Key!+Slides.pdf

Thursday, June 4, 15

Page 37: Using Technology for Content Delivery, Formative ... · data for over 6,000 students in introductory physics courses and found significantly improved performance for students in classes

Using Technology for Content Delivery, Formative Assessment, and ReflectionBridgewater College Annual Pedagogy Project 2015 • Dr. Michael S. Kirkpatrick

Resources

Others• J. D. Bransford, A. L. Brown, and R. R. Cocking, How People Learn: Brain,

Mind, Experience, and School, 2000.• M. Weimer, Learner-Centered Teaching: Five Key Changes to Practice, 2002.• G. Wiggins and J. McTighe, Understanding by Design, 2005.• N. Pinchok and W. C. Brandt, “Connecting Formative Assessment

Research to Practice: An Introductory Guide for Educators,” 2009. http://www.learningpt.org/pdfs/FormativeAssessment.pdf

• N. Chick, Metacognition. http://cft.vanderbilt.edu/guides-sub-pages/metacognition/

• T. Tofade, J. Elsner, S. T. Haines, “Best Practice Strategies for Effective Use of Questions as a Teaching Tool,” Am J Pharm Educ. 2013 Sep 12; 77(7): 155.

Thursday, June 4, 15