assessment scholars - biology scholarswiki.biologyscholars.org/.../1747/=1._2014_capstonepre...

47
2013-2014 Biology Scholars Program Capstone Event Thursday, May 15 th , 2014 10:00am – 3:00pm DoubleTree by Hilton Boston North Shore Hotel Danvers, Massachusetts Pre-Capstone Assignment Pre-Capstone Assignment Instructions………………...2 Assessment Scholars………………………………………3 1

Upload: vuongcong

Post on 06-Feb-2018

216 views

Category:

Documents


3 download

TRANSCRIPT

2013-2014 Biology Scholars Program Capstone Event

Thursday, May 15th, 201410:00am – 3:00pm

DoubleTree by Hilton Boston North Shore Hotel Danvers, Massachusetts

Pre-Capstone Assignment

Pre-Capstone Assignment Instructions………………...2

Assessment Scholars………………………………………3

Research Scholars………………………………………...23

1

2013-2014 Biology Scholars Program Capstone Event

Thursday, May 15th, 201410:00am – 3:00pm

DoubleTree by Hilton Boston North Shore Hotel Danvers, Massachusetts

2013-2014 Biology Scholars Capstone Pre-Assignment Instructions

1. Read the publication, “Creating a Faculty Culture of Student Success”: http://www.aspeninstitute.org/sites/default/files/content/docs/pubs/Creating%20A%20Faculty%20Culture%20of%20Student%20Success.pdf

2. Click the "Attach File or Image" button at the bottom of the Wiki Capstone page (you must be logged in to the Wiki). Upload the following information as one Word Document:

a. An abstract of your of your journey since leaving the training Institute last summer (up to 300 words).  Include what has worked well, what has been a challenge, and what you still hope to accomplish as a Biology Scholar Alum;

b. A table or a figure that helps to illustrate your past year’s work; andc. 1-3 references that have informed your past year’s work.

3. Familiarize yourself with your institution’s mission statement and/or goals. Bring 20 hardcopies of this information to share with other Scholars at the Capstone.

2

ASSESSMENT SCHOLARS

Sarah AdesPenn State University, University Park, PA – 2013-2014 Assessment

After reading this abstract, you will understand what shaped my approach to teaching in the past year and the challenges and hopes I have for the future. In short, my evolving understanding of the concept of learning objectives had an enormous impact on how I taught this year. The BSP residency, emphasizing the links between objectives, classroom activities, and assessment, helped me realize how valuable objectives are. In planning for classes early in the year, I would start with my default of having lists in my head of what to cover, but I would catch myself and ask, “What should the students know at the end of this class or unit, and how are you going to get them there?” Intellectually this process made sense last summer, but putting it into practice made everything come together. In addition to this teaching approach epiphany, I’ve used more active learning techniques. In my seminar course, we changed from trying to lead unstructured group discussions to using small group discussions, peer editing, role-playing, etc. This approach worked amazingly well, and students noted it on their semester evaluations. The lesson plans that I developed last summer for my lab course helped immensely. The castle diagrams were incredibly useful in organizing what to do when. I had also developed a problem set related to the last lab module based on data from the primary literature. The problem set made the students think about what the experiment meant before they did it on their own. I plan to develop similar problem sets for other lab modules. My major challenge is time management- how to balance ideas for teaching, for building more courses incorporating research and active learning, with sustaining my research program. My solution? Don’t stop! I’m joining the 2014-15 BSP Research Residency.

My approach to teaching has become more and more like my approach to research. I expect them to meld even more as I start to do research on teaching and learning.

References that informed my work:

1. Handelsman, J., Miller, S., and Pfund, C. Scientific Teaching. New York : W. H. Freeman and Co., 2007.

2. Branchaw, J., Pfund, C., and Rediske, R. Entering Research: A Facilitator’s Manual. New York : W. H. Freeman and Co., 2010.

3. Ambrose, S.A., Bridges, M.W., DiPietro, M., Lovett, M.C., and Norman, M.K. How Learning Works. San Francisco: Jossey-Bass, 2010.

3

4

Diane HartmanBaylor University, Waco, TX – 2012-2013 Assessment

I attended the Summer 2012 Assessment Residency. Upon returning home I was very discouraged, frustrated and disappointed. This was not what I had anticipated. There were no truly tangible assessments that I felt confident about using in my large introductory biology and microbiology classes of 100+ students.

My position as lecturer does not come with teaching assistants or undergraduate student worker funding. Many of the activities and rubrics work best with more one-on-one interaction. Colleagues evaluated my classes as part of my 2013 performance review. The overall summary was positive. One professor said that watching me “run around the room” to help the groups exhausted him. Another said that this was “exactly what we are supposed to be doing for active learning”. A third noted that several students were checking email and shopping during the group work time. It is very challenging to incorporate active learning in the context of large classes with fixed seating.

Grades for the larger classes are based primarily (80%) on summative assessments (multiple choice exams) with i<clicker quizzes, mastering homework and 4 active learning projects comprising the remaining 20%. The active learning assessments were easy to distribute, but extremely time consuming to read and evaluate. Meaningful, timely feedback was a major problem for me.

This past year a colleague and I have developed our version of the Small World Initiative at Baylor. We had 13 students working in groups of 3-4. These students worked in a BSL-2 lab, developed oral presentations on ESKAPE pathogens, and oral/poster presentations about their specific lab research. All groups presented their posters at one or more scientific meetings this spring. This has provided a unique opportunity to “engage” students in life-long learning. At this level, development of learning goals and learning objectives was greatly simplified. Student engagement was outstanding.

Date ActivitySummer 2013 Returned to the Mountain West Summit as the “alum” from Baylor University 2012

TeamSummer 2013 Attended training program at Yale for training as a Pilot Partner for the Small World

InitiativeSpring 2014 Initiated the first cohort of students in the SWI at BU

ASMCUE: Biosafety Microbrew: SWI- Crowdsourcing Antibiotic DiscoveryASM Poster: SWI - Crowdsourcing Antibiotic Discovery

Fall 2013-Spring 2014

Baylor University’s Academy for Teaching and Learning (ATL) has a two-fold mission: globally, to support and inspire a flourishing community of learning; locally, to promote the integration of teaching, scholarship, collegiality, and service in a Christian environment. This program includes FIG (faculty interest groups) luncheon presentations, SET (seminars for excellence in teaching) lectures and a Lecturer Mentoring Program. I have attempted to attend as many of these sessions as possible. FIG Topics included: Compelling Scholarship: review and generate scholarship of teaching and learning (SoTL). The primary aim is to strengthen the connections between teaching and learning on Baylor’s campus. Transformational Education: The purpose of this FIG is to promote active, engaged, and innovative learning across all schools and disciplines. Judicious Stewardship: The purpose of this FIG is to learn and utilize peer and course assessment techniques to increase the value and quality of a Baylor education. Low-stakes assessments, teaching observations and feedback, student evaluations, and learning goals and objectives will be among the topics discussed.SET: Alexander Beaujean. EVIDENCE-BASED CLASSROOM ASSESSMENT

5

Testing improves long-term retention of material. Testing has a cumulative effect, meaning that the more tests a professor administers throughout the semester, the better the learning outcomes. One great way of administering repeated tests is to use a practice format, such as short quizzes. As a result of repeated assessment, students are more likely to remember the material and they become aware of their own knowledge of the subject.  Professors are also more aware of the student’s knowledge.  In general, testing helps student to organize their knowledge conceptually and can contribute to their ability to apply knowledge in more complex contexts. Finally, testing promotes great study habits. Ultimately, tests are cheap, effective and undeniably beneficial for learning.LM: Dr. Marty Harvill “Incorporation of Student Accountability into the Planning of a Course”LM: Dr. Nicole McAninch “Making Student Course Evaluations Work for You”

Presenter LM: I presented the April 2014 Lecturer Mentoring Topic: “Strategies for Pursuing Undergraduate Research as a Lecturer”.

References:

1. Handelsman, J., C. Pfund, S. Lauffer, C. Pribbenow. 2008. Entering Mentoring. University of Wisconsin Press, Madison, Wisconsin.

2. Phillips, A., Robertson A., Batzli, J, Harris, and Miller, S. 2008. Aligning Goals, Assessments, and Activities: An Approach to Teaching PCR and Gel Electrophoresis. CBE-Life Sciences Education 7: 96-106.

Nathalia HoltzmanQueens College, City University of New York, Queens, NY – 2013-2014 Assessment

The two most important things I took away from the summer workshop was 1) allowing class time for the development of non-content based learning goals and 2) truly considering the time my students spent on my class outside of face time and how to transition between in class and home learning (Castle top design). The course I was focused in is called Science, Technology and the City of New York. To include more time in my class to pursue learning goals such as “caring about the city we live in,” I included time within many lecture for students to explore how the disease of the week directly impacted New York City. These in-class discussions were paired with homework assignments that included self-reflections (something I have never tried). Overall, the class was better able to see the link between the course content and NYC however I was not sure how much more they actually cared for the city. I also introduce online pre-class assignments that were heavily used direct the first 15 minutes on class time. I found the insight from the pre-class activities invaluable and greatly improved the focused time within the lecture. Since this is the first time I thought this class I don’t have any way to determine if this approached improved student learning but by the end of the semester a number of students indicated that they could clearly see that the level of complexity of the course and the expectation of the students was high yet very reachable.

The capstone project for this class was a community health fair organized entirely by the students. They prepared the material, brought in outside guests and hosted the event. Over 600 people attended, with over 15 receiving HIV screen, 17 people joined the bone marrow registry and over 30 people were screen for blood glucose, high blood pressure and had dental screening. Here are some photos of the event.

6

References that informed my work:

1. Grajek, Susan (2013) Understanding what higher education needs from E-Textbooks: An EDUCAUSE/Internet2 Pilot (Research Report), Louisville, CO: EDUCAUSE Center for Analysis and Research. Available from http://www.educause.edu/ecar.

2. David R. Krathwohl (2002) A Revision of Bloom's Taxonomy: An Overview. Theory Into Practice, 41:4, 212-218, DOI: 10.1207/s15430421tip4104_2. Available from http://dx.doi.org/10.1207/s15430421tip4104_.

Megan HowardUniversity of Alaska - Anchorage, Anchorage, AK – 2013-2014 Assessment

Participating in the institute this last year has been eye-opening in several ways. But there were two main objectives I took away from the Assessment residency for my ~100 person Allied Health Microbiology course: (1) aligning my learning objectives and (2) instituting objective-driven assessment.

I have written several learning objectives for each of my lectures (example - Fig A), and used these both to structure my lectures, choose in-class formative activities and post for my students to target their studying. These have also helped me implement backward-design for my lectures, keeping me ‘honest’ when it came to what material I should/did cover. While these are not fully developed, I plan to finish these for Fall. Using backward-design, I feel that my lectures were more organized and my students were better able to follow the material and were more prepared in class.

I also implemented online-homework for formative assessment and redesigned lab homework assignments this year. I chose assignments carefully to match to class content, and used ‘coaching’ style questions and higher level blooms questions (analyze, discuss, etc.) to stimulate critical thinking. When I assessed similarities between homework and exam/final grades for the semester, the graphs suggest that while lab homework varied with final lab exam grades, online lecture homework was more fluid in its fluctuations with exam grades. While neither of these fits suggests a statistically relevant correlation, anecdotal evidence suggests that these assessments are helpful and should be evaluated further more specifically (matching specific homework to exam questions).

I have found this year that aligning my assignments and learning objectives has made me realize how critical objectives are when organizing my lectures and being clear with my students regarding which objectives are emphasized in each assessment and on the exams.

7

A:

B:

References that informed my work:

1. Fink, L. Dee. (2013). Creating Significant Learning Experiences: An Integrated Approach to Designing College Courses. San Francisco: Jossey-Bass.

Jerry KavourasLewis University, Romeoville, IL – 2013-2014 Assessment

Two insights that I gained from the summer training institute were (1) I was planning my courses inefficiently and (2) I needed to better incorporate Bloom’s taxonomy into my course assignments and assessments. I realized during the summer institute that I did not implement, integrate, and align assessments and assignments with course learning outcomes effectively. Before the institute, I would determine the content and plan assignments and assessments throughout the semester at regular intervals. After the institute, I started planning my courses using the backward design model to align lectures with assignments and assessments on a weekly basis. I believe that revising my courses and organizing assessments and assignments on a weekly or biweekly basis really helped my students better achieve the course learning outcomes. I also realized that this approach made my lectures more effective. In regards to assignments and assessments, I realized that

8

I was not using Bloom’s taxonomy effectively. I needed to be clearer to my students which higher order thinking skills were emphasized in the assignments. I believe that this helped my students to create better final products and improved my assignment designs. Future goals include redesigning all courses using this approach, assessing the effectiveness of my designs, and staying current in pedagogical best practices.

In my first assignment for the Assessment Residency, I constructed an assignment grid to demonstrate the implementation of backward design model into my ecology course. The grid demonstrates how I changed my approach to designing courses.

Learning ObjectivesTaxonomy

Level/Category Formative Assessments Summative Assessments

Unit 1: BiodiversityLearning Outcomes: Students should be able to:

define components of biodiversity and provide examples for each component

list aquatic and terrestrial biomes and their general characteristics

define disturbance and describe its significance to biodiversity

define succession and its significance to biodiversity

Knowledge, apply, evaluate

1. Knowledge grid2. Simbio Virtual Lab:

Intermediate DisturbanceHypothesis

3. Class Discussion of Reading Assignments (Easter’s End, Tragedy of the Commons)

1. Exam2. Weekly quiz3. Group Project:

Biomes Posters

Unit 2: The Environment andOrganismsLearning Outcomes: Students should be able to:

distinguish between microclimates and macroclimates

describe the significance of temperature in relation to theperformance of organisms

describe strategies thatorganisms use to regulate bodytemperature

describe strategies that organisms use to balance water loss/gain

describe methods organisms use to obtain energy and the optimal foraging theory

describe biogeochemical cycles and factors that influence them

Knowledge, apply, evaluate

1. Knowledge grid2. Simbio Virtual Lab:

Liebig's barrel and Limiting nutrients 1. Exam

2. Weekly quiz

9

Unit 3: Population EcologyLearning Outcomes: Students should be able to:

define niche describe factors that influence

the distribution and abundance of populations

distinguish among geometric, exponential, and logistic population growth

interpret survivorship curves describe approaches used to

organize life histories

Knowledge, apply, evaluate

1. Knowledge grid2. Simbio Virtual Lab:

Isle Royale

1. Exam2. Weekly quiz

Unit 4: Species InteractionsLearning Outcomes: Students should be able to:

define social, exploitative, competitive, and mutualistic relationships

describe the evolution of sociality and cooperation

describe the Lokta-Volterra models for competition and exploitation

define keystone species define trophic cascade

Knowledge, apply, evaluate

1. Knowledge grid2. Simbio Virtual Lab:

The Barnacle Zone

3. Simbio Virtual Lab: Keystone Predators

4. Simbio Virtual Lab: Top Down Control

1. Exam2. Weekly quiz

References that informed my work:

1. Fink, L. Dee. (2013). Creating Significant Learning Experiences: An Integrated Approach to Designing College Courses. (2nd ed.). San Francisco: Jossey-Bass.

Joan KielyStony Brook University, Stony Brook, NY – 2013-2014 Assessment

Participation in the Biology Scholars Assessment Residency catalyzed a dramatic change in my teaching and scholarship at Stony Brook University. I believe that key factors in this change were immersion in content specific pedagogy, participation in focused and relevant workshops as well as valuable feedback from other fellows and the program leaders. This program made me aware of new ways to envision my role as teacher and better ways to set learning goals for my students. The students in my target course were the first to benefit from my experience as a scholar. I rewrote the syllabus and laboratory manuals with explicit learning goals. Castle-top diagrams, which I strongly disliked during the residency program, turned out to be really useful when designing specific assignments – yup you were right. Better discussion in class and more relevant exams followed from these clearer objectives. The changes spread beyond my target course. Since participating in the Biology Scholars Program, I have been able to share my experience and knowledge with colleagues engendering changes in other courses at Stony Brook and I am now part of a team redesigning a course for medical students. A clearer view of learning objectives allowed me to better assess the outreach program that I oversee. I have presented this data at two international meetings: The Association of Science Teacher Education and The National Association for Research in Science Teaching. I currently have one paper under review and another in preparation. The Biology Scholars Program engendered a change in my career view as well and I am considering pursuing a Ph.D. in Science Education. In the next year, I plan to continue redesign of syllabi and content for two other courses.

10

References that informed my work:

1. Ingraham, J.L., March of the Microbes, Harvard University Press 2010, ISBN978-0-674-06409-6.2. Bell, P. Lewenstein, B, Shouse, A.W. and Feder, M.A., Learning science in informal environments:

People, Places and Pursuits, The National Academies Press, 2009.3. Gess-Newsome, J Southerland, S.A. Johnston, A. and Woodbury, S. Educational reform, personal

practice theories and dissatisfaction: The anatomy of change in college science teaching, American Educational Research Journal, 2003, 40(3)731-767.

Sally MolloyUniversity of Maine, Orono, ME – 2013-2014 Assessment

I am in my third year of teaching a Phage Genomics course to first-year University of Maine Honors students majoring in STEM. The design of the Phage Genomics course is built around the curriculum structure provided by the Howard Hughes Medical Institute-Science Education Alliance’s program providing first-year students the opportunity to actively learn current techniques in the field of microbiology and genomics while carrying out their own novel research project. I expanded the basic active research component of the course to include group activities and reflective assignments that teach scientific content while simultaneously teaching students a variety of personal, interpersonal, and critical thinking skills essential to good science. I defined and focused on the following skills and attributes: the capacity for interdependent thinking, the ability to recognize past knowledge that becomes relevant in a new situation, the ability to develop a specific strategy and to create a visual tool to solve a complex problem, the ability to persist in the face of challenging problems knowing that this kind of courage pays off, the ability to communicate ideas/results clearly, and the capacity to develop some degree of self-

11

awareness, openness, and the kind of interpersonal effectiveness that enhances the development of individual learners who work well within a research team. The success of this course has inspired the faculty of my department to begin designing an additional integrative, research based laboratory class experience, based on the design of Phage Genomics, to replace several existing laboratory course components. Additionally, the faculty is willing to participate in summer workshops in which we will carry out the course design strategies of the BioScholars Assessment residency. Despite the success of this course with students and faculty we are struggling to win the support of the University administration, whose focus is on courses with large student numbers.

References that informed my work:

1. T.A. Angelo and K.P. Cross. 1993. Classroom Assessment Techniques: A Handbook for College Teachers. 2nd Ed. Jossey-Bass, Inc. Publishers, San Francisco.

2. J.D. Fink. 2003. Creating Significant Learning Experiences: An Integrated Approach To Designing College Courses. Jossey-Bass, Inc. Publishers, San Francisco.

3. A.L. Costa and B. Kallick. 2008. Learning and Leading with Habits of Mind: 16 Essential Characteristics for Success. ASCD, Alexandria, VA.

Stephanie Richards and Lee AbrahamsenBates College, Lewiston, ME – 2013-2014 Assessment

We now consistently make and use learning goals for all of our courses and individual class times. This has provided us (and the colleagues we teach with) a framework on which to focus material. It also provides students with a way to organize their study efforts. We developed a protein structure/function activity after the initial lecture on this topic that included an in-class activity that was very successful. The second part of the activity involved an online student group project (a Google Doc) that asked students to generate an information sheet on various members

12

of a single protein family. This was meant to segue to a laboratory activity in which students explore the activity of three amylases under different environmental conditions. The online group activity was less successful than the in-class component. The biggest challenge in this 100-student course is balancing time for formative assessments (etc.) with content delivery. There is so much content that subsequent courses depend on, that using class time for multiple assessments or activities is still challenging. We still hope to share more of what we’ve learned with other faculty in the natural sciences. Our department and division is leading the movement toward best practices in the STEM disciplines, but these efforts are in their infancy. The sciences division hopes to have focus groups to talk more about these issues.

a)

b) Grades on summative assessments related to protein structure function:

2011 exam

2012 exam

2013 exam

2011 lab report

2012 lab report

2013 lab report

Mean 72.81+/- 11.9

74.76+/- 12.7

74.03+/- 13.8

79.39+/- 4.7

80.38+/- 5.1

82.15+/- 5.9

Median 73 76 74.5 81 82 82.5

High 99 96 98.5 89.3 92.2 92.2

Low 32.5 39 45 69.5 64.8 75.1

*2011: no activity*2012: pipe cleaner structure activity introduced in lecture as a small group activity

13

*2013: pipe cleaner activity + group writing of protein family information sheet activityReferences that informed our work:

1. Barbara Oakley, Richard M. Felder, Rebecca Brent, Imad Elhajj; Turning Student Groups into Effective Teams. Journal of Student Centered Learning, Volume 2, No. 1, 2004.

2. Kimberly D. Tanner; Moving Theory into Practice: A Reflection on Teaching a Large, Introductory Biology course for Majors. CBE - Life Sciences Education, Volume 10, 2011.

3. Jamie Lee Jensen, Anton Lawson; Effects of Collaborative Group Composition and Inquiry Instruction on Reasoning Gains and Achievement in Undergraduate Biology. CBE - Life Sciences Education, Volume 10, 2011.

Anne RosenwaldGeorgetown University, Washington, DC – 2013-2014 Assessment

The course I worked on was our spring Biochemistry course (~100 students – biology majors, pre-‐meds, and post-‐baccalaureate students). I taught this class for 12 years, but then took a 4-‐year break and was anxious to make the course more learner-‐centered. However, several events made it difficult to incorporate many changes. First, our lab director quit after the fall semester, so the time between semesters was devoted to getting our new director ready (but she’s been great and we’re very lucky to have hired her). Second, although previously I was solely responsible for the course content, this time I partnered with another faculty member with his own opinions about course structure. We worked through the fall to discuss our different visions, but by compromising with him, some of my ideas for flipping the classroom, bringing more problem-‐based learning to classroom time, etc. fell away. Most of the content was delivered as straight-‐up lectures, something I hope to change for next year.

Despite issues with the lectures, we did restructure the laboratory part, incorporating some of the ideas from last summer’s workshop. We freshened up some of the older labs with new information and activities (we did Joan Kiely’s Toober exercise with great success!). We also substituted some of the wet lab exercises with cases (some from Erin Dolan, University of Georgia [personal communication] and some from Kathleen Cornely, Providence College*). We also wrote some of our own cases and had the students read key papers and reviews on those topics to prepare them. We plan to continue this practice next year in lab, as well as incorporate some of this material into the lectures.

Finally, we tried to incorporate a little bit of “just-‐in-‐time” teaching by having the students take low-‐stakes knowledge quizzes over the weekend on upcoming material. This was reasonably successful – most students made an effort to take the quizzes and from anecdotal evidence, the students said it made them prepare differently and it made us better able to deal with areas of misunderstanding. We also pointed the students to YouTube videos, etc. about various topics and an Online Learning Initiative course in biochemistry from Carnegie-‐Mellon University to use as they wished. Again, anecdotal evidence was positive. We’re in the process of collecting survey data now (no results yet, so no graph to show).

We built a case around this paper: Preston et al. (1992) Appearance of Water channels in Xenopus Oocytes Expressing Red Cell CHIP28 Science 256, 385. The figure

14

shown is the key figure demonstrating that CHIP28, now known as aquaporin, is a water channel.

References that informed my work:

1. Cornely, K. (1999) Cases in Biochemistry, John Wiley and Sons.

Updates/corrections of some of the cases are available on her website: http://www.providence.edu/chemistry/kcornely/Pages/casebook.aspx. She is willing to provide answers to these ([email protected]). Many of these are also now part of Essential Biochemistry (2010 – 2nd ed, 2013 – 3rd ed) Voet, Voet, and Pratt, John Wiley and Sons.

Mary ShawNew Mexico Highlands University, Las Vegas, NM – 2013-2014 Assessment

My goal was to write achievable objectives for the first semester general biology class, especially the evolution unit. I have been trying to focus the class on a true understanding and ability to apply the most significant concepts that form a foundation to biology. Natural selection is surely one of those. The students have a preconceived idea that natural selection involves characters appearing when individuals need them to allow the organisms to adapt to a changing environment. It has been very difficult to dissuade them of this idea. My main means of assessing the student’s ability to demonstrate understanding has been clicker questions such as “When exposed to antibiotics bacteria will try hard to adapt. True or false.” I have also asked them to describe specifically how ducks got webbed feet; giraffes got long necks or another example of an adaptation. The number of students who cling to their incorrect preconception has disappointed me.

My colleagues and I have been discussing whether small classes (25 students or less) would lead to more student success than our usual 50 -75 student classes so we decided to try the smaller class size in general biology 1 and 2 this spring semester and compare our results to fall semester. We are trying to do the same activities and use the same assessments. Unfortunately we decided to do this shortly before fall semester ended and neither of us was familiar with assessment instruments that already existed and that we could adopt. We attended a Discipline Based Research workshop and received good suggestions that we have tried to implement. However it has been difficult to try to fit this in with our other obligations especially since we don’t feel that we really know what we are doing.

References that informed my work:

1. Fink, L. D. 2003. Creating significant learning experiences: an integrated approach to designing college courses. San Francisco, Jossey-Bass.

2. Understanding Evolution, Teaching materials http://evolution.berkeley.edu/evolibrary/teach/index.php retrieved May5, 2014.

3. Colorado Learning Attitudes about Science http://www.colorado.edu/sei/class/CLASS-Bio.html retrieved May 5 2014.

Ann SmithUniversity of Maryland, College Park, MD – 2012-2013 Assessment

I participated in the Biology Scholars Assessment Residency in 2012. I was unable to attend ASMCUE in 2013. I am happy to complete my Biology Scholars project by attending the 2014 Capstone event.

15

My participation in the Assessment Residency helped me clarify my understanding of learning outcomes and how these underpin curriculum design and assessment this work has been front and center for me in the past two years. Here are three examples.

1. Uncovering student misconceptions about microbiology. I lead a faculty learning community that has developed a concept inventory (HPI Concept Inventory). In the last two years we have been investigating the explanations that student provide for their multiple choice responses to the inventory. We are coding these responses to provide us insight into student misconceptions. Collaborating with colleagues at Virginia Tech, we have addressed related to antibiotic resistance. This work provides us insight in to students thinking related to the ASM microbiology curriculum. For this work, we sit together (about 10 faculty) and read student responses to determine common themes and develop a codebook. We then go back to the data and code it to determine the frequency of the themes. Our goal is to use the knowledge of student misconceptions to develop informed learning outcomes for new course assignments. I will present a poster on some of the findings at ASMCUE.

2. Outcome Based Curriculum Design. I have worked with colleagues teaching general microbiology to write up two case studies that address learning outcomes addressing ASM curriculum guidelines.

3. Learning Outcome Assessment. I have moved from my role as a microbiology instructor to a position in the Office of Undergraduate Studies. In the past two years I have helped to develop a criterion based assessment approach for our General Education Program. This includes rubrics to assess General Education Learning Outcomes.

Figure that represents the process of using the HPI Concept Inventory to uncover student thinking / misconceptions:

Example of a HPI Concept Inventory question and codes revealed from analysis of students explanations for response choice.Which is _*NOT*_ true about the evolution of antibiotic resistance in bacterial populations? It can be mediated by

A. selective growth of bacteria capable of degrading antibiotics.B. alterations of a bacterium’s genetic material through mutation.C. changes in gene expression that occur in the presence of antibiotics.D. modification of a bacterium’s genome through uptake of new genetic information.E. I do not know the answer to this question.

16

References that informed my work:

1. “Peer Review | Fall 2011-Winter2012 | Assessing General Education Learning Outcomes.” Accessed April 30, 2014. http://www.aacu.org/peerreview/pr-fa11wi12/assessinggeneral.cfm.

2. Fisher, K. M. 1983. Amino acids and translation: a misconception in biology. In International Seminar on Misconceptions in Science and Mathematics, Cornell University, Ithaca, NY.

3. Anderson, T. R., and K. J. Schönborn. 2008. Bridging the educational research-teaching practice gap. Conceptual understanding, Part 1: The multifaceted nature of expert knowledge. Biochemistry and Molecular Biology Education 36:309-315.

Eric SpanaDuke University, Durham, NC – 2013-2014 Assessment

Since last year I have found that the castle-top view of weeks just doesn’t work for my class. Because my course is a very small lab research class, what happens in any particular class varies greatly from semester to semester. Planning by two-hour movable modules works better and I can use those at different times depending on the class mood. I have worked this past year to make re-usable references for students to use throughout the semester (i.e. How-To-Microscope-Properly) rather than just give it as a lecture. This past school year I also converted one of the take home quizzes to an in-class group quiz (actually a pop-quiz). Students performed much, much better on it when able to discuss the complicated genetics with each other. Finally, although I haven’t been able to institute it, yet, I am also working on a pre-course/post-course assessment that covers all the learning goals of the course. I will be using the pre-course assessment to identify student experience and understanding of genetics, development and molecular biology and then use it to designate teams for the semester (teams of 3 students each) and hopefully I can avoid the “BFFs” problem I had this past semester. Having a friend on a team automatically leaves one student out. At the end of the semester I can then re-administer the same assessment and see how each student improved (or not) over the semester.

17

I am also starting the process of re-vamping a genomics lab course, and thanks to this residency, I think I may actually design it the right way.

References that informed my work:

1. Scientific Teaching. Handelsman, Miller & Pfund. I use it mostly because it’s the one on my desk. I also use the materials from last year’s workshop quite a bit.

Figure 1. A newly eclosed wild-type (OregonR) Drosophila melanogaster adult inflates its wings. Right-click on the image and select “Open Hyperlink” to play the movie. The movie is a time lapse of an immobilized fly suspended from a glass needle. Low-resolution images were captured every 5 seconds for ~2 hours with a Nikon D300S using Nikon’s Camera Control Pro2 software. An undergraduate student worked out how to take these movies last summer and they have been extremely useful in examination of our mutant phenotypes by allowing us to pinpoint the stage in development where our mutant phenotype first is expressed.

Tatiana Tatum Parker Saint Xavier University, Chicago, IL – 2013-2014 Assessment

The original goal from the Assessment workshop was to integrate more assessment and in class exercises into my courses. I have been on sabbatical this year, so this has allowed me the opportunity to restructure my Introductory Biology and Genetics classes. I have much clearer student learning outcomes, and I have a in place to give pre-exercise assignments such as watching videos and completing pre-class online questions. The classes are also structured now so that with each unit there is at least on in class assignment. An example would be a protein folding exercise to show the importance of tertiary structure in protein formation.

We also plan to conduct a study to examine which is the best predictor of success in Introductory Biology and Chemistry classes, math placement or scientific reasoning ability. Numerous studies have been done examining the best predictors of collegiate success. Previous studies have examined the ability of standardized tests (ACT/SAT) as a strong predictor of success in science classes ( Nordstrom, 1990; Bunce and Hutchinson, 1993; Spencer, 1996), high school GPA (Carmichael et al., 1986), high school chemistry grade (Ozsogomonyan and Loftus, 1979; Craney and Armstrong, 1985; Nordstrom, 1990), personality characteristics (House, 1995) and Piagetian tasks (Bender and Milakofsky, 1982; Bunce and Hutchinson, 1993) to predict final chemistry course grade. Spencer (1996) found that SAT math scores and performance on the National Association of Biology Teachers exam correlated positively with class performance and GPA of first year intro biology students. However, results of an exam which measures critical thinking and scientific reasoning did not correlate with final grade or GPA but did positively correlate with SAT math scores.

References that informed my work:

1. Bender D.S. and Milakofsky L., (1982), College chemistry and Piaget: the relationship of aptitude and achievement measures, Journal of Research in Science Teaching, 19, 205-216

18

2. Bunce D.M. and Hutchinson K.D., (1993), The use of the GALT (Group Assessment of Logical Thinking) as a predictor of academic success in college chemistry, Journal of Chemical Education, 70, 183-187.

3. Carmichael J.W.J., Bauer J.S., Sevenair J.P., Hunter J.T. and Gambrell R.L., (1986), Predictors of first-year chemistry grades for Black Americans, Journal of Chemical Education, 63, 333-336.

4. Craney C.L. and Armstrong R.W., (1985), Predictors of grades in general chemistry for allied health students, Journal of Chemical Education, 62, 127-129.

5. House J.D., (1995), Noncognitive predictors of achievement in introductory college chemistry, Research in Higher Education, 36, 473-490.

6. Nordstrom B.H., (1990), Predicting performance in freshman chemistry, Paper opresented at the American Chemical Society National Meeting, Boston, Massachusetts.

7. Ozsogomonyan A. and Loftus D., (1979), Predictors of general chemistry grades, Journal of Chemical Education, 56, 173-175.

8. Spencer H.E., (1996), Mathematical SAT test scores and college chemistry grades, Journal of Chemical Education, 73, 1150-1153.

Didem Vardar-UluWellesley College, Wellesley, MA – 2013-2014 Assessment

There were three very valuable lessons I took away from the Assessment Institute last summer:

1. The importance of backward course design that included clear outlining of assessment tools well in advance in the course design process and categorizing them into formative and summative assessments to help with setting up expectations from students in alignment with student learning objectives.

2. The importance of explicit inclusion of expected out-of class time investment into the course design and student assessment through the use of Castle Top diagrams

3. The importance of creating a master rubric for the course that reflects ALL the learning objectives for the course and clearly articulates the expectations from the students in terms of how to meet those objectives.

Until this residency, the biggest challenge I felt in designing and implementing my courses was to strike a balance between acknowledging diverse learning styles, hence creating a learning platform that offered multiple learning and assessment options and ensuring the establishment and perception of a coherent, effective, manageable, and sustainable course design. By devoting focused and intense attention to critically rework a full week of my Biochemistry course (which I had thought was pretty well laid out already) in light of the concepts and strategies we were introduced to in the institute and receiving practical feedback from multiple colleagues helped me return to my institution with concrete implementation ideas not only for the course I worked on, but also other new courses I was designing and teaching. For both of the courses I am teaching this spring (Fundamentals of Biochemistry and Art of Science) one of the learning objectives for the students was for them to be able to demonstrate competence for independent learning within the field the course was based on (i.e access new information, critically read and analyze primary literature, ask targeted questions, distinguish between relevant and irrelevant information, etc.) and communicate their understanding both to an audience with similar background (peers) as well as a diverse audience (public). Though I have always had activities and projects that encouraged these skills, this year I focused on the assessment component of these projects and certainly have seen a direct gain in student learning outcomes for intended objectives.

Below are a few photographs from the interactive presentation my’ Chem106: Art of Science: Think like a Scientist, Act like an Artist’ course students had in the Cambridge Science Festival last weekend, where they presented their final project on neurotransmitters of their choice to the public.

19

I

know this part is way over our abstract limit, but I thought I would also share with the group a unique opportunity I had to apply what I had learned in the Institute. I was invited to take part in a major assessment and revision effort for the Introductory Science courses in one of the major universities in Istanbul, Turkey and in my advisory role there had an opportunity to work with multiple faculty and teaching assistants over the summer and into the Fall. What I found most interesting about this experience was that surprisingly many of the teaching faculty were very well aware of the importance of formulating clear learning objectives and creating learning activities they believed paralleled these objectives. However, even the word of “assessment” created immediate push-back (or in the best case scenario skepticism) from the faculty as it seemed to imply for them a negatively critical evaluation of their own teaching competencies. The ability to share what I had learned and experienced in the Biology Scholars Assessment Institute with them and provide them with specific examples and strategies from my own work that was polished over the residency program was truly instrumental in ensuring a buy-in from this group.

References that informed my work:

1. Walvoord, Barbara E. and Anderson, Virginia Johnson, Effective Grading, San Francisco: Jossey-Bass, 2010, 2nd edition, Print.

2. Angelo, A. Thomas, Classroom Assessment Techniques: A Handbook for College Teachers, San Francisco: Jossey-Bass, 1993, 2nd edition, Print.

3. Stevens, Dannelle D., Levi, Antonia J., Barbara E. Walvoord Introduction to Rubrics: An Assessment Tool to Save Grading Time, Convey Effective Feedback, and Promote Student Learning.

Cuc Kim VuSt. Catherine University, Minneapolis, MN – 2012-2013 Assessment

BSP 2012 Assessment Residency was intense with a lot of information delivered over such a short period of time. The most pertinent information for me was “The Grid”. I have had a template syllabus handed down to me for every course and taught based on these syllabi without thinking about aligning learning objectives, taxonomy levels, and the different forms of assessment. During the residency we were directed to apply the grid to a specific learning activity. I choose to focus on the case studies assigned in my Microbiology courses. This allowed me think about how the case studies

20

connected to the learning objectives, what level of taxonomy was being applied, and create a stronger grading rubric. My long term goal is to use the grid for alignment (learning objectives and the different forms of assessment while identifying the taxonomy level) of the syllabus unit by unit.

In Summer 2012, I worked on creating a new method for students to complete their case study assignment. Case studies have always been a part of the microbiology course in my department. Students were assigned 3-4 case studies and instructed to complete them individually outside of class time. I expected that students would do very well on the case studies and find them educational as well as fun to complete. I observed that students were very stressed out and actually did poorly on these case studies. I realized that some of my students have very little experience with solving case studies, conducting research, and assembling information. This lead to changing the assignment to have students complete one group case study and 2-3 individual case studies afterwards.

In Fall 2012, I implemented the group case study in addition to the individual case studies. I observed that students were more comfortable with the assignment and constructed their response with greater expertise and less plagiarism.

In Summer 2013, I compiled data comparing previous classes that completed individual case studies to the Fall 2012 cohort that completed group and individual case studies. I presented my findings at the Case Study and PBL in STEM Education Science Case Network Pre-conference Meeting at ASMCUE 2013.

I had to take Fall 2013 off due to a motor accident and returned to teaching in Spring 2014. I will be assigning the group and individual case studies to the Spring 2014 cohort. This summer I will begin the alignment work starting with Unit one: Microbial structure and function, metabolism, growth, and genetics. I will re-evaluate and develop clear language on the learning objectives and aligning them to assessments.

Since there were so many great resources given during the assessment residency I wanted to make sure that I didn’t forget them so saved them onto my flashdrive. It has been extremely useful to have these resources on hand whenever I needed to review a topic. A challenge is being able to stay on track once I have returned back to work. Ideally, it would be nice to have grants or a pledge from institutions that support faculty for the BSP to allow a small amount of course load release to continue this work.

Summer 2012 Fall 2012 Summer 2013 Spring 2014 Summer 2014Case study- Aligning with

learning objective

- Developing clear grading rubric

Implemented group case study

Attended and presented a poster at Case Study and PBL in STEM Education Science Case Network Pre-conference Meeting at ASMCUE 2013

Continue to use group case study

Alignment of Unit One in Microbiology syllabus

Case study- Develop group

case study

21

References that informed my work:

1. Case Study and PBL in STEM Education Science Case Network Pre-conference Meeting at ASMCUE 2013.

2. Albanese, M. and Mitchell, S. 1993. Problem-based learning: a review of literature on its outcomes and implementation issues. Academic Medicine 68, no 1:52-81.

3. Klionsky, D. 2004. Points of view: Lectures: Can’t learn with them, can’t learn without them: Talking biology: Learning outside the book – and the lecture. Cell Biology Education 3:204-211.

Matthew WatermanEastern Nazarene College, Quincy, MA – 2013-2014 Assessment

Since completing the workshop I have delved further into the work of capturing my lectures electronically so that class time can be more productive and interactive. It was a steep learning curve but one that has also been successful in many ways. The first semester of implementing my “concept jigsaw group activity” helped me find the places where my activity theory didn’t match up with the practical realities of the classroom setting. I was able to tweak it on the fly and the reimplementation this semester has gone much smoother. I also used the concept jigsaw activity as a springboard for flipping my Biochemistry II course which was not even something I had considered at the time of the institute. The impulsive decision made for a lot of late nights doing video editing but again I think it was worth it in the long run. One learning tool that I was introduced to at the assessment institute and used during this last year was the IF-AT scratch tickets. I would strongly suggest everyone consider implementing these, particularly as a way to give group quizzes. The students love the interaction (high fiving each other when they get a question right on the first try, etc…) and the conversation and investment the students have as they discuss the answers surpassed my expectations.

22

Exam 4 Final Exam Exam 4 Final Exam Exam 4 Final ExamFall 2012 Fall 2012 Spring 2013 Spring 2013 Fall 2013 Fall 2013

0%

10%20%

30%

40%

50%

60%

70%

80%

90%

100%

Grade Distribution(Concept Jigsaw Introduced Fall 2013)

ABCDF

References that informed my work:

1. Persky AM, Pollack GM. Using Answer-Until-Correct Examinations to Provide Immediate Feedback to Students in a Pharmacokinetics Course. Am J Pharm Educ. 2008;72:83-9.

2. Dihoff RE, Brosvic GM, Epstein ML, et al. Provision of Feedback During Preparation for Academic Testing: Learning is Enhanced by Immediate but Not Delayed Feedback. Psycholog Rec. 2004;54:207-331.

Maureen WhitehurstTrident Technical College, Charleston, SC – 2013-2014 Assessment

1) What has worked well: the support from the BioScholars community such as group email notices, WIKI, references, webinars has worked well! Thank you.

2) What has been a challenge: crafting a practical project design, narrowing project scope to an achievable workload in the context of various college collateral duties and teaching course load.

3) What you still hope to accomplish as Biology Scholar Alum: I look forward to the Microbrew opportunity at AMSCUE. My college will give me an opportunity during our October 2014 Professional Development Day to discuss my project. I hope to collect data to support the College’s investment in additional iPad tablets. I want to broaden the scope of this project. I would like to see tablet usage in my College’s microbiology, general biology, anatomy and physiology courses become common and expected; much like students’ use of calculators during a statistics course.

4) Figure that helps to illustrate your past year’s work:

This is a figure captured by a student to document her ‘Unknown Organism’ results. This figure was taken during a microbiology laboratory session using a College-provided iPad. The iPad was protected within a special sleeve so that it could be disinfected. The image

23

was transferred by the student using the College WIFI network and included in her written report. This chain-of-events may seem quite pedestrian. Yet the fact that the figure exists in my statement, here, demonstrates a series of successes!

5) I have been very interested in the craft of writing course objectives. I have used the references assigned in this program.

24

RESEARCH SCHOLARS

Isabelle Barrette-NgUniversity of Calgary, Calgary, Alberta, Canada – 2013-2014 Research

Some successes and some challenges were encountered during the implementation of the first phase of my project this year. Starting from the inspiring and stimulating discussions at the training institute in July, I prepared a plan to investigate new ways to improve both content acquisition and scientific inquiry skills in my high-enrolment (~500 students) introductory biochemistry class. I hypothesized that the use of a flipped-classroom approach in combination with inquiry-based computer simulation exercises would be effective at fostering both content acquisition and the development of scientific inquiry skills. I spent considerable effort at the start of this project to explore validated concept inventories for introductory biochemistry. The inventory from Villafanes and coworkers seems to be the most suitable for my course and project. In addition, Dr. Stephen Nold provided extremely valuable discussions and helped me to find the “Views About Scientific Inquiry” (VASI) questionnaire to assess learner understanding of basic concepts of scientific inquiry. Preliminary testing of both the concept inventory and the VASI questionnaire indicates that these will be extremely valuable tools for evaluating both concept acquisition and the development of scientific inquiry skills as my project progresses. During the course of the year, a new opportunity for funding teaching development appeared at my university. In collaboration with Carol Berenson, an educational development consultant at our university’s Teaching and Learning Center, I applied for funding to hire a graduate student to code student responses to the VASI questionnaire and to hire an experienced transcriptionist to help record student responses from focus groups. We were excited to be informed just today that our proposal will be funded, as the use of focus groups in combination with the concept inventory and VASI questionnaire is expected to allow us to paint a rich, multidimensional picture of student experiences as we introduce different teaching strategies.

Figure: Timeline showing the sequence of student evaluation instruments and inquiry modules and focus groups planned for the project.

References that informed my work:

1. Bailey, C. P., Minderhout, V., Loertscher, J. (2012) Biochem. Mol. Biol. Educ. 40, 1-7.2. Lederman, J. S., Lederman, N. G., Bartos, S. A., Bartels, S. L., Meyer, A. A., Schwartz, R. S. (2014) J. Research Sci. Teaching 51, 65-83.3. Villafane, S. M., Bailey, C. P., Loertscher, J., Minderhout, V. (2011) Biochem. Mol. Biol. Educ. 39, 102-109.

25

Emma FeeneyLoyola University Chicago, Chicago, IL – 2013-2014 Research

My research interest lies in developing effective methods that can be used in a classroom laboratory setting to train students in science writing and communication. This is of importance, because many students do not have the opportunity to develop these skills through undergraduate independent research experiences. An observation I have made in my classes is that when presented the same guided writing approach, some students’ science writing skills improve while other students’ science writing skills do not. My focus over the last year has been to collect data to determine how these two groups are different in an effort to develop a more effective method to teach science writing. Since leaving the training institute last year, I focused on getting IRB approval for my study and data collection. The biggest challenge I have faced is the IRB approval process. During my preliminary investigation of the IRB process prior to the summer institute and through conversations at the institute, I was under the impression that my project could go through the quicker, exempt review process. However, upon returning to Loyola and speaking with the Assistant Director of Research Compliance in the Office of Research Services, I learned that all projects involving student subjects needed a full IRB review – a process that takes much longer. Thus, I was not able to start my project during the Fall 2013 semester as planned. I was able to begin data collection during the Spring 2014 semester, but I have not been able to analyze the data due to IRB restrictions on my project (approval required that data analysis not begin until after grades were submitted). Going forward, I will begin analyzing my data this summer/fall and will plan on presenting it at the annual SABER meeting next July (July 2015).

References that informed my work:

1. Brownell, S. E., Price, J. V., & Steinman, L. (2013). A writing-intensive course improves biology undergraduates’ perception and confidence of their abilities to read scientific literature and communicate science. Advances in Physiology Education, 37, 70-79.

2. Libarkin, J. & Ording, G. (2012). The utility of writing assignments in undergraduate

26

bioscience. CBE-Life Sciences Education, 11, 39-46.

3. Maskovitz, C. & Kellogg, D. (2011). Inquiry-based writing in the laboratory course. Science, 32,919-920.

Emily FisherJohns Hopkins University, Baltimore, MD – 2013-2014 Research

The project implementation and data collection went well last fall. Katie and I enlisted a colleague at the Center for Educational Resources to help distribute our study surveys and de-identify the responses. We used class time to administer the concept inventory and got high participation on everything. Most importantly (and perhaps most surprisingly with a class of 300), our intervention of in-class group problem solving went extremely well. This semester, sorting through the data has been a challenge in part because we saw no whopping trends. Our in-class problem sets did not instantly change the world or change the way students studied or understood the material. We spent time trying to find a particular student population who may have benefitted, looked for smaller changes in attitudes or behavior, and still did not see large trends. One interesting piece of information is that our students value group studying and discussions but have trouble studying in groups because of scheduling conflicts. We also found that students expected to meet classmates to study with, but ended up studying with existing friends or people from their dorm or previous courses. Year #2 of our plan introduces the intervention earlier in the term, so we may succeed in expanding students’ sets of study partners by intervening before study groups are set. We also have support from the students and our co-instructors to increase the number of problem sets in the coming years, so it is encouraging that the intervention is so well-received.

As an alumna, I hope to use the Biology Scholars community to stay motivated and turn our project into a publication. I plan to attend ASM CUE and stay up on what past, current, and future scholars are studying.

previous b

iochem

istry s

tudents

current b

iochemist

ry cla

ssmate

s

classm

ates m

et durin

g bioch

emistr

y reci

tations

classm

ates in

biochemist

ry lab

dormmate

s

studen

ts met

in previous s

cience

courses.

study a

lone05

101520253035

Study Partnersintentions early in the semes-ter

% o

f res

pons

es

This graph shows responses of students who participated in both the early- and late-semester study surveys. The early survey asked students who they intended to study with for biochemistry. The late

27

survey asked who they actually studied with during the semester. It shows that students expected to meet new study partners during biochemistry lecture or recitation, but ended up studying with people they already knew prior to biochemistry.

References that informed my work:

1. Lian, J. and He, F. Improved Performance of Students Instructed in a Hybrid PBL Format, Biochemistry and Molecular Biology Education, 2012.

2. Klegeris, A., Bahniwal, M., Hurren, H. Improvement in generic problem-solving abilities of students by use of tutor-less problem-based learning in a large classroom, Cell Biology Life Education, 2013.

Heather HenterUniversity of California San Diego, La Jolla, CA – 2013-2014 Research

Before the summer 2013 training Institute I thought that the purpose of our study was to document the effect of an authentic research experience on student attitudes about science. This is a “one teaching method works better than another” question that required controls we didn’t have. I now realize that the strength of our study is to understand how rather than if student attitudes are affected by research. We can use our study to better understand how students learn.

Toward that end we examined how the attitudes of different students changed differently after an in-class original research experience. The most striking result was that of gender. Female student attitudes about science changed significantly more than male student attitudes (Fig. 1). Women’s attitudes about their ability to be scientists, do research, and address real-world problems increased more than men’s attitudes. This is consistent with previous studies that suggest that “making a contribution” is more important to female students than male students.

Several challenges remain. The training Institute helped me understand the value of triangulation in social science research. We have put this into practice by conducting focus groups to bolster and inform the results from our survey data. I have yet to code and quantify the focus group transcripts, however. I am investigating software, “NVivo” for this purpose, but there will be a steep learning curve and a large time commitment required. I am interested in hearing how other people get this done. Another challenge is the fact that we created our own survey rather than using a previously validated survey. I felt that there were no surveys available that focused narrowly enough on our specific goals. Is this going to be a problem? In the near future I hope to publish our results describing how different students are affected differently by participating in original research.

Figure 1. Across all questions, the attitudes of female students changed more than the attitudes of male students (F(1,25)= 15.7, P < 0.0005). A positive change indicates greater confidence or interest in

28

science. Attitudes of females and males were not different at the start of the course. Black bars are the mean change in female response, grey bars are the mean change in male response.References that informed my work:

1. Lovelace, M. and P. Brickman. 2013. Best practices for measuring students’ attitudes toward learning science. CBE-Life Sciences Education 12: 606-617.

2. Trujillo, G. and K.D. Tanner. 2014. Considering the role of affect in learning: monitoring students’ self-efficacy, sense of belonging, and science identity. CBE-Life Sciences Education 13: 6-15.

3. Auchincloss, L.C., et al. 2014. Assessment of course-based undergraduate research experiences: a meeting report. CBE-Life Sciences Education 13: 29-40.

Christy MacKinnonUniversity of Incarnate Word, San Antonio, TX – 2013-2014 Research

The July Workshop was fantastic for the technical and personal support. My team leader, Dr. Stephen Nold, is the best! I’m not sure I would have finished the poster for the annual meeting without his help. There were other items that worked well: the IRB process (the committee was generous and gracious in accommodating my requests for changes) and meeting an excellent colleague in statistics. The assigned reading resonated with my challenges in two ways. Our university has a tenure system with excellent criteria for rewarding teaching innovation. However, administrative adherence to the criteria for innovation is inconsistent. I’m tenured, so there was little risk of implementing the project. I expected and I experienced some student resistance during the project, but my academic upervisor wasn’t very understanding of the inherent risk in trying something new (although he provided the funding for me to participate in the ASM Biology Scholars Residency). I felt the “talk” was valued at my institution, but not the “walk.” In hindsight, I should have established a “core team” with colleagues in our Center for Teaching and Learning and in the office of Student Success, plus an interested department colleague. Going it alone resulted in a low “return of investment” for time involved, research results [and student success]. I have a sabbatical leave Fall 2014 and will continue as a Biology Scholars Alum by preparing a manuscript about the project (which will include more data than I had at the time I prepared my poster). I will also plan a new course (Medical Genomics), and will build in an experimental design to assess the effectiveness of case studies on learning core genetics competencies.

29

Figure 1. The diagram used to help students with the annotation process and a core concept of genomic gene structure. Some probably still have bad dreams about it!

References that informed my work:

1. Carr, C. E., McNicholas, J. C., & Miller, R. R. (2009). Faculty perceptions of research, scholarly, and creative activity and grant seeking at a predominantly undergraduate institution. Research Management Review 17(1), 69-84. (Note: Accessed January 9, 2014, but no longer available on the internet).

2. Burnette, J. M., III, & Susan R. Wessler, S. R. Transposing from the Laboratory to the Classroom to Generate Authentic Research Experiences for Undergraduates. (2012). Genetics 193, 367–375. http://www.genetics.org/content/193/2/367.full. Accessed July 4, 2013.

3. Klosser, M. J., Brownell, S. E., Chiariello, N. R. & Fukami, T. (2011). Integrating Teaching and Research in Undergraduate Biology Laboratory Education. PLoS Biology 9(11). e1001174. Accessed January 28, 2014.

Tim PaustianUniversity of Wisconsin-Madison, Madison, WI – 2012-2013 Research

I left the 2012 institute with a new sense of excitement. I have always made changes in my courses to address problems in my teaching. While I could get a qualitative sense on what worked and what did not, I never really had a set of rigorous, analytical tools and methods to actually measure the success of the methods I tried. The Biology Scholars Research Residency gave me those capabilities and guided me through the steps necessary to perform good, solid, education research.

Since leaving the residency, I have (with a team of dedicated faculty) redesigned the introductory microbiology course to incorporate active learning. We have added many different activities and in the process I convinced several of my reluctant colleagues to give active methods a try. Using statistical analysis I have shown marked improvement in understanding of students when active learning is employed. An ongoing problem with this course is its size. Between 250 and 350 students take the course each semester and there are no resources beyond the instructors to help facilitate assessment. I am thus forced to rely on multiple-choice tests for summative assessment and I feel I am not getting a clear enough picture of student mastery of the material. This is a problem I am interested in addressing creatively.

I have also developed methods to help students to better understand bioinformatic tools, and won a grant to create a simple, development pipeline for taking structure data in the PDB and using it to create educational animations. A major challenge for me is finding the time to communicate these interesting findings to a wider audience. A new goal, is to raise the priority of these efforts in my schedule so that this important part of the process gets done.

Work DoneTable 2 illustrates the performance of students on multiple-choice exam questions. The questions

could be divided into those that were taught using active learning methods and those that were taught in a more classic lecture style. Binomial statistical analysis showed that students performed better when active learning techniques were used.

Table 2. Frequency of correct exam responses by exam and question type

Exam 1 Exam 2 Exam 3% correct (lecture-only) 75% 69% 79%% correct (active-learning) 92%* 77%* 85%*

* Significantly higher than lecture-only questions, p < 0.01

30

References that informed my work:

1. Michael, J. 2006. Where's the evidence that active learning works? Advances in PhysiologyEducation 30:159-167.

2. Wilke, R. R., and W. J. Straits. 2001. The effects of discovery learning in a lower-division biology course. Advances in Physiology Education 25:134-141.

Kim QuillinSalisbury University, Salisbury, MD – 2012-2013 Research

My scholarship has gone well over this past year: I have presented a poster on a conceptual framework on drawing to learn (Society for the Advancement of Biology Education Research, 7/13), presented a talk on helping students to overcome misconceptions (Salisbury University, 2/13), submitted an NSF grant with Stephen Thomas and Julie Libarkin from Michigan State on visual model-based reasoning (NSF-CORE, 2/13), participated in a mentoring network with six biology education research colleagues from other universities and presented a poster on the experience (Biology Leadership Conference, 3/14); presented a workshop on visual, model-based reasoning with Stephen Thomas from Michigan State (BLC, 3/14), invited and hosted a biology education researcher, Michelle Smith (University of Maine), on campus (Salisbury University, 3/14), had an abstract on my drawing-to-learn research accepted for ASM-CUE, and am now writing a paper on visual model-based reasoning for CBE-LSE. The biggest challenges have been to find enough time and money to support my scholarship efforts since this is not a part of my job description. It has also been a challenge to build specific skills needed to succeed in this new research area (mastery of the literature, research methods and statistical analysis), and to maintain collaborations with faculty at other institutions since there is not much support for this work in my own department. In the near future after ASM-CUE, I will find out if my NSF project was funded, submit my paper to CBE-LSE, attend SABER 2014 (including workshop on statistical analysis), and finish analyzing data that I collected a year ago.

A table or figure that helps to illustrate past year’s work: Conceptual framework to guide the understanding and use of visual model-based reasoning.

References that informed my work:

1. Ainsworth, S., V. Prain, and R. Tytler. 2011. Drawing to learn in science. Science 333: 1096-1097.

2. National Research Council (NRC) (2012). Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering. Washington D.C.: National Academies Press.

3. Van Meter, P., & Garner, J. (2005). The Promise and Practice of Learner-Generated Drawing: Literature Review and Synthesis. Educational Psychology Review, 17(4): 285–325.

Martina Rosenberg

31

University of New Mexico, Albuquerque, NM – 2013-2014 Research

My original intention was to investigate the problem of transfer along a typical STEM sequence. In particular I set out to compare students’ ability to work with reoccurring concepts in the context of Chemistry as well as in Biochemistry in a pre-‐ posttest design, while at the same time assess confidence levels on these questions. The devil was in the details=implementation. For instance, I realized quickly, that that using as many questions as would have been necessary to validate a pretest would exceed the time available for this class. I was modifying my original approach and limited myself to a fewer foundational concepts. The underlying metacognitive aspects of the research question became more prominent as the semester progressed. As expected, the timeline to get IRB approval in time for the fall semester was a challenge. Lucky for me I was already on a project that had a blanket IRB for use in my class.

What I didn’t expect were some of the results, e.g., that the median confidence level was really not changing very much. A student level analysis did not have enough power. At that point I could have been more proactive in contacting mentors and staying in touch. I believe a check-‐ in earlier than what was scheduled may have been useful; just to have a sounding board and to make sure I was still on track. I did appreciate the reminders about the assignments, otherwise I would have missed a few. I intend to look with a finer comb (course artifact elevel) and add the data from this ongoing semester.

Although I am only semi-‐satisfied, with what I accomplished and completed, during the middle of last semester I found myself and my colleagues from chemistry on a very related project and we submitted an NSF-‐IUSE.

FIGURE: Below is a timeline of what actually happened after modifying the intended project. IPSA stands for Individual Problem Solving

Assessment (Anderson et al.

International Journal for the Scholarship of Teaching & Learning, 2011). These were designed for this class and contain a reflective writing piece. The results were presented at

Experimental Biology April 30, 2014. At this conference I also learned more about what the ASBMB concept inventory group is working on (h tt p : // www . asb m b . o r g/NSF/NSFPage . aspx? i d=11828 ). Details on their work would have been helpful at the time I started this.

32

References that informed my work:

1. h tt p : // www .i ub m b . o r g/ i ndex . php? i d=27 8 . 2. Villafañes et al. Biochemistry and Molecular Biology Education 39.2 (2011).3. Shi et al; CBE—Life Sciences Education, 453–461(2010).4. Marbach et al. CBE-‐Life Sciences Education 9, 408-‐436 (2010).

Aeisha ThomasCrown College, St. Bonifacius, MN – 2013-2014 Research

I started two studies during the past academic year. I will focus on the first since my IRB for the second dictates that I review the material after grades are in. One of the goals of the first year of the first project was to get baseline measures on student interest in Biology. I also introduced science literacy in smaller bits throughout the semester using a similar model to Krontiris-Litowitz (2013). I have decided to treat this as a preliminary study because I had low enrollment (8 of the potential 45 completed the pre-test and 3 the post-test). The students were given the CLASS-Bio survey which is typically analyzed by breaking the questions down into 5 categories (http://www.colorado.edu/sei/class/). Since I had such few respondents and there were a high number of neutral responses, I have decided to focus on the pre-test questions that had a strong positive (6 or more agree/strongly agree) or strong negative response (6 or more disagree/strongly disagree). From this data, the students seemed to appreciate the value of learning biology and its relevance to the real world. In this small group however, there were no strong positive or negative responses in the problem-solving difficulty or problem-solving effort categories and overall only in 5 of the 32 CLASS questions. Thus, I am still thinking through a meaningful way to assess the remaining responses.

I have had many challenges in my first try at Biology Education research and now have a better sense of how to proceed as I move forward.

Figure 1.

Categories Negative Neutral Positive QuestionReal World Connection

8 14. Learning biology changes my ideas about how the natural world works.

Real World Connection

7 1 19. The subject of biology has little relation to what I experience in the real world.

Enjoyment 1 1 6 9. I want to study biology because I want to make a contribution to society.

Enjoyment 1 1 6 27. I enjoy explaining biological ideas that I learn about to my friends.

Conceptual Connections/ Memorization

7 1 0 19. The subject of biology has little relation to what I experience in the real world.

Problem-solving Strategies

1 1 6 7. To understand biology, I sometimes think about my personal experiences and relate them to the topic being analyzed.

Reasoning 8 14. Learning biology changes my ideas about how the natural world works.

Eight students completed the CLASS-Bio survey (http://www.colorado.edu/sei/class/). The questions are typically sorted into the categories indicated. Negative indicates that the response was disagree or strongly disagree. Positive indicates that the response was agree or strongly agree.

References that informed my work:

33

1. Krontiris-Litowitz J Microbiol Biol Educ. 2013 May 6;14(1):66-77. 2013. Using primary literature to teach science literacy to introductory biology students.

Katie TifftJohns Hopkins University, Baltimore, MD – 2013-2014 Research

My experience with implementing our biology scholars project so far has been surprisingly smooth. We obtained IRB approval, designed and executed the in-class group problem sessions that were the key to our project, collected data from a pre and post-course inventory, recorded answers on related exam questions, and executed four different versions of the study survey (with help from a member of the Center for Educational Resources). The student response to all of these parts was remarkably positive. Although we did not get a very high overall response rate on the study survey, we still had enough to provide a substantial data set. The biggest challenge for me was definitely tackling the overwhelming amount of the data that we collected. I had not fully anticipated or appreciated the challenge of collating and organizing the data, deciding which data to include in the analysis, and determining how to analyze it. Along the way I recognized that we would benefit from advice on both how to effectively and efficiently analyze the data and apply proper statistics. The results did not show the trends we predicted so we have to figure out how to interpret those results and identify what results might be valuable or interesting. Our current challenge is to draw appropriate conclusions from the data and decide if we should make any changes in the research plan for the second semester of the study (Fall 2014). I hope that we will be able to prepare a publication from our current results or at least have created a baseline for future publishable studies. I definitely would not have been able to do this without the biology scholars experience and whether or not this particular project is successful, I feel that I have developed approaches and confidence to design future studies.

References that informed my work:

34

1. Lian, J. and F. He. Improved performance of students instructed in a hybrid PBL format. Biochem Mol Biol Educ 2013, 41(1):5-10.

Heather VerkadeMonash University, Melbourne, Victoria, Australia – 2013-2014 Research

My aim was to improve my class by showing students how to critically analyze a journal article, and then making them do this in an activity directly assessed in the exam. My hypotheses were that these activities would increase their confidence in critically analyzing journal articles, improve their attitudes to it, and increase their skill. The third hypothesis was one that was suggested to me by both facilitators and students at the research residency workshop, so I decided to work it in to the project, and I am very glad I did. I was able to adjust my existing ethics application (IRB) with no great drama (but very short on time). The project ran well, although I received fewer of the post- surveys than I expected. The most interesting data (and the data with the statistically significant change) was testing their ability to answer simple molecular genetics questions. I found that after the semester they were better at analyzing data, and were also much more careful in analyzing data. For example, many more considered the error bars in their answers. For this result, I thank John Geiser for insisting that I could embed the test in their final exam. Through this project and residency I learned how to really identify my research question, and to simplify the overall question so the analysis is manageable. I usually get overambitious. I am hoping to publish this as a small paper in a reasonable life sciences journal. For future studies, though, I would also like to improve my statistics. I have left Monash University, and I am currently looking for another position. I am using this research residency, and this study, as a selling point to find a teaching focused academic position. Fingers crossed.

Percentage of students with certain test answers on simple molecular genetics questions

pre-test post-test χ-squared tests for independence

Q1. Correct answer 87% 75% p < 0.001Q1. Considers error bars 14% 46% p < 0.001Q1. Considers error bars correctly 13% 37% p < 0.001Q1. Specific answer (includes time) 16% 47% p < 0.001Q2. Correct answer 14% 49% p < 0.001Q2. Considers error bars 11% 33% p ≤ 0.001Q2. Considers error bars correctly 11% 32% p < 0.001Q2. Answer contains unit 25% 36% p ≤ 0.016Q3. Broadly correct 60% 82% p < 0.001Q3. Specifically correct 31% 58% p < 0.001

References that informed my work:

1. Spiegelberg B. (2014) A focused assignment encouraging deep reading in undergraduate biochemistry. Biochemistry and Molecular Biology Education 42:1, 1-5.

2. Gormally C, Brickman P and Lutz M. (2012) Developing a Test of Scientific Literacy Skills (TOSLS): Measuring undergraduates’ evaluation of scientific information and arguments. CBE – Life Sciences Education 11, 364-377.

35