nsf msp: ceems project – year two - ceas.uc.edu ceems eval... · lesson development, teacher...

138
NSF MSP: CEEMS Project – Year Two Evaluation Summary Report August 2013 University of Cincinnati Dr. Anant Kukreti, PI Prepared for:

Upload: others

Post on 31-Oct-2019

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

NSF MSP: CEEMS Project – Year Two Evaluation Summary Report August 2013

University of Cincinnati Dr. Anant Kukreti, PI

Prepared for:

Page 2: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

NSF MSP: CEEMS Project - Year Two

Evaluation Summary Report

Evaluation Services Center

Debbie Zorn, Director

Report prepared by:

Catherine Maltbie, Research Associate

Katherine Butcher, Junior Research Associate

Imelda Castaneda-Emenaker, Senior Research Associate

With the assistance of:

Audra Morrison, Research Coordinator

© 2013 Evaluation Services Center University of Cincinnati PO Box 210105 Cincinnati, OH 45221-0105 Tel: (513) 556-3900 Fax: (513) 556-5112 http://www.uc.edu/evaluationservices/ E-mail: [email protected]

Page 3: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

NSF MSP: CEEMS Project – Year Two

Evaluation Summary Report

Table of Contents

Evaluation Questions ....................................................................................................... 2

Procedures ........................................................................................................................ 2

Participants ........................................................................................................................................... 3

Instruments and Data Collection Procedures ................................................................................. 3

Data Analysis Strategies ..................................................................................................................... 7

Results .............................................................................................................................. 8

Project Implementation ..................................................................................................................... 8

Project Outcomes ............................................................................................................................. 15

Project Research ................................................................................................................................ 26

Sustainability ...................................................................................................................................... 26

Conclusions .................................................................................................................... 33

Appendixes ..................................................................................................................... 34

Appendix A. Copies of Instruments ............................................................................... 35

Appendix B. Teacher Surveys Results ........................................................................... 73

Appendix C. Student Surveys Results ............................................................................ 97

Appendix D. Examples of Student Work ..................................................................... 107

Appendix E. 2013 STEM Conference Evaluation Results ............................................ 113

Page 4: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 1

NSF MSP: CEEMS Project – Year Two

Evaluation Summary Report

The University of Cincinnati’s National Science Foundation Mathematics and Science Partnership (NSF MSP) project is entitled:

The CEEMS project was funded in October 2011. The project goals are to:

1) Improve grades 6-12 science and math achievement to prepare for and increase interest in engineering or other STEM careers;

2) Develop math and science teacher knowledge of engineering, the engineering design process and challenge-based learning through training and classroom support;

3) Recruit engineering undergraduates as science and math teachers through a succinct licensure program;

4) Recruit career changers to be science and mathematics teachers;

5) Build a sustainable education licensure STEM degree-granting infrastructure that will positively impact the entire region.

During this second year of the five-year project, the focus was on continued teacher recruitment, lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers who completed the 2012 Summer Institute for Teachers (SIT) and 12 of these in-service teachers continued for Year 2 of the professional development during the 2013 SIT. These teachers are called Cohort 1. These teachers developed and implemented 40 CEEMS units that incorporated challenge-based learning and the engineering design process. The project team recruited 24 in-service teachers for Cohort 2, and all but one started during the 2013 SIT.

During the initial planning year, a project plan and evaluation activities matrix was developed to guide the evaluation. Since the overall evaluation question, “Is the CEEMS project working?”, is complex, individual evaluation sub-questions were associated with each project activity and benchmarks were identified. The evaluation design is a “living document” and as the project was implemented during the 2012 summer and 2012-2013 academic year with modifications and

Page 5: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 2

expanded activities, the evaluation plan evolved to provide the project team with feedback for continuous improvement during the year. This summary report provides an evaluation of progress made to date toward goals and objectives. To help the project team continue to improve its processes, evaluation results are summarized relative to the four major aspects of the CEEMS Project: the project implementation, teacher and student outcomes, research design, and project sustainability.

Evaluation Questions

The evaluative questions associated with each aspect of the project are shown in Table 1.

Table 1. CEEMS Project – Overarching Evaluation Questions

Project Implementation

• How consistent are the conducted project activities with the project design? What are participants’ reactions to these activities?

• What implementation concerns arise during the course of CEEMS and what solutions did the project team identify?

Outcomes • Since the project activities involve direct interaction with teachers, what are the teacher-related outcomes associated of the project activities? These include development of the CEEMS units, teachers’ current instructional practices and teachers’ content knowledge.

• What student impacts are there when the CEEMS teachers’ implement these lessons in the classroom? These include student reactions to the units, student assessment data (pre-post unit data), and student achievement data collected via state-initiated standardized tests.

Project Research • How effective is the research design in CEEMS? • How well do the research activities address project goals?

Project Sustainability

• What aspects of the project team’s decisions regarding design- and challenge-based instruction for supporting science and math learning are valid and sustainable after the funding ends?

• What aspects of CEEMS are portable to other learning environments?

Procedures

A mixed-methods approach was utilized to investigate the implementation of the CEEMS project and to document progress made toward goals and objectives. Quantitative and qualitative data from all participating teachers and the majority of attendees of the 2013 STEM Conference was collected. To reduce the data collection burden on participants, the research and evaluation teams coordinated data collection and sharing of results.

Page 6: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 3

Participants

Participants in the CEEMS evaluation include the teachers participating in project activities and 2013 STEM Conference participants. At the end of Year 2, there were 12 returning Cohort 1 in-service teachers and 23 Cohort 2 in-service teachers. In addition, 12 pre-service teachers participated in SIT courses but did not implement units because they do not have classrooms. Administrators for the in-service teachers also participated in an Adminstrators Academy during the summer of 2012.

Evaluation data was collected from almost all 2013 STEM Conference participants. These participants were educators, and businesses and community members who were interested in STEM education. Participants voluntarily completed 524 hardcopy surveys at the end of the conference sessions and 139 online overall conference evaluations.

Other data pertaining to the evaluation were collected via documentation of student results provided by the project team or project staff. These persons are not participants in the evaluation and their data are considered secondary data.

Instruments and Data Collection Procedures

Survey instruments and focus group/interview protocols were developed to address the evaluation questions. (See Appendix A.) Teacher surveys were developed to focus on professional development activities and their effects, and they were administered after each activity in order to provide feedback to the project team for continuous improvement.

Seminar One Evaluation for Faculty Workshops

The project team coordinated a second set of five workshops to support development of the SIT courses for returning and new-to-the-project faculty. Faculty voluntarily continued teaching these courses or was replaced if they could not continue due to scheduling conflicts. The objectives of these sessions were to introduce and reinforce challenge-based learning and engineering design concepts and help the faculty coordinate their course with the other courses given to SIT participants. At the end of each workshop, each participant completed an evaluation that was used to guide future instructions and discussions. The evaluations had five closed-ended questions, three open-ended questions and a set of identifiers to help track responses from month to month.

Page 7: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 4

Teacher Current Instructional Practices Survey

In order to document changes in teacher content knowledge, attitudes, and behaviors in the classroom, a pre-post content knowledge assessment was developed for each course and an overall pre-post survey was developed that documents participating in-service teachers’ current instructional practices that are associated with challenge-based and design-based learning. This survey is based on an instrument developed by researchers at the Evaluation & Assessment Center for Mathematics and Science Education at Miami University, Oxford, OH and was modified with their permission.

The Current Instructional Practices survey measures changes in the instructional practices of the Summer Institute for Teachers (SIT) participants. The survey has two batteries of questions listing the same challenge-based/design-based learning practices. One battery of questions asks about participants’ incorporation of these practices into instruction and the second battery of questions asks participants to indicate their level of confidence when using these instructional practices. The pre-survey was completed at the beginning of the 2012 SIT by all 16 participating in-service Cohort 1 teachers in June 2012 and served as baseline information for participants’ instructional practices. The survey was not given to pre-service teachers because they do not have a history of teaching from which to gauge the relationship between the incorporation of these instructional practices and their confidence level. The post-survey for Cohort 1 was administered May 2013 during a Community of Practice meeting. The Community of Practice meetings occurred once a quarter and they brought together the participating teachers, resource team, project team and evaluation team to discuss what is happening in the classrrom, the status of project activities and requirements, and to share information via informal, just-in-time professional development discussions. The pre-survey for Cohort 2 was administered during the months of April and May of 2013 at time of enrollment in the CEEMS project.

Administrator Academy Evaluation

Administrator support is an assistor for CEEMS teachers’ implementation during the academic year. To help foster this support, a 2012 Administrators Academy was held July 25-27, 2012 as part of the 2012 SIT. This included the project team presenting information about the project, the administrators experiencing a challenge-based learning activity, joint administrator and teacher professional development sessions, and the evaluators facilitating a discussion that explored ways the administrators could support and promote the CEEMS project. The evaluation survey consisted of six closed-ended questions and three open-ended questions. Feedback from these surveys was used to help the project team support teachers during their classroom unit implementation. This evaluation survey was given to the administrators present during the final day of the academy. It was completed by all (n=9) administrator attendees present.

Page 8: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 5

2012 End of Summer Institute for Teachers (SIT) Evaluation Survey

A two-page survey was used to evaluate the SIT by participants during the last day of the institute. The survey consisted of 20 closed-ended questions that rated the usefulness of various aspects of the CEEMS project SIT experience, five closed-ended questions that indicated participants’ level of satisfaction with support provided and their overall level of satisfaction as a participant in the SIT during this summer. The survey ended with three open-ended questions. Two open-ended questions asked participants to list strengths and areas of improvement for the CEEMS project SIT 2012. The final question asked the participants to list any areas of concern that they would like the resource team to address to help in their participation with the CEEMS project. Since in-service teachers received more support than pre-service teachers, evaluation results were separated by these two groups of teachers. All in-service teachers completed the evaluation (n=16). Less than half of the pre-service teachers completed the evaluation (n=7).

2012-SIT Course Evaluations

At the end of each SIT course (6 courses in total), attendees of the course were asked to complete a course evaluation form. This survey consisted of 18 closed-ended questions and two open-ended questions. This form is a modification of the CECH college course evaluation form with emphasis placed on challenge-based learning and engineering design process. Results are used for course improvement.

Teacher-Led Professional Development Evaluation Forms

As part of the project requirements, each teacher conducted two professional development sessions with other educators at their respective schools or professional conferences. Attendees were asked to complete a one-page survey that had five closed-ended evaluation questions, two open-ended questions and two questions asking about demographics. The survey asked about potential for future use of the concepts and activities discussed and will help the project leaders gauge project dissemination.

2013 Teacher End-of-Year Focus Group Discussion Guide

On May 6, 2013, the evaluation team conducted a 90 minute focus group with the Cohort 1 teachers. This focus group occurred at the completion of the 2013 STEM Conference in a Tangeman University Center seminar room. Fourteen, of the 15, Cohort 1 teachers attended. The discussion was audio taped and a summary of the discussion was shared with the project team. The discussion was guided by the overarching questions: How did the year go – for yourself and for your students; what worked; what changes do you suggest; how do you want to interact with Cohort 2 teachers? Any other comments? The full discussion guide is shown in the Appendix A.

Page 9: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 6

2012-13 Research Team Communication Log

In mid-January, the evaluation team developed and introduced an online communication log for the resource team members to document their interactions with the CEEMS teachers. The log consisted of 11 questions; eight were quick closed-ended responses and the final three were open-ended questions that summarized the communications and follow-up actions expected. Information from other sources could be copied and pasted into text boxes reducing efforts needed to complete the form. All resource team members were given the survey link and asked to complete a log after they completed a communication or series of related communications with any of the teachers. The first communication was logged January 18th, 2013 and the 172nd was logged on March 7th.

2012 CEEMS Student Activity Feedback Survey and 2013 Revised Student Surveys

After each unit was taught, the teachers were asked to distribute Student Activity Feedback Surveys. One version of the survey was used for all grade levels. The survey contained 13 closed-ended and three open-ended questions. The first question asked the students to rate the unit and the remaining closed-ended questions asked students to indicate their agreement to the statements. Students gave their reactions to the unit, including the teachers’ behaviors and the activities.

After most teachers completed their CEEMS Unit 2, the project team and evaluation team met to discuss the process being used to assess what was happening in the classrooms. The evaluation and project teams decided to modify the student feedback forms to get feedback after critical parts of the lesson. Specifically, they wanted to understand, in more detail what was happening in the classrooms from the students’ perspectives. The first survey was administrated after the “Big Idea through Guiding Questions” discussion sessions in the classrooms. The second survey focused on the “Engineering Design Process” and was administered after that activity. Both surveys use a four-point scale to force a positive or negative response. The final survey was similar to the original student feedback survey and focused on “Challenge-Based Learning” aspects of the entire unit. This survey maintained the same five-point scale so that the responses were comparable to the earlier student feedback surveys. We also maintained as many similar questions as possible. As part of a pilot study conducted May 2013 by five teachers, 213 students completed the Big Idea survey after the guiding questions were identified, 198 students completed the Student Activity survey after the activity focused on the engineering design process was completed, and 205 students completed the revised Student Feedback survey when the lesson was completed.

2013 STEM Conference –Individual Session and Overall Evaluations

The 2012 STEM Conference was a CEEMS sponsored educational community event that took place on the UC campus May 6, 2013. The attendees were asked to complete post-session hardcopy evaluations which were immediately collected and they were sent an online conference evaluation the morning after the conference. The post-session evaluations were one page long and

Page 10: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 7

consisted of nine closed-ended and three open-ended qyestions. The questions addressed the presenter’s delivery of the material, an assessment of the activities, the level of understanding gained pertaining to challenge-based learning and the engineering design process, as well as the level of usefulness of the curricular materials presented. The online survey had more questions, 25 in total. It included three demographic questions as well as seven overall rating questions related to the CEEMS goals and objectives, seven questions evaluating conference logistics, three open-ended questions evaluating the usefulness of concepts presented, four questions asking respondents to identify the session they attended and another open-ended question asking what they liked best about the sessions. To increase returns, reminder email messages were sent every week for three weeks.

Data Analysis Strategies

Data analysis involved looking across all the data collected for emergent themes. Quantitative data analysis primarily consisted of descriptive and frequency results from closed-ended questions on surveys and logs related to project activities. For the pre-post survey, a paired sample t-test was performed and results indicated whether or not the responses changed from Time 1 to Time 2. Qualitative data analysis techniques were used to analyze the data collected from the focus groups and open-ended survey questions. Text was analyzed to see what phrases, concepts, and words were prevalent throughout the participants’ responses. During this stage of the analysis, data was sorted and defined into categories that were applicable to the purpose of the evaluation. Codes were defined and redefined throughout the analysis process as themes emerged. At the end of the analysis, major codes were identified as central ideas or concepts (Glesne, 20061). These central ideas were assembled by pattern analysis for the development of major themes. Conclusions were drawn from major themes.

Student baseline standardized test academic data was collected for the 2011-2012 academic year (Year 1). As teachers continue to participate in CEEMS activities, subsequent changes in student data will be analyzed. This analysis will include an Analysis Of Variance (ANOVA) comparing the mean achievement scores over time for the participating school districts, schools where the teachers are located and if possible, students taught by CEEMS teachers. In addition to these data, during the first classroom implementation pre-post unit-specific assessment data for 12 units were collected. Due to the limited representativeness of these data, they will be reviewed qualitatively and a more rigorous analysis will be conducted on the data collected in the 2013-14 academic year before and after all CEEMS units.

As the research plan is implemented, the evaluation and research teams will share common data and analysis. This work is scheduled to begin in the third year of the project.

1 Glesne, C. (1999). Becoming qualitative researchers: An introduction. New York: Longman.

Page 11: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 8

Results

Overall, the CEEMS project is steadily progressing toward the attainment of its goals and objectives. A first cohort of participating teachers are trained and have implemented three units and conducted professional development sessions at their respective schools, or professional conferences, to educate others on challenge-based learning and the integration of the engineering design process into middle and high school classrooms.

The National Advisory Board (NAB) members provided positive comments and suggestions for improving the CEEMS project activities for the future. The NAB’s positive comments were supported by our evaluation results and include: resource team assembled by CEEMS is committed to the project and a very useful resource to the teachers; Administrator’s Academy was received very positively; strong camaraderie among project team members. Since the NAB members were charged with providing advice, they made numerous suggestions for the project team to consider when planning. Key suggestions included:

• Additional support was needed for classroom implementation with the addition of 24 more teachers. This is too much for the current resource team.

• Define and clearly articulate the intended outcomes and then plan, or re-plan, the summer program around them. The NAB’s feedback provided specific suggestions for courses and the SIT organization so that engineering experiences can be balanced with building challenge-based learning opportunities while allowing for more teacher reflection.

• Explicitly use challenge-based learning as the primary pedagogy in the summer courses.

• When looking at classroom implementation, the teachers need to create an explicit assessment and evaluation plan for the units that include finer grain assessment data than standardized test scores. It may take multiple years before there are changes in standardized test scores at these schools and districts.

• The research agenda is very ambitious. They suggested that the research have a tighter focus that concentrates on the teachers’ experiences.

• Finally, the project team needs to create a timeline that explicitly defines what is expected to happen annually. This could include when certain benchmarks are met.

Project Implementation

CEEMS activities planned for Year 2 were fully implemented. Fifteen participating teachers created and implemented three units each during the 2012-13 academic year. The teachers were given support by the project team, resource team, and their administrators. Copies of all lessons are currently being made available on the project website (http://ceas.uc.edu/special_programs/ceems/CEEMS_Home.html).

Page 12: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 9

2012 Summer Institute for Teachers (SIT)

The SIT was implemented as designed and overall SIT participants reported being satisfied with their experience during the summer (mean rating of 4.63 out of 5). Twenty-one teachers participated, 16 in-service teachers and six pre-service teachers. The in-service teachers reported a higher level of satisfaction than the pre-service teachers (means of 4.75 and 4.00, respectively). The in-service teachers received more support related to the creation of the products required for classroom implementation as part of the SIT (CBL unit, poster and movie). Descriptive results from the satisfaction questions are shown in Table 2.

Table 2. Satisfaction Results from 2012 SIT Participants

Please indicate your level of satisfaction or dissatisfaction with the following aspects of the CEEMS Project 2012 SIT experience.

In-service or Pre-service Teachers In-service Pre-service TOTAL

N Mean

* Std. Dev. N

Mean*

Std. Dev. N

Mean*

Std. Dev.

Access to UC Computing Services 16 4.06 1.24 5 4.20 .837 21 4.10 1.14

Support to make your CBL unit 16 4.63 .619 - - - 16 4.63 .619

Support to make your poster 16 4.44 .727 - - - 16 4.44 .727

Support to make your movie 16 4.69 .602 - - - 16 4.69 .602

Overall - Experience as a participant in the CEEMS project SIT during this summer 16 4.75 .447 3 4.00 1.00 19 4.63 .597

*Scale: 5= Very Satisfied; 4=Satisfied; 3=Neither Satisfied Nor Dissatisfied; 2=Dissatisfied; 1=Very Dissatisfied

Considering the usefulness of individual aspects of the SIT, all ratings were positive (means above 3 out of 5). For the in-service teachers, the most highly rated course was Foundations of Engineering (mean of 4.88 out of 5) followed closely by the mean for participants who took the Models and Applications in Earth Systems course (mean of 4.86 out of 5). The lowest rating was given by the participants who took the Models and Applications in Physical Sciences course (mean of 3.25 out of 5). Pre-service teachers rated the Engineering Applications in Mathematics course higher than the in-service teachers (mean ratings of 4.14 and 3.88 out of 5, respectively). This was the only course that had responses from more than one or two pre-service teachers. Table 3 shows these overall course usefulness ratings.

Page 13: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 10

Table 3. Usefulness Ratings for Courses within the 2012 SIT

Please indicate how useful or useless the following aspects of the CEEMS Project 2012 SIT experience were in helping you create Challenge-Based Learning (CBL) units.

In-service or Pre-service Teachers In-service Pre-service TOTAL

N Mean* Std. Dev. N Mean*

Std. Dev. N Mean*

Std. Dev.

Overall - Foundations of Engineering Course 16 4.88 .342 - - - 16 4.88 .342

Overall - Engineering Applications in Mathematics Course 16 3.88 .885 7 4.14 .690 23 3.96 .825

Overall - Models and Applications in Physical Science Course 8 3.25 1.39 2 4.00 .000 10 3.40 1.26

Overall - Models and Applications in Earth Systems Course 7 4.86 .378 1 5.00 . 8 4.88 .354

Overall - Models and Applications in Biological Sciences Course 2 4.50 .707 2 4.00 .000 4 4.25 .500

Overall - Engineering Models Course 2 4.50 .707 1 4.00 . 3 4.33 .577 *Scale: 5= Very Useful; 4=Useful; 3=Neutral; 2=Useless; 1=Very Useless

When looking at ratings given by in-service teachers for support given and individual sessions, the highest mean ratings indicated that participants appreciated the high level of support they were given and they appreciated being shown examples of how to connect the material learned to real-world applications. The highest usefulness rating was for the Panel Discussion with Practicing Engineers (mean rating of 4.69 out of 5). The next highest rated aspects of the SIT were the Video Creation presentation and the Coaching Sessions with Resource Team Members (mean ratings of 4.63 and 4.56 out of 5, respectively). Overall – Interactions with Pre-service Teachers had the lowest mean rating of 3.13 out of 5. These low ratings may indicate that the project team needs to better facilitate opportunities for collaboration and interaction between the in-service and pre-service teacher participants, Descriptive results from the usefulness ratings of specific aspects of the SIT are shown in Table 4.

These quantitative results are consistent with the responses to the open-ended questions. The in-service teachers identified the resource team, learning about CBL and the integration of engineering concepts, and teacher collaboration as strengths of the SIT. Pre-service teachers identified the breadth of knowledge incorporated in the SIT and teacher collaboration as strengths of the SIT. When asked to suggest areas of improvement, the in-service teachers requested better communication of expectations, more effective inclusion of CBL into the courses, and revisiting the participants’ total workload and timing of the workload. Pre-service teachers mentioned taking into account their prior knowledge. Increased teacher collaboration was also mentioned by in-service teachers. For the academic year, in-service teachers stated that they were concerned about communication of timing and deadlines. A full list of participant responses to all open-ended questions is shown in the Appendix B.

Page 14: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 11

Table 4. Usefulness Ratings for Different Aspects of the 2012 SIT

Please indicate how useful or useless the following aspects of the CEEMS Project 2012 SIT experience were in helping you create Challenge-Based Learning (CBL) units.

In-service or Pre-service Teachers In-service Pre-service TOTAL

N Mean* Std. Dev. N Mean*

Std. Dev. N Mean*

Std. Dev.

Overall - Interactions with Resource Team 16 4.44 .512 - - - 16 4.44 .512

Overall - Interactions with Pre-service Teachers 15 3.13 1.30 5 3.60 1.34 20 3.25 1.29

CBL in Action presentation (July 5) 15 4.27 .594 - - - 15 4.27 .594

Unit Template presentation (July 6) 16 4.44 .629 - - - 16 4.44 .629

Academic Standards presentation (July 9) 16 4.13 .500 5 3.60 .894 21 4.00 .632

Wiki presentation (July 11) 16 4.44 .814 - - - 16 4.44 .814

RET Teacher presentation (Amy Jameson-Dater, July 11) 16 3.81 .911 2 4.00 .000 18 3.83 .857

Life of an Engineer poster presentation (July 16) 14 4.29 .994 - - - 14 4.29 .994

Video Creation presentation (July 18) 16 4.63 .806 - - - 16 4.63 .806

Assessments and Rubrics presentation (July 18) 15 3.67 .900 4 3.50 1.29 19 3.63 .955

Misconceptions in Science and Math presentation (July 25) 16 4.13 .885 7 4.43 .535 23 4.22 .795

Panel discussion with practicing engineers (July 26) 16 4.69 .479 - - - 16 4.69 .479

Exploring Design in the Next Generation Science Standards lunch presentation (July 27) 14 4.29 .914 - - - 14 4.29 .914

Coaching Sessions with Resource Team Members (ongoing) 16 4.56 .512 - - - 16 4.56 .512

*Scale: 5= Very Useful; 4=Useful; 3=Neutral; 2=Useless; 1=Very Useless

2012 SIT Administrator Academy

Overall, the objectives of the Administrators’ Academy were met. The project team obtained administrator buy-in for the CEEMS project. Administrators were given a snapshot of what their teachers were learning and the project expectations so that the teachers had an advocate as they implemented their CBL units during the academic year.

The academy was well received with all evaluation questions receiving very high ratings. The highest rated questions pertained to the usefulness of the presentations, exercises, and activities in increasing understanding of how the CEEMS project fits into science and math education and understanding of the concepts presented (mean of 4.89 out of 5). While still high, the lowest rating was associated with the administrators indicating that they had a plan of action for CEEMS implementation in their school or district (mean of 4.43 out of 5). These plans for the academic year were outlined by the administrators in a separate document and these plans will be discussed at the end of this document. Table 5 summarizes quantitative results for the closed-ended questions.

Page 15: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 12

Table 5. Average Agreement Levels for the Closed-ended Evaluation Questions

N Mean* Std. Dev.

1. Overall, I understand the purposes and objectives of the CEEMS project. 9 4.78 .441

2. Overall, the CEEMS Administrator Academy will help me better support the participating teachers in my school or district. 9 4.78 .441

3. The information was organized so that a logical progression of ideas was presented to participants. 9 4.78 .441

4. The presentations included in this academy were helpful to my understanding of how CEEMS project fits into science and math education.

9 4.89 .333

5. The exercises and activities helped me understand the concepts being presented. 9 4.89 .333

6. I left this academy with a plan of action for CEEMS implementation in my school or district. 7 4.43 .535

*Scale: 5= Strongly Agree; 4=Agree; 3=Neutral; 2=Disagree; 1=Strongly Disagree

The open-ended responses written by the administrator attendees support the quantitative results. The aspects that were stated as most useful were:

• active participation and meeting engineers I can bring in to talk to the students

• describing how the program works

• The program director talk about specifics and the panel of engineers. also the discussion among other admin for ideas, concerns

• I think that understanding how the challenge design process works and experiencing some of the activities will allow me to better support my science teacher

• learning about CEEMS and how to support my teacher

• learning/ challenge activities/ design activities with teachers and other admin

• resource info and overview of teacher learning

• The connection of standards to real life applications was great.

• understanding the responsibilities of the teachers and admin's role in CEEMS

The final question on the evaluation survey asked the administrators if they had any additional questions related to the resource team and their interactions with their teachers. There were two responses. The first asked the project team to “just make sure to do this,” provide support to them and their teachers during the academic year. The other response asked for a “summary of the overall aspects of CEEMS.” The project team has since sent a project summary to all schools

Page 16: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 13

and districts participating in the project.

To guide thinking about next steps, each administrator was asked to answer four questions, giving examples and expected outcomes: 1) How do you plan to “market” this program to your higher level administrators? 2) How do you plan to “market” this plan to other teachers in your school and district? 3) What are your initial thoughts about how you will recruit additional teachers for next year’s SIT? Do you recommend a targeted recruitment in selected school(s) or a district-wide effort? 4) How do you plan to support the 2012 SIT participants as they conduct in-school and district-wide professional development activities? A review of these action plans outlined by each administrator indicate that most of the administrators are in the awareness, or good to know stage with a few stating specific concrete actions. All responses are included in Appendix B.

Currently, CEEMS project marketing consists of lead project staff meeting with higher level administrators, with some planning to highlight CEEMS information on school or district websites. These administrators also plan on using CEEMS published materials to support marketing to both administrators and teachers. The CEEMS participants will have administration support for venues to “share experiences” with other teachers. As implementing these units positively impacts student scores, these results will be shared with others to create enthusiasm for the project and to help recruit teachers for future SIT opportunities. The predominant strategy suggested was targeted recruitment but the target is not the same across school districts.

Facilitation of communication was the most commonly reported administrator support. The administrators stated that they would initiate frequent communication and some plan to observe the CBL units. Additionally, one school has arranged for CEEMS participants to have common planning times. This support needs to be closely observed because it may be a “highly suggested” recommendation for future years if multiple teachers are involved from the same school in other districts. Multiple administrators also stated that they would facilitate professional development opportunities within their school or district. The CEEMS project team determined that it needs to monitor the administrator support given to the participants and remind the administrators what they planned to do, if needed.

2012-13 Academic Year Unit Implementation Activities

The CEEMS project provided support to participating teachers during the academic year. This support consisted of regular communication and visits from designated Resource Team members and three scheduled Community of Practice meetings for the teachers to discuss what is happening in the classroom with each other and the project team. Teachers also received project updates from the project staff and, if needed, “just-in-time” professional development. The teachers met in November, February and May. In addition to discussions about the status of the units, the teachers were given revised project documentation at the November and February meetings which was discussed as a group. In May, the evaluator conducted an end-of-year focus group with the teachers and distributed revised feedback surveys to five teachers who volunteered to pilot more

Page 17: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 14

detailed student surveys and teacher surveys. These surveys (three for the students and three for the teachers) were developed to give the evaluator and project team more detailed information pertaining to the unit implementation. This will inform the project team how to provide professional development that helps teachers engage students in the identification of the guiding questions and engineering design process. The revised surveys were discussed with the pilot teachers during another focus group and the teachers gave very specific constructive feedback to the evaluation team for survey modifications. These modified surveys (students -- after the guiding question discussion and at the end of the unit; teachers – after the guiding questions discussion, pertaining to the engineering design process and a final reflection for the entire challenge-based unit process and unit) will be distributed to teachers and students during the 2013-2014 academic year to all participating teachers.

The first online communication log for the resource team members to document their interactions with the CEEMS teachers was logged January 18th, 2013 and the 172nd was logged on March 7th. The 172 logs were approximately split between new and follow-up communications. Records indicate that all CEEMS teachers had at least three communications documented. Over 50% of the time the communications were email messages. Fifty percent of the communication concerned “unit development”, followed by 25% of the communication topics related to “unit implementation”. The most identified specific topics were “engineering design-based aspects of the lessons” (16%) and “challenge-based aspects of the lessons” (12%). The open-ended comments written as part of these logs gave a broad overview of the support provided by the resource team. While not all resource team members completed theses logs during spring 2013, those who did liked the format and all resource team members will be asked to continue using the instrument. Biweekly reminders will be sent to encourage them to document communications on a regular basis throughout the entire 2013-2014 academic year.

2013 SIT Preparation Activities

The CEEMS project team conducted a series of workshops (February 5 – April 9, 2013) for faculty teaching the 2013 SIT. There were five workshops in total. Changes to the workshops versus the previous year were a result of feedback received for the evaluations of the 2012 Faculty Workshops and suggestions provided by the National Advisory Board. Specifically, connections were made with mathematics skills within each course and challenge-based learning examples were incorporated into the course instruction as much as possible.

The first workshop was for new faculty to the project and the agenda included an introduction to the project, introduction to challenge-based and design-based learning, and a discussion with a returning faculty member. At the second workshop, all faculty were asked to attend and there was more detailed discussion about CBL and lessons learned from last year. A few 2012 participating teachers discussed their units and implementation in the classrooms so that the faculty got an idea of how their information may be applied by teachers. The third workshop included a discussion about the concept mapping used to assess participant teacher content

Page 18: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 15

knowledge at the beginning and end of each course and a discussion about academic standards for K-12 students in Ohio. The last two workshops were left open for faculty to share course outlines so they could coordinate information and design challenges across courses if appropriate. These courses included the following engineering courses: Engineering Foundations (all Year 1 teachers); Applications of Technology (all Year 2 teachers); Engineering Applications of Mathematics; Engineering Models; Engineering Energy Systems. The science courses were: Modeling & Applications in Physical Sciences; Modeling & Applications in Biological Sciences; Modeling & Applications in Earth Systems. All course syllabi were developed and provided to the project team prior to the beginning of the SIT. Evaluation results from the faculty attending the workshops indicate that the seminars met the goal of supporting faculty in their development or revision of the SIT courses. They were taught this year with a mixture of returning and new faculty.

Additionally, the project team began recruitment activities earlier in the academic year for the 2013 SIT. They wanted to increase the number of teachers who applied and select 24 teachers who were qualified and complemented the teachers in the first cohort. The project team modified recruitment materials and created a video that outlined the benefits of the project. These materials are on the CEEMS website (http://ceas.uc.edu/special_programs/ceems/CEEMS_Home.html).

Project Outcomes

The project has had a positive impact on teachers and students. Cohort 1 teachers each created and implemented three CEEMS units that incorporated both Challenge-Based Learning and the Engineering Design Process. The teachers reported that the project was very worthwhile because of the positive effect they had on their students. They reported changing their pedagogy and doing things differently in the classroom and, while the creation and implementation of the units was difficult, they said that participating in this project made them better teachers.

Teacher Outcomes

Over the course of the entire project, the CEEMS teachers have produced 40 units which have been successfully implemented in a middle or high school classroom at least once. These units are being uploaded to the CEEMS Project website (http://ceas.uc.edu/special_programs/ceems/CEEMS_Home.html) and can be used by other teachers. These curricular resources will become a major part of the project’s dissemination and sustainability plans.

Since the evaluation strives to measure changes in teacher content knowledge, attitudes, and behaviors in the classroom, an overall pre-post survey was developed that documents participating in-service teachers’ current instructional practices that are associated with challenge-based and design-based learning. The pre-survey was completed at the beginning of the 2012 SIT by all 16 participating in-service Cohort 1 teachers in June 2012 and served as baseline information for

Page 19: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 16

participants’ instructional practices. The post-survey for Cohort 1 was administered May 2013 during a Community of Practice Meeting. The pre-survey for Cohort 2 was administered during the months of April and May of 2013 at time of enrollment in the CEEMS project.

Cohort 1 Results

Results from a one-sample t-test analysis indicate that there were significant increases in participants’ reported levels of confidence for all current instructional practices listed. Results for each question can be found in Table B-3-1 in Appendix B. In an end-of-year focus group with Cohort 1 teachers, they indicated that the CEEMS training and support provided made them more comfortable when implementing these challenge-based focused units. Tables B-3-2 and B-3-3 in Appendix B summarize the pre and post results for confidence. When reviewing the distributions associated with the extent to which teachers reported using certain practices, there were large shifts in the distributions of the responses, toward higher levels of usage. Since the response choices were categories, means for responses to questions could not be determined but a review of the pre-post distributions, Chi-squared analysis for the entire group indicates that the distributions of usage level responses from the beginning to the end of the academic year shifted to higher usage levels and only one person indicated that they never used one practice. This is a large decrease from the pre-survey results. Usage results are shown in Tables B-3-4 and B-3-5 in Appendix B, pre and post respectively.

The pre-survey results indicated that the most used practice was the explicit connection of class content to real world examples and applications (37.5% (6) of the participants used it regularly). The least used instructional practices were the explicit connection of class content to how people in STEM careers use their knowledge to address societal impacts (43.8% (7) of the teachers never used it) and guidance provided students to break complex global problems into their local and more actionable components (37.5% (6) of participants never used it). These instances of non-usage of critical practices in challenge-based/design-based learning provided opportunities for Project CEEMS in the improvement of teachers’ instructional practices. Survey results for SIT participants on the confidence scale indicated that they were not very confident in implementing these challenged-based/design-based practices. The instructional practice with the highest confidence level was still low with only 25% of the participants reporting being very confident in explicitly connecting class content to real world examples and applications. Almost 44% of the participants reported being “not confident” in explicitly connecting class content to how people in STEM careers use their knowledge to address societal impacts; 37.5% noted they were not confident in providing opportunities for students to take responsibility for the decisions they made about the processes used in solving complex problems. These areas where teachers reported the lowest levels of confidence in implementation provided the CEEMS project with opportunities to build teacher skills and confidence in using critical practices in challenge-based/design-based learning.

Post-survey results indicated that there were changes in the instructional practices used regularly in the classroom and the top three directly supported challenge-based learning. Nine (64%) of the teachers reported that they used the following practices regularly: a) guide students in refining problems; b) guide students in planning investigations to better understand different components of

Page 20: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 17

problems; c) provide students with opportunities to explore multiple solution pathways for problems. Only one teacher indicated that they did not try to guide students to break complex global problems into their local and more actionable components. Every other practice was at least tried by all teachers. When considering reported confidence when implementing these challenge-based design-based learning practices, all means were 3.00 or higher, on a four point scale. The highest mean level of confidence was 3.64 out of 4 for the following: a) explicitly connect class content to real world examples and applications; b) guide students in planning investigations to better understand different components of problems; c) guide students in evaluating the results of their solution pathways; and d) provide students with opportunities to refine and retry a solution pathway. All these instructional practices were emphasized in the SIT training, reinforced in the academic year Community of Practice meetings and supported by the resource team members. These gains should continue as Cohort 1 participates in the second year of the summer training and implements three more CEEMS units in their classrooms.

Cohort 2 Pre-Survey Results

Pre-survey results for the Cohort 2 CEEMS teachers (n=24) indicated that the most used practice was the explicit connection of class content to real world examples and applications (33.3%, or 8, participants used it regularly). The least used instructional practices were the explicit connection of class content to how people in STEM careers use their knowledge to address societal impacts (33.3% (8), teachers never used it) and guidance provided students to break complex global problems into their local and more actionable components (62.5% (15) participants never used it). These instances of non-usage of critical practices in challenge-based/design-based learning provide opportunities for Project CEEMS in the improvement of teachers’ instructional practices. Results are in the Table B-3-6 in the Appendix B.

Survey results for Cohort 2 (n=24) on the confidence scale indicated that they were not very confident in implementing these challenged-based/design-based practices. The instructional practice with the highest confidence level was still low with only 12.5% of the participants reporting being very confident in guiding students in refinement of problems. Twenty-four percent of the participants (6) reported being “not confident” in explicitly connecting class content to how people in STEM careers use their knowledge to address societal impacts. These Cohort 2 results are similar to Cohort 1 pre-survey results. Once again, the areas where teachers reported the lowest levels of confidence in implementation provide the CEEMS project with opportunities to build teacher skills and confidence in using critical practices in challenge-based/design-based learning. Results are shown in Table B-3-7 in Appendix B.

Concept Knowledge

In addition to these current instructional practices surveys, the research team has collected and analyzed pilot data related to participants’ content knowledge pre-post each SIT course. A research team member worked with each faculty member to develop a concept map questionnaire for their course’s content which was administered on the first and last days of the course. These pilot data were difficult to analyze and modifications were made to the concept map questionnaires

Page 21: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 18

prior to their second administration during the 2013 SIT courses. Data are currently being fully analyzed and will be presented when it is available as part of the research report.

Student Outcomes

Student Work

Student work was collected in a non-systematic manner during the first classroom implementation. In between units one and two, the teachers were asked to collect and provide the evaluators with examples of student work generated during the unit. Then, the teachers were asked to provide pre-post student assessments that demonstrate knowledge growth associated with each unit. A non-representative sample of both student work and pre-post assessments were collected. Examples of student work were organized into displays for the National Advisory Board meeting in March 2013. These displays included a variety of student work and the evaluators included all work provided to us by the teachers. Figures 1 – 5 in Appendix C show these compilations. Pre-post student assessments were unit-specific and the data sent from 12 units indicated positive student content knowledge. Due to the small amount of data received, these scores were not included in this summary evalaution. The evaluation will collect these data systematically in the upcoming year as these assessments have become part of the unit preparation checklist.

Student Activity Feedback Forms

In all, Student Activity Feedback Forms were received from 1498 students in 13 participating CEEMS teachers’ classrooms. Student Activity Feedback Forms were received from 1293 students in middle and high school classes taught by 12 participating CEEMS teachers. These forms were collected after 21 different CEEMS lessons. See Appendix A for a copy of the Student Activity Feedback Form and Appendix D for a summary of all student activity feedback survey results.

Aggregated results from all students indicate that these lessons were positively perceived by students (overall rating of 4.09 out of 5). The statements with the highest levels of student agreement were related to the teachers.

• The teacher was very good at answering our questions (mean of 4.33 out of 5) • The teacher was able to explain the subject very easily (mean of 4.30 out of 5) • The teacher encouraged us to ask questions (mean of 4.22 out of 5)

The students agreed that these lessons were different than their usual lessons (mean of 4.16 out of 5) and they liked the activities (mean of 4.09 out of 5). The students reported learning a lot from the lessons (mean of 3.91 out of 5) and the teacher (mean of 4.07 out of 5). The lowest levels of agreement were for the statements related to interest in studying Engineering and comfort level about studying mathematics or science.

• This lesson made me interested in learning more about Engineering (mean of 3.32 out of 5) • This lesson helped me feel more confident about studying math (mean of 3.27 out of 5)

Page 22: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 19

• This lesson made me feel more comfortable about studying science (mean of 3.54 out of 5).

When these results are disaggregated by unit (see tables in Appendix C-1), the relative ratings of these statements are mostly equivalent to those overall. There were overall mean ratings of 4.15, 4.03, 4.13, respectively for Unit 1, Unit 2, Unit 3. The students reported that they learned a lot after each unit (mean ratings of 3.99, 3.87, and 3.89 out of 5, respectively). An exception to the similarity of ratings by unit is that the highest level of student agreement on Unit 3 was “This lesson was different from other lessons I’ve had in this class” (mean of 4.27 out of 5). Since student results for this question for the five pilot teachers (shown in Table7) is lower (mean of 3.99 out of 5) than the rating for the other teachers, the evaluators will explore whether the pilot teachers use CBL or design-based activities more regularly in their classrooms. If this is the case, the mean for this particular question should decrease over time.

Survey Revision/Pilot

As stated previously, students will complete revised surveys during the 2013-2014 academic year. As a results of teacher feedback, one survey will be administered after the guiding questions are identified, a pivotal part of challenge-based learning approach. The second survey will be administered at the conclusion of the unit and it includes reactions to the engineering deisgn process as well as the entire unit.

Student Baseline Outcome Data/Demographics

Initial student outcome baseline data (standardized) from academic year 2011-2012 was obtained to determine the current level of student achievement. As teachers continue in the CEEMS project, their students’ achievement results will be compared year-to-year so that changes, hopefully improvements in mathematics and science scores, can be tracked.

These baseline data collected were from the 2012 School Year District Report Cards issued by the Ohio Department of Education. The total number of students in the ten school districts in which the 15 participating 2012 SIT teachers work are 61,744. Also, as reported by the schools, the number of students in the courses taught by these 15 participating teachers is 976. As noted in our proposal, the project is working with some middle and high schools teachers within these school districts, and after five years, it is estimated that the project will impact 11,700 students and 585 STEM teachers (160 pre-service and 425 in-service).

These districts’ state ratings range from Academic Watch to Excellent ratings with a variety of demographic profiles. Oak Hills is a high performing school with very strong ties to UC and a large number of students receive dual credit in high school and attend UC as undergraduates. Cincinnati Public is the largest school district and is the urban district surrounding UC. The majority of their students are high risk. The district also has many specialized schools such as STEM and Engineering. The other participating schools are suburban schools and are within the first-ring suburbs and the demographics and associated teaching and learning needs are changing with the influx of diverse students and tightening budgets. Suburban school graduates represent a very

Page 23: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 20

important group of potential future UC undergraduates because there are a lot of first-generation college students and the majority of UC students come from within 50 miles of the school. The rural consortium is a new partnership between nine school districts in the ounties east of Cincinnati. This diverse group of schools has students of all ability levels with the challenges of rural education. By partnering with these districts, UC is reaching Appalachian students and building stronger connections with teachers more than one hour from campus.

Tables 6 and 7 show the districts’ enrollment and performance data. These districts represent all types of districts. There are currently 26 indicators; 14 measure whether 75% of the students are proficient in a district in the subjects of Reading, Math, Writing, Science and Social Studies at various grade levels between 3rd and 8th grades. Ten indicators each measure whether the five disciplines are passed by over 75% of the students on the 10th and/or 11th grade Ohio Graduation Test. The other two indicators are for attendance and graduation rates. District-level student demographics are shown in Table 8. Caucasians make up the vast majority (>75%) in two school districts and the consortium but these schools have more diversity when looking at academic accommodations and socioeconomic status. These districts all show areas for improvement in grade level math and science achievement and two districts have 20% drop-out rates. When we look at the school buildings participating in this project, the baseline data are similar with the lowest school rating being Continuous Improvement. The evaluation team provided spreadsheets for administrators to provide disaggregated data but this was compiled by only some schools. These data are in Tables 9-13.

In the future, data collection activities pertaining to student achievement scores will begin in the fall and an initial spreadsheet will be generated for all schools that include data publically available from the Ohio Department of Education website, thus allowing the administrators’ time to focus on compiling data that readily are available. While the math and science course offerings differ by district, students impacted by this project will have a greater opportunity to have teachers with increased content knowledge and possibly more courses offered so the total number of students in all varieties of mathematics and science courses will be increased over the five years. These data will be tracked over time.

Page 24: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 21

Table 6. District Enrollment and Performance Data

District

Total Student

Enrollment Type District

State Indicators Met (out of

26)

Performance Index Score (out of 120) Rating

Cincinnati 28,719 Major Urban 11 88.5 "Continuous Improvement"

Oak Hills 7,712 Urban 26 101.6 "Excellent"

Norwood City 2,056 Urban 23 98.3 “Excellent”

Princeton City 5,194 Suburban 20 95.6 “Effective”

Winton Woods City 3,349 Suburban 14 88.4 “Academic Watch”

Felicity-Franklin 1,006 Rural 14 92.2 “Effective”

Clermont Northeastern 1,622 Suburban/Rural 25 100.0 “Excellent”

Goshen 2,615 Suburban/Rural 26 103.9 “Excellent”

West Clermont 8,437 Suburban/Rural 26 100.2 “Effective”

Williamsburg 1,034 Rural 24 100.2 “Excellent”

Table 7. Percentages of Students At & Above the Proficient Level (2012 Report Card Data)

Mathematics Test Science Test

The state requirement is 75 percent for proficiency.

3rd Grade

4th Grade

5th Grade

6th Grade

7th Grade

8th Grade

10th Grade OGT

5th Grade

8th Grade

10th Grade OGT

Cincinnati 69.5 63.6 51.2 69.1 61.1 68.2 75.6 50.2 52.2 66.5

Oak Hills 91.7 88.2 80.3 86.8 85.6 88.6 90.1 81.6 85.9 83.3

Norwood 87.6 83.2 74.1 82.1 74.1 88.7 88.2 86.3 75.9 78.5

Princeton 80.6 78.7 76.4 83.3 64.3 67.4 80.2 71.5 69.3 72.9

Winton Woods 69.4 62.2 30.4 60.9 63.1 59.1 77.1 66.5 54.3 68.0

Felicity-Franklin 79.7 75.9 42.3 70.0 74.4 88.1 70.6 73.2 69.0 63.2

Clermont Northeastern 88.7 85.6 71.7 85.8 83.2 87.7 86.6 80.0 88.5 80.3

Goshen 94.4 87.2 87.6 96.8 85.0 89.6 91.0 90.2 82.1 84.9

West Clermont 84.2 83.1 79.9 82.5 79.0 89.1 89.7 84.8 85.3 82.7

Williamsburg 86.3 86.8 57.7 89.2 80.9 86.7 88.4 76.9 75.9 85.5

Page 25: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 22

Table 8. Demographics of Students by District (All Numbers are %)

District Male Female

American Indian or

Native American

Asian or Pacific

Islander

Black or African

American Hispanic or Latino White No Primary

Cincinnati 51 49 0.1 1.0 65.4 3.0 25.3 5.4

Oak Hills 51 49 0.2 1.4 1.6 1.1 92.6 3.1

Norwood City 53 47 0 0.5 10.2 8.0 76.7 4.3

Princeton City 51 49 0.2 3.6 45.6 12.5 32.4 5.6

Winton Woods City 50 50 0 1.6 67.0 8.3 15.0 8.0

Felicity-Franklin 49 51 0 0 0 1.8 95.8 1.6

Clermont Northeastern 51 49 0 0 0 0.9 96.5 1.8

Goshen 53 47 0 0 0.8 2.5 93.5 2.8

West Clermont 52 48 0 1.3 1.2 1.8 92.9 2.6

Williamsburg 51 49 0 0 0 1.7 96.1 1.4

Limited English

Proficiency (LEP)

Individualized Education Plan

(IEP) Economically

Disadvantaged Migrant

Cincinnati 4.3 20.1 72.6 0

Oak Hills 0 13.7 8.7 0

Norwood City 4.8 16.1 60.7 0

Princeton City 12.4 16.2 61.4 0.8

Winton Woods City 7.9 16.9 63.4 0

Felicity-Franklin 0 16.6 55.2 0

Clermont Northeastern 0 19.1 44.9 0

Goshen 0.6 18.6 54.2 0

West Clermont 1.2 14.1 36.9 0

Williamsburg 0 14.6 42.7 0

Page 26: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 23

Table 9. School Enrollment and Performance Data

District Total Student

Enrollment Type District State

Indicators Met

Performance Index Score (out of 120) Rating

Withrow HS 880 Major Urban 4/12 91.9 “Effective”

Rothenberg MS 296 Major Urban 2/15 79.0 “Continuous

Improvement”

Robert A. Taft HS 564 Major Urban 10/12 92.6 “Effective”

Oak Hills HS 2,619 Urban 12/12 101.2 "Excellent"

Norwood MS 352 Urban 5/6 97.5 “Excellent”

Princeton HS 1,671 Suburban 10/12 98.4 “Effective”

Winton Woods HS 1,116 Suburban 10/12 94.6 “Effective”

Felicity-Franklin MS 331 Rural 5/11 90.8 “Effective”

Clermont Northeastern HS 545 Suburban/Rural 12/12 101.9 “Excellent”

Goshen MS 590 Suburban/Rural 8/8 102.7 “Excellent”

Amelia MS 975 Suburban/Rural 8/8 97.4 “Effective”

Williamsburg 565 Rural 18/19 101.7 “Excellent”

Table 10. Percentages of Students At & Above the Proficient Level (2012 Report Card Data)

Mathematics Test Science Test The state requirement is 75 percent for proficiency.

3rd Grade

4th Grade

5th Grade

6th Grade

7th Grade

8th Grade

10th Grade OGT

5th Grade

8th Grade

10th Grade OGT

Withrow HS -- -- -- -- -- -- 70.3 -- -- 60.3 Rothenberg MS 56.7 62.1 47.8 58.1 45.5 68.4 -- 39.1 10.5 --

Robert A. Taft HS -- -- -- -- -- -- 84.9 -- -- 77.1 Oak Hills HS -- -- -- -- -- -- 90.1 -- -- 83.3 Norwood MS -- -- -- -- 74.1 88.7 -- -- 75.9 -- Princeton HS -- -- -- -- -- -- 80.1 -- -- 73.0

Winton Woods HS -- -- -- -- -- -- 77.1 -- -- 68.0

Felicity-Franklin MS -- -- 42.3 70.0 74.4 88.1 -- 73.2 69.0 --

Clermont Northeastern HS -- -- -- -- -- -- 86.6 -- -- 80.3

Goshen MS -- -- -- 96.8 85.0 89.6 -- -- 82.1 -- Amelia MS -- -- -- 82.8 79.5 88.1 -- -- 80.9 --

Williamsburg HS -- -- -- 89.2 90.9 86.7 88.4 -- 75.9 85.5

Page 27: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 24

Table 11. Demographics of Students by School (All Numbers are %)

District Male Female

American Indian or

Native American

Asian or Pacific

Islander

Black or African

American Hispanic or Latino White No Primary

Withrow HS 48 52 0 0 91.8 3.2 1.4 2.3

Rothenberg MS 53 47 0 0 95.9 0 0 0

Robert A. Taft HS 55 45 0 0 94.4 0 3.1 0

Oak Hills HS 53 47 0.4 1.6 1.5 0.7 93.3 2.5

Norwood MS 55 45 0 0 10.4 6.8 77.4 3.8

Princeton HS 52 48 0 3.2 54.2 6.9 31.5 4.1

Winton Woods HS 50 50 0 1.9 70.6 4.9 15.4 7.1

Felicity-Franklin MS 48 52 0 0 0 0 95.9 0

Clermont Northeastern HS 50 50 0 0 0 0 97.4 0

Goshen MS 52 48 0 0 0 3.6 93.2 2.3

Amelia MS 51 49 0 0 0 1.1 94.9 2.6

Williamsburg HS 51 49 0 0 0 0 97.1 0

Limited English

Proficiency (LEP)

Individualized Education Plan

(IEP) Economically

Disadvantaged Migrant

Withrow HS 6.5 18.5 80.3 0

Rothenberg MS 0 25.6 98.9 0

Robert A. Taft HS 1.9 25.0 94.1 0

Oak Hills HS 0 14.6 0 0

Norwood MS 0 14.4 58.5 0

Princeton HS 5.8 20.1 51.9 0.7

Winton Woods HS 2.5 19.1 48.8 0

Felicity-Franklin MS 0 19.7 61.3 0

Clermont Northeastern HS 0 22.2 35.6 0

Goshen MS 0 19.5 55.4 0

Amelia MS 0 15.0 43.0 0

Williamsburg HS 0 14.1 37.4 0

Page 28: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 25

Table 12. Numbers of Students in Courses Taught by CEEMS Teachers

Table 13. Performance Data Provided for Students in Courses Taught by CEEMS Teachers

7th Grade 8th Grade 9th Grade 10th Grade 11th Grade 12th Grade

Oak Hills HS Science Scores Semester Exam Results - - - 77.57 69.29 68.44

OAA/OGT Results - - - 433 456 439 Overall GPA - - - 3.09 2.68 2.66

Winton Woods HS Math Scores OAA/OGT - - 435 429.2 492.5 401

Felicity-Franklin MS Math Scores OAA/OGT 419 - - - - -

Clermont Northeastern HS Science Scores OAA/OGT - - - 416 - -

Williamsburg HS Science Semester Exam Results - - - 87.1 84.0 79.4

OAA/OGT Results - - - - 448.3 443.9 Overall GPA - - - 3.636 3.373 3.198

Total Students

Reached by CEEMS

Teachers 7th

Grade 8th Grade 9th Grade 10th

Grade 11th

Grade 12th

Grade

Withrow HS 120 - - - - - -

Rothenberg MS 120 - - - - - -

Robert A. Taft HS 120 - - - - - -

Oak Hills HS 159 - - - 79 41 39

Norwood MS 352 200 152 - - - -

Princeton HS 120 - - - - - -

Winton Woods HS 55 - - 1 50 2 2

Felicity-Franklin MS 91 91 - - - - -

Clermont Northeastern HS 57 - - 54 3 - -

Goshen MS 69 69 - - - - -

Amelia MS 159 - 159 - - - -

Williamsburg HS 34 - - - 11 8 15

Page 29: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 26

Project Research

The Research activities associated with the CEEMS project are coordinated by Dr. Helen Meyer. They research component was redesigned as a result of feedback from the project’s National Advisory Board and initial instruments have been developed. Classroom observations and detailed data collection and analysis are scheduled for the upcoming academic year. The research and evaluation teams have been granted approval from the University of Cincinnati Institutional Review Board. To minimize “over-surveying” participants, the research and evaluation teams have created a comprehensive data management system that will help coordinate data collection and data sharing. The first and second cohort of in-service and pre-service teacher participants have consented to participate in the evaluation activities and the majority have agreed to be contacted by the researchers if they are chosen to be part of the research study. As recommended by the National Advisory Board members, the revised CEEMS research design is case studies that will focus on teaches’ experiences and impacts of the project. This methodology has the potential to obtain in-depth data related to how and why the project outputs and outcomes can be attributed to the project activities.

Evaluation of the research component will focus on how effective the research results are in helping the project team identify what supported and what inhibited the team’s ability to attain project goals and to begin supporting the causal model. Effectiveness will be considered in terms of the outputs and outcomes (i.e., number of participants studied and what these participants do) and in terms of the research process itself (i.e., elements of a case study design, use of reliable and valid instruments, scientific evidence related to the project’s theory of change). The change theory associated with this project includes the following: a) give teachers professional development on how to implement design-based and challenge-based instruction; b) follow-up with school year support (administrative support, resource team members, others); c) changes in teacher attitudes and behaviors; d) sustained change in teacher practices, which ultimately leads to sustained increase in student achievement, the first goal of the CEEMS project.

Sustainability

Even though CEEMS is only in its second year, the project team has begun working toward its sustainability. The university has approved an ACCEND program, recommended as part of this project. As background, in fall of 2002, the College of Engineering and Applied Sciences at UC created a combined BS-MS program in all its engineering disciplines, called the Accelerated Engineering Degree (ACCEND), and extended it to the BS-MBA program in the fall of 2006. Five students are expected to take advantage of this opportunity. UC has also maintained, or increased, recruitment of undergraduates and career-changers into licensure programs. Another objective of the project was to create an engineering education Ohio Board of Regents endorsement for educators and these discussions are continuing.

There are currently two cohorts of teachers working in 12 participating schools. Some of the

Page 30: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 27

schools have more than one cohort teacher and discussions have begun about creating support systems, especially pertaining to opportunities for professional development within schools and districts, as well as regional and national conferences.

Teacher-Led Professional Development Sessions

As part of the CEEMS dissemination and sustainability efforts, the 2012 SIT participants provided professional development at their schools to their colleagues. As of April 26, 2013, project evaluators have received surveys from 150 attendees at 12 sessions. These sessions were led by 13 out of 15 CEEMS teachers. Where there are more than one CEEMS teacher per school, the teachers have been conducted the PD jointly.

Overall, the PD sessions were rated very highly (mean of 4.59 out of 5). Attendees reported that the workshops aided their understanding of STEM education and challenge-basedlearning (means of 4.58 and 4.53, out of 5, respectively). The lowest rated mean indicated that the teachers were likely, but not very likely, to use the concepts presented in their own classrooms (mean of 4.18 out of 5). This lower rating may be due to differences between content taught in mathematics and science courses. These quantitative evaluation results are summarized in Table 14.

Open-ended comments supported these results. Attendees commented that the presenters were well prepared, organized and knowledgeable. A specific example of how a teacher plans to use what they learned in these professional development sessions is, “I now have my unit planned for surface area and volume; students will try to open a ‘business’ in an abandoned building in the community and design the inside of it. They must find out how much of each material they need to buy in order to design and decorate their space by using surface area and volume.” Many attendees mentioned that they would try to apply challenge-based learning and the engineering design process in their classrooms as a result of their attending this workshop. See Table 15 for a summarized list of activities attendees reported they plan to try in their classrooms. Table 16 lists suggestions from the PD participants related to the CEEMS project and future ideas for lessons. In conclusion, these teacher-led professional development sessions gave the CEEMS project positive exposure. As one attendee stated, “It has made me very excited to apply!”

Page 31: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 28

Table 14. Summary of Quantitative Evaluation Results – All PD Sessions combined

N Mean* Std. Deviation

1. Please provide an OVERALL RATING for this PD workshop.

147 4.69 .447

2. Please rate this PD workshop in aiding your understanding of STEM education.

148 4.58 .583

3. Please rate this workshop in aiding your understanding of how to implement engineering design into science and math classes.

147 4.44 .663

4. Please rate this workshop in aiding your understanding of challenge-basedlearning.

148 4.53 .553

5. Please rate the likelihood that you will use some of the concepts presented in this workshop in your own classroom.

145 4.18 .831

*Scale for questions 1-4: 5 = Very Good; 4 = Good; 3 = Neutral; 2 = Poor; 1 = Very Poor. Scale for question 5: 5 = Very Likely; 4 = Likely; 3 = Neutral; 2 = Unlikely; 1 = Very Unlikely.

Page 32: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 29

Table 15. Examples of Activities from PD that Attendees Reported They Plan to Use in Their Classrooms – All sessions combined

• Bringing real-world issues and big-ideas into the classrooms

• “Big Questions” when working on units in social studies.

• Bringing real world experience into the class room

• Create movies and posters for units that can be used in the classroom

• Creating more problem based ideas for the students to try which relate to real world.

• Design process. Bringing in more math and science concepts.

• Engineering Design.

• Examples of group work and hands-on-activities

• I really liked the student foldout to organize student-driven questions, and I can see it being used in the environmental unit of biology.

• I want to implement “Big Questions” on a regular in my English classes when working on novels, short story units etc.

• I will use the engineering loop in order for my student to understand their projects!

• I would like to allow for time for students to change their designs and revisit after initial tests in water filter designs.

• I would like to incorporate more projects into student learning in my classroom.

• Incorporating real world connections into content to make material relevant.

• More cooperation group work and hands on activities.

• More design challenges.

• Specific activities described, such as the “Bridge” and “Roller Coaster” and “Rock and Roll/Earthquakes”, iPhone Drop, “Ring & String” Glider

• Specific lessons described, such as geometric shapes, water pollution, environmental science

• The process of presenting a challenge to students and how to make it student centered.

• Using challenge-based learning and engineering-design process that use science, mathematics and engineering concepts to solve real-world big ideas

• We will work on a multi-disciplinary unit.

Page 33: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 30

Table 16. Inquiries and Suggestions Related to CEEMS Stated in Additional Comments Section – All PD sessions combined

• Give more examples for specific content area

• Give more to teachers that inform them how to implement these ideas

• I would like to hear/learn more about the rubrics you used to guide students in the inquiry

• More “take-aways” would have been helpful

• My biggest difficulty is how to implement/timing/assessment

• Would be perfect if applications for [CEEMS] program was available [during this PD]

• Would like chemical and/or biological sciences applications

• Would like more information/examples pertaining to Algebra curriculum

One conference in which CEEMS teachers presented their work was the 2013 STEM Conference sponsored by the project. Approximately 250 people attended the conference and the evaluation results are discussed in more details below.

2013 STEM Conference

Overall, the online evaluation for the 2013 STEM Conference reported that the conference was successful. The overall evaluation question, “Overall, this event aided my understanding of STEM education”, received a mean response of 3.50 out of 4. There were 139 responses to the evaluation from May 7th through May 28th, an approximate 70% response rate for expected respondents.

Respondents reported that the “information presented will be USEFUL in my future educational activities” (mean of 3.45 out of 4) and they agreed that the applications, careers and societal impacts of STEM activities discussed (CEEMS Project objectives) were addressed and led to their increased understanding (mean averages of 3.37, 3.08, and 3.17 out of 4, respectively).

At the end of the conference, respondents reported that they had activities in hand that “can be implemented in my own teaching or learning environment” (mean of 3.46 out of 4) and the “event provided me opportunities to learn about current initiatives” (mean of 3.45 out of 4). Finally, the “breadth of session options was a positive aspect of the conference” (mean of 3.42 out of 4) and the Tangeman University Center at the University of Cincinnati was an appropriate venue (mean of 3.50 out of 4). Descriptive statistics for all rating questions on the conference evaluation are summarized in Table E-1, in the Appendix.

Responses to the open-ended questions support these positive quantitative results and give the project team constructive feedback for improving similar events in the future. Respondents stated that they valued the following categories most about the conference: resources provided, networking opportunities, the variety of presentations, learning about STEM in the region, learning about challenge-based learning and the design process, Next Generation Science Standards, the

Page 34: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 31

Keynote Speakers, and the fact that the project gave the participants a flash drive containing all presentations and handouts. A complete list of the most valuable aspects of the 2013 STEM Conference is shown in Table E-2, in the Appendix. Suggestions for improvements are found in Table E-3, in the Appendix. Respondents listed STEM related presentations or topics they would like to learn more about (shown in Table E-4, in the Appendix). There ideas were focused around specific STEM subjects, integration of STEM with other subjects, challenge-based learning, collaboration with businesses, local non-profits and higher education, project-based learning, general classroom ideas and activities, how to get started teaching STEM, professional development, and other topics. Table E-5 in the Appendix lists the sessions that respondents found most useful. As anticipated, the majority of the conference attendees were K-12 educators. As reported by the respondents, they were K-12 teachers (63.4% of responses) who taught grades 7-12 (68.6% of responses) STEM subjects. Demographic information for the respondents is summarized in Table E-6 in the Appendix.

Results from the 524 individual session evaluations, completed at the end of each session, were very positive: “Overall, the information presented during this session was very useful” (mean of 3.54 out of 4) and they would “recommend this session to their colleagues” (mean of 3.52 out of 4). A summary of these post-session results are shown in Table E-7, in the Appendix. When these results were disaggregated by CEEMS teachers and non-CEEMS teacher presenters, the CEEMS teacher presenters received higher mean ratings for the questions pertaining to increasing attendees’ understanding of challenge-based learning and how to use engineering as a context for teaching mathematics and science topics (mean ratings of 3.50 and 3.59, out of 4, for presenters who were CEEMS teachers compared to 3.39 and 3.32, out of 4, for non-CEEMS presenters). Results for all questions are shown in Tables E-8 and E-9 in the Appendix.

Website Development

The CEEMS website (http://ceas.uc.edu/special_programs/ceems/CEEMS_Home.html) contains all the information about the CEEMS project. The pages on this website include the visions and goals of the project as well as the people involved with the project and how to contact them. On the website one can also find the pathways by which CEEMS educates in-service and pre-service teachers. There is also a list of courses that the teacher would have to take as part of participation in the CEEMS program.

The CEEMS website contains a list of the current CEEMS teachers. It also contains a link for the research and evaluation plan for the CEEMS project along with any publications about the CEEMS project. There is also a link for CEEMS events. Here one can find past events, upcoming events, as well as the explanations of all of the events. Last but not least, there is a link for requently asked questions about the CEEMS project as well as an icon linking the CEEMS website to the College of Engineering and Applied Science Facebook and Twitter pages.

Table 17 lists the “hits” to the website by month. Currently, the CEEMS website is primarily

Page 35: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 32

used as a tool for current CEEMS teachers and future CEEMS teachers as well as the resource team. The months that there was a significant increase in website traffic correlated with months of recruitment and applications for the CEEMS program.

Table 17. CEEMS Website “Hits” per Month

Month and Year Hit

September 2012 756

October 2012 446

November 2012 464

December 2012 458

January 2013 1,378

February 2013 1,703

March 2013 836

April 2013 1,441

May 2013 260

June 2013 352

Page 36: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 33

Conclusions

The CEEMS project has had a very productive Year 2. The first year implementation of lessons in the classroom, during this second project year, was successful and there are new procedures in place to collect two student surveys and three teacher surveys after each unit next year. Each participating teacher from the 2012 SIT implemented one to three lessons in their classroom. These lessons contained challenge-based learning approaches and the engineering design process. Of the beginning teachers, 12 have continued into Year 2 and attended the 2013 SIT. As mentioned at the focus group, these teachers plan to teach at least three CEEMS lessons this upcoming academic school year and most have mentioned teaching one or two additional lessons created last year a second time.

Students reported that they learned from these lessons and rated them highly. All project staff is working with the 35 participating teachers to incorporate pre-post assessments into each unit that will help the evaluation team assess changes in student learning. These results will supplement standardized student achievement test data obtained for the schools. The evaluation team believes that the pre-post assessment will be a finer grain assessment and evaluation measure for student learning while the achievement scores are a coarser grain measure that is highly valued by stakeholders for showing outcomes associated with the project activities.

As stated in the procedures section of this report, recruitment for Cohort 2 began earlier in the school year and the project was able to select the 24 most qualified applicants. A video for recruitment was placed on the website that highlights the project benefits. The project team determined that it should continue to work with each administrator to keep CEEMS in the forefront for opportunities that complement the current participants’ activities.

The planning for the 2013 SIT went smoothly. The project team kept the aspects of the experience that were identified as positives and made modifications that should lead to enhanced training and increased support for returning and new teachers. The evaluation results and suggestions from the National Advisory Board were thoroughly considered. The faculty workshops were revised, leading to more coordinated courses.

The associated research activities were pared down to focus on the teachers’ experience. The instruments have been developed and participating teachers noted if they were interested in being contacted about being part of the research case studies. The evaluation and research activities, data and analysis will be shared as appropriate. This will decrease the data collection burden for teachers.

Project sustainability efforts have begun. Academic pathways have been developed and approved. The teachers are making local and regional presentations and the lessons are currently available on the website. While it is early in the life of the project, activities indicate that the school districts are benefiting from participation and continue to recommend other teachers to participate.

Page 37: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 34

Appendixes

Page 38: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 35

Appendix A. Copies of Instruments

Page 39: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 36

Appendix A-1. Seminar 1 Evaluation for Faculty Workshops

Page 40: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 37

CEEMS Summer Institute Course Development Seminar Evaluation – February 5, 2013

Seminar 1. CEEMS Vision, Goals, Structure Facilitator – Dr. Anant Kukreti and Eugene Rutz

Thank you for attending this University of Cincinnati CEEMS Project sponsored seminar. This is the first seminar in a four seminar series and it is intended to support faculty course development for the CEEMS Summer Institute during Summer 2013. The project team wants to make sure that all of the faculty and staff involved with the project are working in concert toward common goals and using a common perspective. To help the seminar coordinators improve future events, we would like you to complete this brief evaluation survey and place it in the envelope provided by the entrance to this room. Completed surveys will be sent to and summarized by the UC Evaluation Service Center. All responses will remain confidential. Data will be analyzed in aggregate and no individual responses will be reported. If you have any questions or concerns, please contact Cathy Maltbie ([email protected],6-1469). Thank you for your participation. ______________________________________________________________________________ 1. Please provide an OVERALL RATING for this seminar.

○ Outstanding ○ Very Good ○ Satisfactory ○ Marginal ○ Poor

2. OVERALL, how would you rate this seminar in helping you understand the CEEMS project? ○ Outstanding ○ Very Good ○ Satisfactory ○ Marginal ○ Poor

3. OVERALL, the information presented will be USEFUL in my development of my CEEMS Summer Institute course. ○ Strongly Agree

○ Agree

○ Neither Agree Nor Disagree

○ Disagree

○ Strongly Disagree

4. This seminar’s activities helped me understand the material presented. ○ Strongly Agree

○ Agree

○ Neither Agree Nor Disagree

○ Disagree

○ Strongly Disagree

5. Please indicate your level of agreement to the following statements. The seminar provided opportunities to learn about …

Strongly Agree Agree Neutral Disagree

Strongly Disagree

a. … how to integrate challenge based learning into your course. ○ ○ ○ ○ ○

b. … vision and goals of CEEEMS project. ○ ○ ○ ○ ○

c. … school teacher (6-12 grades) grant-related responsibilities.

○ ○ ○ ○ ○

d. … CEEMS faculty grant-related responsibilities.

○ ○ ○ ○ ○

Page 41: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 38

6. What information do you anticipate to be the most helpful as you develop your course

for the CEEMS Summer Institute? Why? 7. What additional support would you like to have while you develop your courses? 8. Briefly describe the primary characteristics of challenge based learning and how

design based learning fits within its framework. To help us target support in the future, please provide the following:

We are asking for a confidential identifier that can be used to help us track your responses over time. This information will only be used to match responses.

What are the last two digits of your office phone number? (01; 02; … 99) _______________ What is the month of your birth? (Jan = 01; Feb = 02; etc.) _______________ What is the day of your birth? (01; 02; 03; …; 31) _______________

Page 42: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 39

Appendix A-2. Current Instructional Practices Survey

Page 43: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 40

UC NSF MSP – CEEMS Program: Current Instructional Practices This survey will be given to CEEMS participants at the beginning of their project

involvement and repeated annually. Challenge-based/design-based learning guides students in constructing

knowledge around an initially ill-defined problem and consists of the following practices. To what extent does your current instruction incorporate these practices?

Use Regularly

Use Occasionally

Have Tried It

Never Used

Explicitly connect class content to complex problems or issues with global impact Ο Ο Ο Ο

Explicitly connect class content to real world examples and applications Ο Ο Ο Ο

Explicitly connect these real-world applications to STEM careers Ο Ο Ο Ο

Explicitly connect class content to how people in STEM careers use their knowledge to address societal impacts

Ο Ο Ο Ο

Guide students to break complex global problems in to their local and more actionable components Ο Ο Ο Ο

Guide students in refining problems Ο Ο Ο Ο

Guide students in planning investigations to better understand different components of problems Ο Ο Ο Ο

Provide opportunities for students to gather information about problems or issues of importance

Ο Ο Ο Ο

Provide students with opportunities to explore multiple solution pathways for problems Ο Ο Ο Ο

Guide students in weighing the pros and cons of different solution pathways Ο Ο Ο Ο

Provide opportunities for students to test their solution pathways Ο Ο Ο Ο

Guide students in evaluating the results of their solution pathways Ο Ο Ο Ο

Provide students with opportunities to refine and retry a solution pathway Ο Ο Ο Ο

Provide opportunities for students to communicate their solution pathways and results to others Ο Ο Ο Ο

Provide opportunities for students to take responsibility for the decisions they made about the processes used in solving complex problems

Ο Ο Ο Ο

~ SURVEY CONTINUES ON THE BACK OF THIS PAPER ~

Page 44: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 41

Please indicate your current level of confidence in implementing these challenge-based/design-based learning practices.

Very Confident Confident Somewhat

Confident Not

Confident

Explicitly connect class content to complex problems or issues with global impact Ο Ο Ο Ο

Explicitly connect class content to real world examples and applications Ο Ο Ο Ο

Explicitly connect these real-world applications to STEM careers Ο Ο Ο Ο

Explicitly connect class content to how people in STEM careers use their knowledge to address societal impacts

Ο Ο Ο Ο

Guide students to break complex global problems in to their local and more actionable components Ο Ο Ο Ο

Guide students in refining problems Ο Ο Ο Ο

Guide students in planning investigations to better understand different components of problems Ο Ο Ο Ο

Provide opportunities for students to gather information about problems or issues of importance

Ο Ο Ο Ο

Provide students with opportunities to explore multiple solution pathways for problems Ο Ο Ο Ο

Guide students in weighing the pros and cons of different solution pathways Ο Ο Ο Ο

Provide opportunities for students to test their solution pathways Ο Ο Ο Ο

Guide students in evaluating the results of their solution pathways Ο Ο Ο Ο

Provide students with opportunities to refine and retry a solution pathway Ο Ο Ο Ο

Provide opportunities for students to communicate their solution pathways and results to others

Ο Ο Ο Ο

Provide opportunities for students to take responsibility for the decisions they made about the processes used in solving complex problems

Ο Ο Ο Ο

~ THANK YOU FOR YOUR RESPONSES! ~

Page 45: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 42

Appendix A-3. 2012 Administrator Academy Evaluation

Page 46: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 43

UC NSF MSP: CEEMS Project – Administrators’ Academy - Summer 2012 Thank you for participating in the University of Cincinnati CEEMS Administrators’ Academy. This survey is intended to obtain your initial reactions to the academy. Your responses will help the project team better support you and your teachers during the academic year and make future academies even more meaningful. Your responses will remain confidential and only group results will be reported. Thank you for your time and efforts!

Instructions:

The first section of the survey deals with overall impressions of the academy. Please think of your experience over the past three days and state your level of agreement with the following statements by marking an “X” on the response that most closely reflects your feeling.

Strongly

Agree Agree Neutral Disagree Strongly Disagree

1. Overall, I understand the purposes and objectives of the CEEMS project.

2. Overall, the CEEMS Administrator Academy will help me better support the participating teachers in my school or district.

3. The information was organized so that a logical progression of ideas was presented to participants.

4. The presentations included in this academy were helpful to my understanding of how CEEMS project fits into science and math education.

5. The exercises and activities helped me understand the concepts being presented.

6. I left this academy with a plan of action for CEEMS implementation in my school or district.

7. What was most useful about this academy? 8. What was the most interesting or surprising aspect of the last three days? Please give us examples and tell us why. 9. The project team is planning to provide support to you and your teachers during the academic year. Are there any additional questions you

have about the resource team’s work with your teachers? (Responses to these questions will be provided to all participants via email.)

Page 47: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 44

Page 48: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 45

Appendix A-4. End of SIT Evaluation - 2012

Page 49: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 46

UC NSF MSP: CEEMS Project – Summer Institute for Teachers (SIT) 2012

These questions relate to your experiences while participating in the CEEMS project SIT. Your opinion is very important as the project team works to improve the summer sessions. Your responses will remain confidential and only group results will be reported. Thank you!

Instructions: Please indicate how useful or useless the following aspects of the CEEMS Project 2012 SIT experience were in helping you create Challenge-Based Learning (CBL) units by marking an “X” on the response that most closely reflects your feeling.

Aspect of the SIT Experience Very Useful Useful Neutral Useless

Very Useless

I did not attend / Not Applicable

10. Overall – Foundations of Engineering Course

11. Overall – Engineering Applications in Mathematics Course

12. Overall – Models and Applications in Physical Science Course

13. Overall – Models and Applications in Earth Systems Course

14. Overall – Models and Applications in Biological Sciences Course

15. Overall – Engineering Models Course

16. Overall – Interactions with Resource Team

17. Overall – Interactions with Pre-service Teachers

18. CBL in Action presentation (July 5)

19. Unit Template presentation (July 6)

20. Academic Standards presentation (July 9)

21. Wiki presentation (July 11)

22. RET Teacher presentation (Amy Jameson from Dater, July 11)

23. Life of an Engineer poster presentation (July 16)

24. Video Creation presentation (July 18)

25. Assessments and Rubrics presentation (July 18)

26. Misconceptions in Science and Math presentation (July 25)

27. Panel discussion with practicing engineers (July 26)

28. Exploring Design in the Next Generation Science Standards lunch presentation (July 27)

29. Coaching Sessions with Resource Team Members (ongoing)

Page 50: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 47

Instructions: Please indicate your level of satisfaction or dissatisfaction with the following aspects of the CEEMS Project 2012 SIT experience by marking an “X” on the response that most closely reflects your feeling.

Aspect of the SIT Experience Very Satisfied Satisfied

Neither Satisfied

Nor Dissatisfied Dissatisfied

Very Dissatisfied

30. Access to UC Computing Services

31. Support to make your CBL unit

32. Support to make your poster

33. Support to make your movie

34. Overall – Experience as a participant in the CEEMS Project SIT during this summer

35. Please identify and describe briefly at least three strengths of the CEEMS project SIT 2012. Please be as specific as possible.

36. Please identify and describe briefly at least three areas for improvement for the CEEMS project SIT 2012. Please be as specific as possible. 37. The project team is planning to provide support to you and your teachers during the academic year. At this time, are there any particular

concerns that you would like the resource team to address to help you in your participation with the CEEMS project?

Thank you!

Page 51: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 48

Appendix A-5. SIT Course Participant Evaluations

Page 52: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 49

UC NSF MSP – CEEMS Program: Course Evaluation

Thinking about the course’s presentations, activities, or projects, please indicate your level of agreement with the following statements.

ARE YOU CURRENTLY AN IN-SERVICE TEACHER? Ο YES Ο NO; if NO, what is your College

Items Strongly Agree

Agree Neutral Disagree Strongly Disagree

Doesn’t Apply

1. The course helped broaden my understanding of the content.

Ο Ο Ο Ο Ο Ο

2. The course activities or projects were an effective means to learn the concepts.

Ο Ο Ο Ο Ο Ο

3. The course helped me understand challenge-based learning through the use of a design challenge.

Ο Ο Ο Ο Ο Ο

4. The course activities or projects will help me apply challenge-based learning to my teaching.

Ο Ο Ο Ο Ο Ο

5. The course helped me understand the engineering-design process.

Ο Ο Ο Ο Ο Ο

6. The course activities or projects will help me apply the engineering-design process in my teaching.

Ο Ο Ο Ο Ο Ο

7. The course provided me with ideas and examples illustrating how engineering applications use math and science knowledge, which I can use in my classes.

Ο Ο Ο Ο Ο Ο

8. The course helped me understand how math and science knowledge leads to different STEM career choices.

Ο Ο Ο Ο Ο Ο

9. The course helped me understand how math and science knowledge is used by engineers to solve societal problems.

Ο Ο Ο Ο Ο Ο

10. The course provided opportunities to enhance my oral communication skills.

Ο Ο Ο Ο Ο Ο

11. The course provided opportunities to enhance my written communication skills.

Ο Ο Ο Ο Ο Ο

12. The course activities or projects helped cultivate effective team-work.

Ο Ο Ο Ο Ο Ο

13. The students in my school will benefit from my experiences in this course.

Ο Ο Ο Ο Ο Ο

14. The instructor presented the concepts effectively. Ο Ο Ο Ο Ο Ο

15. The sessions allowed for questions, answers and discussions.

Ο Ο Ο Ο Ο Ο

16. The course materials were well organized. Ο Ο Ο Ο Ο Ο

17. The course materials supported the concepts taught.

Ο Ο Ο Ο Ο Ο

18. I would recommend other science or mathematics teachers take this course.

Ο Ο Ο Ο Ο Ο

19. Please indicate what you liked most about this course, and provide examples. (Use back of paper, if needed.)

20. Please indicate what aspects of this course you would recommend changing, and provide examples. (Use back of paper, if needed.)

Page 53: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 50

Appendix A-6. Teacher-Led Professional Development Session Evaluation

Page 54: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 51

UC NSF MSP: CEEMS Project – Teacher-Led Professional Development – 2012-2013 Thank you for participating in this teacher-led professional development (PD). This survey is intended to obtain your initial reactions to the PD. Your responses will help the CEEMS project team better support participating teachers. Your responses will remain confidential and only group results will be reported. Thank you for your time and efforts!

Instructions: Please think of your experience and state your rating of the following statements by marking an “X” under the response that most closely reflects your feeling.

Very Good Good Neutral Poor Very Poor

1. Please provide an OVERALL RATING for this PD workshop.

2. Please rate this PD workshop in aiding your understanding of STEM education.

3. Please rate this workshop in aiding your understanding of how to implement engineering design into science and math classes.

4. Please rate this workshop in aiding your understanding of challenge-basedlearning.

Very Likely Likely Neutral Unlikely Very Unlikely

5. Please rate the likelihood that you will use some of the concepts presented in this workshop in your own classroom.

6. Please provide examples of ideas or activities you will try in your classroom as a result of this professional development workshop. 7. Are there any additional comments you have about this PD?

8. Please indicate the grade level(s) you teach. (Mark the box before each grade level.)

□ K □ 1 □ 2 □ 3 □ 4 □ 5 □ 6 □ 7 □ 8 □ 9 □ 10 □ 11 □ 12

9. Please indicate the subject(s) you teach. (Mark the box before each subject.)

□ Science □ Mathematics □ Engineering □ Technology □ Language Arts □ Social Studies □ Foreign Language □ Other (write below)

Page 55: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 52

Appendix A-7. Teacher End-of-Year Focus Group Discussion Guide – 2013

Page 56: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 53

MAY 6th, 2013 -- CEEMS Teacher Cohort 1 - Focus Group Discussion Guide

• How did the year go for you concerning the CEEMS project?

• Was it worth it?

• What results did you see in the classroom?

o Positive results

o Negative results

• Tell me about the products you created and what you thought about them

o Lessons / Video / Poster / Professional development

• Tell me about the Resources provided by the CEEMS project

o SIT Courses / Resource Team / Project Team / Administration

• Suggestions for Change to the SIT and Program

o What would you show cohort 2 and how would it be beneficial to you?

o What would you want to refine your lesson?

o Do you want to revamp a current lesson to be re-taught?

• What is CBL?

• What about EDP?

o Do you find it hard to get back to the revision process – time wise?

• What are some things you HAVE to tell cohort 2?

• Any other comments?

• Was there something totally unexpected?

Page 57: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 54

Appendix A-8. CEEMS Resource Team Communication Log - 2012

Page 58: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 55

Link: https://uceducation.qualtrics.com/SE/?SID=SV_79iKfj1NP0Fporj

CEEMS Resource Team Communication Log - 2012 1. Name of CEEMS Resource Team member completing this communication log Select your name (12) Jack Broering (1) Lori Cargile (2) Tim Dungan (3) Dennis Dupps (4) Meri Johnson (5) Kimya Moyo (6) Rob Rapaport (7) David Vernot (8) Tom Vinciguerra (9)

2. Enter date of communication.

Month (1); Date (2); Year (3) 3. Is this a new or follow-up communication? New (1) Follow-up (2) Both (3)

4. Approximate time spent on communication 15 minutes or less (1) 30 minutes to 1 hour (2) 1 to 2 hours (3) 2 to 3 hours (4) 3 to 4 hours (5) More than 4 hours (6)

5. Method of communication Email (1) Face-to-Face (2) Phone (3) CEEMS Wiki (4) Webcam/Video Conferencing (5) Other (6) ____________________

6-all. Were ALL teachers involved in the communication? Yes (1) No (2)

6. If only some teachers were involved in the communication, who was (were) the CEEMS involved in the communication? (CHOOSE ALL THE TEACHERS INVOLVED)

Page 59: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 56

Erick Allen (Withrow) (1) Chris Anderson (Princeton) (2) Caleb Barber (Norwood) (3) Curt Blimline (Williamsburg) (4) Aaron Debbink (Oak Hills) (5) Jennifer Harvey (Norwood) (6) Cathy Herzog (Amelia MS) (7) Paige Jarrell (Norwood) (8) Gina Ogden (Goshen) (9) Katie Powers (Winton Woods) (10) Matt Ritchey (Clermont NE) (11) Paul Schember (Norwood) (12) Megan Walker (Felicity-Franklin) (13) Doug Werling (Rothenberg) (14) Beverly Pryor-Young (Taft) (15)

7. Teacher(s) school district (CHOOSE ALL THAT APPLY) Cincinnati Public Schools (1) Clermont Consortium (2) Norwood (3) Oak Hills (4) Princeton (5) Winton Woods (6)

8. What was the nature, topic, of the communication? (CHOOSE ALL THAT APPLY) Administrative needs (1) Advice related to a specific activity (2) Advice related to pedagogy (3) Challenge-based aspects of the lessons (4) Content support (5) Coordinate communication among teachers (6) Engineering-design-based aspects of the lessons (7) Planning related to a site visit (8) Planning related to a teacher-led professional development (9) Technology Integration (10) Unit development (11) Unit implementation (12) Unit topic selection/identification (13) Other, please specify (14) ____________________

9. Please summarize communication 10. Describe any necessary follow-up 11. Any additional comments To submit your responses click the "SUBMIT" button below.

Page 60: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 57

Appendix A-9. CEEMS Student Activity Feedback Survey – 2012

Page 61: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 58

Student Feedback for CEEMS – 2012-2013

I. Please fill in the information in the box below Date: Period: Lesson: Grade: Teacher: School: II. Please rate the following statements: Item Excellent Good Average Fair Poor 1. Overall, I would rate this lesson as… Item Strongly

Agree Agree Neutral Disagree Strongly

Disagree

2. I liked the activities we did in this lesson. 3. The lesson was very well organized. 4. The teacher was able to explain the subject very easily. 5. The teacher encouraged us to ask questions. 6. The teacher was very good at answering our questions. 7. The group work was very interesting. 8. I learned a lot from this lesson. 9. I learned a lot from the teacher. 10. This lesson made me interested in learning more about Engineering.

11. This lesson helped me feel more confident about studying math.

12. This lesson helped me feel more confident about studying science

13. The lesson was different from other lessons I’ve had in this class.

14. How was the lesson different from other lessons? 15. What did you like most about this lesson?

Page 62: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 59

16. If your teacher was to use this lesson again, what changes do you think should be made?

Page 63: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 60

Appendix A-10. Revised Unit Implementation Surveys for Students

Page 64: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 61

AFTER GUIDING QUESTIONS DISCUSSION

I. Please fill in the information in the box below Date: Grade: Period/Bel

l: Teacher: Lesson:

Thinking about the discussion your class just had to identify the guiding questions, please choose your level of agreement to the following statements: Strongly

Agree Agree Disagree Strongly

Disagree 1. During this discussion, my ideas were considered. ο ο ο ο

2. I understood the “Essential Question” identified. ο ο ο ο 3. I understand how the “Guiding Questions” will help us

solve the “Challenge” selected. ο ο ο ο

4. I am excited about finding a solution to this “Challenge”. ο ο ο ο

5. I received guidance from my teacher when I asked for it. ο ο ο ο

6. In your own words, what is the “Big Idea” for this unit?

7. In your own words, what is the “Challenge” you are working to solve?

8. In your own words, list two or three “Guiding Questions” selected to help solve the challenge.

9. For summary purposes only, please indicate if you are a male or female.

□ Male □ Female

Page 65: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 62

DISTRIBUTED AFTER THE UNIT IS COMPLETE

I. Please fill in the information in the box below Date: Grade: Period: Teacher: Lesson:

In a drawing or diagram, arrange the following words in the order of the steps you just used in the engineering design process activity and connect each with an arrow, with the arrow-head pointing to the activity that will come next. You can use any of these words more than once, if needed. • Identify and Define • Evaluate or Test Solution(s) • Refine

• Do Again • Select Best Solution to Try • Gather Information

• Communicate Solution(s) • Identify Alternative(s) • Implement Solution(s)

a. How did you act like an engineer during this unit? Please be specific.

b. What type of engineer did you act like? _______________________________________________

(Survey continues on the back of this page.)

Page 66: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 63

Please rate the unit you just completed:

1. Overall, I would rate this unit as … Ο Excellent Ο Good Ο Fair Ο Poor

Thinking about the unit you just completed, please choose your level of agreement to the following:

Strongly Agree Agree Disagree

Strongly Disagree

2. I received guidance from my teacher when I asked for it. ο ο ο ο

3. I learned a lot. ο ο ο ο

4. This unit related to the real world. ο ο ο ο

5. I understand how the engineering design process activity allowed us to use the guiding questions to solve the challenge selected.

ο ο ο ο

6. Solving this challenge can help others, our community, and society.

ο ο ο ο

7. I contributed to the group’s solution to the challenge. ο ο ο ο

8. Listening to other student’s ideas was an important part of the unit. ο ο ο ο

9. There are many solutions to this problem. ο ο ο ο

10. We were able to test our initial solution. ο ο ο ο

11. After our initial test, we were able to think about changes we wanted to make to have a better solution to the challenge.

ο ο ο ο

12. I like problems best when they really make me think. ο ο ο ο

13. I am excited that we found a solution to this challenge. ο ο ο ο

14. I participated more during this unit than I usually do in class.

ο ο ο ο

15. I feel using challenges is a more effective way to learn than the way we are usually taught.

ο ο ο ο

16. This unit made me more interested in Engineering. ο ο ο ο

17. This unit made me feel more confident about math or science. ο ο ο ο

18. I learned about the careers related to this challenge and our solution. ο ο ο ο

19. If you were the teacher, list three changes that would make this unit better so that students learn more?

20. For summary purposes only, please indicate if you are a male or female.

Page 67: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 64

Appendix A-11. 2013 STEM Conference Session Evaluation

Page 68: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 65

SESSION EVALUATION: 2013 Get REAL! STEM Education Brings

REAL Challenges, REAL Connections, REAL Life (May 6, 2013, Tangeman University Center, University of

Cincinnati) Thank you for attending the this session of the 2013 Event (funded by National Science Foundation Grant #1102990). We would like you to briefly complete this session evaluation. Your responses will be aggregated (no individual responses will be reported) and given to the conference organizers and presenters so that the conference can be improved for future events. All responses to the survey will remain confidential.

When you are finished completing this survey, please give it to the session volunteer. THANK YOU!

SESSION NAME:__________________________________________________ Presenter:_______________________________________________________ Thinking about this session, please indicate level of agreement to the following statements: (mark the corresponding circle) Strongly

Agree Agree Disagree Strongly

Disagree Not

Applicable 1. Overall, the information presented during

this session was very useful. Ο Ο Ο Ο Ο

2. The content and/or strategies presented at this session can be easily adapted, or used as-is, in my educational setting.

Ο Ο Ο Ο Ο

3. The session increased my understanding of challenge-based learning.

Ο Ο Ο Ο Ο

4. The session increased my understanding of how to use engineering as a context for teaching mathematics and science topics.

Ο Ο Ο Ο Ο

5. The presentation was engaging and the activities appropriate for the topic.

Ο Ο Ο Ο Ο

6. The presenter was clear and easy to understand.

Ο Ο Ο Ο Ο

7. The presentation accurately reflected my expectations from reading the abstract.

Ο Ο Ο Ο Ο

8. I would recommend this session to my colleagues.

Ο Ο Ο Ο Ο

9. The session identified real-world applications, career connections and societal impacts for teaching STEM content.

Ο Ο Ο Ο Ο

10. What did you value most about this session? Why?

11. What strategy(ies) or content presented do you anticipate using in your teaching environment? Why? 12. What modifications would you recommend if this session were presented again?

Page 69: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 66

Appendix A-12. 2013 STEM Conference Evaluation

Page 70: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 67

Survey Link: 2013 STEM Conference Evaluation

2013 STEM Regional Conference

Conference Evaluation: 2013 Get REAL! STEM Education brings REAL Challenges, REAL Connections, REAL Life

(May 6, 2013 at Tangeman University Center, University of Cincinnati) Thank you for attending the 2013 STEM Regional Conference at Tangeman University Center, on the campus of the University of Cincinnati. This event was sponsored by the University of Cincinnati NSF MSP; Project CEEMS (National Science Foundation Grant #: 1102990). Hopefully, this event provided you a chance to work with partners and colleagues to continue the journey in developing effective, enriching, and innovative STEM programs. To help the event coordinators improve for future events, we would like you to complete this brief evaluation survey within the next 7 days. The survey is being administered by the Evaluation Services Center at the University of Cincinnati on behalf of the University of Cincinnati NSF MSP; Project CEEMS. This survey should take less than ten (10) minutes to complete. All responses to the survey will remain confidential. Data will not be analyzed in aggregate and no individual responses will be reported. If you have any technical concerns with the survey, we can be contacted at (513) 556-3900 or by sending an email to [email protected], contact person. Thank you for your participation. Please click the "Next" button to begin the conference evaluation. 1. OVERALL, this event has aided my understanding of STEM education. Very Good Satisfactory Marginally Poor

2. OVERALL, the information presented will be USEFUL in my future educational activities. Strongly Agree Agree Disagree Strongly Disagree

3. OVERALL, the information presented increased my understanding of how real-world applications of STEM impact students’ lives. Strongly Agree Agree Disagree Strongly Disagree

Page 71: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 68

4. OVERALL, the information presented increased my understanding of STEM careers and fields of study. Strongly Agree Agree Disagree Strongly Disagree

5. OVERALL, the information presented provided me with ideas of how to connect STEM to societal impacts. Strongly Agree Agree Disagree Strongly Disagree

6. At the conclusion of this conference, I have a better understanding of how challenge-based learning can be used to enhance mathematics and science learning. Strongly Agree Agree Disagree Strongly Disagree

7. At the conclusion of this conference, I had a better understanding of how the engineering design process can be used to enhance mathematics and science learning. Strongly Agree Agree Disagree Strongly Disagree

Please click the "Next" button to continue the survey.

Page 72: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 69

Please indicate your agreement level with the following statements regarding aspects of the 2013 Get REAL! STEM Education Brings REAL Challenges, REAL Connections, REAL Life event:

Strongly Agree

Agree Disagree Strongly Disagre

e

8. The event provided me opportunities to learn about current STEM initiatives.

9. The STEM conference provided opportunities to collect STEM activities that can be implemented in my own teaching or learning environment.

10. The conference facilitated collaboration across institutions or schools.

11. The breadth of the session options was a positive aspect of the conference.

12. The length of the sessions was adequate.

13. I had access to the wireless internet when I needed it.

14. The location - Tangeman University Center at the University of Cincinnati - was an appropriate venue for this type of event.

Please click the "Next" button to continue the evaluation survey.

Page 73: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 70

15. What did you value most about the 2013 Get REAL! STEM Education Brings REAL Challenges, REAL Connections, REAL Life event? Why? 16. Do you have any suggestions for improvement for our next STEM event? 17. Please list STEM related presentations or topics you would like to learn more about. 18. To help us track session attendance, please select the presentation or workshop you attended during Session A, the first session in the morning. The Inclusive Competitiveness Imperative: Nurturing Talent & Enterprises to Fill Jobs and

Create Wealth in the 21st Century; Room 400A; Presented by Johnathan M. Holifield, Esq. Solving Real World Problems in Math Classrooms; Room 400C; Presented by Kimya Moyo,

Erick Allen, Jennifer Harvey and Gina Ogden There's More to Light than Meets the Eye; Room 419; Presented by Bev Ketron, Sue Hare and

Sharon Young Investigating a Cliff Model; Room 415; Presented by Kevin Stinson Stunt Design: Challenge Based Learning in the Physics Classroom; Room 425; Presented by

Aaron Debbink Energy Transfer, Transformation and Efficiency in Toy Production; Room Atrium; Presented

by Paige Jarrell REAL Robotics with High School Students - First Robotics Competition (FRC) Teams 1038

and 144; Room 400B; Presented by Doug Noxsel Engineering Design: An instructional Strategy to Close the Gap; Room 417; Presented by

Robin McGinnis and Shannon Raquet, Karen Black, Cheryl Wilson, Joe Ohradzansky Problem Based Learning Experiences (PBLE): Developing Authentic Learning Experiences;

Room Cinema; Presented by Rob Kovacs Integrating Science in the K-3 grades; Room 411; David Schklar and Kasey Dunlap Posters Presentation of Teaching STEM ; Room 403 and 405;

19. Please select the presentation or workshop you attended during Session B, the second session in the morning. Literacy Design Collaborative and STEM; Room 400A; Presented by Kathy Wright, Ronnda

Cargile, Glenetta Krause, Halla Shteiwi and Antwan Lewis Thinking Design: Keeping the E in Engineering Design Process (EDP); Room 415; Presented

by Joe Brasile and Jesse Rosenthal Greater Cincinnati STEM Collaborative Web Site; Room Cinema; Presented by Steve Geresy Challenge Based Unit -Brent Spence Bridge Redesign; Room 425; Presented by Megan Walker Nanoscience for Every Classroom; Room 417; Presented by Brian Pollock and Chantal Hayes Rock and Roll through Earth Science: Connect Science and Mathematics in Your Classroom;

Room 419; Presented by Reeda Hart, Betty Stephens and Thomas Brackman Next Generation Science Standards (NGSS): Structure, Content and Implications for STEM;

Room 411; Presented by David Vernot Poster Presentation of Earthquake Retrofit Project and Connecting Aerospace Engineering

with Energy Transformation and Move!; Room Atrium;

20. Please select the presentation or workshop you attended during Session C, the session after lunch.

Page 74: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 71

Using Blended Learning Teaching Methods to Integrate 21st Century Learning Skills; Room 400B; Presented by Cathy Macdonald and Ryan Macdonald

Building Simulation Contraptions in your Garage; Room Atrium; Presented by Paul Schember Panel: Challenge Based Learning; Room 400C; Presented by Tim Dugan, Katie Powers and

Cathy Herzog It's That Easy to Find High Quality STEM Resources!; Room Cinema; Presented by Cheryl

Ghosh and Susan Kohler LEGO Crash Test Dummies; Room 425; Presented by Sue Hare, Bev Ketron and Sharon

Young Alternative Energy - Wind Turbine Design; Room 415; Presented by Todd Hummer, John

Nicol, Doug Noxsel and Brad Williams Integrating Reading with Science? But We Are Concerned with Improving Science Scores!;

Room 400A; Presented by Imelda Castaneda-Emenaker and Shelly Micham Virtual Applications of Learning; Room 427; Presented by David Valentine and Dr. Robert

Williams iPad Apps to Engage your Students; Room 411; Presented by Glen Schulte Simple , Inexpensive Engineering Design Process Ideas; Room 417; Presented by Doug

Werling, Caleb Barber and Chris Anderson Poster Presentation of Nanotechnology: Past, Present and Future and A RET Experience

Impacting the Classroom: Availability of Safe Drinking Water; Room 419;

21. Please select the presentation or workshop you attended during Session D, the last session of the day. How Does Project Lead the Way Fit into the Career and Tech Model?; Room 400C; Presented

by Angela Lewis, Chasity Rohan, Carissa Hewitt, John Sanders and Sherry Sanders A Model Program for Increasing the Number of Ethnic Students in STEM; Room 400B;

Presented by Kenneth Simonson How Do You STEM up?; Room 417; Presented by Susan Emmett and Tina Gaser Caring for Our Watersheds; Room 415; Presented by Gwen Roth Engineering Makeovers; Room Atrium; Presented by Matt Ritchey and Curt Blimline Thinking Inside the Box: Using the Physical Space of the Classroom to Teach Mathematical

Concepts and Engineering Applications; Room 425; Presented by Josh Rexhausen Putting the "E" in STEM: Updating Lessons You May Already Have; Room 419; Presented by

Reeda Hart, Thomas Brackman and Dr. Madhura Kulkarni Next Generation Science Standards (NGSS): Structure, Content and Implications for STEM

(Session Repeated); Room 411; Presented by David Vernot (Session Repeated) Building Partnerships with Business and Industry; Room 400A; Presented by Carissa

Schutzman

22. Thinking back on these sessions, what was(were) the most useful session(s) you attended? Why? Please click the "Next" button to continue the evaluation survey.

Page 75: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 72

To help us target support in the future, please provide the following demographic information: 23. What is your current position? (CHECK ALL THAT APPLY) K-12 Teacher K-12 Special Education, Resource, or Inclusion Teacher K-12 School Administrator K-12 Central Office Personnel Post-Secondary Faculty or Staff Post-Secondary Administrator Educational Organization Faculty or Staff Member of Educational Outreach Organization (Please specify organization)

____________________ Business or Industry Representative (Please specify company or organization)

____________________ Other (Please Specify) ____________________

24. What level(s) best describes the grade level(s) you are currently teaching? If you are an administrator or supervisor, what is the grade level in your school? (CHECK ALL THAT APPLY) Pre-Kindergarten Primary (K-3) Intermediate (4-6) Middle (7-8) High School (9-12) Administrator Supervisor (Elementary School) Administrator/Supervisor (Middle School) Administrator/Supervisor (High School) Administrator/Supervisor (District-wide) Post-secondary Other ____________________ Not Applicable

25. Select the response that best describes the main subject area you are currently teaching or preparing to teach. Self-contained class (teach all or most academic subject to one class) Math and Science Math only Science only Technology only Engineering only Other or multi-subject combinations (Please Specify) ____________________ Not Applicable

This is the end of the survey. Please click the NEXT button to submit your answers. Thank you.

Page 76: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 73

Appendix B. Teacher Surveys Results

Page 77: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 74

Appendix B-1. June 2012 SIT Evaluation Results

Page 78: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 75

UC NSF MSP: CEEMS Project – Summer Institute for Teachers (SIT) 2012 Evaluation

Table B-1-1: Responses to Qualitative Questions (multiple response table)

Category In-service Teacher Responses: Please identify and describe briefly at least three strengths of the CEEMS project SIT 2012.

Breadth of Knowledge

(n=5)

• The array of ideas and concepts given throughout the course. • All of the information and experiences were useful • Workshops/sessions to give us the skills/ experiences needed to be

successful with the required deliverables. The skills were Wikis, Video, posters, next gen. science standards, etc.

• Content • I felt the misconceptions presentation and subsequent resources was

very helpful. CBL

(n=9) • Helping us create challenge-based lessons to use in the classrooms (real

classroom ideas) • Introduced CBL • Also, challenge based learning is the way to go. • Concept • Encouraging growing as a math teachers - to have they time and support

to invest in planning a unit in the challenge based learning styles. • I enjoyed this program. It was so worthwhile. I can't wait to work on unit

2. • Integrating CBL and content • Challenged- based learning for the classroom • Application of real life math and science

Engineering Concepts

(n=7)

• Strength was the concept of engineering. • Fundamentals of Engineering course was project based and full of

applications. • Learning how to use the design process and engineering in my classroom • Loved the Intro. to Engineering course! Great way to introduce

engineering to someone who had no idea what engineering was. • Foundations of Engineering class - everything about it. • I also thought the Foundations of Engineering class gave us a lot of

experience with how the design process works and can be implemented. • Foundations of Engineering course - set-up the whole program well

Instructor (n=2)

• Attila and Joni were great professors. • [The instructors] Being understanding/patient with us (me) as we

struggled to grasp certain aspects of the program Positive – General

Comment (n=1)

• I think that it is great

Page 79: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 76

Category In-service Teacher Responses: Please identify and describe briefly at least three strengths of the CEEMS project SIT 2012.

Product Oriented (n=4)

• Video/poster production • Loved creating the poster and unit video! • Product oriented; it is great to leave here with stuff to put up and

implement in my classroom. • Product oriented

Project Director (n=2)

• Julie was great as project director. • Julie Steimle is a great project director! She was always on top of

everything! Recommend to

Others (n=1)

• I would recommend this to math and science teachers!

Resource Team (n=10)

• Resource team members/coaching sessions along the way were very helpful

• Resource team members were extremely qualified and knowledgeable • As well as the resource team there for help and guidance • Coaching support was great. • The coaches! It was awesome to have such a wealth of insight and

experience and the ongoing support. • The support was amazing. I am not sure how we would have done this

without the resource team. • Excitement of the group - from the resource team • Enthusiasm of resource team • Coaching was good • Resource team members were very helpful in details of projects.

Support – General (n=1)

• Support for implementations ($, laptop, resource team, video camera, etc.)

Support – Technology

(n=3)

• The provided laptops really made completion of assignments easier • Technology • Learning to use technology and how to apply it in my class

Support – Template

(n=1)

• Overall template of the unit itself

Support – Time (n=3)

• The time we were given at UC to work on projects was very helpful. • Focused time to work on units and implementation for the classroom • The flexibility to work on the projects in the summer was very nice.

Page 80: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 77

Category In-service Teacher Responses: Please identify and describe briefly at least three strengths of the CEEMS project SIT 2012.

Teacher Collaboration

(n=6)

• "Community feel” among teachers, learned to work well together with challenges we needed to overcome.

• Meeting teachers from other districts - learning from each other • Peer feedback • I also appreciate all the professional connections I made this summer. • Enthusiasm of participants • Interaction with teachers was very useful to gain ideas and contacts for

future.

Workload (n=2)

• The work load expectations for math course and Earth science were realistic in scope

• Daily schedule while intense was nicely balanced

Category Pre-service Teacher Responses: Please identify and describe briefly at least three strengths of the CEEMS project SIT 2012.

Breadth of Knowledge (n=3)

• The classes served to remind me some of the things I had forgotten content-wise in mathematics.

• Variety of coursework • Breadth of topics

CBL (n=1)

• Practical applications

Engineering Concepts (n=1)

• This experience will help us lead a design project with our future students

Instructors (n=2)

• Qualified presenters at workshops. • Presenters were very willing to help all teachers and Fellows!

Positive – General Comment

(n=1)

• I really think this program was well designed and implemented! :-) (originally written as a response to question 20)

Product Oriented (n=1)

• Designing and building a model and later creating a poster for presentation was an excellent project for the Fellows to execute.

Teacher Collaboration (n=3)

• I loved the opportunities to gain insight from in-service teachers. Watching them put together units taught me a lot about my future lesson planning.

• Interacting with teachers from different districts • I am a pre-service teacher, and I enjoyed just getting to talk to people who

are currently doing work that I'll be doing soon.

Page 81: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 78

Category In-service Teacher Responses: Please identify and describe briefly at least three areas for improvement for the CEEMS project SIT 2012.

Breadth of Knowledge

(n=2)

• Possible class for spreadsheet • Use "possible class" in Intro. for Engr.

CBL (n=9)

• Classes need to focus on Engineering Applications not just content, some CBL please.

• Some electives were not CBL based • Interaction with teachers who have used CBL in their science and math

classes would be very beneficial. I know that we (teachers) can fill this role next summer. This will be invaluable for future participants in the SIT program.

• Classes need to focus on Engineering Applications not just content, some CBL please.

• Use a "unit lesson" in Intro. to Engr. next year for the new STUDENTS • The elective course should be more connected to what we will teach --

none allowed us time to apply the learning to our specific content. • Physical science not related to challenge based learning • I also think that the content courses (Math, Biology, and Physics) needs to

be more application based -- resources and projects to do in class. I think it is more useful for teachers rather than content.

• Classes other than foundations need to have projects that incorporate challenge based learning.

Communication of Expectations

(n=10)

• Communication of expectations • Communication between coaches so that they all have the same ideas. • Examples of deliverables needed early for next cohort. • The coaches need to get on the same page regarding project

expectations, timelines, etc. • Resource team needs to collaborate. I was getting different feedback

from everyone and did not know who to listen to!! • Prepare a rubric ahead of time for the posters so everyone is on the same

page. • Determine requirements and expectations at beginning • Standardize expectations and don't be afraid to give constructive criticism • Expectations on unit were not concrete. A lot of contradictory advice.

Coaches were not on the same page. Didn't know exact expectations for unit (no one seemed to).

• The resource team needs to be on the same page for the expectations for the poster/video/wiki.

Engineering Concepts

(n=1)

• Better integrate the engineering process with the required university course work; integration this year was not very clear in all classes.

Mathematics Concepts

• Place more emphasis on mathematics implementation • Maybe having two tiers for the math class

Page 82: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 79

Category In-service Teacher Responses: Please identify and describe briefly at least three areas for improvement for the CEEMS project SIT 2012.

(n=2)

Nothing (n=1)

• I can't think of any others; I really think this program was well designed and implemented! :-)

Product Oriented – Guidelines

(n=2)

• More defined guidelines on projects • There was some confusion about what needed to be included on the

posters. Resource Team

(n=2) • More coach interaction • Resource team needs to be more involved.

Support – Feedback

(n=2)

• We would upload stuff to our Wikis and not get timely feedback. • Needed more feedback

Support – Technology

(n=2)

• I never had a working password • Make the wiki more user friendly (goggle site was difficult to use)

Teacher Collaboration

(n=2)

• More interaction with the pre-service teachers on projects and co-development of units when they will be teaching with a SIT teacher.

• Pre-service teachers - didn't see the need or relevance of them

Workload (n=6)

• Reduce the amount of work expected to be completed in the elective class. Physics class homework was so intense that it took more work than the unit production.

• The work from the elective course should not overshadow the development of the unit.

• The electives need to be on a more level playing field. • Physical science was too much busy work • Classes, other than that the for unit template, need to coordinate out of

class work so it is more balanced. Physical science had too much out of class work.

• The workload for some classes was excessive given the ongoing need to work on the SIT deliverables. (Originally written as Question 22)

Workload Timing (n=6)

• Project should have been introduced at the beginning so that we should get ideas thought out ahead of time.

• The lessons must be taught within the first two weeks (how to use and do the different things -- poster, wiki, unit, video)

• Start video and poster resource teaching earlier so that more time could be used to develop them.

• Poster and Video projects need to be introduced earlier in the timeline. • Allow for more time during the day to focus on units. • In the first two weeks of the program, they need to shorten the

Foundation classes by several days and start teaching about Wikis, posters, videos and our template. Getting them all introduced at the same time so late in the game was overwhelming.

Page 83: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 80

Category Pre-service Teacher Responses: Please identify and describe briefly at least three areas for improvement for the CEEMS project SIT 2012.

Engineering Concepts (n=1)

• Greater emphasis on engineering in classes

Organization (n=1)

• Seminar sessions that are better organized

Prior Knowledge (n=3)

• As a Fellow, I often found myself very unfamiliar with the "jargon" used in the afternoon seminar discussions, especially about teaching standards. Many acronyms and some terminology should be explained to Fellows

• The classes would be better if there were more equal levels of previous knowledge in the same classroom.

• If combined, start with same introduction of topic with Woodrow Wilson Fellows (Yes, I am a Fellow.)

Teacher Collaboration (n=2)

• More interaction between teachers and Fellows would be nice. Fellows want advice from current teachers.

• Either better interaction with Woodrow Wilson Fellows or separation

Page 84: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 81

Category

In-service Teacher Responses: The project team is planning to provide support to you and your teachers during the academic year. At this time, are there any particular concerns that you would like the resource team to address to help you in your participation with the CEEMS project?

Classroom Visits (n=1)

• Keeping checking in periodically throughout the school year. Visit our classes and help with ideas for how to best implement our units

Communication (n=6)

• A timeline of events • Communication -- quickly and effective • Meeting to discuss reflections on Unit 1. I get the best ideas "face to

face", so I will want to meet. • Make sure that they keep up to date with our units and not wait until the

last minute. It's hard to change your entire unit in 2 days!! If they had suggested improvements in a timely manner, it would have worked out.

• Need a schedule for meetings. Will most communication/feedback be through wiki?

• Please provide time for meetings as early as possible. Also, are there physical deadlines for the 2nd and 3rd units?

Continue Support (n=3)

• No concerns. Keep the "good help" coming!! • Look forward to being in touch! • Time. This was easy to manage time because it was summer. This will be

much more challenging once the school year starts. Expectations

(n=1) • Make expectations clear and consistent.

Funds (n=1)

• Informing us of how/when we receive funds for unit deliverables/materials.

Nothing (n=2)

• None • Nothing at this time

Support for Future Units (n=2)

• Figuring out ways/relaying information about how we move into other units. Following up with other units.

• I look forward to the continued collaboration and support! Teacher Collaboration

(n=2)

• More time for group collaboration on units. Partners did not work. • Incorporate the pre-service teacher in a better manner.

Category

Pre-service Teacher Responses: The project team is planning to provide support to you and your teachers during the academic year. At this time, are there any particular concerns that you would like the resource team to address to help you in your participation with the CEEMS project?

Teacher Collaboration (n=2)

• Any opportunity I can get to interact with a CPS High School Chemistry teacher to ask questions and get advice about Chemistry curriculum, lesson plans and resources like labs, worksheets, project ideas, tests, etc. would be most welcome.

• If you ever have questions about real world industry/business applications. I am happy to help if I can. (Rich Farris)

Page 85: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 82

Page 86: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 83

Appendix B-2. 2012 Administrator Academy Evaluation Results

Page 87: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 84

UC NSF MSP: CEEMS Project – 2012 Administrators’ Academy Planning for the Academic Year 2012-2013

SUMMARY OF RESPONSES 1. How do you plan to “market” this program to your higher level administrators? Please give

examples and expected outcomes. School District School Name Comments

Cincinnati Public Schools

Taft Information Technology High

School

• I will be meeting with the Assistant Superintendent assigned to Taft. The goals and outcomes will be shared as well as the plan for Taft. The Science Curriculum Manager for CPS will also be informed. I expect CEEMS to be fully embraced

Cincinnati Public Schools

Withrow University High School

• All information will be summarized to the principal on a quarterly basis w/ calendar of opportunities to observe the SIT participants teaching the units.

Felicity Franklin Local Schools

Felicity Franklin Middle School

• By sharing with them what I have experienced through the Academy and allowing the teacher to share their experience. I expect all other administrators will be excited to get involved and gain the resources that are available through the CEEMS project.

Goshen Local School District

Goshen Middle School

• Already sent a brief overview of Mrs. Ogden’s training this summer to be highlighted on the District website and in our community newsletter.

• Have Mrs. Ogden share in December at our District Board Meeting to highlight “what’s” happening in Goshen Middle School.

• Share walkthrough data regarding Mrs. Odgen’s progress w/ superintendent.

Norwood City School District

Norwood Middle School

• Will be listed on Building Focus. Will be shared as an emphasis to the BOE in annual Principal mtg. Will place info on middle school website and in local newspaper with photos of experiments.

Oak Hills Local School District

Oak Hills High School

• Will discuss the program at our Admin meetings. Will also discuss at district level principal meetings.

• Higher level admin will be made aware of the PD that our teacher will be providing to other staff members. They will also be encouraged to observe part or all of the challenge based unit.

Princeton City Schools

Princeton Community Middle

School

• By the use of positive results gained from the program, the testimony of participants and any marketing provided by the program.

West Clermont Amelia Middle

School

• I will work closely with our Teaching and Learning Department (T&L, Tanny McGregor, M.E. Steele-Pierce and Cheryl Turner) to schedule P.D., observations and evaluations since I have transferred from the building our participating teacher is in.

Winton Woods City Schools

Winton Woods High School

• Our district is already on board in addressing the needs of our students in math, science, and engineering. Our high school is developing a STEM Pathway; so this could fit into it. We already do PBL learning in one of our schools within a school and the CBL&DBL follows this same thought.

Page 88: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 85

2. How do you plan to “market” this program to other teachers in your school and district? Please give examples and expected outcomes.

School District School Name Comments

Cincinnati Public Schools

Taft Information Technology High

School

• The CEEMS project will be introduced during our opening PD and during the initial science dept. meeting. Regular updates will also take place. I expect CEEMS to be fully embraced.

Cincinnati Public Schools

Withrow University High School

• School Level - SIT participants will market via PD opportunities throughout the school year (i.e. Open House Team meetings and Staff Meetings with support from administration).

• Expected Outcome - Increase team recruitment (math and science teachers).

Felicity Franklin Local Schools

Felicity Franklin Middle School

• Using the teacher in the CEEMS project to present in staff and curriculum meetings.

Goshen Local School District

Goshen Middle School

• Expected outcomes will impact OAA scored so Pro-Ohio Science Data and reports ran by Mrs. Ogden will be discussed monthly with myself.

• Mrs. Ogden will collaborate in staff meetings to share a challenge activity and overview of CEEM.

• Share flyers to science/ math staff

Norwood City School District

Norwood Middle School

• Through “workshops”, early release, and casual talk with elem. And HS math/ science dept. Will learn more of the teacher participant’s experience.

Oak Hills Local School District

Oak Hills High School

• We will use our teacher who participated in the program to share his experiences in the program and to promote the program internally.

• We will share information about the program w/ all math and science teacher but we will continue to target specific teachers who will help support this initiative. We will tie it into what we are doing w/ ASIA society.

Princeton City Schools

Princeton Community Middle

School

• Results of those in the program. With improvement of results of those teachers. Marketing by using participants in the program. The participant would discuss positive experience.

West Clermont Amelia Middle

School

• At my new school I will work closely with the science and math departments (I and district science curriculum administrator). I will recruit others (teacher from both Amelia and Glen Este Campuses) and work closely with Glen Este Administration to implement/ incorporate CEEMS schedule.

Winton Woods City Schools

Winton Woods High School

• Many of our teachers have already expressed interest in PBL; however, the CBL and DBL involved in the CEEMS project fits better for our STEM pathway and math/ science classes. By utilizing our teacher and CEEMS resources we will share this program with all math and science teacher who already have an open mind regarding changing : ).

Page 89: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 86

3. Recruitment of additional teachers is very important. What are your initial thoughts about how you will recruit additional teachers for next year’s SIT? Do you recommend a targeted recruitment in selected school(s) or a district-wide effort? Please give strategies for implementation of your response. School District School Name Comments

Cincinnati Public Schools

Taft Information Technology High

School

• For CPS I would suggest contacting the Science Curriculum Manager. As far as Taft goes, word will spread about CEEMS.

Cincinnati Public Schools

Withrow University High School

• School Level – identify selected teachers to participate and make sure they are consistently exposed throughout the school year to CEEMS.

Felicity Franklin Local Schools

Felicity Franklin Middle School

• I would recommend a targeted recruitment. I will personally discuss the project with selected staff members. Our strategy is again to use the teacher in the project to provide exactly what it is like to help give true examples of what it is like.

Goshen Local School District

Goshen Middle School

• Target should be determined with our district curriculum director.

• Recruitment during teacher open day with a booth might be possible.

• Meri Johnson knows district well, sending her would be a great idea.

Norwood City School District

Norwood Middle School

• I understand why teachers need fall attendance, however, due to some summer vacations or weddings, often initially limits participation – although fall attendance is an indicator of success

Oak Hills Local School District

Oak Hills High School

• We will share information about the program w/ all math and science teacher but we will continue to target specific teachers who will help support this initiative. We will tie it into what we are doing w/ ASIA society.

Princeton City Schools

Princeton Community Middle

School • Offer in district incentives for the participants in the

program. Use marketing tools from CEEMS program.

West Clermont Amelia Middle

School

• As stated above, as well as a district-wide effort to work with main office (T&L) for implementation of curriculum. I plan on blending this with intro of new standards.

Winton Woods City Schools

Winton Woods High School • See above

Page 90: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 87

4. How do you plan to support the 2012 SIT participants as they conduct in-school and district-wide professional development activities?

School District School Name Comments

Cincinnati Public Schools

Taft Information Technology High

School

• Frequent communication • Visibility

• Opportunities to present • I will serve as a “go to” for ANY issues and will offer

support.

Cincinnati Public Schools

Withrow University High School

• SIT participant will have a variety of opportunities to present (i.e. team meetings, department meetings and

staff meetings, etc.). Prior to facilitating PD activities with support from his team leader, science and math

department chair

Felicity Franklin Local Schools

Felicity Franklin Middle School

• By providing time frames and helping her with the presentations if needed. Checking with her to see what

her needs are to make sure she is successful

Goshen Local School District

Goshen Middle School

• Meet monthly w/ Ogden to discuss progress. • Provide her time to work on unit planning w/ dept. also. • Collaborate w/ her for provide a PD during early release

to math/ science or during whole staff meeting. • Give positive and honest feedback to teacher.

Norwood City School District

Norwood Middle School

• Common plan time. • Common lunch time for working lunches.

• ½ day work time each quarter. • Seize district scheduled 2 hr. early release when CEEMS

group could meet. • Build in updates with teachers to monitor success.

• Allow teachers to spend money on their visions, not mine.

Oak Hills Local School District

Oak Hills High School

• I will schedule a specific time to meet with my teacher a minimum of twice a month to support and provide input.

• I will make sure there is time during late starts and/ or department meetings to provide the professional

development.

Princeton City Schools

Princeton Community Middle

School

• By doing frequent checks with participants and observations to teachers who are involved in the

program.

West Clermont Amelia Middle

School

• Through our (T&L department I will coordinate PD opportunities as well as Science Team meetings

throughout the year.

Winton Woods City Schools

Winton Woods High School

• I plan on being a “sounding board” for her. I will meet with her on a regular basis, visit classes, provide her with

as “coach” so for her to reflect, provide resources, and over all just be there to help her as needed.

Page 91: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 88

Appendix B-3. 2012-2013 Teacher Current Instructional Practices Pre-Post Survey Results – Cohort 1

Page 92: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 89

Table B-3-1. Cohort 1: T-Test Analysis of Post and Pre Results – Confidence in 2012-2013

Confidence in Implementing… Mean

Difference*

Difference

Standard

Deviation

t df Sig. (2-

tailed)

Explicitly connect class content to complex

problems or issues with global impact 1.286 .914 5.264 13 .000

Explicitly connect class content to real world

examples and applications .643 .929 2.590 13 .022

Explicitly connect these real-world applications to

STEM careers 1.214 .975 4.660 13 .000

Explicitly connect class content to how people in

STEM careers use their knowledge to address

societal impacts

1.286 .825 5.828 13 .000

Guide students to break complex global problems

in to their local and more actionable components .857 .770 4.163 13 .001

Guide students in refining problems 1.286 .611 7.870 13 .000

Guide students in planning investigations to better

understand different components of problems 1.214 .893 5.090 13 .000

Provide opportunities for students to gather

information about problems or issues of

importance

1.071 .917 4.372 13 .001

Provide students with opportunities to explore

multiple solution pathways for problems 1.000 .961 3.894 13 .002

Guide students in weighing the pros and cons of

different solution pathways 1.071 .829 4.837 13 .000

Provide opportunities for students to test their

solution pathways 1.357 1.008 5.037 13 .000

Guide students in evaluating the results of their

solution pathways 1.429 .938 5.701 13 .000

Provide students with opportunities to refine and

retry a solution pathway 1.500 .650 8.629 13 .000

Provide opportunities for students to

communicate their solution pathways and results

to others

1.071 .917 4.372 13 .001

Provide opportunities for students to take

responsibility for the decisions they made about

the processes used in solving complex problems

1.286 .914 5.264 13 .000

Page 93: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 90

Table B-3-2. Cohort 1: Summary of Current Instructional Practices–Confidence-06/12 (Pre)

Mean*

(Std. Dev.)

Very Confident

f (%) Confident

f (%)

Somewhat Confident

f (%)

Not Confident

f (%)

Explicitly connect class content to complex problems or issues with global impact

2.25 (.775)

1 (6.3%)

4 (25.0%)

9 (56.3%)

2 (12.5%)

Explicitly connect class content to real world examples and applications

2.88 (.806)

4 (25.0%)

6 (37.5%)

6 (37.5%) -

Explicitly connect these real-world applications to STEM careers

2.00 (.730) - 4

(25.0%) 8

(50.0%) 4

(25.0%)

Explicitly connect class content to how people in STEM careers use their knowledge to address societal impacts

1.75 (.775) - 3

(18.8%) 6

(37.5%) 7

(43.8%)

Guide students to break complex global problems in to their local and more actionable components

2.06 (.574) - 3

(18.8%) 11

(68.8%) 2

(12.5%)

Guide students in refining problems 2.13 (.619) - 4

(25.0%) 10

(62.5%) 2

(12.5%) Guide students in planning investigations to better understand different components of problems

2.25 (.856)

1 (6.3%)

5 (31.3%)

7 (43.8%)

3 (18.8%)

Provide opportunities for students to gather information about problems or issues of importance

2.38 (.619) - 7

(43.8%) 8

(50.0%) 1

(6.3%)

Provide students with opportunities to explore multiple solution pathways for problems

2.31 (.873)

1 (6.3%)

6 (37.5%)

6 (37.5%)

3 (18.8%)

Guide students in weighing the pros and cons of different solution pathways

2.31 (.704)

1 (6.3%)

4 (25.0%)

10 (62.5%)

1 (6.3%)

Provide opportunities for students to test their solution pathways

2.13 (.806) - 6

(37.5%) 6

(37.5%) 4

(25.0%)

Guide students in evaluating the results of their solution pathways

2.13 (.719) - 5

(31.3%) 8

(50.0%) 3

(18.8%)

Provide students with opportunities to refine and retry a solution pathway

2.00 (.730) - 4

(25.0%) 8

(50.0%) 4

(25.0%) Provide opportunities for students to communicate their solution pathways and results to others

2.25 (.683) - 6

(37.5%) 8

(50.0%) 2

(12.5%)

Provide opportunities for students to take responsibility for the decisions they made about the processes used in solving complex problems

1.88 (.806) - 4

(25.0%) 6

(37.5%) 6

(37.5%)

*Scale: Very confident=4; Confident=3; Somewhat Confident=2; Not Confident=1

Page 94: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 91

Table B-3-3. Cohort 1: Summary of Current Instructional Practices–Confidence-05/13 (Post)

Mean* (Std. Dev.)

Very Confident

f (%)

Confident f (%)

Somewhat Confident

f (%)

Not Confident

f (%)

Explicitly connect class content to complex problems or issues with global impact

3.57 (.514)

8 (57.1%)

6 (42.9%)

- -

Explicitly connect class content to real world examples and applications

3.64 (.497)

9 (64.3%)

5 (35.7%)

- -

Explicitly connect these real-world applications to STEM careers

3.29 (.726)

6 (42.9%)

6 (42.9%)

2 (14.3%)

-

Explicitly connect class content to how people in STEM careers use their knowledge to address societal impacts

3.07 (.616)

3 (21.4%)

9 (64.3%)

2 (14.3%)

-

Guide students to break complex global problems in to their local and more actionable components

3.00 (.679)

3 (21.4%)

8 (57.1%)

3 (21.4%)

-

Guide students in refining problems 3.57 (.514)

8 (57.1%)

6 (42.9%)

- -

Guide students in planning investigations to better understand different components of problems

3.64 (.497)

9 (64.3%)

5 (35.7%)

- -

Provide opportunities for students to gather information about problems or issues of importance

3.50 (.760)

9 (64.3%)

3 (21.4%)

2 (14.3%)

-

Provide students with opportunities to explore multiple solution pathways for problems

3.43 (.514)

6 (42.9%)

8 (57.1%)

- -

Guide students in weighing the pros and cons of different solution pathways

3.43 (.514)

6 (42.9%)

8 (57.1%)

- -

Provide opportunities for students to test their solution pathways

3.57 (.514)

8 (57.1%)

6 (42.9%)

- -

Guide students in evaluating the results of their solution pathways

3.64 (.497)

9 (64.3%)

5 (35.7%)

- -

Provide students with opportunities to refine and retry a solution pathway

3.64 (.497)

9 (64.3%)

5 (35.7%)

- -

Provide opportunities for students to communicate their solution pathways and results to others

3.36 (.497)

5 (35.7%)

9 (64.3%)

- -

Provide opportunities for students to take responsibility for the decisions they made about the processes used in solving complex problems

3.29 (.611)

5 (35.7%)

8 (57.1%)

1 (7.1%)

-

*Scale: Very confident=4; Confident=3; Somewhat Confident=2; Not Confident=1

Page 95: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 92

Table B-3-4. Cohort 1: Summary of Current Instructional Practices – Usage - JUNE 2012 (Pre)

Use Regularly

f (%)

Use Occasionally

f (%)

Have Tried It

f (%)

Never Used f (%)

Explicitly connect class content to complex problems or issues with global impact - 9

(56.3%) 7

(43.8%) -

Explicitly connect class content to real world examples and applications

6 (37.5%)

9 (56.3%)

1 (6.3%) -

Explicitly connect these real-world applications to STEM careers - 3

(18.8%) 8

(50.0%) 5

(31.3%)

Explicitly connect class content to how people in STEM careers use their knowledge to address societal impacts

- 2 (12.5%)

7 (43.8%)

7 (43.8%)

Guide students to break complex global problems in to their local and more actionable components

- 3 (18.8%)

7 (43.8%)

6 (37.5%)

Guide students in refining problems 2 (12.5%)

5 (31.3%)

7 (43.8%)

2 (12.5%)

Guide students in planning investigations to better understand different components of problems

3 (18.8%)

7 (43.8%)

5 (31.3%)

1 (6.3%)

Provide opportunities for students to gather information about problems or issues of importance

1 (6.3%)

5 (31.3%)

9 (56.3%)

1 (6.3%)

Provide students with opportunities to explore multiple solution pathways for problems

3 (18.8%)

6 (37.5%)

5 (31.3%)

2 (12.5%)

Guide students in weighing the pros and cons of different solution pathways

1 (6.3%)

7 (43.8%)

6 (37.5%)

2 (12.5%)

Provide opportunities for students to test their solution pathways

1 (6.3%)

7 (43.8%)

7 (43.8%)

1 (6.3%)

Guide students in evaluating the results of their solution pathways - 7

(43.8%) 6

(37.5%) 3

(18.8%)

Provide students with opportunities to refine and retry a solution pathway

1 (6.3%)

5 (31.3%)

5 (31.3%)

5 (31.3%)

Provide opportunities for students to communicate their solution pathways and results to others

- 7 (43.8%)

7 (43.8%)

2 (12.5%)

Provide opportunities for students to take responsibility for the decisions they made about the processes used in solving complex problems

- 3 (18.8%)

9 (56.3%)

4 (25.0%)

Page 96: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 93

Table B-3-5. Cohort 1: Summary of Current Instructional Practices – Usage - May 2013 (Post)

Use Regularly

f (%)

Use Occasionally

f (%)

Have Tried It

f (%)

Never Used f (%)

Explicitly connect class content to complex problems or issues with global impact

6 (42.9%)

8 (57.1%) - -

Explicitly connect class content to real world examples and applications

6 (37.5%)

9 (56.3%) - -

Explicitly connect these real-world applications to STEM careers

5 (35.7%)

9 (64.3%) - -

Explicitly connect class content to how people in STEM careers use their knowledge to address societal impacts

4 (28.6%)

10 (71.4%) - -

Guide students to break complex global problems in to their local and more actionable components

2 (14.3%)

9 (64.3%)

2 (14.3%)

1 (7.1%)

Guide students in refining problems 9 (64.3%)

5 (35.7%) - -

Guide students in planning investigations to better understand different components of problems

9 (64.3%)

4 (28.6%)

1 (7.1%) -

Provide opportunities for students to gather information about problems or issues of importance

5 (35.7%)

8 (57.1%)

1 (7.1%) -

Provide students with opportunities to explore multiple solution pathways for problems

9 (64.3%)

4 (28.6%)

1 (7.1%) -

Guide students in weighing the pros and cons of different solution pathways

6 (42.9%)

6 (42.9%)

2 (14.3%) -

Provide opportunities for students to test their solution pathways

4 (30.8%)

8 (61.5%)

1 (7.7%) -

Guide students in evaluating the results of their solution pathways

5 (35.7%)

8 (57.1%)

1 (7.1%) -

Provide students with opportunities to refine and retry a solution pathway

5 (38.5%)

7 (53.8%)

1 (7.7%) -

Provide opportunities for students to communicate their solution pathways and results to others

4 (28.6%)

10 (71.4%) - -

Provide opportunities for students to take responsibility for the decisions they made about the processes used in solving complex problems

4 (28.6%)

9 (64.3%)

1 (7.1%) -

Page 97: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 94

Appendix B-4. 2013 Teacher Current Instructional Practices PRE Survey Results – Cohort 2

Page 98: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 95

Table B-4-1. Cohort 2: Summary of Current Instructional Practices – Usage - May 2013 (Pre)

Use Regularly

f (%)

Use Occasionally

f (%)

Have Tried It

f (%)

Never Used f (%)

Explicitly connect class content to complex problems or issues with global impact

2 (8.3%)

12 (50.0%)

9 (37.5%)

1 (4.2%)

Explicitly connect class content to real world examples and applications

8 (33.3%)

13 (54.2%)

3 (12.5%) -

Explicitly connect these real-world applications to STEM careers

2 (8.3%)

7 (29.2%)

11 (45.8%)

4 (16.7%)

Explicitly connect class content to how people in STEM careers use their knowledge to address societal impacts - 6

(25.0%) 10

(41.7%) 8

(33.3%)

Guide students to break complex global problems in to their local and more actionable components - 2

(8.3%) 7

(29.2%) 15

(62.5%)

Guide students in refining problems 3 (12.5%)

9 (37.5%)

10 (41.7%)

2 (8.3%)

Guide students in planning investigations to better understand different components of problems

3 (12.5%)

6 (25.0%)

13 (54.2%)

2 (8.3%)

Provide opportunities for students to gather information about problems or issues of importance

3 (12.5%)

10 (41.7%)

9 (37.5%)

2 (8.3%)

Provide students with opportunities to explore multiple solution pathways for problems

4 (16.7%)

11 (45.8%)

8 (33.3%)

1 (4.2%)

Guide students in weighing the pros and cons of different solution pathways

4 (16.7%)

10 (41.7%)

9 (37.5%)

1 (4.2%)

Provide opportunities for students to test their solution pathways

2 (8.3%)

5 (20.8%)

14 (58.3%)

3 (12.5%)

Guide students in evaluating the results of their solution pathways

1 (4.2%)

8 (33.3%)

12 (50.0%)

3 (12.5%)

Provide students with opportunities to refine and retry a solution pathway - 8

(33.3%) 13

(54.2%) 3

(12.5%)

Provide opportunities for students to communicate their solution pathways and results to others

2 (8.3%)

9 (37.5%)

10 (41.7%)

3 (12.5%)

Provide opportunities for students to take responsibility for the decisions they made about the processes used in solving complex problems

1 (4.2%)

6 (25.0%)

12 (50.0%)

5 (20.8%)

Page 99: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 96

Table B-4-2. Cohort 2: Summary of Current Instructional Practices-Confidence-05/13 (Pre)

Mean*

(Std. Dev.)

Very Confident

f (%) Confident

f (%)

Somewhat Confident

f (%)

Not Confident

f (%)

Explicitly connect class content to complex problems or issues with global impact

2.38 (.875)

2 (8.3%)

9 (37.5%)

9 (37.5%)

4 (16.7%)

Explicitly connect class content to real world examples and applications

2.71 (.624)

2 (8.3%)

13 (54.2%)

9 (37.5%) -

Explicitly connect these real-world applications to STEM careers

2.25 (.794)

1 (4.2%)

8 (33.3%)

11 (45.8%)

4 (16.7%)

Explicitly connect class content to how people in STEM careers use their knowledge to address societal impacts

2.04 (.806)

1 (4.2%)

5 (20.8%)

12 (50.0%)

6 (25.0%)

Guide students to break complex global problems in to their local and more actionable components

1.83 (.637) - 3

(12.5%) 14

(58.3%) 7

(29.2%)

Guide students in refining problems 2.50 (.885)

3 (12.5%)

9 (37.5%)

9 (37.5%)

3 (12.5%)

Guide students in planning investigations to better understand different components of problems

2.25 (.676) - 9

(37.5%) 12

(50.0%) 3

(12.5%)

Provide opportunities for students to gather information about problems or issues of importance

2.42 (.717)

1 (4.2%)

10 (41.7%)

11 (45.8%)

2 (8.3%)

Provide students with opportunities to explore multiple solution pathways for problems

2.38 (.647) - 11

(45.8%) 11

(45.8%) 2

(8.3%)

Guide students in weighing the pros and cons of different solution pathways

2.46 (.779)

1 (4.2%)

12 (50.0%)

8 (33.3%)

3 (12.5%)

Provide opportunities for students to test their solution pathways

2.42 (.830)

2 (8.3%)

9 (37.5%)

10 (41.7%)

3 (12.5%)

Guide students in evaluating the results of their solution pathways

2.29 (.908)

2 (8.3%)

8 (33.3%)

9 (37.5%)

5 (20.8%)

Provide students with opportunities to refine and retry a solution pathway

2.38 (.924)

2 (8.3%)

10 (41.7%)

7 (29.2%)

5 (20.8%)

Provide opportunities for students to communicate their solution pathways and results to others

2.54 (.779)

2 (8.3%)

11 (45.8%)

9 (37.5%)

2 (8.3%)

Provide opportunities for students to take responsibility for the decisions they made about the processes used in solving complex problems

2.33 (.917)

2 (8.3%)

9 (37.5%)

8 (33.3%)

5 (20.8%)

*Scale: Very confident=4; Confident=3; Somewhat Confident=2; Not Confident=1

Page 100: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 97

Appendix C. Student Surveys Results

Page 101: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 98

Appendix C-1. 2012-13 Student Activity Feedback Form Results

Page 102: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 99

Table C-1-1. 2012-2013 UNITS 1-2-3 Student Activity Feedback Forms (excluding Unit 3 Pilot) Mean*

(Std. Dev.) n=1293

Poor f (%)

Fair f (%)

Average f (%)

Good f (%)

Excellent f (%)

Overall, I would rate this lesson as…

4.09 (.823)

14 (1.2%)

35 (2.9%)

165 (13.8%)

593 (49.6%)

388 (32.5%)

Mean*

(Std. Dev.) n=1293

Strongly Disagree

f (%) Disagree

f (%) Neutral

f (%) Agree f (%)

Strongly Agree f (%)

I liked the activities we did in this lesson.

4.09 (.818)

11 (.9%)

29 (2.3%)

222 (17.5%)

585 (46.1%)

421 (33.2%)

The lesson was very well organized.

4.13 (1.179)

10 (.8%)

39 (3.1%)

225 (17.8%)

520 (41.0%)

473 (37.3%)

The teacher was able to explain the subject very easily.

4.30 (1.651)

8 (.6%)

33 (2.6%)

168 (13.3%)

474 (37.5%)

582 (46.0%)

The teacher encouraged us to ask questions.

4.22 (1.403)

11 (.9%)

27 (2.1%)

207 (16.4%)

482 (38.2%)

537 (42.5%)

The teacher was very good at answering our questions.

4.33 (1.639)

8 (.6%)

17 (1.4%)

175 (13.9%)

456 (36.2%)

603 (47.9%)

The group work was very interesting.

3.88 (.998)

30 (2.4%)

75 (6.0%)

305 (24.2%)

453 (36.0%)

396 (31.5%)

I learned a lot from this lesson.

3.91 (.976)

27 (2.1%)

64 (5.1%)

304 (24.2%)

459 (36.5%)

403 (32.1%)

I learned a lot from the teacher. 4.07 (.882)

14 (1.1%)

38 (3.0%)

250 (19.9%)

499 (39.7%)

455 (36.2%)

This lesson made me interested in learning more about Engineering.

3.32 (1.252)

114 (9.1%)

219 (17.4%)

364 (28.9%)

274 (21.8%)

288 (22.9%)

This lesson helped me feel more confident about studying math.

3.27 (1.188)

116 (9.2%)

187 (14.9%)

425 (33.8%)

301 (24.0%)

227 (18.1%)

This lesson helped me feel more confident about studying science.

3.54 (1.110)

60 (5.1%)

138 (11.8%)

347 (29.6%)

370 (31.5%)

259 (22.1%)

This lesson was different from other lessons I’ve had in this class.

4.16 (.927)

21 (1.7%)

44 (3.5%)

198 (15.7%)

445 (35.2%)

555 (43.9%)

*Scale for first question: 5=excellent; 4=good; 3=average; 2=fair; 1=poor; Scale for remaining questions: 5=strongly agree; 4=agree; 3=Neutral; 2=Disagree; 1=Strongly Disagree

Page 103: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 100

Table C-1-2. 2012-2013 UNIT 1 Student Activity Feedback Forms Mean*

(Std. Dev.) n=382

Poor f (%)

Fair f (%)

Average f (%)

Good f (%)

Excellent f (%)

Overall, I would rate this lesson as…

4.15 (.826)

5 (1.4%)

6 (1.7%)

51 (14.2%)

164 (45.8%)

132 (36.9%)

Mean*

(Std. Dev.) n=382

Strongly Disagree

f (%) Disagree

f (%) Neutral

f (%) Agree f (%)

Strongly Agree f (%)

I liked the activities we did in this lesson.

4.21 (.764)

3 (.8%)

3 (.8%)

53 (13.9%)

176 (46.1%)

147 (38.5%)

The lesson was very well organized.

4.15 (.888)

4 (1.0%)

11 (2.9%)

69 (18.1%)

138 (36.1%)

160 (41.9%)

The teacher was able to explain the subject very easily.

4.35 (.796)

3 (.8%)

9 (2.4%)

32 (8.4%)

144 (37.8%)

193 (50.7%)

The teacher encouraged us to ask questions.

4.20 (.899)

5 (1.3%)

11 (2.9%)

60 (15.8%)

132 (34.7%)

172 (45.3%)

The teacher was very good at answering our questions.

4.35 (.787)

3 (.8%)

6 (1.6%)

38 (10.0%)

140 (36.9%)

192 (50.7%)

The group work was very interesting.

3.96 (.995)

8 (2.1%)

22 (5.8%)

80 (21.2%)

134 (35.4%)

134 (35.4%)

I learned a lot from this lesson.

3.99 (1.018)

10 (2.7%)

17 (4.5%)

85 (22.7%)

118 (31.5%)

145 (38.7%)

I learned a lot from the teacher. 4.13 (.881)

4 (1.1%)

11 (2.9%)

68 (18.0%)

144 (38.2%)

150 (39.8%)

This lesson made me interested in learning more about Engineering.

3.53 (1.267)

29 (7.7%)

57 (15.1%)

89 (23.6%)

91 (24.1%)

111 (29.4%)

This lesson helped me feel more confident about studying math.

3.33 (1.248)

35 (9.3%)

59 (15.6%)

117 (31.0%)

79 (21.0%)

87 (23.1%)

This lesson helped me feel more confident about studying science.

3.77 (1.126)

15 (4.0%)

37 (9.8%)

91 (24.2%)

111 (29.5%)

122 (32.4%)

This lesson was different from other lessons I’ve had in this class.

4.24 (.892)

5 (1.3%)

11 (2.9%)

53 (13.9%)

129 (33.9%)

182 (47.9%)

*Scale for first question: 5=excellent; 4=good; 3=average; 2=fair; 1=poor; Scale for remaining questions: 5=strongly agree; 4=agree; 3=Neutral; 2=Disagree; 1=Strongly Disagree

Page 104: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 101

Table C-1-3. 2012-2013 UNIT 2 Student Activity Feedback Forms Mean*

(Std. Dev.) n=525

Poor f (%)

Fair f (%)

Average f (%)

Good f (%)

Excellent f (%)

Overall, I would rate this lesson as…

4.03 (.820)

6 (1.2%)

19 (3.8%)

68 (13.6%)

269 (53.9%)

137 (27.5%)

Mean*

(Std. Dev.) n=525

Strongly Disagree

f (%) Disagree

f (%) Neutral

f (%) Agree f (%)

Strongly Agree f (%)

I liked the activities we did in this lesson.

3.97 (.832)

5 (1.0%)

18 (3.4%)

106 (20.2%)

255 (48.7%)

140 (26.7%)

The lesson was very well organized.

4.14 (1.524)

4 (.8%)

17 (3.3%)

95 (18.2%)

224 (42.8%)

183 (35.0%)

The teacher was able to explain the subject very easily.

4.35 (2.373)

2 (.4%)

14 (2.7%)

79 (15.1%)

181 (34.7%)

245 (46.9%)

The teacher encouraged us to ask questions.

4.29 (1.931)

4 (.8%)

8 (1.5%)

90 (17.2%)

193 (40.0%)

227 (43.5%)

The teacher was very good at answering our questions.

4.39 (2.371)

4 (.8%)

7 (1.3%)

77 (14.8%)

176 (33.8%)

256 (49.2%)

The group work was very interesting.

3.81 (1.028)

16 (3.1%)

33 (6.3%)

138 (26.4%)

180 (34.5%)

155 (29.7%)

I learned a lot from this lesson.

3.87 (.975)

12 (2.3%)

30 (5.7%)

123 (23.6%)

204 (39.1%)

153 (29.3%)

I learned a lot from the teacher. 4.02 (.888)

6 (1.1%)

16 (3.1%)

117 (22.4%)

207 (39.6%)

177 (33.8%)

This lesson made me interested in learning more about Engineering.

3.21 (1.279)

57 (10.9%)

100 (19.2%)

153 (29.3%)

100 (19.2%)

112 (21.5%)

This lesson helped me feel more confident about studying math.

3.31 (1.159)

45 (8.6%)

68 (13.1%)

181 (34.7%)

136 (26.1%)

91 (17.5%)

This lesson helped me feel more confident about studying science.

3.39 (1.112)

31 (6.6%)

62 (13.2%)

149 (31.7%)

149 (31.7%)

79 (16.8%)

This lesson was different from other lessons I’ve had in this class.

4.03 (.986)

12 (2.3%)

25 (4.8%)

98 (18.8%)

186 (35.6%)

201 (38.5%)

*Scale for first question: 5=excellent; 4=good; 3=average; 2=fair; 1=poor; Scale for remaining questions: 5=strongly agree; 4=agree; 3=Neutral; 2=Disagree; 1=Strongly Disagree

Page 105: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 102

Table C-1-4. 2012-2013 UNIT 3 Student Activity Feedback Forms (excluding Unit 3 Pilot) Mean*

(Std. Dev.) n=385

Poor f (%)

Fair f (%)

Average f (%)

Good f (%)

Excellent f (%)

Overall, I would rate this lesson as…

4.13 (.819)

3 (.9%)

10 (3.0%)

46 (13.6%)

160 (47.3%)

119 (35.2%)

Mean*

(Std. Dev.) n=385

Strongly Disagree

f (%) Disagree

f (%) Neutral

f (%) Agree f (%)

Strongly Agree f (%)

I liked the activities we did in this lesson.

4.13 (.832)

3 (.8%)

8 (2.2%)

63 (17.4%)

154 (42.5%)

134 (37.0%)

The lesson was very well organized.

4.11 (.829)

2 (.6%)

11 (3.0%)

61 (16.9%)

158 (43.6%)

130 (35.9%)

The teacher was able to explain the subject very easily.

4.16 (.845)

3 (.8%)

10 (2.8%)

57 (15.7%)

149 (41.2%)

143 (39.5%)

The teacher encouraged us to ask questions.

4.16 (.807)

2 (.6%)

8 (2.2%)

57 (15.7%)

157 (43.4%)

138 (38.1%)

The teacher was very good at answering our questions.

4.23 (.784)

1 (.3%)

4 (1.1%)

60 (16.7%)

140 (38.9%)

155 (43.1%)

The group work was very interesting.

3.89 (.951)

6 (1.7%)

20 (5.6%)

87 (24.2%)

139 (38.7%)

107 (29.8%)

I learned a lot from this lesson.

3.89 (.929)

5 (1.4%)

17 (4.7%)

96 (26.7%)

137 (38.1%)

105 (29.2%)

I learned a lot from the teacher. 4.08 (.873)

4 (1.1%)

11 (3.1%)

65 (18.3%)

148 (41.6%)

128 (36.0%)

This lesson made me interested in learning more about Engineering.

3.26 (1.171)

28 (7.8%)

62 (17.2%)

122 (33.9%)

83 (23.1%)

65 (18.1%)

This lesson helped me feel more confident about studying math.

3.15 (1.158)

36 (10.1%)

60 (16.8%)

127 (35.5%)

86 (24.0%)

49 (13.7%)

This lesson helped me feel more confident about studying science.

3.48 (1.049)

14 (4.3%)

39 (11.9%)

107 (32.6%)

110 (33.5%)

58 (17.7%)

This lesson was different from other lessons I’ve had in this class.

4.27 (.851)

4 (1.1%)

8 (2.2%)

47 (13.0%)

130 (36.0%)

172 (47.6%)

*Scale for first question: 5=excellent; 4=good; 3=average; 2=fair; 1=poor; Scale for remaining questions: 5=strongly agree; 4=agree; 3=Neutral; 2=Disagree; 1=Strongly Disagree

Page 106: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 103

Table C-1-5. 2012-2013 UNIT 3 Pilot Big Idea Student Survey

Mean*

(Std. Dev.) n=213

Strongly Disagree

f (%) Disagree

f (%) Agree f (%)

Strongly Agree f (%)

During this discussion, my ideas were considered.

3.23 (.690)

5 (2.4%)

16 (7.6%)

114 (54.3%)

75 (35.7%)

I understand the “Essential Questions” identified.

3.33 (.672)

2 (1.0%)

18 (8.6%)

99 (47.1%)

91 (43.3 %)

I understood which “Essential Question” was selected to define the “challenge.”

3.24 (.686)

2 (1.0%)

24 (11.5%)

105 (50.5%)

77 (37.0%)

I understood how the “Guiding Questions” are framed to help us solve the “challenge” selected.

3.28 (.629)

2 (1.0%)

14 (6.7%)

116 (55.5%)

77 (36.8%)

I understood how to go from the “Big Idea” to the “Guiding Questions.”

3.17 (.731)

3 (1.4%)

32 (15.3%)

101 (48.3%)

73 (34.9%)

I am excited about finding a solution to this “challenge.”

3.15 (.792)

8 (3.8%)

28 (13.3%)

98 (46.7%)

76 (36.2%)

I received guidance from my teacher when I asked for it.

3.41 (.609)

1 (.5%)

10 (4.9%)

98 (47.6%)

97 (47.1%)

I helped choose the team I will work on.

2.41 (1.172)

65 (31.3%)

45 (21.6%)

46 (22.1%)

52 (25.0%)

Page 107: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 104

Table C-1-6. 2012-2013 UNIT 3 Pilot Student Activity Survey

Mean*

(Std. Dev.) n=198

Strongly Disagree

f (%) Disagree

f (%) Agree f (%)

Strongly Agree f (%)

I was able to relate all aspects of this activity to my real-life.

2.76 (.740)

12 (6.1%)

47 (23.7%)

115 (58.1%)

24 (12.1%)

There are many solutions to this problem.

3.27 (.644)

2 (1.0%)

15 (7.6%)

107 (54.3%)

73 (37.1%)

The way we selected the “best” solution fit with my real-world experiences.

2.89 (.722)

7 (3.6%)

41 (21.4%)

111 (57.8%)

33 (17.2%)

I understand how the engineering design process activity allowed us to use the guiding questions to solve the challenge selected.

3.20 (.682)

2 (1.0%)

24 (12.2%)

104 (52.8%)

67 (64.0%)

We were able to test our initial solution.

3.36 (.720)

4 (2.0%)

16 (8.2%)

82 (41.8%)

94 (48.0%)

After this initial test, we were able to think about changes we wanted to make to have a better solution to the challenge.

3.16 (.725)

3 (1.5%)

29 (14.9%)

97 (49.7%)

66 (33.8%)

I know how this challenge, and its solution, could be solved by professionals and what jobs these professionals would work in.

3.26 (.742)

6 (3.1%)

17 (8.7%)

94 (48.0%)

79 (40.3%)

The way we solved this challenge can be used to solve problems we have in our society or community.

2.93 (.801)

9 (4.6%)

43 (21.9%)

97 (49.5%)

47 (24.0%)

The challenge’s solution incorporated applications from real-life into the classroom.

3.07 (.738)

4 (2.1%)

34 (17.5%)

100 (51.5%)

56 (28.9%)

I am excited that we found a solution to this challenge.

3.12 (.768)

6 (3.1%)

29 (14.8%)

96 (49.0%)

65 (33.2%)

I received guidance from my teacher when I asked for it.

3.44 (.709)

4 (2.0%)

13 (6.6%)

72 (36.5%)

108 (54.8%)

Page 108: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 105

Table C-1-7. 2012-2013 UNIT 3 Pilot Student Feedback Survey

Mean* (Std. Dev.)

n=205

Poor f (%)

Fair f (%)

Average f (%)

Good f (%)

Excellent f (%)

Overall, I would rate this unit as… 3.97 (.846)

3 (1.7%)

4 (2.3%)

35 (20.1%)

86 (49.4%)

46 (26.4%)

Mean*

(Std. Dev.) n=205

Strongly Disagree

f (%)

Disagree f (%)

Neutral f (%)

Agree f (%)

Strongly Agree f (%)

We received guidance from our teacher when we asked for it.

4.20 (.771)

1 (.5%)

2 (1.0%)

32 (15.6%)

86 (43.4%)

81 (39.5%)

I learned a lot. 4.06 (.906)

3 (1.5%)

7 (3.4%)

39 (19.0%)

82 (40.0%)

74 (36.1%)

I helped to solve a part of a big problem.

4.08 (.920)

3 (1.5%)

8 (3.9%)

37 (18.0%)

79 (38.5%)

78 (38.0%)

I worked harder on this unit than I usually do on school work.

3.58 (1.043)

7 (3.4%)

22 (10.7%)

65 (31.7%)

68 (33.2%)

43 (21.0%)

I felt like I was doing something important.

3.67 (1.054)

8 (3.9%)

18 (8.8%)

56 (27.5%)

74 (36.3%)

48 (23.5%)

I contributed to the group’s solution.

4.38 (.769)

1 (.5%)

6 (2.9%)

12 (5.9%)

81 (39.7%)

104 (51.0%)

Listening to other student’s ideas was an important part of this unit.

4.12 (.939)

5 (2.5%)

6 (3.0%)

30 (14.8%)

80 (39.4%)

82 (40.4%)

I like problems best when they really make me think.

3.66 (1.205)

19 (9.3%)

14 (6.8%)

39 (19.0%)

79 (38.5%)

54 (26.3%)

I feel using challenges is a more effective way to learn than the way we are usually taught.

4.00 (.926)

2 (1.0%)

9 (4.4%)

48 (23.5%)

73 (35.8%)

72 (35.3%)

How this unit was taught (challenge-based learning approach) enhanced my interest and desire to learn.

3.77 (.981)

6 (2.9%)

11 (5.4%)

58 (28.3%)

79 (38.5%)

51 (24.9%)

This unit made me interested in learning more about Engineering.

3.25 (1.216)

23 (11.3%)

27 (13.2%)

65 (31.9%)

54 (26.5%)

35 (17.2%)

This unit helped me feel more confident about studying math or science.

3.41 (1.043)

8 (3.9%)

29 (14.1%)

71 (34.6%)

64 (31.2%)

33 (16.1%)

The unit was different from other lessons I’ve had in this class.

3.99 (.915)

3 (1.5%)

10 (4.9%)

39 (19.0%)

88 (42.9%)

65 (31.7%)

I was able to learn how the unit’s content relates to what I see in the real world.

3.77 (.992)

7 (3.4%)

11 (5.4%)

53 (26.0%)

83 (40.7%)

50 (24.4%)

I learned about the jobs people have that use the knowledge taught in this unit.

3.80 (1.033)

6 (2.9%)

17 (8.3%)

46 (22.5%)

78 (38.2%)

57 (27.9%)

*Scale for first question: 5=excellent; 4=good; 3=average; 2=fair; 1=poor; Scale for remaining questions: 5=strongly agree; 4=agree; 3=Neutral; 2=Disagree; 1=Strongly Disagree

Page 109: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 106

Page 110: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 107

Appendix D. Examples of Student Work

Page 111: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 108

Figure 1. Student Work Examples for Roackslides and Seismographs

Page 112: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 109

Figure 2. Student Work Examples for Lunar Rover, Subduction Zone Housing, Energy Through Toys and Rockets

Page 113: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 110

Figure 3. Student Work Examples for Parachutes, Slippery Slopes, and Using Math to Build Bridges

Page 114: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 111

Figure 4. Student Work Examples for Extreme Makover Highway Edition, Build a Better Product, iPhone Case, Stunt Design, and Design a Sign

Page 115: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 112

Figure 5. Student Work Examples for Your Brain on Sports – You Only Get One, Renewable Energy

Page 116: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 113

Appendix E. 2013 STEM Conference Evaluation Results

Page 117: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 114

Table E-1. 2013 STEM Conference Evaluation Results – Descriptive Statistics for Ratings N Mean*

Standard Deviation

1. OVERALL, this event has aided my understanding of STEM education. 133 3.50 0 .62

2. OVERALL, the information presented will be USEFUL in my future educational activities. 132 3.45 0.58

3. OVERALL, the information presented increased my understanding of how real-world applications of STEM impact students’ lives.

132 3.37 0.61

4. OVERALL, the information presented increased my understanding of STEM careers and fields of study. 131 3.08 0.69

5. OVERALL, the information presented provided me with ideas of how to connect STEM to societal impacts. 132 3.17 0.64

6. At the conclusion of this conference, I have a better understanding of how challenge-based learning can be used to enhance mathematics and science learning.

133 3.38 0.64

7. At the conclusion of this conference, I had a better understanding of how the engineering design process can be used to enhance mathematics and science learning.

132 3.23 0.70

8. The event provided me opportunities to learn about current STEM initiatives. 128 3.45 0.60

9. The STEM conference provided opportunities to collect STEM activities that can be implemented in my own teaching or learning environment.

126 3.46 0.60

10. The conference facilitated collaboration across institutions or schools. 129 3.33 0.66

11. The breadth of the session options was a positive aspect of the conference. 129 3.42 0.63

12. The length of the sessions was adequate. 129 3.27 0.72

13. I had access to the wireless internet when I needed it. 113 3.33 0.76

14. The location - Tangeman University Center at the University of Cincinnati - was an appropriate venue for this type of event. 133 3.50 0.62

Question 1 Scale: 4=Very Good; 3=Satisfactory; 2=Marginally; 1=Poor Questions 2-14 Scale: 4=Strongly Agree; 3=Agree; 2=Disagree; 1=Strongly Disagree

Page 118: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 115

Table E-2. Written Responses to the following: “What did you value most about the 2013 Get REAL! STEM Education Brings REAL Challenges, REAL Connections, REAL Life event? Why?” _____________________________________________________________________________________

RESOURCES (ACTIVITIES / TECHNIQUES / DEMONSTRATIONS) TO BRING BACK (n=31) • Activities that were engaging, grade level appropriate and challenged based. There was so much

inquiry with the lessons. • Better understanding of how to implement engineering in my science classroom. • Demonstrations • Getting to see what other teachers are doing in their classrooms because it gave me ideas for future

lessons. • Greater sense of awareness of community and support and great ideas for the classroom - very

motivating. • Enthusiasm and preparedness of presenters! I got "hooked" on STEM! I was loaded with ideas to

bring home and actually implemented some of them already today! Thanks soooooooo much! • Seeing STEM activities in action - having an opportunity to experience the activities firsthand. • Having an opportunity to try out various lessons within the conference. • I really enjoyed the sessions with people presenting what worked in their class or demonstrated actual

lessons. • Hearing from other teachers as they do things in their classrooms • Ready to implement lesson ideas. • Practical applications (especially in nanotechnology due to the sessions I attended) that I have already

used in class! • Strategies to incorporate STEM into my classroom. • I value the presentations and the tried and tested lesson plans that I can adapt for my classroom. • Workshops that presented appropriate materials that I can utilize in my classroom. • I loved the hands-on aspect of many of the workshops I attended. It was nice to hear directly from

teachers who were using these methods in the classroom and to see real examples of how they were running their day-to-day activities with their students. Being able to bring things back that I can use directly in my classroom was extremely beneficial.

• I enjoyed all of the STEM activities that were provided to help teachers implement STEM in the classroom.

• Resources for use in classroom • The access to all of the resources was amazing. I plan to share and use them in my own classroom. • The activities that could be incorporated in my high school math classes. I found them to be intriguing

to students which should keep them motivated and engaged in learning. • Practical lesson plans that I can implement soon in my classroom. • The examples of lessons provided in some of the break-out sessions • The hands on demonstrations and activities. • This was the first conference I've attended where every single session was useful and appropriate for

the age level I teach. The presenters did a great job of making the material available and easily adaptable in my classroom.

• I loved the teacher led sessions. I love it when a teacher loves a lesson or idea and wants to share it with others.

• As a teacher, I was hoping to walk away with a wealth of knowledge for bettering my classroom. • Many presenters were actual classroom teachers who have been working on incorporating STEM in

the classroom. This made it more real for me and not such an overwhelming new task. • This gives me something I can use readily in my classroom • There were examples that showed me some of what I am already doing is STEM applicable • Good sessions for K-6 STEM • Great event! Very useful for the classroom and practical. It didn't require me to completely change the

way that I teach.

Page 119: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 116

NETWORKING (n=19) • Networking. • Ability to network • As a university professor and administrator, I wasn't really the target audience, but I found it helpful to

… network with others working in STEM education. • Enjoyed the opportunity to meet new people • Have you thought about doing it yearly to keep the connections and networking strong? It makes a

difference when you meet people in person. • I liked making connections with other teachers in the area because they can be a great resource for

me professionally. • Networking with others about the development of real STEM. • Being on the UC campus and getting a better understanding of how UC works with math and science

teachers. • Having an opportunity to speak with other educators and hearing what is happening in their schools

and classrooms in relation to STEM education. • Interaction with individuals • Interactions with other teachers that are experiencing similar concerns • I valued the opportunity to meet a significant number of teachers from across school districts. • The interactions • The advantage of these meeting is above what is learned in the workshops. It is the opportunity to

network and share insights and ideas with others. • The connections our organization was able to make with other groups. • The sharing of lessons with others • The opportunity to collaborate • Also time to talk with colleagues. • Making connections with others. VARIETY OF PRESENTATIONS (n=15) • Lots of options for learning! • Appreciate an event that has a variety of presenters and participants with different perspectives. • good variety • I liked hearing different points of views. • The variety of content. Because it allowed me to choose whatever interested me. • Seeing the vast diversity in presenters and participants. • The selection of topics was very relevant. • Variety of sessions. • Variety of courses offered. • The option to pick and choose which session to attend • the range of information and potential sources to use • The range of sessions was applicable to a broad range of grades and applications. • There were LOTS of great ideas shared at the conference. • Lots of sessions I wanted to attend so it was tricky to choose which to go to during certain time slots. • The session variety and organization in to "Interactives" "Panel Discussions" Etc. LEARN ABOUT STEM IN THE REGION (n=13) • As a university professor and administrator, I wasn't really the target audience, but I found it helpful to

learn about STEM careers … • I came to discover how STEM was influencing schools and the partnerships that have been involved. I

was able to successfully ascertain that it is impacting teaching and learning and people are still serious about making STEM work.

• I loved the explanations and now have a better understanding of what STEM is all about. • I thought the idea behind the conference was great.

Page 120: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 117

• I was energized by the topics and professionals presenting. I felt informed and spoken to like a colleague and professional. I would come again and recommend this conference to anyone.

• I valued the input from teachers in the field regarding how informal learning centers can better address STEM for their students

• I was happy to become aware of some opportunities that might be available to my district in the future. • It validated my plans for how I want to run my class room in my school's proposed STEM Academy.

Everything I heard teachers were doing aligned right with my plan. • It was an opportunity to see what is going on in the region, and compare practice to my own. • Learning resources to help aid instruction • Learning that STEM was alive and well in the Cincinnati area. • Seeing actual projects and initiatives that are taking place. • Conference was a success! Each time we have a STEM conference it always provides opportunities

to learn more about STEM. Very organized, interactive and engaging. IDENTIFIED SPECIFIC SESSIONS (n=8) • Engineering Our Future - all the speakers were really excited about the topic. • I valued the session by Simonson. The data gave insight to what I should be doing as a teacher to

prepare my students for college. • I valued the Holified session as well. • I enjoyed the team from Kentucky. The ideas were powerful and I could have doubled my time with

them. I was able to see them for two sessions. • The Butler Tech Wind Kids presentation. • Loved the Lego Test Dummy session and EIE lessons • The iPad session was very helpful. • The NKU center CINSAM sessions were the only ones that I found to be valuable. LEARNING ABOUT CHALLENGE-BASED LEARNING and DESIGN PROCESS (n=6) • I attended the STEM conference to learn how to incorporate the design process into my science

classes. I was very pleased that the programs I attended helped me to accomplish that goal. • problem and challenge based learning • Real Life Scenarios • Opportunities to see challenge-based learning in action and in practice. • The opportunity to learn about the implementation of challenge based learning. • overall knowledge gained NGSS (NEXT GENERATION SCIENCE STANDARDS) (n=6) • A look at NGSS. • The session on NGSS: Structure, Content, and Implications. The presenter did an excellent job

involving the audience through an activity. • I attended the sessions addressing the standards and common core. They were extremely helpful and

relevant. We are beginning to implement both of these into our curriculum and I feel like I have a better handle of these things for science.

• Learning about the NGSS and how I can enhance my curriculum to meet the engineering aspect. • You had people who addressed exactly where we are in education with our standards nationally and

locally. Then we had speakers who modeled and connected those with an activity at many different levels. Then the technology resources.

• I get real applications as well as theoretical aspects of STEM which will help me develop lessons for the new common core standards.

KEYNOTE SPEAKERS (n=6) • I really enjoyed listening to Dr. Tonya Matthews. Very inspiring. • Dr. Matthews’s speech was inspiring. • The Keynote speaker, Dr. Matthews was great! • I really enjoyed the morning keynote speaker - excellent

Page 121: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 118

• The guest speaker was an excellent choice. She reminded me of why we teach inquiry based learning and ignited me in the classroom.

• Fantastic keynote, nice to hear from UC administrators. FLASH DRIVE (n=5) • flash drive of content • Having materials offered on the flash drive was very nice!! • I also really appreciated getting the flash drive with all documents - an excellent resource! • loved that all information was given on a thumb drive to we can refer back to it and review or look at

information that was presented in a different session • I liked that we were given a flash drive with the information and that I came away with things I can use

right away. CONVENIENT CONFERENCE (n=3) • Affordable for everyone! • I really appreciated the conference being held locally. It was much easier to get approval to come. • I was glad that it was during the school day and could be used as a professional day. WORKING LUNCH (n=2) • I liked the working lunch where we were able to talk about how we could make stem work for our

school. • I especially enjoyed the lunch time being long enough to discuss sessions we had been to and to have

some preliminary collaboration time. PRESENTATION TIMING (n=1) • This conference was timed well, that means that I felt like I could go to the bathroom and still get to my

sessions on time. NOTHING (n=1) • Honestly, I didn't bring home much of any value. ____________________________________________________________________________________

Page 122: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 119

Table E-3. Written Responses to the following: “Do you have any suggestions for improvement for our next STEM event?” _____________________________________________________________________________________ FOCUS OF SESSIONS (n=20) • More math applications. • More Math rich relevant activities • More on PBL. • More presenters for Elementary Teachers. We need to start the design processes so much earlier.

Having sessions that tie into the Core Curriculum would be great. • More profiles on careers related to stem, and connections to businesses • More research-based, nationally recognized sessions rather than current classroom teachers telling

about their 1st year's STEM experience. • more technology components • I would like the more general sessions-technology, resources--to go more in depth and let us try the

resources • I would like to attend sessions that help curriculum personnel design STEM programs in their districts. • I would also like to see how districts with business connections were able to establish those

collaborations in order to build a successful STEM program. • Maybe narrow the focus a bit. There was just SO much information to take in. • More hands-on sessions like CINSAM that gave solid examples of activities for classroom use. They

also gave DVD's with their information to be reused in the classroom. • Most of the sessions stated k-12. I feel that you need to gear activities to specific grade levels. • Much of what was presented seemed dated. Get innovative leading-edge educators. • Need more specific presentations, example lessons. Dive into the content a little more. • Half of the speakers were just talking and didn’t adequately explain how this impacted stem. • I was disappointed in the quality of the activities that the teachers were incorporating into their

classrooms and showing us as STEM-appropriate. • Some of the activities I went to would have been better as hands on • The [sessions] need to be much longer and in much more detail. Presenters must explain how they

are implementing STEM into their classrooms, not just showing samples of the work. They need to be honest about all the obstacles that could prevent standards for being completed due to all the STEM projects being implemented in the classroom. Time is a huge factor, and none of the presenters were truly open about the time commitment.

• Having instructors make a commitment to try specific STEM activities in the classroom • At science conventions I usually desire lots of options and breakout sessions. However, I felt like the

sessions I attended really lacked any valuable applications to the classroom. Everything I saw was just something fun to do in class--an isolated event or activity that lacked any true learning objectives. Maybe that's where I'm missing the point? Is the point of STEM just to allow a student to experiment and design with no content connection? I know there are engineering classes out there that are solid (such as Kings HS with Jason Shields). Maybe these are the type of sessions that should be presented--not a variety of cute isolated activities that are FUN but pointless. It seems the process of finding presenters needs to be more selective and target toward solid examples of excellence.

VENUE / LOGISTICS / PROGRAM (n=11) • Give a suggestion as to which parking garage to use and directions for how to walk to the building

from the garage! • The Tangeman Center was an "appropriate" venue for the conference but the session rooms were

very noisy; I heard almost as much from across the room dividers and outside the doors as I did from within and found these noises overpowered speakers' voices quite a bit. Also, I know it may be difficult, logistically, to move off campus and probably not worth the tradeoffs, but UC's campus can be difficult to get to and around compared to other venues.

• The morning registration was a little awkward.

Page 123: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 120

• The sign in process for each session was unwieldy in the larger groups. It wasn't always clear where or with whom the sig in sheets were.

• Wireless (UCGUEST) was not very good... was in an iPad APP session and couldn't access any apps...

• Make sure that the flyer has an address and zip code to log into the GPS when a participant is coming out of town. The flyer just said Tangeman University Center.

• I like the time directly linked to the title and the room in the booklet. Provides quick clarity! • A conference earlier in the school year would be helpful to teachers. • I would have liked to have had a more detailed summary of each session. It was difficult to pick from

the list with the vague information. • Larger program format and more info • Make sure presenters give an accurate description of their session. SESSIONS TOO SHORT (n=11) • Sessions weren't long enough. • Have some sessions a double block so you could really get into the subject matter in more depth. • Two presentations had to be combined into the same session time and locations which made for very

hurried explanations and activities. • I wish that each session could be a little longer so we would have more time to use the information in

our planning process. • Interactive sessions should be longer -- perhaps 45 minutes to an hour. • I like the session time, but with them trying to be interactive, it seemed like they needed 15 more

minutes. • Less time spent on introductions, more time for each session • Longer break-out sessions. • I'd make less introductions and speeches in the morning and give some more time to the sessions. • Interactive sessions need to be more than 30 minutes! I went to several sessions where the

presenters were very rushed and we did not get to "do" as much as I think they wanted to. • Perhaps make sessions 75 minutes each COORDINATION AND NUMBER OF SESSIONS (n=9) • Better coordination of grade level curriculum and breakout sessions. Three of the sessions that related

to current 8th grade standards were provided during the same session. However, none were available during two of the sessions.

• Duplicate tracks - hard to schedule the ones of most interest. Maybe more focused menu would help • Figure out how to be able to attend more of the sessions. There were many sessions that I could not

attend that I would have liked to. • Maybe squeeze in one more session in the morning? Start a bit earlier and end a wee bit later or make

lunch shorter. • I would have loved one or two more breakout sessions • It would be better if the conference could be a two-day event with repeats during different session

times (maybe with an option of registering for only one day). There were MANY workshops that I wanted to attend, but could not because they conflicted with other workshops being offered at the same time and all of the materials were not provided on the flash drive.

• Offer it over several days in order to attend more workshops • Start a little earlier so that we can attend more sessions • There were so many wonderful sessions to attend that it was difficult to choose. Sometimes there

were 2-3 sessions I wanted to attend that were scheduled simultaneously. Having people select their choices in advance of the conference and sending that information to those in charge of the event would help. Then it would be known which sessions might need to be offered more than once during the course of the day.

WORKING LUNCH (n=7) • Not enough time for the working part of the working lunch.

Page 124: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 121

• Do not offer a "working lunch" - this is a time when teachers need a chance to meet other teachers and process the events of the day, not listen to people talk at me while I eat.

• The working lunch was a nice idea, unfortunately, it gave me less time to talk and network with new people

• No working lunch- Teachers need a chance to be off task and chat with peers without everything being facilitated.

• No working lunches!!! I hated that people were talking while we were trying to Network! • PLEASE PLEASE PLEASE... FEWER UNNECESSARY SPEAKERS in the main hall! Our "working

lunch" would have been MUCH better spent if we had been able to simply talk with the other conference attendees at our lunch tables! That time is very important for networking and discussion of conference topics with fellow educators. It's also a time to decompress, and to review what we've learned, after spending all morning in sessions. We need to have a chance to talk! Having to eat in silence while listening to the speakers up front was just plain frustrating. Also, it was frustrating that so much of the speaking we listened to in the main hall was unnecessary introductions! Every new speaker had to have somebody else introduce them, and every introduction was incredibly long. Why do I need to hear a 10-minute biography of someone who's only going to speak for 10 minutes? Waste of half of our time! Just put it in writing, in the conference brochure. The GCSC speakers during our "working lunch" were folks that the majority of us are already familiar with. We've already done the exercise they wanted us to do, already given them our ideas for involving businesses in STEM education. (Never mind that they're going about it entirely the wrong way to begin with...) So not only did it ruin our lunch conversations, it was for something that nobody at my table found useful!

• You may want to add one more room to the afternoon breakout sessions and call it the networking room. The sole purpose would to provide a location for people to talk and brainstorm. This would help to continue the conversations that you started at lunch with your questions. I collected great ideas and that was because I our table was a mix of teachers and professionals.

SESSION ROOMS TOO SMALL (n=6) • Asking participants to choose the sessions ahead of time so you have approximate numbers for the

size of the room needed. Some popular sessions were in very small rooms with some people sitting on the floor.

• Bigger classrooms • Register for individual sessions before event • And one of the rooms did not have enough seating for everyone. • Some of the rooms were uncomfortably small • Some presentation rooms were too small. SUGGESTIONS FOR WHO TO INVITE IN FUTURE (n=5) • Bring in legislators that support STEM education in Ohio. • Greater involvement of non-formal educational organizations. Displays and or sessions so teachers,

admins and the business community could learn more about what they offer. Perhaps similar to the Greater Cincinnati Environmental Educators annual Ultimate Educators Expos held at the Zoo every fall. The teachers in the sessions I attended were hungry for collaboration and ideas they could immediately take into the classroom.

• I teach at an independent school and only accidentally stumbled on the conference announcement. Is there a way to add independent schools on your notification list?

• More of a national audience. • A keynote speaker who has more clarity and experience with STEM public/private school classroom

experience. STM speakers who are employed by STEM schools. They can relate and model experiences for teachers who are now responsible for Engineering and Design in the Ohio Science Model Curriculum. Science ODE consultant who can address STEM issues in the NGSS or science Model Curriculum List A list of STEM "Adopt a school partnerships" (contacts) who will provide a collaboration with school district and teachers.

SPEAKER INTRODUCTIONS (n=5) • Introductions were too long.

Page 125: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 122

• Overboard with the introductions. I don't think you need to read everyone's accolades before they speak. Just let them speak! And they can provide the background they feel relevant.

• Less time spent on introductions, more time for each session • I'd make less introductions and speeches in the morning and give some more time to the sessions. • No need to address so many kudos for guest speakers. Just a few, we can read this in the booklet.

Don't waste so much time! WENT WELL – NO SUGGESTIONS FOR IMPROVEMENT (n=4) • I enjoyed every session I attended • Everything was done very well. • It was really good. • This was very well organized and implemented. POSTER SESSION (n=3) • Yes, the poster presentations needs to in one location so that people can see all posters. • I would like to see all posters displayed in a common location for the entire conference. This would

allow more people to view them. • The "poster" sessions got stuck in the corner. For new to STEM educators these presenters would

have day to day info needed to help new teachers plan

Page 126: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 123

Table E-4. Written Responses to the following: “Please list STEM related presentations or topics you would like to learn more about. ” _____________________________________________________________________________________

SPECIFIC STEM SUBJECTS (n=29) ENGINEERING (n=9) • More engineering applications in gr. 2-6. • ASSET program • Engineering your Future. • More about FIRST • Project Lead the way • Putting the E in STEM! • Programming robotics 60 - 90 minutes with a break. • Please include stuff that is affordable and feasible for most teachers / schools! The FIRST

Robotics sounds cool, but is only affordable for the richest of private schools. (Annual club budgets of $35,000-$50,000!!! For an after-school club!!!) I found out later that there is a far cheaper and yet equally effective (possibly more effective, since the adult mentors don't do all the work) alternative: the FIRST Tech Challenge. Can you feature this instead next year? Ask iSPACE.

• more from NASA MATH (n=7) • I would also like to see more ways that Math is being integrated into the science aspects. • I think that it is always more challenging for a math teacher to come up with STEM lessons so I

would be interested in more STEM presentations for math teachers. • Middle School Math topics that can be implemented in classrooms • Topics in math for younger grade levels. • Some more middle grade math lessons based around engineering. • More about the math in the sessions. • I'd like to see more activities that are aligned to math content areas for my grade level

LIFE SCIENCE (n=4) • How does STEM relate to Life Sciences? Not much on biology, genetics, etc. • life science • Life science • We have an organic garden at our school. STEM activities related to the garden/plants/

environmental issues are topics I would like to explore further. BIOLOGY (n=3) • STEM in the Biology classroom. • STEM lessons for biology • I would like to see more applications for biology and chemistry.

CHEMISTRY (n=2) • stem activities for chemistry • I would like to see more applications for biology and chemistry.

NANOTECHNOLOGY / BIOTECH (n=2) • Nanotechnology • more Biotech

GEOLOGY (n=1) • MATTER!!!! GEOLOGY

Page 127: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 124

MATERIALS SCIENCE (n=1) • Materials science is what our students need to know. Cincinnati is a manufacturing city. Our

students will enter this workforce and it was not represented. INTEGRATION WITH OTHER SUBJECTS (n=11)

ACROSS DISCIPLINES (n=7) • I would like to see more global issues brought down to the students level...water in 3rd world

nations, global warming... • Ideas on how to get Science Teachers, Math Teachers, ELA, Technology teachers working

together to develop meaningful, authentic STEM experiences in a school. • Integrating other subjects, time to see how districts have integrated STEM in their district • Integration of 21st century skills; college and career readiness • Rule of Thumb #1: Please have a wider variety of session topics! The majority was focused on

"engineering", and all of the titles and descriptions of these were so similar it was difficult to tell the various engineering sessions apart.

• Additional Ideas: 1.How about one or two interdisciplinary sessions, that focus on how to use a topic (e.g. astronomy, an environmental issue, etc.) to tie all four letters of STEM together into one classroom project?

• Using STEM resources during "test prep" time in my classroom. During OAA time we focus mostly on Math and Reading.

INTEGRATION WITH LITERACY (n=4) • STEM & Literacy • integration of literacy skills • Reading in the content (Non-Fiction) in the different grade bands and other classroom integration

activities. • Combining Literacy and Science Blending the 5E's in science

CHALLENGE-BASEDLEARNING (n=9) • Challenge-based learning • challenge-basedlearning • Implementation of challenge based learning • Deign challenges for the elementary level • Designing challenge based learning collaborations between schools and informal education providers.

Environmentally focused CBL. • STEM challenges, who, why, and where. • The possibilities of using service learning within the CBL classroom. The Flipped Classroom. • more about PBLs and CBLs, examples and designs • Problem challenge based learning as a hands on activity. Give us an example we can do there and

then so we can experience it. COLLABORATION WITH BUSINESSES, LOCAL NON-PROFITS, AND HIGHER EDUCATION (n=9) • I would like a presentation on businesses that are open to collaborating with schools to advance

STEM education. • Building business partnerships • Business leaders' perspectives on what they are looking for from new employees. • Connections to business fieldtrips for teachers and students • Get some industry people to talk about the types of students that they need. Some teachers did not

have a very good understanding of the types of jobs that require a lot of technical skills but are a good match for the second tier kids (solid B students) Teachers generally understand what is involved with engineering, doctors, architects, etc.). Information about all the types of technicians, computer technicians to welders.

• more business and industry in the classroom activities more business and industry involvement in program

Page 128: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 125

• opportunities to make partnerships with University of Cincinnati • How about introducing teachers to educational non-profits and other community resources? I know

you have a few sessions led by specific organizations, like iSPACE, and that's good! But get more of those - sessions by the Observatory, the Zoo, etc. And / or, bring in someone who can talk about all of these organizations in aggregate, and make the case for field trips and outreach programs in general being an important tool for teachers. Maybe a session by a panel of representatives from the Greater Cincinnati Environmental Educators (GCEE) network, to let teachers know what's out there?

• Any given by CINSAM NKU. PROJECT BASED LEARNING (n=7) • Project-based learning • engineering in the elementary classroom, graphic design or other digital/computer skills for elementary • examples of problem based learning in a physics classroom, modeling curriculum for physics • PBL • pbl -reducing to practice • more about PBLs and CBLs, examples and designs GENERAL CLASROOM IDEAS / ACTIVITIES (n=5) • Any specific classroom ideas for labs etc. to get the concepts across. There were many good ones this

time, but more is better. • Cheap activities • Cheap ways to develop activities in my classroom. • STEM activities that are directly aligned to the Ohio Model curriculum K-8 and the 6 science

secondary syllabi. • Museum of Boston STEM activities for elementary or a professional development for elementary

teacher. HOW TO GET STARTED? (n=4) • We are just starting our STEM program in my district, and an "Intro to STEM", or "How To Get Started"

workshop would be very nice! • My school is just beginning the process of planning and integrating STEM / problem based activities. I

would have loved to see a session dedicated to the real novice in this area. What do I need and need to know to do this with my students?

• Transformation of a "regular" school to a STEM school. Also how to develop a STEM academy within the walls of a traditional comprehensive high school.

• Programs that are open to all schools. PROFESSIONAL DEVELOPMENT (n=4) • Effective models of professional development • Focus on highly developed programs that are models of excellence ... • How do you take the National Standards and web of complexity and make it conceptual to younger

learners? Evaluators want to see the lesson Objectives on the Board. They want it written out on your lesson plans. What Engineering/Technology Professionals are able and willing to come and speak/run an activity in the classroom?

• Museum of Boston STEM activities for elementary or a professional development for elementary teacher

STANDARDS (n=3) • Additional sessions, or a 2 hour workshop on the new standards. • Science standards and incorporating STEM • More about the grade specific standards. More ideas of how to implement them. OUTCOMES AND MEASUREMENTS (n=2) • A few sessions of learning outcomes and measurement. • quantitative measures of its (CBL) impact

Page 129: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 126

DIVERSITY (n=1) • Some discussion of strategies to increase STEM interest for diverse populations. BLENDED LEARNING (n=1) • Blended learning FUNDING OPPORTUNITIES (n=1) • I would like to have more information about how to access funding for my district. TIME INVOLVED TO IMPLEMENT STEM ACTIVITIES (n=1) • I really want to know the honest answers to how one actually implements STEM activities without

running out of time to teach ALL the new standards in CCSS or NGSS. iPad Applications (n=1) • The iPad applications session was awesome. I would love to learn more/practice the use of some of

the applications. ____________________________________________________________________________________

Page 130: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 127

Table E-5. Written Responses to the following: “Thinking back on these sessions, what was (were) the most useful session(s) you attended? Why?”

Session Why most useful? General Comments • It was interactive, I liked … too because they shared 3 different

CBL units. • because they provided activities they used • loved the resources! • Classroom content and Teacher PD ideas. • I have used most of these presenters in my building. • did a very good job at presenting the topic at hand and I learned a

lot • … creating an atmosphere for collaboration rather than competition

by using the framework of the whole class as a company rather than competing companies

• great lessons ready for the classroom • Directly related to my content • I was able to actually work through the process of the lesson and

see how it would be applicable in my classroom and have the perspective my students may have when completing this project.

• They were all pretty much the same • Those connected to giving me ideas of STEM lessons to teach in

my own classroom. A Model Program • I want to help with recruiting students. All (n=9) • loved getting Reeda Hart's CDs

• I liked all of the workshops that gave me activities to take back to my own classroom. I need more time to play with the iPad aps to make that session more worthwhile. At least it gave me some places to start for next year. I liked the first session, where we could actually interact with classroom teachers. It gave me some ideas for my high school geometry course next year.

• I loved the Lego session, but all of them were useful. • I enjoyed every one of them. The best part was the access to

resources to aid instruction • All 4 to be really good. Looking back at the sessions, I remembered

that I had to pick and choose. It would be nice to see some of the talks or topics next year. This one reason I like the flash drive because I can still go and get the information

• All of the sessions were useful. I came back to school with ideas to tweak what I am doing now and a deeper understanding of the NGSS.

• because they were hands on. • for different reasons. • for different reasons.

Building Partnerships with Business and Industry (n=2)

• gave a great example of how Gateway working to bridge the gap between schools and industry. I would like to see more post-secondary institutions doing this work.

Page 131: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 128

Session Why most useful? CBL (n=5) • Interaction with other teachers and concrete examples of using

CBL in the classroom. • probably most practical application • Good Concept and she shared her process of start to finish and the

challenges she faced as well. It was a well-developed idea and the content was relevant curriculum.

• topics my organization is currently focused on • They brought their experiences from the classroom

closing the gap • closing the gap Crash Test Dummies (n=2) • The really allowed me to get an idea of Engineering Design by trial

and error. • I enjoyed it, but need to find its correlation to concepts and

terminology. Energy Transfer, Transformation

• The teacher was very knowledgeable.

Engineering design • they were practical and hands-on Engineering Makeovers • the fact that we got to actually engineer something in a simple,

quick way and test it out was fantastic Engineering Our Future (n=2) Inclusive Competitiveness Imperative: Nurturing Talent & Enterprise to Fill Jobs and Create Wealth in the 21st Century (n=3)

• it gave insight into the future of STEM and how it will affect the world

• Jonathan’s message should be heard by educators and business leaders alike.

Integrated reading with science! (n=5)

• Provided a couple of science literacy strategies with CLARITY that can be taken back to the classroom

• I learned many ways to integrate areas in my classroom and it helped develop an understanding of the process.

• strategies I can apply immediately • Introduced me to the NSTA website and gave me fabulous ways to

integrate STEM into reading. • This is what our school is working towards and it was good to see it

in action. Much useful information and a good web site were given as a resource.

iPad Apps (n=2) • I thought it and STEM up were awesome for the primary grades. I feel like most things I see involving STEM are for older grades.

• would have been better if wireless allowed me on the app store

It's That Easy to Find High Quality STEM Resources

• because they had us go through web sites step by step to find our grade level specific content. They focused on us individually.

LEGO (n=3) • Both provided good ideas for classroom activities. They were hands-on and fun.

• because they both presented new ideas I could use in my classroom.

Nanoscience for Every Classroom (n=3)

• they were practical and hands-on • great resources • was very useful and relevant. I liked that they included the societal

link, I've never seen that done before

NASA • NASA's website and resources were awesome!

Page 132: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 129

Session Why most useful? Next Generation Science Standards (n=11)

• It was just more details I needed about changes coming down the road.

• I wish we could have been in this session all day, as it was practical and full of information that I need to make the necessary changes in my curriculum.

• David did a good job in the presentation and discussion in covering this. I found his (and others in the room) insight to be helpful as I begin planning for future lessons that I am responsible to develop.

• really helpful to finally understand that system and its implementation

• It helped me get a better idea of what the new standards look like and how they will impact my science courses.

• topics my organization is currently focused on • did a fantastic job of breaking down the intimidating document and

showing how it can be used in the classroom. • Having been involved in STEM education in central Ohio for many

years, most of the presentations were showcasing lessons and activities that I had already implemented. The NGSS presentation, however gave me a better understanding of how to link the T&E to math and science. Moreover, it gave me ideas on how to present the NGSS to school staff.

• gave me some realistic ideas about how to integrate engineering into the science and math classrooms. He was about to help us look at the next generation and common core initiatives and see how they mesh and where they differ. We also got information about websites that will be helping us in working with the revisions made to the science standards. It was very good.

PBLE session (n=2) • I bought the app and have been very impressed with project ideas it provides.

poster presentation • very informative and I was able to ask questions to the presenter Putting E in STEM (n=4) • They gave wonderful hands on ideas for the classroom. I am a third

grade teacher and most of the conference seemed tailored to the upper grades. Those two sessions were worth their weight in gold.

• Hands-on activities - They also gave DVD with their info. • The presenters were very knowledgeable and enthusiastic. I was

able to walk away with a cd of the materials they presented in the session.

REAL Robotics • No other program that completely encompasses and exposes students to what it is like to be an engineer.

Reeda Hart's group – both lessons (n=4)

• Interactive and resourceful. Using inquiry in science classes. • She is a great presenter and everything is hands on. • b/c of the classroom applicability and new ways to present lessons. • Both were terrifically engaging and reminded me that I was doing

was good, and with small tweaks to lessons can incorporate math and other ideas.

RET Teachers • They were all useful. I was very intrigued to hear what the RET teachers have done and what they learned. Each person brought some unique insight or information that they learned from the

Page 133: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 130

Session Why most useful? program.

Rock and Roll through Earth Science (n=5)

• most directly related to my core curriculum. Presenters were enthusiastic and authentic and offered lots of ideas for implementation.

• Liked the earthquake simulation lessons. • The presenters were very knowledgeable and enthusiastic. I was

able to walk away with a cd of the materials they presented in the session.

• It gave me a new perspective one how to implement engineering in our earth science curriculum. Changed my thinking about the focus of projects/activities. A shift away from discovery and more towards problem solving.

Rock Cycle (n=2) • They gave wonderful hands on ideas for the classroom. I am a third grade teacher and most of the conference seemed tailored to the upper grades. Those two sessions were worth their weight in gold.

• Hands-on activities - They also gave DVD with their info. SIMPLE INEXPENSIVE ENGINEERING DESIGN PROCESS IDEAS (n=3)

• Interactive putting the engineering design process into lessons I liked the inter-activeness of the session.

• The presenters had fresh ideas that could relate directly to middle school students. They presented hands-on activities that could be directly used now that I am back in my classroom. They explained how they set up their labs, as well as how they have their students use the engineering design process throughout their labs.

Solving Real World Math Problems

• They seemed the most authentic of the STEM activities I saw.

Stunt Design (n=5) • I was looking specifically for Physics ideas, so Stunt Design was terrific.

• good presentation with hands on activities - well thought out challenge

• It provided an entire unit that I can apply without needing to edit much

• because I feel like I can apply that exactly the way it is with little modification

• because of my lack of lessons available for physics. I also can see how to translate the process to additional lessons I have.

There is More to Light (n=3) • Both provided good ideas for classroom activities. They were hands-on and fun.

• because they both presented new ideas I could use in my classroom.

Thinking Inside the Box. • Gave me three activities that I can use in my classroom. Using Blended Learning • because I am not technologically savvy and not as aware of what is

available in that area. Virtual applications • Because it seemed like something a few of my students would like. Water treatment session • valuable to me b/c I was looking for something new to do in my

Chemistry classes next year. Wind Turbine (n=3) • I am building a wind turbine this summer and use it as a teaching

tool next school year. • hands on with copied materials.

Page 134: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 131

Table E-6. Demographics of Respondents to 2013 STEM Conference Evaluation What is your current position? (CHECK ALL THAT APPLY) Frequency

% of Responses

K-12 Teacher 77 62.6 K-12 Special Education, Resource, or Inclusion Teacher 1 0.8 K-12 School Administrator 2 1.6 K-12 Central Office Personnel 3 2.4 Post-Secondary Faculty or Staff 9 7.3 Post-Secondary Administrator 2 1.6 Educational Organization Faculty or Staff 16 13.0 Member of Educational Outreach Organization (Listed below)

• Asset Inc. • Cincinnati Observatory Center (n=2) • CINSAM • Civic Garden Center • E3 Program • Greater Cincinnati STEM Collaborative • iSPACE • Lakota Robotics • NSTA • Ohio Technology and Engineering Educators Association • YWCA Greater Cincinnati

12 9.8

Business or Industry Representative (Listed below) • Controls Engineer • Newport Aquarium

4 3.3

Other (Listed below) • CEEMS Resource Team member • Educational consultant • Government • Student (n=3)

7 5.7

What level(s) best describes the grade level(s) you are currently teaching? Frequency

% of Responses

Pre-Kindergarten 3 2.5 Primary (K-3) 18 14.9 Intermediate (4-6) 29 24.0 Middle (7-8) 43 35.5 High School (9-12) 40 33.1 Administrator Supervisor (Elementary School) 1 0.8 Administrator/Supervisor (Middle School) 1 0.8 Administrator/Supervisor (High School) 2 1.7 Administrator/Supervisor (District-wide) 4 3.3 Post-secondary 11 7.9 Other (Listed below)

• Consultant • ESC Science Coordinator • K-12 Consultant • Science Coach

6 4.3

Not Applicable 6 4.3

Page 135: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 132

Table E-6. Demographics of Respondents to 2013 STEM Conference Evaluation (continued) Select the response that best describes the main subject area you are currently teaching or preparing to teach. Frequency

% of Responses

Self-contained class (teach all or most academic subject to one class) 6 5.0 Math and Science 19 15.7 Math only 12 9.9 Science only 38 31.4 Technology only 4 3.3 Engineering only 6 5.0 Other or multi-subject combinations (Listed below)

• All STEM areas • Arts • Computer Science • gifted integrated content • K-12 Instructional Technology • Language Arts and Science • math and Engineering • Primarily science (astronomy), but some T, E, and M as well! And,

some history... • Science & Engineering • Science and Social Studies • Science, Math, Language Arts • Self-efficacy • Social Studies • STEM (n=3) • T & E • Teaching social studies (but math certification too!)

19 15.7

Not Applicable 17 14.0

Page 136: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 133

Table E-7. 2013 STEM Conference Summary of All Session Evaluations

N Mean*

Std.

Deviation

1. Overall, the information presented during this session was very useful. 524 3.54 .577

2. The content and/or strategies presented at this session can be easily adapted, or

used as-is, in my educational setting.

514 3.50 .619

3. The session increased my understanding of challenge-based learning. 478 3.41 .675

4. The session increased my understanding of how to use engineering as a context for

teaching mathematics and science topics.

468 3.37 .730

5. The presentation was engaging and the activities appropriate for the topic. 521 3.56 .624

6. The presenter was clear and easy to understand. 524 3.67 .517

7. The presentation accurately reflected my expectations from reading the abstract. 520 3.54 .632

8. I would recommend this session to my colleagues. 518 3.52 .657

9. The session identified real-world applications, career connections and societal

impacts for teaching STEM content.

506 3.55 .612

Valid N (listwise) 414

*Scale: 4=Strongly Agree; 3=Agree; 2=Disagree; 1=Strongly Disagree

Page 137: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 134

Table E-8. Summary of Session Evaluation Results for CEEMS Teacher Presentations

N Mean*

Std.

Deviation

1. Overall, the information presented during this session was very useful. 73 3.60 .520

2. The content and/or strategies presented at this session can be easily adapted, or

used as-is, in my educational setting.

74 3.62 .516

3. The session increased my understanding of challenge-based learning. 72 3.50 .531

4. The session increased my understanding of how to use engineering as a context for

teaching mathematics and science topics.

74 3.59 .494

5. The presentation was engaging and the activities appropriate for the topic. 73 3.71 .485

6. The presenter was clear and easy to understand. 74 3.74 .470

7. The presentation accurately reflected my expectations from reading the abstract. 74 3.61 .519

8. I would recommend this session to my colleagues. 74 3.65 .535

9. The session identified real-world applications, career connections and societal

impacts for teaching STEM content.

74 3.61 .544

Valid N (listwise) 70

*Scale: 4=Strongly Agree; 3=Agree; 2=Disagree; 1=Strongly Disagree

Page 138: NSF MSP: CEEMS Project – Year Two - ceas.uc.edu CEEMS Eval... · lesson development, teacher professional development and classroom implementation. There were 15 in-service teachers

CEEMS – Year 2 Evaluation 135

Table E-9. Summary of Session Evaluation Results for non-CEEMS Teacher Presentations

N Mean*

Std.

Deviation

1. Overall, the information presented during this session was very useful. 451 3.53 .586

2. The content and/or strategies presented at this session can be easily adapted, or

used as-is, in my educational setting.

440 3.48 .633

3. The session increased my understanding of challenge-based learning. 406 3.39 .697

4. The session increased my understanding of how to use engineering as a context for

teaching mathematics and science topics.

394 3.32 .759

5. The presentation was engaging and the activities appropriate for the topic. 448 3.54 .641

6. The presenter was clear and easy to understand. 450 3.66 .523

7. The presentation accurately reflected my expectations from reading the abstract. 446 3.53 .648

8. I would recommend this session to my colleagues. 444 3.50 .674

9. The session identified real-world applications, career connections and societal

impacts for teaching STEM content.

432 3.54 .623

Valid N (listwise) 344

*Scale: 4=Strongly Agree; 3=Agree; 2=Disagree; 1=Strongly Disagree