trc212: engineering journal training evaluation plan · the following section provides a detailed...

25
EDCI 57700-002 TRC212: Engineering Journal Training Evaluation Plan Proposal created by Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Michael Puckett Purdue University EDCI 577-002 Fall 2017

Upload: others

Post on 15-Jul-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

EDCI 57700-002 TRC212: Engineering Journal Training Evaluation Plan Proposal created by Michael Puckett

TRC212: Engineering Journal Training Evaluation Plan

Michael Puckett Purdue University EDCI 577-002 Fall 2017

Page 2: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 2

October 1, 2017 Dear Jennifer, Thank you for selecting Steadfast Learning as your preferred evaluator for the TRC 212 training program. We are excited about the upcoming product launch and my team is looking forward to providing a detailed and strategic evaluation for the Tennessee Robotics Club Engineering Journal Training module in the TRC212 training program. Included in this evaluation plan you will find the following items:

• Executive Summary • Evaluation Goals and Scope • Description of the Evaluation Process • Evaluation Rationale, Procedures and Measurement Instruments • Data Collection and Analysis • Appendix of Example Instruments and Data Reports

Please review the enclosed evaluation plan for completeness and accuracy. We can work out any corrections next week during our one to one review session. I look forward to seeing you then. In the meantime, please feel free to reach out to me if you have any questions or concerns prior to the start of the evaluation project. Best Regards, Michael Puckett Owner, Steadfast Learning & Design [email protected] (615)-987-8560

Page 3: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 3

Executive Summary Purpose:

The purpose of this training evaluation is to measure the effectiveness of the training content and delivery platform to develop future training modules. The TRC212 program seeks to provide an online curriculum thereby freeing up team coaches and mentors to focus on preparing the team for competition. If the program is successful, the TRC212 program will decrease the amount of time the team spends on fundamental training during meetings and increase overall training time. This comprehensive evaluation plan will provide a detailed analysis of the training effectiveness and help identify areas for improvement. Primary Objective:

The primary objective is to measure the training effectiveness and identify improvements from the TRC212 Engineering Journal Training module which can be implemented in the design of future learning modules in the TRC212 program. Summary of Evaluation Plan:

Steadfast Learning utilizes the Kirkpatrick & Kirkpatrick (2006) Four Levels of Evaluation model which is considered by many to be the standard for strategic evaluation and assessment. We have designed a 6-month evaluation plan to perform a thorough evaluation and analysis. First, the plan will evaluate learner reaction to the training material and platform. Next, learning knowledge, skills and behaviors will be measured. Thirdly, we will seek to uncover behavior and application from the training. Finally, the plan will seek to understand the results of the training course in the TRC212 program. Key Recommendations:

The training program should commence during the pre-season to have adequate time for the training to be completed. The reaction survey for level 1 will be completed immediately following the post-test assessment. The performance test should be completed within 1 week of the training to allow time for coaches and mentors to complete the performance assessment and provide feedback to the learner. From there, it is recommended that we complete the behavior survey after 90 days to allow enough time for the team member to use the new knowledge and skills. At the end of the season during the 6th month, the final step of the evaluation will be completed to measure the overall results. Data Findings:

No data has been collected at this time due to this being the first module in the series. Key data that will be collected include reaction data results, learning data results, interview data from surveys, and detailed feedback from focus group interviews.

Page 4: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 4

Evaluation Goals and Scope Instructional Product Overview

At 211 degrees, water is hot. At 212 degrees, water boils and creates STEAM. In 2017, Tennessee Robotics Club (TRC) launched a new online training program called TRC212 with the long-term goal of developing a series of online E-Learning courses focusing on STEAM topics: Science, Technology, Engineering, Art/Design, and Mathematics.

The TRC Engineering Journal Training is the first module in the TRC212 learning program and will be used to evaluate overall effectiveness of this type of training. This first module in the TRC212 series designed to teach the process of submitting an online engineering notebook entry following the FIRST Tech Challenge Engineering guidelines. The learning module is located at http://tennesseeroboticsclub.org/trc-engineering-journal. Table 1 provides a module overview of the section and content in the Engineering Journal Training module which will be used for the instructional product evaluation. Module Section Content Let the Adventure Begin Learning Game Activity/ Kickoff Video Section 0: Course Introduction Course Instructions/ Learning Challenge 0 Section 1: Access Engineering Journal Interactive Content/ Learning Challenge 1 Section 2: Create a New Journal Entry Interactive Content/ Learning Challenge 2 Section 3: Enter Meeting Details Interactive Content/ Learning Challenge 3 Section 4: Enter Reflection and Media Interactive Content/ Learning Challenge 4 Section 5: Review, Save and Publish Interactive Content/ Learning Challenge 5 Learning Assessment Final Assessment Table 1: Overview of Module 1 Contents

Purpose of the Evaluation

The TRC212 program seeks to provide an online curriculum thereby freeing up team coaches and mentors to focus on preparing the team for competition. The purpose of this training evaluation is to evaluate the effectiveness of the training content and delivery platform. If the program is successful, the TRC212 program will decrease the amount of time the team spends on fundamental training during meetings.

Page 5: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 5

Evaluation Goals

The goals of the evaluation plan are to:

1. Identify areas for improvement by measuring learner reaction to the instructional content, course design, learning technology, and overall learner experience.

2. Determine the effectiveness of an online training program to provide the knowledge and skills necessary for a robotics team to successfully solve engineering problems.

3. Determine if the TRC212 online training program would provide a viable resource for future course development encouraging team members to explore STEM careers.

Primary Objective

TRC team coaches are interested in learning how team members react to the online learning module and how effectively the first module meets the learning objectives. There is also an interest in understanding if team members continue to use the skills taught after 60 days of completing the training. The primary objective is to measure the training effectiveness and discover improvements which can be implemented in the design of future learning modules in the TRC212 program. Learner Analysis and Context

Tennessee Robotics Club (TRC) is a local FIRST Tech Challenge team where team members explore real-world engineering challenges by building robots to complete mission tasks on a challenging playing surface. TRC Team members are encouraged to work together solving complex engineering challenges, building robots using engineering design principles and exploring exciting career fields in STEM. The intended audience for the learning module includes both new and existing team members, coaches, and mentors of the Tennessee Robotics Club. TRC team members are between the ages of 14-18 and have at least 1 year of experience writing journal entries. Many of the team members have 6 or more years of FIRST robotics experience. The training will be completed during one of the pre-season meetings.

Technology Requirements

Team members have an advanced understanding of basic computer programming and using multimedia programs. To complete the TRC Engineering Journal training module, Team members will need to have the following technology: Computer/ Device PC, Laptop, Android Tablet or iPad Internet Internet Access Browser(s) Internet Explorer 9.0 or higher, Google

Chrome, Safari, Firefox PDF Adobe PDF Viewer

Table 2: Technology Requirements

Page 6: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 6

Description of the Evaluation Process This evaluation will follow the guidelines established by the Kirkpatrick Four Level Evaluation model (Kirkpatrick & Kirkpatrick, 2006) which includes measuring the four levels defined below:

1. Reaction 2. Learning 3. Behavior 4. Results

The evaluation plan will begin in August shortly after pre-season to allow enough time for the team members to complete the online training. The first level learner reaction survey will be completed immediately following the post-test assessment which will be completed at the end of the online training module. Team members will complete the performance test within 1 week of the training to allow time for coaches and mentors to complete the performance test checklist. The third level behavior survey will be completed in November after 90 days have passed allowing enough time for team members to use the new knowledge and skills during meetings. The fourth level of the evaluation plan will be completed in February at the end of the season to measure the overall results and provide a detailed summary report of key recommendations and findings to the TRC team manager. Please see Appendix A: Evaluation Plan Timeline for a detailed overview of the evaluation plan timelines and steps.

Evaluation Rationale, Procedures and Measurement Instruments The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale as to why the level is important to evaluation, how it will be accomplished, and what instruments will be used at each level. Level 1- Evaluating Reaction

The first step of a successful course evaluation begins with the learner’s reaction to the training program. “Evaluating reaction is the same thing as measuring customer satisfaction” (Kirkpatrick & Kirkpatrick, 2006, p. 27). This is an important first step in the evaluation process because understanding how the learner reacted to the training program will help determine if learning is likely to occur. Ideally, a favorable reaction to the training will result in a positive motivation to learn. Secondly, gathering feedback from the learner about the training experience will help improve the program for future participants which is important in developing future modules in the TRC212 program.

Page 7: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 7

Level 1 Evaluation Procedure

After the training is complete, team members and coaches will receive an email from the TRC team manager to complete an anonymous course survey online. The survey will provide the team member with the option of entering their name for follow-up, otherwise, no personal identifying information will be collected. This will help achieve honest and genuine feedback for the survey results.

Level 1 Evaluation Instrument

The reaction survey (See Appendix B) will be created using Google Forms. The reaction survey will consist of 10 questions to measure learner reaction to the course content, design and learning technology. The reaction data will be compiled into a spreadsheet form and stored on the TRC Google Drive for future reference. The numerical data will also be tabulated to create a chart to report the results of the survey in a visual form. The Google Form survey will rate the responses on a Likert scale of 1 to 5 with 1 representing “Strongly Disagree” and 5 representing “Strong Agree” and varying levels in between. Each question will include a space for learners to enter additional comments to help clarify their response selection. The last question in the survey will provide a comment section for the learner to make any suggestions about the training that were not asked in the previous survey questions. The survey is short and should take about 10 minutes to complete.

Level 2- Evaluating Learning

The second step in the evaluation is to measure learning which is comprised of skills, knowledge and attitudes. As Kirkpatrick (2006) points out, unless the learning objectives were met, then no change in behavior can be expected (p.42). Ideally, a control group would be used to help measure gains in learning but since the team is small consisting of only 5 members and three coaches, the control group would not be possible. The training process is new for all team members so there is no need or benefit in doing a pre-test. Therefore, the primary evaluation instruments will consist of a post-test to measure knowledge and attitudes and a performance test to measure skills. These results will help measure what knowledge was learned and what skills were developed.

Level 2 Evaluation Procedure

Learning will be evaluated in two steps. The first step will be accomplished by a final assessment (See Appendix C) to measure knowledge learned. The second step will be accomplished with a performance test by having the team member submit an online engineering journal entry which will be evaluated by a coach or mentor using the performance evaluation form (See Appendix D).

Page 8: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 8

Each section also contains a learning challenge (See Appendix E) which must be completed before moving on to the next section. There are 6 learning challenges to reinforce the content for the Final Assessment.

Level 2 Evaluation Instrument 1- Post Test

After completing each of the sections successfully, the learner will then be presented with a final assessment of 10 questions to successfully complete the training module. The final assessment will be administered online at the end of the final section. Learners must pass the test with a score of 80% or higher to receive credit for the course. Level 2 Evaluation Instrument 2- Performance Test

Upon completion of the final assessment, the team member will submit a new online engineering journal entry for evaluation by a coach or mentor. The TRC Team Manager will review the journal entry and check for accuracy and completeness of the following areas from the training sections: meeting details, reflection and media submission.

Level 3- Evaluating Behavior

Level 3 behavior evaluates the learner’s application of new skills revealing if they are in fact using them beyond the training. If the first two levels reveal the team member did learn new knowledge and skills, then level 3 seeks to understand if the learner applied the newly learned skills in the performance setting.

Level 3 Evaluation Procedure

Behavior change is measured using a performance survey questionnaire (See Appendix F). A survey questionnaire is recommended by Kirkpatrick (2006) as being the most practical way to evaluate behavior (p.56). The performance survey questionnaire will be completed online after the 90-day mark to ask team members what new skills they were able to apply from the training course. The results of the survey help determine if the team member exercised any changes in behavior since the training was completed.

Level 3 Evaluation Instrument- Performance Survey Questionnaire

The Performance Survey Questionnaire is comprised of four questions evaluating the skill areas of entering meeting details, reflection, media submission, and completeness with an absolute scale of 1 to 5 measured quantitatively. The questions ask the team members about application of the skills with ratings beginning at 1 (never) to 5 (always) and varying levels of application in between. Data collected is reported in a table format representing each skill area.

Page 9: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 9

Level 4- Results

As Kirkpatrick (2006) points out, probably the most difficult and likely the most important step is to evaluate the results of the training program (p.63). This level provides an opportunity to gain valuable feedback from the coaches, mentors, team members and even the judges. The evaluation will seek to answer important questions to help uncover the true impact of the training program. Since our group is small, a focus group will work great for this instrument. Level 4 Evaluation Procedure

Results will be evaluated using a Focus Group Interview (See Appendix G) to measure the overall impact of the training program. The interview will take place at the end of the season in a small group setting to provide an opportunity for each team member and coach to reflect upon the impact the training had on the season and how successful the team was at implementing the engineering journal for the competition.

Level 4 Evaluation Instrument- Focus Group Interview

Focus groups work best in small group settings of 8 to 12 members where conversations are constructive focusing on a few key areas for discussion. Therefore, there are guidelines that must be followed to ensure that results collected from the focus group provide the greatest value and the most credible result (Elkeles, Phillips & Phillips, 2014, pp. 56-57). The Focus Group Interview will be conducted using the following steps as recommended by Elkeles, Phillips & Phillips to help the team arrive at a credible result of the technology training impact:

1. Explain the goals of the focus group 2. Discuss the rules (primarily to keep focused) 3. Explain the importance of the process and how it seeks to show results 4. Select the first measure and show the improvement (Example: Online Training) 5. Identify the different factors that have contributed to the performance 6. Identify other factors that have contributed to the performance (Example:

Experience, Awards, Peer Feedback, and Coaching) 7. Discuss the linkage between the other factors 8. Repeat the process for each of the other factors 9. Allocate the improvement percentage for each factor 10. Provide a confidence estimate of that percentage allocation 11. Multiply the two percentages together to arrive at a result for each factor

The Focus Group will be comprised of 7 questions and the answers related to the Engineering Journal guidelines. These questions were selected to help the team identify the outcomes of the training and help identify other factors that may have contributed to the performance success. The team interview with the judges will also be recorded via video camera by one of the coaches and reviewed during the Focus Group interview to help provide constructive feedback for the next season.

Page 10: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 10

Data Collection and Analysis Evaluation Levels and Data Analysis: Upon completion of the evaluation instruments, the data will be collected and tabulated into a series of charts by the TRC Team Manager and the course evaluator. This data will include results from the reaction survey, the final assessment, the performance test, the performance survey, and the focus group interview. The evaluation report will be presented to the TRC Team Manager at the completion of the evaluation plan in March. Level 1

Reporting the reaction data (See Appendix H) will be completed by compiling the results of the survey on a 5-point scale and averaging them out over the number of completed surveys. On a scale from 1 to 5, an average result of 4 or higher would indicate a positive learner experience with the program. If the results are lower than a 4 in one of the three evaluation areas: course content, materials and platform, that would indicate an area for improvement and the TRC Team Manager can improve that section before developing future TRC212 modules. Level 2- Part I

Analysis of the learning data (See Appendix I) will be completed by listing the test scores of each participant. Each section of the module contains a non-graded learning challenge but serve as learning checkpoints of understanding before a learner can continue. All sections of the module will be combined into a single final assessment. Since each question is worth ten points, a passing score of 80% or higher is necessary to complete the training and would indicate that the learner did achieve a basic understanding of the material. This test score would also indicate that the team member has the basic skills required to successfully complete an engineering journal entry for the performance test. Level 2- Part II

The performance checklist would measure the learning that has taken place after the training is completed. A rating scale of 4.0 or higher for each of the four performance areas would reflect that the team member can properly complete a journal entry. Level 3 Analyzing behavior change will completed using the performance survey questionnaire See Appendix J for an example report from a Performance Survey Questionnaire. A rating scale of 1 to 5 with a 4.0 or higher in each of the skill areas would indicate that the team member is effectively using the skills learned from the training course to submit an engineering journal entry.

Page 11: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 11

Level 4

Results from the Focus Group Interview will be collected after the tournament with responses from coaches, mentors, team members and judges. The participants will help classify the responses into four areas. Example areas might include online training, team member experience, award incentive, and coaching/mentoring. Each member of the focus group will complete a participant estimation table based on the factors identified (See Appendix K). Next to each factor, the participants will allocate a percentage that factor had on performance improvement. The third column will include a confidence percentage to reflect the error of estimation. Then those two numbers are multiplied and the result is entered in the final column as an adjusted percentage of performance improvement. Example: if a team member is 70% confident that the online training was an influencing factor, and the confidence level was 80%, then the adjustment for error is 56% (70% x 80% = 56%) (Elkeles, Phillips & Phillips, 2014, p. 56). Analysis of the resulting estimation table should help provide measurement of the estimated percentage of impact that all factors identified by the team members and coaches may have influenced the results. Data Collection Summary

Level Objective Measurement Instrument

Data Source

Timeframe Responsibilities

1 Reaction: Score of 4 or higher

Reaction Survey

Team Members

Immediately following training

Steadfast Learning

2 Learning: Measure Knowledge and attitudes- 80% or higher

Final Assessment

Team Members

Completed immediately following the course

Steadfast Learning

2 Learning: Identify skills attained- 4.0 or higher

Performance Checklist

Team Members

Taken within 1 week of training

Steadfast Learning

3 Behavior: Apply new skills to performance areas

Performance Survey Questionnaire

Team Members

Completed 90 days after training

Steadfast Learning

4 Results: Measure the results

Focus Group Interview

Team and Coaches

6 months after training is complete

Steadfast Learning

Source: Data collection plan adapted from Kirkpatrick & Kirkpatrick (2006), “HRD Initiative/ Performance Improvement Program”, pp. 324-325.

Page 12: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 12

Final Thoughts on Evaluation Plan

According to the FIRST Tech Challenge Engineering Notebook guidelines (2017), there are seven awards based on a successful and effective engineering notebook submission (p.10). Therefore, an award placement in one of the seven categories (See Appendix L) will be considered a success accurately demonstrating the team’s ability to successfully enter an engineering journal entry. Using the results from this comprehensive evaluation plan, the TRC Team Manager will be able to evaluate the effectiveness of the TRC212 Engineering Journal Training module course and improve upon the course design of future training modules in the TRC212 online training program.

References Elkeles, T., Phillips, P., & Phillips, J. (2014). Measuring the success of learning through technology. Alexandria, VA: American Society for Training & Development. FIRST Tech Challenge Engineering Notebook Guidelines. FIRST Inspires, 8 Oct. 2017, https://www.firstinspires.org/sites/default/files/uploads/resource_library/ftc/engineering-notebook-guidelines.pdf. Kirkpatrick, D.L. & Kirkpatrick, J.D. (2006). Evaluating training programs: The four levels (3rd ed.). San Francisco, CA: Barrett-Koehler Publishers, Inc. TRC Engineering Journal. Tennessee Robotics Club, 2 Oct. 2017, www.tennesseeroboticsclub.org/trc-engineering-journal/.

Appendices Appendix A: TRC212- Evaluation Plan Timeline and Gantt Chart Appendix B: TRC212- Reaction Survey Appendix C: TRC212- Final Assessment Appendix D: TRC212- Performance Test Checklist Appendix E: TRC212- Learning Challenge Example Appendix F: TRC212- Performance Survey Questionnaire Appendix G: TRC212- Focus Group Interview Questions Appendix H: TRC212- Example Reaction Data Results Appendix I: TRC212- Example Learning Data Results Appendix J: TRC212- Example Performance Survey Data Results Appendix K: Example Participant’s Estimation Table Appendix L: FTC Award Categories based on Engineering Notebook

Page 13: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 13

App

endi

x A

: TRC

212-

Eva

luat

ion

Plan

Tim

elin

e an

d G

antt

Cha

rt

Pro

ject

Tim

elin

es:

Aug

ust-

Trai

ning

Pro

gram

Beg

ins

Sep

tem

ber

-

Trai

ning

Pro

gram

End

s S

epte

mb

er-

Le

vel 1

Rea

ctio

n S

urve

y co

mp

lete

d S

epte

mb

er-

Le

vel 2

Lea

rnin

g- F

inal

Ass

essm

ent c

omp

lete

d

Sep

tem

ber

-

Leve

l 2 L

earn

ing-

Per

form

ance

Tes

t co

mp

lete

d

Nov

emb

er-

Le

vel 3

Per

form

ance

Sur

vey

com

ple

ted

Feb

ruar

y-

Le

vel 4

Imp

act F

ocus

Gro

up c

omp

lete

d

Page 14: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 14

Appendix B: TRC212- Reaction Survey

Please enter your name if you would like to be contacted about your comments to help improve future training programs. Otherwise, your name is not required and the comments you enter will be anonymous. Participant Name (Optional): ___________________________________ Survey Instructions: This survey is designed to collect reactions and comments in three areas of the training program: course content, course format and technology. Please answer each question by circling the number next to your response.

1- Strongly Disagree 2- Disagree 3- Neutral 4- Agree 5- Strongly Agree

Comments: Please enter comments to make recommendations where applicable.

Reaction Area 1: Rate your reaction to the course content:

1. I found the course content to be relevant and engaging. 1. Strongly Disagree 2. Disagree 3. Neutral 4. Agree 5. Strongly Agree

Comments:

2. The material presented in each section was organized clearly and easy to understand. 1. Strongly Disagree 2. Disagree 3. Neutral 4. Agree 5. Strongly Agree

Comments:

3. I feel that I can successfully complete an Engineering Journal entry on my own. 1. Strongly Disagree 2. Disagree 3. Neutral 4. Agree 5. Strongly Agree

Comments:

Page 15: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 15

Reaction Area 2: Rate your reaction to the course design

4. The guided walk-throughs were easy to follow and understand. 1. Strongly Disagree 2. Disagree 3. Neutral 4. Agree 5. Strongly Agree

Comments:

5. The Learning Challenges reinforced the content from each section for the final assessment. 1. Strongly Disagree 2. Disagree 3. Neutral 4. Agree 5. Strongly Agree

Comments:

6. The final assessment tested me on the knowledge presented in each section. 1. Strongly Disagree 2. Disagree 3. Neutral 4. Agree 5. Strongly Agree

Comments:

Reaction Area 3: Rate your reaction to the course technology:

7. The online learning module was easy to use and navigate throughout the course. 1. Strongly Disagree 2. Disagree 3. Neutral 4. Agree 5. Strongly Agree

Comments:

8. The course navigation and features worked properly in each section. 1. Strongly Disagree 2. Disagree 3. Neutral 4. Agree 5. Strongly Agree

Comments:

Page 16: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 16

9. The videos and images properly displayed without delay. 1. Strongly Disagree 2. Disagree 3. Neutral 4. Agree 5. Strongly Agree

Comments:

Reaction Area 4: Overall Learner Satisfaction:

10. Overall, I am satisfied with this training course and it met my learning needs. 1. Strongly Disagree 2. Disagree 3. Neutral 4. Agree 5. Strongly Agree

Comments:

Page 17: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 17

Appendix C: TRC212- Final Assessment

Okay, you have made it to the end. Now let's check what we have learned in this training module. There are 10 questions covering the steps required to successfully enter an engineering journal entry. You must pass with an 80% score to move on. After completing the final learning assessment, you will receive instructions by email to complete a brief reaction survey. Good luck! START QUIZ

1. Select the name of the link to add a photo, drawing or code snippet to your journal entry:

a. Add New b. Add Media c. Add Form d. Preview

2. What is the correct Windows keyboard shortcut to paste content? a. Ctrl-C b. Ctrl-S c. Ctrl-V d. Crtl-P

3. Identify the correct source to find your username and password to login. a. Welcome Packet b. trc.com c. WordPress Registration Email d. FTC Website

4. What is the correct Windows keyboard shortcut to select and copy content? a. Ctrl-S and Ctrl-V b. Ctrl-C and Ctrl-V c. Ctrl-S and Ctrl-C d. Ctrl-A and Ctrl-C

5. Select the correct username format from the list below. a. mpuckett b. tnroboticsclub c. admin d. [email protected]

Page 18: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 18

6. If a Team Member completes a rough sketch of the chassis design, what type of media is best for this reflection?

a. Photo b. Drawing c. Code Snippet d. Not Required

7. Identify the correct url to navigate to the TRC website to bring you to the login screen.

a. Trc.com b. Tnroboticsclub.com c. Tennesseeroboticsclub.org d. Tennesseeroboticsclub.org/wp-admin

8. Choose the task below that best describes the task goal. Remember be

SMART! a. Task 1: Build robot chassis b. Task 1: Build robot c. Task 1: Vivamus in diam turpis d. Task 1: Build robot chassis subassembly for battery mount by end of October

9. If a Team Member completes an update to the program code, what type of media is best for this reflection?

a. Photo b. Code Snippet c. Drawing

10. What is the final step to make your post live and viewable online in the Engineering Journal?

a. Save Draft b. Move to Trash c. Publish d. Preview

Page 19: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 19

Final Assessment Answer Key:

1. a) Add Media 2. c) Ctrl-V 3. c) WordPress Registration Email 4. d) Ctrl-A and Ctrl-C 5. a) mpuckett 6. b) Drawing 7. d) Tennesseeroboticsclub.org/wp-admin 8. d) Task 1: Build robot chassis subassembly for battery mount by end of October 9. b) Code Snippet 10. c) Publish

Page 20: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 20

Appendix D: TRC212- Performance Test Checklist

Instructions: Use the following checklist to evaluate the completion of the performance test. This evaluation will critique the following areas: Meeting Details and Tasks, Journal Reflection, Media Submission and overall Completeness. Please circle the appropriate rating for each of the following areas: Meeting Details

1. The Meeting Details and Tasks section was entered correctly and accurately. 1. Strongly Disagree 2. Disagree 3. Neutral 4. Agree 5. Strongly Agree

Reflection

2. The journal reflection was concise and well written providing a good summary of the meeting events. 1. Strongly Disagree 2. Disagree 3. Neutral 4. Agree 5. Strongly Agree

Media Selection

3. The journal reflection included a media submission that accurately represented the journal reflection. 1. Strongly Disagree 2. Disagree 3. Neutral 4. Agree 5. Strongly Agree

Completeness

4. The engineering journal entry was entered according to the training guidelines. 1. Strongly Disagree 2. Disagree 3. Neutral 4. Agree 5. Strongly Agree

What would have made the training more beneficial to you?

Page 21: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 21

Appendix E: TRC212- Learning Challenge Example

Page 22: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 22

Appendix F: TRC 212- Performance Survey Questionnaire

Instructions: The purpose of this questionnaire is to determine how effectively you could apply the skills learned through completion of the Engineering Journal training course. The results of the questionnaire will be used to address any gaps in the training for future courses. Please circle the appropriate rating for each of the following areas: Meeting Details

1. I complete the Meeting Details and Tasks section correctly and accurately for each entry. 1. Always 2. Most of the time 3. The same 4. Less of the time 5. Never

Comments: Reflection

2. I write clear, concise and reflective journal entries for each submission providing a good summary of the meeting events. 1. Always 2. Most of the time 3. The same 4. Less of the time 5. Never

Comments: Media Submission

3. I include a media submission that accurately represents my journal reflection. 1. Always 2. Most of the time 3. The same 4. Less of the time 5. Never

Comments: Completeness

4. I complete the engineering journal entry according to the training guidelines. 1. Always 2. Most of the time 3. The same 4. Less of the time 5. Never

Comments:

Page 23: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 23

Appendix G: TRC 212- Focus Group Interview Questions

Question Q1: Did the Engineering Journal document team growth and development? Q2: Did the Engineering Journal document team failures and struggles? Q3: Did every team member contribute to the journal? Q4: Was every team meeting documented with at least one reflection entry? Q5: Did journal entries include media submissions (drawings, photos, code snippets)? Q6: Did the Engineering Journal accurately represent our team’s journey through the season? Q7: Did the Engineering Journal submission result in an award placement at the tournament?

Appendix H: TRC 212- Example Reaction Data Results

Appendix I: TRC 212- Example Learning Data Results

0

0.5

1

1.5

2

2.5

3

3.5

4

4.5

5

Content Design Technology Satisfaction

Reaction Data Results

Rating Scale

Page 24: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 24

Appendix J: TRC 212- Example Performance Survey Questionnaire Results

Appendix K: Example of Focus Group Participant Estimation

3.8

4

4.2

4.4

4.6

4.8

5

Meeting Details Reflection Media Completeness

Learning Data Results

Rating Scale

0

0.5

1

1.5

2

2.5

3

3.5

4

4.5

5

Meeting Details Reflection Media Completeness

Performance Survey Questionnaire

Rating Scale

Page 25: TRC212: Engineering Journal Training Evaluation Plan · The following section provides a detailed overview of the four levels of evaluation. Each section contains a brief rationale

Michael Puckett TRC212: Engineering Journal Training Evaluation Plan Page 25

Factor That Influenced Improvement

Percentage of Improvement

Percentage of Confidence

Adjustment Percentage of Improvement

Online Training Program 50% 80% 50% Team Member Experience 10% 80% 8% Award Incentive 20% 50% 10% Coaching/Mentoring 20% 80% 16% Total 100%

Source: Measuring the Success of Learning Through Technology: Table 4-1. Example of a Participant’s Estimation, p. 56.

Appendix L: FTC Award Categories based on Engineering Notebook

Source: FIRST Tech Challenge Engineering Notebook Guidelines: chart p.10.