3.1 graduate attributes

129
Questionnaire for the Schulich School of Engineering 13 3.1 Graduate attributes The higher education institution must demonstrate that the graduates of a program possess the attributes listed in Sections 3.1.1 to 3.1.12. The attributes will be interpreted in the context of candidates at the time of graduation. It is recognized that graduates will continue to build on the foundations that their engineering education has provided. Engineering programs are expected to continually improve. There must be processes in place that demonstrate that program outcomes are being assessed in the context of these attributes, and that the results are applied to the further development of the program. The process that is being followed at the Schulich School of Engineering for graduate attributes assessment is illustrated in Figure 3. This process was developed through a series of meetings and workshops with teaching faculty and representatives from each of the School’s eight B.Sc. programs starting in Fall 2008. In this section, we provide an overview of the School’s graduate attributes assessment plan; additional details on the process can be found in Exhibit 5(b). Figure 3. Schulich School of Engineering Graduate Attributes Planning Flowchart (a) Indicators The first step in the graduate attribute assessment process involves identifying a set of measurable statements, or indicators, that identify the performance required to meet each graduate attribute: i.e., descriptors of what students must do to be considered competent in the attribute; the measurable and pre-determined standards used to evaluate learning (27 January 2011 CEAB Questionnaire template) The Schulich School of Engineering is using the CDIO (Conceive-Design-Implement-Operate) Syllabus, listed in Exhibit 5(c), as a starting point for developing program-specific indicators. The CDIO Syllabus is effectively a very detailed list of general engineering program outcomes developed by an international community of CDIO collaborators and validated via focus-group discussions, document research, surveys, workshops, and peer review that involved faculty,

Upload: others

Post on 31-Dec-2021

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

13

3.1 Graduate attributes

The higher education institution must demonstrate that the graduates of a program possess the attributes listed in Sections 3.1.1 to 3.1.12. The attributes will be interpreted in the context of candidates at the time of graduation. It is recognized that graduates will continue to build on the foundations that their engineering education has provided.

Engineering programs are expected to continually improve. There must be processes in place that demonstrate that program outcomes are being assessed in the context of these attributes, and that the results are applied to the further development of the program.

The process that is being followed at the Schulich School of Engineering for graduate attributes assessment is illustrated in Figure 3. This process was developed through a series of meetings and workshops with teaching faculty and representatives from each of the School’s eight B.Sc. programs starting in Fall 2008. In this section, we provide an overview of the School’s graduate attributes assessment plan; additional details on the process can be found in Exhibit 5(b).

Figure 3. Schulich School of Engineering Graduate Attributes Planning Flowchart

(a) Indicators

The first step in the graduate attribute assessment process involves identifying a set of measurable statements, or indicators, that identify the performance required to meet each graduate attribute: i.e.,

descriptors of what students must do to be considered competent in the attribute; the

measurable and pre-determined standards used to evaluate learning (27 January 2011 CEAB Questionnaire template)

The Schulich School of Engineering is using the CDIO (Conceive-Design-Implement-Operate) Syllabus, listed in Exhibit 5(c), as a starting point for developing program-specific indicators. The CDIO Syllabus is effectively a very detailed list of general engineering program outcomes developed by an international community of CDIO collaborators and validated via focus-group discussions, document research, surveys, workshops, and peer review that involved faculty,

Page 2: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

14

students, industry leaders, and senior engineering academics from a variety of universities. As shown in Exhibit 5(c), this syllabus has been mapped to ABET’s program outcomes for outcomes-based assessment in US engineering; as well, it has recently been mapped to the CEAB graduate attributes by Canadian CDIO collaborators1 as illustrated in Figure 4.

Figure 4. Relationship between the CDIO Syllabus and the CEAB Graduate Attributes

The mapping between the CDIO Syllabus and the CEAB graduate attributes results in a comprehensive list of outcomes that can be used as a starting point to develop specific indicators for each of the Schulich School of Engineering’s B.Sc. programs. For this part of the process, teaching faculty and program representatives reviewed and edited the CDIO Syllabus indicators to ensure that the descriptions aligned with the programs’ intended learning outcomes.

The concept of “key indicators” was adopted at the Schulich School of Engineering to keep the overall assessment process more coherent and manageable. In other words, Schulich School of Engineering programs identified a set of key indicators (from the long list of CDIO indicators) that capture the most important aspects of each of the twelve graduate attributes. This has resulted in a relatively small number of indicators per graduate attribute as described in Sections 3.1.1 to 3.1.12.

1. Cloutier G., Hugo R. and Sellens R., “Mapping the relationship between the CDIO Syllabus and the 2008

CEAB Graduate Outcomes”, Proceedings of the 6th International CDIO Conference, École Polytechnique,

Montréal, June 15-18, 2010.

CEAB Graduate Attributes Criteria 3.1 CDIO

Syllabus 3.1.1 3.1.2 3.1.3 3.1.4 3.1.5 3.1.6 3.1.7 3.1.8 3.1.9 3.1.10 3.1.11 3.1.12 1.1 1.2 1.3 2.1 2.2 2.3 2.4 2.5 3.1 3.2 3.3 4.1 4.2 4.3 4.4 4.5 4.6

Strong Correlation Good Correlation

Page 3: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

15

(b) Curriculum Mapping

The second step in the graduate attribute assessment process involves linking curriculum content/pedagogy to knowledge, practice and learning outcomes: i.e.,

a plotted representation (often in the form of a table) that shows the relationship

between the learning experiences (e.g., courses, co-ops, co-curricular activities),

instructional assessment methods, and intended learning for each aspect of a given

program so that the relationships and connections among all the elements are easily

seen (27 January 2011 CEAB Questionnaire template)

A curriculum map for all Schulich School of Engineering common core and capstone design courses was developed via a survey of common core and capstone design course instructors in the summer and fall of 2009 (curriculum mapping for the School’s eight B.Sc. programs is described in the program questionnaires). This survey was based on the CDIO Syllabus outcomes as noted in (a), and was implemented in the form a full introduce-teach-utilize (ITU) analysis (the ITU analysis survey is provided in Exhibit 5(d)).

The survey was conducted by a series of one-hour meetings with faculty involved in delivering common core courses and involved a series of questions of two types. First, the instructors used the CDIO syllabus to map learning activities and outcomes. For each category, the instructor was asked if the activity was introduced (i.e., superficial treatment to briefly expose the topic), taught (i.e., detailed coverage with assignments / exams) or utilized (i.e., assume the student is already skilled in this area) in their course. Secondly, eight questions were asked that focused on determining the intended learning outcomes of the course.

As noted in (a), the CDIO Syllabus provided a good starting point for the curriculum mapping exercise. The high level of detail of the syllabus enabled a comprehensive analysis of each of the graduate attributes as well as an opportunity to refine the CDIO outcomes into key indicators for Schulich School of Engineering courses.

The ITU analysis also required common core instructors to think about how course material is delivered (i.e., introduced or taught), or alternatively, if the student needs to bring knowledge and skills to the course (i.e., utilized). This has the potential to move the survey from a simple information gathering exercise to a learning tool for the course instructor.

The results of the curriculum mapping to common core courses is provided in Exhibit 5(e).

(c) Assessment

The third step in the graduate attribute assessment process involves establishing processes to identify, collect, and prepare data to evaluate the attainment of the key indicators described in (a) and (b). The basis for this work is the key indicators discussed previously: i.e., evidence should be collected on each key indicator.

At this stage of the process, specific courses were identified for direct assessment (using the curriculum map described previously), and decisions were made about the forms of indirect assessment to be used. To ensure that the results are aligned, at least three or four forms of evidence were identified for each of the key indicators. For example, for each key indicator a combination of direct and indirect assessments were planned:

• Direct assessment: in-class, summative assessments (e.g., final exam, final project);

Page 4: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

16

• Indirect assessment: surveys of final year students (in capstone courses), alumni surveys, and industry surveys.

An assessment plan was prepared for each graduate attribute to be assessed in the upcoming academic year. For example, the assessment plan for graduate attribute 3.1.6 “individual and team work” is shown in Figure 5. In each case, the key indicators and curriculum mapping are provided (columns 1 and 2) along with details of the assessment plan: i.e., the type of assessment (column 3), where it is performed (column 4), when it is performed (column 5), who is responsible for the assessment (column 6), and who is responsible for evaluating the results (column 7).

Figure 5. Assessment Plan for Graduate Attribute 3.1.6 “Individual and team work”

As can be seen in Figure 5, direct assessment involves in-class assessment in senior courses identified through the curriculum map. In some cases, attributes may be acquired in a single course, however in most cases attributes are introduced in junior courses, then practiced and built on in senior course. For example, although Figure 5 only shows a mapping to various common core courses (column 2), only the capstone design course (“ENXX500” in this example) is selected as a source of direct assessment for graduate attribute 3.1.6, “individual and team work” (column 4). In this case, the key indicators are taught in ENGG 200 and the capstone design course (ENXX500). They are utilized in ENGG 201, ENGG 225, ENGG 311, ENGG 317, ENGG 481, PHYS 369 (and many other program-specific courses).

Recognizing that graduate attributes must be interpreted in the context of candidates at the time of graduation, we focus primarily on summative, rather than formative assessments: i.e.,

• formative assessment: the assessment is used to promote learning; in the context of the classroom, this would be in the form of feedback to students; in the context of program assessment, the results can be used to see how students’ knowledge, skills and behaviors develop throughout a program of study;

• summative assessment: this is an assessment of student learning at a certain point in time; in the context of the classroom, this may take the form of a final exam or final project; in the context of program assessment, the assessment can be used to demonstrate student learning at the time of graduation.

Graduate Attribute: 3.1.6 "Individual and team work" 1 2 3 4 5 6 7

Key Indicators Courses Method(s) of Assessment

Source of Assessment

Time of Data Collection

Assessment Coordinator in 2011 Evaluation of Results

Faculty evaluations ENXX500 Fall & Winter ENXX500 instructor Student surveys ENXX500 Fall & Winter ENXX500 instructor Alumni surveys Online survey Winter Assessment Coordination Team

1. Identify the stages of team formation and life-cycle as well as the roles and responsibilities of team members

ENGG200, ENGG201, ENGG225, ENGG311, ENGG317, ENGG481, ENXX500 Employer surveys Online survey Winter Assessment Coordination Team

Engineering Undergraduate Studies Committee

Faculty evaluations ENXX500 Fall & Winter ENXX500 instructor Student surveys ENXX500 Fall & Winter ENXX500 instructor Alumni surveys Online survey Winter Assessment Coordination Team

2. Analyze the strengths and weaknesses of the team.

ENGG200, ENGG201, ENGG225, ENGG311, ENGG317, ENGG481, ENXX500 Employer surveys Online survey Winter Assessment Coordination Team

Engineering Undergraduate Studies Committee

Faculty evaluations ENXX500 Fall & Winter ENXX500 instructor Student surveys ENXX500 Fall & Winter ENXX500 instructor Alumni surveys Online survey Winter Assessment Coordination Team

3. Execute the planning and facilitation of effective meetings.

ENGG200, ENGG201, ENGG225, ENGG311, ENGG317, ENGG481, ENXX500 Employer surveys Online survey Winter Assessment Coordination Team

Engineering Undergraduate Studies Committee

Faculty evaluations ENXX500 Fall & Winter ENXX500 instructor Student surveys ENXX500 Fall & Winter ENXX500 instructor Alumni surveys Online survey Winter Assessment Coordination Team

4. Practice conflict negotiation and resolution.

ENGG200, ENGG201, ENGG225, ENGG311, ENGG317, ENGG481, ENXX500 Employer surveys Online survey Winter Assessment Coordination Team

Engineering Undergraduate Studies Committee

Page 5: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

17

As a result, although direct assessments of the four key indicators listed in Figure 5 occur in the first year design and communication course (ENGG 200) and many of the other courses listed in column 2, only the assessment tools from the capstone design course are used to evaluate our students’ individual and team work at the time of graduation.

For the indirect assessments listed in Figure 5, we developed a 38-question survey to address graduate attributes 3.1.1 through 3.1.12. This survey used the key indicators described in (a) as a basis for the questions, and was implemented in the form of three separate surveys (provided in Exhibit 5(f)):

(1) a self-efficacy survey for final year students registered in the capstone design courses;

(2) a self-efficacy survey of one year post-graduation alumni;

(3) a graduate competencies survey of employers of our graduates.

In order to keep the survey relatively succinct, the graduate attributes were summarized in three to four survey questions as shown below (e.g., eight key indicators for 3.1.4 “design” were reduced to three survey questions). The list of key indicators were carefully reviewed and reformulated in the form of survey questions. In some cases, multiple key indicators could be combined into one question; however, in order to keep the length of the survey reasonable, some key indicators were not addressed in the survey. As can be seen by comparing the table below and Exhibit 5(f), the survey questions were re-sorted to spread the questions relating to each graduate attribute throughout the survey.

Graduate

Attribute

Survey

Question

How confident are you in your current ability to:

3.1.1 10 Use your technical knowledge to participate in a design discussion.

3.1.1 11 Describe a well-known experiment that proved an important scientific law.

3.1.1 20 Use mathematics to describe and solve engineering problems.

3.1.2 1 Apply your engineering knowledge and skills to solve a real-world problem.

3.1.2 16 Make assumptions that successfully simplify a complex problem to make it easier to work

with.

3.1.2 21 After solving a problem, evaluate your initial assumptions to see if they need to be

changed.

Page 6: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

18

Graduate

Attribute

Survey

Question

How confident are you in your current ability to:

3.1.3 7 Generate a working hypothesis and a strategy to test it.

3.1.3 13 Synthesize information to reach conclusions that are supported by data and needs.

3.1.3 14 Analyze and interpret data.

3.1.4 24 Test a design solution to determine if it meets its specified needs.

3.1.4 28 Collect and interpret customer needs for a project you were given.

3.1.4 29 Analyze the trade-offs between alternative design approaches and select the one that is

best for your project.

3.1.5 2 Apply an appropriate engineering technique or tool to accomplish a task.

3.1.5 6 Adapt or extend an engineering technique to accomplish a complex task.

3.1.5 25 Describe the limitations of various engineering tools and choose the best one to

accomplish a task.

3.1.6 3 Get team members to make personal commitments to deliver what they had agreed to

do for a project.

3.1.6 8 Review your team’s strengths and weaknesses and tell others where the team might

need help.

3.1.6 12 Help two project team members with a strong and emotional disagreement resolve their

differences.

3.1.6 35 At the start of a project, identify all the roles and responsibilities that your team will

need to complete it.

3.1.7 19 Deliver a clear and organized formal presentation to a group of professionals.

3.1.7 22 Interpret a formal technical drawing in your engineering discipline.

3.1.7 26 Use various written styles to communicate complex engineering concepts to your

colleagues.

3.1.7 30 Prepare a sketch of a design concept that is understood by your colleagues.

Page 7: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

19

Graduate

Attribute

Survey

Question

How confident are you in your current ability to:

3.1.8 9 Identify processes in your project to ensure protection of the public and the public

interest.

3.1.8 15 Identify the regulatory policies that pertain to a project that you are working on.

3.1.8 38 Identify your professional responsibilities within a large engineering project.

3.1.9 4 Identify the interactions that an engineering project has with the economic, social,

health, safety, legal, and cultural aspects of society.

3.1.9 27 Apply technical, social, and environmental criteria to guide trade-offs between design

alternatives.

3.1.9 34 Incorporate sustainability considerations in project decision-making.

3.1.10 18 Admit when you have made a mistake.

3.1.10 36 Identify an ethical dilemma when it occurs in a project.

3.1.10 37 Analyze opposing positions on an issue and make a judgment based on the evidence.

3.1.11 17 Apply project cost management principles to ensure that a project is completed within

budget.

3.1.11 31 Identify and plan for risks in an engineering project.

3.1.11 33 Work with others to establish project objectives when different project tasks must be

completed.

3.1.12 5 Recognize your strengths and weaknesses when working on a specific problem.

3.1.12 23 Identify the best approach that is suited to your learning style.

3.1.12 32 Use technical literature or other information sources to fill a gap in your knowledge.

(d) Evaluation

The fourth step in the graduate attribute assessment process involves interpreting the data and evidence accumulated through the assessment process. This not only involves the collection of evidence of student learning, but also the setting of performance targets, and the interpretation of results. The intention is to determine the extent to which graduate attributes are attained and provide information to inform the School’s continuous improvement process.

As discussed in (c), the Schulich School of Engineering’s graduate attributes assessment plan involves the collection of multiple forms of evidence (both direct and indirect) for each key indicator. This validation approach – referred to as “cross-examination” or “triangulation” – is intended increase the confidence in the end result of the assessments (i.e., when the different methods lead to the same results).

Page 8: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

20

To manage the evaluation process for direct assessment in courses, Schulich School of Engineering instructors were requested to complete a reflective memo for their course that summarized the intended learning outcomes for the course (in the form of graduate attributes and key indicators), the teaching and assessment methods, student learning, and continuous improvement. For example, the reflective memo for the capstone design courses was organized as follows (reflective memo templates for the capstone design courses and ENGG 481 “Technology and Society” are provided in Exhibit 5(g)):

(1) Intended Learning Outcomes: the reflective memo for capstone design should report on graduate attributes and key indicators for 3.1.4 “design”, 3.1.6 “individual and team work”, and 3.1.7 “communication”;

(2) Teaching and Assessment Methods: the reflective memo should report on the teaching and assessment methods that were used to address the intended learning outcomes identified in (1);

(3) Student Learning: the reflective memo should report on how well students performed on each of the intended learning outcome in (1); where possible, instructors were requested to make reference to specific data to support their conclusion;

(4) Continuous Improvement: the reflective memo should report on actions taken during the semester to improve the subject as a result of previous reflections or input from students or colleagues.

Given that each graduate attribute is assessed along multiple dimensions as discussed in (c), a graduate attribute assessment plan as shown in Figure 6 was developed to summarize the assessments.

Figure 6. Graduate Attributes Assessment Summary

The assessment summary includes most of the information from the assessment plan, but is expanded to include the target performance for each indicator (column 7), an analysis of the results of the direct assessments and the indirect assessments, and actions that resulted from the evaluation. The “target performance” is the level of performance that we want our students to achieve. In this example our target performance is based on the percentage of students who achieve “satisfactory” or better on the assessment: the “satisfactory” assessment

Graduate Attribute: 3.1. 1 2 3 4 5 6 7

Key Indicators Courses Method(s) of Assessment

Source of Assessment Assessment Cycle Years(s) of Data Collection

Target Performance

Faculty evaluations Course(s) Tri-annual e.g., 2011, 2014 Student surveys Online survey annual 2011-2016 Alumni surveys Online survey annual 2011-2016

1. indicator no. 1 Course mapping

Employer surveys Online survey annual 2011-2016

85% satisfactory or better

Faculty evaluations Course(s) tri-annual e.g., 2012, 2015 Student surveys Online survey annual 2011-2016 Alumni surveys Online survey annual 2011-2016

2. indicator no. 2 Course mapping

Employer surveys Online survey annual 2011-2016

90% satisfactory or better

Faculty evaluations Course(s) tri-annual ... Student surveys Online survey annual 2011-2016 Alumni surveys Online survey annual 2011-2016

... ...

Employer surveys Online survey annual 2011-2016

...

Results (direct measures) 2011: a short summary of the evaluation of the direct measures results Results (indirect measures) 2011: a short summary of the evaluation of the indirect measures results Actions 2011: a short summary of the results from the feedback for continuous improvement process

Page 9: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

21

would be defined by the assessment tool used for the indicator (e.g., a rubric, a graded exam question, etc.).

Each of the Schulich School of Engineering programs are responsible for setting their own performance targets, which then become the basis for graduate attribute assessment. Given that the School is in the early stages of graduate attributes assessment, our programs are just beginning to establish performance targets for each of the graduate attributes (more details on each of the programs’ progress in this area are provided in the program questionnaires). Each department has been encouraged to look at both its program’s educational objectives and its students. For example, if a program’s curriculum has been designed with an emphasis on specific areas, its performance targets should reflect this. To establish initial performance targets, the performance of recent student cohorts will be reviewed: i.e., past performance in the courses identified in the curriculum map will be used as a starting point for each graduate attribute’s performance target (e.g., percentage of students who obtain a “C-” or higher).

(e) Feedback for Continuous Improvement

As shown in Figure 3, feedback for continuous improvement occurs at all stages of the graduate attributes assessment process, and as a result, involves a wide range of individuals at the Schulich School of Engineering. Given that direct and indirect assessments already started and that we have been collecting data since the Fall 2010 session, we are currently preparing our continuous improvement process. However, the systems are in place to ensure that the School's graduate attributes assessment process is refined and improved each year, and that the feedback from the evaluation stage is used to further develop and improve our B.Sc. programs.

The curriculum review systems at the Schulich School of Engineering have been in place for many years (e.g., curriculum committees, annual reviews, etc.): the graduate attributes assessment process complements existing processes by providing a positive environment for, and an enabler of, curriculum reform. At the Schulich School of Engineering, various groups and individuals are involved in this aspect of graduate attributes assessment and program development. More specifically, the feedback process involves teaching faculty, departmental curriculum committees, and the Schulich School of Engineering curriculum committee (the Engineering Undergraduate Studies Committee).

Since teaching faculty are primarily involved with teaching and learning activities in individual common core and program courses, their program development activities focus primarily on individual courses. For example, Schulich School of Engineering faculty members have been working with their departmental program assessment coordinators to develop forms of evidence for assessment: this work then provides feedback on the key indicators (e.g., are the indicators appropriate? can they be assessed in courses? etc.) and the curriculum mapping (e.g., is the mapping appropriate for the course?). As well, teaching faculty input on the reflective memo is being used as a form of self-assessment of teaching and learning in individual courses. For example, course instructors are asked to reflect on their students' performance and how the course can be improved in the context of the learning outcomes (i.e., the key indicators). This form of assessment is being used both to improve teaching and learning in Schulich School of Engineering courses, and as input to the curriculum review and development process.

As discussed in (d), the results of the direct and indirect assessments of each of the learning outcomes are summarized in assessment summaries (e.g., Figure 6). These summaries are

Page 10: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

22

prepared by the department and School assessment planning coordinators in collaboration with teaching faculty (e.g., using the reflective memo as a starting point), and used for program development by the departmental and Schulich School of Engineering curriculum committees. For example, the results of the graduate attributes assessments are being used by the curriculum committees to inform decision making around individual course topics, outcomes, instruction hours, etc., as well as decision making around the structure of a program's curriculum (e.g., sequencing of courses, selection of courses, etc.). More information on the departmental and School curriculum committees is provided in Section 3.4.8 in the program questionnaires and the School questionnaire respectively.

(f) The Graduate Assessment Process

As noted previously, work on the development of the Schulich School of Engineering's graduate attributes process began in Fall 2008. The planning timeline for the November 2011 CEAB visit is summarized in the following table:

Winter 2009

• Preliminary development of key indicators based on the CDIO Syllabus

Summer 2009 - Fall 2009

• Curriculum mapping in Common Core and B.Sc. in Mechanical Engineering using the Introduce-Teach-Utilize analysis

• Refinement of key indicators based on instructor interviews

Winter 2010

• CEAB Visit Planning Committee established to coordinate School-wide planning for the November 2011 site visit

• Associate Dean (Academic & Planning) meetings with engineering departments to review the accreditation process and introduce the new graduate attributes criterion

Summer 2010

• CEAB Visit Planning Committee retreat on CEAB graduate attributes planning

• Detailed planning for Fall 2010 / Winter 2011 sessions develop (Appendix 5(a))

• Refinement of key indicators for graduate attributes 3.1.4, 3.1.6, and 3.1.7 based on discussions with the School's design and communication instructors

• Development of the graduate attributes survey

Page 11: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

23

Fall 2010

• Pilot run of the capstone design self-efficacy survey in the first year design and communication course (ENGG 200)

• Refinement of the capstone design self-efficacy survey and development of the alumni and employer surveys

• Direct assessment of graduate attributes 3.1.4, 3.1.6, and 3.1.7 in courses

Winter 2011

• Direct assessment of graduate attributes 3.1.4, 3.1.6, 3.1.7, and 3.1.9 in courses

• Indirect assessment of all twelve graduate attributes using the capstone design, alumni, and employer surveys

Summer 2011

1. Meetings with departments to refine the key indicators and course mappings for graduate attributes 3.1.1, 3.1.8, 3.1.10, and 3.1.12

Fall 2011

1. Curriculum mapping of program courses in the Department of Chemical & Petroleum Engineering, the Department of Civil Engineering, the Department of Electrical & Computer Engineering, and the Department of Geomatics Engineering

2. Identification of senior program courses for Graduate Attributes 3.1.2, 3.1.3, 3.1.5, and 3.1.11 in the Department of Mechanical & Manufacturing Engineering

3. Review of the 2010/2011 assessments for Graduate Attributes 3.1.4, 3.1.6, 3.1.7, and 3.1.9 by the Assessment Coordination Team

Winter 2012

1. Identification of senior program courses for Graduate Attributes 3.1.2, 3.1.3, 3.1.5, and 3.1.11 in the Department of Chemical & Petroleum Engineering, the Department of Civil Engineering, the Department of Electrical & Computer Engineering, and the Department of Geomatics Engineering

2. Assessment of Graduate Attributes 3.1.2, 3.1.3, 3.1.5, and 3.1.11 in all departments

3. Presentation of the 2010/2011 assessments to the Engineering Undergraduate Studies Committee by the Assessment Coordination Team; these assessments will be used to inform the Engineering Undergraduate Studies Committee’s curriculum review

To enable an ongoing graduate attributes assessment process that is manageable, the Schulich School of Engineering is following the multiple-year data collection plan shown in Figure 7. This plan involves collecting data on four graduate attributes per year, and results in two to three sets of direct assessments of each of the graduate attributes in each (six year) accreditation

Page 12: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

24

cycle. Data on the indirect assessment of all twelve graduate attributes (via surveys) is collected every year.

It should be noted that this data collection plan is not meant to imply that classroom assessment is only performed in the years noted in Figure 7. Clearly, it is important for pedagogical reasons to have ongoing assessment in program courses; however, the data is not collected, analyzed, etc. for program assessment purposes during the off-cycle years. By reducing the number of direct assessments that departmental and School curriculum committees need to focus on in any given year, teaching faculty and curriculum committee members can apply a more concerted effort to the indicators, assessment tools, and curriculum review than if all twelve attributes were tackled each and every year.

The data collection plan shown in Figure 7 is intended to be flexible. For example, if the evaluation phase reveals that a graduate attributes needs to be monitored more closely, the data collection plan will be adapted to reflect the need. In this case, direct assessment of the attribute will be included in the assessment plan more frequently until the curriculum development process has addressed the concern.

More details on individual graduate attribute assessment are provided in the following subsections. In this questionnaire, we focus on the key indicators and the assessment plans for each of the graduate attributes; details on the evaluation of each of the graduate attributes are provided in the program questionnaires.

Page 13: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

25

Academic Year Graduate Attribute

2010-11 2011-12 2012-13 2013-14 2014-15 2015-16

3.1.1 A knowledge base for engineering

3.1.2 Problem analysis

3.1.3 Investigation

3.1.4 Design

3.1.5 Use of engineering tools

3.1.6 Individual and team work

3.1.7 Communication skills

3.1.8 Professionalism

3.1.9 Impact of engineering on society and environment

3.1.10 Ethics and equity

3.1.11 Economics and project management

3.1.12 Life-long learning

Notes:

1. = direct assessment in courses (ENGG 481, capstone in 10/11) and indirect assessment via surveys

2. = indirect assessment via surveys

Figure 7. Schulich School of Engineering Graduate Attributes Data Collection Plan

Page 14: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

26

3.1.1 A knowledge base for engineering

Demonstrated competence in university level mathematics, natural sciences, engineering

fundamentals, and specialized engineering knowledge appropriate to the program.

Graduate attribute 3.1.1 " a knowledge base for engineering " will be directly assessed in

courses in the Fall 2012 / Winter 2013 terms. The key indicators and curriculum mapping for "a

knowledge base for engineering " will be discussed in detail in Summer 2012; however, the

following list of key indicators (based on the School's 2009/2010 graduate attributes planning)

are currently being used for indirect assessments and will be used as a starting point for the

2012/2013 key indicators.

1. Standardized test(s): e.g., Force Concept Inventory, Mechanics Baseline Test.

2. Use mathematics to describe and solve engineering problems.

3. Use technical knowledge to inform engineering activities.

4. Describe a well-known experiment that proved an important scientific law.

3.1.2 Problem analysis

An ability to use appropriate knowledge and skills to identify, formulate, analyze, and solve

complex engineering problems in order to reach substantiated conclusions.

Plans are currently underway to directly assess "problem analysis" in program courses in the Fall

2011 / Winter 2012 terms. The data collection plan for this graduate attribute is shown in

Figure 8.

3.1.3 Investigation

An ability to conduct investigations of complex problems by methods that include appropriate

experiments, analysis and interpretation of data, and synthesis of information in order to

reach valid conclusions.

Plans are currently underway to directly assess "investigation" in program courses in the Fall

2011 / Winter 2012 terms. The data collection plan for this graduate attribute is shown in

Figure 9.

Page 15: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

27

Graduate Attribute: 3.1.2 “Problem analysis”

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations Program course Fall or Winter

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

1. Apply engineering knowledge and skills to solve real-word problems.

CHEM209, ENGG201, ENGG202, ENGG233, ENGG200, MATH211, PHYS259, AMAT307, ENGG225, ENGG349, ENGG407

Employer surveys Online survey Winter 2012 – ACT

Engineering Undergraduate Studies Committee

Faculty evaluations Program course Fall or Winter

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

2. Make assumptions that successfully simplify a complex problem.

CHEM209, ENGG201, ENGG202, ENGG233, ENGG200, MATH211, PHYS259, AMAT307, ENGG225, ENGG349, ENGG407

Employer surveys Online survey Winter 2012 – ACT

Engineering Undergraduate Studies Committee

Faculty evaluations Program course Fall or Winter

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

3. Evaluate initial assumptions used to formulate a solution to a problem.

CHEM209, ENGG201, ENGG202, ENGG233, ENGG200, MATH211, PHYS259, AMAT307, ENGG225, ENGG349, ENGG407

Employer surveys Online survey Winter 2012 – ACT

Engineering Undergraduate Studies Committee

Figure 8. Data Collection Plan for 3.1.2 "Problem analysis"

Page 16: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

28

Graduate Attribute: 3.1.2 “Problem analysis”

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations Program course Fall or Winter

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

4. Elicit incomplete and ambiguous information.

CHEM209, ENGG201, ENGG202, ENGG233, ENGG200, MATH211, PHYS259, AMAT307, ENGG225, ENGG349, ENGG407

Employer surveys Online survey Winter 2012 – ACT

Engineering Undergraduate Studies Committee

Faculty evaluations Program course Fall or Winter

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

5. Synthesize problem solutions and formulate summary recommendations.

CHEM209, ENGG201, ENGG202, ENGG233, ENGG200, MATH211, PHYS259, AMAT307, ENGG225, ENGG349, ENGG407

Employer surveys Online survey Winter 2012 – ACT

Engineering Undergraduate Studies Committee

Faculty evaluations Program course Fall or Winter

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

6. Formulate a strategy for solving an engineering problem.

CHEM209, ENGG201, ENGG202, ENGG233, ENGG200, MATH211, PHYS259, AMAT307, ENGG225, ENGG349, ENGG407

Employer surveys Online survey Winter 2012 – ACT

Engineering Undergraduate Studies Committee

Figure 8. Data Collection Plan for 3.1.2 "Problem analysis" (continued)

Page 17: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

29

Graduate Attribute: 3.1.3 “Investigation”

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations Program course Fall or Winter

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

1. Formulate an experimental concept and strategy to solve an engineering problem.

CHEM209, ENGG201, ENGG200, AMAT307, ENGG407

Employer surveys Online survey Winter 2012 – ACT

Engineering Undergraduate Studies Committee

Faculty evaluations Program course Fall or Winter

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

2. Generate a working hypothesis and strategy to test it.

CHEM209, ENGG201, ENGG200, AMAT307, ENGG407

Employer surveys Online survey Winter 2012 – ACT

Engineering Undergraduate Studies Committee

Figure 9. Data Collection Plan for 3.1.3 "Investigation"

Page 18: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

30

Graduate Attribute: 3.1.3 “Investigation”

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations Program course Fall or Winter

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

3. Analyze and interpret experimental data.

CHEM209, ENGG201, ENGG200, AMAT307, ENGG407

Employer surveys Online survey Winter 2012 – ACT

Engineering Undergraduate Studies Committee

Faculty evaluations Program course Fall or Winter

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

4. Synthesize information to reach conclusions that are supported by data and needs.

CHEM209, ENGG201, ENGG200, AMAT307, ENGG407

Employer surveys Online survey Winter 2012 – ACT

Engineering Undergraduate Studies Committee

Figure 9. Data Collection Plan for 3.1.3 "Investigation" (continued)

Page 19: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

31

3.1.4 Design

An ability to design solutions for complex, open-ended engineering problems and to design

systems, components or processes that meet specified needs with appropriate attention to

health and safety risks, applicable standards, and economic, environmental, cultural and

societal considerations.

Graduate Attribute 3.1.4 "design" was directly assessed in courses and indirectly assessed via

surveys in the Fall 2010 / Winter 2011 terms. The data collection plan for common core courses

is shown in Figure 10; the assessment summaries for "design" are provided in the program

questionnaires.

3.1.5 Use of engineering tools

An ability to create, select, apply, adapt, and extend appropriate techniques, resources, and

modern engineering tools to a range of engineering activities, from simple to complex, with an

understanding of the associated limitations.

Plans are currently underway to directly assess " use of engineering tools " in program courses

in the Fall 2011 / Winter 2012 terms. The data collection plan for this graduate attribute is

shown in Figure 11.

3.1.6 Individual and team work

An ability to work effectively as a member and leader in teams, preferably in a multi-

disciplinary setting.

Graduate Attribute 3.1.6 "individual and team work" was directly assessed in courses and

indirectly assessed via surveys in the Fall 2010 / Winter 2011 terms. The data collection plan

for common core courses is shown in Figure 12; the assessment summaries for "individual and

team work" are provided in the program questionnaires.

Page 20: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

32

Graduate Attribute: 3.1.4 Design

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations capstone design Fall & Winter 2011 – capstone

design instructor

Student surveys ENGG200 &

capstone design

Fall & Winter 2010 – W. Rosehart

2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

1. Elicit and

interpret customer

needs.

ENGG200, ENGG513,

capstone design

Employer surveys Online survey Winter 2011 – ACT

Engineering

Undergraduate

Studies Committee

Faculty evaluations capstone design Fall & Winter 2011 – capstone

design instructor

Student surveys ENGG200 &

capstone design

Fall & Winter 2010 – W. Rosehart

2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

2. Interpret

ethical, social,

environmental,

legal and

regulatory

influences.

ENGG200, ENGG513,

capstone design

Employer surveys Online survey Winter 2011 – ACT

Engineering

Undergraduate

Studies Committee

Figure 10. Data Collection Plan for 3.1.4 "Design"

Page 21: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

33

Graduate Attribute: 3.1.4 “Design”

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations capstone design Fall & Winter 2011 – capstone

design instructor

Student surveys ENGG200 &

capstone design

Fall & Winter 2010 – W. Rosehart

2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

3. Identify and

explain system

performance

metrics.

ENGG200, ENGG513,

capstone design

Employer surveys Online survey Winter 2011 – ACT

Engineering

Undergraduate

Studies Committee

Faculty evaluations capstone design Fall & Winter 2011 – capstone

design instructor

Student surveys ENGG200 &

capstone design

Fall & Winter 2010 – W. Rosehart

2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

4. Select concepts

and analyze the

trade-offs among

and recombination

of alternative

concepts

ENGG200, capstone

design

Employer surveys Online survey Winter 2011 – ACT

Engineering

Undergraduate

Studies Committee

Figure 10. Data Collection Plan for 3.1.4 "Design" (continued)

Page 22: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

34

Graduate Attribute: 3.1.4 “Design”

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations capstone design Fall & Winter 2011 – capstone

design instructor

Student surveys ENGG200 &

capstone design

Fall & Winter 2010 – W. Rosehart

2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

5. Decompose and

assign function to

elements, and

define interfaces

ENGG200, capstone

design

Employer surveys Online survey Winter 2011 – ACT

Engineering

Undergraduate

Studies Committee

Faculty evaluations capstone design Fall & Winter 2011 – capstone

design instructor

Student surveys ENGG200 &

capstone design

Fall & Winter 2010 – W. Rosehart

2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

6. Use prototypes

and test articles in

design

development

ENGG200, ENGG233,

capstone design

Employer surveys Online survey Winter 2011 – ACT

Engineering

Undergraduate

Studies Committee

Figure 10. Data Collection Plan for 3.1.4 "Design" (continued)

Page 23: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

35

Graduate Attribute: 3.1.4 “Design”

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations capstone design Fall & Winter 2011 – capstone

design instructor

Student surveys ENGG200 &

capstone design

Fall & Winter 2010 – W. Rosehart

2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

7. Demonstrate

iteration until

convergence and

synthesize the

final design.

ENGG200, ENGG233,

capstone design

Employer surveys Online survey Winter 2011 – ACT

Engineering

Undergraduate

Studies Committee

Faculty evaluations capstone design Fall & Winter 2011 – capstone

design instructor

Student surveys ENGG200 &

capstone design

Fall & Winter 2010 – W. Rosehart

2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

8. Demonstrate

accommodation of

changing

requirements

ENGG200, ENGG233,

capstone design

Employer surveys Online survey Winter 2011 – ACT

Engineering

Undergraduate

Studies Committee

Figure 10. Data Collection Plan for 3.1.4 "Design" (continued)

Page 24: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

36

Graduate Attribute: 3.1.5 “Use of engineering tools”

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations Program course Fall or Winter

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

1. Select the most

appropriate

engineering tool to

accomplish at task

from various

alternatives.

ENGG200, ENGG233

Employer surveys Online survey Winter 2012 – ACT

Engineering

Undergraduate

Studies Committee

Faculty evaluations Program course Fall or Winter

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

2. Apply

appropriate

engineering

techniques or tools

to accomplish a

task.

ENGG200, ENGG233

Employer surveys Online survey Winter 2012 – ACT

Engineering

Undergraduate

Studies Committee

Figure 11. Data Collection Plan for 3.1.5 "Use of engineering tools"

Page 25: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

37

Graduate Attribute: 3.1.5 “Use of engineering tools”

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations Program course Fall or Winter

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

3. Adapt or extend

an engineering

technique to

accomplish a task.

ENGG200, ENGG233

Employer surveys Online survey Winter 2012 – ACT

Engineering

Undergraduate

Studies Committee

Faculty evaluations Program course Fall or Winter

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

4. Evaluate the

appropriateness of

results from

different

engineering

techniques and

tools.

ENGG200, ENGG233

Employer surveys Online survey Winter 2012 – ACT

Engineering

Undergraduate

Studies Committee

Figure 11. Data Collection Plan for 3.1.5 "Use of engineering tools" (continued)

Page 26: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

38

Graduate Attribute: 3.1.6 "Individual and team work"

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations capstone design Fall & Winter 2011 – capstone

design instructor

Student surveys ENGG200 &

capstone design

Fall & Winter 2010 – W. Rosehart

2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

1. Identify the

stages of team

formation and life-

cycle as well as

the roles and

responsibilities of

team members

ENGG200, capstone

design

Employer surveys Online survey Winter 2011 – ACT

Engineering

Undergraduate

Studies Committee

Faculty evaluations capstone design Fall & Winter 2011 – capstone

design instructor

Student surveys ENGG200 &

capstone design

Fall & Winter 2010 – W. Rosehart

2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

2. Evaluate team

effectiveness and

plan for

improvements.

ENGG200, capstone

design

Employer surveys Online survey Winter 2011 – ACT

Engineering

Undergraduate

Studies Committee

Figure 12. Data Collection Plan for 3.1.6 "Individual and team work"

Page 27: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

39

Graduate Attribute: 3.1.6 "Individual and team work"

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations capstone design Fall & Winter 2011 – capstone

design instructor

Student surveys ENGG200 &

capstone design

Fall & Winter 2010 – W. Rosehart

2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

3. Execute the

planning and

facilitation of

effective

meetings.

ENGG200, capstone

design

Employer surveys Online survey Winter 2011 – ACT

Engineering

Undergraduate

Studies Committee

Faculty evaluations capstone design Fall & Winter 2011 – capstone

design instructor

Student surveys ENGG200 &

capstone design

Fall & Winter 2010 – W. Rosehart

2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

4. Practice conflict

negotiation and

resolution.

ENGG200, capstone

design

Employer surveys Online survey Winter 2011 – ACT

Engineering

Undergraduate

Studies Committee

Figure 12. Data Collection Plan for 3.1.6 "Individual and team work" (continued)

Page 28: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

40

Graduate Attribute: 3.1.6 "Individual and team work"

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations capstone design Fall & Winter 2011 – capstone

design instructor

Student surveys ENGG200 &

capstone design

Fall & Winter 2010 – W. Rosehart

2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

5. Assume

responsibility for

own work and

participate

equitably.

ENGG200, capstone

design

Employer surveys Online survey Winter 2011 – ACT

Engineering

Undergraduate

Studies Committee

Faculty evaluations capstone design Fall & Winter 2011 – capstone

design instructor

Student surveys ENGG200 &

capstone design

Fall & Winter 2010 – W. Rosehart

2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

6. Exercise

initiative and

contribute to team

goal setting.

ENGG200, capstone

design

Employer surveys Online survey Winter 2011 – ACT

Engineering

Undergraduate

Studies Committee

Figure 12. Data Collection Plan for 3.1.6 "Individual and team work" (continued)

Page 29: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

41

Graduate Attribute: 3.1.6 "Individual and team work"

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations capstone design Fall & Winter 2011 – capstone

design instructor

Student surveys ENGG200 &

capstone design

Fall & Winter 2010 – W. Rosehart

2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

7. Demonstrate

capacity for

initiative and

technical or team

leadership while

respecting other’s

roles.

ENGG200, capstone

design

Employer surveys Online survey Winter 2011 – ACT

Engineering

Undergraduate

Studies Committee

Figure 12. Data Collection Plan for 3.1.6 "Individual and team work" (continued)

Page 30: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

42

3.1.7 Communication skills

An ability to communicate complex engineering concepts within the profession and with

society at large. Such ability includes reading, writing, speaking and listening, and the ability

to comprehend and write effective reports and design documentation, and to give and

effectively respond to clear instructions.

Graduate Attribute 3.1.7 "communication skills" was directly assessed in courses and indirectly

assessed via surveys in the Fall 2010 / Winter 2011 terms. The data collection plan for common

core courses is shown in Figure 13; the assessment summaries for "communication skills" are

provided in the program questionnaires.

3.1.8 Professionalism

An understanding of the roles and responsibilities of the professional engineer in society,

especially the primary role of protection of the public and the public interest.

Graduate attribute 3.1.8 "professionalism" will be directly assessed in courses in the Fall 2012 /

Winter 2013 terms. The key indicators and curriculum mapping for "professionalism" will be

discussed in detail in Summer 2012; however, the following list of key indicators (based on the

School’s 2009/2010 graduate attributes planning) are currently being used for indirect

assessments and will be used as a starting point for the 2012/2013 key indicators.

1. Recognize and accept the goals and roles of the engineering profession.

2. Recognize and accept the responsibilities of engineers to society.

3. Recognize the way in which legal and political systems regulate and influence engineering.

4. Describe how professional societies license and set standards.

3.1.9 Impact of engineering on society and the environment

An ability to analyse social and environmental aspects of engineering activities. Such ability

includes an understanding of the interactions that engineering has with the economic, social,

health, safety, legal, and cultural aspects of society, the uncertainties in the prediction of

such interactions; and the concepts of sustainable design and development and environmental

stewardship.

Graduate Attribute 3.1.9 "impact of engineering on society and the environment" was directly

assessed in the common core course ENGG 481 “Technology and Society” as shown in Exhibit

5(h) and indirectly assessed via surveys in the Fall 2010 / Winter 2011 terms. The data

collection plan for common core courses is shown in Figure 14; the assessment summaries for

"impact of engineering on society and the environment" are provided in the program

questionnaires.

Page 31: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

43

Graduate Attribute: 3.1.7 "communication skills"

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations capstone design Fall & Winter 2011 – capstone design instructor

Student surveys ENGG200 & capstone design

Fall & Winter 2010 – W. Rosehart 2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

1. Construct logical and persuasive arguments.

COMS363, ENGG201, ENGG200, ENGG225, PHYS369, ENGG481, ENGG513

Employer surveys Online survey Winter 2011 – ACT

Engineering Undergraduate Studies Committee

Faculty evaluations capstone design Fall & Winter 2011 – capstone design instructor

Student surveys ENGG200 & capstone design

Fall & Winter 2010 – W. Rosehart 2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

2. Practice conciseness, crispness, precision and clarity of language.

COMS363, ENGG201, ENGG200, ENGG225, PHYS369, ENGG481, ENGG513

Employer surveys Online survey Winter 2011 – ACT

Engineering Undergraduate Studies Committee

Figure 13. Data Collection Plan for 3.1.7 "communication skills"

Page 32: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

44

Graduate Attribute: 3.1.7 "communication skills"

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations capstone design Fall & Winter 2011 – capstone design instructor

Student surveys ENGG200 & capstone design

Fall & Winter 2010 – W. Rosehart 2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

3. Demonstrate writing with coherence and flow.

COMS363, ENGG201, ENGG225, ENGG349, ENGG481, capstone design

Employer surveys Online survey Winter 2011 – ACT

Engineering Undergraduate Studies Committee

Faculty evaluations capstone design Fall & Winter 2011 – capstone design instructor

Student surveys ENGG200 & capstone design

Fall & Winter 2010 – W. Rosehart 2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

4. Practice writing with correct spelling, punctuation and grammar

COMS363, ENGG201, ENGG225, ENGG349, ENGG481, capstone design

Employer surveys Online survey Winter 2011 – ACT

Engineering Undergraduate Studies Committee

Figure 13. Data Collection Plan for 3.1.7 "communication skills" (continued)

Page 33: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

45

Graduate Attribute: 3.1.7 "communication skills"

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations Capstone design Fall & Winter 2011 – capstone design instructor

Student surveys ENGG200 & capstone design

Fall & Winter 2010 – W. Rosehart 2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

5. Apply various written styles (informal, formal, memos, reports, etc.)

COMS363, ENGG201, ENGG225, ENGG349, ENGG481, capstone design

Employer surveys Online survey Winter 2011 – ACT

Engineering Undergraduate Studies Committee

Faculty evaluations Capstone design Fall & Winter 2011 – capstone design instructor

Student surveys ENGG200 & capstone design

Fall & Winter 2010 – W. Rosehart 2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

6. Demonstrate sketching and drawing.

ENGG200, capstone design

Employer surveys Online survey Winter 2011 – ACT

Engineering Undergraduate Studies Committee

Figure 13. Data Collection Plan for 3.1.7 "communication skills" (continued)

Page 34: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

46

Graduate Attribute: 3.1.7 "communication skills"

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations Capstone design Fall & Winter 2011 – capstone design instructor

Student surveys ENGG200 & capstone design

Fall & Winter 2010 – W. Rosehart 2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

7. Demonstrate construction of tables, graphs, and charts.

AMAT217, AMAT219, ENGG233, ENGG200, MATH211, PHYS259, CHEM357, ENGG311, ENGG225, ENGG349, PHYS369, capstone design

Employer surveys Online survey Winter 2011 – ACT

Engineering Undergraduate Studies Committee

Faculty evaluations Capstone design Winter 2011 – capstone design instructor

Student surveys ENGG200 & capstone design

Fall & Winter 2010 – W. Rosehart 2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

8. Interpret formal technical drawings and renderings.

Capstone design

Employer surveys Online survey Winter 2011 – ACT

Engineering Undergraduate Studies Committee

Figure 13. Data Collection Plan for 3.1.7 "communication skills" (continued)

Page 35: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

47

Graduate Attribute: 3.1.7 "communication skills"

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations Capstone design Fall & Winter 2011 – capstone design instructor

Student surveys ENGG200 & capstone design

Fall & Winter 2010 – W. Rosehart 2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

9. Deliver clear and organized formal presentation following established guidelines.

CHEM209, ENGG200, ENGG225, ENGG481, capstone design

Employer surveys Online survey Winter 2011 – ACT

Engineering Undergraduate Studies Committee

Faculty evaluations Capstone design Winter 2011 – capstone design instructor

Student surveys ENGG200 & capstone design

Fall & Winter 2010 – W. Rosehart 2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

10. Use appropriate referencing to cite previous work.

COMS363, ENGG201, ENGG225, ENGG349, ENGG481, capstone design

Employer surveys Online survey Winter 2011 – ACT

Engineering Undergraduate Studies Committee

Figure 13. Data Collection Plan for 3.1.7 "communication skills" (continued)

Page 36: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

48

Graduate Attribute: 3.1.7 "communication skills"

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations Capstone design Fall & Winter 2011 – capstone design instructor

Student surveys ENGG200 & capstone design

Fall & Winter 2010 – W. Rosehart 2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

11. Adapt format, content, organization, and tone for various audiences.

COMS363, ENGG201, ENGG225, ENGG349, ENGG481, capstone design

Employer surveys Online survey Winter 2011 – ACT

Engineering Undergraduate Studies Committee

Figure 13. Data Collection Plan for 3.1.7 "communication skills" (continued)

Page 37: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

49

Graduate Attribute: 3.1.9 "Impact of engineering on society and environment"

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations ENGG481 Winter 2011 – M. Eggermont

Student surveys ENGG200 & capstone design

Fall & Winter 2010 – W. Rosehart 2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

1. Analyze the impact of engineering on the environment, social, knowledge and economic systems in modern culture.

ENGG200, ENGG481, ENGG513, capstone design

Employer surveys Online survey Winter 2011 – ACT

Engineering Undergraduate Studies Committee

Faculty evaluations ENGG481 Winter 2011 – M. Eggermont

Student surveys ENGG200 & capstone design

Fall & Winter 2010 – W. Rosehart 2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

2. Describe the important contemporary political, social, legal and environmental issues and values.

ENGG209, ENGG481, capstone design

Employer surveys Online survey Winter 2011 – ACT

Engineering Undergraduate Studies Committee

Figure 14. Data Collection Plan for 3.1.9 "Impact of engineering on society and the environment”

Page 38: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

50

Graduate Attribute: 3.1.9 "Impact of engineering on society and environment"

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations ENGG481 Winter 2011 – M. Eggermont

Student surveys ENGG200 & capstone design

Fall & Winter 2010 – W. Rosehart 2011 – ACT

Alumni surveys Online survey Winter 2011 – ACT

3. Define the process by which contemporary values are set, and one’s role in these processes

ENGG209, ENGG481, capstone design

Employer surveys Online survey Winter 2011 – ACT

Engineering Undergraduate Studies Committee

Figure 14. Data Collection Plan for 3.1.9 "Impact of engineering on society and the environment" (continued)

Page 39: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

51

3.1.10 Ethics and equity

An ability to apply professional ethics, accountability, and equity.

Graduate attribute 3.1.10 "ethics and equity" will be directly assessed in courses in the Fall

2012 / Winter 2013 terms. The key indicators and curriculum mapping for "ethics and equity"

will be discussed in detail in Summer 2012; however, the following list of key indicators (based

on the School's 2009/2010 graduate attributes planning) are currently being used for indirect

assessments and will be used as a starting point for the 2012/2013 key indicators.

1. Demonstrate an ability to make informed ethical choices.

2. Demonstrate knowledge of a professional code of ethics.

3. Evaluate the ethical dimensions of professional and scientific practice.

4. Demonstrate ethical practice.

3.1.11 Economics and project management

An ability to appropriately incorporate economics and business practices including project,

risk and change management into the practice of engineering and to understand their

limitations.

Plans are currently underway to directly assess "economics and project management" in

common core and program courses in the Fall 2011 / Winter 2012 terms. The data collection

plan for common core courses is shown in Figure 15.

3.1.12 Life-long learning

An ability to identify and to address their own educational needs in a changing world in ways

sufficient to maintain their competence and to allow them to contribute to the advancement

of knowledge.

Graduate attribute 3.1.12 "life-long learning" will be directly assessed in courses in the Fall

2012 / Winter 2013 terms. The key indicators and curriculum mapping for "life-long learning"

will be discussed in detail in Summer 2012; however, the following list of key indicators (based

on the School's 2009/2010 graduate attributes planning) are currently being used for indirect

assessments and will be used as a starting point for the 2012/2013 key indicators.

1. Reflect on one’s skills, interests, strengths, and weaknesses.

2. Describe one’s own learning style.

3. Describe the importance of developing relationships with mentors.

Page 40: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

52

Graduate Attribute: 3.1.11 "Economics and project management"

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations ENGG209 Fall or Winter ENGG209 instructor

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

1. Apply the

concept of the

time value of

money to

engineering

projects.

ENGG200, ENGG209

Employer surveys Online survey Winter 2012 – ACT

Engineering

Undergraduate

Studies Committee

Faculty evaluations ENGG209 Fall or Winter ENGG209 instructor

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

2. Recognize the

role of financial

planning and

capital budgeting

in engineering

projects.

ENGG200, ENGG209

Employer surveys Online survey Winter 2012 – ACT

Engineering

Undergraduate

Studies Committee

Faculty evaluations Capstone design Fall or Winter Capstone design

instructor

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

3. Describe project

control for cost,

performance, and

schedule.

ENGG200, capstone

design

Employer surveys Online survey Winter 2012 – ACT

Engineering

Undergraduate

Studies Committee

Figure 15. Data Collection Plan for 3.1.1 "Economics and project management"

Page 41: 3.1 Graduate attributes

Questionnaire for the Schulich School of Engineering

53

Graduate Attribute: 3.1.11 "Economics and project management"

1 2 3 4 5 6 7

Key Indicators Courses Method(s) of

Assessment

Source of

Assessment

Time of Data

Collection

Assessment

Coordinator

Evaluation of

Results

Faculty evaluations Capstone design Fall or Winter Capstone design

instructor

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

4. Discuss the

estimation and

allocation of

resources in

engineering

projects.

ENGG200, capstone

design

Employer surveys Online survey Winter 2012 – ACT

Engineering

Undergraduate

Studies Committee

Faculty evaluations Capstone design Fall or Winter Capstone design

instructor

Student surveys Online survey Winter 2012 – ACT

Alumni surveys Online survey Winter 2012 – ACT

5. Identify risks

and alternatives in

engineering

projects.

ENGG200, capstone

design

Employer surveys Online survey Winter 2012 – ACT

Engineering

Undergraduate

Studies Committee

Figure 15. Data Collection Plan for 3.1.1 "Economics and project management" (continued)

Page 42: 3.1 Graduate attributes

Exhibit 5: Graduate Attributes Assessment Resources

Exhibit 5(a): Assessment Coordination Team Terms

of Reference

Page 43: 3.1 Graduate attributes

Assessment Coordination Team (ACT) Approved – Deans Executive Committee 14 September 2011

University of Calgary – Schulich School of Engineering

Assessment Coordination Team (ACT)

TERMS OF REFERENCE

Mandate The mandate of the Assessment Coordination Team (ACT) is to coordinate CEAB (Canadian Engineering Accreditation Board) graduate attributes assessment for the Schulich School of Engineering’s undergraduate programs. ACT serves in an advisory role to the Dean’s Executive Committee and reports to the Engineering Undergraduate Studies Committee and the Engineering Internship Program Standing Committee on program outcomes assessment planning and results.

The issues dealt with by ACT include, but are not limited to, the Roles and Responsibilities listed below.

Guidelines 1. The Associate Dean (Academic and Planning) is an ex-officio voting member of the

Team as serves as Chair.

2. The Associate Dean (Teaching and Learning) is an ex-officio voting member of the Team.

3. The Team consists of one representative from each of the five Departments who are nominated by their Department Heads and approved by the School’s Striking Committee; Departmental representatives also serve as Assessment Coordinators for their Departments.

4. The Team consists of one representative from the Engineering Career Centre.

5. The Team meets at least two times per year: (1) to review past Fall and Winter session graduate attributes assessment results in the Spring session, (2) to plan graduate attributes assessments for the upcoming Fall and Winter sessions.

Roles and Responsibilities 1. Coordinate graduate attribute assessments for Schulich School of Engineering B.Sc.

programs.

2. Evaluate graduate attribute assessments for Schulich School of Engineering B.Sc. programs.

3. Support teaching faculty and Departmental Assessment Coordinators with assessment tools and techniques.

4. Support teaching faculty and Departmental Assessment Coordinators with graduate attributes assessment training activities.

Page 44: 3.1 Graduate attributes

Exhibit 5: Graduate Attributes Assessment Resources

Exhibit 5(b): CEAB Graduate Attributes Planning –

Fall 2010

Page 45: 3.1 Graduate attributes

Schulich School of Engineering

CEAB Graduate Attribute Planning Fall 2010

R.W. Brennan 19 August 2010

Page 46: 3.1 Graduate attributes

Table of Contents

Description Page

Executive Summary 1

1. Graduate Attributes Planning 2

1.1 Graduate Attributes 3

1.2 Performance Indicators 4

1.3 Educational Practices/Strategies 7

1.4 Assessment: Collection of Evidence 8

1.5 Evaluation: Collection and Analysis of Evidence 13

1.6 Feedback for Continuous Improvement 16

1.7 Graduate Attributes Assessment Timeline 17

2. Example – B.Sc. in Mechanical Engineering 21

2.1 Program Educational Objectives and Student Outcomes 22

2.2 Performance Indicators and Course Mapping 23

2.3 Final Words on the CDIO Syllabus 26

References 26

Page 47: 3.1 Graduate attributes

List of Tables

Page

Table 1. CEAB Graduate Attributes 3

Table 2. Forms of Evidence 9

Table 3. A Typical Data Collection Plan 18

Table 4. Graduate Attribute Assessment Plan 20

Table 5. The CDIO Syllabus and Program Educational Objectives 23

List of Figures

Page

Figure 1. Graduate Attribute Planning 2

Figure 2. Graduate Attributes and Performance Indicators 6

Figure 3. Example of a Curriculum Mapping for “Communication Skills” 7

Figure 4. Collection of Evidence 10

Figure 5. Analytic Rubric for Writing Skills 11

Figure 6. Assessment Plan for “Communication Skills” 12

Figure 7. Assessment Plan and Analysis for “Communication Skills” 15

Figure 8. A Typical Data Collection Timeline 18

Figure 9. A Detail of the Expanded CDIO Syllabus 21

Figure 10. Graduate Attribute Planning and the CDIO Syllabus 22

Figure 11. B.Sc. in Mechanical Engineering Objectives and Outcomes 24

Figure 12. Student Outcomes / Graduate Attributes Mapping for the B.Sc. in Mechanical Engineering Program

25

Page 48: 3.1 Graduate attributes

Feedback

GraduateAttributes

PerformanceIndicators

EducationalStrategies Assessment Evaluation

• by CEAB• what students are able to do by the time of graduation

What?

When? • not a basis of accreditation until June 2014

• refinement of attributes• meaningful• measurable• only “key performance indicators”

• Fall 2010 • essential for assessment

• how/where attributes are demonstrated• typically a curriculum mapping to the performance indicators

• Fall 2010• essential for assessment

• measurement of performance indicators• direct: typically in-class• indirect: exit interviews, surveys

• Winter 2011 to Fall 2011• unrealistic to have all for the report

• using the assessments to determine if attributes are being met• requires setting of perf. targets

• Fall 2011• unrealistic to have all for the report

2010/11? • must have a process in place for visit• desirable to have some experience by visit

• require key performance indicators in Fall 2010• must be in place for report

• require map no later than end of Fall 2010 term • must be in place for report

• should have some key assessments• in report or at time of visit

• should have targets for report • ideally, some key evaluations by visit

Executive Summary

SSE Graduate Attributes Planning page 1 of 27

Page 49: 3.1 Graduate attributes

CEAB Graduate Attributes Planning – Fall 2010 Schulich School of Engineering

“It's hard to lead a cavalry charge if you think you look funny on a horse.”

Adlai E. Stevenson

Although outcomes-based assessment is a well-established component of many national engineering accreditation boards (e.g., ABET), it is completely new in the Canadian context. This is not to say that outcomes-based assessment is not practiced in Canada – other national accreditation boards (e.g., medicine) have been relying on outcomes-based assessment for years and many of our colleagues use it as part of their teaching and learning strategies – however, there is very little experience with outcomes-based assessment at the engineering programs level in Canada.

Despite this, as leaders of the Schulich School of Engineering’s accreditation process we are required to do just that: we will be expected to develop outcomes-based assessment plans for each of our programs and lead our colleagues through the process. This document is intended to help with this aspect of the School’s 2010/2011 accreditation cycle. More specifically, the objectives of this document are (1) to provide background on graduate attribute assessment, and (2) to provide a general planning framework that can be used by all of the School’s departments.

1. Graduate Attributes Assessment A draft plan for graduate attribute assessment is shown in Figure 1. This section provides a brief description of each aspect of the plan as well as a proposed timeline for the process.

Figure 1. Graduate Attributes Planning

SSE Graduate Attributes Planning page 2 of 27

Page 50: 3.1 Graduate attributes

1.1. Graduate Attributes Since 2005, the CEAB (Canadian Engineering Accreditation Board) has been working towards updating their criteria in order to move towards a model that emphasizes continuous improvement, and more specifically, program outcomes. Although, the new “graduate attributes” requirement, introduced in section 3.1 of the CEAB’s 2008 Accreditation Criteria and Procedures [1], does not officially come into effect until 2014 (i.e., after one accreditation cycle), the CEAB expects institutions to “demonstrate compliance with this criterion” during the current accreditation cycle [2].

The graduate attribute criterion states that institutions must demonstrate that the graduates of a program possess the attributes listed in Table 1. It is important to note that these attributes will be interpreted in the context of candidates at the time of graduation; as well, it is recognized that graduates will continue to build on the foundations that their engineering education has provided.

Table 1. CEAB Graduate Attributes [2]

3.1.1 A knowledge base for engineering: Demonstrated competence in university level mathematics, natural sciences, engineering fundamentals, and specialized engineering knowledge appropriate to the program.

3.1.2 Problem analysis: An ability to use appropriate knowledge and skills to identify, formulate, analyze, and solve complex engineering problems in order to reach substantiated conclusions.

3.1.3 Investigation: An ability to conduct investigations of complex problems by methods that include appropriate experiments, analysis and interpretation of data, and synthesis of information in order to reach valid conclusions.

3.1.4 Design: An ability to design solutions for complex, open-ended engineering problems and to design systems, components or processes that meet specified needs with appropriate attention to health and safety risks, applicable standards, and economic, environmental, cultural and societal considerations.

3.1.5 Use of engineering tools: An ability to create, select, apply, adapt, and extend appropriate techniques, resources, and modern engineering tools to a range of engineering activities, from simple to complex, with an understanding of the associated limitations.

3.1.6 Individual and team work: An ability to work effectively as a member and leader in teams, preferably in a multi-disciplinary setting.

3.1.7 Communication skills: An ability to communicate complex engineering concepts within the profession and with society at large. Such ability includes reading, writing, speaking and listening, and the ability to comprehend and write effective reports and design documentation, and to give and effectively respond to clear instructions.

3.1.8 Professionalism: An understanding of the roles and responsibilities of the professional engineer in society, especially the primary role of protection of the public and the public interest.

3.1.9 Impact of engineering on society and the environment: An ability to analyze social and environmental aspects of engineering activities. Such ability includes an understanding of the interactions that engineering has with the economic, social, health, safety, legal, and cultural aspects of society, the uncertainties in

SSE Graduate Attributes Planning page 3 of 27

Page 51: 3.1 Graduate attributes

the prediction of such interactions; and the concepts of sustainable design and development and environmental stewardship.

3.1.10 Ethics and equity: An ability to apply professional ethics, accountability, and equity.

3.1.11 Economics and project management: An ability to appropriately incorporate economics and business practices including project, risk, and change management into the practice of engineering and to understand their limitations.

3.1.12 Life-long learning: An ability to identify and to address their own educational needs in a changing world in ways sufficient to maintain their competence and to allow them to contribute to the advancement of knowledge.

As noted, the CEAB is moving towards a model that emphasizes continuous improvement. As such, the CEAB notes that there must be processes in place that demonstrate that program outcomes are being assessed in the context of the attributes listed in Table 1, and that the results are applied to the further development of the program.

The CEAB recognizes that universities are presently developing methods of evaluating the success of achieving the attributes. For this reason, the attributes will not form a basis for accreditation decisions until June 2014. As a result, CEAB program visitors are required to examine and report on the evidence presented by the program to demonstrate how they are complying with each graduate attribute criterion, however no ratings of “Acceptable”, “Marginal”, or “Unacceptable” will be assigned to the graduate attribute criteria. Instead, visiting teams are asked to comment on

1) how the institution is working towards developing this attribute throughout the program; and,

2) how the institution is using the information collected to improve the preparation of engineering graduates.

Methods of evaluation that have been identified currently are based on a review of individual course materials (i.e., course binders): textbooks and supporting materials; samples of graded student work and examinations; samples of anonymous student transcripts; student design reports; models or equipment constructed by students; laboratory manuals; faculty information; etc.

For the Fall 2011 visit, the Schulich School of Engineering will also be expected to provide additional information to explicitly address the twelve graduate attributes: e.g., work term experience/reports; employer surveys; exit interview; graduate surveys, etc. In the following sections, we will look at how this requirement can be addressed.

1.2. Performance Indicators Ultimately, we are required to demonstrate that graduates of each of our programs possess the attributes listed in Table 1. To accomplish this, the graduate attributes must be written in a format that is acceptable, understood, meaningful and measurable. For example, the attributes should be acceptable within the context of the program’s educational objectives; they should be understood and meaningful to those involved in the assessment (e.g., faculty, students, alumni); and there should be a means of obtaining evidence to determine if the outcome has been achieved (e.g., via a direct assessment such as an exam or via an indirect measure such as a survey question).

SSE Graduate Attributes Planning page 4 of 27

Page 52: 3.1 Graduate attributes

In order to get to this point, each CEAB graduate attribute should be refined into a number of performance indicators that have the following two essential parts:

(1) content reference: subject content that is the focus of instruction (e.g., steps of the design process, chemical reaction, scientific method)

(2) action verb: direct student to a specific performance (e.g., “list”, “analyze”, “apply”, etc.)

For example, CEAB graduate attribute 3.1.7 “Communication Skills” is stated as follows:

An ability to communicate complex engineering concepts within the profession and with society at large. Such ability includes reading, writing, speaking and listening, and the ability to comprehend and write effective reports and design documentation, and to give and effectively respond to clear instructions.

Arguably, this graduate attribute is easily understood and meaningfully; however, it should be refined so that (1) it aligns with the program’s educational objectives, and (2) it can be measured. First, the “communication skills” graduate attribute refers to oral and written communication, but it does not refer to graphical communication: the Schulich School of Engineering clearly values graphical communications given the choices made in designing its common core curriculum (i.e., graphical communication is a key aspect of ENGG 200). Secondly, it would be very difficult to identify the type of evidence that would be required to demonstrate this graduate attribute given the wide range of communication skills that it refers to. Examples of potential performance indicators (expressed in the form of learning outcomes) are as follows:

• Demonstrate writing with coherence and flow;

• Practice writing with correct spelling, punctuation and grammar;

• Demonstrate sketching and drawing;

• etc.

In this case, evidence could be in the form of a design report submitted as a course requirement, exit interview questions (e.g., “How would you assess your ability to …?”), and/or alumni survey questions.

This leads to a number of questions – foremost of which are:

• How do we develop appropriate performance indicators for each graduate attribute?

• How many performance indicators are needed to demonstrate that graduates possess the desired attribute?

Since we will be assessing the attributes of graduates of our programs (either directly or indirectly), it is important that the performance indicators work for our programs, our students, our faculty, etc. As a result, we should develop our own (program-specific) performance indicators from the CEAB graduate attributes.

However, given the broad scope of many of the CEAB’s graduate attributes, it is possible to come up with a very large number of performance indicators. The risks here are that the process becomes unmanageable from an assessment point of view and that we (and the CEAB program visitors) become overwhelmed with data to the extent where valuable information is lost (or buried).

SSE Graduate Attributes Planning page 5 of 27

Page 53: 3.1 Graduate attributes

A nice analogy here is the concept of an “economic indicator” used by economists to analyze the economy. To deal with economic complexity, economists use key economic indicators such as unemployment rate, housing starts, consumer price index, etc. that serve as significant indicators of the current state and prediction of future trends.

A similar approach can be used for graduate attribute assessment to make this process more coherent and manageable. In other words, programs should identify a set of key performance indicators that capture the most important aspects of each of the CEAB’s graduate attributes. In most cases this should result in a relatively small number of performance indicators per graduate attribute (e.g., ABET assessment experts recommend as few as 3 to 4 performance indicators per student outcome [3]). The process of refining graduate attributes is illustrated in Figure 2.

Figure 2. Graduate Attributes and Performance Indicators

Developing a set of key performance indicators for each CEAB graduate attribute will take a considerable amount of time and effort, and should involve various faculty members with interest and expertise in all of the areas covered in Table 1. For example, a group of faculty members with interest/expertise in “communication skills” could be brought together to develop a short list of key performance indicators for graduate attribute 3.1.7. One approach to this group work that has proven useful in U.S. institutions involved in ABET outcomes-based assessment [4] is the affinity process (e.g., described in [5]) – a brainstorming technique that facilitates the process of clustering large numbers of solutions into groups of alike solutions.

Rather than starting from scratch, programs may decide to build on existing work on learning outcomes, which can then be refined into sets of key performance indicators (e.g., the CDIO Syllabus [6]). In this case, programs do not have to “reinvent the wheel” to generate performance indicators (they can start with an existing list), but they do have to map the existing list to the CEAB graduate attributes and narrow the list down to a set of key performance indicators.

SSE Graduate Attributes Planning page 6 of 27

Page 54: 3.1 Graduate attributes

1.3. Educational Practices/Strategies Once a set of key performance indicators are developed for each of the CEAB graduate attributes, the assessment work can begin.

As noted in the previous section, assessment can occur directly (i.e., within courses) and indirectly (i.e., through surveys and interviews). “Educational practices/strategies” relates to the first type of assessment, and more specifically, to where these assessments will occur. For example, given the graduate attributes and performance indicators shown in Figure 2, this step involves identifying which part of the program (e.g., which courses) have learning outcomes associated with the “communication skills” graduate attributes, and where it is most appropriate to collect evidence that students can demonstrate these skills. In other words, which courses are designed in a way that lets student’s demonstrate outcomes/attributes? Figure 3 provides an example of this for the B.Sc. in Mechanical Engineering program.

Figure 3. Example of a Curriculum Mapping for “Communication Skills”

It is important to recall that the CEAB graduate attributes, by definition, must be interpreted in the context of candidates at the time of graduation. In some cases, attributes may be acquired in a single course (e.g., applying engineering economic principles), however in most cases attributes are introduced in junior courses, then practiced and built on in senior course. For example, in Figure 3 COMS363 “Professional and Technical Communication” is concerned primarily with oral and written communications and, arguably, focuses on many of the outcomes described in CEAB graduate attribute 3.1.7. However, one could also argue that this course is introductory in nature, and as such, is intended to provide students with the communication skills that they will practice and develop throughout their engineering program. As a result, the true measure of a graduate’s communication skills is an assessment of written, oral, graphical work at the end of her/his program such as in the capstone design course, ENME 538 “Mechanical Engineering Design Methodology and Application”.

Figure 3 reflects this view of in-class assessment using two categories:

SSE Graduate Attributes Planning page 7 of 27

Page 55: 3.1 Graduate attributes

(1) Formative assessment: the assessment is used to promote learning; in the context of the classroom, this would be in the form of feedback to students; in the context of program assessment, the results can be used to see how students’ knowledge, skills and behaviours develop throughout a program of study;

(2) Summative assessment: this is an assessment of student learning at a certain point in time; in the context of the classroom, this may take the form of a final exam or final project; in the context of program assessment, the assessment can be used to demonstrate student learning at the time of graduation.

Although formative assessment is important for curriculum review, we are primarily concerned with summative assessment for graduate attribute assessment: i.e., the CEAB wants us to report on attributes at the time of graduation.

A common technique that is used to generate curriculum maps of the type shown in Figure 3 is to survey the faculty who are responsible for a program’s courses to determine if their course learning outcomes correspond to the performance indicators identified in the previous step. For example, the survey could list all of the program’s performance indicators and ask faculty to indicate which performance indicators apply to their course(s). The CDIO’s introduce-teach-utilize (ITU) analysis [7] is an example of this approach that has the added benefit of requiring faculty to think about whether the outcome is introduced, taught or utilized in their course, and as a result, whether or not a formal assessment occurs.

1.4. Assessment: Collection of Evidence Before looking at how evidence is collected, it should be noted that there is a fundamental difference between the assessment approaches that are used for the CEAB’s curriculum content criteria (Section 3.3 of [2]) and the CEAB’s graduate attribute criterion (Section 3.1 of [2]).

• Curriculum Content: The CEAB sets minimum curriculum component levels and states that “all students must meet all curriculum content criteria” (i.e., the “minimum path” requirement). Evidence is presented in the form of curriculum tables, and samples of student transcripts are used to confirm that students are following the curriculum.

• Graduate Attributes: The CEAB views graduate attributes assessment in the context of continuous improvement. As a result, each program sets its own performance targets, collects evidence of achievement, and uses the results for curriculum improvement (as illustrated in Figure 1). Like all continuous improvement processes, it is unrealistic to expect that all students meet all of the performance targets: otherwise, there is no room for improvement! It is also unreasonable to expect that evidence will be collected on all graduates: e.g., we cannot expect to get a 100% response rate from an alumni survey or even expect that an in-class assessment will capture all of a program’s Spring 2011 graduates. As a result, evidence will be in the form of appropriate samples of student work, interview responses, survey responses, etc.

This part of the process involves both the identification of forms of evidence of student learning, and the establishment of levels of student achievement. The basis for this work is the performance indicators from the previous section: i.e., evidence should be collected on each performance indicator.

SSE Graduate Attributes Planning page 8 of 27

Page 56: 3.1 Graduate attributes

The first questions that follows from this are, what are appropriate forms of evidence of student achievement and how do we collect it for a given performance indicator? As noted previously, graduate attribute assessment takes two forms: direct and indirect assessment. Table 2 provides examples of assessment techniques that can be used in each of these categories.

Table 2. Forms of Evidence

Direct Indirect

Exit and other interviews Written surveys and questionnaires

Standardized exams Exit and other interviews

Locally developed exams Archival records

Portfolios Focus groups

Simulations

Performance appraisal

External examiner

Oral exams

Behavioural observations

It should be noted that, whether a particular assessment method is direct or indirect depends on the nature of what is being measured and how the method is being used. A description of each of the forms of evidence listed in Table 2 is beyond the scope of this document – please refer to [8] for in-class assessment and [9] for surveys.

At this stage of the process, specific courses will need to be identified for direct assessment (using the curriculum maps described previously), and decisions will need to be made about the forms of indirect assessment that will be used. It is best to identify at least two or three forms of evidence for each of the performance indicators in order to ensure that the results are aligned, and if not, to provide feedback to refine the measures (i.e., triangulation of results).

As noted previously, a sampling approach should be followed. For example, a representative sample of graduating students can be given exit interviews in their final year of study, an alumni survey can be used provided that enough responses are received to reach conclusions about the results (e.g., 90% confidence interval), and in-class, summative assessments can be given to classes with representative numbers of students within a cohort (e.g., a project report in a core course). Figure 4 illustrates the link between CEAB graduate attributes, performance indicators, and evidence.

SSE Graduate Attributes Planning page 9 of 27

Page 57: 3.1 Graduate attributes

Figure 4. Collection of Evidence

The next question that follows from this is how do we set the performance targets for a program’s performance indicators? We will look at direct assessment and indirect assessment separately to address this question.

For direct assessment (i.e., in-class assessment), a level of achievement will first have to be established by the course instructor(s). For example, in the case of a technical skill, the instructor may administer an exam and equate a given percentage with satisfactory achievement of the performance indicator; in the case of professional skills like communication and teamwork, the instructor may develop a rubric [8] with written descriptions for each level of achievement. An example of this second form of assessment is shown in Figure 5: the “meets standard” level describes what is required to satisfy the outcome/performance indicator.

SSE Graduate Attributes Planning page 10 of 27

Page 58: 3.1 Graduate attributes

Figure 5. Analytic Rubric for Writing Skills (from [10])

For indirect assessment (e.g., surveys and exit interviews), responses may be binary in nature (e.g., “Can graduates of the … program write with coherence and flow?”) or require a choice to be made (e.g., “How would you rate our graduates’ ability to write with coherence and flow?”). Like direct assessment, one can set the thresholds for demonstrating the performance indicator: e.g., “Yes” in the case of a binary decision, “Satisfactory” in the case of a rating.

A second aspect of setting performance targets will be discussed in the next section: in addition to setting levels of student/graduate achievement (e.g., “70%”, “Yes”, “Satisfactory”), the program will have to determine levels of program achievement. For example, can we say that graduates of the program demonstrate a particular attribute or performance indicator if 80% of its students/graduates demonstrate “satisfactory” or higher performance? 90%? 100%?

Before concluding this section, a few words should be said about the planning aspects associated with collecting student evidence. Given that most programs will have a large amount of evidence that will be collected, it is important that the program has processes in place to manage the collection of evidence, and a means of clearly presenting their plan to the CEAB visiting team.

An assessment plan of the form shown in Figure 6 is recommended. As can be seen in this figure, all of the assessments associated with the “Communication Skills” graduate attribute are summarized on one-page along with the type of assessment, where it is performed, when it is performed, who is responsible for the assessment, and who is responsible for evaluating the results.

SSE Graduate Attributes Planning page 11 of 27

Page 59: 3.1 Graduate attributes

Graduate Attribute: 3.1.7 Communication Skills Performance Indicators

Courses Method(s) of Assessment

Source of Assessment

Time of Data Collection

Assessment Coordinator

Evaluation of Results

Faculty evaluations ENME538 Winter 2011 – Dr. A 2014 – Dr. B

Student surveys ENME538 Winter 2011 – Dr. A 2014 – Dr. B

1. Demonstrate writing with coherence and flow

ENGG200, ENGG201, ENGG225, ENGG233, COMS363, ENGG349, ENME337, PHYS369, ENME495, ENGG513, ENME538, ENGG481 Alumni surveys On-line Survey Spring 2011 – Ms. C

2014 – Ms. C

Department Curriculum Committee

Faculty evaluations ENME538 Winter 2011 – Dr. A 2014 – Dr. B

Student surveys ENME538 Winter 2011 – Dr. A 2014 – Dr. B

2. Practice writing with correct punctuation and grammar

ENGG200, ENGG201, ENGG225, ENGG233, COMS363, ENGG349, ENME337, PHYS369, ENME495, ENGG513, ENME538, ENGG481 Alumni surveys On-line Survey Spring 2011 – Ms. C

2014 – Ms. C

Department Curriculum Committee

Faculty evaluations ENGG200 Fall 2011 – Dr. D 2014 – Dr. E

Student surveys ENME538 Winter 2011 – Dr. A 2014 – Dr. B

3. Demonstrate sketching and drawing

ENGG200, ENGG201, ENGG225, ENGG233, ENGG349, ENME337, PHYS369, ENME495, ENME538

Alumni surveys On-line Survey Spring 2011 – Ms. C 2014 – Ms. C

Department Curriculum Committee

4. …

Department Curriculum Committee

Figure 6. Assessment Plan for “Communication Skills”

SSE Graduate Attributes Planning page 12 of 27

Page 60: 3.1 Graduate attributes

Wherever possible, it is recommended that this table be kept to one to two pages (preferably one page). As will be discussed in the next section, this table can be expanded to provide a summary of the program’s evaluation of the assessment results. Given that CEAB program visitors will be looking at twelve graduate attributes per program in addition to information on the program’s process for graduate attribute assessment: a succinct summary of the assessments will likely be appreciated.

Having said this, it is also important for the program to maintain a record of the details of the assessment process for the Department, School, and the CEAB visiting team. It is recommended that this aspect of the assessment (and evaluation) documentation is accomplished with a graduate attributes binder (or one binder per graduate attribute) that would be made available at the time of the CEAB visit. The binder(s) would contain a summary of the form shown in Figure 6, along with samples of all of the assessments used for the graduate attribute(s) and any other relevant information (e.g., analyses of results, relevant curriculum committee meeting minutes, etc.).

1.5. Evaluation: Collection and Analysis of Evidence This phase of the process is where programs determine if their graduates are attaining the CEAB’s graduate attributes. On the surface it is a matter of comparing the results of the direct and indirect assessments described in the previous section to performance targets. However, the evaluation phase is more than just looking at the results of the performance indicators: it also involves analysis of the assessment process itself. More specifically, the assessments and their targets must be evaluated, and the results need to be interpreted and summarized.

Performance Targets

As noted in the previous section, programs are responsible for setting their own performance targets, which then become the basis for graduate attribute assessment. A common way to report performance targets is in terms of the percentage of students/graduates who demonstrate “satisfactory” or higher performance for the attribute (“satisfactory” performance is established with the assessment as described previously).

In order to establish performance targets, it is important to look at both the program’s educational objectives and its students. For example, if a program’s curriculum has been designed with an emphasis on specific areas, it may be reasonable to set the performance targets that relate to that area higher than it would be in other programs: e.g., a program that emphasizes project management may have higher performance targets associated with graduate attribute 3.1.11 than other programs. To establish initial performance targets, the performance of recent student cohorts could be reviewed: i.e., past performance in the courses identified in the curriculum map (e.g., Figure 3) could be used as a starting point for each graduate attribute’s performance target (e.g., percentage of students who obtain a “C-” or higher).

During this phase, it is important to ask if the selected performance targets are realistic. Again, this should be done in the context of the program and its students. For example, do the performance targets reflect the program’s educational objectives? Given the program’s current student cohort, are the performance targets unrealistically high or are they set too low?

SSE Graduate Attributes Planning page 13 of 27

Page 61: 3.1 Graduate attributes

Analysis of Evidence

In the previous section, it was recommended that more than one piece of evidence should be collected for each performance indicator. In most cases, this would involve one or more direct assessments (e.g., in-class assessment) and one or more indirect assessments (e.g., alumni survey, exit interview).

There are a number of benefits to collecting multiple pieces of evidence. Arguably, the most obvious benefit is it provides backup: i.e., at least one piece of evidence should be obtained if data is lost or an assessment fails.

Multiple forms of evidence also help to validate the results. This approached is typically referred to as “cross-examination” or “triangulation”, and allows the program to increase confidence in the end result (i.e., when the different methods lead to the same results). With one piece of evidence, a strong argument must be made that the result is valid. With two, one runs the risk of having clashing results (which is correct?) – the hope is, that with three methods, two will produce similar results.

With graduate attributes assessment, we are typically combining evidence from different sources such as in-class assessments of student achievement, surveys of student self-efficacy (e.g., exit interviews), surveys asking alumni about their experience, and surveys asking employers to rate our graduates. Clearly, the more that different sources align, the more confidence we have that the results are valid.

Finally, multiple sources of evidence help programs to analyze the assessment process itself. In particular, any inconsistencies in the results can be used to refine the assessments (e.g., reframe survey questions, change in-class assessment, etc.).

Presenting the Results

The analysis of the evidence can be combined with the assessment plan as shown in Figure 7. In other words, this should simply be an extension of the form shown in Figure 6 that is also supported by documentation in a graduate attribute binder. The following type of information may be included in the binder:

• Rubrics used to score communication skills (e.g., Figure 5);

• Senior survey questionnaire and results;

• Alumni survey questionnaire and results;

• Minutes of EUSC meetings where School-wide plans were discussed/developed;

• Minutes of Departmental Curriculum Committee meeting(s) where recommendations were made;

• Etc.

SSE Graduate Attributes Planning page 14 of 27

Page 62: 3.1 Graduate attributes

Graduate Attribute: 3.1.7 Communication Skills Performance Indicators

Courses Method(s) of Assessment

Source of Assessment

Length of Assessment Cycle

Year(s) of Data Collection

Target Performance

Faculty evaluations ENME538

Student surveys ENME538

1. Demonstrate writing with coherence and flow

ENGG200, ENGG201, … ENGG513, ENME538, ENGG481

Alumni surveys On-line Survey

3 years 2011, 2014 85%

Faculty evaluations ENME538

Student surveys ENME538

2. Practice writing with correct punctuation and grammar

ENGG200, ENGG201, … ENGG513, ENME538, ENGG481

Alumni surveys On-line Survey 3 years 2011, 2014 85%

Faculty evaluations ENGG200

Student surveys ENME538

3. Demonstrate sketching and drawing

ENGG200, ENGG201, … ENME495, ENME538

Alumni surveys On-line Survey

3 years 2011, 2014 85%

4. …

Results (direct measures) 2011: A sample of 115 students (68% of the 2010/2011 cohort) in ENME 538 were assessed for performance indicators 1 and 2. A sample of 455 students (60% of the 2010/2011 cohort) in ENGG 200 were assessed for performance indicators 1 and 3 (i.e., 4 of the 6 sections). The percentage of the sample that demonstrated each indicator were as follows: Indicator 1 – 82%; Indicator 2 – 75%; Indicator 3 – 90%; etc. Results (indirect measures) 2011: A survey of the ENME 538 class was conducted in March 2011 with a response rate of 62% (104 responses). The percentage of the sample that demonstrated each indicator were as follows: Indicator 1 – 89%; Indicator 2 – 77%; Indicator 3 – 92%; etc. An alumni survey was conducted in March 2011 …

Actions 2011: Based on the analysis of the results, the department asked faculty members to provide the communication skills rubrics to students with the course assignments where the students were provided with opportunities to demonstrate their communication skills as defined by the indicators. A sub-committee of the Department Curriculum Committee met to review the performance indicators. It was decided not to make any changes at this time …

Figure 7. Assessment Plan and Analysis for “Communication Skills”

SSE Graduate Attributes Planning page 15 of 27

Page 63: 3.1 Graduate attributes

1.6. Feedback for Continuous Improvement The overall purpose of graduate attribute assessment is to establish a process for the continuous improvement of each program’s curriculum. However, as shown in Figure 1 and implied throughout this document, feedback for continuous improvement occurs at all stages of the process.

Graduate Attributes

Although the graduate attributes were established by the CEAB, we have some input on this aspect of the process. In the context of outcomes-based assessment, graduate attributes are very similar to student outcomes or program outcomes: e.g., ABET defines “student outcomes” as “what students are able to do by the time of graduation … relate to the knowledge, skills, and behaviours that students acquire as they progress through the program” [4]. ABET encourages programs to establish their own student outcomes that are more reflective of their program’s educational objectives, then map their program-specific outcomes to ABET’s criteria. This same process could be followed with respect to the CEAB’s graduate attributes (an example will be provided later in this document).

It should be noted that the NCDEAS (National Council of Deans of Engineering and Applied Sciences) is expected to provide input to Engineers Canada on the CEAB’s graduate attributes. As we gain more experience with graduate attributes assessment, our feedback on this aspect of the accreditation process should be provided to the Dean of the Schulich School of Engineering.

Performance Indicators and Educational Practices/Strategies

It is hoped that our initial efforts to establish meaningful and measurable performance indicators are successful. The real test of our efforts will occur when they are put to use. For example, faculty will need to work with the Department’s program assessment person (people) to develop forms of evidence: this work should provide feedback on the performance indicator (e.g., if it makes sense, can be assessed, etc.) and the course mapping (e.g., is this really an outcome of the course?). Similarly, indirect evidence like surveys will require some fine-tuning (e.g., rephrasing of ambiguous or leading questions).

As noted previously, programs should focus on a relatively small set of “key performance indicators”. At this stage of the process, it is important to ask if the correct performance indicators were defined: are they representative of the graduate attribute? are new performance indicators required? should some performance indicators be removed?

Assessment: Collection of Evidence

Although the purpose of collecting evidence is to assess the program’s graduates in the context of the graduate attributes, a considerable amount of information should also be available on the assessment process itself. For example:

• Forms of Evidence: Are the assessments appropriate (e.g., is a term test, a report, etc. the best way to assess the attribute)? Is the timing of the assessment appropriate (e.g., should the alumni survey be done during the Winter term)?

• Performance Targets: Do the performance targets need to be adjusted up or down?

• Triangulation: Are the various forms of evidence arriving at the same results?

SSE Graduate Attributes Planning page 16 of 27

Page 64: 3.1 Graduate attributes

• Number of Samples: Did we sample enough students/alumni/industry?

Curriculum

The information that is obtained from the graduate attributes assessment process should be used to inform discussions and actions about program’s curriculum at various levels. Faculty, Department Curriculum Committee representatives, and EUSC representatives should ask themselves what the results are telling them about:

• Course Design: the emphasis in lectures and/or labs may be misaligned with the courses learning objectives; the assessments may be inappropriate (e.g., should ethics be assessed with a multiple choice exam?); the course may assume that students have prerequisite knowledge that they do not have; etc.

• Program Design: the course sequence may be incorrect; important program outcomes may be missed or underemphasized in the program; etc.

• Common Core Design: similar questions to “program design”, but from a shared, School-wide perspective.

Data vs. Information

As noted previously, the key here is to ensure that the graduate attribute assessment process provides the program and School with information that can be used to fine-tune the process and improve the School’s undergraduate programs. There is always the temptation to collect as much data as possible, then cross one’s fingers and hope that we can learn something. However, if the process is carefully planned from the start, and feedback is used to refine the process, we should be able to reach the point where all of our graduate attributes assessment efforts are meaningful (and manageable).

1.7. Graduate Attributes Assessment Timeline It is very important that the School, and in turn, each program decides what is reasonable in terms of the timeline for graduate attributes assessment. The CEAB has determined that no concerns, weaknesses or deficiencies will be assessed under section 3.1 of their criteria [2] until 2014 (i.e. one full accreditation cycle). Until that time a transition and development period will be allowed, during which evidence must be provided to demonstrate how the program is complying with the graduate attribute criterion.

We certainly should aim to demonstrate that we are taking graduate attribute assessment seriously and have plans in place to ensure that the process works for the Schulich School of Engineering’s programs. Given that the process is focused on continuous improvement, it makes sense that we embrace it regardless of accreditation requirements.

However, given the School’s limited resources, the need to focus on “standard” accreditation requirements (i.e., curriculum content, course binders, questionnaires, etc.), and the need to ensure that we are establishing a continuous improvement process (and not just producing more paper for the accreditation team), what is reasonable? In order to put this in context, a typical data collection timeline for ABET programs is shown in Figure 8.

As can be seen in this figure, it is unreasonable to expect that an outcomes-based assessment (i.e., graduate attributes assessment) process can be fully realized in a matter of months. The main reason for this is that graduate attributes assessment should be a shared activity, involving input from multiple stakeholders. For example, early work

SSE Graduate Attributes Planning page 17 of 27

Page 65: 3.1 Graduate attributes

on program educational objectives should involve faculty, students, alumni, industry etc.; the more detailed work on performance indicators should involve faculty with interest/expertise in specific graduate attributes; curriculum mapping should involve all of the program’s teaching faculty; etc., etc.

Figure 8. A Typical Data Collection Timeline (adapted from [3])

A program certainly could take a top-down approach and setup all of the assessments described in this document, require faculty to report on the assessments, and evaluate the results. This approach would, perhaps, satisfy a program visitor, but it has the potential to alienate the program’s faculty from the process and entirely miss the point of graduate attributes assessment: i.e., if the program’s faculty are not involved (other than to hand over performance results) how can curriculum improvement occur? In other words, the process degenerates into just another accreditation exercise.

Having said this, it is still important to make as much progress as possible, in as many areas as possible for the Fall 2011 accreditation visit. We only need to determine how much progress is enough. Before attempting to answer this question, we should look at a typical data collection plan used by an ABET institution (Table 3) – in this case, the ABET student outcomes are replaced by CEAB graduate attributes.

Table 3. A Typical Data Collection Plan (adapted from [3])

Academic Year Graduate Attribute

10-11 11-12 12-13 13-14 14-15 15-16

A knowledge base for engineering

Problem analysis

Investigation

Design

Use of engineering tools

As can be seen in this figure, not all of the graduate attribute data collection is done at once. Instead, data collection is spread over a more manageable, multi-year cycle. For example, a 3-year cycle (as shown in Figures 6-7 and Table 3) would involve collecting data on four graduate attributes per academic year; the CEAB would be provided with two to three assessments of each graduate attribute by the next visit.

SSE Graduate Attributes Planning page 18 of 27

Page 66: 3.1 Graduate attributes

Please note that this is not to say that assessments should only be performed in the years noted in Table 3. Clearly, it is important for pedagogical reasons that assessment continues to occur in the program’s courses; however, the data is not collected, analyzed, etc. for accreditation purposes during the off-cycle years.

Returning to the question of “how much is enough”, it seems reasonable that programs attempt to have the preliminary planning work accomplished as early as possible in the Fall 2010 term so that some selected assessments can occur in the 2010/2011 term and the overall plan (with examples of assessments and preliminary evaluations) can be included in the CEAB questionnaire by Spring 2011. Table 4 summarizes a suggested plan for graduate attributes planning.

As can be seen, a considerable amount of up-front work is required by each program in Fall 2010 in order to establish a set of performance indicators and a curriculum map. Once this preliminary work is done, a relatively small number of in-class assessments can be identified for Fall 2010 and Winter 2010 and surveys can be developed (e.g., student survey, alumni survey).

In the next section, an example is provided of how the CDIO syllabus [6] can be used to facilitate the up-front work involved in this process. The example is taken from work done by the Department of Mechanical & Manufacturing Engineering over the past year.

SSE Graduate Attributes Planning page 19 of 27

Page 67: 3.1 Graduate attributes

Table 4. Graduate Attribute Assessment Plan

When What

Fall 2010 Winter 2011 Spring/Summer 11 Fall 2011 Define performance indicators and map curriculum

Performance Indicators defined (start of term) Curriculum mapped (once PI’s are defined)

Develop surveys Student survey developed

Alumni survey developed

(once PI’s are defined)

Data collection In selected Common Core1 courses

In selected Common Core and Program2 courses

Surveys tested (start of term)

Survey implemented (end of term)

In selected Common Core and Program courses

Evaluation and design of improvements

In-class assessments evaluated/updated

Surveys evaluated/updated

Implement improvements and data collection

In-class assessments revised

Notes:

1. Where clear matches between the graduate attributes and Common Core courses exist (e.g., ENGG 200, ENGG 513).

2. Where key performance indicators can be assessed (e.g., capstone design courses).

SSE Graduate Attributes Planning page 20 of 27

Page 68: 3.1 Graduate attributes

2. Example – B.Sc. in Mechanical Engineering An alternative, but complementary, approach to the one described to this point is to build on work that has already been done in this area. In particular, the CDIO syllabus [6] is effectively a very detailed list of general engineering program outcomes that has been used to expand on ABET’s program outcomes for US engineering schools. CDIO collaborators developed the syllabus and validated it via focus-group discussions, document research, surveys, workshops, and peer review that involved faculty, students, industry leaders, and senior engineering academics from a variety of universities [6].

Figure 9 shows a detail of the expanded CDIO syllabus: i.e., one topic (Experimentation and Knowledge Discovery) and a portion of its subtopics (e.g., Hypothesis Formation) and learning outcomes (e.g., select critical questions to be examined) are shown. The “[b]” beside the topic heading shows the mapping to the ABET program outcome (i.e., “an ability to design and conduct experiments, as well as to analyze and interpret data”).

Figure 9. A Detail of the Expanded CDIO Syllabus [6]

The power of this syllabus is that it is comprehensive and can be directly mapped to ABET’s program outcomes. Similarly, as described by Cloutier et al. [11], the CDIO syllabus can be mapped directly to the CEAB’s graduate attributes. Once the mapping is performed, the CDIO syllabus can be used as a starting point for the development of performance indicators as noted in section 1.2.

It should be noted that this approach does not discount the stakeholder engagement that is inherent with the first approach. Instead, the CDIO syllabus is used as a starting point for program assessment and as a means of informing and focusing the discussions around program-specific outcomes and performance criteria. Figure 10 shows how this approach can be applied to the planning process described in Section 1.

SSE Graduate Attributes Planning page 21 of 27

Page 69: 3.1 Graduate attributes

Figure 10. Graduate Attribute Planning and the CDIO Syllabus

In the remainder of this section, we will show how this approach could be applied to the B.Sc. in Mechanical Engineering program.

2.1. Program Educational Objectives and Student Outcomes As described in Section 1.6, CEAB graduate attributes could be viewed in the same context as ABET “student outcomes”. More specifically, each program could develop its own set of student outcomes that are mapped directly to the CEAB graduate attributes as shown in Figure 10. Before looking at how this is done, it is useful to look at the relationship between “program educational objectives” and “student outcomes”.

Like graduate attributes, student outcomes are focused on what students can do at the time of graduation. However, from an employer’s perspective (e.g., industry, government, etc.), the interest is more in what graduates are expected to attain within a few years after graduation. These broader, “program educational objectives” are less in the program’s control since our graduates’ work and life experiences factor in to these “outcomes”. However, broad program educational objectives help to focus a program’s more detailed student outcomes.

When starting from scratch, it is advised that programs consult with their constituents and stakeholders (e.g., industry, community, etc.) when developing their program educational objectives. As shown in Table 5, the CDIO syllabus can be used as a starting point for this work.

SSE Graduate Attributes Planning page 22 of 27

Page 70: 3.1 Graduate attributes

Table 5. The CDIO Syllabus and Program Educational Objectives

CDIO Syllabus (Level 0) Mechanical Engineering Program Educational Objectives

1. Technical knowledge and reasoning

! 1. Demonstrate a deep working knowledge of technical fundamentals

2. Personal and Professional Skills and Attributes

! 2. Apply and master personal and professional skills and attributes

3. Interpersonal Skills: Teamwork and Communication

! 3. Communicate effectively and work in multidisciplinary teams

4. Conceiving, Designing, Implementing and Operating Systems in the Enterprise and Societal Context

! 4. Conceive, design, implement and operate systems in enterprise and social contexts

In this example, Level 0 of the CDIO syllabus is used as a starting point to develop more program-specific educational objectives for the B.Sc. in Mechanical Engineering. As noted, these should be developed with the input of the program’s constituents/stakeholders – however, the CDIO syllabus provides a good starting point for discussions.

The main advantage of this approach, is that student outcomes (and graduate attributes) can now be linked to a very comprehensive syllabus. More specifically, the CDIO syllabus can be viewed as follows in the context of the graduate attributes phases discussed previously:

CDIO Syllabus Graduate Attributes Assessment

Level 0 ! Program Educational Objectives

Level 1 ! Student Outcomes / Graduate Attributes

Level 2 ! Performance Indicators

For example, Figure 11 shows how the B.Sc. in Mechanical Engineering program can be articulated in terms of program educational objectives and student outcomes. Figure 12 shows how student outcomes are mapped to the CEAB graduate attributes (based on [11]).

2.2 Performance Indicators and Course Mapping As noted above, the CDIO syllabus can now be used to generate a (long) list of potential performance indicators for each student outcome / graduate attribute: i.e., Level 2 of the syllabus. Ideally, faculty members who have interest/expertise in specific student outcomes / graduate attributes should refine the Level 2 learning outcomes (i.e., performance indicators) at this point; however, even without this initial work, curriculum mapping can begin.

SSE Graduate Attributes Planning page 23 of 27

Page 71: 3.1 Graduate attributes

Program Educational Objectives:The mechanical engineering program is designed such that students are able to:1. demonstrate a deep working knowledge of technical fundamentals; 2. apply and master personal and professional skills and attributes; 3. communicate effectively and work in multidisciplinary teams; 4. conceive, design, implement and operate systems in enterprise and social contexts.

1.1 Apply the principles of the underlying sciences of mathematics, physics and chemistry

1.2 Apply the principles of core engineering fundamentals in fluid mechanics, solid mechanics and materials, dynamics, electrical circuits and machines, thermodynamics, computers and computation

1.3 Demonstrate deep working knowledge and reasoning in heat transfer, kinematics and dynamics of machines, structural mechanics, structural analysis and design, machine component design, manufacturing and production systems, thermo-fluids, manufacturing and production processes, and control systems

2.1 Analyze and solve engineering problems

2.2 Conduct inquiry and experimentation in engineering problems

2.3 Think holistically and systemically

2.3 Master personal skills that contribute to successful engineering practice: initiative, flexibility, creativity, curiosity, and time management

2.4 Master professional skills that contribute to successful engineeringpractice: professional ethics, integrity, currency in the field, career planning

3.1 Lead and work in teams 4.1 Recognize the importance of thesocietal context in engineering practice

3.2 Communicate effectively in writing,in electronic form, in graphic media, and in oral presentations

4.2 Appreciate different enterprise cultures and work successfully in organizations

4.3 Conceive engineering systems including setting requirements, definingfunctions, modelling systems, andmanaging projects

4.4 Apply a phased design process and utilize knowledge in design

4.5 Design and implement hardware and software processes and manageimplementation procedures

4.6 Apply techniques for operating complex engineering systems and process and managing operations

Student Outcomes

Figure 11. B.Sc. in Mechanical Engineering Objectives and Outcomes

SSE Graduate Attributes Planning page 24 of 27

Page 72: 3.1 Graduate attributes

Figure 12. Student Outcomes / Graduate Attributes Mapping for the B.Sc. in Mechanical Engineering Program

SSE Graduate Attributes Planning page 25 of 27

Page 73: 3.1 Graduate attributes

In a pilot study of the B.Sc. in Mechanical Engineering program [12], the work by Cloutier et al. [11] was extended to determine where the CEAB’s twelve graduate attributes are introduced, taught, and/or utilized throughout the program. More specifically, a full introduce-teach-utilize (ITU) analysis (e.g., [7]) of the mechanical engineering curriculum was performed via a survey of the instructors of Fall 2008 and Winter 2009 courses. The survey was conducted by a series of one-hour meetings with all faculty involved in delivering the mechanical engineering program and involved a series of questions of two types. First, the instructors used the CDIO syllabus to map learning activities and student outcomes. For each category, the instructor was asked if the activity was introduced (i.e., superficial treatment to briefly expose the topic), taught (i.e., detailed coverage with assignments / exams) or utilized (i.e., assume the student is already skilled in this area) in their course. Secondly, eight questions were asked that focused on determining the intended learning outcomes (i.e., performance indicators) of the course. The curriculum mapping shown in Figure 3 is one example of this process.

2.3 Final Words on the CDIO Syllabus Programs do not have to adopt the “CDIO Approach” [6] to take advantage of the CDIO syllabus for graduate attributes assessment. As noted, the CDIO syllabus is effectively a very detailed list of general engineering program outcomes that should apply to any engineering discipline. The advantage to the approach described in this section is that the considerable amount of work that has been accomplished by an international community of engineering educators can be used as a starting point for a program’s work on graduate attributes assessment. To achieve an effective continuous improvement process though, it is still very important to engage faculty, students, and other stakeholders in the process to build on the CDIO work and thereby make the process specific to the School’s individual programs.

References 1. Canadian Engineering Accreditation Board, Accreditation Criteria and Procedures,

2008.

2. Canadian Engineering Accreditation Board, Accreditation Criteria and Procedures, www.engineerscanada.ca/e/files/Accreditation_Criteria_Procedures_2009.pdf, 2009.

3. G.M. Rogers, Assessment Planning Flow Chart, CD available at http://www.abet.org/ideal.shtml, 2004.

4. ABET, Self-study Questionnaire: Template for a Self-study Report, 2011-2012 Review Cycle, available at http://www.abet.org/forms.shtml, 2010.

5. K. Holtzblatt, J. Burns Wendell, S. Wood, Rapid contextual design: a how-to guide to key techniques for user-centered design, Elsevier, 2005.

6. E. Crawley, J. Malmqvist, S. Ostlund, D. Brodeur, Rethinking Engineering Education: the CDIO Approach, Springer, 2007.

7. D.J. Newman and A.R. Amir, “Innovative first year aerospace design course at MIT”, Journal of Engineering Education, July 2001, pp. 375-381.

8. S.M. Brookhart, The Art and Science of Classroom Assessment: The Missing Part of Pedagogy, ASHE-ERIC Higher Education Report, Vol. 27, No. 1, 1999.

9. L.A. Suskie, Questionnaire Survey Research: What Works, 2nd Edition, Association for Institutional Research, 1996.

SSE Graduate Attributes Planning page 26 of 27

Page 74: 3.1 Graduate attributes

10. Kent School District, http://www.kent.k12.wa.us (accessed 17 August 2010).

11. G. Cloutier, R. Hugo, and R. Sellens, “Mapping the relationship between the CDIO Syllabus and the 2008 CEAB Graduate Outcomes”, Proceedings of the 6th International CDIO Conference, École Polytechnique, Montréal, June 15-18, 2010.

12. R. Brennan and R. Hugo, “The CDIO syllabus and outcomes-based assessment: a case study of a Canadian mechanical engineering program”, Proceedings of the 6th International CDIO Conference, École Polytechnique, Montréal, June 15-18, 2010.

SSE Graduate Attributes Planning page 27 of 27

Page 75: 3.1 Graduate attributes

Exhibit 5: Graduate Attributes Assessment Resources

Exhibit 5(c): Conceive-Design-Implement-Operate (CDIO) Syllabus

Page 76: 3.1 Graduate attributes

CDIO Syllabus in Topical Form • from www.cdio.org • the “[]” beside the topic heading shows the mapping to the ABET program outcome (e.g., CDIO

topic “2.2 Experimentation and Knowledge Discovery” maps to ABET program outcome [b] “an ability to design and conduct experiments, as well as to analyze and interpret data”)

1 TECHNICAL KNOWLEDGE AND REASONING 1.1 KNOWLEDGE OF UNDERLYING SCIENCES [a] 1.1.1 Mathematics (including statistics) 1.1.2 Physics 1.1.3 Chemistry 1.1.4 Biology 1.2 CORE ENGINEERING FUNDAMENTAL KNOWLEDGE [a] 1.3 ADVANCED ENGINEERING FUNDAMENTAL KNOWLEDGE [k] 2 PERSONAL AND PROFESSIONAL SKILLS AND ATTRIBUTES 2.1 ENGINEERING REASONING AND PROBLEM SOLVING [e] 2.1.1 Problem Identification and Formulation

• Data and symptoms • Assumptions and sources of bias • Issue prioritization in context of overall goals • A plan of attack (incorporating model, analytical and numerical solutions, qualitative analysis,

experimentation and consideration of uncertainty) 2.1.2 Modeling

• Assumptions to simplify complex systems and environment • Conceptual and qualitative models • Quantitative models and simulations

2.1.3 Estimation and Qualitative Analysis

• Orders of magnitude, bounds and trends • Tests for consistency and errors (limits, units, etc.) • The generalization of analytical solutions

2.1.4 Analysis With Uncertainty

• Incomplete and ambiguous information • Probabilistic and statistical models of events and sequences • Engineering cost-benefit and risk analysis • Decision analysis • Margins and reserves

2.1.5 Solution and Recommendation

• Problem solutions • Essential results of solutions and test data

Page 77: 3.1 Graduate attributes

• Discrepancies in results • Summary recommendations • Possible improvements in the problem solving process

2.2 EXPERIMENTATION AND KNOWLEDGE DISCOVERY [b] 2.2.1 Hypothesis Formulation

• Critical questions to be examined • Hypotheses to be tested • Controls and control groups

2.2.2 Survey of Print and Electronic Literature

• The literature research strategy • Information search and identification using library tools (on-line catalogs, databases, search

engines) • Sorting and classifying the primary information • The quality and reliability of information • The essentials and innovations contained in the information • Research questions that are unanswered • Citations to references

2.2.3 Experimental Inquiry

• The experimental concept and strategy • The precautions when humans are used in experiments • Experiment construction • Test protocols and experimental procedures • Experimental measurements • Experimental data • Experimental data vs. available models

2.2.4 Hypothesis Test, and Defense

• The statistical validity of data • The limitations of data employed • Conclusions, supported by data, needs and values • Possible improvements in knowledge discovery process

2.3 SYSTEM THINKING 2.3.1 Thinking Holistically

• A system, its behavior, and its elements • Trans-disciplinary approaches that ensure the system is understood from all relevant

perspectives • The societal, enterprise and technical context of the system • The interactions external to the system, and the behavioral impact of the system

2.3.2 Emergence and Interactions in Systems

• The abstractions necessary to define and model system • The behavioral and functional properties (intended and unintended) which emerge from the

system • The important interfaces among elements • Evolutionary adaptation over time

Page 78: 3.1 Graduate attributes

2.3.3 Prioritization and Focus

• All factors relevant to the system in the whole • The driving factors from among the whole • Energy and resource allocations to resolve the driving issues

2.3.4 Trade-offs, Judgement and Balance in Resolution

• Tensions and factors to resolve through trade-offs • Solutions that balance various factors, resolve tensions and optimize the system as a whole • Flexible vs. optimal solutions over the system lifetime • Possible improvements in the system thinking used

2.4 PERSONAL SKILLS AND ATTITUDES 2.4.1 Initiative and Willingness to Take Risks

• The needs and opportunities for initiative • The potential benefits and risks of an action • The methods and timing of project initiation • Leadership in new endeavors, with a bias for appropriate action • Definitive action, delivery of results and reporting on actions

2.4.2 Perseverance and Flexibility Self-confidence, enthusiasm, and passion

• The importance of hard work, intensity and attention to detail • Adaptation to change • A willingness and ability to work independently • A willingness to work with others, and to consider and embrace various viewpoints • An acceptance of criticism and positive response • The balance between personal and professional life

2.4.3 Creative Thinking

• Conceptualization and abstraction • Synthesis and generalization • The process of invention • The role of creativity in art, science, the humanities and technology

2.4.4 Critical Thinking

• The statement of the problem • Logical arguments and solutions • Supporting evidence • Contradictory perspectives, theories and facts • Logical fallacies • Hypotheses and conclusions

2.4.5 Awareness of One’s Personal Knowledge, Skills and Attitudes

• One's skills, interests, strengths, weaknesses • The extent of one's abilities, and one's responsibility for self-improvement to overcome

important weaknesses • The importance of both depth and breadth of knowledge

2.4.6 Curiosity and Lifelong Learning [ i ]

Page 79: 3.1 Graduate attributes

• The motivation for continued self-education • The skills of self-education • One's own learning style • Developing relationships with mentors

2.4.7 Time and Resource Management

• Task prioritization • The importance and/or urgency of tasks • Efficient execution of tasks

2.5 PROFESSIONAL SKILLS AND ATTITUDES 2.5.1 Professional Ethics, Integrity, Responsibility and Accountability [f]

• One's ethical standards and principles • The courage to act on principle despite adversity • The possibility of conflict between professionally ethical imperatives • An understanding that it is acceptable to make mistakes, but that one must be accountable for

them • Proper allocation of credit to collaborators • A commitment to service

2.5.2 Professional Behavior

• A professional bearing • Professional courtesy • International customs and norms of interpersonal contact

2.5.3 Proactively Planning for One’s Career

• A personal vision for one’s future • Networks with professionals • One's portfolio of professional skills

2.5.4 Staying Current on World of Engineer

• The potential impact of new scientific discoveries • The social and technical impact of new technologies and innovations • A familiarity with current practices/technology in engineering • The links between engineering theory and practice

3 INTERPERSONAL SKILLS: TEAMWORK AND COMMUNICATION 3.1 TEAMWORK [d] 3.1.1 Forming Effective Teams

• The stages of team formation and life cycle • Task and team processes • Team roles and responsibilities • The goals, needs and characteristics (works styles, cultural differences) of individual team

members • The strengths and weakness of the team • Ground rules on norms of team confidentiality, accountability and initiative

3.1.2 Team Operation

Page 80: 3.1 Graduate attributes

• Goals and agenda • The planning and facilitation of effective meetings • Team ground rules • Effective communication (active listening, collaboration, providing and obtaining information) • Positive and effective feedback • The planning, scheduling and execution of a project • Solutions to problems (team creativity and decision making) • Conflict negotiation and resolution

3.1.3 Team Growth and Evolution

• Strategies for reflection, assessment, and self-assessment • Skills for team maintenance and growth • Skills for individual growth within the team • Strategies for team communication and writing

3.1.4 Leadership

• Team goals and objectives • Team process management • Leadership and facilitation styles (directing, coaching, supporting, delegating) • Approaches to motivation (incentives, example, recognition, etc) • Representing the team to others • Mentoring and counseling

3.1.5 Technical Teaming

• Working in different types of teams : • Cross-disciplinary teams (including non-engineer) • Small team vs. large team • Distance, distributed and electronic environments • Technical collaboration with team members

3.2 COMMUNICATIONS [g] 3.2.1 Communications Strategy

• The communication situation • Communications objectives • The needs and character of the audience • The communication context • A communications strategy • The appropriate combination of media • A communication style (proposing, reviewing, collaborating, documenting, teaching) • The content and organization

3.2.2 Communications Structure

• Logical, persuasive arguments • The appropriate structure and relationship amongst ideas • Relevant, credible, accurate supporting evidence • Conciseness, crispness, precision and clarity of language • Rhetorical factors (e.g. audience bias) • Cross-disciplinary cross-cultural communications

3.2.3 Written Communication

Page 81: 3.1 Graduate attributes

• Writing with coherence and flow • Writing with correct spelling, punctuation and grammar • Formatting the document • Technical writing • Various written styles (informal, formal memos, reports, etc)

3.2.4 Electronic/Multimedia Communication

• Preparing electronic presentations • The norms associated with the use of e-mail, voice mail, and videoconferencing • Various electronic styles (charts, web, etc)

3.2.5 Graphical Communication

• Sketching and drawing • Construction of tables, graphs and charts • Formal technical drawings and renderings

3.2.6 Oral Presentation and Inter-Personal Communications

• Preparing presentations and supporting media with appropriate language, style, timing and flow • Appropriate nonverbal communications (gestures, eye contact, poise) • Answering questions effectively

3.3 COMMUNICATIONS IN FOREIGN LANGUAGES 3.3.1 English 3.3.2 Languages of Regional Industrialized Nations 3.3.3 Other Languages 4 CONCEIVING, DESIGNING, IMPLEMENTING AND OPERATING SYSTEMS IN THE ENTERPRISE AND SOCIETAL CONTEXT 4.1 EXTERNAL AND SOCIETAL CONTEXT [h] 4.1.1 Roles and Responsibility of Engineers

• The goals and roles of the engineering profession • The responsibilities of engineers to society

4.1.2 The Impact of Engineering on Society

• The impact of engineering on the environment, social, knowledge and economic systems in modern culture

4.1.3 Society's Regulation of Engineering

• The role of society and its agents to regulate engineering • The way in which legal and political systems regulate and influence engineering • How professional societies license and set standards • How intellectual property is created, utilized and defended

4.1.4 The Historical and Cultural Context

• The diverse nature and history of human societies as well as their literary, philosophical, and artistic traditions

Page 82: 3.1 Graduate attributes

• The discourse and analysis appropriate to the discussion of language, thought and values 4.1.5 Contemporary Issues and Values [j]

• The important contemporary political, social, legal and environmental issues and values • The process by which contemporary values are set, and oneÕs role in these processes • The mechanisms for expansion and diffusion of knowledge

4.1.6 Developing a Global Perspective

• The internationalization of human activity • The similarities and differences in the political, social, economic, business and technical norms

of various cultures • International inter-enterprise and inter-governmental agreements and alliances

4.2 ENTERPRISE AND BUSINESS CONTEXT 4.2.1 Appreciating Different Enterprise Cultures

• The differences in process, culture, and metrics of success in various enterprise cultures: • Corporate vs. academic vs. governmental vs. non-profit/NGO • Market vs. policy driven • Large vs. small • Centralized vs. distributed • Research and development vs. operations • Mature vs. growth phase vs. entrepreneurial • Longer vs. faster development cycles • With vs. without the participation of organized labor

4.2.2 Enterprise Strategy, Goals, and Planning

• The mission and scope of the enterprise • An enterprise’s core competence and markets • The research and technology process • Key alliances and supplier relations • Financial and managerial goals and metrics • Financial planning and control • The stake-holders (owners, employees, customers, etc.)

4.2.3 Technical Entrepreneurship

• Entrepreneurial opportunities that can be addressed by technology • Technologies that can create new products and systems • Entrepreneurial finance and organization

4.2.4 Working Successfully in Organizations

• The function of management • Various roles and responsibilities in an organization • The roles of functional and program organizations • Working effectively within hierarchy and organizations • Change, dynamics and evolution in organizations

4.3 CONCEIVING AND ENGINEERING SYSTEMS [c] 4.3.1 Setting System Goals and Requirements

• Market needs and opportunities

Page 83: 3.1 Graduate attributes

• Customer needs • Opportunities which derive from new technology or latent needs • Factors that set the context of the requirements • Enterprise goals, strategies, capabilities and alliances • Competitors and benchmarking information • Ethical, social, environmental, legal and regulatory influences • The probability of change in the factors that influence the system, its goals and resources

available • System goals and requirements • The language/format of goals and requirements • Initial target goals (based on needs, opportunities and other influences) • System performance metrics • Requirement completeness and consistency

4.3.2 Defining Function, Concept and Architecture

• Necessary system functions (and behavioral specifications) • System concepts • The appropriate level of technology • Trade-offs among and recombination of concepts • High level architectural form and structure • The decomposition of form into elements, assignment of function to elements, and definition of

interfaces 4.3.3 Modeling of System and Ensuring Goals Can Be Met

• Appropriate models of technical performance • The concept of implementation and operations • Life cycle value and costs (design, implementation, operations, opportunity, etc.) • Trade-offs among various goals, function, concept and structure and iteration until convergence

4.3.4 Development Project Management

• Project control for cost, performance, and schedule • Appropriate transition points and reviews • Configuration management and documentation • Performance compared to baseline • Earned value recognition • The estimation and allocation of resources • Risks and alternatives • Possible development process improvements

4.4 DESIGNING [c] 4.4.1 The Design Process

• Requirements for each element or component derived from system level goals and requirements

• Alternatives in design • The initial design • Experiment prototypes and test articles in design development • Appropriate optimization in the presence of constraints • Iteration until convergence • The final design • Accommodation of changing requirements

Page 84: 3.1 Graduate attributes

4.4.2 The Design Process Phasing and Approaches

• The activities in the phases of system design (e.g. conceptual, preliminary, and detailed design) • Process models appropriate for particular development projects (waterfall, spiral, concurrent,

etc.) • The process for single, platform and derivative products

4.4.3 Utilization of Knowledge in Design

• Technical and scientific knowledge • Creative and critical thinking, and problem solving • Prior work in the field, standardization and reuse of designs (including reverse engineer and

redesign) • Design knowledge capture

4.4.4 Disciplinary Design

• Appropriate techniques, tools, and processes • Design tool calibration and validation • Quantitative analysis of alternatives • Modeling, simulation and test • Analytical refinement of the design

4.4.5 Multidisciplinary Design

• Interactions between disciplines • Dissimilar conventions and assumptions • Differences in the maturity of disciplinary models • Multidisciplinary design environments • Multidisciplinary design

4.4.6 Multi-Objective Design (DFX) Design for:

• Performance, life cycle cost and value • Aesthetics and human factors • Implementation, verification, test and environmental sustainability • Operations • Maintainability, reliability, and safety • Robustness, evolution, product improvement and retirement

4.5 IMPLEMENTING [c] 4.5.1 Designing the Implementation Process

• The goals and metrics for implementation performance, cost and quality • The implementation system design: • Task allocation and cell/unit layout • Work flow • Considerations for human user/operators

4.5.2 Hardware Manufacturing Process

• The manufacturing of parts • The assembly of parts into larger constructs • Tolerances, variability, key characteristics and statistical process control

Page 85: 3.1 Graduate attributes

4.5.3 Software Implementing Process • The break down of high level components into module designs (including algorithms and data

structures) • Algorithms (data structures, control flow, data flow) • The programming language • The low-level design (coding) • The system build

4.5.4 Hardware Software Integration

• The integration of software in electronic hardware (size of processor, communications, etc) • The integration of software with sensor, actuators and mechanical hardware • Hardware/software function and safety

4.5.5 Test, Verification, Validation, and Certification

• Test and analysis procedures (hardware vs. software, acceptance vs. qualification) • The verification of performance to system requirements • The validation of performance to customer needs • The certification to standards

4.5.6 Implementation Management

• The organization and structure for implementation • Sourcing, partnering, and supply chains • Control of implementation cost, performance and schedule • Quality and safety assurance • Possible implementation process improvements

4.6 OPERATING [c] 4.6.1 Designing and Optimizing Operations

• The goals and metrics for operational performance, cost, and value • Operations process architecture and development • Operations (and mission) analysis and modeling

4.6.2 Training and Operations

• Training for professional operations: • Simulation • Instruction and programs • Procedures • Education for consumer operation • Operations processes • Operations process interactions

4.6.3 Supporting the System Lifecycle

• Maintenance and logistics • Lifecycle performance and reliability • Lifecycle value and costs • Feedback to facilitate system improvement

4.6.4 System Improvement and Evolution

• Pre-planned product improvement • Improvements based on needs observed in operation

Page 86: 3.1 Graduate attributes

• Evolutionary system upgrades • Contingency improvements/solutions resulting from operational necessity

4.6.5 Disposal and Life-End Issues

• The end of useful life • Disposal options • Residual value at life-end • Environmental considerations for disposal

4.6.6 Operations Management

• The organization and structure for operations • Partnerships and alliances • Control of operations cost, performance and scheduling • Quality and safety assurance • Possible operations process improvements • Life cycle management

Page 87: 3.1 Graduate attributes

Exhibit 5: Graduate Attributes Assessment Resources

Exhibit 5(d): Introduce-Teach-Utilize (ITU) Suvey

Page 88: 3.1 Graduate attributes

Nine Questions to Benchmark Curriculum and Instruction

Course ______________________________ Instructor(s) ______________________ Person Being Interviewed _____________________________ Interviewer ________________________________________ Date________________ • What is it you would most like to improve when it comes to the quality of student learning in this course? • OUTPUT Function: Describe what a student will be able to do after successful completion of this course (in

terms of knowledge, skills, and attitudes)? • Mapping Exercise: What learning outcomes from the CDIO Syllabus (see Page 6) at the x.x.x level of detail

are addressed in this course (using categories Introduce, Teach, or Utilize given on Page 5)? • INPUT Function: What areas of knowledge from previous courses does your course use / need to be

improved? • When do students get feedback during this course? How do they use the feedback? • What are the most motivating aspects of the course to students? • What are the least motivating aspects of the course to students? • Describe the main tasks and roles of the instructor(s) in this course. What resources are available, and how

are they used? Are the resources adequate or too demanding, as the course is organized today?

• What other comments would you like to make about teaching this course?

Page 89: 3.1 Graduate attributes

Explanations of Introduce, Teach, Utilize

Introduce: • Touch on or briefly expose the students to this topic • No specific learning objective of knowledge retention is linked to this topic • Typically less than one hour of dedicated lecture/discussion/laboratory time is spent on this topic • No assignments/exercises/projects/homework are specifically linked to this topic • This topic would probably not be assessed on a test or other evaluation instrument Example: At the beginning of class an example is given of the operation of an engineering system (4.6) to motivate an aspect of the design. But, no explicit discussion of the design or analysis of operation is presented. Example: An ethical problem or dilemma (2.5) is presented to the students that sets the context for an example or lecture. But, no explicit treatment of ethics or its role in modern engineering practice is presented. Teach: • Really try to get students to learn new material • Learning objective is to advance at least one cognitive level (e.g. no exposure to knowledge,

knowledge to comprehension, comprehension to application, etc.) • Typically, one or more hours of dedicated lecture/discussion/laboratory time are spent on this topic • Assignments/exercises/projects/homework are specifically linked to this topic • This topic would probably be assessed on a test or other evaluation instrument Example: The process and methodology of product design (4.4) are explicitly presented to and practiced by the students on a project or assignment. Example: Several workshops are presented on working in teams and group dynamics (3.1), and a coach works with students on understanding teamwork throughout the semester’s team project. Utilize: • Assumes the student already has a certain level of proficiency in this topic • No specific learning objective is linked to this topic, but the student will use knowledge of this topic

to reach other learning objectives • No time explicitly allotted to teaching this topic • Assignments/exercises/projects/homework are not designed to explicitly teach this topic • Tests or other evaluation instruments are not designed to explicitly assess this topic Example: When teaching a topic other than communication, students are expected to utilize their skills in preparing oral presentations (3.2) which explain their work. But, no further explicit instruction in communications is given. Example: When working in a laboratory session, students are expected to utilize their skills of experimentation (2.2). But, no further explicit instruction on techniques of experimentation are given.

Page 90: 3.1 Graduate attributes

Exhibit 5: Graduate Attributes Assessment Resources

Exhibit 5(e): Curriculum Map

Page 91: 3.1 Graduate attributes

3.1.1 A knowledge base for engineering: Demonstrated competence in university level mathematics, natural sciences, engineering fundamentals, and specialized engineering knowledge appropriate to the program.

CoursesCDIO Syllabus Topics CDIO Learning Outcomes Introduce Teach Utilize1.1 Knowledge of AMAT 217Underlying Sciences AMAT 219

CHEM 209 CHEM 209ENGG 201ENGG 202 ENGG 202

ENGG 233ENGG 200

MATH 211AMAT 307CHEM 357ENGG 311 ENGG 311

ENGG 317ENGG 319ENGG 225 ENGG 225ENGG 349 ENGG 349ENGG 407PHYS 369

capstone

1.2 Core Engineering AMAT 217 ENGG 201Fundamental AMAT 219Knowledge CHEM 209

ENGG 202ENGG 233

ENGG 200 ENGG 200MATH 211AMAT 307CHEM 357

ENGG 311 ENGG 311ENGG 317

ENGG 319ENGG 225ENGG 349

ENGG 407 ENGG 407ENGG 513 ENGG 513

capstone

Page 92: 3.1 Graduate attributes

3.1.2 Problem analysis: An ability to use appropriate knowledge and skills to identify, formulate, analyze, and solve complex engineering problems in order to reach substantiated conclusions.

CoursesCDIO Syllabus Topics CDIO Learning Outcomes Introduce Teach Utilize2.1.1 Problem * Evaluate data and symptoms AMAT 217Identification and * Analyze assumptions and sources of bias AMAT 219Formulation * Demonstrate issue prioritization in context CHEM 209

of overall goals ENGG 201* Formulate a plan of attack (incorporating model, ENGG 202 ENGG 202 analytical and numerical solutions, qualitative ENGG 233 ENGG 233 analysis, experimentation and consideration ENGG 200 ENGG 200 of uncertainty) MATH 211

PHYS 259AMAT 307

CHEM 357ENGG 311 ENGG 311ENGG 317ENGG 319ENGG 225ENGG 349ENGG 407PHYS 369

ENGG 513 ENGG 513capstone capstone

2.1.2 Modeling * Employ assumptions to simplify complex AMAT 217 systems and environment AMAT 219* Choose and apply conceptual and qualitative CHEM 209 models ENGG 201* Choose and apply quantitative models and ENGG 202 simulations ENGG 233 ENGG 233

ENGG 200 ENGG 200MATH 211

PHYS 259CHEM 357ENGG 311

ENGG 317ENGG 225 ENGG 225

ENGG 349PHYS 369

capstone

2.1.3 Estimation and * Estimate orders of magnitutde, bounds AMAT 217Qualitative Analysis and trends AMAT 219

* Apply tests for consistency and errors (limits, ENGG 201 units, etc.) ENGG 202* Demonstrate the generalization of analytical ENGG 200 ENGG 200 solutions MATH 211

PHYS 259AMAT 307CHEM 357

ENGG 311ENGG 317ENGG 319

ENGG 225ENGG 407

PHYS 369ENGG 513capstone

2.1.4 Analysis with * Elicit incomplete and ambiguous information AMAT 217Uncertainty * Apply probabilistic and statistical models of AMAT 219

events and sequences ENGG 200 ENGG 200* Practice engineering cost-benefit and risk PHYS 259 analysis CHEM 357* Discuss decision analysis ENGG 349* Schedule margins and reserves PHYS 369

ENGG 513capstone

2.1.5 Solution and * Synthesize problem solutions CHEM 209Recommendation * Analyze essential results of solutions and test ENGG 201

data ENGG 202* Analyze and reconcile discrepancies in results ENGG 233* Formulate summary recommendations ENGG 200 ENGG 200* Appraise possible improvements in the problem AMAT 307

Page 93: 3.1 Graduate attributes

3.1.2 Problem analysis: An ability to use appropriate knowledge and skills to identify, formulate, analyze, and solve complex engineering problems in order to reach substantiated conclusions.

CoursesCDIO Syllabus Topics CDIO Learning Outcomes Introduce Teach Utilize

solving process ENGG 311ENGG 317

ENGG 319ENGG 225

ENGG 349ENGG 407ENGG 513 ENGG 513

capstone

Page 94: 3.1 Graduate attributes

3.1.3 Investigation: An ability to conduct investigations of complex problems by methods that includeappropriate experiments, analysis and interpretation of data, and synthesis of information in order to reach valid conclusions.

CoursesCDIO Syllabus Topics CDIO Learning Outcomes Introduce Teach Utilize2.2.1 Hypothesis * Select critical questions to be examined CHEM 209Formation * Formulate hypotheses to be tested ENGG 201

* Discuss controls and control groups ENGG 200 ENGG 200AMAT 307ENGG 311ENGG 319ENGG 407

capstone

2.2.2 Survey of Print * Choose the literature research strategy CHEM 209and Electronic * Demonstrate information search and identification ENGG 201Literature using library tools (on-line catalogs, databases, ENGG 200 ENGG 200

search engines) ENGG 225* Demonstrate sorting and classifying the primary capstone information* Question the quality and reliability of information* Identify the essentials and innovations contained in the information* Identify research questions that are unanswered* List citations to references

2.2.3 Experimental * Formulate experimental concept and strategy CHEM 209Inquiry * Discuss the precautions when humans are ENGG 201

used in experiments ENGG 233* Execute experimental construction ENGG 200 ENGG 200* Execute test protocols and experimental PHYS 259 procedures CHEM 357* Execute experimental measurements ENGG 311* Analyze and report experimental data ENGG 317* Compare experimental data vs. available ENGG 225 models PHYS 369

capstone capstone

2.2.4 Hypothesis * Discuss the statistical validity of data CHEM 209Test and Defense * Discuss the limiations of data employed ENGG 201

* Prepare conclusions, supported by data, needs PHYS 259 and values ENGG 311* Appraise possible improvements in knowledge ENGG 317 discovery process ENGG 225

PHYS 369capstone

4.5.5 Test,Verification, * Discuss test and analysis procedures (hardware ENGG 233Validation and vs. software, acceptance vs. qualification) capstone capstoneCertification * Discuss the verification of performance to

system requirements* Discuss the validation of performance to customer requirements* Explain certification to standards

Page 95: 3.1 Graduate attributes

3.1.4 Design: An ability to design solutions for complex, open-ended engineering problems and to design systems, components or processes that meet specified needs with appropriate attention to health and safety risks, applicablestandards, and economic, environmental, cultural and societal considerations.

CoursesCDIO Syllabus Topics CDIO Learning Outcomes Introduce Teach Utilize2.3.1 Thinking * Identify and define a system, its behavior, and CHEM 209Holistically its elements ENGG 202 ENGG 202

* Use trans-disciplinary approaches that ensure the ENGG 200 ENGG 200 system is understood from all relevant CHEM 357 perspectives ENGG 225* Identify the societal, enterprise and technical ENGG 349 context of the system ENGG 513* Identify interactions external to the system, and capstone capstone the behavioral impact of the system

2.3.2 Emergence and * Discuss the abstractions necessary to define and CHEM 209Interactions in model system ENGG 202Systems * Identify the behavioral and functional properties ENGG 233

(intended and unintended) which emerge from CHEM 357 the system ENGG 311* Identify the important interfaces among elements ENGG 349* Recognize evolutionary adaptation over time ENGG 513

capstone capstone

2.3.3 Prioritization * Locate and classify all factors relevant to the CHEM 209and Focus system in the whole ENGG 233

* Identify the driving factors from among the whole ENGG 200 ENGG 200* Explain resource allocations to resolve the driving CHEM 357 issues ENGG 225

ENGG 349ENGG 513

capstone capstone

2.3.4 Tradeoffs, * Identify tensions and factors to resolve through ENGG 233Judgement and trade-offs ENGG 200 ENGG 200Balance in Resolution * Choose and employ solutions that balance CHEM 357

various factors, resolve tensions and optimize ENGG 225 the system as a whole ENGG 349* Describe flexible vs. optimal solutions over the ENGG 513 system lifetime capstone capstone* Appraise possible improvements in the system thinking used

4.3.1 Setting System * Identify market needs and opportunities ENGG 200 ENGG 200Goals and * Elicit and interpret customer needs ENGG 513Requirements * Identify opportunities that derive from new capstone capstone

technology or latent needs* Explain factors that set the context of the requirements* Identify enterprise goals, strategies, capabilities and alliances* Locate and classify competitors and benchmarking information* Interpret ethical, social, environmental, legal and regulatory influences* Explain the probability of change in the factors that influence the system, its goals and resources available* Interpret system goals and requirements* Identify the language/format of goals and requirements* Identify initial target goals (based on needs, opportunities and other influences)* Explain system performance metrics* Interpret requirement completeness and consistency

4.3.2 Defining * Identify necessary system functions (and ENGG 233Function, Concept behavioral specifications) ENGG 200 ENGG 200and Architecture * Select system concepts capstone capstone

* Identify the appropriate level of technology* Analyze trade-offs among and recombination of concepts* Identify high level architectural form and structure* Discuss the decomposition of form into elements, assignment of function to elements, and definition of interfaces

Page 96: 3.1 Graduate attributes

3.1.4 Design: An ability to design solutions for complex, open-ended engineering problems and to design systems, components or processes that meet specified needs with appropriate attention to health and safety risks, applicablestandards, and economic, environmental, cultural and societal considerations.

CoursesCDIO Syllabus Topics CDIO Learning Outcomes Introduce Teach Utilize

4.3.3 Modeling of * Locate appropriate models of technical capstone capstoneSystem and Ensuring performanceGoals Can Be Met * Discuss the concept of implementation and

operations* Discuss life cycle value and costs (design, implementation, operations, opportunity, etc.)* Discuss trade-offs among various goals, function, concept and structure and iteration until convergence

4.4.1 The Design * Choose requirements for each element or ENGG 233Process component derived from system level goals and ENGG 200 ENGG 200

requirements ENGG 311* Analyze alternatives in design ENGG 317* Select the initial design capstone* Use prototypes and test articles in design development* Execute appropriate optimization in the presence of constraints* Demonstrate iteration until convergence* Synthesize the final design* Demonstrate accommodation of changing requirements

4.4.2 The Design * Explain the activities in the phases of system ENGG 233Phasing and design (e.g., conceptual, preliminary, and ENGG 200 ENGG 200Approaches detailed design) ENGG 513

* Discuss process models appropriate for particular capstone development projects (waterfall, spiral, concurrent, etc.)* Discuss the process for single, platform and derivative products

4.4.3 Utilization of * Utilize technical and scientific knowledge ENGG 233 ENGG 233Knowledge in Design * Practice creative and critical thinking, and ENGG 200 ENGG 200

problem solving CHEM 357* Discuss prior work in the field, standardization capstone and reuse of designs (including reverse engineer and redesign)* Discuss design knowledge capture

4.4.4 Disciplinary * Choose appropriate techniques, tools, and ENGG 233Design processes ENGG 200 ENGG 200

* Explain design tool calibration and validation capstone* Practice quantitative analysis of alternatives* Practice modeling, simulation and test* Discuss analytical refinement of the design

4.5.1 Designing the * State the goals and metrics for implementation ENGG 200 ENGG 200Implementation performance, cost and quality ENGG 513 ENGG 513Process * Recognize the implementation system design capstone

4.6.1 Designing and * Interpret the goals and metrics for operational ENGG 233Optimizing Operations performance, cost, and value

* Explain operations process architecture and development* Explain operations (and mission) analysis and modeling

Page 97: 3.1 Graduate attributes

3.1.5 Use of engineering tools: An ability to create, select, apply, adapt, and extend appropriate techniques,resources, and modern engineering tools to a range of engineering activities, from simple to complex, with an understanding of the associated limitations.

CoursesCDIO Syllabus Topics CDIO Learning Outcomes Introduce Teach Utilize1.3 Advanced Eng. ENGG 201Fundamental ENGG 317Knowledge ENGG 407 ENGG 407

capstone capstone

4.5.2 Hardware * Describe the manufacturing of parts ENMF 417Manufacturing * Describe the assembly of parts into larger ENGG 513 ENGG 513Process constructs capstone

* Define tolerances, variability, key characteristics and statistical process control

4.5.3 Software * Explain the break down of high-level components ENGG 233Implementing into module designs (including algorithms and ENGG 200 ENGG 200Process data structures) ENGG 513 ENGG 513

* Discuss algorithms (data structures, control flow, capstone data flow)* Describe the programming language* Execute low-level design (coding)* Describe the system build

4.5.4 Hardware * Describe the integration of software in electronic ENGG 513 ENGG 513Software Integration hardware (size of processor, communications, capstone

etc.)* Describe the integration of software integration with sensor, actuators and mechanical hardware* Describe hardware/software function and safety

Page 98: 3.1 Graduate attributes

3.1.6 Individual and team work: An ability to work effectively as a member and leader in teams, preferably in a multi-disciplinary setting.

CoursesCDIO Syllabus Topics CDIO Learning Outcomes Introduce Teach Utilize3.1.1 Forming * Identify the stages of team formation and life- ENGG 201Effective Teams cycle ENGG 202

* Interpret task and team processes ENGG 200 ENGG 200* Identify team roles and responsibilities PHYS 259* Analyze the goals, needs and characteristics ENGG 311 (work styles, cultural differences) of individual ENGG 225 team members ENGG 317* Analyze the strengths and weaknesses of the PHYS 369 team ENGG 481* Discuss ground rules on norms of team ENGG 513 confidentiality, accountability, and initiative capstone capstone

3.1.2 Team Operation * Choose goals and agenda ENGG 200 ENGG 200* Execute the planning and facilitation of PHYS 259 effective meetings ENGG 311* Apply team ground rules ENGG 225* Practice effective communication (active listening, ENGG 317 collaboration, providing and obtaining PHYS 369 information) ENGG 481* Demonstrate positive and effective feedback ENGG 513* Practice the planning, scheduling and execution capstone capstone of a project* Formulate solutions to problms (creativity and decision making)* Practice conflict negotiation and resolution

3.1.3 Team Growth * Discuss strategies for reflection, assessment, and ENGG 200 ENGG 200and Evolution self-assessment ENGG 311

* Identify skills for team maintenance and growth ENGG 225* Identify skills for individual growth within the ENGG 481 team ENGG 513* Explain strategies for team communication and capstone capstone writing

3.1.4 Leadership * Explain team goals and objectives ENGG 200 ENGG 200* Practice team process management ENGG 513* Practice leadership and facilitation styles capstone (directing, coaching, supporting, delegating)* Explain approaches to motivation (incentives, example, recognition, etc.)* Practice representing the team to others* Describe mentoring and counseling

3.1.5 Teachnical * Describe working different types of teams ENGG 513Teaming * Cross-disciplinary teams (including non-engineers) capstone

* Small team vs. large team* Distance, distributed and electronic environments* Demonstrate technical collaboration with team members

Page 99: 3.1 Graduate attributes

3.1.7 Communication skills: An ability to communicate complex engineering concepts within the profession and with society at large. Such ability includes reading, writing, speaking and listening, and the ability to comprehend and write effective reports and design documentation, and to give and effectively respond to clear instructions.

CoursesCDIO Syllabus Topics CDIO Learning Outcomes Introduce Teach Utilize3.2.1 Communications * Analyze the communication situation ENGG 200 ENGG 200Strategy * Choose a communications strategy ENGG 209 ENGG 209

ENGG 481capstone

3.2.2 Communications * Construct logical, persuasive arguments ENGG 201Structure * Construct the appropriate structure and ENGG 233

relationship amongst ideas ENGG 200 ENGG 200* Choose relevant, credible, accurate supporting ENGG 311 evidence ENGG 225* Practice conciseness, crispness, precision and PHYS 369 clarity of language ENGG 209 ENGG 209* Analyze rhetorical factors (e.g., audience bias) ENGG 481* Identify cross-disciplinary cross-cultural ENGG 513 communications capstone

3.2.3 Written * Demonstrate writing with coherence and flow AMAT 217 AMAT 217Communication * Practice writing with correct spelling, AMAT 219 AMAT 219

punctuation and grammar CHEM 209* Demonstrate formatting the document ENGG 201* Demonstrate technical writing MATH 211* Apply various written styles (informal, formal PHYS 259 memos, reports, etc.) ENGG 311

ENGG 225ENGG 349

PHYS 369ENGG 209 ENGG 209

ENGG 481capstone capstone

3.2.4 Electronic / * Demonstrate preparing elecronic presentations AMAT 217Multimedia * Identify the norms associated with the use of AMAT 219Communication email, voice mail, and videoconferencing ENGG 200 ENGG 200

* Apply various electronic styles (charts, web, etc.) MATH 211ENGG 311ENGG 349ENGG 481capstone

3.2.5 Graphical * Demonstrate sketching and drawing AMAT 217Communication * Demonstrate construction of tables, graphs AMAT 219

and charts CHEM 209* Interpret formal technical drawings and ENGG 201 renderings ENGG 202 ENGG 202

ENGG 233ENGG 200 ENGG 200MATH 211PHYS 259CHEM 357ENGG 311 ENGG 311ENGG 225ENGG 349PHYS 369

ENGG 481capstone capstone

3.2.6 Oral Presentation * Practice preparing presentations and CHEM 209& Interpersonal supporting media with appropriate language, ENGG 200 ENGG 200Communication style, timing and flow ENGG 225

* Use appropriate nonverbal communications ENGG 421 (gestures, eye contact, poise) ENGG 481* Demonstrate answering questions effectively capstone

Page 100: 3.1 Graduate attributes

3.1.8 Professionalism: An understanding of the roles and responsibilities of the professional engineer in society, especially the primary role of protection of the public and the public interest.

CoursesCDIO Syllabus Topics CDIO Learning Outcomes Introduce Teach Utilize4.1.1 Roles and * Recognize and accept the goals and roles of ENGG 201Responsibility of the engineering profession ENGG 202Engineers * Recognize and accept the responsibilities of ENGG 233

engineers to society ENGG 317ENGG 481ENGG 513capstone capstone

4.1.3 Society's * Accepts the role of society and its agents to ENGG 200Regulation of regulate engineering ENGG 317Engineering * Recognize the way in which legal and political ENGG 481

systems regulate and influence engineering ENGG 513* Describe how professional societies license capstone and set standards* Describe how intellectual property is created, utilized and defended

Page 101: 3.1 Graduate attributes

3.1.9 Impact of engineering on society and the environment: An ability to analyze social andenvironmental aspects of engineering activities. Such ability includes an understanding of the interactions that engineering has with the economic, social, health, safety, legal, and cultural aspects of society, the uncertainties in the prediction of such interactions; and the concepts of sustainable design and development andenvironmental stewardship.

CoursesCDIO Syllabus Topics CDIO Learning Outcomes Introduce Teach Utilize4.1.2 The Impact * Analyze the impact of engineering on the ENGG 201of Engineering on environment, social, knowledge and economic ENGG 202Society systems in modern culture ENGG 233

ENGG 200CHEM 357ENGG 317

ENGG 481ENGG 513capstone capstone

4.1.4 The Historical * Describe the diverse nature and history of human ENGG 201and Cultural Context societies as well as their literary, philosophical, ENGG 233

and artistic traditions ENGG 200* Describe the discourse and analysis appropriate CHEM 357 to the discussion of language, thought and ENGG 311 values ENGG 317

ENGG 225ENGG 349

ENGG 481ENGG 513

4.1.5 Contemporary * Describe the important contemporary political, ENGG 201Issues and Values social, legal and environmental issues and CHEM 357

values ENGG 311* Define the process by which contemporary ENGG 317 values are set, and one's role in these ENGG 209 ENGG 209 processes ENGG 481* Define the mechanisms for expansion and ENGG 513 diffusion of knowledge capstone

4.1.6 Developing a * Describe the internationalization of human activity ENGG 200Global Context * Recognize the similarities and differences in the CHEM 357

political, social, economic, business and technical ENGG 311 norms of various cultures ENGG 317* Recognize international inter-enterprise and ENGG 209 intergovernmental agreements and alliances ENGG 481

ENGG 513

Page 102: 3.1 Graduate attributes

3.1.10 Ethics and equity: An ability to apply professional ethics, accountability, and equity.

CoursesCDIO Syllabus Topics CDIO Learning Outcomes Introduce Teach Utilize2.5.1 Professional * Demonstrate one's ethical standards and ENGG 200 ENGG 200Ethics, Integrity, principles ENGG 513 ENGG 513Responsibility & * Demonstrate the courage to act on principle capstoneAccountability despite adversity

* Identify the possiblility of conflict between professionally ethical imperatives* Demonstrate an understanding that it is acceptable to make mistakes, but that one must be accountable for them* Practice proper allocation of credit to collaborators

2.5.2 Professional * Discuss a professional bearing ENMF 417Behavior * Explain professional courtesy ENGG 513 ENGG 513

* Identify international customs and norms of capstone interpersonal contact

Page 103: 3.1 Graduate attributes

3.1.11 Economics and project management: An ability to appropriately incorporate economics and businesspractices including project, risk, and change management into the practice of engineering and to understand their limitations.

CoursesCDIO Syllabus Topics CDIO Learning Outcomes Introduce Teach Utilize2.4.7 Time and * Discuss task prioritization AMAT 217 AMAT 217Resource Management * Explain the importance and/or urgency of tasks AMAT 219 AMAT 219

* Explain efficient execution of tasks CHEM 209ENGG 201 ENGG 201ENGG 202 ENGG 202ENGG 233

ENGG 200 ENGG 200MATH 211

AMAT 307ENGG 311

ENGG 319ENGG 225

ENGG 349ENGG 407

ENGG 209ENGG 481

ENGG 513capstone capstone

4.2.2 Enterprise * State the mission and scope of the enterprise ENGG 225Strategy, Goals, and * Recognize an enterprise's core competence and ENGG 209 ENGG 209Planning markets ENGG 481

* Recognize the research and technology process capstone capstone* Centralized vs. distributed* Recognize key alliances and supplier relations* List financial and managerial goals and metrics* Recognize financial planning and control* Describe stake-holder relations (with owners, employees, customers, etc.)

4.3.4 Development * Describe project control for cost, performance, ENGG 200 ENGG 200Project Management and schedule ENGG 513

* Explain appropriate transition points and reviews capstone capstone* Explain configuration management and ENME 599 documentation* Interpret performance compared to baseline* Define earned value process* Discuss the estimation and allocation of resources* Identify risks and alternatives* Describe possible development process improvements

4.5.6 Implementation * Describe the organization and structure for capstone capstoneManagement implementation

* Discuss sourcing, partnering, and supply chains* Recognize control of implementation cost, performance and schedule* Describe quality and safety assurance* Describe possible implementation process improvements

4.6.6 Operations * Describe the organization and structure forManagement operations

* Recognize partnerships and alliances* Recognize control of operations cost, performance and scheduling* Describe quality and safety assurance* Define life cycle management* Recognize possible operations process improvements

Page 104: 3.1 Graduate attributes

3.1.12 Life-long learning: An ability to identify and to address their own educational needs in a changing world in ways sufficient to maintain their competence and to allow them to contribute to the advancement of knowledge.

CoursesCDIO Syllabus Topics CDIO Learning Outcomes Introduce Teach Utilize2.4.5 Awareness of * Reflect on one's skills, interests, strengths, AMAT 217One's Personal weaknesses AMAT 219Knowledge, Skills * Discuss the extent of one's abilities, and one's CHEM 209and Attitude responsibility for self-improvement to overcome ENGG 202

important weaknesses ENGG 233* Discuss the importance of both depth and ENGG 200 ENGG 200 breadth of knowledge AMAT 307

CHEM 357ENGG 317ENGG 319

ENGG 225ENGG 349

ENGG 407ENGG 481

ENGG 513capstone capstone

2.4.6 Curiosity and * Discuss the motivation for continued self- AMAT 217Life-long Learning education AMAT 219

* Demonstrate the skills of self-education CHEM 209* Describe one's own learning style ENGG 201* Describe the importance of developing ENGG 233 relationships with mentors CHEM 357

ENGG 311ENGG 317ENGG 225 ENGG 225ENGG 349ENGG 209ENGG 481

ENGG 513capstone

2.5.4 Staying Current * Discuss the potential impact of new scientific CHEM 209on the World of discoveries ENGG 201Engineering * Describe the social and technical impact of new ENGG 202

technologies and innovations ENGG 233* Discuss a familiarity with current practice/ ENGG 200 ENGG 200 technology in engineering ENGG 311* Explain the links between engineering theory ENGG 317 and practice ENGG 225

ENGG 481ENGG 513

capstone

Page 105: 3.1 Graduate attributes

Exhibit 5: Graduate Attributes Assessment Resources

Exhibit 5(f): Surveys (final year, alumni, employer)

Page 106: 3.1 Graduate attributes
Page 107: 3.1 Graduate attributes
Page 108: 3.1 Graduate attributes
Page 109: 3.1 Graduate attributes

Exhibit 5: Graduate Attributes Assessment Resources

Exhibit 5(g): Reflective Memo Templates (ENGG 481,

capstone)

Page 110: 3.1 Graduate attributes

Schulich School of Engineering Page 1 of 3 Reflective Memo – Capstone Design

Schulich School of Engineering Capstone Design Reflective Memo for 2010-2011

Course Number and Title:

Term(s):

Instructor(s):

1) CEAB Graduate Attributes and Performance Indicators (Learning Outcomes)

3.1.4 Design: An ability to design solutions for complex, open-ended engineering problems and to design systems, components or processes that meet specified needs with appropriate attention to health and safety risks, applicable standards, and economic, environmental, cultural and societal considerations.

1. Elicit and interpret customer needs.

2. Interpret ethical, social, environmental, legal and regulatory influences.

3. Identify and explain system performance metrics.

4. Select concepts and analyze the trade-offs among and recombination of alternative concepts.

5. Decompose and assign function to elements, and define interfaces.

6. Use prototypes and test articles for design validation.

7. Demonstrate iteration until convergence and synthesize the final design.

8. Demonstrate accommodation of changing requirements.

3.1.6 Individual and team work: An ability to work effectively as a member and leader in teams, preferably in a multi-disciplinary setting.

9. Identify the stages of team formation and life-cycle as well as the roles and responsibilities of team members.

10. Evaluate team effectiveness and plan for improvements.

11. Execute the planning and facilitation of effective meetings.

12. Practice conflict negotiation and resolution.

13. Assume responsibility for own work and participate equitably.

14. Exercise initiative and contribute to team goal setting.

15. Demonstrate capacity for initiative and technical or team leadership while respecting other’s roles.

Page 111: 3.1 Graduate attributes

Schulich School of Engineering Page 2 of 3 Reflective Memo – Capstone Design

3.1.7 Communication skills: An ability to communicate complex engineering concepts within the profession and with society at large. Such ability includes reading, writing, speaking and listening, and the ability to comprehend and write effective reports and design documentation, and to give and effectively respond to clear instructions.

16. Construct logical and persuasive arguments.

17. Practice conciseness, crispness, precision and clarity of language.

18. Demonstrate writing with coherence and flow.

19. Practice writing with correct spelling, punctuation and grammar.

20. Apply various written styles (informal, formal, memos, reports, etc.).

21. Demonstrate sketching and drawing.

22. Demonstrate construction of tables, graphs and charts.

23. Interpret formal technical drawings and renderings.

24. Deliver clear and organized formal presentation following established guidelines.

25. Use appropriate referencing to cite previous work.

26. Adapt format, content, organization, and tone for various audiences.

2) Teaching and Assessment Methods What teaching and assessment methods did you use to address the learning outcomes identified in Section 1?

3) Student Learning How well did the students perform on each learning outcome? (Where possible, make reference to specific data to support your conclusion.)

4) Continuous Improvement What actions did you take this semester to improve the subject as a result of previous reflections or input from students or colleagues?

What did you learn about your teaching and assessment methods this semester?

What actions do you recommend to improve this subject in the future?

Page 112: 3.1 Graduate attributes

Schulich School of Engineering Page 3 of 3 Reflective Memo – Capstone Design

Attachments 1. Course Outline

2. Copies of assessments described in Section 2

3. Copies of scoring guides or rubrics used to evaluate the assessments described in Section 2

4. Results of the assessments described in Section 3 (e.g., class records of the assessments)

Page 113: 3.1 Graduate attributes

Schulich School of Engineering Page 1 of 3 Reflective Memo – ENGG 481

Schulich School of Engineering Technology & Society Reflective Memo for 2010-2011

Course Number and Title:

Term(s):

Instructor(s):

1) CEAB Graduate Attributes and Performance Indicators (Learning Outcomes)

3.1.6 Individual and team work: An ability to work effectively as a member and leader in teams, preferably in a multi-disciplinary setting.

1. Identify the stages of team formation and life-cycle as well as the roles and responsibilities of team members.

2. Evaluate team effectiveness and plan for improvements.

3. Execute the planning and facilitation of effective meetings.

4. Practice conflict negotiation and resolution.

5. Assume responsibility for own work and participate equitably.

6. Exercise initiative and contribute to team goal setting.

7. Demonstrate capacity for initiative and technical or team leadership while respecting other’s roles.

3.1.7 Communication skills: An ability to communicate complex engineering concepts within the profession and with society at large. Such ability includes reading, writing, speaking and listening, and the ability to comprehend and write effective reports and design documentation, and to give and effectively respond to clear instructions.

8. Construct logical and persuasive arguments.

9. Practice conciseness, crispness, precision and clarity of language.

10. Demonstrate writing with coherence and flow.

11. Practice writing with correct spelling, punctuation and grammar.

12. Apply various written styles (informal, formal, memos, reports, etc.).

13. Demonstrate sketching and drawing.

14. Demonstrate construction of tables, graphs and charts.

15. Interpret formal technical drawings and renderings.

16. Deliver clear and organized formal presentation following established guidelines.

17. Use appropriate referencing to cite previous work.

18. Adapt format, content, organization, and tone for various audiences.

Page 114: 3.1 Graduate attributes

Schulich School of Engineering Page 2 of 3 Reflective Memo – ENGG 481

3.1.9 Impact of engineering on society and the environment: An ability to analyze social and environmental aspects of engineering activities. Such ability includes an understanding of the interactions that engineering has with the economic, social, health, safety, legal, and cultural aspects of society, the uncertainties in the prediction of such interactions; and the concepts of sustainable design and development and environmental stewardship.

19. Analyze the impact of engineering on the environment, social, knowledge and economic systems in modern culture.

20. Describe the important contemporary political, social, legal and environmental issues and values

21. Define the process by which contemporary values are set, and one's role in these processes

2) Teaching and Assessment Methods What teaching and assessment methods did you use to address the learning outcomes identified in Section 1?

3) Student Learning How well did the students perform on each learning outcome? (Where possible, make reference to specific data to support your conclusion.)

4) Continuous Improvement What actions did you take this semester to improve the subject as a result of previous reflections or input from students or colleagues?

What did you learn about your teaching and assessment methods this semester?

What actions do you recommend to improve this subject in the future?

Attachments 1. Course Outline

Page 115: 3.1 Graduate attributes

Schulich School of Engineering Page 3 of 3 Reflective Memo – ENGG 481

2. Copies of assessments described in Section 2

3. Copies of scoring guides or rubrics used to evaluate the assessments described in Section 2

4. Results of the assessments described in Section 3 (e.g., class records of the assessments)

Page 116: 3.1 Graduate attributes

Exhibit 5: Graduate Attributes Assessment Resources

Exhibit 5(h): Reflective Memo – ENGG 481

Page 117: 3.1 Graduate attributes

Schulich School of Engineering Page 1 of 6 Reflective Memo – ENGG 481

Schulich School of Engineering ENGG 481 Reflective Memo for 2010-2011

Course Number and Title: ENGG 481 Technology & Society

Term(s): Winter 2011

Instructor(s): Marjan Eggermont

1) CEAB Graduate Attributes and Performance Indicators (Learning Outcomes)

3.1.6 Individual and team work: An ability to work effectively as a member and leader in teams, preferably in a multi-disciplinary setting.

1. Identify the stages of team formation and life-cycle as well as the roles and responsibilities of team members.

2. Evaluate team effectiveness and plan for improvements.

3. Execute the planning and facilitation of effective meetings.

4. Practice conflict negotiation and resolution.

5. Assume responsibility for own work and participate equitably.

6. Exercise initiative and contribute to team goal setting.

7. Demonstrate capacity for initiative and technical or team leadership while respecting other’s roles.

3.1.7 Communication skills: An ability to communicate complex engineering concepts within the profession and with society at large. Such ability includes reading, writing, speaking and listening, and the ability to comprehend and write effective reports and design documentation, and to give and effectively respond to clear instructions.

8. Construct logical and persuasive arguments.

9. Practice conciseness, crispness, precision and clarity of language.

10. Demonstrate writing with coherence and flow.

11. Practice writing with correct spelling, punctuation and grammar.

12. Apply various written styles (informal, formal, memos, reports, etc.).

13. Demonstrate sketching and drawing.

14. Demonstrate construction of tables, graphs and charts.

15. Interpret formal technical drawings and renderings.

16. Deliver clear and organized formal presentation following established guidelines.

17. Use appropriate referencing to cite previous work.

18. Adapt format, content, organization, and tone for various audiences.

Page 118: 3.1 Graduate attributes

Schulich School of Engineering Page 2 of 6 Reflective Memo – ENGG 481

3.1.9 Impact of engineering on society and the environment: An ability to analyze social and environmental aspects of engineering activities. Such ability includes an understanding of the interactions that engineering has with the economic, social, health, safety, legal, and cultural aspects of society, the uncertainties in the prediction of such interactions; and the concepts of sustainable design and development and environmental stewardship.

19. Analyze the impact of engineering on the environment, social, knowledge and economic systems in modern culture.

20. Describe the important contemporary political, social, legal and environmental issues and values

21. Define the process by which contemporary values are set, and one's role in these processes

2) Teaching and Assessment Methods What teaching and assessment methods did you use to address the learning outcomes identified in Section 1?

The lectures for ENGG 481 were presented by layering each lecture on top of the next. Each week a theme was introduced and described historically from 1800 to the present. The next theme was then shown alongside the one from the previous week, so that by the end of the semester we were looking at material from about 11 or 12 themes simultaneously. This showed the students how all areas of society are linked and how technology influences theses areas. The lectures are available in the ENGG 481 binder – examples of themes are Energy, Production Processes, Communication, Computation, Medicine, Design, Financial Engineering, etc.

Students were asked to keep a notebook for lecture material and expand on the lecture material each week by adding two pages of research on a topic (of their choice) that was discussed in class that week. This notebook was assessed at week 6 and at the end of term using Rubric 2. Examples of research notes can be found in the ENGG 481 binder.

Students each participated in a block of seminars lasting 4 weeks. They gave an oral presentation at the end of the block during which each student had to speak and present (Rubric 1). They then completed an individual response (timeline) to the seminar video. Examples can be found in the ENGG 481 binder and on the CDs.

Students also had two large individual projects due in week 7 and 13 of the course. The first one was an analysis project (Rubric 3) looking at the impact of a technology and it’s impact on society using global engineering attributes. A popular choice was social media and the differences in use in the west versus the past few months in the Middle East. Examples of Analysis Projects can be found in the ENGG 481 binder and on the CDs. This assignment focused heavily on Section 1 - 3.1.9 Impact of engineering on society and the environment.

The final delivery for the course was a visual book report. This report was due at the end of term. Students were asked to read a technology-related non-fiction book (Examples: Omnivores Dilemma, The End of Oil, The Post-American World, Can Asians Think?, Citizen Engineer). They had to summarize each chapter via a data visualization – examples were given via the website The Periodic Table of Visualization Methods. I

Page 119: 3.1 Graduate attributes

Schulich School of Engineering Page 3 of 6 Reflective Memo – ENGG 481

think many of our engineers will have to give presentations in future where they have to summarize large amounts of data and information in concise easy to read slides or reports. This exercise hopefully exposed them to the many different forms information can take. Examples of book reports are also available in the ENGG 481 binder.

Tables 1 through 3 provide a summary of the assessment methods used in ENGG 481 to address the learning outcomes in Section 1. Table 1. Assessment Methods for Graduate Attributed 3.1.6 Individual and team work

(Not all indicators were addressed due to limited group work opportunities) Performance Indicator / Learning Outcome Evidence

3. Execute the planning and facilitation of effective meetings.

Seminar presentation (Rubric 1)

4. Assume responsibility for own work and participate equitably.

Seminar presentation (Rubric 1)

Table 2. Assessment Methods for Graduate Attributed 3.1.7 “Communication Skills” Performance Indicator / Learning Outcome Evidence

8. Construct logical and persuasive arguments.

Visual book report project (Rubric 2)

9. Practice conciseness, crispness, precision and clarity of language.

Visual book report project (Rubric 1)

10. Demonstrate writing with coherence and flow.

Visual book report project (Rubric 1)

11. Practice writing with correct spelling, punctuation and grammar.

Visual book report project (Rubric 1)

12. Apply various written styles (informal, formal, memos, reports, etc.).

Notebook research, analysis project (Rubric 3), visual book report, individual seminar response

13. Demonstrate sketching and drawing. Visual book report project (Rubric 1)

14. Demonstrate construction of tables, graphs and charts.

Visual book report project (Rubric 1)

16. Deliver clear and organized formal presentation following established guidelines.

Group oral presentation (Rubric 1)

17. Use appropriate referencing to cite previous work.

All projects

18. Adapt format, content, organization, and tone for various audiences.

Group presentation, individual responses, project requirements (Rubrics 1-3).

Table 3. Assessment Methods for Graduate Attributed 3.1.9 “Impact of engineering on society and the environment”

Page 120: 3.1 Graduate attributes

Schulich School of Engineering Page 4 of 6 Reflective Memo – ENGG 481

Performance Indicator / Learning Outcome Evidence

1. Analyze the impact of engineering on the environment, social, knowledge and economic systems in modern culture.

Analysis Project (Rubric 3 “Awareness of technology’s impact”, “Interconnectedness”)

2. Describe the important contemporary political, social, legal and environmental issues and values

Analysis Project (Rubric 3 “Interconnectedness”, “Impact of global economy”, “Impact of decisions”)

3. Define the process by which contemporary values are set, and one's role in these processes

Analysis Project (Rubric 3 “Impact of culture”, “Impact of ideology”, “Participation”)

3) Student Learning How well did the students perform on each learning outcome? (Where possible, make reference to specific data to support your conclusion.)

My main goal for ENGG 481 was to address graduate attribute 3.1.9 “Impact of engineering on society and the environment” and use 3.1.7 as a conduit to obtain that goal. In order to assess student performance in the three graduate attributes noted in section 1, the following summative assessments were used:

• 3.1.6 Individual and team work: Seminar Presentation (Rubric 1)

• 3.1.7 Communication skills: Book Review (Rubric 1) and Lecture Notebook (Rubric 2)

• 3.1.9 Impact of engineering on society and the environment: Analysis Project (Rubric 3)

Based on the Winter 2011 assessment results, individual student performance in each of the graduate attributes is summarized in Figure 1. As can be seen in this figure, the majority of students were assessed as “Proficient” or higher: i.e., for graduate attribute 3.1.9 “Impact of engineering on society and the environment”, 46% of the class was assessed “proficient” and 53% of the class was assessed “advanced”.

4) Continuous Improvement What actions did you take this semester to improve the subject as a result of previous reflections or input from students or colleagues?

One of my goals was to keep students coming to lecture and to interest them enough in the material that doing the additional research was something they looked forward to. Unsolicited e-mails and notes from the students indicated that this was indeed the case.

Page 121: 3.1 Graduate attributes

Schulich School of Engineering Page 5 of 6 Reflective Memo – ENGG 481

Figure 1. Summary of Graduate Attributes Assessments in ENGG 481

What did you learn about your teaching and assessment methods this semester?

That it was perhaps a bit too ambitious. I received over 1,000 items to grade – well worth it in the end and it was great to see how creative and original everyone was when given the chance. I may need to re-evaluate some of the projects for next time. I would hate to lose the outcome however. I do not think a midterm and final capture the thinking of the students – it was great to see and very moving at times.

What actions do you recommend to improve this subject in the future?

Keep meaningful open-ended projects. Most of the assignments were designed so students could tailor their work to their own interests (to a certain degree). Try to find creative solutions to similar projects while doing slightly less marking.

Page 122: 3.1 Graduate attributes

Schulich School of Engineering Page 6 of 6 Reflective Memo – ENGG 481

Attachments 1. Course Outline

2. Rubric 1

3. Rubric 2

4. Rubric 3

Page 123: 3.1 Graduate attributes

COURSE OUTLINE Winter 2011

1. Calendar Information

ENGG 481 Technology and Society

An interpretive course on the interrelationship between technology and society. The first

part of the course surveys significant historical developments within disciplinary areas

such as energy, materials, production processes, structures, transport, communications,

and computation. Sequence within each area: discovery, development, application, impact, future. Social and economic consequences are also considered. The latter part

of the course explores contemporary problems of society and technology.

Course Hours: H(3-1.5S)

2. Learning Outcomes

At the end of this course, you will:

• be able to perform a critical analysis of the impact of engineering solutions in a

global context

• be able to form opinions on how technology contributes to changes in society

and vice versa

• acquire basic knowledge of many technological advances in history

• be familiar with contemporary technologies and potential future directions in technology

3. Timetable

Section Days of the

Week

Start

Time

Duration

(Minutes)

Location

L01 TuTh 11:00 am 75 min. ENA 201

S01 Tu 5:00 pm 75 min. A 142

S02 We 5:00 pm 75 min. A 142

S03 Th 5:00 pm 75 min. A 142

4. Course Instructors

Course Coordinator

Section Name Phone Office Email

All M. Eggermont 403-210-9888 ICT 253 [email protected]

Other Instructors: N/A

Teaching Assistants

Section Name Phone Office Email

Page 124: 3.1 Graduate attributes

Schulich School of Engineering page 2 of 2

Course Outline

TBA

TBA

5. Examinations

The following examinations will be held in this course: N/A

6. Use of Calculators in Examinations

N/A

7. Final Grade Determination

The final grade in this course will be based on the following components:

Component Weight

Seminar presentation 10 %

Seminar response 10 %

Lecture notebook 15 %

Analysis project part 1 30 %

Final book review 35 %

TOTAL 100 %

8. Textbook

Textbook information will be discussed the first day of classes.

9. Course Policies

All Schulich School of Engineering students and instructors have a responsibility to

familiarize themselves with the policies described in the Schulich School of Engineering Advising Syllabus available at:

http://schulich.ucalgary.ca/undergraduate/advising

10. Additional Course Information

Page 125: 3.1 Graduate attributes

Rubric 1 - Communication

The presenter … No

vice

Basi

c

Pro

fici

en

t

Ad

van

ced

Clearly stated the purpose of the presentation

Was well organized

Was knowledgeable about the subject

Answered questions authoritatively

Spoke clearly and loudly

Maintained eye contact with the audience

Appeared confident

Adhered to time constraints

Had main points that were appropriate to the central topic

Accomplished the stated objectives

Page 126: 3.1 Graduate attributes

Rubric 2 - Notebook Midterm Feedback

Advanced An entry for every lecture, including both class presentations.Strong detail in all research entries.Some image use.

Proficient An entry for every lecture.Good detail in most research entries.

Basic One or more missing entries.Sketchy or abbreviated detail in most research entries.

Novice Only one or two entries.Minimal work present.

Page 127: 3.1 Graduate attributes

By the Metiri Group in cooperation with NCREL

Indicator Novice Basic Proficient Advanced

Awareness of technology’s impact on interconnections between nations/ individuals, global economy

Student is unaware of the role that technology plays in enabling a global economy. He/she knows at a very superficial level that technology links individuals from different nations.

Student is aware that technology plays an important role in linking nations/individuals and in enabling the global economy. However, this knowledge is general, limited (e.g., student may define technology too narrowly), or includes significant misconceptions.

Student has some understanding of the ways in which technology has been an essential part of the global economy. He/she understands some of the effects technology has had in linking nations /individuals and enabling exchange of goods, services, and information.

Student understands - beyond grade-level expectations -how technology links nations/individuals, how it enables the global economy, and how it changes the nature of the resources (e.g. information vs. goods) that can be traded.

Understanding of the interconnected-ness of the global economy

Student does not understand that economies of nations impact one another.

Student is aware that national economies impact one another, but this knowledge is general and sparse.

Student is aware that economic conditions of one nation can impact those of other nations, but he/she is not aware of political/social/ environmental issues raised by economic interdependence.

Student understands – beyond grade-level expectations – how economies impact each other; he/she can think critically about political/ social/ environmental issues raised by economic interdependence.

Understanding of the impact of global economy on political decision-making

Student is unaware of the impact of economic considerations on political decision-making. He/she may be largely unaware of political events and international economic conditions.

Student is generally aware that political decisions are shaped by economic considerations; however, he/she has little knowledge of specific considerations and national/ international policies.

Student is aware of some of the economic considerations that drive political decisions. However, this knowledge is somewhat limited or tends to cast issues in black and white terms.

Student possesses knowledge – beyond grade level expectations – of economic considerations that drive specific national policies and decisions. He/she can critically evaluate the gains and losses that result from these policies.

Understanding the impact of decisions made

Student has no knowledge of the impacts of

Student understands very generally that

Student understands how some specific

Student has an excellent understanding of

77

Page 128: 3.1 Graduate attributes

By the Metiri Group in cooperation with NCREL

by national, international organizations on societies, environment, economies

decisions made by national/international organizations. He/she has little knowledge of these organizations or their functions.

national and international organizations impact societal, environmental, and micro-economic conditions, but is unaware of specific policies/decisions that impact his/her world.

decisions made by national/international organization impact many facets of his/her day-to-day world; however, knowledge is limited or tends to cast issues in black and white.

the way specific decisions made by national/international organizations impact his/her day-to-day world. He/she is able to evaluate these issues critically and thoroughly.

Understanding of the impact of culture on political relationships

Student is unaware of the ways in which culture impacts national/personal political decision-making.

Student understands that culture impacts national/personal political decision-making, but his/her view tends to cast these issues in black and white. Knowledge is either sparse or includes significant misconceptions.

Student understands some specific ways in which culture impacts national/personal political decision-making.

Student has an excellent understanding of the ways in which culture impacts decision-making of specific nations/groups. This understanding is fair and takes into account multiple cultural perspectives.

Understanding of the impact of ideology, culture on decisions related to technology and access

Student is unaware of differences in societies’ access to technology and information; he/she is unaware that political ideologies and culture impact individuals’ access to these resources.

Student understands at a general level that nations differ in the degree to which they allow citizens access to technology/ information. However, this knowledge is sparse.

Student understands some of the ideological and cultural issues that drive national decisions about access to technology and information.

Student has specific and well-developed knowledge of ways in which access to technology/information is impacted by culture and political ideology. He/she is able to transfer this knowledge when learning about similar issues with which he/she is unfamiliar.

Participation in the global society

In many cases it has not occurred to the student that persons in other nations directly influence his/her life socially,

The student has a growing awareness of the global nature of the world. He/she is interested in the study of international policy

The student recognizes his/her own role as an individual in a global society. As such he/she - when guided -participates

The student is aware of how his/her actions and the actions of his/her country exert influence globally. He/she

Page 129: 3.1 Graduate attributes

By the Metiri Group in cooperation with NCREL

politically, and economically.

and affairs—but action is limited to learning and reflection.

locally through economic, political, or social means (e.g., donations to relief efforts, contributions to international social, health, or environmental concerns).

seeks to understand the global impact of personal actions (e.g., consumerism based on company policies, consumption of energy, or recycling), and acts accordingly.

References and Links:

Anderson, Sarah, John Cavanagh, Thea Lee, and the Institute for Policy Studies. Field Guide to the Global Economy. New York, NY: The New Press, 2000. Annan, Kofi. Speech to Harvard University, September 17, 1998. Castells, Manuel. The Rise of the Network Society. Oxford, U.K.: Blackwell, 1996. Drucker, Peter (2001, November 3). The Next Society. The Economist, Special Supplement, pgs. 3-20 The Economist (2001). Pocket World in Figures, 2001 ed. London, U.K.: Profile Books Foreign Policy Association (2000). Citizen’s Guide to U.S. Foreign Policy: The Critical Issues. NY: Foreign Policy Association Friedman, Thomas L. (1999). The Lexus and the Olive Tree: Understanding Globalization. NY: Farrar, Straus & Giroux Friedman, Thomas L. (2001)Presentation to the Indiana Humanities Council. Naisbitt, John (1994). Global Paradox. New York: Avon Sassen, Saskia (1988). The Mobility of Labor and Capital. Cambridge, U.K.: Cambridge University Press

79