2004-05 accreditation cycle eac accreditation site visit xyz university october 24-26, 2004...
TRANSCRIPT
2004-05 Accreditation Cycle
EAC Accreditation Site VisitXYZ UniversityOctober 24-26, 2004
• Introductions
• Expectations for the team– documents– role of observers– support program assignments
• Overview of Schedule
• EC2000 Visit details Issues
• Program overviews
2004-05 Accreditation Cycle
Important Team Expectations• Evaluators represent the EAC of ABET
• We are accrediting programs to state to the public that they satisfy the criteria
• Team effort—team decisions
• Confidentiality
• Conflict of interest – every visitor should have signed a conflict-of-interest statement
• Observers– no evaluative statements
– exit interview—thank-you only
2004-05 Accreditation Cycle
Confidentiality• Do not discuss conclusions with faculty, students,
and others
• Keep all materials until the July 2005 EAC meeting. At conclusion of accreditation process (July 2005) materials are to be destroyed
• Information specific to the institution is to remain confidential without time limit
• Institutional data is confidential except with written authorization of institution
• ABET materials only released by ABET staff
2004-05 Accreditation Cycle
Communication• Maintain open line of communication with the
department head
• Identify deficiencies as soon as possible
• Discuss all issues with the department head at the debriefing
• Do not discuss the recommended accreditation action with anyone except team members
2004-05 Accreditation Cycle
Today’s Schedule
• Sunday meetings– Sunday a.m.
• Brief: initial thoughts and recommendations
– Sunday p.m.• More detailed discussion
• “Pre-visit” recommended action
• Questions
• Lunch
2004-05 Accreditation Cycle
Visit Schedule (Example) Sunday, October 24
10:30 AM – 11:30 ABET team meeting in team room Brief: initial thoughts and
recommendations11:30/noon Team lunch or evaluator/chair
meetings1:30 - 4:30 PM Meetings/Examine course materials and
documentation of outcomes.4:30-6:00 PM Team meeting
More detailed discussion“Pre-visit” recommended action
6:00 PM ABET team dinner
2004-05 Accreditation Cycle
Visit Schedule (Example)Monday, October 25
[7:30 AM Transit time—meet in hotel lobby]
8:00 AM - 9:00 AM Team Meeting with engineering administration.
9:00 AM - 10:00 AM Evaluators meet with chairs or begin meetings with faculty
10:00 AM - 12:00 N Meet with faculty members to discuss program processes, outcomes, improvement
12:00 N - 1:30 PM Joint ABET team and XYZ administration lunch
1:30 PM - 3:00 PM Visit supporting departments
3:00 PM - 4:00 PM Meet with students or faculty
4:00 PM - 4:45 PM Meet with faculty or students
5:00 PM - 11:00 PM ABET team meeting and dinner
2004-05 Accreditation Cycle
Visit Schedule (Example)Tuesday, October 26
[7:30 AM Meet in hotel lobby—evaluators may choose to schedule meetings earlier—on their own]
8:00 AM - 11:30 AM Inspect classrooms, laboratories, offices, equipment
<<<Additional interviews as needed (students, faculty)>>>
11:45 AM - 12:30 AM ABET team meeting to prepare debriefing
12:30 PM - 1:00 PM Debrief chairs/dean
1:15 PM - 2:45 PM ABET team meeting and working lunch
3:00 PM - 4:00 PM Summary meeting with administration
4:00 PM Depart for airport/hotel
2004-05 Accreditation Cycle
Support Program Assignments
Math PEV name* Time Location
Physics PEV name Time Location
Chemistry PEV name Time Location
Hum/S.S. PEV name Time Location
Libraries PEV name Time Location
Computing PEV name Time Location
*TC may help in support program reviews
2004-05 Accreditation Cycle
Visit details• Important forms
– Transcript & curriculum analyses
– Level of Implementation
– Program Audit Form + Explanation of Shortcoming (left on campus)
– Draft Statement—NEW FORMAT—on disc to me
– Program Evaluator Worksheet—you keep
– Short form—recommended actions
– Program Evaluator Report
2004-05 Accreditation Cycle
PROGRAM AUDIT FORM(PROVIDE A COPY TO INSTITUTION AT EXIT MEETING)
If it doesn’t have this column, you are using an old form
2004-05 Accreditation Cycle
Exit Statement Format• INTRODUCTION—USEFUL
PROGRAM STATISTICS
• PROGRAM ISSUES
– Strengths (special, unique or particularly conspicuous strengths)
1.
2.
– Deficiencies (In order, only for those criteria where deficiencies exist)
1. XXX
2. etc.
– Weaknesses (In order, only for those criteria where weaknesses exist) 1. YYY
2. etc.
– Concerns (In order, where concerns exist)
1. ZZZ
2. etc.
– Observations (do not have to relate to criteria)
1. etc.
2004-05 Accreditation Cycle
EC2000 Visit details
• Consistency—things to look for– Evaluation and measurement of objectives
– Assessment and demonstration of outcomes
– Program improvement (closing the loop)
– Curricular and program issues
– Faculty and students• As they relate to undergraduate education
2004-05 Accreditation Cycle
Terminology• Deficiency -- criterion is NOT satisfied.
• Weakness -- criterion is satisfied, but lacks strength of compliance to assure the quality of the program will not be compromised prior to next general review.
• Concern -- criterion is satisfied, but potential exists for non-satisfaction in the near future.
2004-05 Accreditation Cycle
Working Definition of Key Terms
• Deficiency: assigned to any criterion that is totally or largely unmet
• Weakness: criterion is met to some meaningful extent, but compliance is insufficient to fully satisfy requirements
• Concern: criterion is fully met, but there is potential for non-compliance in the near future
• Observation: general commentary possibly, but not necessarily, related to criteria
2004-05 Accreditation Cycle
Limit Use of Key Terms
• Use Key Term only in reference to overall evaluation of each criterion
2004-05 Accreditation Cycle
Accreditation ActionsNGR Next General Review
IR Interim Report
IV Interim Visit
SC Show Cause
RE Report Extended
VE Visit Extended
SE Show Cause Extended
NA Not to Accredit
Interim visit only
2004-05 Accreditation Cycle
Linking Actions to Terms
Term Results of EvaluationWeaknesses ? no yes yes --Deficiencies ? no no no yes
Type of Review Possible ActionsGeneral (comprehensive) NGR IR IV SC or NA
(following a SC) NGR IR IV NA or SC*
New Program NGR IR IV NA
* Show Cause can follow a Show Cause only if progress is demonstrated .
General Review Visits
2004-05 Accreditation Cycle
Duration of Accreditation Actions
DurationWeak? Def? Action (years)
No No NGR Next General Review 6
Yes No IR Interim Report 2
Yes No IV Interim Visit 2
-- Yes SC Show Cause 1
General Review Visits
2004-05 Accreditation Cycle
Interim Actions• Interim evaluations focus on identified deficiencies (if Show
Cause ) or weaknesses.
• Interim Visit: recommended only when degree of resolution requires the evaluator to be on campus and cannot be determined by review of a report or when previous written information has not been effective in providing the necessary evidence. (e.g., student work). A new team is sent.
• Interim Report: recommended when resolution of shortcomings can be described by a report (e.g., faculty hiring); the current program evaluator and team chair may review the interim report, assess progress, prepare a statement (Minor U) and recommend accreditation action.
2004-05 Accreditation Cycle
Consistency Issues for the Team• The depth and completeness of the
evaluation from program to program
• Consistency across all programs in an institution
• The assignment of appropriate key terms (deficiency, weakness, concern) to describe shortcomings
• For weaknesses, consistency on interim recommendations--IR vs IV
2004-05 Accreditation Cycle
Definitions
Program Educational Objectives
Program educational objectives are broad statements that describe the career and professional accomplishments that the program is preparing graduates to achieve.
2004-05 Accreditation Cycle
Program Educational Objectives (continued)
These are also often referred to by institutions as goals, career outcomes, or standards. There are two types of objectives those that all graduates are expected to accomplish and those that some subgroups, but not all graduates, are expected to accomplish. The audiences for objective statements are normally external constituents, such as prospective students, employers, and transfer institutions.
2004-05 Accreditation Cycle
Program Outcomes
Program outcomes are narrower statements that describe what students are expected to know and able to do by the time of graduation. These relate to the skills, knowledge, and behaviors that students acquire in their matriculation through the program.
2004-05 Accreditation Cycle
Assessment
Assessment is one or more processes that identify, collect, and prepare data to evaluate the achievement of program outcomes and program educational objectives.
This reference to “or program improvement” seems out of place, since evaluating achievement of outcomes and objectives is a prerequisite to improvement, not an alternative. Besides, in our set of definitions, improvement comes after ‘evaluation’ rather than after ‘assessment.”
2004-05 Accreditation Cycle
Assessment (continued)
Often, the entire process is referred to as assessment, and the program or institution does not subdivide the overall process into component parts. While assessment data are useful for display during accreditation reviews, data alone do not provide the documented evidence for continuous improvement.
2004-05 Accreditation Cycle
Evaluation
Evaluation is one or more processes for interpreting the data and evidence accumulated through assessment practices. Evaluation determines the extent to which program outcomes or program educational objectives are being achieved, and results in decisions and actions to improve the program..
2004-05 Accreditation Cycle
Evaluation (continued)
Evaluation is often referred to generically by the term “assessment.” Also, “closing the loop” is a term used to describe the process of evaluation of assessment data and making improvements as a result of the analysis. It is expected that the programs would perform the evaluation of assessment data, and the documented results of the evaluation would serve as evidence of achievement of objectives and outcomes and/or evidence of an effective continuous improvement system.
2004-05 Accreditation Cycle
Consistency Issues for Criteria 2 & 3--where issues arise• Most consistency issues are likely to center
on Criteria 2 & 3
• TCs and evaluators should understand the institution’s use of its own terminology relevant to these criteria
2004-05 Accreditation Cycle
Level of Expectation• How should you assess the claim “We’re working
on it”?– Educational objectives must be defined and based on
needs of constituencies
– All major outcomes must be defined and processes in place to assure continuous improvement
– Assessment is an on-going activity and takes some time to reach full maturity, but some data should be available and results demonstrated
2004-05 Accreditation Cycle
Exactly what attributes must each graduate have?– A system must be in place to ensure that all graduates
have, to some minimum extent, achieved the prescribed outcomes (Criterion 3) and all elements of Professional Component (Criterion 4)
– The level of achievement may vary, consistent with program objectives
Level of Expectation
2004-05 Accreditation Cycle
Criterion 2Program Educational Objectives
• Program Educational Objectives--statements that describe the expected accomplishments of graduates during the first few years after graduation1
• Unique to the program and institution
• Consistent in all publications
1Working definition from ABET Faculty Workshops
2004-05 Accreditation Cycle
Criterion 2Program Educational Objectives
• Consider a deficiency if the general intent of Criterion 2 is not met. Contributing factors may include:– no involvement of constituencies– no process-oriented approach to achieving objectives
(links to curriculum)– no process-oriented approach to evaluate achievement
of objectives– no data that demonstrate the extent to which objectives
are met – no evidence of program improvement based on
evaluative processes
2004-05 Accreditation Cycle
Criterion 2Program Educational Objectives
• Consider a weakness if the general intent of Criterion 2 is met to some extent, but not fully met. Contributing factor may include:– Objectives are published but are not accessible to
constituencies and potential students– limited or ad hoc involvement of constituencies– incomplete process to achieving objectives (links to
curriculum are not clear)– incomplete process-oriented approach to evaluating
achievement of objectives– evidence of program improvement based on ad hoc
processes
2004-05 Accreditation Cycle
Criterion 2Program Educational Objectives
• Consider a concern if the general intent of the criterion is fully met, but minor issues may lead to lack of compliance in the future. Contributing factors may include:– Objectives are published, but are changed
frequently – Objectives are evaluated, but there is limited
involvement of constituencies in this process or it varies from year to year (2b)
– Program improvement processes may rely too heavily on one person
2004-05 Accreditation Cycle
Criterion 3Program Outcomes & Assessment• Program outcomes: statements that describe
what students are expected to know and are able to do by the time of graduation, the achievement of which indicates that the student is equipped to achieve the Program Educational Objectives1
• ABET designated (a-k) included in some way
• Program may add others1Working definition from ABET Regional Faculty Workshops
2004-05 Accreditation Cycle
Criterion 3Program Outcomes and Assessment• Consider a deficiency if the general intent of
Criterion 3 is not met. Contributing factors may include:– No documented working process(es) to produce
outcomes– Loop not closed on any outcomes– Absence of defined goals and documented assessment
results– No evidence that demonstrates achievement of
outcomes– No evidence of efforts at program improvement based
on assessment
2004-05 Accreditation Cycle
Criterion 3Program Outcomes and Assessment
• Consider a weakness if the general intent of Criterion 3 is met to some extent, but not fully met. Contributing factor may include :– Absence of a working process(es) to produce some
outcomes– Loop closed on some outcomes– Defined goals and documented assessment results for
some outcomes– Absence of demonstration of a small number of
outcomes– Incomplete or ad hoc evidence of efforts at program
improvement based on assessment
2004-05 Accreditation Cycle
Criterion 3Program Outcomes & Assessment
• Consider a concern if the general intent of the criterion is fully met, but minor issues may lead to lack of compliance in the future. Contributing factors may include:– Process to produce some outcomes is possibly
inconsistent and may lead to circumstances in which their quality is insufficient to meet program metrics
– Loop closed on most outcomes, but some important assessment results have not been acted upon
– Inconsistent coverage or demonstration of a small number of outcomes e.g., with variation from year to year in the same course taught by different faculty members—may be overly dependent on one person
2004-05 Accreditation Cycle
• 30-Day Due Process Response from institution
• TC edits “Final” Statement to add 30-day response (Major U or Minor U format); checks with evaluators if needed
• TC updates PAF and Short Form• EAC takes final accreditation action• ABET sends Final Statement and accreditation
letter to institution
• Team Chair and Program Evaluators fill out on-line evaluation forms
Post-Visit Process
2004-05 Accreditation Cycle
Team Chair
Team Chair
Team Chair
Editor
EAC Chair
Team Team Team Team Team
Institution: Due Process
14-day response
Institution EAC Team
Institution EAC Team
ABET Headquarters
Draft statement
Due Process: 14-Day Response
Du
e P
roce
ssTeam Chair
Team Chair
Team Chair
Editor
EAC Chair
ABET HQ
Team Team Team Team Team
Due Process: Institutional
Response
Institution EAC Team
Institution EAC Team
Due Process: After 30-Day Response
2004-05 Accreditation Cycle
Team Chair
Team Chair
Team Chair
Editor
EAC Chair
Team Team Team Team Team
ABET Headquarters
Institution
EAC Meeting
Final statement
ABET President
Professional Societies
Final Statement