understanding middle states’ expectations for assessment linda suskie, vice president middle...

Post on 20-Dec-2015

213 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Understanding Middle States’ Expectations for Assessment

Linda Suskie, Vice PresidentMiddle States Commission on Higher Education

3624 Market Street, Philadelphia PA 19104Web: www.msche.org E-mail: LSuskie@msche.org

AIRPOBuffalo, June 12, 2009

Today…

1. Understanding Standard 7: Institutional Assessment

2. Sharing assessment results

3. Using assessment results

4. Telling your story to Middle States

5. Questions an MSCHE reviewer might ask

Understanding Standard 7:Institutional Assessment

Planning & Assessment as a Four-Step Cycle

1. Goals

4. Using Results

2. Programs, Services & Initiatives

3. Assessment/ Evaluation

What Goals Are We Talking About?

• Institutional goals (mission & strategic plan)– Administrative goals

• Division goals– Administrative unit goals

– Student learning goals • Institutional• Gen Ed curriculum• Academic programs• Student development programs• Support programs

1. Mission & Goals

8. Admissions

9. Student Support Services

2. Planning

10. Faculty

3. Resources

11. Educational Offerings

4. Leadership/Governance

12. General Education

5. Administration

13. Related Educ. Activities

6. Integrity

7. Institutional Assessment

14. Asmt. of Student Learning

Institutional Effectiveness: Are We Achieving…

Community Service

Scholarship

Diversity

Revenue Generation

Productivity/ Efficiency

14. Student Learning

7. Mission & Goals

Access

Strategies to Assess Institutional Goals

Assessments of student learning

• Direct evidence (clear, convincing)– Tests & examinations

– Assignments, papers, projects

– Portfolios

– Field experience evaluations

• Indirect evidence– Retention, graduation, placement rates

– Surveys of students & alumni

– Grades

Performance indicators

• “Measures that are monitored in order to determine the health, effectiveness, & efficiency” of an institution

» Michael Dolence & Donald Norris

= Key performance indicators (KPIs)= Key quality indicators (KQIs)= Performance measures= Performance metrics

= Balanced scorecard= Dashboard indicators

Popular performance indicators

• Student retention & graduation rates

• Job placement rates

• Racial/ethnic enrollment breakdowns

• Dollar value of sponsored research grants

• Licensure & certification exam pass rates

• Faculty workload– Student/faculty ratio

– Average credit enrollment per FTE faculty

Common state performance indicatorsNational Center for Public Policy & Higher Education

• Preparation– Number & quality of

teachers graduating in critical fields

• Participation– Enrollment by race,

gender, income• Affordability

– Discounted tuition & fees as proportion of median income

• Completion– Actual & predicted

graduation rates based on student preparation & aptitude

• Benefits– Degrees awarded in

critical fields– Sponsored research &

publications• Learning

Other examples

• Participation rates (e.g., in student activities, cultural events)

• Expenditures per FTE student

• Counts of contacts, inquiries, etc.– Questions to library information desk

– Referrals to counseling center

Program reviews (academic & other)

• Common criteria for academic program reviews

– Quality• Resources, activities, outcomes, etc.

– Need• Demand for the program

• Competing programs

• Centrality to mission

– Cost and cost-effectiveness

Baldrige National Quality Program

1. Leadership

2. Strategic planning

3. Student, stakeholder, & market focus

4. Measurement, analysis & knowledge management

5. Faculty & staff focus

6. Process management

7. Organizational performance results

Other assessment strategies

• Surveys, interviews, focus groups• “Secret shoppers”• Observations of students, meetings, activities• Document reviews

– Meeting minutes, transcript analyses, e-mails, online discussions

• Online institutional portfolios• Quality improvement tools

– Run charts, histograms, pareto analyses, six sigma analyses

– Activity-based costing: Compare outcome against cost

Your assessment strategy must align with a goal to be useful.

Sharing Assessment Results

Why are you assessing the program or curriculum?

– Validate it to others (accountability)– Make sure it isn’t slipping– Improve it

Keep assessment summaries useful to you and your colleagues.

• Who needs to see the results?• Why? What decisions will they make?• What do they need to see to make those

decisions?

What decisions might the assessment help with?

• Learning goals– Are our learning goals sufficiently clear and focused?

• Curriculum– What is the value of service learning?– Should our courses have more uniformity across sections?

• Teaching methods– Is online instruction as effective as traditional instruction?– Is collaborative learning more effective than lectures?– Are we developing a community of scholars?

• Assessments– Have our assessments been useful?

• Resource allocations– Where should we commit our resources first?

Keep assessment summaries short and simple.

• Fast and easy to read and understand– Use short, simple charts, graphs, and lists.

• Use PowerPoint presentations.• Avoid narrative text.

– First aggregate (sum up) data, then drill down into details as needed.

– Round results. – Sort results from highest to lowest.– Percentages may be more meaningful than averages.

• Avoid complex statistics.

– As you collect results over time, show trends.

Tell a story.

• Key questions to address:– What have you learned about your students’ learning &

other institutional goals?– What are you going to do about what you have learned?– When, where, and how are you going to do it?

» Doug Eder

• Focus on “big news.”– Identify meaningful vs. insignificant differences.

• Find someone skilled at finding the stories in reams of data.

Using Assessment Results

Publicize!

Celebrate!

When Assessment Results Are Good

When assessment results are disappointing…Example: Student retention results

• Goals• Set a special target for male students.

• Program (curriculum)• Make the advisement program mandatory.

• Implementation (pedagogy)• Increase professional development for advisors.

• Assessments• Identify student goals upon entry and upon exit.

• Resource allocations• Fund professional development for advisors.

Telling Your Story to Middle States

What Should Institutions Document?

• Clear statements of goals

• Organized, sustained assessment process

– Principles, guidelines, support

– What assessments are already underway

– What assessments are planned, when, & how

• Assessment results documenting progress toward accomplishing goals

• How results have been used for improvement

How Might Institutions Document This?

• Need not be a fancy bound document!

• Need not be in a consistent format or single repository

• An overview in the report to MSCHE

• A chart or “roadmap” in the report to MSCHE or an appendix

• More thorough information in the on-site “resource room,” online, and/or burned onto CD

• A few samples of student work

– Exemplary, adequate, inadequate

Do you need special assessment software?

• What are your needs?• How will you use the software?• Are faculty & staff ready to use it?• Do you have IT support?• Ask vendors for references.• What are the real costs?• What is the cost-benefit balance?• Don’t rush; involve faculty in deciding.

Questions an MSCHE Reviewer Might Ask

Used

Cost effective

Reasonably accurate &

truthful results

Clear & important

goals

Is the Institution Engaged in “Good” Assessment?

Valued

Goals Assessments Improvements

For Each Goal…

• How is the goal being assessed?• What are the results of those assessments?• How have those results been used for

improvement?

Do Institutional Leaders Support and Value a Culture of Assessment?

• Is there adequate support for assessment?– Overall guidance & coordination

• Are assessment efforts recognized & valued?

• Are efforts to improve teaching recognized & valued?

How Much Has Been Implemented?

• Are there any significant gaps?

What Do Assessment Results Tell Us?

• Do results demonstrate…

– Achievement of mission and goals?

– Sufficient academic rigor?

Have Assessment Results Been Used?

• Have they been appropriately shared & discussed?

• Have they led to appropriate decisions?

– Curricula and pedagogy

– Programs and services

– Resource allocation

– Institutional goals and plans

Is the Process Sustainable?

• Simple

• Practical

• Detailed

• Ownership

• Appropriate timelines

Where is the Institution Going with Assessment?

• Will momentum slow after this review?

• What Commission action will most help the institution keep moving?

Middle States’ Five “Rules” for Assessment

1. Keep it useful.

2. Tie assessments to important goals.

3. For student learning, include some “direct” evidence.

4. Use multiple measures.

5. Keep doing something everywhere, every year.

Bottom Line on Moving Ahead

Keep assessment useful.Keep things simple.

• Especially in terms of time• Don’t create unnecessary rules.

Value assessment.Just do it!

Volunteer for Middle States Evaluation Teams!

• Go to our web site (www.msche.org).

• Click on “Evaluators.”

top related