actionable assessment of academic programs: principles and practices for usable results

36
Actionable Assessment of Academic Programs: Principles and Practices for Usable Results Jo Michelle Beld Professor of Political Science Vice President for Mission Integrative Learning and the Departments – July 2014

Upload: pandora-case

Post on 30-Dec-2015

12 views

Category:

Documents


0 download

DESCRIPTION

Actionable Assessment of Academic Programs: Principles and Practices for Usable Results. Jo Michelle Beld Professor of Political Science Vice President for Mission Integrative Learning and the Departments – July 2014. Agenda. Principles: Conceptual frameworks - PowerPoint PPT Presentation

TRANSCRIPT

Actionable Assessmentof Academic Programs:Principles and Practices

for Usable Results

Jo Michelle BeldProfessor of Political ScienceVice President for Mission

Integrative Learning and theDepartments – July 2014

Agenda

Principles: Conceptual frameworks

Practitioners: Case studies

Practices: Specific strategies

Principles

Utilization-focused assessment

(Patton, 2008):

Focus on intended uses

by intended users

Principles

Backward design

(Wiggins & McTighe, 2005):

“Beginning with the end in mind”

Principles

Traditional assessment design:

Choose an assessment instrument

Gather and summarize evidence

Send a report to someone

Principles

Backward assessment design:

Identify intended uses

Define and locate the learning

Choose assessment approach

Practitioners

Sample departments/programs (pp. 2-3):

Studio Art

Chemistry

History

Statistics

Management Studies

Interdisciplinary Programs

Practitioners

Studio Art (p. 4)• Developed evaluation form for

senior exhibit that doubles as assessment instrument

• Addressed disconnect between student and faculty criteria for artistic excellence

• Revised requirements for the major

• Refocused common foundation-level courses

Practitioners

Chemistry• Used ACS exam as final in

Chem 371: Physical Chem• Students outperformed

national average and did well in kinetics despite limited coverage in course

• Chem 371 was retooled to focus on thermodynamics and quantum mechanics

Practitioners

History (p. 5):% “exemplary” or

“satisfactory” ability to…

• Examining ability to understand and work with historiography in new intermediate seminars for major

  First-Year seminars

Senior seminars

Identify historiographical debates

64% 94%

Use historiography to develop argument

54% 60%

PractitionersStatistics

• Collaboratively designed final exam question and grading rubric in Stats 270 to examine interpretation and communication of results; two faculty graded essays

• Instructor adjusted teaching to address identified weaknesses

• Instructor benefited from mentoring by senior faculty

Practitioners

Management Studies

• Quiz scores: Teams outperformed best individual students

• Course evaluations: Students believed they learned “much” or “exceptional amount” in teams (73%)

• Team-based learning being extended to other courses

Mean Results – Management Studies 251 Course Quizzes

Highest individual score Team quiz score

Section A 4.36 4.79

Section B 4.39 4.74

Practitioners

Interdisciplinary programs(pp. 6-8):

• Collaboratively developed assessment questionnaire

• Next: Direct assessment of student work products

– Latin American Studies: AAC&U Critical Thinking rubric

– Women’s and Gender Studies: rating students’ use of terms/concepts

Principles (redux)

Uses in individual courses:

• Setting priorities for content/instruction

• Revising/expanding assignments

• Clarifying expectations for students

• Enhancing “scaffolding”

• Piloting or testing innovations/changes

• Affirming current practices

Principles (redux)

Uses in the program as a whole:

• Strengthening program coherence

• Sending consistent messages to students

• Revising program requirements

• Extending productive pedagogies

• Affirming current practices

Principles (redux)More program uses:

• Telling the program’s story to graduate schools and employers

• Enhancing visibility to disciplinary and inter-disciplinary associations

• Supporting grant applications

• Meeting requirements for specialized accreditation

Principles (redux)

What specific use of assessment in

your department or program

could make assessment more

meaningful for your colleagues?

Practices

“Direct” Assessment”

“Direct” Assessment

Evidence of what students actually know, can do, or care about

“Indirect”Assessment

Evidence of learning-relatedexperiences or

perceptions

Practices

Common indirect assessment “artifacts”

Course mapping, course-taking patterns or transcript analysis

Responses to survey or interview questions about experiences, perceptions, self-reported progress, or impact of program experiences

Reflective journals

Practices

Common direct assessment “artifacts”

Theses, papers, essays, abstracts – individually or in a portfolio

Presentations and posters

Oral or written examination items

Responses to survey or interview questions that ask for examples of knowledge, practice, or value

Practices

But wait!! Aren’t we

observing student work

all the time anyway?

What’s the difference

between grading and

assessment?

Practices

Gradingsummarizes

many outcomes

for

one student

Assessmentsummarizes

one outcome

for

many students

Practices

The purpose of assessment is to provide systematic, summarized

information about the extent to which a group of students has realized one or more intended learning outcomes

Practices

Options to consider:

Use an instrument developed by someone else

Adapt an existing instrument

Add to something you’re already doing

Connect to institutional-level evidence

Invent something new

Practices

Rubrics

Inter-institutional level (p.8)

Institutional level (p. 9)

Department/program level (pp. 10-11)

Course level (p. 12)

Some brief advice (pp. 13-14)

Practices

Externally-developed tests

Biology: ETS Major Field Exam

Chemistry: American Chemical Society exams

Classics: Eta Sigma Phi Latin and Greek exams

French: ACTFL Oral Proficiency interviews

Nursing: California Critical Thinking Skills Test

Psychology: Area Concentration Achievement Tests

Practices

Locally-developed tests or test items

Chemistry: Safety quiz for all lab students

Physics: Programming test items in introductory and advanced seminar courses

Statistics: Collectively-written final exam essay question in intermediate gateway course

Practices

Institutional-level surveys

Analysis and/or administration of selected items by department/division (p. 15)NSSE (note Items 2a, 2b, 2c, 2g)

Triangulation (pp. 16-18)

Reflection (p. 19)

Practices

Locally-developed questionnaires

Program-level learning outcomes survey (Mathematics – p. 20)

Knowledge and attitudes survey (Environmental Studies – students in on-campus course and undergraduate research field experience; pre/post in both; paired with direct assessment of student work – p. 21)

Practices

Course-embedded outcomes reporting(pp. 22-23)• Identify one or more outcomes of interest• Identify one or more assignments that develop

and demonstrate the outcome(s)• Rate each student’s work just in relation to

that outcome• Aggregate results and suggest significance for

course/program

Practices

Where possible, pair indirect observations of processes and perceptions with direct observations of outcomes (pp. 21, 24).

Practices

The dual goal of sampling:

Representativenessand

Manageability

Practices

Examples involving comprehensive sampling:

Survey of all senior majors

Application of rubric to all research abstracts in all seminars

Application of rubric to all work submitted for senior art show

Practices

Examples involving selective sampling:

Application of rubric to randomly-selected subset of final papers in capstone course

Pre/post administration of locally-developed quiz in required sophomore methods course

End-of-course survey in one introductory and one senior-level course

Aggregation of results on selected items in an evaluation form for student work

Practices

Which of these assessment

strategies might be appropriate for

the course- or program-level use of

assessment evidence you

identified earlier?

A final thought….

It’s not rocket science (unless that’s what you

teach)!