direct assessment of slo in student services experiential ... · the $10,000 guest speaker is...
TRANSCRIPT
MICHAEL C. SACHS, JD, CCEP
ASSISTANT VICE PRESIDENT AND DEAN OF STUDENTS
GUTTMAN ELO SYMPOSIUM 4/2018
Direct Assessment Techniques of SLO in Student Services
© Michael C. Sachs 2018
0
Program Learning Outcomes
Participants will be able to:
recognize why direct assessment is important
recognize direct vs. indirect assessment
identify types of direct assessment
implement strategies on campus
© Michael C. Sachs 2018
1
Assessment
©2015 Michael C. Sachs
Mission and Goals
SL
O / D
irect an
d In
direct
Assessment Cycle
©2017 Michael C. Sachs ©2017 Michael C. Sachs
MSCHE
The characteristics of good evidence of student learning include considerations of direct and indirect methods for gathering evidence of student learning.
(Student Learning Assessment: Options and Resources, Second Edition, 2007)
© Michael C. Sachs 2018
4
Direct Assessment
In direct assessment students display knowledge or skills as the result of an assessment measure (presentation, test, etc.). Direct measures of student learning require students to display their knowledge and skills as they respond to the instrument itself. Objective tests, essays, presentations, and classroom assignments all meet this criterion.
(Assessment Essentials: Planning, Implementing and Improving Assessment in Higher Education Palomba, C.A., & Banta, T.W., 1999)
©2017 Michael C. Sachs
Indirect Assessment
Indirect assessment learning is inferred instead of being supported by direct evidence (i.e., usage data, satisfaction surveys). Students reflect on learning rather than demonstrate it.
(Palomba, C.A., & Banta, T.W., 1999)
©2017 Michael C. Sachs
WHAT DID THE STUDENT LEARN AT THE PROGRAM
VS.
WHAT THEY BELIEVE THEY LEARNED
©2017 Michael C. Sachs
Program / Event
Example: SLO Direct / Indirect
Which goal is direct / indirect?1. Increasing attendance goals at a Title IX program from the
previous year?
(Indirect)
1. A program with the goal of building community
(Direct or Indirect)
1. Have students write three things they learned from the program they attended
(Direct)If you have predetermined learning goals / measures
How do we know they learned something, and that they learned what we want them to learn?
©2017 Michael C. Sachs
Indirect Assessment in Student Services
Student Services has often relied on indirect assessment for reporting success such as: Satisfaction surveys (NSSE, CCSSE, etc.)
Evaluations
Attendance rates
Usage data
Quality measures
Focus groups
Indirect assessment is very useful when determining if a student likes or enjoys an event, activity, or program, but not in conveying if they learned something
© Michael C. Sachs 2018
9
Direct vs. Indirect
Indirect measures that provide feelings, likes, and perceptions are important in Student Services Imagine if a student did not like the food in the cafeteria?
If survey results constantly noted that the Financial Aid office is cold and uncaring?
The $10,000 guest speaker is “boring”
These are very important!
Indirect assessment is important in helping us to understand how students perceive our programs and services but not if they have learned anything from the experience. Satisfaction vs. Learning
©2017 Michael C. Sachs
© Michael C. Sachs 2015 11
If the program is achieving a different goal than the one intended isn’t that important to know? Let’s look at an example.
1 Shot = 1 Glass of Wine = 1 Beer
SLO Goal: Consuming several shots can be far more intoxicating due to the alcohol content than having the same quantity of beer or wine.
Alcohol Education Program
© Michael C. Sachs 2018
12
Alcohol Education
Indirect:
Did you learn something from this presentation?
Direct:
What did you learn from the presentation?
List three items (+/- with predetermined answers)
Fill in the blank
Create a tool to measure the learning for small group discussions (have the recorder turn in their notes)
© Michael C. Sachs 2018
13
Unwanted Outcome
Direct Assessment Outcome:
“Cool, I can get drunk faster and gain less weight by doing shots than drinking beer.”
Indirect assessment would not likely have brought this answer forward
© Michael C. Sachs 2018
14
IF OFFERED, ATHLETIC, STUDENT LIFE, AND OTHER
EXTRACURRICULAR ACTIVITIES THAT ARE REGULATED BY THE SAME
ACADEMIC, FISCAL AND ADMINISTRATIVE PRINCIPLES AND
PROCEDURES THAT GOVERN ALL OTHER PROGRAMS (#10)
©2017 Michael C. Sachs
MSCHE Standard IV
Qualitative & Quantitative vs. Direct & Indirect
CAUTION!!!
Don’t confuse operational / program assessment with SLO assessment
Don’t confuse qualitative & quantitative with direct & indirect
© Michael C. Sachs 2018
16
Operational Outcomes
© Michael C. Sachs 2018
17
Assesses a department, is not the only type of assessment needed.
An Operational Assessment is an efficient way to identify your institutional strengths and weaknesses through a comprehensive review of your operations. This is required by MSCHE!
Operational Examples
©2017 Michael C. Sachs
Financial Resources
Staff Training
Technology
Infrastructure
Staffing levels
Organization and management
Access and Equity
Compliance
Best Practices
Etc.
Operational Assessment should include statements about SLO and assessment plans should outline operational assessments, direct SLO, and indirect assessment
ExampleCAS – Council for the Advancement of Standards
©2017 Michael C. Sachs
Operational / Program Assessment Example
For All Standards
©2017 Michael C. Sachs
“Assessment must include qualitative and quantitative methodologies….and student learning development outcomes are being met” (this is in all standards)
CAS Examples
“Document achievement of stated goals and learning outcomes”
Qualitative & Quantitative vs. Direct & Indirect
Qualitative:
Qualitative assessment methodology involves asking participants broad, general questions, collecting the detailed views of the participants in the form of words or images, and analyzing the information for descriptions and themes
(Qualitative Inquiry and Research Design: Choosing Among Five Approaches, Creswell, J., 2007)
Data that does not lend itself to quantitative methods but rather to interpretive criteria.
(Carnegie Mellon Eberly Center Teaching Excellence & Educational Innovation)
©2015 Michael C. Sachs
Qualitative & Quantitative vs. Direct & Indirect
Qualitative:
Qualitative assessment methodology involves asking participants broad, general questions, collecting the detailed views of the participants in the form of words or images, and analyzing the information for descriptions and themes
(Qualitative Inquiry and Research Design: Choosing Among Five Approaches, Creswell, J., 2007)
©2017 Michael C. Sachs
Examples
©2015 Michael C. Sachs
A Musical Performance
Qualitative (Quality of Music)
Quantitative (Were the notes correctly performed)
A Play
Qualitative (Was the performer funny)
Quantitative (were the lines read accurately)
Extemporaneous Speech
Qualitative (Was it engaging and entertaining)
Quantitative (?)
The judge or reviewer must have expertise in the area or field – no amateurs allowed!!!
Key to Qualitative Assessment
©2015 Michael C. Sachs
Sample Direct Assessment Techniques
Pre/Post test
Direct Observation
Video Observation
Completion Accuracy
Reflection Papers
Performance Observation
Demonstrations
Interviews (not focus groups)
Competitions
Portfolios
Projects
Capstones
Goal Completion
Training Others
Essays
Quick Checks during program
Work Groups/ Table Top Exercises
Juried Evaluators
© Michael C. Sachs 2018
25
Key to Direct Assessment
A direct assessment technique is only as good as the tool used to measure it:
A post test with no predetermined acceptable answers (remember the alcohol example?)
An indirect assessment technique could have direct assessment embedded into it:
Evaluation with questions about student learning embedded into the evaluation (caution!!)
A focus group with specific questions asked to individual participants about what they learned
© Michael C. Sachs 2018
26
Rubrics
Before we get into the examples, a few words on rubrics
It is essential to create a tool such as a rubric that will measure the anticipated learning goals and quality of the answers
© Michael C. Sachs 2018
27
Rubric Defined
Rubrics … [communicate] expectations for an assignment, providing focused feedback on works in progress, and grading final products. A rubric is a document that articulates the expectations for an assignment by listing the criteria, or what counts, and describing levels of quality from excellent to poor. (4teachers.org)
©2015 Michael C. Sachs
RUBRICS
©2015 Michael C. Sachs
Rubrics Continued
Rubrics contain three essential features:
1. Criteria students are to attend to in completing the assignment
2. Markers of quality (typically rating scales)
3. Scoring
(University of California, Berkley Center for Teaching and Learning)
© Michael C. Sachs 2018
30
Now For Examples!
The following goals are examples of direct assessment processes in various areas of Student Services
Caveats It is essential to create a tool that measures your SLO
goals
Not all examples will be feasible on all campuses
This presentation does not take into account campus resources (both human and financial)
Each institution will need to determine their specific SLO outcomes
© Michael C. Sachs 2018
31
Sample Rubric – Admissions Tour Guides - Direct Observation
Beginning 1(Lower 50%)
Developing 2(50%-79%)
Accomplished 3(80%-89%)
Outstanding 4(Top 10%)
Quality & Organization
Engaging
Presentation Skills
ResourceKnowledge
© Michael C. Sachs 2018 32
Some EL Examples
© Michael C. Sachs 2018
33
© Michael C. Sachs 2018 34
© Michael C. Sachs 2018 35
© Michael C. Sachs 2018 36
© Michael C. Sachs 2018 37
Residence Life: RA Training
Program / SLO:
Behind Closed Doors – Role Play Training
Capstone: Students will be able to apply training to real life scenarios
Tool:
Direct observation by professional staff
© Michael C. Sachs 2018
38
© Michael C. Sachs 2018 39
Campus Police
Program / SLO:
Campus Police Active Shooter Response
Tool:
Table Top Exercise
Program / SLO:
Arrest / Stop Interview with Student
Tool:
Post Interview with Conduct Officer or Police
© Michael C. Sachs 2018
40
Yes/No + Notes
Lock & Barricade Doors
Turn Off Lights
Close Blinds
Turn Off Radios & Computer Monitors
Keep Occupants Calm, Quiet and Out of Sight
Take Adequate Cover
Silence all Cell Phones
Place Signs in Exterior Windows for Injured Persons
Sample Rubric – Campus Police© Michael C. Sachs 2018 41
Conduct Office
Program / SLO:
Reflection Paper with Topical Goals
Tool:
Rubric with Stated Goals for paper content
© Michael C. Sachs 2018
42
Beginning Intermediate Exemplary
Self Disclosure
Paper Connected to Conduct Violation
Understand Connection to CommunityStandards
Sample Rubric – Student Conduct: Reflection Paper
© Michael C. Sachs 2018 43
Food Services
Program / SLO:
Students will better understand food quality, preparation, waste management, etc.
Tool - Post Test:
With Specific Answers or fill in the blank
© Michael C. Sachs 2018
44
Enter an ‘X’ in the correct box for your answer.
What items are not baked on premises or are brought in from local bakeries?
Dinner Rolls Wraps Hamburger/ Hotdog Buns
Hoagie/ Sub Rolls
Bagels Donuts Sandwich Bread
Enter an ‘X’ in the correct box for your answer.
What percentage of meat on average is delivered frozen?++
5% 20% 60% 80%
Enter an ‘X’ in the correct box for your answer.
Most vegetables/fruit are freshly prepared, what are the two exceptions?
Peas Carrots Beets Green Beans Cabbage Cling Peaches
Sample Rubric – Food Services© Michael C. Sachs 2018 45
Theater Club
Program / SLO:
Students will produce an entertaining and semi-professional theatrical production
Tool - Video or Direct Observation
Of the Performer or the Audience
© Michael C. Sachs 2018
46
Criteria 4 3 2 1 TOTAL
VOICEVoice was loud and clear; words were easily understood
Student spoke clearly but it was
difficult to understand some
of the script; could’ve been
louder.
Voice and language was not
very clear; could’ve been much louder.
Could not understand what
was being said due to unclear and low
speech.
AUDIENCE Audience felt like part of the show.
Was aware and well-connected to
the audience.
Needed more audience
awareness and connection.
No audience awareness or
connection at all.
MEMORIZATION/
IMPROVISATION
(When applicable)
Script was fully memorized; student improvised in place
of lines.
Script was almost fully memorized-
some improv used to make up for missed lines.
Script was partially
memorized; student did not
attempt improvisation.
Script was not at all memorized; no
improvisation used.
OVERALLCommitted,
cooperated & concentrated-
WOW!
Semi-committed,
concentrated & cooperative-
GREAT!
Almost committed,
cooperative & concentrated-
NOT TOOBAD…
No commitment, cooperation or concentration
MOREREHEARSAL!
Theatre Club© Michael C. Sachs 2018 47
Career Development
SLO:
Students who attend the resume workshop will be able to create a quality basic resume
Scoring rubric with criteria
© Michael C. Sachs 2018
48
Skill Outstanding Good Average Unsatisfactory Total
PRESENTATION
/
FORMAT
§ Typed or computer
generated
§ Balanced margins
with eye appeal
§ Format highlights
strengths and
information
§ Appropriate fonts and
point size used with
variety
§ Typed or computer
generated
§ Balanced margins
§ Format identifies
strengths and
information
§ Appropriate fonts and
point size used
§ Typed or computer
generated
§ Somewhat
balanced margins
§ Format identifies
strengths and
information
§ No variation in fonts
and/or point size
§ Typed or computer
generated
§ Unbalanced
margins
§ Format detracts
from strengths and
information
§ Fonts distract from
readability
Ranking Points 10 8 7 6
JOB-SPECIFIC /
VOLUNTEER
INFORMATION
§ All action phrases
used to describe
duties and skills
§ Information
demonstrates ability
to perform the job
§ Professional
terminology used
when describing skills
§ 1-2 duties/skills lack
action phrases
§ Information
demonstrates ability
to perform the job
§ Some professional
terminology used
when describing skills
§ 3-4 duties/skills
lack action phrases
§ Some information
demonstrates ability
to perform the job
§ 5-6 duties/skills
lack action phrases
§ Information does
not clearly
demonstrate ability
to perform the job
Ranking Points 15 12 11 10
SPELLING &
GRAMMAR
§ No spelling errors
§ No grammar errors
§ 1-2 spelling errors
§ 1-2 grammar errors
§ 3-4 spelling errors
§ 3-4 grammar errors
§ 5-6 spelling errors
§ 5-6 grammar errors
Ranking Points 10 8 6 4
TOTAL
SCORE:
Sample Rubric – Career Development© Michael C. Sachs 2018 49
Student Government
Program / SLO
Student Government Officers will learn leadership Skills
Portfolio of their year of work
Programs & Events: Successes and Challenges
Reflection Paper
Train the Trainer
Public Speaking
© Michael C. Sachs 2018
50
Registrar
Program / SLO:
Workshop on completing graduation application: students who attend registration workshop will do better than those who did not attend
Tool
Comparison of attendee results vs. non attendees
© Michael C. Sachs 2018
51
Athletics
Program/SLO:
Athletic Team Sport: Sportsmanship
© Michael C. Sachs 2018
52
Criteria ScaleConduct 4
Ideal
3
Acceptable
2
Tolerable
1
Unacceptable
0
Absent
Behavior towards Officials Opponents Host school Host students
RespectfulPoliteGraciousPositive interaction
Consistently neither rude nor polite.
Lacking politeness
Attitude is not respectful.
TantrumsDisrespectfulFightingSwearing
Unacceptable behavior in all possible areas.
Play / Participation Rules Spirit of event On the field
Honorable
Playing under control.
Fully engaged in respectful play.
Solid good play.
Abides by the rules.
Play that follows rule but selfish or lacking true spirit.
CheatingRoughness
Out of control. Inappropriate taking advantage.
Unable to follow rules.
Unwillingness to grow as athletes
Team Work Unity Organization Cooperation
Cooperative.United.
Respectful to team- mates.Good leadership.
Working together but some problems with communication.
Sometimes working together or disagreeing with own team.
Disjointed play.Inter-fightingDisorganized
Lacking leadership.
No teamwork whatsoever displayed.
Average Score
Sample Rubric – Athletics
© Michael C. Sachs 2018 53
Make it Fun
Discrimination workshop / program Pre-Post Test Cross Word Puzzle determining knowledge after presentation
© Michael C. Sachs 2018
54
© Michael C. Sachs 2018 55
To be added leave blank
Integration into Assessment Plans Sample
Decide who will participate Get Buy in
Set your SLO goals for Direct Assessment 50% or more must be direct assessment of SLO in Student
Services reported to Univ. Assessment Committee (UAC)
Small programs 1-2 per year, large 5+ per year
Individual create their own assessment plans and criteria
Operational Assessment is separate
Yearly mapping submitted to UAC departmental Assessment plan reviewed by SS Assessment Committee
© Michael C. Sachs 2018
56
© Michael C. Sachs 2018
57
A S S I S T A N T V I C E P R E S I D E N T A N D D E A N O F S T U D E N T S
M S A C H S @ J J A Y . C U N Y . E D U
Michael C. Sachs, JD, CCEP
Questions?
© Michael C. Sachs 2018
58