interpreting and using assessment results the lehman college assessment council october
TRANSCRIPT
Interpreting and Interpreting and Using Assessment Using Assessment
ResultsResults
The Lehman College Assessment Councilhttp://www.lehman.edu/research/assessment/council-documents.php
October 20, 2010
TimelineTimelineSpring 2011
• Middle States report due April 1
• Second completed assessment cycle of student learning goals
• Analyze evidence• Report on how fall
assessment results were used
Fall 2010• First completed assessment cycle of student learning goals• Identifying goal/objective and begin gather evidence on second goal (9/15)• Report on how spring assessment results were used (11/15)• Supporting workshops through the fall semester• Submission of fall assessment results•Syllabi collection
Ongoing assessment
Spring 2010• First Assessment Plan• Programs begin gathering evidence• Supporting workshops• Results and Analysis reported• Learning objectives on syllabi
Timeline: Fall 2010Timeline: Fall 2010September 15Assessment Plan identifying second major/program learning objective to be assessed.
November 15Completed Assessment Report, indicating how results from spring 2010 assessments are being used
Late December/JanuaryResults from fall assessments due
OngoingEvidence gatheringMeetings with ambassadorsSyllabi revisionsDevelopment opportunitiesPlanning for next Spring/Fall assessments
Assessment as a Four-Step Assessment as a Four-Step Continuous CycleContinuous Cycle
Source: Suskie 2004: 4.
Step 1. Establishing Learning Step 1. Establishing Learning GoalsGoals “Assessment begins not with
creating or implementing tests, assignments, or other assessment tools but by first deciding on your goals: what you want your students to learn” (Suskie 2004: 73).
Overview of Exemplary Goals Overview of Exemplary Goals Across DepartmentsAcross Departments Identify the contributions of key figures and events
to the historical development of sociology as a scientific discipline
Calculate and interpret descriptive and inferential statistics
Make ethical decisions by applying standards of the
National Association of Social Workers code of ethics
Design and conduct a study using an appropriate research method
Demonstrate an understanding of the basic research process and advocacy for the protection of human subjects in the conduct of research
Department Specific Samples Department Specific Samples of Exemplary Goals and of Exemplary Goals and ObjectivesObjectives
Department Specific Samples Department Specific Samples of Exemplary Goals and of Exemplary Goals and ObjectivesObjectives
Step 2: Provide Learning Step 2: Provide Learning OpportunitiesOpportunities
Provide multiple learning opportunities for a single goal across courses within a program
Articulate learning goals for every assignment◦ Identify specific important learning goals for each
assignment, then create meaningful tasks or problems that correspond to those goals
Some goals are not quantifiable: habits of mind, behaviors etc.◦ Allow time for reflection, honest self-appraisals
of actions, minute papers
Learning OpportunitiesLearning OpportunitiesProvide a variety of assignments and
assignment types◦ Examples in Suskie p. 158
Will students will learn significantly more from a larger assignment than a shorter one—enough to justify the time that they and you will spend on it?
Break apart large assignments into pieces that are due at various times (“scaffolding”)
Suskie (2009) pp. 155-157
Step 3: Assessing Student Step 3: Assessing Student Learning: Gathering and Learning: Gathering and Analyzing Data Analyzing Data (Direct (Direct Evidence)Evidence)Direct evidence of student learning is
tangible, visible, self-explanatory evidence of exactly what students have and haven’t learned.
Examples of Direct Evidence:◦Embedded course assignments
(written/oral) graded with rubric◦Department wide exams ◦Standardized tests◦Capstone projects◦Field experiences◦Score gains, Pre-Test/Post-Test
Suskie (2004), p. 95
Assessing Student Learning: Assessing Student Learning: Gathering and Analyzing Gathering and Analyzing Data (Indirect Evidence)Data (Indirect Evidence)Indirect evidence provides signs that
students are probably learning, but evidence of exactly what they are learning may be less clear and less convincing.
Examples of Indirect Evidence:
◦ Pre- and post-course surveys ◦ Open-ended questionnaire survey◦ Focus group ◦ Track admissions to graduate and
professional schools
Suskie (2004) , p. 95
Assessing Student Learning: Gathering Assessing Student Learning: Gathering and Analyzing Data (Direct Evidence)and Analyzing Data (Direct Evidence)
Assessing Student Learning: Gathering and Assessing Student Learning: Gathering and Analyzing Data (Indirect Evidence)Analyzing Data (Indirect Evidence)
Step 4: Closing the Loop: Step 4: Closing the Loop: Using Results for Using Results for ImprovementImprovementIn your report, you will be asked to . . .Explain the implications of the assessment
results for the program.◦ How can the results be used to improve planning,
teaching and learning?◦ Are changes in the program suggested? If so, what
kinds of changes? Are changes in the assessment plan indicated? If so, what kinds of changes? The program changes may refer to curriculum revision, faculty development, changes in pedagogy, student services, resource management and/or any other activity that relates to student success.
◦ What, if any, additional information would help inform decision making regarding student achievement of the objective(s)?
◦ What kinds of resources will you need to make changes?
Evaluating the Quality of Evaluating the Quality of Your Assessment ProcessYour Assessment Process
Using the results from the assessment of your first goal(s), discuss plans you have regarding your:
Learning GoalsCurriculumTeaching MethodsAssessment Strategies and Tools
(See handout for questions to guide your thinking for each of these categories.)
Assessment Council Assessment Council MembershipMembership
• Kofi Benefo (Sociology) [email protected]• Salita Bryant (English) [email protected]• *Nancy Dubetz (ECCE) [email protected]• Robert Farrell (Lib) [email protected]• Judith Fields (Economics) [email protected]• Marisol Jimenez (ISSP) [email protected]• Lynn Rosenberg (SLHS)
[email protected]• Renuka Sankaran (Biology) [email protected]• Robyn Spencer (History) [email protected]• Minda Tessler (Psych) [email protected]• Janette Tilley (Mus) [email protected]
*Committee Chair
Administrative Advisor – Assessment CoordinatorRay Galinski - [email protected]
References/ResourcesReferences/ResourcesSuskie, L. (2004). Assessing student learning: A
common sense guide. San Francisco: Anker Publishing Co., Inc.
Suskie, L. (2009). Assessing student learning: A common sense guide. San Francisco: John Wiley & Sons, Inc.