measuring student learning march 10, 2015 cathy sanders director of assessment

26
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Upload: aleesha-thornton

Post on 17-Dec-2015

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Measuring Student Learning

March 10, 2015Cathy Sanders

Director of Assessment

Page 2: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Workshop Learning Outcomes

Upon conclusion of the workshop, workshop participants will be able to:

1) identify and define the elements of UNC Charlotte’s Student Learning Outcomes Assessment Plan and Report;

2) develop an analytic rubric;3) link analytic rubrics and assessment data to

Performance Outcomes; and4) use assessment data to improve student learning.

Page 3: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Elements of UNC Charlotte’s Student Learning Outcomes Assessment Plan and

Report

1) Student Learning Outcome 2) Effectiveness Measure

with the scoring rubric3) Assessment Methodology 4) Performance Outcome5) Assessment Data6) Continuous Improvement Plan

Page 4: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Student Learning Outcomes (SLOs) Are:

competencies (i.e., knowledge, skills, abilities and behaviors) that students will be able to demonstrate as a result of an educational program

SLOs need to be specific and measurable

Page 5: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Sample SLO That is Not Measurable Because it is Stated Too Broadly:

“Upon conclusion, workshop participants will be knowledgeable about student learning outcomes assessment.”

Page 6: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Sample Measurable SLOs

Workshop participants will be able to:

1) identify and define the elements of UNC Charlotte’s Student Learning Outcomes Assessment Plan and Report;

2) develop an analytic rubric;3) link analytic rubrics and assessment data to

Performance Outcomes ; and4) use assessment data to improve student learning.

Page 7: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

An Effectiveness Measure is:

a student artifact, (e.g., exam, project, paper or presentation) that will be used to gauge students’ acquisition of the student learning outcome.

Page 8: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Important Attributes of an Effectiveness Measure

Validity: Does the Effectiveness Measure assess what it is supposed to (i.e., the knowledge, skill, ability or behavior articulated in the SLO)?

Example: An oral presentation would be a valid Effectiveness Measure for assessing students’ oral communication skills.

Reliability: Does the Effectiveness Measure consistently assess the knowledge, skill, ability or behavior of interest?

Analytic rubrics help to facilitate inter-rater reliability

Page 9: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Sample Effectiveness Measure

“Workshop participants will develop an analytic rubric to demonstrate their ability to apply their new knowledge of analytic rubrics acquired in the workshop. The analytic rubric will model rubrics used by faculty to evaluate student artifacts and will include a minimum of a 3 point rating scale and 3 evaluation criteria with cell descriptors for each criterion for each level of competency.

Page 10: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Assessment Methodology Is:

A description of the department’s process for assessing the student learning outcome and reviewing the resulting assessment data that includes:

1) when and in what course the assessment will be administered;

2) how the student artifact will be evaluated and by whom; 3) the department’s process for collecting, analyzing and

disseminating assessment data to department faculty will be; and

4) the department’s process for annually reviewing assessment data and deciding changes/improvements to make will be.

Page 11: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Sample Assessment Methodology“Workshop participants will develop an analytic rubric after the instructor has defined a rubric, shared a sample rubric and explained the steps in constructing a rubric. Rubrics will be collected by the instructor and a Scoring Rubric for Analytic Rubric Activity will be used to assess workshop participants’ ability to develop a rubric. The instructor will summarize and disseminate the assessment findings to OAA staff who will identify areas needing improvement and decide changes to make prior to the next training workshop.”

Page 12: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

A Performance Outcome Is:

the percentage of students that will demonstrate proficiency on the student learning outcome and the level of proficiency expected.

Page 13: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Sample Performance Outcome

“80% of workshop participants will score ‘Acceptable (2)’ or higher on the Scoring Rubric for Analytic Rubric Activity”

Page 14: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Scoring Rubric Defined

A scoring tool that uses a set of evaluation criteria that are directly tied to the student learning outcome to assess student learning.

When the content of the rubric is communicated prior to students completing the work, the grading process is very clear and transparent to everyone involved.

When scoring rubrics are used, Performance Outcomes are to be tied to the rubric scale.

Page 15: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Scoring Rubric for Analytic Rubric Activity Criteria 1 = Needs Improvement 2 = Acceptable 3 = Accomplished Rating

(1 to 3)

Ability to develop a rubric scale

Is unable to develop a minimum of a 3 point scale that progresses from least proficient to most proficient

Is able to develop a minimum of a 3 point scale that progresses from least proficient to most proficient

Is able to develop a scale that is greater than 3 points that progresses from least proficient to most proficient

Ability to develop evaluation criteria

Is unable to identify at least 3 relevant evaluation criteria

Is able to identify at least 3 relevant evaluation criteria

Is able to identify more than 3 relevant evaluation criteria

Ability to develop cell descriptors

Cell descriptors are missing for some evaluation criteria. Many of the cell descriptors are not sufficiently detailed to differentiate between the levels of proficiency.

Cell descriptors are present for all evaluation criteria but some do not clearly differentiate between the levels of proficiency.

Cell descriptors are present for all evaluation criteria that clearly differentiate between the levels of proficiency.

Page 16: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Steps in Constructing an Analytic Scoring Rubric

1. Determine what kind of rubric scale to use (e.g., 3 pt., 5 pt.).2. Label each level of proficiency across the top of the rubric (e.g., “(1) Needs

Improvement,” “(2) Acceptable,” “(3) Accomplished”).3. List evaluation criteria down the left side of the rubric template.4. Write cell descriptors of what the highest level of proficiency “(3) Accomplished”

looks like for each criterion.5. Write cell descriptors of what the lowest level of proficiency “(1) Needs

Improvement” looks like for each criterion.6. Write cell descriptors of what the mid levels of proficiency look like for each

criterion.7. Test it: use the rubric to evaluate student artifacts. Make note of important

evaluation criteria that were omitted, existing criteria to eliminate, and cell descriptors that need greater specificity.

8. Revise the rubric to reflect the desired changes.

Page 17: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Analytic Rubric Activity

• Using the blank rubric template provided, develop an Oral Presentation Rubric with a minimum of a 3 point rating scale and 3 evaluation criteria with cell descriptors for each criterion for each level of competency.

Page 18: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Tying it All Together Into a Student Learning Outcomes Assessment Plan

Page 19: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

SLO Assessment PlanStudent Learning Outcome: Workshop participants will be able to:1) identify and define the elements of UNC Charlotte’s Student

Learning Outcomes Assessment Plan and Report;2) develop an analytic rubric;3) link analytic rubrics and assessment data to Performance

Outcomes; and4) use assessment data to improve student learning.

Effectiveness Measure: Workshop participants will develop an analytic rubric to demonstrate their ability to apply their new knowledge of analytic rubrics acquired in the workshop. The analytic rubric will model rubrics used by faculty to evaluate student artifacts. and will include a minimum of a 3 point rating scale and 3 evaluation criteria with cell descriptors for each criterion for each level of competency.

Page 20: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

SLO Assessment Plan

Assessment Methodology: Workshop participants will develop an analytic rubric after the instructor has defined a rubric, shared a sample analytic rubric and explained the steps in constructing a rubric. Rubrics will be collected by the instructor and a Scoring Rubric for Analytic Rubric Activity will be used to assess workshop participants’ ability to develop a rubric. The instructor will summarize and disseminate the assessment findings to OAA staff who will decide changes to make prior to the next training workshop.

Performance Outcome: 80% of workshop participants will score “Acceptable (2)” or higher on the Scoring Rubric for Analytic Rubric Activity.

Page 22: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Report assessment data in the same way that the Performance Outcome is stated.

PO: 80% of workshop participants will score “Acceptable (2)” or higher on the Scoring Rubric for Analytic Rubric Activity

Assessment Data: 89% of workshop participants scored “Acceptable (2)” or higher on the Scoring Rubric for Analytic Rubric Activity

Page 23: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Continuous Improvement Plan

During the annual review of assessment data, department faculty will:1. determine whether students are meeting the

Performance Outcome(s) for each Student Learning Outcome;

2. identify changes that are needed to improve future student learning;

3. develop the department’s continuous improvement plan for the upcoming year; and

4. the following year, document in the Student Learning Outcomes Assessment Plan and Report whether the changes made improved student learning.

Page 24: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Changes to Consider When Students Are Not Meeting Performance Outcomes

1. Change the assessment instrument– Revise the assessment instrument (i.e., test questions or project

requirements)– Change to an assessment instrument that measures deeper learning (e.g.

from test questions to a written paper)– Revise or add rubric evaluation criteria

2. Change the assessment methodology– Change what you are assessing (i.e., revise the SLO statement)– Change when SLOs are assessed (junior year vs senior year)– Change how the assessment is administered (e.g., videotaped oral

presentation vs oral presentation so that students can watch afterward to self-assess)

Page 25: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Changes to Consider When Student Are Not Meeting Performance Outcomes

3. Change the curriculum – Revise courses to provide additional coverage in areas where students did

not perform well– Revise the curriculum to increase students’ exposure to SLOs by

introducing, reinforcing and emphasizing competencies throughout the curriculum

– Schedule an appt. with CTL curriculum design specialist

4. Change the pedagogy– Incorporate more active learning that provides students with opportunities

to apply what they are learning– Incorporate mini assessments throughout the semester to gauge earlier

whether students are grasping the material and adjust accordingly– Incorporate active learning opportunities for students to share/teach the

material they are learning in your classroom

Page 26: Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Questions/Discussion