rubric design workshop

17
Rubric Design Workshop

Upload: lisa-m-snyder

Post on 30-Jul-2015

61 views

Category:

Education


0 download

TRANSCRIPT

Rubric Design Workshop

Overview Of Session 

What is a rubric?

• Definition: “a set of criteria specifying the characteristics of an outcome and the levels of achievement in each characteristic.” • Benefits - Provides consistency in evaluation and performance

- Gathers rich data- Mixed-method- Allows for direct measure of learning

Why use rubrics?

• Provides both qualitative descriptions of student learning and quantitative results • Clearly communicates expectations to students • Provides consistency in evaluation • Simultaneously provides student feedback and

programmatic feedback • Allows for timely and detailed feedback • Promotes colleague collaboration • Helps us refine practice

Types of Rubrics - Analytic

Analytic rubrics articulate levels of performance for each

criteria used to assess student learning.

Advantages • Provide useful feedback on areas of strength and weakness.• Criterion can be weighted to reflect the relative importance of

each dimension.Disadvantages • Takes more time to create and use than a holistic rubric.• Unless each point for each criterion is well-defined raters may

not arrive at the same score

Analytic Rubric Example

Types of Rubrics - HolisticA holistic rubric consists of a single scale with all

criteria to be included in the evaluation being considered together. Advantages • Emphasis on what the learner is able to demonstrate, rather than

what s/he cannot do.• Saves time by minimizing the number of decisions raters make.• Can be applied consistently by trained raters increasing reliability.Disadvantages • Does not provide specific feedback for improvement.• When student work is at varying levels spanning the criteria points it

can be difficult to select the single best description.• Criteria cannot be weighted.

Ω

Holistic Rubric Example

Steps for Implementation

Identify the outcome ✔

Determine how you will collect the evidence ✔

Develop the rubric based on observable criteria

Train evaluators on rubric use

Test rubric and revise if needed

Collect Data

Analyze and report

1

2

3

4

5

6

7

Rubric Development – Pick your Scale

Levy, J.D. Campus Labs: Data Driven Innovation. Using rubrics in student affairs: A direct assessment of learning.

Rubric Development – Pick your Dimensions

Levy, J.D. Campus Labs: Data Driven Innovation. Using rubrics in student affairs: A direct assessment of learning.

Creating you Rubric

Levy, J.D. Campus Labs: Data Driven Innovation. Using rubrics in student affairs: A direct assessment of learning.

Writing Descriptors

University of Florida Institutional Assessment: Writing Effective Rubrics

Describe each level of mastery for each characteristic

Describe the best work you could expect

Describe an unacceptable product

Develop descriptions of intermediate level products for intermediate categories

Each description and each category should be mutually exclusive

Be specific and clear; reduce subjectivity

1

2

3

4

5

6

Next Steps…

Levy, J.D. Campus Labs: Data Driven Innovation. Using rubrics in student affairs: A direct assessment of learning.

Training for Consistency1. Inter-rater reliability: Between-rater consistencyAffected by:• Initial starting point or approach to scale (assessment

tool)• Interpretation of descriptions• Domain / content knowledge• Intra-rater consistency

2. Intra-rater reliability: Within-rater consistencyAffected by:• Internal factors: mood, fatigue, attention • External factors: order of evidence, time of day, other

situations• Applies to both multiple-rater and single rater situations

Levy, J.D. Campus Labs: Data Driven Innovation. Using rubrics in student affairs: A direct assessment of learning.

Testing your Rubric• Use a Meta-rubric to review your work • Peer review- ask one of your peers to review the rubric

and provide feedback on content • Test with students - use student work or observations to

test the rubric • Revise as needed • Test again • Multiple raters – norm with other raters if appropriate

Levy, J.D. Campus Labs: Data Driven Innovation. Using rubrics in student affairs: A direct assessment of learning.

Allen, M.J. (2004). Assessing academic programs in higher education. Bolton, MA: Anker.

Brophy, Timothy S. University of Florida Institutional Assessment: Writing Effective Rubrichttp://assessment.aa.ufl.edu/Data/Sites/22/media/slo/writing_effective_rubrics_guide_v2.pdf

Jon Mueller. Professor of Psychology, North Central College, Naperville, IL. Authentic Assessment Toolbox http://jfmueller.faculty.noctrl.edu/toolbox/rubrics.htm

Teaching Commons, Deparul Universityhttp://teachingcommons.depaul.edu/Feedback_Grading/rubrics/types-of-rubrics.html