developing and using rubrics to evaluate and...
TRANSCRIPT
Jay McTigheEducational Consultant
6581 River RunColumbia, MD 21044-6066
(410) 531-1610e-mail: [email protected]
presented by
Developing and Using Rubrics to Evaluate and Improve
Student Performance
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 2
Ass
essm
ent
Pla
nnin
g F
ram
ewor
k: K
ey Q
uest
ions
Pur
pose
(s)
for
Ass
essm
ent
Why
are
we
asse
ssin
g an
d ho
w w
ill
the
asse
ssm
ent i
nfor
mat
ion
be u
sed?
Con
tent
Sta
ndar
ds
Wha
t do
we
wan
t stu
dent
s to
kno
w,
un
ders
tand
, and
be
able
to d
o?
Aud
ienc
e(s)
for
Ass
essm
ent
For
who
m a
re th
e as
sess
men
tre
sults
inte
nded
?
■ ■ ■ ■
❏di
agno
se s
tude
nt s
tren
gths
/nee
ds
❏pr
ovid
e fe
edba
ck o
n st
uden
t le
arni
ng
❏pr
ovid
e a
basi
s fo
r in
stru
ctio
nal
plac
emen
t
❏in
form
and
gui
de in
stru
ctio
n
❏co
mm
unic
ate
lear
ning
ex
pect
atio
ns
❏m
otiv
ate;
focu
s st
uden
t at
tent
ion
and
effo
rt
❏pr
ovid
e pr
acti
ce a
pply
ing
know
ledg
e an
d sk
ills
❏pr
ovid
e a
basi
s fo
r ev
alua
tion
__
gra
ding
__
pro
mot
ion/
grad
uati
on
__ p
rogr
am s
elec
tion
adm
issi
on
❏pr
ovid
e ac
coun
tabi
lity
data
❏ga
uge
prog
ram
eff
ecti
vene
ss
❏te
ache
r/in
stru
ctor
❏st
uden
ts
❏pa
rent
s
❏gr
ade-
leve
l/dep
artm
ent
team
❏ot
her
facu
lty
❏sc
hool
adm
inis
trat
ors
❏cu
rric
ulum
sup
ervi
sors
❏po
licy
mak
ers
❏bu
sine
ss c
omm
unit
y/em
ploy
ers
❏co
llege
adm
issi
ons
offic
ers
❏hi
gher
edu
cati
on
❏ge
nera
l pub
lic
❏ot
her:
___
____
____
____
____
from
McT
ighe
and
Fer
rara
(19
97).
Ass
essi
ng L
earn
ing
in th
e C
lass
room
. Was
hing
ton,
DC
: Nat
iona
l Edu
catio
n A
ssoc
iatio
n
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 3
fill i
n th
e bl
ank
• w
ord(
s)•
phra
se(s
)
shor
t an
swer
• se
nten
ce(s
)•
para
grap
hs
labe
l a d
iagr
am
“sho
w y
our
wor
k”
repr
esen
tati
on(s
)•
web
• co
ncep
t m
ap•
flow
cha
rt•
grap
h/ta
ble
• m
atri
x•
illus
trat
ion
oral
pre
sent
atio
n
danc
e/m
ovem
ent
scie
nce
lab
dem
onst
rati
on
athl
etic
ski
lls
perf
orm
ance
dram
atic
rea
ding
enac
tmen
t de
bate
mus
ical
rec
ital
keyb
oard
ing
❏❏ ❏ ❏ ❏
oral
que
stio
ning
obse
rvat
ion
(“ki
d w
atch
ing”
)
inte
rvie
w
conf
eren
ce
proc
ess
desc
ript
ion
“thi
nk a
loud
”
lear
ning
log
❏ ❏
❏
essa
y
rese
arch
pap
er
log/
jour
nal
lab
repo
rt
stor
y/pl
ay
poem
port
folio
art
exhi
bit
scie
nce
proj
ect
mod
el
vide
o/au
diot
ape
spre
adsh
eet
❏ ❏ ❏❏ ❏❏❏ ❏
❏ ❏ ❏
❏PE
RF
OR
MA
NC
ES
PE
RF
OR
MA
NC
E-B
ASE
D A
SSE
SSM
EN
TS
PR
OD
UC
TS
Fra
mew
ork
of C
lass
room
Ass
essm
ent A
ppro
ache
s an
d M
etho
ds
mul
tipl
e-ch
oice
true
-fal
se
mat
chin
g
CO
NST
RU
CT
ED
RE
SPO
NSE
S
❏ ❏❏
❏❏
❏❏❏ ❏❏
❏❏
PR
OC
ESS
-F
OC
USE
D
How
mig
ht w
e as
sess
stu
dent
lear
ning
in th
e cl
assr
oom
?
SEL
EC
TE
DR
ESP
ON
SEIT
EM
S
❏ ❏ ❏
❏
from
McT
ighe
and
Fer
rara
(19
97).
Ass
essi
ng L
earn
ing
in th
e C
lass
room
. Was
hing
ton,
DC
: Nat
iona
l Edu
catio
n A
ssoc
iatio
n
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 4
Sele
cted
-Res
pons
e It
ems:
a
nsw
er k
ey
m
achi
ne s
cori
ng
Per
form
ance
-Bas
ed
Ass
essm
ents
:
p
erfo
rman
ce li
st
h
olis
tic
rubr
ic
a
naly
tic
rubr
ic
c
heck
list
w
ritt
en/o
ral c
omm
ents
teac
her(
s)/in
stru
ctor
(s)
peer
s/co
-wor
kers
expe
rt j
udge
s (e
xter
nal r
ater
s)
stud
ent
(sel
f-ev
alua
tion
)
pare
nts/
com
mun
ity
mem
bers
empl
oyer
s
othe
r: _
____
____
____
____
___
Eva
luat
ion
Rol
es
How
will
we
eval
uate
stu
dent
kn
owle
dge
and
profi
cien
cy?
Who
will
be
invo
lved
in e
valu
atin
g st
uden
t re
spon
ses,
pro
duct
s or
per
form
ance
s?
How
will
we
com
mun
icat
e as
sess
men
t res
ults
?
❏ ❏❏
❏ ❏❏
num
eric
al s
core
•
perc
enta
ge s
core
s•
poin
t to
tals
lett
er g
rade
deve
lopm
enta
l/pro
ficie
ncy
sc
ale
narr
ativ
e re
port
(w
ritt
en)
chec
klis
t
wri
tten
com
men
ts
verb
al r
epor
t/co
nfer
ence
❏ ❏ ❏
Eva
luat
ion
and
Com
mun
icat
ion
Met
hods
Com
mun
icat
ion/
Fee
dbac
k M
etho
dsE
valu
atio
n M
etho
ds
❏ ❏
❏❏ ❏ ❏ ❏ ❏❏
❏ ❏
Judg
men
t-B
ased
Eva
luat
ion
by:
from
McT
ighe
and
Fer
rara
(19
97).
Ass
essi
ng L
earn
ing
in th
e C
lass
room
. Was
hing
ton,
DC
: Nat
iona
l Edu
catio
n A
ssoc
iatio
n
❏
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 5
Criterion-Based Evaluation Tools
Notes______________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________
Criterion-based evaluation tools are used in con-junction with “open-ended” performance tasks and projects, which do not have a single, “correct” an-swer or solution process. Evaluation of the resulting products and performances is based on judgment guided by criteria. The goal is to make a judgment-based process as clear, consistent and defensible as possible.
Three general types of criterion-based evaluation tools are used widely in classrooms – performance lists, holistic rubrics, and analytic rubrics. A performance list, consists of a set of criterion ele-ments or traits and a rating scale. A rubric consists of a fixed measurement scale (e.g., 4-points) and descriptions of the characteristics for each score point. Note that a rubric provides a description of the levels of performance, unlike a performance list, which simply assigns scores based on identi-fied criterion elements.
Two general types of rubrics – holistic and analytic – are widely used to judge student products and performances. A holistic rubric provides an over-all impression of a student’s work. Holistic rubrics yield a single score or rating for a product or per-formance. An analytic rubric divides a product or performance into distinct traits or dimensions and judges each separately. Since an analytic rubric rates each of the identified traits independently, a separate score is provided for each.
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 6
Benefits of Criterion-Based Evaluation Tools for Teachers and Students
Clearly defined performance criteria communicate the important dimensions, or elements of quality, in a product or performance. The clarity provided by well-defined criteria assists educators in reduc-ing subjective judgments when evaluating student work. When an agreed-upon set of criterion-based evaluation tools are used throughout a department or grade-level team, school or district, more consistent evaluation results since the performance criteria do not vary from teacher to teacher. A second benefit of criterion-based evaluation tools relates to teach-ing. Clearly defined criteria provide more than just evaluation tools to use at the end of instruction – they help clarify instructional goals and serve as teaching targets. Educators who have scored student work as part of a large-scale performance assess-ment at the district or state level often observe that the very process of evaluating student work against established criteria teaches a great deal about what makes the products and performances successful. As teachers internalize the qualities of solid perfor-mance, they become more attentive to those quali-ties in their teaching.
Well-defined criteria provide a common vocabu-lary and a clearer understanding of the important dimensions of quality performance for educators. The same benefits apply to students as well. When students know the criteria in advance of their per-formance, they are provided with clear goals for their work. There is no “mystery” as to the desired elements of quality or the basis for evaluating (and grading) products and performances. Students don’t have to guess about what is most important or how their work will be judged.
Notes______________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 7
TYPES OF CRITERION-BASED EVALUATION TOOLS
SCORING RUBRIC PERFORMANCE LIST
Holistic Analytic Analytic
Options for Criterion-Based Evaluation Tools
KEY QUESTIONS
• What is the purpose of this performance task or assignment (diagnostic, formative, summative)?
• What evaluation tool is most appropriate given the assessment purpose?
❍performance list ❍ holistic rubric ❍ analytic rubric ❍ generic ❍ task specific
• What is the range of the scale?
• Who will use the evaluation tool (teachers, external scorers, students, others)?If students are involved, the tool should be written in understandable ‘kid language’.
Generic
Task- Specific
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 8
1. The graph contains a title that tells what the data shows. 2. All parts of the graph (units of measure-ment, rows, etc.) are correctly labelled. 3. All data is accurately represented on the graph. 4. The graph is neat and easy to read.
Total _____ _____ _____
Performance List for Graphic Display of Data
(elementary level)
Performance lists offer a practical means of judging student performance based upon identified criteria. A performance list consists of a set of criterion elements or traits and a rating scale. The rating scale is quite flexible, ranging from 3 to 100 points. Teachers can assign points to the various elements, in order to “weight” cer-tain elements over others (e.g., accuracy counts more than neatness) based on the relative importance given the achievement target. The lists may be configured to easily convert to conventional grades. For example, a teachers could assign point values and weights that add up to 25, 50 or 100 points, enabling a straightforward conversion to a district or school grading scale (e.g., 90-100 = A, 80-89 = B, and so on). When the lists are shared with students in advance, they provide a clear perfor-mance target, signaling to students what elements should be present in their work. Despite these benefits, performance lists do not provided detailed descrip-tions of performance levels. Thus, despite identified criteria, different teachers using the same performance list may rate the same student’s work quite differently.
_____ _____ _____ _____ _____ _____ _____ _____ _____
_____ _____ _____
Key Criteria Points Possible Self Other Teacher
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 9
Possible Points Earned PointsKey Traits: self teacher ________________________________ ________ _______ ________
________________________________ ________ _______ ________
________________________________ ________ _______ ________
________________________________ ________ _______ ________
________________________________ ________ _______ ________
________________________________ ________ _______ ________
________________________________ ________ _______ ________
________________________________ ________ _______ ________
Totals 100
Performance List for oral presentation
*adapted from materials presented by K. Michael Hibbard, Region 15 Board of Education, Middlebury, CT
• well organized 25
• topic explained and supported 30
• effective visual display 25
• effective volume 5
• effective rate of speech 5
• appropriate inflection 5
• effective posture 5
Constructing a Criterion Performance List(example - oral presentation)
KEY QUESTIONS• What are the key traits, elements, or dimensions that will be evaluated? • How many score points (scale) will be needed? (Checklists only need a binary scale – yes or no – when used to evaluate the presence or absence of elements.)
☞Teachers should review and discuss the identified elements and the scale with students prior to using the performance list for self/peer/teacher evaluation.
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 10
Performance List for Cooperative LearningPrimary Level
1. Did I do my job in my group? 2. Did I follow directions? 3. Did I finish my part on time? 4. Did I help others in my group? 5. Did I listen to others in my group?
6. Did I get along with others in my group? 7. Did I help my group clean up?
TerrificO.K. NeedsWork
adapted from materials developed by Dr. H.B. Lantz, ASCI (2000)
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 11
All data is accurately represented on the graph. All parts of the graph (units of measurement, rows, etc.) are correctly labelled. The graph contains a title that clearly tells what the data shows. The graph is very neat and easy to read.
3
2
1 The data is inaccurately represented, contains major errors, OR is missing.Only some parts of the graph are correctly labelled OR labels are missing. The the title does not reflect what the data shows OR the title is missing. The graph is sloppy and difficult to read.
All data is accurately represented on the graph OR the graph containsminor errors. All parts of the graph are correctly labelled OR the graph contains minor inaccuracies. The graph contains a title that suggests what the data shows. The graph is generally neat and readable.
Holistic Rubric for Graphic Display of Data
A holistic rubric provides an overall impression of a student’s work. Holistic rubrics yield a single score or rating for a product or performance. Holistic rubrics are well suited to judging simple products or performances, such as a student’s response to an open-ended test prompt. They provide a quick snapshot of overall quality or achievement, and are thus often used in large-scale assessment contexts (national, state or district levels) to evaluate a large number of student responses. Holistic rubrics are also effective for judging the “impact” of a product or performance (e.g., to what extent was the essay persuasive? did the play entertain?). Despite these advantages, holistic rubrics have limitations. They do not provide a detailed analysis of the strengths and weaknesses of a product or performance. Since a single score is generally inadequate for conveying to students what they have done well and what they need to work on to improve, they are less effective at providing specific feedback to students. A second problem with holistic rubrics relates to the interpretation and use of their scores. For instance, two students can receive the same score for vastly different reasons. Does an overall rating of “3” on a 4-point holistic writing rubric mean that a student has demonstrated strong idea development (“4”) and weak use of conventions (“2”), or vice-versa? Without more specific feed-back than a score or rating, it is difficult for the student to know exactly what to do to improve.
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 12
The graph is very neat and easy to read.
3
2
1Only some parts of the graph are correctly labelled OR labels are missing.
title labels accuracy neatness
All data is accurately represented on the graph.
The data is inaccurately represented, contains ma-jor errors, OR is missing.
All parts of the graph (units of measurement, rows, etc.) are correctly labelled.
The graph contains a title that clearly tells what the data shows.
The graph is sloppy and difficult to read.
The the title does not reflect what the data shows OR the title is missing.
The graph contains a title that suggests what the data shows.
Data representation contains minor errors.
The graph is generally neat and readable.
Some parts of the graph are inaccurately labelled.
weights –
Analytic Rubric for Graphic Display of Data
An analytic rubric divides a product or performance into distinct traits or dimensions and judges each separately. Since an analytic rubric rates each of the identified traits independently, a separate score is provided for each. Analytic rubrics are better suited to judging complex performances (e.g., research process) involving several significant dimensions. As evaluation tools, they provide more specific information or feedback to students, parents and teachers about the strengths and weaknesses of a performance. Teachers can use the information provided by analytic evaluation to target instruction to particular areas of need. From an instruc-tional perspective, analytic rubrics help students come to better understand the nature of quality work since they identify the important dimensions of a product or performance. However, analytic rubrics are typically more time-consuming to learnand apply. Since there are several traits to be considered, analytic scoring mayyield lower inter-rater reliability (degree of agreement among different judg-es) than holistic scoring. Thus, analytic scoring may be less desirable for use in large-scale assessment contexts, where speed and reliability are necessary.
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 13
Pro
blem
Sol
ving
Rea
soni
ng a
nd P
roof
Com
mun
icat
ions
Rep
rese
ntat
ion
4 E
xper
t
An
effic
ient
str
ateg
y is
cho
sen
and
prog
ress
tow
ards
a s
olut
ion
is
eval
uate
d. A
djus
tmen
ts in
str
at-
egy,
if n
eces
sary
, are
mad
e al
ong
the
way
, and
/ or
alte
rnat
ive
stra
te-
gies
are
con
side
red.
Evi
denc
e of
an
alyz
ing
the
situ
atio
n in
mat
he-
mat
ical
term
s, a
nd e
xten
ding
pri
or
know
ledg
e is
pre
sent
.A
cor
rect
ans
wer
is a
chie
ved.
Ded
uctiv
e ar
gum
ents
are
us
ed to
just
ify
deci
sion
s an
d m
ay r
esul
t in
form
al p
roof
s.
Evi
denc
e is
use
d to
just
ify
and
supp
ort d
ecis
ions
mad
e an
d co
nclu
sion
s re
ache
d.
Thi
s m
ay le
ad to
gen
eral
izin
g an
d ex
tend
ing
the
solu
tion
to
othe
r ca
ses.
A s
ense
of
audi
ence
and
pur
-po
se is
com
mun
icat
ed.
Com
mun
icat
ion
of a
rgum
ent
is s
uppo
rted
by
mat
hem
atic
al
prop
ertie
s.Pr
ecis
e m
ath
lang
uage
and
sy
mbo
lic n
otat
ion
are
used
to
cons
olid
ate
mat
h th
inki
ng a
nd
to c
omm
unic
ate
idea
s.
Abs
trac
t or
sym
bolic
m
athe
mat
ical
rep
rese
nta-
tions
are
con
stru
cted
to
anal
yze
rela
tions
hips
, ex-
tend
thin
king
, and
cla
rify
or
inte
rpre
t phe
nom
enon
.
3 Pr
actit
ione
r
A c
orre
ct s
trat
egy
is c
hose
n ba
sed
on m
athe
mat
ical
situ
atio
n in
the
task
. Pla
nnin
g or
mon
itori
ng o
f st
rate
gy is
evi
dent
. Evi
denc
e of
so
lidif
ying
pri
or k
now
ledg
e an
d ap
plyi
ng it
to th
e pr
oble
m.
A c
orre
ct a
nsw
er is
ach
ieve
d.
Arg
umen
ts a
re c
onst
ruct
ed
with
ade
quat
e m
athe
mat
ical
ba
sis.
A s
yste
mat
ic a
ppro
ach
and/
or ju
stifi
catio
n of
cor
rect
re
ason
ing
is p
rese
nt. T
his
may
lead
to c
lari
ficat
ion
of
the
task
and
not
ing
patte
rns,
st
ruct
ures
and
reg
ular
ities
.
A s
ense
of
audi
ence
or
purp
ose
is c
omm
unic
ated
. and
/or
Com
mun
icat
ion
of a
n ap
-pr
oach
is e
vide
nt th
roug
h a
met
hodi
cal,
orga
nize
d, c
oher
-en
t seq
uenc
ed a
nd la
bele
d re
-sp
onse
. For
mal
mat
h la
ngua
ge
is u
sed
to s
hare
and
cla
rify
id
eas.
App
ropr
iate
and
acc
urat
e m
athe
mat
ical
rep
rese
nta-
tions
are
con
stru
cted
and
re
fined
to s
olve
pro
blem
s or
por
tray
sol
utio
ns.
2 A
ppre
ntic
e
A p
artia
lly c
orre
ct s
trat
egy
is
chos
en, o
r a
corr
ect s
trat
egy
for
only
sol
ving
par
t of
the
task
is
chos
en. E
vide
nce
of d
raw
ing
on s
ome
prev
ious
kno
wle
dge
is
pres
ent,
show
ing
som
e re
leva
nt
enga
gem
ent i
n th
e ta
sk.
Arg
umen
ts a
re m
ade
with
so
me
mat
hem
atic
al b
asis
.So
me
corr
ect r
easo
ning
or
just
ifica
tion
for
reas
onin
g is
pr
esen
t with
tria
l and
err
or, o
r un
syst
emat
ic tr
ying
of
seve
ral
case
s.
Som
e aw
aren
ess
of a
udie
nce
or p
urpo
se is
com
mun
icat
ed,
and
may
take
pla
ce in
the
form
of
par
aphr
asin
g of
the
task
. or
So
me
com
mun
icat
ion
of
an a
ppro
ach
is e
vide
nt th
roug
h ve
rbal
/wri
tten
acco
unts
and
ex
plan
atio
ns, u
se o
f di
agra
ms
or o
bjec
ts, w
ritin
g, a
nd u
sing
m
athe
mat
ical
sym
bols
.
An
atte
mpt
is m
ade
to
cons
truc
t mat
hem
atic
al
repr
esen
tatio
ns to
rec
ord
and
com
mun
icat
e pr
ob-
lem
sol
ving
, but
they
are
in
com
plet
e or
inap
pro-
pria
te.
1 N
ovic
e
No
stra
tegy
is c
hose
n, o
r a
stra
t-eg
y is
cho
sen
that
will
not
lead
to
a co
rrec
t sol
utio
n.
Arg
umen
ts a
re m
ade
with
no
mat
hem
atic
al b
asis
.N
o co
rrec
t rea
soni
ng n
or
just
ifica
tion
for
reas
onin
g is
pr
esen
t.
No
awar
enes
s of
aud
ienc
e or
pu
rpos
e is
com
mun
icat
ed.
or
Litt
le o
r no
com
mun
ica-
tion
of a
n ap
proa
ch is
evi
dent
or
Eve
ryda
y, f
amili
ar la
n-gu
age
is u
sed
to c
omm
unic
ate
idea
s.
No
atte
mpt
is m
ade
to
cons
truc
t mat
hem
atic
al
repr
esen
tatio
ns.
Sour
ce:
Exe
mpl
ars.
com
Com
mon
Rub
ric
for
Mat
hem
atic
al P
robl
em S
olvi
ng
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 14
COLLABORATION and TEAMWORK
Works towards the achievement of group goals.4 Actively helps identify group goals and works hard to meet them.3 Communicates commitment to the group goals and effectively carries out assigned roles.2 Communicates a commitment to the group goals but does not carry out assigned roles.1 Does not work toward group goals or actively works against them.
Demonstrates effective interpersonal skills.4 Actively promotes effective group interaction and the expression of ideas and opinions in a way that is sensitive to the feelings and knowledge base of others.3 Participates in group interaction without prompting. Expresses ideas and opinions in a way that is sensitive to the feelings and knowledge base of others.2 Participates in group interaction with prompting or expresses ideas and opinions without considering the feelings and knowledge base of others.1 Does not participate in group interaction, even with prompting, or expresses ideas and opinions in a way that is insensitive to the feelings or knowledge base of others.
Contributes to group maintenance.4 Actively helps the group identify changes or modifications necessary in the group process and works toward carrying out those changes.3 Helps identify changes or modifications necessary in the group process and works toward carrying out those changes.2 When prompted, helps identify changes or modifications necessary in the group process, or is only minimally involved in carrying out those changes.1 Does not attempt to identify changes or modifications necessary in the group process, even when prompted, or refuses to work toward carrying out those changes.
Effectively performs a variety of roles within a group.4 Effectively performs multiple roles within the group.3 Effectively performs two roles within the group.2 Makes an attempt to perform more than one role within the group but has little success with secondary roles.
Generic Rubric for 21st Century Skills
Source: Marzano, B., Pickering, D. and McTighe, J. (1993) Assessing Outcomes: Performance Assessment based on the Dimensions of Learning Model. Alexandria, VA: ASCD.
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 15
Com
preh
ensi
bilit
y
Res
pons
es r
eadi
ly c
ompr
ehen
-si
ble,
req
uiri
ng n
o in
terp
reta
tion
on th
e pa
rt o
f th
e lis
tene
r.
Res
pons
es c
ompr
ehen
sibl
e,
requ
irin
g m
inim
al in
terp
reta
tion
on th
e pa
rt o
f th
e lis
tene
r.
Res
pons
es m
ostly
com
preh
en-
sibl
e, r
equi
ring
inte
rpre
tatio
n on
the
part
of
the
liste
ner.
Res
pons
es b
arel
y co
mpr
ehen
-si
ble.
Flu
ency
Spee
ch c
ontin
uous
with
few
pa
uses
or
stum
blin
g.
Som
e he
sita
tion
but m
anag
es
to c
ontin
ue a
nd c
ompl
ete
thou
ghts
.
Spee
ch c
hopp
y an
d/or
slo
w
with
fre
quen
t pau
ses;
few
or
no in
com
plet
e th
ough
ts.
Spee
ch h
altin
g an
d un
even
w
ith lo
ng p
ause
s or
inco
mpl
ete
thou
ghts
.
Pro
nunc
iati
on
Acc
urat
e pr
onun
ciat
ion
enha
nces
com
mun
icat
ion.
Infr
eque
nt m
ispr
onun
ciat
ions
do
not
inte
rfer
e w
ith
com
mun
icat
ion.
Mis
pron
unci
atio
ns s
omet
imes
in
terf
ere
with
com
mun
icat
ion.
Freq
uent
mis
pron
unci
atio
ns
grea
tly in
terf
ere
with
co
mm
unic
atio
n.
Voc
abul
ary
Ric
h us
e of
voc
abul
ary
enha
nces
com
mun
icat
ion.
Ade
quat
e an
d ac
cura
te u
se
of v
ocab
ular
y fo
r th
is le
vel
enha
nces
com
mun
icat
ion.
Inad
equa
te a
nd/o
r in
accu
rate
us
e of
voc
abul
ary
som
etim
es
inte
rfer
es w
/ com
mun
icat
ion.
Inad
equa
te a
nd/o
r in
accu
rate
us
e of
voc
abul
ary
grea
tly in
-te
rfer
es w
ith c
omm
unic
atio
n.
Lan
guag
e C
ontr
ol
Acc
urat
e co
ntro
l of
basi
c la
ngua
ge s
truc
ture
s.
Gen
eral
ly a
ccur
ate
cont
rol
of b
asic
lang
uage
str
uc-
ture
s.
Em
ergi
ng u
se o
f ba
sic
lang
uage
str
uctu
res.
Inad
equa
te a
nd/o
r in
accu
-ra
te u
se o
f ba
sic
lang
uage
st
ruct
ures
.
4 3 2 1
Gen
eric
Ana
lyti
c Sp
eaki
ng R
ubri
c fo
r W
orld
Lan
guag
es
Sour
ce: F
airf
ax C
ount
y, V
A P
ublic
Sch
ools
h
ttp://
ww
w.f
cps.
edu/
DIS
/OH
SIC
S/fo
rlan
g/PA
LS/
rubr
ics/
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 16
Task-Specific Rubric for a Science Investigation
Item 1 - Plan investigation (total possible points: 2) a) describes how the investigation will be conducted b) states what variables will be measured or observed; includes both solution time and temperature c) design provides control for other variables, or renders other variables irrelevant Item 2 - Conduct investigation and record measurements in table Response is scored for both the quality of the presentation and the quality of the data collection.
Quality of presentation (total possible points: 2) a) presents at least 2 sets of measurements in table. b) measurements are paired: dissolution time and temperature. c) labels table appropriately: data entries in columns identified by headings and/or units; units incorporated into headings or placed beside each measurement.
Quality of data (total possible points: 3) a) records solution time for at least three temperature points b) measurements are plausible: time and temperature (109 to 100 degrees) c) records solution times that decline as temperature increases
Item 3 - Draw conclusions about effect of temperature (total possible points: 2) a) conclusion is consistent with data table or other presentation of data b) describes relationship presented in the data
Item 4 - Explain conclusions (total possible points: 2) a) relates higher temperature to greater energy or speed of particles (atoms, molecules, etc.). b) makes connection between greater speed or energy of water molecules and the effect on the tablet (may be implicit).
Source: Third International Mathematics and Science Study (TIMMS)
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 17
Creating Task-Specific Rubrics from Generic
Generic Rubric for Declarative Knowledge (understanding)
4 Demonstrates a thorough understanding of the generalizations, concepts, and facts specific to the task or situation and provides new insights into some aspect of this information.
3 Displays a complete and accurate understanding of the generalizations, concepts, and facts specific to the task or situation.
2 Displays an incomplete understanding of the generalizations, concepts, and facts specific to the task or situation and has some notable misconceptions.
I Demonstrates severe misconceptions about the generalizations, concepts, and facts specific to the task or situation.
Content Standard - Understands how basic geometric shapes are used in the planning of well-organized communities.
Task-Specific Rubric in Mathematics
4 Demonstrates a thorough understanding of how basic geometric shapes are used in the planning of well-organized communities and provides new insights into some aspect of their use.
3 Displays a complete and accurate understanding of how geometric shapes are used in the planning of well-organized communities.
2 Displays an incomplete understanding of how basic geometric shapes are used in the planning of well-organized communities and has some notable misconceptions about their use.
1 Has severe misconceptions about how basic geometric shapes are used in the planning of well-organized communities.
Source: Marzano, R., Pickering, D. and McTighe, J. (1993). Assessing Outcomes: Performance Assessment Using the Dimensions of Learning Model. Alexandria, VA: ASCD
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 18
The novice ...
• assumes that presenting a clear position with a reason is
sufficient to persuade
•
•
•
•
•
•
The expert ...
• understands that effective persuaders carefully analyze their audience to determine the most persuasive approach
•
•
•
•
•
example:
novice expert••persuasion
Rubric Design Process #1 – T-Chart One effective process for developing a rubric is to begin at the ends. In other words, to develop a rubric to assess degrees of understanding of a “big idea” or complex process, ask: What are indicators of a sophisticated understanding? What do the most effective performers do that beginners do not? Contrast these indicators with those of a novice. Similarly, when creating a rubric for skills, distinguish the qualities displayed by an expert compared to a novice. Use the following worksheet to identify specific indicators of novice versus expert.
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 19
Use the following general terms to describe differences in degree when constructing a “first-time” scoring rubric with a 4-point scale. Once the rubric is applied, an analysis of student work will
yield more precise descriptive language and/or a rubric with more gradations.
student successfully completes the task:
• independently
• w/ minimal assistance • w/ moderate assistance
• only w/ considerable assistance
Degrees of Independence
Descriptive Terms for Differences in Degree
• highly effective
• effective • moderately effective
• ineffective
Degrees of Effectiveness
• thorough/complete • substantial • partial/incomplete • misunderstanding/ serious misconceptions
Degrees of Understanding
• completely accurate; all ___ (facts, concepts, mechanics, computations) correct
• generally accurate; minor inaccuracies do not affect overall result • inaccurate; numerous errors detract from result
• major inaccuracies; significant errors throughout
Degrees of Accuracy
• exceptionally clear; easy to follow
• generally clear; able to follow • lacks clarity; difficult to follow
• unclear; impossible to follow
Degrees of Clarity
• always/consistently
• frequently/generally
• sometimes/occasionally
• rarely/never
Degrees of Frequency
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 20
Four Categories of Criteria Content – refers to the appropriateness and relative sophistication of the understanding, knowledge and skill employed.
Quality – refers to the overall quality, craftsmanship and rigor of the work.
Process – refers to the quality and appropriateness of the procedures, methods, and approaches used, prior to and during performance.
Result – refers to the impact, success or effectiveness of performance, given the purpose(s) and audience.
Example – Cooking a Meal
Here is an example in which all four types of criteria might be used to evaluate a meal in nine different ways:
Content 1. meal reflects knowledge of food, cooking, situation, and diners’ needs and tastes 2. meal contains the appropriate, fresh ingredients 3. meal reflects sophisticated flavors and pairings
Quality 4. meal is presented in aesthetically appealing manner 5. all dishes are cooked to taste
Process 6. meal is efficiently prepared, using appropriate techniques 7. the two cooks collaborated effectively
Result 8. meal is nutritious 9. meal is pleasing to all guests
NOTE: While these four categories reflect common types of criteria, we do not mean to suggest that you must use all four types for each and every performance task. Rather, you should select the criterion types that are appropriate for the goals be-ing assessed through the task and for which you want to provide feedback to learners.
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 21
Four Categories of Criteria
Content – refers to the appropriateness and relative sophistication of the understand-ing, knowledge and skill employed. • Was the work accurate? • Did the product reveal deep understanding? • Were the answers appropriately supported? • Was the work thorough? • Were the arguments of the essay cogent? • Was the hypothesis plausible and on target? • In sum: Was the content appropriate to the task, accurate, and supported?
Quality – refers to the overall quality, craftsmanship and rigor of the work. • Was the speech organized? • Was the paper mechanically sound? • Was the chart clear and easy to follow? • Did the story build and flow smoothly? • Was the dance graceful? • Were the graphics original? • In sum: Was the performance or product of high quality?
Process – refers to the quality and appropriateness of the procedures, methods, and approaches used, prior to and during performance. • Was the performer methodical? • Was proper procedure followed? • Was the planning efficient and effective? • Did the reader/problem solver employ apt strategies? • Did the group work collaboratively and effectively? • In sum: Was the approach sound?
Result – refers to the impact, success or effectiveness of performance, given the purpose(s) and audience. • Was the desired result achieved? • Was the problem solved? • Was the client satisfied? • Was the audience engaged and informed? • Was the dispute resolved? • Did the speech persuade? • Did the paper open minds to new possibilities? • In sum: Was the work effective?
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 22
Categories of Performance Criteria
By what criteria should understanding performances be assessed? The challenge in answering
is to ensure that we assess what is central to the understanding, not just what is easy to score. In
addition, we need to make sure that we identify the separate traits of performance (e.g. a paper can
be well-organized but not informative and vice versa) to ensure that the student gets specific and
valid feedback. Finally, we need to make sure that we consider the different types of criteria (e.g. the
quality of the understanding vs. the quality of the performance in which it is revealed).
result
Describes the over- all impact and the extent to which goals, purposes, or results are achieved.
beneficialconclusiveconvincing
decisiveeffectiveengaging
entertaininginformative
inspiringmeets standards
memorablemoving
persuasiveproven
responsivesatisfactorysatisfyingsignificant
usefulunderstood
process
Describes the degree of skill/proficiency. Also refers to the effec-tiveness of the process or method used.
carefulclever
coherentcollaborative
concisecoordinated
effectiveefficientflawless
followed processlogical/reasoned
mechanically correctmethodicalmeticulousorganizedplanned
purposefulrehearsedsequential
skilled
Four types of performance criteria (with sample indicators)
quality
Describes the degree of quality evident in products and performances.
attractivecompetent
creativedetailed
extensivefocussedgraceful
masterfulorganizedpolishedproficient
preciseneat
novelrigorousskilledstylishsmoothunique
well-crafted
content
Describes the degree of knowledge of factual information or under-standing of concepts, principles, and processes.
accurateappropriateauthenticcompletecorrect credible
explainedjustified
importantin-depth
insightfullogical
makes connectionspreciserelevant
sophisticatedsupportedthorough
valid
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 23
3
THE GAME. The task is presented without cues as to how to approach or solve it, and may look unfamiliar or new. Success depends upon a creative adaptation of one’s knowledge, based on understanding the situation and the adjustments needed to achieve the goal - “far transfer.” No simple “plug-ging in” will work, and the student who learned only by rote will likely not recognize how the task taps prior learning and requires adjustments. Not all students may succeed, therefore, and some may give up.
• In a writing class, students are given a quote that offers an intriguing and unortho-dox view of a recently-read text, and are simply asked: “Discuss”
• In a math class, students must take their knowledge of volume & surface area to solve a problem like: “What shape permits the most volume of M & Ms to be packed in the least amount of space – cost-effectively and safely?”
2
GAME-LIKE. The task is complex but is presented with sufficient clues/cues meant to suggest the approach or content called for (or to simplify/narrow down the options considerably). Success depends upon realizing which recent learning applies, and using it in a straightforward way – “near transfer.” Success depends on figuring out what kind of problem this is, and with modest adjustments using prior procedures and knowledge to solve it.
• writing: same as above, but the directions summarize what a good essay should include, and what past topics and ideas apply.
• math: the above problem is more simplified and scaffolded, by the absence of a spe-cific context, and through cues provided about the relevant math and procedures
1
DRILL. The task looks familiar and is presented with explicit reference to previously studied material and/or approaches. Minimal or no transfer is required. Success requires only that the student recognize, recall and plug in the appropriate knowledge/skill, in response to a familiar (though perhaps slightly different) prompt. Any transfer involves dealing with only altered variables or details different from those in the teaching examples; and/or in remembering which rule applies from a few obvious recent candidates.
• writing: the prompt is a just like past ones, and the directions tell the student what to consider, and provide a summary of the appropriate process and format.
• math: the student need only plug in the formulae for spheres, cubes, pyramids, cylinders, etc. to get the right answers, in a problem with no context.
Rubric for Degree of Transfer
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 24
The following six-step process for identifying performance criteria and using them as a basis for designing a scoring rubric. The procedure begins with sorting student work and then proceeds by looking at sample performance criteria from other places.
Step 1: Gather samples of student performance that illustrate the desired skill or understanding.
Choose as large and diverse a set of samples as possible.
Step 2: Sort student work into different
stacks and write down the reasons.
For example, place the samples of student work into three piles: strong, middle and weak. As the student work is sorted, write down reasons for placing pieces in the various stacks. If a piece is placed in the “sophisticated” pile, describe its distinguish-ing features. What cues you that the work is sophisticated? What are you saying to yourself as you place a piece of work into a pile? What might you say to a student as you return this work? The qualities (attributes) that you identify reveal criteria. Keep sorting work until you are not adding anything new to your list of attributes.
Step 3: Cluster the reasons into traits or important dimensions of performance.
The sorting process used thus far in this exercise is “holistic.” Participants in this process end up with a list of comments for high, medium and low performance; any single student product gets only one overall score. Usually, during the listing of com-ments someone will say something to the effect that, “I had trouble placing this paper into one stack or another because it was strong on one trait but weak on another.” This brings up the need for analytical trait scoring systems; i.e., evaluating each student’s product or performance on more than one dimension.
Rubric Design Process #3 – Categorizing Student Work
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 25
Step 4: Write a definition of each trait.
These definitions should be “value neutral” – they describe what the trait is about, not what good performance looks like. (Descriptions of good performance on the trait are left to the “high” rating.)
Step 5: Find samples of student performance that illustrate each score point on each trait.
Find samples of student work which are good examples of strong, weak and mid range performance on each trait. These can be used to illustrate to students what to do and what “good” looks like. It’s important to have more than a single example. If you show students only a single example of what a good performance looks like, they are likely to imitate or copy it.
Step 6: Continuously Refine
Criteria and rubrics evolve with use. Try them out. You’ll probably find some parts of the rubric that work fine and some that don’t. Add and modify descriptions so that they communicate more precisely. Choose better sample papers that illustrate what you mean. Revise traits if you need to. Let students help—this is a tool for learning.
Rubric Design Process #3
(continued)
Source: Arter, J. and McTighe, J. (2001). Scoring Rubrics in the Classroom: Using Performance Criteria for Assessing and Improving Student Performance.
Thousand Oaks, CA: Corwin Press
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 26
(Note: This is a flawed example.)Va
lidi
ty r
equi
res
that
thes
e el
emen
ts m
ust a
lign
Assessment Task Blueprint: Validity Check
• actual Civil War battle depicted • accurate information on index card• neat and colorful• correct spelling
• accurate topography• drawn to scale• includes compass rose • correct placement of armies • neat and colorful
topographical map of battlefield
Student will understand the causes and effects of the Civil War.
Students will demonstrate knowledge of and skill in using topographical maps.
You are opening a new museum on the Civil War designed to inform and engage young people. Your task is to select a decisive Civil War battle, research the battle, and construct a diorama of the battle. Attach an index card to your diorama containing the date of the battle, the names of the opposing generals, the number of casualties on each side, and the victor. Finally, create a topographical map to show an aerial view of the battlefields. Remember: Your map must be drawn to scale. Neatness and spelling count!
diorama of Civil War battle
By what criteria/indicators will student products/performances be evaluated?
What content standards will be assessed through this task?
What student products/performances will provide evidence of desired understandings?
Through what authentic performance task will students demonstrate understanding?
Task Overview:
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 27
Score Point 4 The re-enactor always wears wool from head to toe while on the battlefield or in camp. S/he eliminates all 20th century terms from vocabulary while in role. Subsists entirely on hardtack and coffee. Contracts lice and annoying intestinal ailments during extended re-enactments.
Score Point 3 The re-enactor dresses in wool from head to toe in July. S/he usually follows drill orders to march and fire rifle. Carries hardtack and coffee in haversack. Can correctly identify Union and Confederate troops while in the field.
Score Point 2The re-enactor wears a blue uniform made of synthetic materials. S/he ex-ecutes most orders, but usually 3-5 seconds after the rest of the company. Hides a Snickers bar in haversack and carries beer in canteen. Sometimes can not remember which side wears blue and which wears gray.
Score Point 1The re-enactor wears an Orioles cap, Hard Rock Cafe tee-shirt, and Reeboks with uniform. S/he cannot tell Union from Confederate troops. Has been heard asking, “Are you a Union or Confederate soldier?” Fires upon his fel-low soldiers and frequently wounds self or fellow soldiers. Litters the 19th century campground with Twinkie and Big Mac wrappers.
Rubric for a Civil War Re-enactorAdapted from a humorous rubric created by Dr. Tim Dangel, Anne Arundel Schools (MD)
Comments:
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 28
Score Point 4 Accurately describes 4 or more attributes of plants and animals.
Score Point 3 Accurately describes 3 attributes of plants and animals.
Score Point 2Accurately describes 2 attributes of plants and animals.
Score Point 1Accurately describes 1 attribute of plants and animals.
Score Point 0Does not accurately describe any attributes of plants and animals.
Critique These Two Rubrics
Topic: Observing and describing living things
4 – Provides 4 or more reasons.
3 – Provides 3 reasons. 2 – Provides 2 reasons.
1 – Provides a reason.
0 – Provides no reasons.
Topic: Persuasion (in writing or speaking)
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 29
Reviewing Your Rubric
In summary, the best criteria/rubrics...
1. evaluate student performances in terms of characteristics central to Stage 1 goals, not just the surface features of the task itself. Be careful not to over-emphasize the surface features of a particular product or performance (e.g., “colorful”, or “neat”) at the expense of the most important traits related to understanding (e.g., “thorough” or explanation with support”).
2. reflect the central features of performance, not just those which are easiest to see, count or score (e.g., “at least 4 footnotes” or “no misspellings”) at the expense of the most important traits (e.g., “accurate” or “effective”).
3. split independent criteria into separate traits. In other words, do not combine distinct traits, such as “very clear” and “very organized” in the same criterion, since an essay might be clear but not organized, and vice versa.
4. emphasize the result of the performance. Ultimately, meaning-making and trans-fer are about results – was the paper persuasive?, ...the problem solved?, ...the story engaging?, ...the speech informative?, etc. The criteria chosen should always high-light the purpose of a task, in other words, as indicated by results-focused criteria. Be careful not to assess for mere compliance or process (i.e., “followed all the steps,” “worked hard”).
5. balance specific feedback on the task with reference back to general goals. Ultimately, a broad understanding matters more than performance on a unique and very specific task. However, the indicators need to be specific enough to provide useful feedback as well as reliable scoring of the particular task.
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 30
• What do you really understand about _________?
• What questions/uncertainties do you still have about _________?
• What was most effective in _________?
• What was least effective in _________?
• How could you improve_________?
• What would you do differently next time?
• What are you most proud of?
• What are you most disappointed in?
• How difficult was _________ for you?
• What are your strengths in _________ ?
• What are your deficiencies in _________ ?
• How does your preferred learning style influence _________ ?
• What grade/score do you deserve? Why?
• How does what you’ve learned connect to other learnings?
• How has what you’ve learned changed your thinking?
• How does what you’ve learned relate to the present and future?
• What follow-up work is needed?
• other: __________________________________________ ?
Encouraging Self-Assessment and Reflection
Rubrics may be used as tools to engage students in self evaluation, reflection and goal setting. The following questions may be used as prompts to guide student self evalua-tion and reflection.
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 31
Ana
lyti
c R
ubri
c fo
r G
raph
ic D
ispl
ay o
f D
ata
Nam
e: _
____
____
____
____
____
____
____
____
____
D
ate:
___
____
____
___
_
____
____
____
____
____
____
G
oals
/Act
ions
:
The
gra
ph is
ver
y ne
at a
nd e
asy
to
read
. 3 2 1
Onl
y so
me
part
s of
th
e gr
aph
are
corr
ectly
la
belle
d O
R la
bels
are
mis
sing
.
tit
le
lab
els
acc
urac
y
nea
tnes
s
All
data
is a
ccur
atel
y re
pres
ente
d on
the
grap
h.
The
dat
a is
inac
cura
tely
re
pres
ente
d, c
onta
ins
ma-
jor
erro
rs, O
R is
mis
sing
.
All
part
s of
the
grap
h (u
nits
of
mea
sure
men
t, ro
ws,
etc
.) a
re c
orre
ctly
la
belle
d.
The
gra
ph c
onta
ins
a tit
le th
at c
lear
ly
tells
wha
t the
dat
a sh
ows.
The
gra
ph is
slo
ppy
and
diffi
cult
to
read
.
The
the
title
doe
s no
t refl
ect w
hat t
he
data
sho
ws
OR
the
title
is m
issi
ng.
The
gra
ph c
onta
ins
a tit
le th
at s
ugge
sts
wha
t the
dat
a sh
ows.
Dat
a re
pres
enta
tion
cont
ains
min
or e
rror
s.
The
gra
ph is
gen
eral
ly
neat
and
rea
dabl
e.
Som
e pa
rts
of th
e gr
aph
are
inac
cura
tely
labe
lled.
wei
ghts
–
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 32
Questions To Ask When Examining Student Work
Use the following questions to guide the examination of student work.
Describe• What knowledge and skills are assessed?• What kinds of thinking are required (e.g., recall, interpretation, evaluation)? • Are these the results I (we) expected? Why or why not?• In what areas did the student(s) perform best? • What weaknesses are evident? • What misconceptions are revealed?• Are there any surprises? • What anomalies exist?• Is there evidence of improvement or decline? If so, what caused the changes?
Evaluate• By what criteria am I (are we) evaluating student work? • Are these the most important criteria? • How good is “good enough” (i.e., the performance standard)?
Interpret• What does this work reveal about student learning and performance? • What patterns (e.g., strengths, weaknesses, misconceptions) are evident? • What questions does this work raise? • Is this work consistent with other achievement data?• Are there different possible explanations for these results?
Identify Improvement Actions• What teacher action(s) are needed to improve learning and performance?• What student action(s) are needed to improve learning and performance?• What systemic action(s) at the school/district level are needed to improve learning and performance (e.g., changes in curriculum, schedule, grouping)?
• Other: _________________________________________________________?
• Other: _________________________________________________________?
Developing and Using Scoring Rubrics
©2011 Jay McTighe page 33
Dat
a-D
rive
n Im
prov
emen
t P
lann
ing
Bas
ed o
n an
ana
lysi
s of
ach
ieve
men
t dat
a an
d st
uden
t wor
k:
•
Wha
t pat
tern
s of
wea
knes
s ar
e no
ted?
•
Wha
t spe
cific
are
as a
re m
ost i
n ne
ed o
f im
prov
emen
t?
•
p
robl
em s
olvi
ng a
nd m
athe
mat
ical
reas
onin
g ar
e ge
nera
lly w
eak
•
s
tude
nts
do n
ot e
ffec
tivel
y ex
plai
n th
eir r
easo
ning
and
the
ir us
e of
str
ateg
ies
•
•
Wha
t spe
cific
impr
ovem
ent a
ctio
ns w
ill w
e ta
ke?
appr
opria
te m
athe
mat
ical
lang
uage
is n
ot a
lway
s us
ed
Expl
icitl
y te
ach
(and
regu
larly
revi
ew)
spec
ific
prob
lem
sol
ving
str
ateg
ies.
Deve
lop
a po
ster
of p
robl
em s
olvi
ng s
trat
egie
s an
d po
st in
eac
h m
ath
clas
sroo
m.
Deve
lop
a “w
ord
wal
l” of
key
mat
hem
atic
al t
erm
s an
d us
e th
e te
rms
regu
larly
.
Incr
ease
use
of “
thin
k al
ouds
” (b
y te
ache
r & s
tude
nts)
to
mod
el m
athe
mat
ical
reas
onin
g.
Incr
ease
our
use
of “
non
rout
ine”
pro
blem
s th
at re
quire
mat
hem
atic
al re
ason
ing.
Revi
se o
ur p
robl
em s
olvi
ng ru
bric
to
emph
asiz
e ex
plan
atio
n &
use
of m
athe
mat
ical
lang
uage
.