assessment: a key aspect of teaching and learning mark wilson & kathleen scalise uc, berkeley

34
Assessment: A key aspect of teaching and learning Mark Wilson & Kathleen Scalise UC, Berkeley

Upload: clara-cannon

Post on 17-Dec-2015

215 views

Category:

Documents


0 download

TRANSCRIPT

Assessment:A key aspect of teaching and learning

Mark Wilson & Kathleen Scalise

UC, Berkeley

Overview of Today’s Presentation

• Set the context – What are some problems with assessment in higher

education?

• Overview of the logic & findings from Knowing What Students Know (KWSK)

• Give an example of an assessment system

An Example of Problems with Current Educational Assessment

“One of the persistent dilemmas in education is that students spent time practicing incorrect skills with little or no feedback. Furthermore, the feedback they receive is often neither timely nor informative. For the less capable student, unguided practice can be practice in doing tasks incorrectly.” — NRC report “Knowing What Students Know,” p. 87

An Example:CS 1731

• Week 1: “Went to my first CS 173 discussion today. Went over what we are going to cover in the class. Sounds like some cool stuff…. The only thing bugging me right now is that I am not officially enrolled in CS 173 yet. I am fourth on the waitlist.” 1 Not the real course number.

Week 4

• “Damn homework…. I went to the lab to work on it. Everyone shortly there after came in and started working on their CS 173 homework. So I ended up staying around to help everyone the best I could since I was the only one to have finished the homework…. I have no clue if it is correct, though.”

Week 8

• Almost all of Monday was spent in the (computer lab) working on CS 173 homework again. What made this severely frustrating is that I was unable to solve the problem that I spent all day on; no one I know was able to solve that problem….Severely frustrated, I went on home.”

Week 9

• “Midterm grade: 63/100 . The mean was 55.5 . Standard deviation was around 18. Would have liked mean + SD, but I will live. Still beat the mean which is what is really important.”

Week 17

• “Went and took the exam. Kind of stupid. Bunch of short answer with some other stuff that was never covered in the homework. I found out afterwards that about a third to half of the test lifted from last year's final. So everyone who had it or had read it knew how to answer those questions perfectly. Of course I had not seen it, let alone had a copy for the test…. Needless to say the curve will be skewed.”

Some concerns about Assessmentin Higher Education

• Assessments frequently fail to provide:– useful feedback to students– useful “feedforward” to instructors.– useful “feedforward” to administrators

• Narrowing of instruction by teaching to tests with restricted performance outcomes.

• Narrowing of student learning engagement when metacognitive needs not satisfied.

The Assessment Triangle

• cognition – model of how students

represent knowledge & develop competence in the domain

• observations– tasks or situations that allow

one to observe students’ performance

• interpretation– method for making sense of the

data

observations interpretation

cognition

Must be coordinated!

Scientific Foundationsof Assessment

• Advances in the Sciences of Thinking and Learning -- the cognition vertex– informs us about what observations are sensible to make

• Contributions of Measurement and Statistical Modeling -- the interpretation vertex– Informs us about how to make sense of the observations we have

made

Advances in Sciences of Thinking & Learning

• The most critical implications for assessment are derived from study of the nature of competence and the development of expertise in specific curriculum domains.– Knowledge organization – Characteristics of expertise– Metacognition– Multiple paths to competence– Preconceptions and mental models– Situated knowledge and expertise

Some Summary Points

• Contemporary knowledge from the cognitive sciences strongly implies that assessment practices need to move beyond discrete bits and pieces of knowledge to encompass the more complex aspects of student achievement

• Instructional programs and assessment practices based on cognitive theory exist for areas of the curriculum

• Further work is needed– translate research findings for practical use– develop models of learning for all areas of curriculum

Advances in Measurement:Beyond Models of General Proficiency

• Three general sets of measurement issues that can be accommodated by various models – continua vs classes– single vs multiple attributes– status vs change

• Report describes a progression of models and methods of increasing complexity

Assessment Design Principles

Assessment design should always be • based upon a model of student learning • well-designed and tested items• and a clear sense of the inferences about student

competence that are desired

• for the particular context of use.

Implications forAssessment Practice

In the classroom:• assessment should be an integral part of instruction• students should get information about particular

qualities of their work and what they can do to improve

• students must understand learning goals and landmark performances along the way

• cognitive science findings need to be made user-friendly

Assessment Practice, cont.

• Report envisions systems of assessments that cut across contexts and that are:– comprehensive– coherent– continuous

• We need to shift the emphasis toward the classroom where learning occurs– Example -- BEAR assessment system

The BEAR Assessment System

Example: The ChemQuery project

Observation Interpretation

Cognition

Reminder…Assessment Triangle

BEAR Assessment System 1Principles

II. Match between instruction and assessment

III. Management by teachers

IV. Quality evidence

I. Developmental perspective

BEAR Assessment System 2Building Blocks

II. Items ModelIII. Outcome space

IV. Measurement model

I. Developmental progress variables

Developmental Progress Variables from ChemQuery

• Matter is composed of atoms arranged in various ways: composition, structure, properties and amount of matter.

• Change is associated with rearrangements of atoms: type, progression and conservation in change.

• Stability is maintained unless change occurs with energy input: possibilities, influence and effort of stability.

ChemQuery

Examples of items from our instrument:

Both of the solutions have the same molecular formulas, but butyric acid smells bad and putrid while ethyl acetate smells good and sweet. Explain why these two solutions smell differently.

C4H8O4

butyric acidC4H8O4

ethyl acetate

Items Design

Outcome Space(Read from bottom up)

5. Generation: Students use the models to generate new knowledge and to extend models. (~graduate school)

4. Construction: Students integrate scientific understanding into full working models of the domain. (~upper division)

3. Formulation: Students combine unirelational ideas, building more complex knowledge structures in the domain. (~lower division)

2. Recognition: Students begin to recognize normative scientific ideas, attaching meaning to unirelational concepts. (~high school)

1. Notions: Students bring real-world ideas, observation, logic and reasoning to explore scientific problem-solving. (~middle-school)

ChemQuery

Level One: Notions

Response 1: I think there could be a lot of different reasons as to why the two solutions smell differently. One could be that they're different ages, and one has gone bad or is older which changed the smell.

Response 2: Using chemistry theories, I don't have the faintest idea, but using common knowledge I will say that the producers of the ethyl products add smell to them so that you can tell them apart.

Response 3: Just because they have the same molecular formula doesn't mean they are the same substance. Like different races of people: black people, white people. Maybe made of the same stuff but look different.

Level Two: Recognition

Response: "They smell differently b/c even though they have the same molecular formula, they have different structural formulas with different arrangements and patterns.”

ChemQuery

Quality evidence: student profile

Quality evidence: track student over time

ChemQueryQuality evidence:To help ALL students increase understanding of chemistry

ss

s

s

s

s

s

s

s

ScoreLevels

Low Middle High

Pretest

Post-test

Fall 2000 Student Gains, Grouped by Pretest Score

2

-2

+1

1

-1

0

ChemQuery team: Jennifer Claesgens Kathleen Scalise Angelica Stacy Rebecca Krystiniak Sheryl Mebane Karen Draney Mark Wilson

$$ NSFContact:mrwilson@ [email protected]

To know what they know.

And how to help.

For More Information

• NRC’s KWSK report:– http://www.nap.edu/catalog/10019.html

– A summary with commentaries:

Measurement: Interdisciplinary Research and Perspectives (2003), whole Issue #2, Erlbaum.

http://bear.soe.berkeley.edu/measurement/default4.html

• BEAR Assessment System:– Wilson, M. & Sloane, K. (2000). From principles to practice: An embedded

assessment system. Applied Measurement in Education, 13(2), 181-208.

– See also: http://bear.soe.berkeley.edu/

• Living By Chemistry Project:– www.lhs.berkeley.edu/LBC