fvu-math final test emma study visit copenhagen 9 – 12 may 2007 pernille pind email:...

Post on 03-Jan-2016

213 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

FVU-math

Final test

EMMA Study Visit Copenhagen 9 – 12 May 2007 Pernille Pindemail: pernille@pernillepind.dkhjemmeside: www.pernillepind.dk

Organization:

2000-2001Part of the greater project about FVU-math

Group:1 representative from ministry1 PhD student4 teachers form the Pilotproject of teachers traininng for FVU-math1 AVU teachergroupleader

FVU-math, a new education

Numeracy the new aim

Content: activities, data and media, mathematical concepts and operations

Organization: contexts

activities data and media

mathematical concepts and operations

Many ideas for final test:

•Test over several days with real problems

•Oral test with real activities

•Real everyday concrete materiale

•Video sequences in test

•Audio sequences in test

In the end it turned out to be a fight against a written test.

And we lost the fight!

Conditions from ministry:• Final test• Issued by ministry• Written • Assessment: pass/fail• Individual testing• Voluntary• Printed and copied bye each institution • Flexible test dates• Work towards database of test questions

Our basic point of view:•The aim is numeracy, not mini-math

•Final tests have enormous impact on teaching

Our constitution:

Double authenticity:

• all test questions emerge from authentic documents

• all test questions should be realistic

Did that make any difference compared to other math-educations?

Yes!

In comparison with final tests for Folkeskolen and AVU:

•Authentic documents are only used as illustration, not as an actual part of the question.

•Reality is changed so it fits the math you want tested.

•Several non-realistic questions.

Categories of test questions:

We worked with the database in mind.

We made hundreds of test questions and the started to group them in categories.

The grouping of questions turned out to be a grouping of documents.

Only a few of the categories emerged from an activity.

The categories were not a sufficient description of the test questions!

Categories of test questions:

Trin1 Trin2

Diagrams and tables x

Shapes x

Shopping x x

Instruments x

Maps x

Measuring x

Numbers and codes x

Recipes x x

Calculate the same way (algorithms)

x

Trin1 Trin2

Interest x

Rates x x

Numbers in text x

Time x x

Counting x

Fill in x x

Currency x

Informative labeling x x

World in two dimensions (photo) x

Question Categorizing System

As said before, the categories were not sufficient.

Every question should also be categorized according to the following parameters:

Media, Datatype, Concepts, Activity, Context, Type of answer and Level of difficulty.

These were soon reduced to:

Context, Concepts, Activity, Type of answer and Level of difficulty.

Opgave Kategoriserings System – OKS

Opgave Kategoriserings System – OKS

Level of difficulty

Level 1, 2 or 3

The following five things in consideration::1. Level of familiarity.2. Level of “noise” in the document3. Level of openness4. Quantity of calculations5. Number of steps in calculations

Enough time and predictable questions• We wished that nobody should feel stressed by a time limit.

• We wished nobody should feel surprised by the test questions.

Our hope was that test training in class wouldn’t take up to much time.

It’s difficult!

To find documents:

•that are relevant for: the whole country, both sexes, ages between 18 and 65, all occupations…

•with very little “noise”

•that seem without pitfalls

To formulate questions that are realistic for many people

Test-question examples

Trin 1 opgavesæt C forår 2005

Trin 2 opgavesæt B forår 2005

top related