we have standardised, let’s do no more€¦ · fulcher, g. 2010. practical language testing,...

21
We Have Standardised, Let’s Do No More Prepared by: Ebtesam Abdulhaleem Supervisors: Prof. Claudia Harsch and Dr. Neil Murray

Upload: lemien

Post on 11-Jul-2018

214 views

Category:

Documents


0 download

TRANSCRIPT

We Have Standardised, Let’s Do No More

Prepared by:

Ebtesam Abdulhaleem

Supervisors:

Prof. Claudia Harsch and Dr. Neil Murray

Outline • Introduction

• Background Information

• The Study Context

• Literature Review

• The Problem

• Aims of the Study & Research Questions

• Research Methodology

• Results

• Recommendations

Introduction

• English as a foreign language (EFL) during school

education in Saudi Arabia

Very limited general English exposure.

• English and university education

English is the main medium of instruction for many academic disciplines.

Background Information • Preparatory Year Programme (PYP)

It is an intensive programme “to improve the knowledge and skills of high school graduates before they join their desired majors at the university” (Al-Murabit, 2012)

Its purpose id to improve the students’ English language to cope with their colleges’ English requirements.

• Before the commencement of the PYP students are placed:

Into three different levels (Advanced (c), Intermediate (B) and Beginners (A) based on the results of a placement test.

Into different tracks (e.g. The Medical/Healthcare Track) based on their high school GPA and interest.

KSU PYD

ELSD

Levels

Beginner Elementary Pre-

Intermediate

Pre-

Intermediate

Plus

Intermediate Intermediate

Plus

Upper-

Intermediate

Advanced

CEFR A1 A1-A2 A2 A2-B1 B1 B1-B2 B2 C1

Tracks

Categories

Courses

Semester1

Quarter 1 Quarter 2 Courses

Semester2

Quarter 3 Quarter 4

Medical/He

alth Track

A ENG141 Elementary

A1-A2

Pre-Intermediate ENG146 Pre-Intermediate

Plus

Intermediate

B1

B ENG142 Pre-Intermediate

A2

Pre-Intermediate

Plus

ENG137-B Intermediate Intermediate Plus

B1-B2

C ENG143 Intermediate

B1

Intermediate Plus ENG147-C Upper Intermediate Advanced

C1

PYP Curriculum Framework

Background Information

Background Information • Based on the students Track and GPA at the PYP, the students will be enrolled

at one of the colleges associated with that track.

For example: students in the medical track can register at one of the following

colleges:

College of Medicine

College of Pharmacy College of Applied Medical sciences College of Dentistry College of Nursing

The Study Context

• Students (all levels and tracks) sit the same

standardised tests.

• The results are expected to discriminate between the

students in order to be enrolled at different colleges

based on their GPA.

CONTINUOUS ASSESSMENT 20%

PROJECT

10%

PROCESS WRITING

10%

MID-TERM AND FINAL EXAMS 80%

MID-TERM EXAM (CBT)

30%

NO

Writing/Speaking components

FINAL EXAM 30%

SPEAKING 10%

WRITING

10%

Literature Review

“Whenever you read about standardised tests, the tests are based upon the

assumption that the distribution of scores (and test takers’ ability) are normal.

(Fulcher, 2010:42)

Noticed Problem • Majority of the students scored high in the standardized writing exam.

Students’ scores Students’ scores

% o

f to

tal

stu

den

ts

% o

f to

tal

stu

den

ts

% o

f to

tal

stu

den

ts

Aim of the Study & Research Questions

Aim:

To investigate and evaluate the suitability of the exam with multi-level students at the PYP.

Research Questions:

• Why have all the students passed the writing exam with very high scores

when the purpose of the exam is to discriminate between the students?

• Was a standardised writing exam the best option for those students?

“In order to make a decision about which tests are the best measures, we need to compare the results of the test with an independent estimate of whatever the test is designed to measure” (Fulcher, 2010:33)

Research Methodology 1. Quantitative Data:

• PYP Medical/Healthcare Track Students (449) used 10 CEFR scales to self-assess their levels of proficiency in writing.

• Their PYP tutors assessed the writing proficiency of all those students using the same scales.

• Correlation analysis between the students’ writing scores and the students assessment and teachers’ evaluation.

2. Qualitative Data:

• Focus Group Discussion with:

• PYP students,

• PYP tutors and coordinators.

• Students and staff from different medical/Healthcare colleges at the university.

Results

Group Levels

Number of participants

(N) Mean Std.

Deviation CEFR level

Elementary level (students) 73 4.48 1.58 B1 - B1+

Elementary level (teachers) 90 3.96 1.58 B1

Intermediate level (students) 269 4.92 1.53 B1+

Intermediate level (teachers) 249 5.22 1.65 B1+ / B2

Advanced level (students) 177 6.71 1.44 B2 / B2+

Advanced level (teachers) 190 6.65 1.59 B2 / B2+

v

Co

mm

on

Eu

rop

ean

Fra

mew

ork

of

Refe

ren

ces

(CE

FR

)

Basi

c U

sers

In

dep

en

den

t u

ser

Pro

fici

en

t U

sers

A1

A2

A2+

B1

B1+

B2

B2+

C1

C2

-5 5 15 25 35

0

1

2

3

4

5

6

7

8

9

- Advanced

- Intermediate

- Elementary

Teachers’ Evaluation Students’ Self-assessment

Category Writing Score CEFR Level

Sample size (N) 72 72

Mean 9.47 4.49

Spearman Correlation 0.271

Sig (2 tailed) 0.032

Elementary Level (Students’ self-assessment)

Category Writing Score CEFR Level

Sample size (N) 269 269

Mean 9.79 4.92

Spearman Correlation 0.003

Sig (2 tailed) 0.966

Intermediate Level (Students’ self-assessment)

Category Writing Score CEFR Level

Sample size (N) 176 176

Mean 9.93 6.73

Spearman Correlation 0.251

Sig (2 tailed) 0.000

Advanced Level (Students’ self-assessment)

Category Writing Score CEFR Level

Sample size (N) 63 63

Mean 9.35 4.02

Spearman Correlation 0.204

Sig (2 tailed) 0.085

Elementary Level (Teachers’ Evaluation)

Category Writing Score CEFR Level

Sample size (N) 218 218

Mean 9.79 5.18

Spearman Correlation 0.043

Sig (2 tailed) 0.487

Intermediate Level (Teachers’ Evaluation)

Category Writing Score CEFR Level

Sample size (N) 189 189

Mean 9.92 6.66

Spearman Correlation 0.161

Sig (2 tailed) 0.033

Advanced Level (Teachers’ Evaluation)

Results from the FG Standardised test

PYP Teachers and coordinators:

Pros:

-“No need for the students to place

themselves in lower levels to gain higher

marks”.

-“Easier to administer and score”.

PYP Students:

Pros:

-“The test was very easy”

-“We can easily get high marks”

-“There is no need to study or worry

about the exam”

Results from the FG Standardised test

PYP Teachers and coordinators:

Cons:

-“I don’t think the students are challenged enough”

-“we have one exam for all students which I don’t

think it’s appropriate”

-“It’s OK to have final standardised exam but the

mid-term should be, at least, level-based and track-

based exam”

-“Ever since this whole thing of standardisation

across whole levels everything is just going down

hell”

-The test is very easy (for all levels).

-The word limit is a disadvantage, especially for the

advanced level.

College Staff:

Cons:

-“Students join the college with similar very high

GPA, but there is a very noticed differences in their

levels”.

-“Most of the students’ writing is weak and is not up

to the expected level”

Results from the FG Standardised test

College Students:

-“We did not take the exam seriously, which affected our level of learning”.

-“The exam, sorry to say that, it was nonsense”

-“It would help us more if the exam was related to our level so we can take it more

seriously”

-“we were happy that the test was very easy but there was no enough improvement

in our levels”

-“because we knew the test is very easy, we did not take the subject (English)

seriously and now we are struggling because of that.

Recommendations

- Level-based exam will be more challenging for the students, which will

motivate the students to work harder to improve their writing.

- Presumably, students will take level-based exams more seriously.

- The students assessment should be based on a needs analysis study of the

students’ actual needs (Tsai & Tsou, 2009)

- It is very important to evaluate the impact of the assessment regime on

students (Gipps, 2002) and students’ outcomes.

References Council of Europe, (2001). Common European Framework of References of Languages:

Learning, Teaching and Assessment, Cambridge: Cambridge University Press.

Fulcher, G. 2010. Practical language testing, London, UK: Hodder Education.

Gipps, C.2002. Socio-cultural perspectives on assessment. In learning for life in the 21st century,

ed. G. Wells and G. Claxton, 73-88. Oxford: Blackwell.

MAcDonald, J. 2004. Developing competent e-learners: The role of assessment. Assessment

and Evaluation in Higher Education 29, no. 2: 215-26

Tsai, Y., & Tsou, C.H. (2009). A standardised English language proficiency test as the

graduation bench-mark: Student perspectives on its application in higher education.

Assessment in Education: Principles, Policy & Practice,16 (3), 319–330.

Thank you