2013 monitoring of learning achievement student performance report abridged … · 2016-03-31 ·...
TRANSCRIPT
2013 Monitoring of Learning Achievement
1
2013 MONITORING OF LEARNING ACHIEVEMENT
STUDENT PERFORMANCE REPORT
Abridged Version
JUNE, 2014 EDUCATION DEPARTMENT
HQ, AMMAN
2013 Monitoring of Learning Achievement
2
© United Nations Relief and Works Agency for Palestine Refugees in the Near East, 2013.
All rights reserved. The contents of this publication shall not be quoted or reproduced or stored in a retrieval system,
or transmitted in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the
prior written permission of UNRWA. For enquiries about use or reproduction of the text or other parts of this
publication, please contact UNRWA Department of Education [email protected]. Where use or reproduction
of this material is authorized by UNRWA, credit shall be given to UNRWA and a copy of the publication containing the
reproduced materials will be sent to UNRWA Department of Education, [email protected]. For copying or
reproduction for commercial purposes, a fee may be charged by UNRWA.
2013 Monitoring of Learning Achievement
3
CONTENTS
Acronyms .................................................................................................................................. 4
Acknowledgements .................................................................................................................. 5
Foreword for abridged version ................................................................................................ 6
1. Introduction ........................................................................................................................ 7
1.1 Report Outline.............................................................................................................. 7
1.2 UNRWA Education: Overview ...................................................................................... 7
1.3 Education Reform Strategy (ERS)................................................................................. 8
1.4 UNRWA-Wide Assessment System ............................................................................ 10
1.5 Monitoring of Learning Achievement (MLA) ............................................................. 10
2. Methodology...................................................................................................................... 12
2.1 Development of Instruments ..................................................................................... 12
2.2 Developing Questionnaires........................................................................................ 18
2.3 Reliability and Validity of Instruments ....................................................................... 19
2.4 Data Processes ........................................................................................................... 21
2.5 Data Analysis and Reporting ...................................................................................... 23
2.6 Limitations of the MLA Survey ................................................................................... 30
3. Student Performance ........................................................................................................ 31
3.1. Grade 4 Arabic............................................................................................................ 31
3.2. Grade 4 Mathematics ................................................................................................ 42
3.3. Grade 8 Arabic............................................................................................................ 52
3.4 Grade 8 Mathematics ................................................................................................ 63
4. Conclusion and Recommendations................................................................................... 74
4.1 Methodology.............................................................................................................. 74
4.2 MLA Results................................................................................................................ 75
4.3 Implications of MLA Findings ..................................................................................... 77
4.4 Dissemination and Reporting..................................................................................... 77
4.5 Recommendations for Future MLA Studies............................................................... 77
References .............................................................................................................................. 79
Annex ...................................................................................................................................... 80
2013 Monitoring of Learning Achievement
4
ACRONYMS
CF Curriculum Framework
ERS Education Reform Strategy
EMIS Education Management Information System
HOTS Higher Order Thinking Skills
HRCRT Human Rights Conflict Resolution and Tolerance
ICT Information, Communication and Technology
IE Inclusive Education
MLA Monitoring of Learning Achievement
MTS Medium Term Strategy
OD Organisational Development
PCSU Project Coordination and Support Unit
PLD Performance Level Descriptors
RBM Results Based Management
SBTD School-Based Teacher Development
TDSE Teacher Development and School Empowerment
VTC Vocational Training Centre
UNRWA United Nations Relief and Works Agency
2013 Monitoring of Learning Achievement
5
ACKNOWLEDGEMENTS
I would like to acknowledge the support and technical guidance of Professor Anil Kanjee in the
design, development and analysis of the MLA and the writing of this report. I would also like to
acknowledge the hard work of the Education Department Curriculum Unit and colleagues from
the DE office who steered and managed this whole process. The Agency wide Technical
Committee and the Steering Committee, with representatives from all Fields, both played key
roles. The Technical Committee worked on the design of the question items and oversaw the
implementation of the MLA in their Fields, as well supported the analysis process at initial and
final stages. The Steering Committee, comprising Chiefs and some HQ Education staff, ensured
the strategic direction of the MLA and discussed its findings and the implications.
I would also like to acknowledge the work of those who administered the tests and those who
inputted the data and of course the teachers and Head teachers. We also particularly thank the
students who undertook the tests, and responded to the questionnaires, for their engagement
and support in this exercise and the contribution it makes to UNRWA’s ongoing strive for
quality education for all.
2013 Monitoring of Learning Achievement
6
FOREWORD FOR ABRIDGED VERSION
This abridged and interim report provides information on the performance of 57,626 UNRWA
students in the agency-wide Monitoring of Learning Achievement (MLA) tests for Grade 4 and 8
in Arabic and Mathematics in 2013. The complete report will also include analysis of the
questionnaires for students, teachers and School Principals and school specific data
Reflecting the systemic change envisaged through the Education Reform Strategy,
comprehensive and comparable data on the achievement levels of students and the quality of
education is needed. Here the Monitoring of Learning Achievement (MLA) survey, as a key
strand in the UNRWA systematic and holistic assessment approach (Output 2), should provide
meaningful information to the Agency, Fields and Schools. In this regard and following the 2009
MLA, it was agreed in the Technical and Steering Committee Forums to broaden the scope of
the MLA, to not only include students’ academic achievements, with regard to mean scores, but
also to look at performance in cognitive domains, higher order thinking skills and the overall
equity of student learning outcomes. The MLA, through the questionnaires, would also seek to
look at look at classroom practices, the school environment and Reform progress in general.
Data would also be available for individual schools so that they can place themselves against
the Field and Agency results to analyse their strengths and needs.
The results of the MLA survey reveal a number of important trends with respect to issues of:
gender; equity in achievement - with marked disparity and too many children performing below
the level expected in that subject for that grade; differences within and between Fields; the
performance of students across the different content domains , and in the cognitive levels, in
Arabic and Mathematics.
The findings of the MLA survey have a number of implications for the development of relevant
intervention strategies to improve learning and teaching in UNRWA schools. Central to this is
the continuation of the current Education Reform process, and the importance of the
integration of proposed interventions into its systems and programmes. The specific factors
that impact on learning and teaching in schools across the different Fields, as well as the
capacity and skills to effectively delivery the interventions, must also be addressed in order to
enhance teaching and learning in all UNRWA schools.
The full report will be finalised by August 2014.
Dr Caroline Pontefract Director of Education June, 2014
2013 Monitoring of Learning Achievement
7
1. INTRODUCTION
1.1 REPORT OUTLINE
This abridged report provides information on the performance of students in the Agency-wide
Monitoring of Learning Achievement (MLA) tests. The report begins with an overview of the
UNRWA Education Sector and the purpose of the MLA survey. This is followed by a detailed
explanation of the methodology employed in the collection, analysis and reporting of the data.
The third section outlines student performance results for Grade 4 and 8 in Arabic and
Mathematics, and the final chapter provides a summary of the results and recommendations
for the 2015 MLA survey. The complete Report will also include analysis of the questionnaires
for students, teachers and School Principals and school specific level data.
1.2 UNRWA EDUCATION: OVERVIEW
The United Nations Relief and Works Agency for Palestine Refugees in the Near East (UNRWA)
is mandated with providing assistance and protection to Palestine refugees pending a just
solution to their plight. The Agency provides education, comprehensive primary health care,
emergency relief, social interventions, microfinance, housing, and infrastructural support
services to Palestine refugees in its five Fields of operation: the Gaza Strip, the West Bank,
Jordan, Lebanon and the Syrian Arab Republic. The Education Programme is the largest UNRWA
programme, operating 700 schools for approximately 500,000 Palestine refugee students; nine
Vocational Training Centres; two pre-service teacher training institutions and in-service teacher
education through Education Development Centres in four Fields . UNRWA employs over 22,700
teaching staff and the education programme accounts for more than half of its core operational
budget.
Since 2006, the Agency has been undergoing an Agency-wide Organizational Development (OD)
process to strengthen the Agency’s capacity to serve Palestine refugees more effectively. The
OD introduced the SPARE paradigm – Strategy, Policy, Accountability, Results and Envelopes (of
resources) – and decentralization, and it sought to clarify the roles and responsibilities between
HQ and Field offices within this paradigm.1 Here HQ is to provide strategic direction and policy
frameworks, while Field managers and staff are empowered in programme implementation. In
light of the OD, and the perceived need to strengthen the Education Programme based on the
academic results of the 2009 MLA and the new demands of the 21st Century, an evaluation of
the UNRWA Education Programme was undertaken by an external consulting firm, Universalia,
in 2010. This provided an objective external assessment of the ‘constant factors in schooling’
that impact on the education of Palestinian refugee children in UNRWA schools.
The Universalia Review noted that the UNRWA education system emphasised rote-learning and
memorisation, rather than the acquisition of skills and the development of understanding in
terms of higher order cognitive abilities and their application.2 In this respect, the Review
1 UNRWA, 2006 2 Universalia, 2010
2013 Monitoring of Learning Achievement
8
highlighted that ‘the pre-eminent concern of UNRWA and many of its Palestine refugee
stakeholders is academic achievement, narrowly defined in terms of results on high-stakes
examinations, and bench-marked against available host government pass rates on state
examinations.’ It observed how the high value many Palestinians place on more traditional
measures of academic achievement and the impact this has on the provision of quality
education. It refers to international research which emphasises that education needs to focus
on different dimensions of student achievement.
The Universalia Review also observed that the provision of pre-service training and professional
development of teachers in UNRWA was often fragmented and ad-hoc, and was based on an
‘obsolete’3 training model which did not provide sufficient development and support, locking
teachers into a teaching style which did not reflect a holistic view of children’s active learning
nor did it encourage inclusive education approaches whereby the educational needs of all
children are identified and addressed. The Review suggested that the enrichment and
adaptation of the curriculum would contribute to fostering higher order thinking skills.
In order to address these issues, amongst others, the Review highlighted the need for HQ to
provide strategic direction to ensure that the UNRWA Education Programme is responsive to
the needs of the Palestine refugees in 21st century and do more with the resources at hand.
1.3 EDUCATION REFORM STRATEGY (ERS)
Guided by the Universalia review and the need to provide quality education for all UNRWA
students to achieve their full potential, the Education Reform Strategy (ERS) (2011-2015) and its
corresponding Implementation Plan, were developed. The ERS seeks to bring about systemic
transformation and as such addresses education related dimensions holistically and
systemically. It uses an integrated approach to address different dimensions of quality
education and identifies eight Reform areas, four substantive programme areas, and four
support areas. Substantive programmes areas are: Teacher Development and School
Empowerment (TDSE); Inclusive Education; Curriculum and Student Assessment; and Technical
and Vocational Education and Training (TVET). Support areas are: Research, Development and
EMIS; Governance; Strategic Planning, Management and Projects; and Partnerships,
Communication and ICTs (Figure 1.1).
3 Universalia, 2010,
2013 Monitoring of Learning Achievement
9
Figure 1.1: Overview of the UNRWA Education Reform Strategy
Within these dimensions, the Education Reform Strategy emphasises the need for development
at the three levels of: policy, strategy and individual capacity. The ERS also highlights the unique
context within which learning and teaching is undertaken in the UNRWA system, with UNRWA
schools in five Fields, where they adhere to the policies and curricula of the different Host
countries. Given this context, the ERS stresses collaboration between Field offices and
Headquarters with either taking a ‘substantive’ lead role, and emphasises partnerships with
Host Governments and other stakeholders.
The Eight Key Outcomes Specified in the ERS
Outcome 1 Professional, qualified and motivated teacher force and empowered schools in place
Outcome 2 Equal access to quality education for all children regardless of gender, abilities, disabilities, impairments, health conditions and socio-economic status assured
Outcome 3
Relevant and quality Technical and Vocational Education and Training structure/programmes in place
Outcome 4
Curricula that support holistic approaches to learning and personal development strengthened
Output 1: Policy and standards for curricula in place
Output 2: A systematic and holistic assessment approach in place
Output 3: A culture of human rights, conflict resolution and tolerance in place
Output 4: Students’ life skills enhanced
Outcome 5 Evidence based policy making and informed decision-making at all levels in place
2013 Monitoring of Learning Achievement
10
Outcome 6 Effective educational governance system at all levels in place
Outcome 7 Education programme planning and management strengthened
Outcome 8 Partnerships, communication and use of education ICTs strengthened
1.4 UNRWA-WIDE ASSESSMENT SYSTEM
The Universalia Review noted that UNRWA was constrained by an emphasis on traditional
‘high-stakes’ examinations and the adherence to Host country curricula which focus on factual
content and memorisation. As such UNRWA’s examination system reflects the examination
cultures of the host countries and the MENA region, which, the Review stated, was not best
serving UNRWA students. The Review advocated for increased focus on literacy and numeracy
skills, particularly in the early grades with an overall emphasis on child-centred, activity-based
learning at all grades across all subjects.
In the 2012, ‘Review of Assessment Policy and Practice in the UNRWA Education System’, it was
observed that the assessment system within the UNRWA education sector is dominated by the
use of testing and examinations. Little evidence was found regarding the use of assessment for
improving learning and the Review noted that this limited use of formative assessment within
the classroom was due to the dominance of examinations and tests, as well as the capacity and
experience of teachers, Education Specialists and Education Development Centre staff.
In response to these specific assessment related findings, the Education Reform Strategy
identified the development of an UNRWA-wide assessment system as one of the key areas;
specifically, Outcome 4, Output 2 specifies that ‘a systematic and holistic assessment approach’
should be in place.4 This Output covers both the administering of a range of tests (for example
MLA, TIMSS, PISA) in the core subjects but also ongoing formative assessment for learning.
1.5 MONITORING OF LEARNING ACHIEVEMENT (MLA)
Reflecting the systemic change envisaged through the ERS, comprehensive and comparable
data on the achievement levels of students and the quality of education is needed. Here the
Monitoring of Learning Achievement (MLA) survey, as a key strand in the UNRWA systematic
and holistic assessment approach, (Output 2 of the Curriculum Reform area) should provide
meaningful information to the Agency, Fields and Schools . Following the 2009 MLA, it was
agreed in Technical and Steering Committee Forums to broaden the scope of the MLA, to not
only include students’ academic achievements with regard to mean scores but also to look at
performance in cognitive domains, higher order thinking skills and overall equity of student
learning outcomes. The MLA through the questionnaires would also seek to look at look at
classroom practices, school environments and Reform progress in general. Thus, the
assessment will facilitate measuring social, physical, creative, emotional and intellectual
dimensions of educational outcomes in an inclusive way, reflecting the overall vision of UNRWA
Education Reform.
4 UNRWA, 2010
2013 Monitoring of Learning Achievement
11
The implementation of the new MLA is linked to the start of the implementation of the
Education Reform, as the 2012-2013 test will serve as a baseline for the Reform. A second MLA
survey is scheduled for 2015 in order to monitor progress and feedback on the Reform process.
In doing so, the MLA is expected to become a comprehensive assessment umbrella for the
whole Education Programme and its Reform. In light of this, the primary purpose of the MLA is
to provide information to5:
i) Use as a baseline on learners' achievement;
ii) Enable identifying needs, trends, comparison within Fields, between Fields and globally;
iii) Serve as a baseline for the Reform in terms of student performance and practices in
schools;
iv) Facilitate target testing;
v) Enable reporting-internally & externally.
5Specified in October 2013 ToR
2013 Monitoring of Learning Achievement
12
2. METHODOLOGY
This section outlines the methodology followed in the development of instruments for the
2012-13 MLA survey, the selection of the sample, the implementation of the study, the
collection and cleaning of data, data analysis and the reporting of findings.
2.1 DEVELOPMENT OF INSTRUMENTS
A range of factors was taken into consideration while designing the UNRWA MLA survey. These
include the following:
i) International best practices in test development, administration and analysis;
ii) Differences between host country curricula used in UNRWA schools;
iii) The cost and feasibility of implementing a large-scale achievement survey, taking into
account the different conditions under which UNRWA operates; and
iv) The need for data generated by the MLA study to lead to analysis which will support
decision-making as well as the identification of strategies which will help improve
student achievement levels.
The quality of survey instruments or items ultimately determines the overall quality of the
study and the usefulness of the achievement information that it provides. For this reason a
great deal of effort was put into the development of test items to ensure that they:
i) Are appropriate for the student levels undertaking the MLA;
ii) Facilitate an analysis of the extent to which students have acquired the knowledge and
concepts required for mastery of the Grade 4 and 8 curriculum;
iii) Take into account differences in the curricula used in UNRWA Fields of operation; and
iv) Relate to international standards and expectations for student achievement in the 4th
and 8thGrades.
a) IDENTIFICATION OF SKILLS AND KNOWLEDGE TO BE INCLUDED IN THE SURVEY
Before the drafting of survey questions began, Curriculum Advisors, at HQ Amman in co-
operation with the Fields’ Education Specialists (ES) of Arabic and Mathematics conducted a
detailed analysis of the curricula in use in all UNRWA Fields of operation. More specifically, the
ES analysed 4th and 8th Grade Arabic and Mathematics textbooks and identified the major
content and cognitive domains regarding the various topics and skills that are taught. Following
this, the Fields were asked to identify their subjects and skills under these domains. After that,
each Field sent their analysis back to the Advisors at HQ (A), who then unified these, which
resulted in the identification common domains and skills among the Fields. It was then only
these elements which were selected to be tested in the MLA.
The identified common domains/topics were cross-referenced with international standards in
terms of mathematical knowledge and literacy levels in order to ensure that the UNRWA
2013 Monitoring of Learning Achievement
13
testing framework was aligned with internationally shared expectations of what students
should know after four and eight years of formal schooling. Content standards, which outlined
the knowledge and skills on which the tests would be based, were produced and circulated to
the Fields. The tests’ items in both Arabic and Mathematics were based on these standards (see
Appendix, Tables A, B,C, D for more details).
The process of identifying the range of skills and content that could be included in tests laid the
foundation for determining the relative weighting to be given to each content domain or area
in the tests – as the tests needed to reflect the relative weight given to different topics in
national curricula.
b) TEST STRUCTURE
It was important to ensure that the MLA survey is in line with international best practices in
conducting achievement surveys –in the content, the structure of the tests and the manner in
which they are conducted. The ability to successfully manage one’s time in an examination,
respond to a variety of question formats and be at ease while responding appropriately in
formal test settings, are all important skills that students should acquire. To address these
issues, the test developers varied the types of the questions they formulated and used
objective questions such as true or false, matching, etc. They also used open-ended questions
that require specific answers and Higher Order Thinking Skills (HOTS) questions. The HOTS
items focussed on students’ abilities to think critically and creatively involving analysis,
synthesis and evaluation skills.
Additional HOTS questions were also included in the test for pilot purposes, as the current
curriculum does not focus on this level of cognition. The intended use of these items is to
obtain a baseline on student performance at this cognitive level which can then be used in
improving teaching practices and learning regarding the development of higher-order thinking
and problem solving skills. The items for Arabic were prepared and reviewed by external
consultants, while for Mathematics they were prepared by HQ (A) and Field teams. For each
test, a total of five pilot HOTS items were included.
In addition to identifying the content and skills that should be assessed, consideration has given
to the number and types of items to be included in the tests as well as the duration of the test
(in minutes). Relevant tables of specifications for both Arabic and Mathematics tests for 4th
and 8th Grades were prepared, taking into account the cognitive domains, the content topics,
the number and the types of the questions, and the number of the ‘anchor items’ that were
included from 2009 MLA test items.
Taking the issues outlined above into consideration, various workshops were held with the
participation of Technical Committees from the Fields for constructing the Tables of
Specifications for both the Mathematics and Arabic tests for 4th and 8th Grades, and to develop
the test items. In the initial stages, three forms with around 80 test items each were produced
for both subjects. These forms were modified after the pre-testing and the piloting phases.
The content domains and cognitive level that defined the items included in the tests are noted
2013 Monitoring of Learning Achievement
14
in the ‘Tables of Specifications for Mathematics and Arabic Tests for 4 th and 8th Grades’ below.
The “M” symbol refers to the multiple choice questions, while the “F” symbol refers to the free
response questions.
Table of Specification for the 4th Grade Mathematics Test (Form A & B)
Cognitive Behavior
Subjects
Types of
Questions
Knowledge
30%
Application
50%
Reasoning
10%
HOTS
10 %
Total No.
of Items
Whole Numbers, Operations and Fractions
50%
M* 6 6 0 0 22
F* 1 5 2 2
Geometry and
Measurements 30%
M 4 5 2 0 14
F 0 2 0 1
Algebra and Patterns 10%
M 1 1 1 0 5
F 0 1 0 1
Data Representation,
Statistics and Probability 10%
M 0 0 0 1 4
F 1 2 0 0
Total Number of Items
100%
M 11 12 3 1 45
F 2 10 2 4
* M= Multiple Choice Questions F= Free Response Questions
Table of Specification for the 8th Grade Mathematics Test (Form A)
Cognitive Behavior
Subjects
Types of
Questions
Knowledge
30%
Application
50%
Reasoning
10%
HOTS
10 %
Total No.
of Items
Whole Numbers, Operations and Fractions
30%
M 4 7 1 0 15
F 0 0 1 2
Geometry and Measurements
30%
M 5 6 0 0 15
F 0 1 2 1
Algebra and Patterns 20%
M 3 3 1 0 10
F 0 1 1 1
Data Representation,
Statistics and Probability 12%
M 1 0 1 0 6
F 1 2 0 1
Proportionality
8%
M 1 2 0 0 4
F 0 0 0 1
Total Number of Items
100%
M 14 18 3 0 50
F 1 4 4 6
2013 Monitoring of Learning Achievement
15
Table of Specification for the 8th Grade Mathematics Test (Form B)
Cognitive Behavior Subjects
Types of Questions
Knowledge 30%
Application 50%
Reasoning 10%
HOTS 10 %
Total No. of
Items
Whole Numbers,
Operations and Fractions 30%
M 4 7 1 0 15
F 0 0 1 2
Geometry and Measurements
30%
M 5 6 0 0
15 F 0 1 2 1
Algebra and Patterns 20%
M 3 3 1 0 10
F 0 1 1 1
Data Representation, Statistics and Probability
12%
M 1 1 0 1
6
F 1 1 1 0
Proportionality 8%
M 1 2 0 0 4
F 0 0 0 1
Total Number of Items 100%
M 14 18 3 1 50
F 1 4 4 5
Table of Specification for Arabic Language for 4th Grade
Levels-Content Recognition Understanding and Application
HOTs Extra HOTs
Total %
Reading Comprehension
3 6 2 4 15 30%
Language and Structures
5 9 1 0 15 30%
Dictation 4 8 1 0 13 26%
Composition 0 1 5 1 7 14%
Total 12 24 9 5 50 100%
Percentages 24% 48% 18% 10% 100%
2013 Monitoring of Learning Achievement
16
Table of Specification for Arabic Language for 8th Grade
Levels-Content Recognition Understanding and Application
HOTs Extra HOTs
Total %
Reading Comprehension
3 9 3 3 18 30%
Language and Structures
4 17 3 0 24 40%
Dictation 2 10 0 0 12 20%
Composition 0 2 2 2 6 10%
Total 9 38 8 5 60 100%
Percentages 15% 63% 13% 9% 100%
In the final versions of the tests, two forms for the Arabic and Mathematics tests for each
Grade were prepared: Form A and Form B. These four forms strive to include as many of the
common skills in Arabic and Mathematics as possible.
c) COMMON ITEMS FOR EQUATING
To enable the comparison of results between the 2013 and 2009 MLA survey, a number of
items from the 2009 tests were included in the 2013 tests. These items are referred to as
‘anchor items’ and were selected from the 2009 MLA tests in both Mathematics and Arabic
language for 4th and 8th Grades, based on different issues relating to the subject matter.
For Mathematics, the weight of the mathematical content was taken into consideration in each
Grade, for example for in 4thGrade, Numbers and Operations, Geometry, and Statistics were
considered along the cognitive level in each domain. Similarly, for 8th Grade, mathematical
content such as Numbers and Operations, Geometry, Algebra and Patterns, Statistics,
Probabilities and Proportionality was taken into consideration. In addition to the mathematical
content, the cognitive level in each domain was considered. It is worth mentioning that the
focus in 4th and 8th Grades was on the application cognitive level due to the fact that it is the
core skill in Mathematics.
With regard to Arabic language, the sub-domains in Arabic language in both 4th and 8thGrades
such as Reading for Comprehension, Grammar, Dictation, and Composition, were taken into
consideration. Furthermore, the cognitive domain was also taken into consideration and the
distribution of the questions varied according to the cognitive domain. Moreover,
comprehension questions were related to certain texts and cannot be separated from their
texts, thus these comprehension questions were taken with their texts. Below are tables
showing the common items in Arabic and Mathematics for 4th and 8thGrades.
2013 Monitoring of Learning Achievement
17
Common Items for Grade 8 – Forms A & B Arabic Language
Subject Item no. Form Cognitive level Item content
Arabic 1 A Understanding &applying Reading
Arabic 2 A Knowing Reading
Arabic 3 A HOTs Reading Arabic 4 A Knowing Reading
Arabic 5 A Knowing Reading Arabic 6 A Understanding & Applying Reading
Arabic 7 A Understanding & Applying Reading Arabic 1 B Knowing Reading
Arabic 2 B Understanding & Applying Reading
Arabic 3 B Knowing Reading
Arabic 4 B Understanding & Applying Reading Arabic 5 B HOTs Reading
Arabic 6 B Knowing Reading Arabic 7 B HOTs Reading
Common Items for Grade 4 – Forms A & B Arabic Language
Subject Item
no.2013 Form Cognitive level Item content
Arabic 22 A Understanding & Applying Grammar
Arabic 23 A Understanding & Applying Grammar Arabic 24 A Understanding & Applying Grammar
Arabic 25 A Knowing Grammar Arabic 30 A Understanding & Applying Grammar
Arabic 39 A Understanding & Applying Dictation
Arabic 40 A Knowing Dictation
Arabic 23 B Understanding & Applying Grammar Arabic 24 B Understanding & Applying Grammar
Arabic 25 B Understanding & Applying Grammar
Arabic 30 B Understanding & Applying Grammar
Arabic 42 B Understanding & Applying Dictation
Arabic 43 B Knowing Dictation
2013 Monitoring of Learning Achievement
18
Common Items for Grade 8 – Forms A & B Mathematics
Subject Item no.
2013 Form Cognitive Level Item Content
Math 2 A Knowing Numbers & Operations Math 8 A Applying Numbers & Operations
Math 25 A Applying Geometry Math 34 A Applying Algebra & Patterns
Math 43 A Applying Statistics & Data Representation
Math 45 A Reasoning Statistics & Data Representation
Math 47 A Knowing Proportionality
Math 6 B Applying Numbers & Operations
Math 8 B Applying Numbers & Operations
Math 15 B EXTRA HOT's Numbers & Operations
Math 29 B Reasoning Geometry Math 37 B Applying Algebra & Patterns
Math 39 B Reasoning Algebra & Patterns Math 41 B Knowing Statistics & Data Representation
Common items for Grade 4 – Forms A & B Mathematics
2.2 DEVELOPING QUESTIONNAIRES
The quality of education provided to students is influenced by a range of factors and cannot be
measured only by referencing test scores. In assessing systemic efficacy, contextual data can
provide information about attitudes to learning, and a range of factors – including practices –
that may influence learning and achievement.
Subject Item no.
2013 Form
Cognitive Level
Item Content
Math 2 A Knowing Numbers & Operations Math 10 A Applying Numbers & Operations
Math 14 A Applying Numbers & Operations
Math 31 A Applying Geometry
Math 33 A Applying Geometry
Math 35 A Reasoning Geometry
Math 44 A Applying Statistics & Data Representation
Math 10 B Applying Numbers & Operations
Math 14 B Applying Numbers & Operations
Math 15 B Applying Numbers & Operations
Math 19 B Reasoning Numbers & Operations
Math 33 B Applying Geometry
Math 42 B Knowing Statistics & Data Representation
Math 44 B Applying Statistics & Data Representation
2013 Monitoring of Learning Achievement
19
As part of the MLA study, three different questionnaires were prepared by the UNRWA
Department of Education to collect data from students, teachers and School Principals
concerning a range of factors that international research has shown to influence levels of
student achievement. This information regarding the teaching and learning context in UNRWA
schools should enable the Agency to better understand and explain the level of achievement of
the students in its schools and help provide more effective support to teachers and students.
a) STUDENT QUESTIONNAIRES
Each student in Grade 4 and 8 who undertook the Arabic and Mathematics MLA tests also
completed a questionnaire for both subjects. The questionnaire was completed after the tests,
at the start of the second testing session. The students were instructed to write their student
numbers on the questionnaire. The test administrator oversaw the completion of the
questionnaire. Students kept their student numbers with them and recorded the same student
numbers on the questionnaires for both the Maths and Arabic tests.
b) TEACHER QUESTIONNAIRES
The teachers who taught Mathematics and Arabic to students being tested completed the
questionnaires as well. The teachers were told that the questionnaires are confidential and are
used for research purposes only. The information in the surveys helped to understand the
factors that influence (or explain) levels of students’ achievement in the respective subjects.
c) SCHOOL PRINCIPAL QUESTIONNAIRES
The School Principals of the schools where the tests were implemented completed the School
Principal questionnaire. The focus of the questionnaire was on the teaching and learning of
Mathematics and Arabic and the overall school climate. Questionnaires provided information
about the school premises, the school climate, the relations between the School Principal and
their teachers and students, teachers’ abilities, teaching strategies, and students’ attitudes
towards the school, teachers, Arabic and Mathematics.
2.3 RELIABILITY AND VALIDITY OF THE INSTRUMENTS
a) PRETESTING OF TEST ITEMS
In developing large-scale achievement surveys, it is important to ensure that the test
instruments are compatible with the time allotted for the tests, and that the layout of the test-,
the tables-, the diagrams- and the wording are clear.
To test the reliability and validity of the test instruments they were pre-tested for two days in
two boys’ and two girls’ schools in Jordan Field. Three test forms for Arabic and Mathematics
were pre-tested in two 4th Grade class sections and two 8th Grade class sections in the boys’
and the girls’ schools. The student, the teacher, and the School Principals’ questionnaires were
all pre-tested as well. All pre-tested instruments were analysed, and instruments were modified
accordingly. In particular, it was found that the time allotted for these tests should be
increased, thus an additional 15 minutes was added to both the Arabic and Mathematics
sessions.
2013 Monitoring of Learning Achievement
20
b) PILOTING OF TEST ITEMS
The test instruments were modified in collaboration with the Technical Committee from the
Fields in a workshop held for this purpose; modifications reflected findings which emerged
through the pre-testing process. Following this, three times as many test items as were
required were prepared and piloted to ensure that the final selection of test items could draw
from a large pool of relevant test items and that these test items were of the appropriate level
(not too easy or too difficult). In 2012, these items were piloted in four schools in each of the
four Fields: Gaza, Jordan, Lebanon and West Bank.
The final selection of test items for the 2013 MLA test were based on the results of the pilot
study and also accounted for the feedback provided by subject specialists from the Fields.
Several modifications to the test items were made, particularly for Arabic (both Grade 4 and 8)
and some for the Mathematics items. The test items which were very difficult or very easy
(according to item of difficulty and discrimination index) were replaced by other appropriate
items in response to the piloting analysis. Additional HOTS in Arabic were reviewed and
modified by an external consultant. The final selection of items was implemented in the four
Fields of operations – Syria Field could not participate due to the current war.
c) TEST ADMINISTRATION
As in the 2009 MLA test, the test administration procedures were closely modelled on those
used in the TIMSS study. In conducting large-scale achievement surveys, it is important to
ensure that the conditions under which tests are administered in different contexts are as
similar as possible, since differences in test conditions could affect the validity of scores
attained by students. To promote consistency across the different testing sites, a
comprehensive Test Administration Manual was developed, which set out how test
administrators should conduct the test sessions, including the classroom layout, test
distribution and management of the testing session. In addition, all those who were responsible
for test administration, attended a standardised training session on these procedures. The
administrators were teams of Education Specialists, and some School Principals, all of whom
were co-ordinated by MLA Field focal points.
A team of MLA Quality Assurors (senior Field staff from the Education Programme supported by
the staff from HQ(A) as well as other senior staff from HQ (A) were responsible for conducting
random visits to schools in which testing was taking place. The Quality Assurors assessed the
degree to which the Testing Manual was being adhered to by the test administrators. Formal
reports were prepared based on each school visit; these reports indicated that the tests were
administered in a fair, reliable and consistent manner.
d) SAMPLE SELECTION
The sample of the 2013 MLA covered all UNRWA schools that offered either Grade 4 or Grade 8
education. This was done to ensure that the Agency would have comparable performance data
drawn from as wide a range of schools possible. Although the sample covered all eligible
schools, it was not feasible to test all Grade 4 and 8 students. To this effect, up to two class
2013 Monitoring of Learning Achievement
21
sections per school were selected– this included more than four sections of Grade 4 and Grade
8, which participated in the study.
The participating class sections were identified on the day the test was administered in a school
- this was done to avoid any possible bias through preparation or selection. The Fields provided
HQ (A) with the school lists with their names and codes. It was agreed among the test planners
and developers to select two classes of 4th Grade and two of 8th Grade in the schools which
have four sections or more of either 4th or 8th Grade classes, or else one class section was
selected on the day of implementing the tests. The selected class section was kept confidential
till the day of the tests and was selected using criteria set for this purpose.
The following table illustrates the criteria for the selection of the class:
No. of class sections at school
(4 or 8 grades)
Section to be
Selected
Other Section to
be Selected
Total Number of Class Sections to be Selected
1 A - One section (Class A) 2 B - One section (Class B)
3 B - One section (Class B) 4 C - One section (Class C)
5 A E Two sections (Classes A & E)
6 B D Two sections (Classes B & D)
7 C F Two sections (Classes C & F)
8 A G Two sections (Classes A & G)
2.4 DATA PROCESSES
a) DATA COLLECTION
After implementing MLA instruments in the Fields, the Technical Committee at each Field
gathered the envelopes of the tests and the questionnaires. They were categorised and
classified according to the subject (Arabic or Mathematics), and according to Grades (4th or 8th).
They were stored in safe and secure places until the marking and the data entry could take
place. Three centres (in Gaza, Jordan and Lebanon) were established in order to manage the
task of marking the tests and the data entry. The process of marking MLA test booklets and
data entry for test’ answers and questionnaires for Lebanon Field was carried out in Lebanon
centre; and for Gaza Field , the process of marking MLA tests booklets and data entry for tests’
answers and questionnaires was carried out in Gaza centre at Gaza Field. While for West Bank
and Jordan Fields, the process of marking MLA tests booklets and data entry for tests’ answers
and questionnaires was carried out in Jordan centre at Jordan Field. An Education Specialist was
nominated from WBFO to monitor the process of marking and data entry.
b) CODING OF TEST MARKS
At each of the three centres in the Fields, a team of selected teachers met to mark student
responses. At the start of the marking process, there was a meeting with all parties involved in
marking the exams in order to explain the purpose of the MLA examinations and the
2013 Monitoring of Learning Achievement
22
procedures which were to be followed during the marking. A memorandum of correction was
discussed in detail to ensure a common understanding of how questions were to be weighted
and the manner in which students’ answers were to be judged.
At the start of marking each exam, the relevant Curriculum Advisor was present to act as
marking moderator. As such, the Curriculum Advisor was responsible for explaining marking
procedures and reviewing the quality and consistency of marking. Only free-response or open
ended questions were marked, reducing the number of questions that had to be marked (and
thereby reducing potential for error). The Curriculum staff checked 5-10 % of the corrected test
papers per day to quality-assure the process.
The following codes were used in the marking of the MLA tests. Any question with an incorrect
answer was entered with the code (0). If a student answered a question worth one mark
correctly, then the coding for that question in the data entry was (1) and so on, as shown in the
coding memorandum below which was followed in the correction of MLA Mathematics and
Arabic tests:
Coding Memorandum: Free-Response Questions
Code Meaning
0 (zero) Incorrect response
1 Correct response ( Question with 1 mark)
2 Correct response (Question with 2 marks)
3 Correct response (Question with 3 marks)
77 Incomprehensible (response cannot be read)
99 Missing (question not attempted or NOT completed)
The options selected by students in multiple choice questions were also entered in order to
facilitate more detailed error analysis. The given code is in accordance with the number of
marks allotted for the question if the response was correct. The same procedure was applied to
multiple choice questions, as shown in the table below. Scores obtained on constructed
response questions were entered using pre-set codes
Coding Memorandum: Multiple-Choice Questions Code Meaning
1 If the correct response is alternative( A or أ )
2 If the correct response is alternative (B or ب)
3 If the correct response is alternative (C or ج)
4 If the correct response is alternative (D or د)
77 If the response is incomprehensible (two options circled, correct option unclear)
99 No answer-Missing (question not attempted) or NOT reached
2013 Monitoring of Learning Achievement
23
c) DATA ENTRY AND CLEANING
After the marking process was completed and the answer papers were stored safely, three
centres (Gaza, Jordan and Lebanon) were established in order to manage the task of data entry.
At each centre, a team of Vocational Training Centre (VTC) students was selected for data entry
process. The data entry staff received training on how to enter data according to the data entry
memoranda. They received theoretical explanations with examples and discussion, and after
that they practiced entering data according to the set procedures until they were sure that they
could perform the task properly.
The correction and the data entry process then started for all test items. Data Analysis
Information Sheets for capturing data in the questionnaires were prepared and the data
gathered was analysed and linked with the students’ performance in the tests. Each data entry
staff was allocated an ID number to be included on data entry forms, in order to track data
capture errors. Curriculum staff checked 5-10 % of the data entry test papers and questionnaire
to quality assure the processes.
After the data entry of the tests’ scores and questionnaire responses, the data was gathered
and compiled for each Field; staff from the Research Unit at HQ (A) did the cleaning for all the
data according to the instructions, the feedback and the comments provided to them by the
consultant, Professor Anil Kanji. The Fields were contacted to clarify any vague information and
to complete the missing information as much as possible. Following this the blank cells were
removed, and also the cells which remained with missing information were removed.
2.5 DATA ANALYSIS AND REPORTING
The data analysis for this study was conducted according to the specific design of the
instruments, i.e. two forms of the cognitive tests, and to enhance the reporting of information
for use in addressing the key purpose of this report, which is to provide information to
education officials for identifying areas in need of intervention and for use as a baseline for the
2015 MLA survey. Specifically, an IRT scaling and equating process was required to equate the
different forms of the tests as well as to compare results between the 2009 and 2013 MLA
surveys, a standard setting process was undertaken for enhancing the reporting of scores, while
descriptive analysis, by Performance levels, Content Domains and Cognitive level was
conducted to report scores disaggregated by Gender and Field.
a) ITEM RESPONSE THEORY (IRT) SCALING AND EQUATING
The instruments for each subject area and Grade level comprise of two forms, A and B, which
were randomly administered to the same sets of students. In each Grade, the students were
assigned, in approximately equal ratios, to either: Form A (Mathematics), Form B
(Mathematics), Form A (Arabic) or Form B (Arabic). The two test forms in each combination of
subject and Grade were designed to be parallel, and did not contain any common items.
To produce scores that are comparable across test forms in each Grade, a three-stage
combination of Item Response Theory (IRT) scaling and classical linear equating process was
applied. IRT refers to a collection of mathematical models and statistical methods used to
2013 Monitoring of Learning Achievement
24
analyse test items and test scores and measure the performance of individuals on some
construct, e.g., mathematical ability (Reise, Ainsworth & Haviland, 2005). IRT is based on the
assumption that an estimate of a person’s true ability score on any test is determined by how
much a person knows about the subject matter that is being tested and whether the test
questions (items) are easy or difficult. Thus, as a person’s ability level increases, their chances
of getting a test item correct also increases.
The advantage of using IRT analysis is that when a model fits the data, the results provide
sample free item estimates and test free ability estimates (Hambleton, Swaminathan, & Rogers,
1991). Sample free item estimates mean that the item difficulty and discrimination values are
independent of the examinee sample from which they were estimated, that is , these values
belong to the item, not the group that took that test (Baker, 2001). Thus, with the correct
application of IRT, the results of a test are not affected if test items are easy or difficult or if the
ability level of students is high or low. In contrast, for analysis done using Classical Test Theory,
the difficulty and discrimination values of a test depend on the specific group of students that
take the test. An item will be easy if a high ability group of students respond to the item or
difficult if a low ability group of students respond to the item. Similarly, the score of any
student will be high on an easy test and low on a difficult test.
In the first stage, each test form is calibrated separately, producing IRT item parameters and
IRT theta scores for each student, using the 2PL model, defined in equation 1:
P (u=1) = 1/ (1+Exp (-1.7*a*(theta-b) (1)
Equation 1 describes the probability (P) that a student with a proficiency score of theta will
correctly respond to an item (u=1) whose statistical behaviour is defined by the parameters a
and b. For items with partial credit scoring, Equation 1 is interpreted as the probability that a
student with proficiency equal to theta will achieve a score greater than or equal to u.
In the second stage, the theta distributions of the sub-samples from each form are linearly
equated by adjusting each mean and standard deviation to equal 0 and 1, respectively. In the
third stage, the scaling constants used to perform the transformations in the third stage are
applied to the a) and b) parameters from the respective test forms to situate the items on the
common {0,1} scale. Finally, the equated theta scores and item parameters are used to produce
expected proportion-correct scores, or ‘true’ scores, which are estimated for each student, j, by
the Equation 2:
T(j) = Sum[(1/(1+Exp(-1.7*a_i*(theta_j-b_i)),i={1,2,…n}]/n. (2)
In Equation 2, the summation across all test items is the probability that each student will
correctly answer each item, when both the proficiency of the student sand the parameters of
the test items are known. Divided by the total number of item scores, the estimate describes
the proportion of total score that each student would expect to achieve if he or she were
administered all of the available test items. These true scores share the same mathematical
scale properties as percent scores.
2013 Monitoring of Learning Achievement
25
To facilitate the comparison of student performance across the 2009 and 2013 MLA surveys, an
IRT anchor item procedure was applied based the test items that were common in both the
2009 and 2013 surveys. For this analysis, all pilot HOTS items were excluded from the 2013
data while responses for students from Syria were excluded f rom the 2009 data. The table
below provides an example of the common-anchor test design based on 10 items test taken by
two groups of students, A to E who take items Q1 to Q10 and F to G who take items Q6 to Q15.
In this instance, Q6, Q7, Q8, Q9 and Q10 are common to both groups of students.
Anchor Item Design
Item # Student Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Q10 Q11 Q12 Q13 Q14 Q15
A x x x x x x x x x x
B x x x x x x x x x x
C x x x x x x x x x x
D x x x x x x x x x x
E x x x x x x x x x x
F x x x x x x x x x x
G x x x x x x x x x x
H x x x x x x x x x x
I x x x x x x x x x x
J x x x x x x x x x x
b) STANDARD SETTING
The standard setting process was undertaken to enhance the use of MLA results by UNRWA
education officials, teachers and school managers. Specifically, this allowed performance of
students to be reported against specific knowledge and skills adopted in the curriculum,
thereby providing additional details which could be used for identifying what students know
and can do, as well as identifying relevant interventions for addressing student learning needs.
The standard setting process required a representative panel of subject experts to:
i) Identify number of reporting levels and policy definitions for each of these levels;
ii) Determine performance levels descriptors for each subject area and grade level, based
on the common curriculum across the different fields, that outlines the specific
knowledge and skills that students functioning at the different levels know and can do;
iii) Establish cut scores that demarcate performance into the different performance levels
descriptors;
The panellists for this workshop comprised subject specialists from all UNRWA Fields, who were
selected based on their subject area expertise and experience as well as their knowledge of the
contexts in which the schools operate. Additional details on the standard setting process can
be found in the technical workshop report (UNRWA, 2013).
2013 Monitoring of Learning Achievement
26
After developing and agreeing on the number of levels and policy definitions for each level, the
PLDs for each grade and subject were developed. Thereafter, the subject specialists engaged in
a three-stage iterative process of setting cut scores. In the first stage, panellists worked
individually to rate the performance of a group of minimally competent student on each item,
at each level of performance. In the second stage panellists first discussed their individual
ratings with each other, focusing on those items where the variation was found to be
unacceptably wide, after which each item was rated again. In the third stage, panel members
were provided with students’ scores to compare with their own ratings, after which each item
was again rated. The final ratings were used to calculate the cut-scores, which were provided
to the panels to review and approve.
The process of setting cut scores resulted in PLDs and cut scores that, for each subject and
grade, make it possible to report student performance on a continuum that spans four levels,
viz. Not Achieved, Partially Achieved, Achieved and Advanced6as noted in the Table below.
The standard setting process was undertaken to enhance the use of MLA results by UNRWA
education officials, teachers and school managers. Specifically, this allowed performance of
students to be reported against specific knowledge and skills adopted in the curriculum,
thereby providing additional details which could be used for identifying what students know
and can do, as well as identifying relevant interventions for addressing student learning needs.
Policy Definitions for Key Performance Level Level Level definition
Advanced Performance at this level indicates that students demonstrates a comprehensive understanding of the knowledge and skills required to function at this grade level
Achieved Performance at this level indicates that students demonstrate sufficient understanding of the knowledge and skills required to function at this grade level
Partially Achieved
Performance at this level indicates that students demonstrates a partial understanding of the knowledge and skills required to function at this grade level
Not Achieved Performance at this level indicates that students demonstrate little or no understanding of the knowledge and skills required to function at this grade level
Progression and Intervention Implication for Each Performance Level
Level Progression Implications Intervention implications
Advanced high likelihood of success in the next grade
require little or no academic intervention but need to be provided with more challenging tasks to maximise their full potential
Achieved reasonable likelihood of success in the next grade
may require some assistance with complex concepts to progress to the advance level
Partially Achieved
not likely to succeed in the next grade without support
specific intervention to address knowledge gaps, while also requiring additional and continued classroom support to progress to the required grade (achieved) level
Not Achieved
unlikely to succeed in the next grade without significant support
specific intervention to address knowledge gaps, while also requiring additional teaching time and extensive and continued support within the classroom context, to progress to the required grade (achieved) level
6Terms for each level were proposed by the Mathematics Grade 4 group
2013 Monitoring of Learning Achievement
27
For each subject and Grade level, draft performance level descriptors were developed,
discussed by the different subject area expertise and adapted, as noted in Tables below.
Performance Level Descriptors for Grade 4 Arabic
Performance Level Descriptors for Grade 8 Arabic
Grade 4 Arabic: Level 1 - Partially Achieved
At this level, the student is expected to:
Read comprehensive, literal and interpretational meaning of words and their antonyms; extract direct
answers from texts; distinguish between nouns and verbs and between subjects and predicates;
classify verbs according to tense; distinguish between types of sentences; define constituents of verb
sentences; distinguish between various kinds of plurals; distinguish words with letters that are
pronounced but not written and words with letters that are written but not pronounced.
Grade 4 Arabic: Level 1 - Partially Achieved
At this level, the student is expected to:
Read comprehensive and literal meanings of words and common phrases and their opposite
meanings; extract answers directly from text; ascribe a quote in the text to its source; distinguish
masculine noun; write singular and plural forms of common nouns; distinguish between nouns, verbs
and particles; write appropriate examples of present, past and imperative mood tenses; know
common interrogative, demonstrative and relative pronouns; distinguish between masculine and
feminine constructions; distinguish between verb and nominal sentences; distinguish between the
two kinds of “t”; identify exclamation marks; re-arrange words of a scrambled 3-word sentence.
Grade 4 Arabic: Level 2 - Achieved
At this level, the student is expected to:
Demonstrate interpretational reading, define main ideas, sub-ideas and text title; distinguish facts
from opinions; judge behaviours in text; recognize proverbs; derive present tense and imperative
mood from past tense; write past tense in dual and plural forms; turn masculine into feminine forms
and vice versa; use prepositions and negative and prohibitive “la” correctly; use exclamation
structure of “how …he is!”; identify constituents of verb and nominal sentences; use adverbs
correctly; write the letter “t” correctly at the end of the word; write the distinctive “a” correctly; use
punctuation marks correctly; rearrange a scrambled 4-word sentence.
Grade 4 Arabic: Level 3 - Advanced
At this level, the student is expected to:
Demonstrate symbolic and creative reading with supporting evidence from text; distinguish between
cause and effect; justify judgment behaviours in text; formulate different kinds of plural; formulate
different kinds of sentences; conjugate verbs according to tense and mood; use linguistic forms such
as interrogative, exclamation , negative and prohibitive; turn nominal sentences into verb sentences
and vice versa; distinguish between the “t” and the “h”; write a paragraph or text for a sign; write
narrative about a picture using suitable punctuation marks.
2013 Monitoring of Learning Achievement
28
Grade 4 Arabic: Level 2 - Achieved
At this level, the student is expected to:
Distinguish among denotations of words and their meanings and use them in meaningful sequences;
interpret situations according to text; give opinion about issues; clarify intended meaning of phrases
with reference to real life examples; infer qualities of characters in texts; clarify main and sub ideas in
texts; identify lines of poetry indicative of certain ideas; distinguish between transitive and
intransitive verbs and syllabic and non-syllabic verbs; describe the function of syllabic and non-syllabic
verbs; write nouns in the dual form; define the subjunctive case of the present tense; use the present
tense in negative and affirmative constructions; use the Five Nouns; describe marks of inflicted verbs
and nouns; distinguish between negative and affirmative sentences; distinguish between the
inflectional, plural and original “و”; and write “a” correctly at the end of verbs and nouns.
Grade 4 Arabic: Level 3 - Advanced
At this level, the student is expected to:
Distinguish among symbolic and creative texts; distinguish among denotations of words and
constructions in different contexts; evaluate behaviours from own viewpoint; infer the author’s
theme from the text; distinguish between the meaning of words with similar spelling but are different
in meaning; define the function and marks of syllabic and non-syllabic verbs; define the function and
marks of the Five Nouns; draft compositions according to the rules of writing; summarize a text in no
more than 10 lines.
Performance Level Descriptors for Grade 4 Mathematics Grade 4 Mathematics: Level 1 - Partially Achieved
At this level, the student is expected to:
Read and write numbers within seven digits; recognize odd and even numbers; compare 2 numbers
within 7 digits at most. Develop a number within 7 digits and versa; Add and subtract two numbers
within 7 digits; multiply 2 numbers within 2 digits; recognize fractions and decimal numbers and
express it in drawings and versa. Add and subtract fractions having same denominators. Identify units
of length also those of time; Identify: square and rectangle.
Grade 4 Mathematics: Level 2 - Achieved
At this level, the student is expected to:
Write numbers using place value digit; arrange numbers, add and subtract two numbers up to 7 digits,
multiply 3-digit number with 2-digit number, divide 2-digit number by 1-digit number, complete
simple numerical or geometric patterns, find factors and multiples of numbers up to 100, compare
fractions with same numerator/denominator, recognize place value digit in decimal number, change
decimal number to fraction and vice versa (denominators 10,100). Change or compare different units
of measurement of length, and different units of time and solve related simple problems. Calculate
area of regular geometric shape by counting square units. Read statistical data in frequency tables or
Bar graph. Solve simple open statement.
2013 Monitoring of Learning Achievement
29
Grade 4 Mathematics: Level 3 - Advanced
At this level, the student is expected to:
Find non-routine mathematical patterns; round a whole number and decimal number to the nearest
given digit. Solve an open statement and non-routine problems using the four operations (+, - , x, ÷)
on numbers within 7 digits and like/similar fractions; explain and represent statistical data and solve
related problems. Compare and arrange durations of time, solve concerned situation related to time
and student daily life; change durations of time to a new form or unit.
Performance level descriptors for Grade 8 Mathematics Grade 8 Mathematics: Level 1 - Partially Achieved
At this level, the student is expected to:
Compare and arrange rational numbers, identify opposite of a given number, use prime factorization,
find least common multiple (LCM) and greatest common divisor (GCD), recognize and use powers
with natural exponent; change fraction to mixed number and vice versa, recognize different forms of
a rational number; change a fraction and decimal number to a percentage and vice versa. Form
proportions; identify linear equation and solve it; classify angles and triangles, identify properties of
isosceles triangle, Pythagoras’ theorem and its converse, recognize quadrilateral and its properties;
Read statistical data in frequency tables or figures.
Grade 8 Mathematics: Level 2 - Achieved
At this level, the student is expected to:
Perform calculations on integers and rational numbers, find the square root of a complete squared
number, and use powers with natural exponents. Also use ratio, proportion and percentage to solve
simple problem, find the value of algebraic expression, solve simple equation in one unknown and use
it to solve routine problems.
Use: properties of isosceles triangle, sum of angels in a triangle, Pythagoras’ theorem, its converse,
midpoint theory, and sum of angles in a quadrilateral to solve routine problems, use properties of
triangles and quadrilateral to solve routine problem about area and perimeters. Read statistical data
and represent it by bar diagram and Pi-charts, also calculate its mean (average).
Grade 8 Mathematics: Level 3 - Advanced
At this level, the student is expected to:
Use the properties of the 4 operations (+, - , x, ÷) simplify algebraic expressions and powers with
natural exponents. Solve linear equations and use it to solve non routine problems use ratio,
proportion and percentage to solve non routine problems; use rule of area and perimeter to solve
non routine problems; explain and solve non routine problems given statistical data and use the mean
(average) to solve non routine problems.
For each of performance level descriptors across the subject areas and grade levels, cut-scores
were determined based on the Angoff method. These cut-scores were used to categorise
students into the different levels of performance based on the scores obtained in tests, (see
Table below).
2013 Monitoring of Learning Achievement
30
Grade 4 and 8 Cut-Scores (%) by Grade Level and Subject
Grade / Subject Level 1
Partially Achieved
Level 2 Achieved
Level 3 Advanced
Grade 4 Arabic 24 55 83
Grade 4 Mathematics 22 51 73
Grade 8 Arabic 20 46 70
Grade 8 Mathematics 16 38 59
c) DESCRIPTIVE ANALYSIS
Basic descriptive analysis was also conducted to report student performances by:
(i) overall score, (ii) cognitive domain, and (iii) cognitive level. All results are reported using
mean scores and performance levels for each grade level and subject area and are
disaggregated by Field and Gender.
2.6 LIMITATIONS OF THE MLA SURVEY
A number of limitations that may have had an impact on this study need to be noted. These
include:
i) Piloting of the Grade 4 and 8 instruments were conducted on students in Grade 5 and 9,
as the pilot study was only conducted in March 2013. In ideal contexts, this pilot should
have been conducted at the around the same period when the main test was scheduled
to be administered, that is in May;
ii) The data entry and coding processes were not uniformly applied across the different
Fields, which resulted in missing data, that had to be re-entered at a later stage;
2013 Monitoring of Learning Achievement
31
3. STUDENT PERFORMANCE
This section provides information to UNRWA education officials on the performance levels of
students for two primary purposes, firstly to identify key areas in need of improvement
regarding the specific knowledge and skills of students across the participating UNRWA Fields.
Secondly, for use in establishing a baseline for the Education Programme Reform. In addressing
these purposes, the results are reported against the performance standards regarding overall
performance of students, disaggregated by Field and gender. In addition, results are also
reported by content domain and cognitive level for each subject and Grade area.
As noted in Section 2, performance standard provides information on the percentage of
students’ performing at each level as well as information on what students know and can do at
each of these levels. For each Grade and subject area, performance of students is reported at
four levels: Advanced, Achieved, Partially Achieved and Not Achieved, with additional
information for each level noted in Table 3.1, while specific knowledge and skills for each
subject and grade level are noted in Appendix (Tables A, B, C, D)
Table 3.1 - Performance Level Definitions
Level Level definition
Advanced Performance at this level indicates that students demonstrate a comprehensive
understanding of the knowledge and skills required to function at this grade level.
Achieved Performance at this level indicates that students demonstrate sufficient
understanding of the knowledge and skills required to function at this grade level.
Partially
Achieved
Performance at this level indicates that students demonstrate a partial understanding
of the knowledge and skills required to function at this grade level.
Not Achieved
Performance at this level indicates that students demonstrate little or no
understanding of the knowledge and skills required to function at this grade level.
The results are reported by subject area and Grade level beginning with Grade 4 Arabic,
followed by Grade 4 Mathematics, Grade 8 Arabic and Grade 8 Mathematics.
3.1. GRADE 4 ARABIC
a) PERFORMANCE LEVELS BY FIELDS
The mean score for all students across the four Fields on the Grade 4 assessment is 45%. The
mean scores range from 40% in Jordan to 48% in the West Bank and Gaza. (See Table 3.1.1). In
terms of the Grade 4 Arabic performance standards, a total of 55% of students functioned
below the Achieved Grade level (that is, 16% at the Not Achieved Level and 39% at the Partially
Achieved level). Forty per cent (40%) performed at the Achieved Grade level and 5% at the
Advanced Level (Figure 3.1.2).
2013 Monitoring of Learning Achievement
32
Table 3.1.1 - Mean Scores by Field
Field Mean N se
Gaza 47.83 8238 0.22
Lebanon 41.99 1351 0.52
Jordan 39.85 4749 0.29
West Bank 47.91 2392 0.41
All fields 45.11 16730 0.16
The histogram in Figure 3.1.1, which presents the mean percentage scores of students, is
slightly negatively skewed reflecting the mean of 45%, with 55% of students achieving below
the Achieved Grade level.
Figure 3.1.1 - Distribution of Student Scores by Performance Level
For two of the four Fields, (Lebanon and Jordan), almost half of all Grade 4 students were
functioning at the Partially Achieved level, while in Gaza and West Bank a little less than 40% of
students were at this level (Figure 3.1.2). Conversely, Gaza and the West Bank had more
students functioning at the Achieved level (i.e. at the Grade level), 39% and 40% respectively,
with corresponding percentages for Lebanon and Jordan at 29% and 26%. Over the four Fields
between 18% and 27% were functioning at the Not Achieved level, with between 2% and 6% at
an Advanced level. Gaza and West Bank stand out as having the higher proportions of students
at Advanced and the lower proportions of students at Below.
Not Achieved
Achieved Advanced
2013 Monitoring of Learning Achievement
33
Figure 3.1.2 - Performance Standards by Field7
b) PERFORMANCE LEVELS BY GENDER
A comparison of the scores for boys and girls across all UNRWA schools (Table 3.1.2) indicates
that girls obtained a mean score of 51% while boys obtained a mean of 38%, representing a
significant difference of 13%. The differences in favour of girls were similar across the four
Fields.
Table 3.1.2 - Mean Scores by Gender and Field
Field Boys Girls
Mean se Mean Se
Gaza 41.45 0.32 53.87 0.28
Lebanon 35.93 0.73 47.97 0.66
Jordan 33.38 0.40 46.07 0.38
West Bank 39.91 0.68 52.92 0.47
Total 38.44 0.23 51.11 0.20
c) PERFORMANCE LEVELS BY FIELD AND GENDER
Differences in performance in favour of girls across all schools were also noted in terms of the
performance standards with significantly less girls at the lower levels and significantly more
girls at the higher levels (Figure 3.1.3). At the lower levels (i.e. Not Achieved and Partially
Achieved) there are 74% boys, and 49% girls. At the higher levels (i.e. Achieved and Advanced)
there are 26% boys and 50% girls.
7For all Figures, Below Basic = Not Achieved, Basic = Partially Achi eved, Required = Achieved
2013 Monitoring of Learning Achievement
34
Figure 3.1.3 - Performance Levels by Gender
A review of the gender performance across the Fields reveals similar outcomes, with greater
proportions of boys than girls at the Not Achieved and Partially Achieved levels, and a greater
proportion of girls than boys at the Achieved and Advance levels (Table 3.1.3). Jordan is the
exception with slightly more girls than boys at the Partially Achieved level. The small difference
of 2% is likely to be insignificant.
Gaza and the West Bank have the least proportion of students at Not Achieved and at Partially
Achieved, and the greatest proportions at Achieved and Advanced, suggesting these two Field
show the strongest achievement by gender. The most variation by gender is at Not Achieved
and Achieved levels. The least variation is at the Partially Achieved and Advanced levels but the
generally lesser percentages at Advanced reduce the impact of this result. Across Table 3.1.3 as
a whole West Bank shows least variation from the means and Jordan shows most variation.
Table 3.1.3 - Performance Levels (%) by Gender and Field
Field Not
Achieved Partially Achieved
Achieved Advanced
Boys Girls Boys Girls Boys Girls Boys Girls
Gaza 27 9 41 34 29 49 3 8
Lebanon 33 13 49 45 18 39 0 3
Jordan 39 16 44 46 16 35 1 3
West Bank 27 9 44 36 26 49 2 7
All Fields 31 11 43 38 24 44 2 6
d) PERFORMANCE BY CONTENT DOMAINS
Further analysis was also conducted to determine how all students performed across the
specific sub-domains constituting the Language curriculum: Dictation, Grammar, Reading and
Writing. These data are presented in Figure 3.1.4. Performance was relative uniform across the
2013 Monitoring of Learning Achievement
35
four content domains with Dictation, Reading and Writing having average scores within the
range 50% to 54%. Grammar was somewhat greater at 62%.
Figure 3.1.4 - Mean Scores: Content Domain for all Fields
A review of performance across the four Fields for each Content Domain (Table 3.1.4) shows
uniform levels of performance generally in the range of 40% to 66%. Lebanon showed the
lowest average with 48% on Dictation. West Bank recorded the highest average with 66% on
Grammar. Across the four content domains Gaza and West Bank have scored highest and rank
first or second for all domains. Gaza is slightly superior in Reading and Writing. West Bank is
slightly superior in Dictation and Grammar. Jordan and Lebanon rank third or fourth for all
domains. Their lowest scores were in Writing. Their highest scores were in Grammar.
Table 3.1.4 Mean Scores: Content Domain by Field
Field Dictation Grammar Reading Writing
Mean se Mean se Mean se Mean se
Gaza 56.22 0.26 64.63 0.29 56.97 0.35 56.14 0.39
Lebanon 48.25 0.61 61.78 0.72 48.83 0.8 42.44 0.86
Jordan 48.99 0.36 54.85 0.4 48.83 0.44 40.07 0.49
West Bank 57.29 0.53 65.63 0.54 54.98 0.59 52.25 0.71
All Fields 53.68 0.19 61.77 0.21 53.72 0.24 49.92 0.27
In terms of mean scores by gender, girls obtained higher scores in all four Content Domains
with mean scores differences of 7% -8% for Dictation, Grammar, and Reading and 12% for
Writing. (Table 3.1.5) Given that the lowest mean score of the four Content Domains was
recorded for writing, this difference of 12% is noteworthy.
Table 3.1.5 - Mean Scores: Content Domain by Gender
Domain Boys Girls Total
Mean se Mean se Mean se
Dictation 46.03 0.28 60.56 0.24 53.68 0.19
Grammar 54.34 0.32 68.46 0.26 61.77 0.21
Reading 46.06 0.35 60.61 0.31 53.72 0.24
Writing 38.35 0.39 60.33 0.35 49.92 0.27
2013 Monitoring of Learning Achievement
36
A comparison of performance across the Content Domain at Figure 3.1.5 shows that students at
the Not Achieved level obtained their highest mean scores for Dictation and Grammar and the
lowest for Writing, which at an average of 5%, is well below their performance in the other
domains, and well below the Writing performance at the other three performance levels.
Writing is also the area of lowest scores for the Partially Achieved group but this pattern is not
replicated for the Achieved or Advanced groups. The low levels of average performance by
students at the Not Achieved level stand out as a specific area of concern. These performance
levels which average around 15% over the four domains are in sharp contrast to the Advanced
group averaging beyond 90% for the same four domains.
Figure 3.1.5 - Mean Scores: Content Domain by Performance Levels
As noted previously Grammar has the highest mean scores within each of the four Fields
(Figure 3.1.6). Writing has the lowest mean scores. Jordan and Lebanon are notable for the low
scores in Writing.
Figure 3.1.6 - Mean scores: Content Domains by Field
At the Not Achieved and Partially Achieved levels the lowest mean scores were achieved in
Writing. At Not Achieved, for example Jordan (3%) and West Bank (4%) record the lowest mean
2013 Monitoring of Learning Achievement
37
scores (Table 3.1.6). At the Achieved level mean scores for Writing varies from highest for Gaza
(85%), to lowest for Lebanon (76%). However, at the Advanced level, Writing is the highest
scoring domain for three out of four fields, with Jordan the exception. Dictation and Grammar
are the two domains with the highest mean scores at the Not Achieved and Partially Achieved
levels. This does not hold true for Advanced however, where Writing is top for three of the four
fields with Jordan the exception this time.
The key literacy skill of reading is ranked third for all Fields at Not Achieved and for three Fields
at Not Achieved. At the Achieved level reading is ranked as second except for West Bank where
it ranks third. At the Advanced levels, reading ranks second for all Fields. At both Achieved and
Advanced levels, Writing remains first. In other words, at the lower levels the key skills of
reading and writing showed the least proficiency of achievement.
Table 3.1.6 - Content Domain by Performance Levels and Field
Not Achieved
Partially
Achieved Achieved Advanced Total
Mean se Mean se Mean se Mean se Mean se
Gaza
Dictation 24.61 0.32 47.65 0.26 74.13 0.2 89.56 0.29 56.21 0.26
Grammar 22.57 0.35 60.55 0.31 83.85 0.16 92.39 0.23 64.62 0.29
Reading 13.14 0.32 46.7 0.4 81.37 0.26 95.49 0.3 56.96 0.35
Writing 7.03 0.25 43.5 0.45 84.9 0.26 96.81 0.33 56.13 0.39
Lebanon
Dictation 20.94 0.73 46.72 0.58 70.15 0.57 86.61 1.18 48.25 0.61
Grammar 24.42 0.92 64.3 0.64 85.45 0.46 91.62 0.91 61.78 0.72
Reading 13.96 0.74 46.16 0.85 78.24 0.68 93.07 2.16 48.83 0.8
Writing 6.61 0.57 37.73 0.92 75.56 0.79 93.41 2.08 42.44 0.86
Jordan
Dictation 21.02 0.4 48.99 0.34 75.2 0.31 91.43 0.71 48.97 0.36
Grammar 19.89 0.37 57.79 0.36 83.4 0.28 94.89 0.46 54.84 0.4
Reading 15.09 0.4 49.11 0.47 80.24 0.4 95.63 0.63 48.8 0.44
Writing 3.47 0.21 38.52 0.54 77.37 0.42 91.14 1.06 40.06 0.49
West Bank
Dictation 19.36 0.74 48.93 0.57 76.56 0.35 89.89 0.61 57.29 0.53
Grammar 19.51 0.77 61.83 0.56 84.34 0.29 93.5 0.46 65.63 0.54
Reading 14.18 0.75 45.65 0.68 75.72 0.51 92.27 0.75 54.98 0.59
Writing 3.77 0.37 38.81 0.84 79.54 0.52 93.97 0.81 52.25 0.71
All Fields
Dictation 22.37 0.23 48.16 0.18 74.49 0.15 89.75 0.24 53.67 0.19
Grammar 21.40 0.24 60.21 0.21 83.94 0.12 92.86 0.19 61.76 0.21
Reading 14.06 0.23 47.26 0.27 79.99 0.19 94.86 0.27 53.71 0.24
Writing 5.31 0.15 40.74 0.3 81.79 0.2 95.54 0.31 49.91 0.27
2013 Monitoring of Learning Achievement
38
e) PERFORMANCE BY COGNITIVE LEVELS
Gaza and West Bank stand out with superior mean scores (all above 52%) across the cognitive
levels (Table 3.1.7). This duplicates the results from Table 3.1.1, Mean Scores by Field and is
consistent with Figure 3.1.2 which shows West Bank and Gaza as having the greater
proportions of students at the Advanced and Achieved levels.
Table 3.1.7 - Mean Scores: Cognitive Level by Field
Knowing Applying HOTS
Mean se Mean se Mean se
Gaza 55.67 0.3 60.79 0.26 55.37 0.39
Lebanon 48.37 0.68 55.33 0.62 40.18 0.83
Jordan 49.75 0.39 51.37 0.36 40.81 0.45
West Bank 58.17 0.56 60.57 0.5 52.23 0.67
All Fields 53.76 0.21 57.65 0.19 49.56 0.26
Across each of the cognitive levels assessed, Girls have greater mean scores than boys (Table
3.1.8). This is consistent with the gender differences found in favour of girls at Table 3.1.2 and
Figure 3.1.3. The greatest difference in mean scores between boys and girls is for the cognitive
level classified as higher level skills (HOTS), a difference of 12%.
Taking the achievement and cognitive data together emphasises further the superiority of girls
over boys for the outcomes assessed in this study. This conclusion is in line with studies of
achievement in many other international settings.
Table 3.1.8 - Mean Scores: Cognitive Level by Gender
Boys Girls Total
Mean se Mean se Mean se
Knowing 46.42 0.3 60.36 0.27 53.76 0.21
Understanding 49.98 0.28 64.55 0.23 57.65 0.19
HOTS 38.09 0.37 59.89 0.34 49.56 0.26
Pilot HOTS 18.11 0.18 25.79 0.21 22.15 0.14
A number of additional items assessing higher order thinking skills were also included in the
instrument. These items were included as pilot items as the current curriculum does not focus
on this level of cognition. The intended use of these items was to obtain a baseline measure on
student performance at this cognitive level for use in improving teaching and learning regarding
the development of higher-order thinking and problem solving skills. As noted in Figure 3.1.7,
mean scores across the four Fields range from 45% (Jordan) to 55% (Lebanon). Contrary to prior
results, the Gaza and the West Bank do not stand out at this level.
2013 Monitoring of Learning Achievement
39
Figure 3.1.7 - Performance on Extra HOTS Items
The superiority of girls as demonstrated at earlier tables and figures is demonstrated again for
the pilot HOTS items (Figure 3.1.8). All mean scores are remarkably similar by gender except for
Girls in Jordan, whose mean scores are noticeably lower than girls for other Fields.
Figure 3.1.8 - Performance on Extra HOTS Items by Gender and Field
Figure 3.1.9 could be read in conjunction with Table 3.1.6 and Figure 3.1.1. Figure 3.1.9
reaffirms the distance between those achieving at the Advanced level and those at the Not
Achieved level. It shows also Reading and Writing as being the strongest performances at the
Advanced and Achieved levels and the weakest performances at Not Achieved and Partially
Achieved. In addition, at Not Achieved and Partially Achieved there is a steady decline in
performance after the high scores for grammar. Dictation and Grammar, which are the
strongest areas of performance at the Not Achieved and Partially Achieved level probably
articulates most with the cognitive skills of Knowing and Understanding.
2013 Monitoring of Learning Achievement
40
Figure 3.1.9 - Mean Scores: Content Domain by Performance Levels
A review of student performance by cognitive level, performance standard and Field reveals
similar trends to that reported above (Table 3.1.9). Generally, across most Fields and
performance levels, higher mean scores were obtained for Understanding, followed by knowing
with significantly lower scores for HOTS at the Partially Achieved and Not Achieved level.
However, at the Advance levels, students obtained higher mean scores for the HOTS items and
significantly lower mean scores for the pilot HOTS items. This difference could be attributed to
the specific nature and focus of the pilot-HOTS items, that is these items are using longer
reading texts, there are more than one text for a certain theme, the texts are more relevant to
unfamiliar contexts and focus on a wider range of critical thinking skills .
Table 3.1.9 - Mean Scores: Cognitive Level by Performance Standards and Field
Not Achieved Partially Achieved
Achieved Advanced Total
Mean se Mean se Mean se Mean se Mean se
Gaza
Knowing (%) 17.78 0.28 46.74 0.32 76.52 0.23 91.02 0.33 55.66 0.30
Understanding (%) 23.87 0.27 54.3 0.22 79.56 0.13 91.36 0.21 60.78 0.26
HOTS (%) 4.84 0.19 42.09 0.4 85.05 0.22 98.4 0.18 55.36 0.39
Pilot HOTS (%) 10.34 0.33 19.1 0.27 29.42 0.32 53.37 0.93 23.47 0.21
Lebanon
Knowing (%) 17.77 0.61 45.97 0.65 74.11 0.67 87.94 1.09 48.37 0.68
Understanding (%) 22.73 0.67 55.97 0.48 78.1 0.38 89.4 1.12 55.33 0.62
HOTS (%) 4.07 0.40 34.63 0.76 74.65 0.67 95.6 1.25 40.18 0.83
Pilot HOTS (%) 10.13 0.65 19.8 0.54 32.74 0.89 60.88 4.74 21.97 0.47
Jordan
Knowing (%) 18.90 0.34 50.15 0.37 78.13 0.35 94.15 0.59 49.73 0.39
Understanding (%) 19.67 0.30 52.56 0.28 79.47 0.22 93.02 0.53 51.36 0.37
HOTS (%) 4.41 0.18 39.96 0.44 76.71 0.32 91.72 0.79 40.80 0.45
Pilot HOTS (%) 10.59 0.34 17.24 0.26 29.59 0.49 63.74 2.11 19.44 0.24
2013 Monitoring of Learning Achievement
41
West Bank
Knowing (%) 17.04 0.66 49.55 0.58 78.81 0.4 92.19 0.67 58.17 0.56
Understanding (%) 19.11 0.61 54.16 0.43 79.58 0.26 91.29 0.44 60.57 0.50
HOTS (%) 3.77 0.29 39.14 0.67 79.06 0.42 94.81 0.53 52.23 0.67
Pilot HOTS (%) 9.10 0.62 17.85 0.44 29.53 0.58 55.65 1.84 23.02 0.38
All Fields
Knowing (%) 18.11 0.2 48.13 0.21 77.08 0.17 91.49 0.27 53.75 0.21
Understanding (%) 21.68 0.19 53.88 0.15 79.44 0.1 91.48 0.18 57.64 0.19
HOTS (%) 4.49 0.12 40.31 0.26 81.59 0.17 96.9 0.2 49.55 0.26
Pilot HOTS (%) 10.28 0.21 18.41 0.17 29.70 0.23 55.21 0.77 22.14 0.14
f) SUMMARY GRADE 4 ARABIC
A review of Grade 4 student performance across UNRWA in Arabic reveals the following:
Performance was generally moderate across all four Fields with mean scores varying from
40% to 48%. Just under one- half of students performed at or above the Achieved grade
level; about one-fifth performed at the Not Achieved level; a little more than one-third
were at the Partially Achieved level, and 5% were at the Advanced level.
In all Fields, performance across the four Arabic Content Domains was also generally
moderate, the average scores ranging from 50% to 64% with a median of 54%. Grammar
was highest at 64%; Dictation and Reading were 54% and Writing 50%.
Generally students from Gaza and West Bank showed stronger performance than students
from Jordan and Lebanon.
Although Writing has the lowest overall average, this was the highest scoring content
domain on average, for students at the Achieved and Advanced levels. Students at
Advanced and Achieved did next best at Reading.
Students at Partially Achieved and Not Achieved did best on Grammar and Dictation but
their performance on Reading and Writing showed a noticeable decline. At these lower
levels the key skills of reading and writing showed the least proficiency of achievement.
Results for all Fields indicate moderate performance across the three Cognitive levels also,
with averages of 58% for Understanding, 54% for Knowing, and 50% for HOTS. Students
scored highest for Understanding for all performance standards except at Advanced where
HOTS was highest. Knowing was ranked second. Averages for the cognitive measures for
Not Achieved students were about one-third of the total group averages, except for HOTS
where the average score was less than 10% of the group average.
In terms of mean scores by gender, girls obtained higher scores in all four Content
Domains with mean score differences over boys of 7% -8% for Dictation, Grammar, and
Reading and 12% for Writing. There are significantly less girls at the lower levels and
significantly more girls at the higher levels. Taking the achievement and cognitive data
together emphasises the superiority of girls over boys for the outcomes assessed.
2013 Monitoring of Learning Achievement
42
The greatest difference between mean scores for boys and girls is for the cognitive skill
classified as HOTS, which represent higher level skills than Knowing and Understanding.
3.2. GRADE 4 MATHEMATICS
This section provides an overview of student performance on the Grade 4 Mathematics test.
a) PERFORMANCE LEVELS BY FIELD
The mean score obtained by students across all UNRWA Fields on the Grade 4 assessment is
41% with mean scores ranging from 38% in Jordan to 46% in the West Bank (see Table 3.2.1).
In terms of the Grade 4 mathematics performance standards, 68% of all UNRWA students were
functioning below the Achieved Grade level, that is, 20% at the Not Achieved Level and 48% at
the Partially Achieved level. Only 25% were performing at the Achieved grade level and 7% at
the Advance Level (Figure 3.2.1).
Table 3.2.1 - Mean Scores by Field
Field Mean N se
Gaza 41.75 7828 0.22
Lebanon 41.07 1421 0.52
Jordan 37.87 5119 0.28
West Bank 45.75 2390 0.45
All Fields 41.08 0.10 0.16
Figure 3.2.1 - Distribution of Student Scores by Performance Level
Across three of the four fields, approximately half of all Grade 4 Mathematics students were functioning at the Partially Achieved level, while in West Bank approximately 40% of students
were at this level (Figure 3.2.2). In Gaza and Lebanon, a quarter of the students were functioning at the Achieved level (i.e. at the Grade level), while for Jordan and West Bank the
Not Achieved
Partially Achieved
Achieved AdvancedPartially
2013 Monitoring of Learning Achievement
43
three corresponding percentages were 21 and 30% respectively. However, across the four Fields between 17 and 26% were functioning Below the Partially Achieved level.
Figure 3.2.2 - Performance Standards by Field
b) PERFORMANCE LEVELS BY GENDER
A comparison of the scores for boys and girls across all UNRWA schools (Table 3.2.2) indicates
that girls obtained a mean score of 45% while boys obtained a mean of 37%, representing a
significant difference of 8.5%. Similar trends were noted across the four Fields with girls
obtaining higher scores than boys, with differences in mean scores of ranging from 5%
(Lebanon) to 9% (West Bank).
Table 3.2.2 - Mean Scores by Gender and Field
Field Boys Girls Total
Mean se Mean se Mean se
Gaza 37.47 0.32 45.77 0.29 41.75 0.22
Lebanon 38.82 0.77 43.80 0.74 41.32 0.54
Jordan 33.88 0.40 41.79 0.39 37.87 0.28
West Bank 40.11 0.74 49.37 0.55 45.75 0.45
All Fields 36.74 0.23 45.04 0.21 41.10 0.16
c) PERFORMANCE LEVELS BY FIELD AND GENDER
Differences in performance between girls and boys across the UNRWA schools were also noted
when comparing the performance standards (Figure 3.2.3). Higher percentages of boys were
functioning at the lowest level (i.e. Not Achieved and Partially Achieved) while higher
percentages of girls were functioning at the Achieved and Advanced levels.
2013 Monitoring of Learning Achievement
44
Figure 3.2.3 - Performance Levels for all Schools
A review of the gender performance across the fields reveal similar trends, with higher
percentages of boys represented at the Not Achieved level, and higher percentages of girls
performing at the Achieved and Advance levels (Table 3.2.3) with twice the number of boys
than girls at the Not Achieved level. Approximately equal percentages of girls and boys are
functioning at the Partially Achieved Level in Gaza and Lebanon, while in the West Bank there
are approximately 5% more boys and for Jordan, there were approximately 4% more girls.
Table 3.2.3 - Performance Levels by Gender and Field
Field Not Achieved Partially Achieved
Achieved Advanced
Boys % Girls % Boys % Girls % Boys % Girls % Boys % Girls %
Gaza 23.6 10.4 51.8 50.8 19.1 30.7 5.5 8.2
Lebanon 29.4 12.0 48.0 49.8 19.4 32.8 3.2 5.5
Jordan 33.5 18.3 45.3 49.1 17.2 25.7 4.0 6.9
W Bank 25.2 12.4 42.3 37.6 22.5 35.2 10.1 14.8
All Fields 27.4 13.2 48.3 48 18.9 30.0 5.4 8.8
d) PERFORMANCE BY CONTENT DOMAINS
Further analysis was also conducted to determine how students perform across the specific
sub-domains that constitute the Mathematic curriculum: Numbers and Operations, Algebra and
Patterns, Geometry and Statistics and Data. As noted in Figure 3.2.4, performance across the
UNRWA Fields was generally low, the average scores ranging from 32 to 43%. The highest
percentage (43%) was recorded for Number and Operations, followed by Geometry, while
mean scores for Algebra and Patterns and Statistics and Data were similar.
2013 Monitoring of Learning Achievement
45
Figure 3.2.4 - Mean Scores: Content Domain for all Fields
A review of performance across the four Fields for each Content Domain also reveal generally
low levels of performance, with all mean scores below 50% (Table 3.2.4). In Gaza, Lebanon and
Jordan, students obtained the highest score for Numbers and Operations and for Statistics and
Data in the West Bank. The lowest scores were recorded for Algebra and Patterns in all Fields,
besides Gaza where the lowest scores were for Statistics and Data.
Table 3.2.4 - Mean Scores: Content Domain by Field
Field Number &
Operations
Algebra &
Patterns Geometry
Statistics &
Data
Mean se Mean se Mean se Mean se
Gaza 44.01 0.24 35.53 0.31 37.14 0.21 32.38 0.37
Lebanon 46.27 0.61 24.14 0.62 36.91 0.49 27.68 0.86
Jordan 38.83 0.31 26.36 0.35 38.16 0.26 31.66 0.47
West Bank 49.67 0.52 34.12 0.55 40.09 0.39 37.28 0.73
Total 43.43 0.17 31.56 0.21 37.85 0.14 32.46 0.26
In terms of gender differences, girls obtained higher scores in all four Content Domains with
mean score differences of 9% for Numbers and Operations, 8% for Algebra and Patterns, 4% for
Geometry and 12% for Statistics and Data.
Table 3.2.5 - Mean Scores: Content Domain by Gender
Gender Number &
Operations
Algebra &
Patterns Geometry
Statistics &
Data
Mean se Mean se Mean se Mean se
Boys 38.46 0.25 27.45 0.29 35.97 0.22 26.1 0.37
Girls 47.59 0.24 35.77 0.3 39.58 0.19 38.49 0.36
Total 43.27 0.18 31.83 0.21 37.87 0.15 32.62 0.27
2013 Monitoring of Learning Achievement
46
A review of the mean scores by Content Domain and performance levels reveals a number of
different trends. Across all four Content Domains, student performing at the Advance level
obtain mean scores that are approximately 3 to 5 times higher than the mean scores of
students at the Not Achieved level, and at least 15% more than students at the next level, i.e.
Achieved (Figure 3.2.5). A comparison of performance across the Content Domain reveals that
students at the Not Achieved level obtained the highest mean score for Geometry and the
lowest for Statistics and Data, which was extremely low at only 2%. In contrast, students at the
Partially Achieved level obtained the highest scores for Number and Operations followed by
Geometry, with similar mean scores for Statistics and Data and for Algebra and Patterns. At the
Achieved level, students also scored highest for Number and Operations, followed by Statistics
and Data, Geometry and Algebra and Patterns, with a similar performance trends displayed by
students at the Advance level.
Figure 3.2.5 - Mean Scores: Content Domain by Performance Levels
Further analysis was conducted to determine trends in mean scores of students across and
within the four Fields. Similar trends as the UNRWA wide results were noted, with Advance
students obtaining the highest mean scores for Statistics and Data and Numbers and
Operations and lowest scores for Geometry (Figure 3.2.6). Across all four Fields, students at
the Not Achieved level, obtained significantly low scores for Statistics and Data with scores
ranging from 1.90% (Gaza) to 2.65% (Jordan), and scored highest in Geometry (approximately
18%).
2013 Monitoring of Learning Achievement
47
Figure 3.2.6 - Mean Scores: Content Domains by Field
At the Partially Achieved level, however, students obtained the highest mean scores in
Numbers and Operations (approximately 38%), and lowest scores in Statistics and Data. For
students at the Achieved, the highest scores were for Numbers and Operations, while for
Algebra and Patterns mean scores ranged from 39% for Lebanon and 53% for Gaza, and scores
for Geometry ranged from 49 to 53%. In contrast to the Not Achieved students, across the
four Fields, students at the Advance level obtained the lowest mean scores for Geometry wit h
similarly high scores for Statistic and Data and Numbers and Operations.
Table 3.2.6 - Content Domain by Performance Levels and Field
Not Achieved
Partially
Achieved Achieved Advanced
Mean se Mean se Mean se Mean se
Gaza
Number & Operations 15.17 0.20 38.42 0.16 63.75 0.23 84.33 0.37
Algebra & Patterns 16.25 0.51 28.19 0.35 52.05 0.56 77.18 0.93
Geometry 17.48 0.32 33.39 0.21 49.90 0.30 66.71 0.54
Statistics & Data 1.90 0.22 22.48 0.38 58.37 0.62 86.00 0.85
Lebanon
Number & Operations 15.92 0.46 41.57 0.48 69.27 0.51 85.85 0.88
Algebra & Patterns 10.64 0.92 17.60 0.68 38.66 1.21 60.80 2.19
Geometry 18.35 0.75 34.89 0.55 49.24 0.74 61.63 1.64
Statistics & Data 2.30 0.57 17.98 0.89 53.46 1.56 81.07 2.34
2013 Monitoring of Learning Achievement
48
Jordan
Number & Operations 14.31 0.19 36.27 0.22 62.68 0.33 83.25 0.53
Algebra & Patterns 12.04 0.45 21.57 0.43 44.57 0.71 64.12 1.36
Geometry 19.97 0.33 37.80 0.28 53.24 0.41 68.10 0.77
Statistics & Data 2.65 0.27 27.13 0.54 62.58 0.87 86.69 1.24
West Bank
Number & Operations 14.34 0.36 39.38 0.38 66.96 0.39 88.00 0.41
Algebra & Patterns 12.29 0.83 25.16 0.70 44.56 0.86 66.29 1.23
Geometry 18.31 0.60 34.46 0.43 49.12 0.47 65.34 0.78
Statistics & Data 2.09 0.40 22.87 0.82 56.09 1.07 84.41 1.18
All Fields
Number & Operations 14.79 0.12 38.16 0.12 64.51 0.17 85.12 0.24
Algebra & Patterns 13.62 0.30 24.94 0.24 47.58 0.38 70.28 0.65
Geometry 18.64 0.21 34.96 0.15 50.59 0.21 66.34 0.38
Statistics & Data 2.25 0.15 23.53 0.28 58.66 0.44 85.42 0.59
e) PERFORMANCE BY COGNITIVE LEVELS
The assessment instruments also included items developed to address three levels of cognitive
demand: Knowing, Applying and Reasoning, performance on which is reported in Table 3.2.7.
The mean scores were highest for “Knowing”, following by Applying and Reasoning.
Table 3.2.7 - Mean Scores: Cognitive Level by Field
Field Knowing (%) Applying (%) Reasoning (%)
Mean se Mean se Mean se
Gaza 60.94 0.27 33.89 0.22 30.90 0.25
Lebanon 50.54 0.62 38.89 0.54 30.99 0.62
Jordan 53.36 0.35 31.41 0.28 30.23 0.32
West Bank 61.05 0.52 40.24 0.47 35.02 0.50
Total 57.76 0.19 34.46 0.16 31.29 0.18
A number of Higher Order Thinking Skills (HOTS) were also included in the Grade 4 Maths test.
However, these HOTS items were included as pilot items given that the current curriculum did
not focus on this level of cognition. The intended use of the results for the HOTS items was to
provide specific input for improving student skills regarding higher order thinking and problem
solving as part of the education reform process. As noted in Figure 3.2.7, mean scores across
the four Fields on the pilot-HOTS items was extremely low, with scores ranging from 11 to 14%.
2013 Monitoring of Learning Achievement
49
Figure 3.2.7 - Mean Scores: Performance on Pilot-HOTS Items by Field
Comparison of student performance by Gender revealed similar trends to the UNRWA wide
picture where both boys and girls scored highest for Knowing, followed by Applying, Reasoning
and HOTS. Moreover, across all four Cognitive Levels, girls obtained higher scores than the boys
(Table 3.2.8).
Table 3.2.8 - Mean Scores: Cognitive Level by Gender
Knowing (%) Applying (%) Reasoning (%) HOTS (%)
Mean se Mean se Mean se Mean se
Boys 53.19 0.29 30.32 0.23 27.82 0.26 8.67 0.16
Girls 62.34 0.26 37.79 0.22 34.35 0.25 12.69 0.18
Total 58.01 0.20 34.25 0.16 31.26 0.18 10.79 0.12
Figure 3.2.8 presents student mean scores by Cognitive Levels and Performance Levels. Across
the four performance levels, students obtained highest scores for Knowing, the lowest for the
HOTS item. At the Not Achieved and Partially Achieved levels, however, mean scores were
similar for Applying and Reasoning while for the Achieved and Advance levels, students
obtained higher mean scores Applying.
2013 Monitoring of Learning Achievement
50
Figure 3.2.8 - Mean Scores: Cognitive Level by Performance Levels
A comparison of mean scores across the Fields indicates mean scores varying performance
across the four Cognitive levels. For Knowing, students in Gaza and West Bank obtained higher
scores than students in Jordan and Lebanon, while scores for Applying were similar for Gaza
and Jordan and similar for Lebanon and West Bank. For the Reasoning, West Bank students
obtained the highest mean (35%) with similar mean scores for the three other fields, while for
the HOTS items, the West Bank also obtain the highest mean (14%).
Table 3.2.9 - Mean Scores: Cognitive Level by Field
Field Knowing (%) Applying (%) Reasoning (%) HOTS (%)
Mean se Mean se Mean se Mean se
Gaza 60.94 0.27 33.89 0.22 30.90 0.25 9.63 0.16
Lebanon 50.54 0.62 38.89 0.54 30.99 0.62 12.09 0.42
Jordan 53.36 0.35 31.41 0.28 30.23 0.32 10.76 0.22
West Bank 61.05 0.52 40.24 0.47 35.02 0.50 14.31 0.37
All Fields 57.76 0.19 34.46 0.16 31.29 0.18 10.85 0.12
Analysis of mean scores by Performance levels and fields for the different Cognitive levels
revealed that students functioning at the Not Achieved level obtain highest scores for Knowing,
similar scores for Applying and Reasoning and the lowest for HOTS. At the three othe r cognitive
levels (Partially Achieved, Achieved and Advance), the general trend was highest mean scores in
Knowing, followed by Applying, Reasoning and HOTS.
2013 Monitoring of Learning Achievement
51
Table 3.2.10 - Mean Scores: Cognitive Level by Performance Standards and Field
Not Achieved
Partially
Achieved Achieved Advanced All Fields
Mean se Mean se Mean se Mean se Mean se
Gaza
Knowing (%) 24.43 0.35 59.24 0.25 80.44 0.27 91.81 0.38 60.94 0.27
Applying (%) 11.27 0.18 26.77 0.14 52.13 0.23 75.75 0.40 33.89 0.22
Reasoning (%) 12.59 0.35 24.60 0.24 45.47 0.44 69.54 0.72 30.90 0.25
Pilot HOTS (%) 2.13 0.14 5.50 0.13 15.59 0.33 37.07 0.96 9.63 0.16
Lebanon
Knowing (%) 22.41 0.69 47.29 0.61 70.58 0.83 83.67 1.53 50.54 0.62
Applying (%) 13.09 0.44 34.46 0.43 58.65 0.46 75.31 1.00 38.89 0.54
Reasoning (%) 11.14 0.77 24.59 0.61 49.21 1.03 71.30 1.78 30.99 0.62
Pilot HOTS (%) 2.87 0.35 7.84 0.39 20.25 0.87 43.21 2.50 12.09 0.42
Jordan
Knowing (%) 24.01 0.34 54.69 0.32 76.71 0.39 88.93 0.56 53.36 0.35
Applying (%) 11.42 0.17 27.65 0.19 52.73 0.30 74.78 0.54 31.41 0.28
Reasoning (%) 12.00 0.34 27.10 0.35 49.02 0.56 69.69 1.03 30.23 0.32
Pilot HOTS (%) 2.35 0.14 7.46 0.20 19.92 0.51 43.25 1.30 10.76 0.22
West Bank
Knowing (%) 22.32 0.62 56.25 0.50 77.11 0.47 90.05 0.51 61.05 0.52
Applying (%) 11.58 0.33 29.13 0.33 55.13 0.38 77.73 0.50 40.24 0.47
Reasoning (%) 11.54 0.65 25.28 0.53 46.10 0.68 70.28 0.94 35.02 0.50
Pilot HOTS (%) 1.53 0.20 7.06 0.30 19.62 0.57 41.15 1.23 14.31 0.37
All Fields
Knowing (%) 23.84 0.22 56.50 0.18 78.00 0.20 90.15 0.28 57.76 0.19
Applying (%) 11.52 0.11 27.98 0.11 53.39 0.16 76.01 0.26 34.46 0.16
Reasoning (%) 12.10 0.22 25.43 0.18 46.86 0.29 69.88 0.48 31.29 0.18
Pilot HOTS (%) 2.20 0.09 6.47 0.10 17.85 0.24 39.95 0.64 10.85 0.12
f) SUMMARY: GRADE 4 MATHEMATICS
A review of Grade 4 student performance across UNRWA in Mathematic reveals the following
trends:
2013 Monitoring of Learning Achievement
52
Performance was generally low across all four Fields with approximately a third of Grade 4
students performing at or above the Achieved grade level, approximately two thirds
performing at the Partially Achieved level, with low percentages of students at the Not
Achieved and Advance levels.
In all fields performance across the four Mathematics Content Domains was also generally
low, the average scores ranging from 32 to 43%, the highest for Number and Operations
(43%), followed by Geometry, while mean scores for Algebra and Patterns and Statistics
and Data were similar.
Results for all Fields indicate varying performance across the four Cognitive levels. Across
all four performance standards, students scored highest for “Knowing” and obtained
similar scores for “Applying” and “Reasoning”. Of concern, however, are the extremely low
scores for the HOTS items. Even student functioning at the Advance level only obtained a
mean score of 40% on the HOTS items.
Across all UNRWA Fields, the overall mean scores as well as scores by Content Domains
and Cognitive Levels of girls was higher than mean scores for boys, with significant
differences recorded when comparing across the different fields, domains and levels.
3.3. GRADE 8 ARABIC
a) PERFORMANCE LEVELS BY FIELD
The mean score for all students across the four Fields on the Grade 8 Arabic assessment is 52%,
with scores ranging from 56% in Gaza to 47% in Jordan Table 3.3.1). In terms of the
performance standards, a total of 57% of students achieved at or beyond the Achieved Grade
level, that is: 37% at the Achieved level and 19% at the Advanced Level. There were 44% of
students below the Achieved Level, that is: 34% at the Partially Achieved Level and 9% at the
Not Achieved level, (Figure 4.3.2).
The histogram at Figure 3.3.1, which displays the mean percentage scores by students, is
slightly positively skewed reflecting the mean of 52% with 44% of students achieving below the
Achieved Grade level.
Table 3.3.1 – Mean Scores by Field
Field Mean N se
Gaza 55.98 5288 0.28
Lebanon 49.40 899 0.53
Jordan 46.82 3710 0.35
West Bank 50.66 2255 0.46
All fields 51.71 12152 0.19
2013 Monitoring of Learning Achievement
53
Figure 3.3.1 – Distribution of Student Scores by Performance Level
A detail review of the performance levels (Figure 3.3.2) indicates that two of the four Fields,
(Gaza and the West Bank) have almost half of their students at the Advanced Level, while in
Lebanon and Jordan about one-fifth of students were at Advanced. Gaza and the West Bank
had 39% and 36% respectively at the Achieved Level. Lebanon had more students at Achieved
with 41%. Lebanon has quite a different distribution of students compared with the other
Fields, as they have the lowest proportions at Advanced and Not Achieved, but the largest
proportions at Partially Achieved and Achieved. But to maintain a perspective on the relative
strengths of Gaza, West Bank and Lebanon, the latter had just 8% at Advanced. Over the four
Fields between 5% and 14% were functioning at the Not Achieved level.
Figure 3.3.2 - Performance Standards by Field
Not Achieved
Partially Achieved
Achieved Advanced
2013 Monitoring of Learning Achievement
54
b) PERFORMANCE LEVELS BY GENDER AND FIELD
Differences in performance in favour of girls across all schools were also noted in terms of the
performance standards (Figure 3.3.2). At the lower levels (i.e. Not Achieved and Partially
Achieved) there are 61% boys, and 29% girls. At the higher levels (i.e. Achieved and Advanced)
there are 39% boys and 71% girls. There are significantly less girls than boys at the lower levels
and significantly more girls than boys at the higher levels.
Table 3.3.2 - Mean Scores by Gender and Field
Field Boys Girls
Mean se Mean se
Gaza 47.76 0.39 64.03 0.34
Lebanon 42.29 0.79 54.37 0.62
Jordan 37.10 0.47 55.40 0.42
West Bank 38.04 0.73 57.81 0.51
Total 42.62 0.27 59.33 0.23
Reviews of the gender performance across the Fields for all schools at Table 3.3.3 reveal similar
outcomes. There are greater proportions of boys than girls at the Not Achieved and Partially
Achieved levels, and greater proportions of girls than boys at the Achieved and Advance levels).
Indeed at every cell of Table 3.3.3 girls have outperformed boys. There are no exceptions to
this pattern.
Figure 3.3.3 - Performance Levels for all Schools
2013 Monitoring of Learning Achievement
55
Lebanon has the least proportions of students at Not Achieved but also the least proportions at
Advanced. West Bank has the greatest proportions at Not Achieved (marginally ahead of
Jordan). Lebanon has the greatest proportions at Partially Achieved (mainly because of the
large proportion of girls). The proportions at Achieved are relatively uniform but with Gaza and
Lebanon marginally ahead. Gaza has the greatest proportions at Advanced.
Looking at variations from average scores by gender, at Not Achieved, West Bank boys are most
below the boys’ average; at the Partially Achieved level, Lebanon boys are above the boys’
average; at Achieved, Lebanon girls are above the girls’ average; and at Advanced, Gaza girls
are above the girls’ average. The most variation by gender is at Partially Achieved and
Advanced. The least variation is at Not Achieved and Achieved. Across Table 3.3.3 as a whole
West Bank shows least variation from the mean scores and Lebanon shows most variation.
Table 3.3.3 - Performance Levels by Gender and Field
Field Not Achieved Partially
Achieved Achieved Advanced
Boys Girls Boys Girls Boys Girls Boys Girls
Gaza 9.87 1.42 42.45 19.57 34.53 42.87 13.15 36.14
Lebanon 9.46 1.32 60.81 37.62 27.57 49.53 2.16 11.53
Jordan 23.68 4.77 46.15 29.75 25.92 44.16 4.35 21.32
West Bank 23.93 4.34 42.09 26.81 26.50 40.56 7.48 28.40
All Fields 16.25 3.02 44.78 25.62 30.18 43.28 8.79 28.07
c) PERFORMANCE BY CONTENT DOMAINS
A review of performance for Fields by Content Domain (Figure 3.3.4) shows uniform levels of
performance generally, with mean scores in the range of 43% for Reading to 58% for Dictation.
By Field, Lebanon shows the lowest average with 33.94% on Writing. West Bank has the highest
average with 60.39% on Dictation.
Figure 3.3.4 - Mean Scores: Content Domain for all Fields
2013 Monitoring of Learning Achievement
56
Across the four content domains West Bank ranks first or second for all domains except
Reading (Table 3.3.4). Gaza is slightly ahead of West Bank in Grammar, Writing and Reading
while West Bank is slightly ahead of Gaza in Dictation. Jordan and Lebanon rank third or fourth
for all domains except Reading. Lebanon is slightly ahead in Grammar and Writing; Jordan is
slightly ahead in Dictation. Reading shows a different ranking with Lebanon first and West Bank
fourth.
Table 3.3.4 - Mean Scores: Content Domain by Field
Field Dictation Grammar Reading Writing
Mean se Mean se Mean se Mean se
Gaza 59.39 0.31 56.62 0.32 45.56 0.30 53.45 0.43
Lebanon 50.59 0.60 50.25 0.59 46.64 0.67 33.94 0.81
Jordan 56.38 0.41 45.49 0.37 41.32 0.35 39.12 0.50
West Bank 60.39 0.52 52.50 0.53 40.43 0.46 42.97 0.65
All Fields 58.01 0.21 51.99 0.21 43.39 0.20 45.69 0.28
At Figure 3.3.5 as noted previously, Dictation and Grammar have the highest mean scores
within each of the four Fields. Reading has the lowest mean score. Jordan and Lebanon record
their lowest mean scores for Writing. The mean for Writing recorded by Lebanon is the lowest
of all means being 19.5% below the highest mean for Writing. Gaza and West Bank which are
the highest ranking Fields over the content domains, record their lowest means for Reading.
Figure 3.3.5 - Mean Scores: Content Domains by Field
2013 Monitoring of Learning Achievement
57
Table 3.3.5 - Mean Scores: Content Domain by Gender
Domain Boys Girls All Fields
Mean se Mean se Mean se
Dictation 49.46 0.31 65.17 0.26 58.01 0.21
Grammar 44.52 0.30 58.25 0.28 51.99 0.21
Reading 35.44 0.28 50.06 0.25 43.39 0.20
Writing 32.67 0.40 56.60 0.35 45.69 0.28
A comparison of performance across the Content Domain at Figure Table 3.3.6 shows that
students at the Not Achieved level obtained their highest mean scores for Dictation and
Grammar, and the lowest for Writing, which at an average of 2.5%, is well below their
performance in the other domains, and well below the Writing performance at the other three
performance levels. Writing is also the content area with the least mean for the Partially
Achieved group, but this pattern is not replicated for the Achieved or Advanced groups. At
Advanced for example, the mean for Writing is within two percent of Grammar and Dictation.
Reading shows the narrowest range of mean scores mainly due to the mean for the Advanced
group being considerably less than their mean scores in the other content areas. Reading ranks
below Writing at Achieved and Advanced.
The low levels of average performance by students at the Not Achieved level stand out and will
be of concern. These performance levels average around 12% over the four domains. It may be
of concern too that the mean scores for Reading for Achieved and Advanced are so far below w
the mean scores for Dictation and Grammar.
Figure 3.3.6 - Mean Scores: Content Domain by Performance Levels
At the Not Achieved and Partially Achieved levels, the lowest mean scores were achieved in
Writing (Table 3.3.6) with Lebanon and West Bank below the mean at Not Achieved, and
Lebanon and Jordan below the mean at the Partially Achieved level. At the other two
performance levels, Writing typically ranks second or third except for Lebanon where it is
2013 Monitoring of Learning Achievement
58
fourth at Advanced, and for Gaza where it is first at Achieved and Advanced. At all four
performance levels, Gaza ranks first in Writing.
Dictation and Grammar are the content domains with the highest mean scores at all
performance levels. This is true also for all Fields at Not Achieved and Partially Achieved, but
there are three exceptions at Achieved and Advanced i.e. at Achieved Writing is ranked second
for Gaza and Jordan; at Advanced Writing is ranked first for Gaza. The key literacy skill of
reading is ranked third for all Fields at the Not Achieved and Partially Achieved levels, and four
that the Achieved and Advanced level except for Lebanon where Reading is third at the
Achieved and second at Advanced level.
Table 3.3.6 - Content Domain by Performance Levels and Field
Not Achieved
Partially
Achieved Achieved Advanced
Mean se Mean se Mean se Mean se
Gaza
Dictation 18.32 0.8 41.6 0.42 61.52 0.31 79.74 0.33
Grammar 18.59 0.63 36.33 0.35 56.65 0.32 81.44 0.3
Reading 12.07 0.6 27.65 0.4 45.89 0.34 67.07 0.35
Writing 3.85 0.54 26.06 0.57 56.68 0.5 82.38 0.41
Lebanon
Dictation 18.46 2.62 39.87 0.77 56.58 0.63 73.08 1.11
Grammar 19.74 1.74 38.76 0.55 55.75 0.64 77.52 1.15
Reading 15.78 2.63 33.45 0.8 53.46 0.69 74.22 1.17
Writing 1.39 0.7 16.98 0.83 42.26 0.88 69.07 1.87
Jordan
Dictation 18.6 0.61 46.01 0.46 67.41 0.39 85.35 0.47
Grammar 16.58 0.45 34.31 0.35 52.64 0.38 78.3 0.5
Reading 12.65 0.45 30.76 0.41 50.49 0.39 67.29 0.53
Writing 2.05 0.26 20.76 0.56 53.55 0.57 76.76 0.73
West Bank
Dictation 17.83 0.86 46.45 0.62 68.2 0.49 85.75 0.48
Grammar 17.72 0.59 34.29 0.49 58.4 0.54 82.49 0.54
Reading 11.16 0.57 26.51 0.5 44.97 0.51 64.6 0.61
Writing 1.88 0.34 21.54 0.72 50.59 0.76 77.29 0.79
All Fields
Dictation 18.34 0.42 43.81 0.26 64 0.21 81.81 0.24
Grammar 17.47 0.31 35.5 0.21 55.69 0.21 80.87 0.23
Reading 12.23 0.31 29.07 0.24 47.78 0.22 66.88 0.26
Writing 2.46 0.21 22.57 0.33 53.36 0.32 79.82 0.33
2013 Monitoring of Learning Achievement
59
d) PERFROMANCE BY COGNITIVE LEVELS
Only Gaza has scored above the mean for All Fields at the four levels of cognitive skills at Table
3.3.7. This duplicates results from Table 3.3.1 mean scores by Field, and is consistent with
Figure 3.3.2 which shows Gaza as having the greatest proportion of students at Advanced.
Lebanon is above the mean for All Fields for Knowing and Extra HOTS. West Bank is above the
mean for All Fields for Knowing. The cognitive levels for Lebanon and West Bank are consistent
also with Table 3.3.1 and Figure 3.3.2.
At Table 3.3.8 girls are consistently above the mean for boys on all four cognitive levels. The
greatest differences in favour of girls are for HOTS and the pilot-HOTS items, where the mean
for girls is more than 20% above the mean for boys. Taking the achievement and cognitive data
together emphasises further the superiority of girls over boys for the outcomes assessed in this
study. This conclusion is in line with studies of achievement in many other international
settings.
The mean scores for All Fields are relatively uniform at Table 3.3.7, ranging from 42.65 to 51.84.
The greatest man score is recorded for Applying. Gaza is above the mean for All Fields for all
four levels. Gaza was also above the mean for All Fields at Table 3.3.4, mean scores by content
domain. Lebanon and West Bank are above for two levels. Lebanon has the greatest score in
two skills, namely Knowing and Extra HOTS. Jordan is below the mean for All Fields for all four
levels. Jordan was also below the mean for All Fields at Table 3.3.4.
Table 3.3.7 - Mean Scores: Cognitive Level by Field
Knowing Applying HOTS Extra HOTS
Mean se Mean se Mean se Mean se
Gaza 53.68 0.29 55.52 0.29 48.69 0.38 51.77 0.43
Lebanon 55.96 0.66 47.54 0.53 35.18 0.77 55.35 0.88
Jordan 44.72 0.35 47.95 0.35 36.58 0.45 45.40 0.52
West Bank 51.15 0.48 51.32 0.48 41.44 0.59 47.67 0.68
All Fields 50.64 0.20 51.84 0.19 42.65 0.25 49.33 0.28
At Table 3.3.8 the mean scores for girls are greater than the mean scores for boys at all four
cognitive levels, the difference in favour of girls averaging 17% .The greatest difference of 22%
was for the Pilot-HOTS, the least difference of 12% was for Knowing.
Table 3.3.8 - Mean Scores: Cognitive Level by Gender
Boys Girls All Fields
Mean se Mean se Mean se
Knowing 44.07 0.29 56.15 0.25 50.64 0.2
Applying 43.25 0.27 59.04 0.24 51.84 0.19
HOTS 31.3 0.35 52.16 0.32 42.65 0.25
Extra HOTS 37.21 0.41 59.48 0.35 49.33 0.28
2013 Monitoring of Learning Achievement
60
Mean scores on HOTS range from 45 for Jordan, to 55 for Lebanon with an All Fields mean of
55. Lebanon has the highest mean. Table 3.3.7 also recorded Lebanon as having the greatest
score for HOTS items.
Figure 3.3.7 - Mean Scores: Pilot HOTS Items by Field
The superiority of girls as demonstrated at earlier tables and figures is demonstrated again at
Figure 3.3.8. Mean scores for girls have a range of 3%. Mean scores for boys have a range of
12%. Means for boys in Jordan and West Bank are markedly less than all other mean scores.
Figure 3.3.8 - Mean Scores: Pilot HOTS Items by Gender and Field
Figure 3.3.9 which shows the mean cognitive levels by performance, and indicates a clear and
marked linear relationship. Mean scores hover around 20% for Not Achieved students and
2322
19
2322
0
5
10
15
20
25
Gaza Lebanon Jordan West Bank All Fields
2013 Monitoring of Learning Achievement
61
progress a little above 75% for Advanced students. At the lower levels (Not Achieved and
Partially Achieved) Knowing shows the greatest mean scores. At the upper levels (Achieved and
Advanced) the mean scores for Understanding and Pilot HOTS items are the highest.
Understanding is ahead of Pilot HOTS at Advanced, but at Achieved these positions are
reversed.
Figure 3.3.9 - Mean Scores: Cognitive Levels by Performance Levels
Table 3.3.9 provides additional detail by Field for the data illustrated in Figure 3.3.9. Knowing
has the greatest mean score for all Fields at Not Achieved and Partially Achieved except for
Jordan at Partially Achieved, where Understanding is marginally ahead. This difference is not
significant however. Achieved Understanding has the greatest mean scores except for Lebanon
where Pilot HOTS is superior. At Advanced Understanding has the greatest mean scores for all
Fields except Lebanon where again Pilot HOTS is superior. It may be worth noting that Lebanon
has the greatest mean scores for Pilot HOTS, but not for HOTS, at all performance levels.
Table 3.3.9 – Mean Scores: Cognitive Level by Performance Standards and Field
Not Achieved
Partially
Achieved Achieved Advance Total
Mean se Mean se Mean se Mean se Mean se
Gaza
Knowing (%) 22.83 0.91 39.53 0.47 54.2 0.35 71.11 0.34 53.68 0.29
Understanding (%) 15.37 0.4 35.14 0.25 56.71 0.21 79.25 0.23 55.51 0.29
HOTS (%) 3.85 0.38 22.55 0.42 49.86 0.37 78.42 0.36 48.68 0.38
Pilot HOTS (%) 5.45 0.7 27.63 0.66 54.91 0.55 77.33 0.45 51.76 0.43
2013 Monitoring of Learning Achievement
62
Lebanon
Knowing (%) 27.41 2.81 45.32 0.95 62.08 0.75 76.42 1.15 55.96 0.66
Understanding (%) 15.89 1.18 35.42 0.42 53.69 0.43 74.5 0.76 47.54 0.53
HOTS (%) 3.21 0.78 18.45 0.76 42.93 0.79 72.05 1.2 35.18 0.77
Pilot HOTS (%) 10.95 2.13 38.73 1.24 65.63 0.89 83.65 1.36 55.35 0.88
Jordan
Knowing (%) 21.75 0.68 36.1 0.48 50.84 0.44 69.08 0.59 44.72 0.35
Understanding (%) 14.47 0.33 36.35 0.26 57.76 0.24 79.00 0.33 47.95 0.35
HOTS (%) 2.87 0.24 19.32 0.41 48.71 0.46 74.6 0.64 36.58 0.45
Pilot HOTS (%) 3.82 0.38 30.77 0.7 60.05 0.61 78.15 0.73 45.4 0.52
West Bank
Knowing (%) 21.78 0.94 38.38 0.66 55.61 0.57 74.07 0.58 51.14 0.48
Understanding (%) 14.52 0.44 34.55 0.35 57.73 0.34 79.66 0.37 51.31 0.48
HOTS (%) 2.30 0.27 20.41 0.58 48.61 0.57 75.01 0.63 41.43 0.59
Extra HOTS (%) 3.68 0.59 28.66 0.94 57.33 0.84 76.99 0.83 47.65 0.68
All Fields
Knowing (%) 22.2 0.46 38.7 0.29 54.17 0.24 71.43 0.26 50.64 0.2
Understanding (%) 14.76 0.22 35.48 0.15 56.93 0.14 79.12 0.17 51.84 0.19
HOTS (%) 3.00 0.17 20.67 0.25 48.66 0.25 76.78 0.28 42.64 0.25
Pilot HOTS (%) 4.42 0.3 29.95 0.41 57.84 0.35 77.64 0.34 49.32 0.28
e) SUMMARY: GRADE 8 ARABIC
A review of Grade 8 student performance across UNRWA Fields in Arabic reveals the following:
Performance was generally uniform across all four Fields with mean scores varying from
47% to 56%. Just over one- half of students performed at or above the Achieved grade
level; about one-tenth performed at the Not Achieved level; a little more than one-third
were at the Partially Achieved level, and one-fifth were at the Advanced level.
Across the content domains, performance was generally uniform with mean scores in the
range of 43% to 58%. Dictation was highest with 58%, Grammar was 52%, Writing was
46% and Reading was lowest at 43%.
The mean scores for Reading are well below the means for Dictation and Grammar at all
performance levels and below Writing as well at Achieved and Advanced.
2013 Monitoring of Learning Achievement
63
By Field, Lebanon shows the lowest average with 34% on Writing. West Bank has the
highest average with 60% on Dictation. . Jordan and Lebanon record their lowest mean
scores for Writing.
Gaza and West Bank which are the highest ranking fields over the content domains
record their lowest mean scores for Reading.
Students at Partially Achieved and Not Achieved did best on Grammar and Dictation, but
their performance on Reading and Writing showed a noticeable decline. At these lower
levels the key skills of Reading and Writing showed the least proficiency of achievement.
Results for all Fields indicate moderate performance across the three Cognitive levels
with mean scores of 52% for Applying, 51% for Knowing, 49% for Extra HOTS and 43% for
HOTS. Students scored highest for Knowing at Not Achieved and Partially Achieved. At
Advanced Understanding was highest but at Achieved Extra HOTS was marginally ahead
of Understanding.
Taking the achievement and cognitive data in terms of mean scores by gender, girls
obtained higher scores in all four Content Domains with mean scores differences over
boys of 14%-16%for Dictation, Grammar, and Reading and 23% for Writing. There are
significantly less girls at the lower performance levels and significantly more girls at the
higher performance levels.
The greatest difference between mean scores for boys and girls is for the cognitive skill
classified as HOTS, which presumably represent higher level skills than Knowing and
Understanding.
3.4 GRADE 8 MATHEMATICS
a) PERFORMANCE LEVELS BY FIELD
The mean score obtained by UNRWA Grade 8 Maths students across the different Fields is 35%
with minimum differences between the Fields, Jordan at 33% and West Bank at 36% (Table
3.4.1). Table 3.4.1 - Mean Scores by Field
Field Mean N se
Gaza 35.24 5358 0.23
Lebanon 34.80 837 0.51
Jordan 33.32 3454 0.28
West Bank 35.93 2337 0.39
All fields 34.79 11986 0.16
A review of the Performance Standards indicates that approximately half of all students across
the four Fields are functioning at the Partially Achieved level, between 5 and 9% at the Not
Achieved level (Figure 3.4.1). Approximately a third of students were performing at the
2013 Monitoring of Learning Achievement
64
Achieved Grade 8 level, while and 12% students from Gaza, Lebanon and Jordan and 17% from
the West Bank were functioning at the Advance level. Figure 3.4.2 displays the distribution of
student mean score as well as the different performance levels within which students fall.
Figure 3.4.1 - Performance Standards by Field
Figure 3.4.2 - Distribution of Student Scores by Performance Level
Not Achieved Partially Achieved
Achieved Advanced
2013 Monitoring of Learning Achievement
65
b) PERFORMANCE LEVELS BY GENDER
A review of disaggregated performance indicates that Girls generally scored significantly higher
than Boys, even though were relatively low. In all Fields besides Lebanon, Girls obtained higher
scores (Table 3.4.2).
Table 3.4.2 - Mean Scores by Gender and Field
A review of gender differences across the different Performance Standards indicated reveals
higher percentage of boys at the lower levels (Partially Achieved and Not Achieved) and higher
percentage of girls at the higher levels (Achieved and Advance).
Figure 3.4.3 - Performance Levels for All Schools
Disaggregated analysis by Field and Performance Standards revealed similar patterns in Gaza,
Jordan and West Bank with higher percentages of boys functioning at the Partially Achieved
and Not Achieved levels and higher percentages of girls at the Achieved and Advanced levels.
For Lebanon, however, while a higher percentage of boys were functioning at the Not Achieved
level, a higher percentage of girls (4%) were functioning at the Partially Achieved level, with
similar percentages noted at the Achieved and Advance levels.
Field Boys Girls
Mean se Mean se
Gaza 31.96 0.30 38.45 0.33
Lebanon 33.6 0.99 34.23 0.71
Jordan 29.09 0.38 36.99 0.40
West Bank 29.48 0.54 39.92 0.51
All Fields 30.77 0.21 38.12 0.22
2013 Monitoring of Learning Achievement
66
Table 3.4.3 - Performance Levels by Gender and Field (%)
Field Not Achieved
Partially Achieved
Achieved Advanced
Boys Girls Boys Girls Boys Girls Boys Girls
Gaza 8.7 3.9 54.7 45.2 28.5 34.5 8.1 16.5
Lebanon 8.6 4.2 48.3 52.5 31.7 32.4 11.4 10.9
Jordan 11.9 4.6 58.7 45.4 22.0 33.4 7.4 16.7
WB 14.5 5.5 54.4 42.3 22.6 29.7 8.5 22.5
All Fields 10.6 4.5 55.5 45.1 25.8 33.0 8.1 17.5
c) PERFORMANCE BY CONTENT DOMAINS
A review of student performance across the different Grade 8 Mathematics content domain
reveals generally low levels of performance with mean scores ranging from 36% to 41% (Figure
3.4.4), with the higher mean scores obtained for Algebra and Statistics.
Figure 3.4.4 - Mean Scores: Content Domain for all Fields
Further analysis reveals similarly low levels of performance across the four Fields and five
content domain areas (Table 3.4.4). Students in Lebanon obtained significantly higher mean
scores for Numbers and Algebra and significantly lower scores for Proportionality and Statistics.
Table 3.4.4 – Mean Scores: Content Domain by Field
Field Number &
Operations
Algebra &
Patterns Proportionality Geometry
Statistics &
Data
Mean se Mean se Mean se Mean se Mean se
Gaza 35.81 0.25 37.7 0.3 36.19 0.39 38.62 0.29 41.04 0.37
Lebanon 43.06 0.7 48.59 0.79 30.67 0.95 34.12 0.62 26.72 0.83
Jordan 36.86 0.36 44.36 0.44 35.19 0.48 31.02 0.3 41.46 0.48
West Bank 36.77 0.44 38.81 0.5 37.13 0.59 39.09 0.47 43.42 0.59
All Fields 36.81 0.18 40.6 0.22 35.7 0.26 36.21 0.19 40.62 0.25
2013 Monitoring of Learning Achievement
67
Further analysis of student performance by Gender and Content Domain, indicates that girls
obtained significantly higher scores than boys across four of the five domains areas. For
proportionality, the scores were the same.
Table 3.4.5 - Mean Scores: Content Domain by Gender
Content Domain Boys Girls
Mean se Mean se
Number & Operations 33.05 0.24 39.74 0.27
Algebra & Patterns 35.25 0.30 44.69 0.31
Proportionality 35.93 0.40 35.72 0.35
Geometry 32.38 0.27 39.45 0.26
Statistics and Data 34.74 0.37 45.76 0.35
A review of the results presented in Figure 3.4.5 indicates that students at the Not Achieved
level obtained their highest mean score for Proportionality while students at the Advance level
obtained their lowest mean score for Proportionality. Students at the Partially Achieved level
obtained very similar mean scores across all five content domain area while students
performing at the Achieved and Advance levels scored the highest for Algebra. Specific reasons
for this result are not clear, and require further investigation to determine its meaning and
value for supporting learning and teaching in UNRWA schools.
Figure 3.4.5 - Mean Scores: Content Domain by Performance Levels
A review of mean scores for the different content domain areas across the four Fields highlights
a number of interesting results (Figure 3.4.6). While student scores for Gaza and West Bank are
2013 Monitoring of Learning Achievement
68
similar across the content domains, for Lebanon and Jordan scores varied widely. In Lebanon
students obtained their highest mean scores for Algebra and Patterns, which was significantly
higher than the other three Fields, and their lowest scores for Statistics, which was significantly
lower that the three other Fields. For Jordan, however, higher scores were also recorded for
Algebra and Patterns and the lowest for Geometry.
Figure 3.4.6 - Mean Scores: Content Domains by Field
Additional analysis was also conducted to identify specific strengths and weaknesses of
students functioning at the different Performance levels across the four Fields in each of the
content domains (Table 3.4.6). In Gaza, students at the Not Achieved level obtained higher
scores for Proportionality (16%) and the lowest for Statistics (9%), while at the Partially
Achieved level scores were similar across all the content domains (approximately 27%).
Students at the Achieved level score highest for Statistics (55%) and lowest for Number and
Operations (42%), while at the Advance level, the highest scores were obtain for Geometry
(75%) and lowest for Number and Operations (66%). In Lebanon, similar trends were noted
across the four performance levels with highest scores obtained for Algebra and Patterns and
lowest for Statistics. At the Not Achieved level, mean scores were 7 and 17% while at the
Partially Achieved level, scores were 16 and 37%. At the Advanced level, scores were 55 and
79%, while at the Achieved level, scores were 36 and 61% with students also obtaining a low
score of 36% for Proportionality.
In Jordan, students at the Not Achieved level also obtained their lowest scores for Statistics
(9%) and highest for Proportionality (19%) while students at the Partially Achieved level scored
lowest in Geometry (23%) and highest in Algebra and Patterns (32%). Similarly, students at the
Achieved level scored lowest in Geometry (38%) and highest in Algebra and Patterns (59%),
while at the Advance level, student obtained the lowest score in Proportionality (56%) and
higher in Algebra and Patterns (83%). For students at the Not Achieved level in West Bank, the
2013 Monitoring of Learning Achievement
69
lowest and highest scores was also recorded for Statistics (10%) and Proportionality (18%),
while students at the Partially Achieved level score highest in Statistics (31%) and obtain
similarly low scores (26%) in Numbers and Patterns and Algebra and Operations. Students at
the Achieved level scored highest in Statistics (57%) and lowest for Proportionality (42%), while
that the Advanced level, students scored highest in Geometry (75%) and lowest for
Proportionality (60%).
Table 3.4.6 - Content Domain by Performance Levels and Field
Not Achieved
Partially
Achieved Achieved Advanced
Mean se Mean se Mean se Mean se
Gaza
Number & Operations 14.37 0.50 26.99 0.22 42.34 0.31 65.71 0.61
Algebra & Patterns 11.16 0.55 26.84 0.27 47.63 0.36 69.74 0.62
Proportionality 15.92 1.13 28.20 0.49 43.18 0.67 60.91 0.91
Geometry 11.08 0.41 27.15 0.21 47.99 0.31 75.03 0.51
Statistics and Data 8.52 0.61 29.08 0.43 54.88 0.54 70.58 0.73
Lebanon
Number & Operations 14.78 1.15 32.56 0.64 53.04 0.84 72.01 1.31
Algebra & Patterns 17.17 1.66 36.60 0.81 60.65 0.93 79.41 1.33
Proportionality 13.04 2.83 22.81 1.16 36.00 1.60 56.77 2.76
Geometry 12.44 0.88 24.62 0.53 41.49 0.76 63.26 1.30
Statistics and Data 6.72 1.25 15.56 0.84 36.31 1.36 55.25 2.12
Jordan
Number & Operations 13.55 0.52 26.77 0.27 45.80 0.44 73.85 0.70
Algebra & Patterns 12.49 0.64 31.85 0.4 59.40 0.56 83.23 0.65
Proportionality 18.89 1.35 30.23 0.61 39.88 0.92 55.82 1.43
Geometry 10.73 0.43 22.88 0.23 38.25 0.37 61.81 0.77
Statistics and Data 8.54 0.62 30.28 0.52 57.05 0.71 74.20 0.85
West Bank
Number & Operations 13.05 0.52 25.79 0.34 43.26 0.52 68.94 0.78
Algebra & Patterns 12.3 0.69 25.63 0.41 48.13 0.63 73.97 0.84
Proportionality 18.18 1.5 29.14 0.78 42.53 1.08 60.35 1.26
Geometry 11.38 0.48 26.23 0.33 48.05 0.5 74.58 0.7
Statistics and Data 9.77 0.82 31.23 0.7 57.06 0.89 72.84 0.89
All Fields
Number & Operations 13.81 0.29 27.09 0.15 44.27 0.22 69.11 0.39
Algebra & Patterns 12.18 0.35 28.79 0.2 51.92 0.28 75.05 0.41
Proportionality 17.26 0.73 28.6 0.33 41.61 0.47 59.14 0.65
Geometry 11.11 0.24 25.53 0.14 44.86 0.22 70.62 0.38
Statistics and Data 8.73 0.37 28.89 0.29 54.42 0.38 71.15 0.47
2013 Monitoring of Learning Achievement
70
d) PERFORMANCE BY COGNITIVE LEVELS
Items in the Grade 8 Mathematics Instruments were also intended to assess three levels of
cognitive functioning: Knowing, Applying and Reasoning. As reported in Table 3.4.7, the highest
scores were recorded for Knowing (55%) followed by Applying (33%) and Reasoning (26%).
Across the four Fields, similar performance trends were also noted.
Table 3.4.7 - Mean Scores: Cognitive Level by Field
Field Knowing (%) Applying (%) Reasoning (%)
Mean se Mean se Mean se
Gaza 55.52 0.29 33.26 0.24 25.79 0.27
Lebanon 56.36 0.66 33.06 0.59 26.19 0.71
Jordan 52.57 0.36 32.70 0.31 26.16 0.35
West Bank 55.05 0.49 34.38 0.41 28.51 0.47
Total 54.64 0.2 33.30 0.17 26.46 0.19
A number of additional items assessing higher order thinking skills were also included in the
instruments. These items were included as pilot items as the currently curriculum does not
focus on this level of cognition. The intended use of these items was to obtain a baseline
measure on student performance at this cognitive level for use in improving teaching and
learning regarding the development of higher-order thinking and problem solving skills. As
noted in Figure 3.4.7, mean scores across the four Fields were extremely low, with score
ranging from 9% to 12%.
Figure 3.4.7 - Mean Scores: Performance on Pilot-HOTS Items by Field
A review of cognitive level performance by gender reveals similar trends to the UNRWA wide
results with girls obtaining higher scores across all four cognitive levels, although differences
were much smaller for the Pilot HOTS items (Table 3.4.8).
2013 Monitoring of Learning Achievement
71
Table 3.4.8 - Mean Scores: Cognitive Level by Gender
Boys Girls Total
Mean se Mean se Mean se
Knowing (%) 50.39 0.29 58.09 0.27 54.54 0.2
Applying (%) 29.39 0.23 36.55 0.24 33.25 0.17
Reasoning (%) 21.83 0.25 30.19 0.27 26.34 0.19
HOTS (%) 9.67 0.14 12.39 0.14 11.14 0.1
Additional analysis was also conducted to compare student scores on the different Cognitive
Level items at the four Performance Levels. Similar patterns of performance were noted for
students at the different Performance levels with highest mean scores recorded for Knowing,
followed by Applying and Reasoning and the lowest for the Pilot HOTS items. A particularly
concerning result is that students functioning at the advance level obtained extremely low
scores for the Pilot HOTs items (i.e. mean of 24%).
Figure 3.4.8 - Mean Scores: Cognitive Levels by Performance Levels
A comparison of scores by Field and Cognitive level for students functioning at each of the four
Performance Levels reveals similar trends to that of the UNRWA-wide results, that is: (a)
generally low performance at all cognitive levels; and (b) highest mean score for Knowing with
extremely low mean scores for the Pilot HOTS items. Students at the Not Achieved obtaining
mean scores of 21% for Knowing and 4% for (Pilot) HOTS while the corresponding mean scores
for students at the Advance levels were 85 and 23% respectively(Table 3.4.9).
2013 Monitoring of Learning Achievement
72
Table 3.4.9 - Mean Scores: Cognitive Level by Performance Standards and Field
Not Achieved
Partially
Achieved Achieved Advance All Fields
Mean se Mean se Mean se Mean se Mean se
Gaza
Knowing (%) 21.39 0.56 44.1 0.26 68.08 0.29 86.88 0.35 55.52 0.29
Applying (%) 9.78 0.3 23.12 0.15 41.32 0.21 65.59 0.44 33.26 0.24
Reasoning (%) 4.64 0.33 15.64 0.22 33.24 0.33 58.6 0.71 25.79 0.27
Pilot HOTS (%) 4.06 0.33 7.73 0.15 13.71 0.22 24.67 0.58 11.48 0.15
Lebanon
Knowing (%) 27.97 1.46 46.3 0.6 66.74 0.79 82.77 0.99 56.41 0.66
Applying (%) 8.78 0.71 22.56 0.39 41.97 0.52 63.07 0.99 33.07 0.59
Reasoning (%) 4.31 0.79 15.6 0.63 34.53 0.97 56.89 1.61 26.18 0.71
Pilot HOTS (%) 2.85 0.72 6.46 0.33 11.95 0.58 17.34 1.03 9.38 0.31
Jordan
Knowing (%) 21.4 0.56 42.97 0.31 65.55 0.38 83.51 0.46 52.57 0.36
Applying (%) 9.88 0.31 22.76 0.19 42.12 0.28 67.68 0.57 32.70 0.31
Reasoning (%) 4.74 0.39 16.59 0.27 34.66 0.47 60.75 0.8 26.16 0.35
Pilot HOTS (%) 2.81 0.30 7.32 0.17 12.68 0.27 20.68 0.57 10.11 0.16
West Bank
Knowing (%) 19.68 0.65 42.29 0.41 68.75 0.51 86.8 0.48 55.05 0.49
Applying (%) 10.76 0.35 22.74 0.24 41.37 0.33 67.5 0.61 34.38 0.41
Reasoning (%) 5.76 0.48 16.44 0.37 35.28 0.59 62.73 0.93 28.51 0.47
Pilot HOTS (%) 3.47 0.35 8.11 0.23 14.32 0.34 25.81 0.68 12.41 0.23
All Fields
Knowing (%) 21.33 0.33 43.58 0.17 67.41 0.2 85.69 0.24 54.64 0.2
Applying (%) 10.00 0.18 22.9 0.1 41.6 0.14 66.48 0.29 33.3 0.17
Reasoning (%) 4.93 0.22 16.07 0.15 34.09 0.24 60.11 0.45 26.46 0.19
Pilot HOTS (%) 3.46 0.18 7.59 0.1 13.41 0.15 23.42 0.35 11.12 0.10
e) SUMMARY: GRADE 8 MATHEMATICS
A review of the student performance for Grade 8 Mathematics indicated the following trends:
Generally low levels of performance with approximately 40% of student functioning at
or above the Grade 8 level;
2013 Monitoring of Learning Achievement
73
Performance across the five content domain areas was also generally low across all
fields with highest scores noted for Algebra and Patterns and Statistics (approximately
40%) and similar scores for the other three areas (approximately 36%);
Comparison of performance across the cognitive domain reveals generally low scores
across the four Fields, even at the level of “Knowing”. A particular concern is the
extremely low performance on the Pilot HOTS items (overall mean of 11%);
Girls obtained higher scores than boys, a trend that was noted across all fields, content
domain and cognitive levels;
2013 Monitoring of Learning Achievement
74
4. CONCLUSION AND RECOMMENDATIONS
The Monitoring Learning Achievement (MLA) was conducted to obtain information on the
performance of students and schools within the UNRWA system. Specifically, the purpose of
the MLA is to:
i) determine the performance of students, teachers and schools heads;
ii) identify area in need of intervention and support for improving learning and teaching;
iii) serve as a baseline to monitor the effect of the Education Reform process.
This concluding section provides a brief overview of the methodology, a summary of the
results, implications for interventions and next steps, regarding dissemination and reporting of
results. This section ends by highlighting key issues for consideration in the next MLA.
4.1 METHODOLOGY
The MLA survey was conducted in all schools which had Grade 4 and 8 students. Arabic and
Mathematics tests, as well as student questionnaires were administered to a randomly selected
sample of intact Grade 4 and 8 sections, and teacher and school head questionnaires were also
administered in each school. Both the Grade 4 and 8 Arabic and Mathematic tests comprised
two Forms, which were randomly administered to students in the selected sections. The tests
were constructed in collaboration with UNRWA subject specialists from each Field, and
comprised only items from the common curriculum across the four fields. The Arabic tests
comprised the following sub-domains: Dictation, Grammar, Reading and Writing, while the
Mathematics test comprised: Numbers and Operations, Algebra and Patterns, Proportionality
(only Grade 8), Geometry and Statistics and Data. Items in both subjects focussed on three
different cognitive levels: Knowledge, Understanding and Application. In addition, a number of
Higher Order Thinking Skills items were also included for pilot purposes. Given the need to
compare the 2013 MLA results to that of the 2009 MLA, a number of items from the 2009 were
also included from the 2009 tests. All tests were ‘pre-tested’ and ‘piloted’ in order to be able to
select valid and reliable items for the final instruments.
The data analysis for this study reflected the specific design of the MLA administration and to
enhance the use of information reported. An IRT scaling and equating process was required to
equate the different forms of the tests as well as to compare results between the 2009 and
2013 MLA surveys8, a standard setting process was undertaken for enhancing the reporting of
scores, while descriptive analysis, by Performance levels, Content Domains and Cognitive level
was conducted to report scores disaggregated by Gender and Field. The standard setting
process allowed for performance of students to be reported against specific knowledge and
skills adopted in the curriculum, thereby providing additional details for use identifying what
students know and can do, as well as identifying relevant interventions for addressing student
learning needs. Four levels of performance were identified: definition
8Results of equating are presented in a different report
2013 Monitoring of Learning Achievement
75
Advanced – Indicates that students demonstrates a comprehensive understanding of the
knowledge and skills required to function at this grade level ;
Achieved – Indicates that students demonstrate sufficient understanding of the knowledge and
skills required to function at this grade level;
Partially Achieved – Indicates that students demonstrates a partial understanding of the
knowledge and skills required to function at this grade level; and
Not Achieved – Indicates that students demonstrate little or no understanding of the
knowledge and skills required to function at this grade level.
4.2 MLA RESULTS
The results of the MLA survey revealed a number of important trends with respect to student
gender, as well as within and between Fields, regarding performance of students across the
different content domains and cognitive levels in Arabic and Mathematics. The key findings
emanating from the study are noted below.
The Grade 4 Arabic student performance the following trends:
Performance Grade 4 students was moderate across all four Fields with mean scores
varying from 40% to 48%.
Just under 50% of students performed at or above the Achieved level; about 20%
performed at the Not Achieved level; approximately 34% at the Partially Achieved level,
and 5% were at the Advanced level.
Similarly, performance across the four Arabic Content Domains was also generally
moderate in all Fields.
Students who performed only at Partially Achieved and Not Achieved levels did best on
Grammar and Dictation but their performance on Reading and Writing showed a
noticeable decline.
Results for all Fields indicate moderate performance across the three Cognitive levels also,
with averages of 58% for Understanding, 54% for Knowing, and 50% for HOTS.
Girls obtained higher scores in all four Content Domains while there are significantly less
girls at the lower performance levels and significantly more girls at the higher performance
levels.
A review of Grade 4 Mathematics performance reveals the following trends:
Performance was generally low across all four Fields, with mean scores ranging from 38 to
46%.
Approximately a third of Grade 4 students were performing at or above the Achieved grade
level, approximately two thirds at the Partially Achieved level, with low percentages of
students at the Not Achieved and Advance levels.
2013 Monitoring of Learning Achievement
76
In all Fields performance across the four Mathematics Content Domains was also generally
low, the highest scores recorded for Number and Operations (43%), followed by Geometry,
while mean scores for Algebra and Patterns and Statistics and Data were similar.
Students scored highest for “Knowing” and obtained similar scores for “Applying” and
“Reasoning”. Of concern, however, are the extremely low scores for the HOTS items. Even
student functioning at the Advance level only obtained a mean score of 40% on the HOTS
items.
Across all UNRWA Fields, the overall mean scores, as well as scores by Content Domains
and Cognitive Levels, of girls was higher than mean scores for boys.
A review of performance trends for Grade 8 Arabic students reveals that:
Performance was generally uniform across all four Fields with mean scores varying from
47% to 56%.
Just over 50% of students performed at or above the Achieved grade level; about 10%
performed at the Not Achieved level; a little more than 34% were at the Partially
Achieved level, and 20% were at the Advanced level.
The mean scores for Reading are well below the means for Dictation and Grammar at all
performance levels and below Writing as well at Achieved and Advanced.
Students at the Partially Achieved and Not Achieved levels did best on Grammar and
Dictation, but their performance on Reading and Writing showed a noticeable decline.
Across all Fields girls obtained higher scores in all four Content Domains. There are
significantly less girls at the lower performance levels and significantly more girls at the
higher performance levels.
Student performance for Grade 8 Mathematics reveals the following trends:
Generally low levels of performance with approximately only 40% of students
functioning at the Achieved level or above;
Performance across the five content domain areas was also generally low across all
fields with highest scores noted for Algebra and Patterns and Statistics (approximately
40%) and similar scores for the other three areas (approximately 36%);
Comparison of performance across the cognitive domain reveals generally low scores
across the four Fields, even at the level of “Knowing”. A particular concern is the
extremely low performance on the Pilot HOTS items (overall mean of 11%);
Girls obtained higher scores than boys, a trend that was noted across all Fields, content
domain and cognitive levels.
2013 Monitoring of Learning Achievement
77
4.3 IMPLICATIONS OF MLA FINDINGS
The findings of the MLA survey have a number of implications for the development of relevant
intervention strategies to improve learning and teaching in UNRWA schools. In developing this
strategy, the following key issues findings need to be taken into account:
For both Grade levels, moderate performance in Arabic was recorded with lower
levels of performance for Mathematics;
Performance was higher at the Knowledge / recall cognitive levels, and significantly lower at
the more demanding cognitive levels. Of concern is the extremely low mean scores at the
HOTS level, even for students performing at the Advance levels;
In terms of content domains, performance trends varied in the subject areas and grade
levels as well as the across the four Fields, attesting to the need for Field specific
interventions;
The performance of Girls was significantly higher than that of Boys at both Grade levels and
subjects, and across all four Fields;
In developing interventions to address key challenges noted, it is also important to consider the
current Education Reform process, and to integrate any proposed interventions into current
systems and programmes. In addition, the specific factors that impact on learning and teaching
in schools across the different Fields, as well as the current capacity and skills to effectively
delivery the interventions, must also be addressed to ensure improving in learning and teaching
in all schools.
4.4 DISSEMINATION AND REPORTING
The disseminating and reporting of the MLA results should be conducted within the overall
framework of the Education Reform Process, and should take place at the Field, Area and
school levels, with the key objective being to provide relevant information to all stakeholders
for use in improving learning and teaching in schools. Process es of effective dissemination
should include Field workshops with subject specialists, and Area workshops with schools heads
and relevant teachers. In addition, specific reports for each school should also be compiled and
distributed to school heads, with key advice on how information should be interpreted and
used for improving learning and teaching.
4.5 RECOMMENDATIONS FOR FUTURE MLA STUDIES
Conducting large-scale assessment surveys is a major undertaking within any education system
that require significant resources and relevant capacity and skills to be effectively
implemented, and for results to be relevant, and timeously reported for us e in enhancing the
quality of education provided. Within this context, a number of issues should be considered
form improving any future MLA surveys:
2013 Monitoring of Learning Achievement
78
Test instruments should be reduced from the current two forms to a single form, for both
grade levels and subject areas, so as to reduce the complexity of analysis as well as the
costs and time for printing, administration, entry and coding;
Given that the 2013 test instruments were kept secure, a large number of the same items
should be included in further studies, so as to enhance the reliability and validity of
comparing (equating) scores across the two testing periods;
A similar process of enhancing involvement of Fields in the instrument development,
administration, scoring, coding and entry process should be maintained. However, more
time should be spent on ensuring greater standardisation of the different processes
applied, especially with regard to data cleaning.
In order to reduce the turnaround time for the completion of reports, there is a need to
expand the team involved in the analysis and writing process . Effecting clear deadlines for
completion based on specific analysis and writing workshops will go a long way in this
regard.
A review of the current MLA reporting and dissemination process should be instituted,
based on consultations with Field staff, subject specialist, school heads and teachers in
order to identify areas of improvement for future MLA studies.
2013 Monitoring of Learning Achievement
79
REFERENCES
Baker. F., B. (2001), The Basics of Item Response Theory, ERIC Clearinghouse on Assessment and Evaluation, University of Maryland, http://edres.org/irt/baker/ (Accessed 13 January 2004)
Bond, T.G. and Fox, C.M., (2007), Applying the Rasch Model: Fundamental measurement in the
human sciences, 2nd Ed. New York: Routledge
Dorans, N. J., Moses, T. P., & Eignor, D. R. (2010), Principles and Practices of Test Score Equating. ETS Research Report: RR-10-29
Kolen M. J., and Brennan R. L. (2004), Test Equating, Scaling, and Linking. New York, NY:
Springer-Verlag Hambleton RK, Swaminathan H & Rogers HJ (1991), Fundamentals of item response theory. Newbury Park, CA, Sage Linacre, J.M. (2012), Winsteps ® Rasch measurement program User’s Guide. Beaverton, Oregon: Winsteps.com Reise,S. T., Ainsworth A.T.,& Haviland M.G.(2005), Item Response Theory: Fundamentals, applications and promise in psychological research. Current Directions in Psychological Science, 14:95-101 Universalia (2010), The evolving nature of UNRWA schools. Universalia Report, April 2010.
UNRWA (2006) UNRWA’s Organizational Development Plan 2006-09: Serving Palestine
Refugees More Effectively: Strengthening the Management Capacity of UNRWA
UNRWA (2009), Monitoring Learning Achievement in UNRWA schools – Baseline survey 2009. Amman: UNRWA UNRWA (2010), Education Reform Strategy 2011 – 2015. Amman: UNRWA UNRWA (2012), UNRWA Framework for analysis and quality: Implementation of the Curriculum. Amman: UNRWA
2013 Monitoring of Learning Achievement
80
ANNEX
TABLE A) – THE COMMON STANDARDS FOR GRADE 4 MATHS 1) Numbers and Operations
The Standard Read and write numbers within seven digits
Recognize and identify the place value of any digit in a number of seven digits Compare and order numbers of seven digits
Write numbers within seven digits in the expanded form
Continue a numerical pattern Add and subtract two numbers within seven digits at most horizontally and vertically
Find the missing number in an open sentence that contains addition and subtraction notation Solve problems using addition and subtraction
Multiply a number of two or three digits by a one or two-digit number Multiply a number of three digits at most by multiples of 10, 100 to 900
Solve problems using multiplication strategies
Divide a number by a one-digit number with or without remainder Solve problems using Division strategies
Identify fractions, nominators and denominators and then represent in geometrical shapes and vice versa Compare and arrange fractions which have the same nominators or denominators in ascending or descending orders
Add and subtract two fractions with the same denominator Solve addition and subtraction problems of fractions with the same denominator
Identify and recognize decimal numbers and write in a symbolic form Identify the place value in 10th and 100th digits.
Compare and arrange decimal numbers of 10th and/or 100th digits in ascending or descending orders
Round a decimal number to the nearest whole number Convert a decimal number to a fraction and convert a fraction whose denominator is a factor of 10 to a decimal
2013 Monitoring of Learning Achievement
81
2) Algebra: Grade 4
The Standard Recognize even and odd numbers
Recognize the factors and the multiples of any number which is less than or equal to 100 Solve open sentence that contains the four basic Arithmetic operations
Continue a given pattern of numbers or geometrical shapes
3) Statistics and data analysis: Grade 4
The Standard Read data represented by tables and bars Solve Problems related to interpretation of data
4) Geometry : Grade 4
The Standard Recognize the angle, name it, and identify its type from the drawing Recognize and categorize quadrilaterals into squares and rectangles
Recognize length measurements and convert between the different units Compare between two lengths of two different units
Estimate the length of a segment line or a given shape
Recognize the time measurement units and convert between them (hours to minutes and minutes to seconds) Read a clock
Compare and order different time intervals of different units Add and subtract two time intervals represented by hours and minutes
Solve problems of adding and subtracting time intervals Recognize squares and rectangles and identify that a square is a special kind of a rectangle
Find the area of a rectangle or a square on a grid
Find the area of a highlighted region on a grid Estimate the area of a highlighted region on a grid
TABLE B) – THE COMMON STANDARDS FOR GRADE 8 MATHEMATICS 1) Numbers and Operations
The Standard Recognize the additive inverse of a number and find it
Carry on the four arithmetic operations on Integers Apply addition and multiplication properties on Integers
Compare and order integers
Convert between fraction and mixed numbers Recognize a rational number and write it in the simplest decimal form
Find the place value of a digit in a decimal number Carry on the arithmetic operations on rational numbers: Addition, Subtraction, Multiplication, and Division
Compare and order rational numbers in ascending/descending order Find the square root of a rational number or estimate its value
Recognize and use index laws with integral indices or zero Use index laws to write an algebraic expression raised to any power in its simplest form
2013 Monitoring of Learning Achievement
82
2) Geometry: Grade 8 The Standard
Classify angles according to their types Recognize the corresponding, alternative, complementary, and vertical angles, and use related generalizations in solving problems
Estimate the measure of a drawn angle in degrees Define parallel, intersected , and perpendicular lines
Classify triangles according to their angles and sides
Recognize and apply the properties of isosceles triangle in solving problems
Recognize and apply the rule: “The sum of the angles measures of anytriangles is 180°”
Recognize the concept of congruent triangles
Recognize and apply pythagorean theorem and its inverse
Solve mathematical problems using pythagorean theorem Use the following theorem to solve problems: ”The length of the segment line from the vertix of a right triangle to the mid point of the hypotinuse equals half the length of the hypotinuse”
Apply the rule: “The sum of the angles measures of the quaderateral is 360°” Recognize the parallelogram and use its properties in solving mathematical problems
Recognize the special kinds of parallelogram (Rhombus, rectangle,and square)
Recognize and use the properties of a rhombus,rectangle,and square in solving geometrical problems Solve problems using the perimeter of geometrical shapes (Parallelogram, rhombus, rectangle, and square)
Recognize the concept of the circle and its elements
Find the area of the following shapes: Triangles, trapezode, parallelogram, rhombus, rectangle, and square Find the area of a compound shape of embedded simple shapes within
Employ the rules of geometrical shapes areas in solving problems
3) Ratio and proportion: Grade 8
The Standard Convert between fraction or decimal and percent
Solve a specific proportion
Apply proportions in problem solving
Recognize the concept of percent and use it in solving problems
4) Algebra: Grade 8
The Standard Find the Prime Factors of a number
Find the greatest common factor and the least common multiples for two numbers or more
Solve a linear equation in one variable
Recognize the value that represents a solution of an equation Solve problems using linear equations
Find the value of an algebraic expression given the values of its variables Continue a numerical or geometrical pattern
2013 Monitoring of Learning Achievement
83
5) Statistics, probability, and data analysis: Grade 8
The Standard Represent data by bars /circular sectors Interpret data represented by bars or sectors or graphs Solve problems related to data interpretation Evaluate the mean of given data and solve related problems
TABLE C) – COMMON STANDARDS FOR GRADE 4 ARABIC Content of 1st component: Reading Comprehension
Cognitive
Domain
No. Outputs
Gaza
WB
Jor.
Leb.
Syria
Comprehension 1. Identify antonyms from the text
√ √ √ √
Cognitive 2. Identify the meaning of the word from the context
√ √ √ √
Application 3. Use words in meaningful sentences
√ √ √ √
Higher-thinking skills 4.
Identify homophones words
√ √ √ √
Comprehension 5. Identify the odd word from a number of words
√ √ √ √
Comprehension 6. Differentiate between good attributes and bad attributes
√ √ √ √
Comprehension 7. Conclude the main theme of the text
√ √ √ √
Comprehension 8. Conclude the sub-themes from the text
√ √ √ √
Higher-thinking skills 9.
Suggest another title for the text
√ √ √ √
2013 Monitoring of Learning Achievement
84
Content of the 2nd component: Exercises / Language structures and styles
Cognitive domain No. Outputs Gaza WB Jor. Leb Syria
Application 12. Write the appropriate plurals of each singular word
√ √ √ √
Comprehension 13. Fill in the blanks with the suitable subject
√ √ X √
Application 14. Write the base form of the verb and the imperative using the past form of the verb
√ √ √ √
Application 15. Fill in the blanks with the suitable question word
√ √ √ √
Application 16. Derive the subject and object name from the base form of the verb
√ √ √ √
Comprehension 17. Derive the unconnected pronouns from given sentences.
√ √ √ √
Application 18. Derive the dual and plural verbs from the past form of the verb
√ √ √ √
Comprehension 19. Derive the singular form from the plural form
√ √ √ √
Application 20. Apply verb to be on nominal clause .
√ √ X √
Comprehension 21. Identify masculine adjectives √ √ √ √
Application 22. Use the appropriate
demonstrative pronouns in the blanks
√ √ √ √
Higher thinking skills
1. State his/her opinion in certain situations
√ √ √ √
Comprehension 2. Complete the sentences meaningfully from the text
√ √ √ √
Comprehension 3. Conclude the main character from the text
√ √ X √
Recite 4. Identify the quotation’s owner √ √ √ √
Comprehension 5. Describe a behaviour he/she likes from the text
√ √ √ √
Comprehension 6. Find out a proverb/wisdom mentioned in the text
√ √ X √
Comprehension 7. Identify the main event from the minor events
√ √ X √
Higher thinking skills
8. Justify an event in the text √ √ √ √
Higher thinking skills
9. Clarify a fact mentioned in the text
√ √ √ √
Comprehension 10. eyItei d n itneI eItei tIy it edI
text √ √ √ √
Comprehension 11. Explain certain expression √ √ √ √
Cognitive domain No. Outputs Gaz
a WB Jor. Leb Syria
2013 Monitoring of Learning Achievement
85
Comprehension 23. Identify the affirmative and negative statements
√ √ √ √
Higher thinking skills 24. Classify the sentence into
categories (subject/ verb / object)
√ √ √ √
Application 25. Derive the feminine statement
from masculine ones and vice versa
√ √ √ √
Comprehension 26. Find out the connected
pronouns from given statements
√ √ X √
Comprehension 27. Fill in the blanks with the suitable unconnected pronouns
√ √ √ √
Application 28. Derive the nationalities from names of countries
√ √ X X
Higher thinking skills 29. Re-paraphrase a text using the feminine form
√ √ X √
Application 30. Change nominal clause into verbal clause
√ √ X √
Application 31. Apply verb to be in nominal clause
√ √ X X
Higher-thinking skills 32. Classify words into singular/ dual and plural .
√ √ √ √
Application 33. Fill in the blanks with the suitable relative pronoun
√ √ √ √
Application 34. Fill in the blanks with the suitable prepositions
√ √ √ √
Application 35. Derive the “ing” adjective from
the verbs ending with glottal stop
√ √ X X
Application 36. Use the suitable negative form in sentences
√ √ √ √
Higher thinking skills 37. Derive the questions from given statements
√ √ √ √
Application 38. Replace the subject with suitable unconnected pronoun
√ √ X X
Higher thinking skills 39. Differentiate between feminine
words with/ without feminine pronouns
√ √ X X
Application 40. Apply demonstrative pronouns
in statements (This / that /those/ theses)
√ √ √ √
Application 41. Derive the dual and plural words from singular.
√ √ √ √
Comprehension 42. Negate statements using the negative form (not)
√ √ √ √
Comprehension 43. Use the exclamation in sentences
√ √ √ √
Application 44. Derive feminine form from given words .
√ √ √ √
Comprehension 45. Compare between nominal and verbal clauses
√ √ √ √
2013 Monitoring of Learning Achievement
86
Application 46. Use the subject in meaningful sentences
√ √ √ √
Comprehension 47. Differentiate between the verb, subject and object
√ √ √ √
Application 48. Derive the feminine form of nominal clause using verb to be
√ √ X X
Application 49. Derive the dual form of words ending with glottal stop
√ √ X X
Application
50. Derive the present tense from the past form of the verb and verbs ending with glottal stop according to the example
√ √ X X
Comprehension 51. Use prepositions (in/ on / above) in meaningful sentences.
√ √ √ √
Comprehension 52. Change into object phrase √ √ X X
Comprehension 53. Change into subject phrase √ √ X X
Comprehension 54. Use particles (before/after/ in
front of/ behind) in meaningful sentences
√ √ √ √
Comprehension
55. Derive the relative, demonstrative and unconnected feminine pronouns
√ √ √ √
Content of the 3rd Component: Dictation Cognitive domain No. Outputs Gaza WB Jor. Leb. Syria
Higher thinking skills 56. Differentiate between
different forms of glottal stops
√ √ √ √
Application 57. Change active verbs
ending with glottal stop into passive form
√ √ X X
Application 58. Derive the correct dual
and plural past form of the verb
√ √ X √
Application 59. Derive the correct form
of the words ending with glottal stop
√ √ √ √
Application 60. Derive the correct form
of words starting with glottal stop
√ √ √ √
Application
61. Complete the words using the appropriate feminine form of the letter (T)
√ √ √ √
Application 62. derive the correct form
of the verbs with glottal stop in the middle
√ √ X √
Higher thinking skills 63. Identify the silent letter (A) in plural verbs
√ √ X √
2013 Monitoring of Learning Achievement
87
Application 64. Punctuate the words ending with glottal stop
√ √ X X
Application 65. Write the words ending
with glottal stop after letter (A) correctly
√ √ X √
Application 66. Write the words ending with glottal stop correctly
√ √ X X
Application 67. Derive the plural past form of the verb
√ √ X √
Cognitive
68. Write the accusative case of words ending with glottal stop preceded by letter (A)
√ √ X X
Cognitive 69. Write words with glottal stop in the middle and at the end of the words correctly
√ √ √ √
Cognitive 70. Write the middle glottal stop above letter (W) correctly
√ √ X X
Cognitive 71. Write glottal stop at the end of the words above letter (W)
√ √ X X
Cognitive 72. Write the middle glottal stop above letter (A) correctly
√ √ √ √
Cognitive 73. Identify punctuation marks .)!/./؟/: /،(
√ √ √ √
Content of the 4th component: Composition Cognitive domain No. Outputs Gaza WB Jor. Leb. Syria
2013 Monitoring of Learning Achievement
88
Higher thinking skills 74. Put the words in the
correct order to form meaningful statements
√ √ √ √
Higher thinking skills 75. Put the sentences in the correct order to form a meaningful text
√ √ √ √
Higher thinking skills 76. Write about a specified subject not more than 3 lines (Olive tree/ family / fruit/ human body/ four seasons)
√ √ √ √
Higher thinking skills 77. Write a poster about certain value
√ √ √ √
Higher thinking skills 78. Compose a text using a given picture
√ √ √ √
Application 79. Rearrange sentences correctly to form a story.
√ √ √ √
2013 Monitoring of Learning Achievement
89
TABLE D) – THE COMMON STANDARDS GRADE 8 ARABIC
Cognitive domain Outcomes Gaza WB Jordan Syria Leb.
Comprehension
1. Distinguish between the semantics of the words
√ √ √ √
Comprehension 2. Distinguish between the meanings of different sentences.
√ √ √ √
Recall 3. Identify the meanings of words. √ √ √ √
Recall 4. Identify the antonyms of words. √ √ √ √
Application 5. Use some words from the text in meaningful sentences.
√ √ √ √
Comprehension 6. Explain a certain situation. √ √ √ √
Comprehension 7.Justify the reason of the occurrence of a certain phenomenon.
√ √ √ √
HOTS 8.Give opinion in a certain issue. √ √ √ √
Comprehension 9. Identify the meaning of some statements
√ √ √ √
HOTS 1. Act out a certain situation from real life.
√ √ √ √
HOTS 11. Conclude certain qualities for the characters from the text.
√ √ √ √
Comprehension 12. Show the passion. √ √ √ √
Comprehension 13. Illustrates the technical image / similarities.
√ √ √ √
Comprehension 14. Illustrates the main theme. √ √ √ √
Application 15. Use certain structures in meaningful sentences.
√ √ √ √
Comprehension 16. Concludes the sub-themes from the text.
√ √
HOTS 17. Identify the usage of the symbol in a certain text.
√ √ √ √
Comprehension 18. Explain some verses and statements. √ √ √ √
Application 19. Employs some meanings in new situations.
√ √ √ √
Cognitive 20. Identify the literary genre and their elements( prose, essay, story)
√ √ √ √
Comprehension 21. Identify a paragraph or verses which correspond to a given statement.
√ √ √ √
Comprehension 22. Specify the meaning of some literary terms ( Maqqamat. Muwashahat)
√ √ X √
HOTS 23. Write three paragraphs about certain qualities.
√ √ √ √
HOTS 24. Write a speech in certain occasion considering the elements of the main speech.
√ √ X X
HOTS 25. Summarize a certain text in not more than 10 lines.
√ √ √ √
Application 26. Write situational composition (identity card)
√ √ X √
2013 Monitoring of Learning Achievement
90
HOTS 27. Write a letter according to its elements.
√ √ X √
HOTS 28. Write a composition from his own based on a given saying.
√ √ √ √
HOTS 29. Write a composition using given elements.
√ √ √ √
Comprehension 30. Distinguish Noun from verb. √ √ √ √
Comprehension 311. Identify the subject and the predicate.
√ √ √ √
Comprehension 32. Classify tenses according to their times
√ √ √ √
HOTS 33. Identify the types of the sentences. √ √ √ √
Application 34.Build nominal sentences according to the predicate images.
√ √ √ √
Comprehension 35. Distinguish between the types of the verbs based on transitiveness and in-transitiveness
√ √ √ √
Comprehension 36. Identify the subject, the verb, and the object.
√ √ √ √
Comprehension 37. Classify the verbs into infixed and non-infixed verbs(Al-Sahih & Mu’tal)
√ √ √ √
Application 38. Identify the syntax of the infixed verbs “ Mu’tal)
√ √ √ √
Comprehension 39. Distinguish between different types of nouns(Mamdodah, Maqsorah, Manqosah)
√ √ X √
Recall 40. Recall the number of the consonants and the vowels.
X X X X
HOTS 41. Analyze the consonants and the vowels in the words.
√ √ X X
Recall 42. Distinguish between the types of of the oral and the dental sounds.
√ √ X X
HOTS 43. Identify the extra letters in given words.
√ √ X √
Comprehension 44. Identify the affixed and the base verbs.
√ √ X √
Application 45.Scale words according to the morphological scale.
√ √ X √
Evaluation 46. Change the base forms into affixed verbs with one or two letters.
√ √ X √
Application 47. Derive the present participle from a three letter verb.
√ √ X √
Application 48. Derive the present participle from a geminated three latter verb.
√ √ X √
Application 49. Derive the present participle from an infixed three letter verb.
√ √ X √
Application 50. Derive the present participle from a three letter verb(called verb Naqis)
√ √ √ √
Application 51. Derive the present participle from non-three letter verb.
√ √ √ √
2013 Monitoring of Learning Achievement
91
Application 52. Derive the past participle from a three-letter verb.
√ √ √ √
Application 53. Derive the past participle from an infixed verb.
√ √ √ √
Application 54. Derive the past participle from Naqis verb.
√ √ √ √
Application 55 Derive the past participle from a geminated three-letter verb.
√ √ √ √
Application 56.Derive the past participle from non-three letter verb.
√ √ √ √
Application 57. Identify present participle and its verb.
√ √ √ √
Evaluation 58. Distinguish between present participle and past participle that are similar in form.
√ √ √ √
Comprehension 59. Conclude Al Sifah Al Mushabha. √ √ √ √
Application 60.Derive the Exaggeration form from given verbs.
√ √ X √
Application 61. Derive the superlative noun. √ √ X √
HOTS 62. Construct the dual form with the needed changes.
√ √ √ √
Comprehension 63. Identify the Masculine Plural form. √ √ √ √
Comprehension 64. Distinguish between Masculine Plural form and Plural of Taxeer in given statements.
√ √ √ √
Comprehension 65. Identify Feminine Plural form. √ √ √ √
Application 66. Pluralize the words to be in the Feminine Plural form
√ √ √ √
Application 67. Identify the syntactic feature of the parsing of the Five Nouns
√ √ X X
HOTS 68. Conclude the un-derived morphologically nouns.
√ √ X √
HOTS 69.Insert the accusative letter making the necessary changes in the sentence.
√ √ √ √
Application 70.Use the five nouns according to their position in the sentence.
√ √ X √
Comprehension 71. Identify the sign of the nominative case of the present tense.
√ √ √ √
Comprehension 72. Identify Al-Majzoom verb and show its sign.
√ √ √ √
Comprehension 73. Identify the sign of static noun. √ √ √ √
Comprehension 74. Distinguish between the static and the dynamic verbs.
√ √ √ √
Comprehension 75. Identify the type of the static verbs. √ √ √ √
Comprehension 76. Identify the sign of the past static verb.
√ √ √ √
Comprehension 77. Distinguish between LA Al-_ Nahiyya and LA_AL Naffiyah.
√ √ √ √
HOTS 78. Correct the grammatical mistkes. √ √ √ √ Comprehension 79. Distinguish the Glottal stop. √ √ √ √
2013 Monitoring of Learning Achievement
92
Recall 80. Recall the topics of the Glottal Stop. √ √ √ √
Comprehension 81.Calssify words starting with the two types of the glottal stop with justification.
√ √ √ √
Comprehension
82. Identify the type of the vowel “WAW” in given sentences(Plural WAW, Masculine Plural WAW, or the originl WAW)
√ √ √ √
Comprehension 83. Justify the writings of the letter in
different shapes."“A "أ √ √ √ X
Comprehension 84. Justify the writings of the letter in
other different shapes."“A "أ √ √ √ √
Comprehension 85. Distinguish between words where letters are written but not pronounced.
√ √ √ √
Comprehension 86. Distinguish between words are pronounced but not written.
√ √ √ √
Comprehension 87. Justify the writing of the letter “A” in the shape of the Ya’a”)
√ √ √ √
Comprehension 88. Justify the writing of the letter “A” in the shape of Al-Alif- Al-Qa’imah))
√ √ √ √
2013 Monitoring of Learning Achievement
93
TABLE E) – SAMPLE OF SCHOOLS IN JORDAN FIELD
Area No. School Name School Code
No. of student
s of
5th grade
No. of stude
nts of
9th grade
No. of sections of 5th
grade
No. of sections of
9th grade
Class Selected
For Piloting.
N. Amman
1. Baqa’aElem.B/
S 2 54785 256 - 8 - 5th
2. Baqa’aPrep.G/
S 2 54772 125 132 3 3 9th
3. NuzhaPrep.G/
S 2 54171 85 75 2 2 5th
4. Hashemi Prep
B/S 2 54382 97 119 2 3 9th
S. Amman
5. ANC Elem.G2 53378 126 82 3 3 5th
6. ANC Prep.B1 53389 90 128 2 3 9th
7. Taibeh Prep.B1 53484 150 150 3 3 5th
8. Ashrafiyah
Prep.G1 53176 34 68 1 2 9th
Zarqa
9. Russeifeh Prep.G3
55373 70 78 2 2 5th
10 Russeifeh Prep.B3
55383 121 81 3 2 5th
11. Zarqa Prep. G1 55170 78 81 2 2 9th 12. Zarqa Prep. B1 55181 81 90 2 2 9th
2013 Monitoring of Learning Achievement
94
TABLE F) – SAMPLE OF SCHOOLS IN WEST BANK FIELD
Area No. School Name School Code
No. of students of 5th grade
No. of students of 9th
grade
No. of sections of
5th grade
No. of sections of
9th grade
Class Selected
for Piloting
Nablus
1. AskarElem.Boys 66713 118 - 3 - 5th 2. Balata Prep. Girls 66710 155 177 4 5 9th
3. Qalqilia Prep. Girls 66778 121 134 3 4 5th 4. Qalqilia Prep. Boys 66747 88 107 2 3 9th
Jerusalem
5. Kalandia Prep. Girls
64706 80 101 2 3 5th
6. Kalandia Prep. Boys
64705 87 80 3 2 9th
7. Ama’ri Prep. Girls 64710 79 92 2 3 5th 8. Ramalla Prep. Girls 64701 76 65 2 2 9th
Hebron
9. Hebron Prep. Girls 65717 71 64 2 2 5th
10 Hebron Prep. Boys 65719 82 57 2 2 9th 11. Arroub Prep. Boys 65769 131 105 3 3 5th
12. Aida Prep. Boys 65745 55 62 2 2 9th
2013 Monitoring of Learning Achievement
95
TABLE G) – SAMPLE OF SCHOOLS IN LEBANON FIELD
Area No. School Name School Code
No. of students of
5th grade
B G
No. of students of 9th grade
B G
No. of sections of 5th
grade
No. of sections of
9th grade
Class Selected
For Piloting.
Central Lebanon
1. RAMALLAH ELEM COED
32752 62 57 - - 3 0 5th
2. JERUSALEM PREP.BOYS
32789 76 75 - - 2 3 5th
3. JALLOUD PREP GIRLS
32797 - - - - 0 4 9th
4. JERUSALEM PREP BOYS
32789 76 - - 75 2 3 9th
Saida
5. PALESTINE MARTYRS PREP BOYS
33724 61 - 55 - 2 2 5th
6. FALOUJA PREP GIRLS
33728 - 85 - 41 2 2 5th
7. QIBYA PREP GIRLS 33737 - 60 - 60 2 2 9th
8. Sammou’ Prep. Boys
33787 40 - 132
- 1 4 9th
Tyre
9. QADISIEH ELEM GIRLS
34750 - 132 - - 4 0 5th
10 EIN EL ASSAL ELEM BOY
34766 105 - - - 3 0 5th
11. PALESTINE PREP BOYS
34766 84 - 59 - 3 2 9th
12. JABALIA PREP GIRLS
34770 - 103 - 80 3 3 9th
2013 Monitoring of Learning Achievement
96
TABLE H) – SAMPLE OF SCHOOLS IN GAZA FIELD
Area No. School Name School Code
No. of student
s of 5th grade
No. of students of
9th grade
No. of sections of
5th grade
No. of sections of 9th
grade
Class Selected
For Piloting.
Gaza Beach
1. Beach Elem. Boys “B”
22742 256 - 6 - 5th
2. Beach Prep. Girls “C”
22758 211 207 5 5 9th
3. New Gaza Prep. Boys”C”
22757 - 322 - 10 9th
Gaza Town
4. Gaza Elem. COED “B”
23740 97 - 2 - 5th
5. Shaja’iyaElem.Boys”C”
23736 276 - 6 - 5th
6. Gaza Prep. Girls”B”
23732 234 282 5 6 9th
Khan Younis
7. Khan Younis Prep. Boys “C”
27791 45 224 1 7 9th
8. Khan Younis Elem. Girls”D”
27758 65 - 2 - 5th
9. Mustafa Hafez Elem.Boys”B”
27751 234 - 5 - 5th
Rafah
10 RafahPrep.Girls”E”
28769 188 299 4 7 9th
11. RafahPrep.Boys”C”
28767 255 129 5 4 9th
12. RafahElem.Girls
28744 80 - 2 - 5th