the division monitoring &evaluation system -...

160
The Division Monitoring &Evaluation System

Upload: truongnhi

Post on 02-Apr-2018

220 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

The Division Monitoring &Evaluation System

Page 2: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

DivisionMonitoring & Evaluation System

1. Introduction

2. Objectives

3. Scope

4. Performance Measures

5. School Monitoring Process

6. Control and Adjustment Point

7. M&E Tools and Techniques

8. Documents and Reports

9. Terms of Reference

10. Setting Up the School M&E System

Division Quality Management Inventory Model (QMIM)

Page 3: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

DIVISION M&E SYSTEM

1.0IN T R O D U C T I O N

Page 4: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Introduction

T H E F I S H E R M E N

A U T H O R , U N K N O W N S O U R C E U N K N O W N

There was a group called 'The Fisherman's Fellowship.' They were

surrounded by streams and lakes full of hungry fish. They met regularly to

discuss the call to fish, and the thrill of catching fish. They got excited about

fishing!!

So a committee was formed to send out fishermen. As prospective fishing

places outnumbered fishermen, the committee needed to determine

priorities.

A priority list of fishing places was posted on bulletin boards in all of the

fellowship halls. But still, no one was fishing. A survey was launched, to find

out why Most did not answer the survey, but from those that did, it was�

discovered that some felt called to study fish, a few to furnish fishing

equipment, and several to go around encouraging the fisherman.

What with meetings, conferences, and seminars, they just simply didn't have

time to fish.

Now, Jake was a newcomer to the Fisherman's Fellowship. After one stirring

meeting of the Fellowship, Jake went fishing. He tried a few things, got the

hang of it, and caught a choice fish. At the next meeting, he told his story,

and he was honored for his catch, and then scheduled to speak at all the

Fellowship chapters and tell how he did it. Now, because of all the

speaking invitations and his election to the board of directors of the

Fisherman's Fellowship, Jake no longer has time to go fishing.

But soon he began to feel restless and empty. He longed to feel the tug on

the line once again. So he cut the speaking, he resigned from the board,

and he said to a friend, "Let's go fishing." They did, just the two of them, and

they caught fish.

The members of the Fisherman's Fellowship were many, the fish were

plentiful, but the fishers were few.

1 . 0 I N T R O D U C T I O N

1.1 Purpose of the Manual

Monitoring and evaluation is acknowledged to be one of the important systems in an

organization. It is one of the most desired systems. In conferences, seminars and workshops,

participants and organizers would talk about the importance of M&E, how it could facilitate

Page 1 - 2

Page 5: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Introduction

an efficient operation and how the organizations can benefit from it. People would get

excited about the importance of M&E that it would often result to many bright ideas and

plans about how to implement an M&E. However, the perceived importance of M&E does

not always translate to actually doing and implementing M&E. Often, critical elements of

the M&E system are missing. As a result, activities and events are undertaken in the name of

M&E and yet most fail to provide the necessary information needed in making good

decisions. Data gathering and report writing are often confused to be the M&E. � �

The story of The Fishermen draws an important parallelism to the practice of M&E by� �

organizations. Too many activities and events are undertaken in the name of M&E. Forms

and data gathering instruments are developed, but which are often incoherent. Costly

infrastructures and facilities are set up, but which usage is far from being maximized. And

generally, despite all the efforts stated above, the most basic information requirements are

missing.

It is ironic that the one of the most important systems is also one of the most neglected

systems in the organizations. Often, there are too many fisherman's fellowship and yet the� �

fishers are few. The main purpose of the Manual is to serve as a guide to would-be� �

Monitors and Evaluators on how to operationalize a M&E System. This document illustrates

the fundamental requirements and techniques of implementing M&E at the Division.

The schools ( fish ) are plentiful. There is an urgent need to set up an efficient M&E system� �

to enable the monitors to actually fish.� �

1.2 Understanding M&E

The capability to get things done in an efficient way is dependent on the organization's

ability to gather data, analyze data, and provide feedback to improve the way things are

done. Receiving the right information at the right time is critical to an efficient operation.

Information must be correct in order for decision-makers to set directions, and information

must be accurate in order for field implementers to act decisively in making the necessary

adjustments to improve things. Among the systems in the organization, it is the M&E system

that provides such information. It is one of the must systems that should be in place in an

organization, in a program or a project. Its importance necessitates that it should be done

in a very systematic manner.

M&E is more than an activity, more than collecting data and more than reporting. It is

purposive, deliberate and systematically undertaken. In order to understand M&E, it is

important to understand the following areas: (1) planning, (2) decision-making, and (3)

continuous improvement.

1. Planning. One of the major objectives of M&E is to determine if the implementation

is going according to plan. Without a plan, there is nothing to monitor and� �

evaluate.

The Plan provides the scope of monitoring and evaluation. It provides the

Page 1 - 3

Page 6: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Introduction

destination (outcomes), the directions (strategies) and the means (resources) to get

to the destination. It is important to ensure that the plan is accurate, correct and

clearly written, especially the targets and indicators.

The Plan defines the areas to be monitored and evaluated.

2. Decision-making. During implementation, things may not go according to plan.� �

Every manager must make a decision correct and timely decision before things�

get out of control. The objective of every manager is to ensure that despite

changes in the frame conditions and changes in the plan the outcomes can still be

achieved. Necessary adjustments have to be made in the strategies and activities,

in the use of resources in order to keep implementation on track according to�

schedule, target, time and quality.

The quality of decisions is dependent on the timeliness and completeness of

information that a decision maker has at hand. In setting up the M&E system, one of

the key considerations is knowing the information requirements of key internal

stakeholders the manager, supervisor and field personnel.�

The M&E function is core to decision-making.

3. Continuous improvement is a management process where delivery processes are

constantly evaluated and improved in the light of efficiency, effectiveness and

flexibility. (Wikipedia).

1.3 What is M&E?

Monitoring and evaluation is a means to support the continuous learning processes of an

organization. It is an essential component of any reform agenda, programs, projects and

interventions. The learning and insights generated by doing monitoring and evaluation are

used to promote continuous improvement in the work place, especially in the practices

and processes of an organization. The lack of it often results to poor performance,

inefficient implementation and program failure. The presence of a M&E System, therefore, is

important to organizations.

1.3.1 Definition

Monitoring and Evaluation (M&E) is defined as the systematic process of gathering,

processing, analyzing, interpreting, and storing data and information thereby setting into

motion a series of managerial actions for the purpose of ascertaining the realization of

set objectives.

M&E is composed of three interrelated processes. These are:

Monitoring refers to the systematic observation and documentation of actual

accomplishments as well as tracking of issues, opportunities and problems that may

Page 1 - 4

Page 7: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Introduction

affect implementation

Evaluation concerns the assessment of information (collected through monitoring)

regarding the extent to which actual accomplishments conform to or deviate from

the objectives set in the plan

Adjustment means steering the implementation. This means using the information

and insights derived from evaluation, adjusting the strategies or way of doing things

to make implementation more efficient and lead towards realization of objectives.

The main purpose of M&E is to spur managerial actions based on information and insights

collected, processed, analyzed and interpreted by the monitors and evaluators. These

managerial actions are undertaken to improve performance during implementation and to

increase the likelihood of achieving the desired outcomes.

1.3.2 M&E and Decision-Making

M&E is closely linked to decision-making. Decision-making is the process of identifying and

selecting alternatives, directions and/or solutions. Often, decisions are based on the

accountabilities of an individual, on the vision, mission and values of an organization and

on the goals and desired outcomes of the plan. However, the quality, relevance and

timeliness of decisions depend greatly on the quality and availability of data and

information. Supplying sound data and information will allow individuals to meet their

accountabilities, stay true to their organizational values and most importantly, significantly

contribute to the achievement of set goals and outcomes.

The M&E function is primarily set up to support the decision-making requirements of

managers and staff. This is the first and most important design requirement of any M&E. The

strategies, data collection and analytical techniques and timing of M&E should be tied-up

with the decisions and the timing of the decisions.

1.3.3 Elements of M&E

There are five major elements of M&E. These are:

Scope. All M&E efforts should have a scope. The scope provides the standards and

parameters for evaluating performance of programs, projects and including

individuals. The coverage of M&E is defined by the approved or accepted plan.

Without a plan, there is no scope for the M&E. Specifically, the scope will define the

following M&E concerns:

outcomes to be achieved

outputs to be delivered

Page 1 - 5

Page 8: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Introduction

activities to be undertaken

resource requirements including human, material and equipment

schedule or timing of implementation

budget or cost

Plan versus Actual Performance. M&E is about tracking of performance by

comparing the approved plan (scope) versus the actual performance. The M&E

supplies the following information:

Outcomes. Includes changes in the performance and/or practices of target

groups and benefits received as a result of the interventions

Quality. Assess quality of outputs delivered versus the standards

Scope/Quantity. Comparison of target outputs versus actual outputs

delivered

Time. Target schedule/duration versus actual date/duration of work

undertaken

Cost. Budget versus actual expenditure

Means of verification (MoV). One of the main features of any M&E system is the

means of verification. MoVs are authoritative source of information about the

achievement of outputs and the actual performance. The role of M&E is to provide

relevant, timely, and accurate information about the achievements and status of

implementation. MoVs include:

Status and/or Accomplishment Reports

Documentation of effective practices

Testing

Observation and Inspection

Key Informant Interviews

Focus Group Discussion

Managerial actions. One of the objectives of M&E is to supply information about

performance and the status or possible occurrence of external factors which may

affect implementation. The role of is to provide the venue for making corrective

actions. Possible actions include:

Page 1 - 6

Page 9: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Introduction

Adjustments in the activities and strategies

Reallocation of resources

Rescheduling

Provision of more resources

Adjustment in the scope or inclusion of new strategies

No actions at all

Termination/replacement of staff

New design, new strategies and new plans

External factors. M&E also keeps track of the possible occurrence of external

factors that may affect actual performance. These factors include:

Status and/or Accomplishment Reports

Documentation of effective practices

Testing

1.4 Types of M&E

The concerns of M&E range from tracking

efficiency to evaluating effectiveness.

There are different types of M&E with

different focus and usage. There are four

4 common types:

1.4.1 Progress Monitoring and Evaluation

Progress Monitoring and Evaluation is a

systematic and objective assessment of

an on-going implementation of a plan or

project. Its aim is to steer implementation

as efficiently as possible based on

empirical facts determined through a systematic observation and documentation process

and through a verifiable assessment process. Specifically, Progress M&E measures physical

progress against plans and work schedules and financial progress against cash flow and

Page 1 - 7

Inputs

Outputs

Intermediate Objectives

Outcomes

Goal

Progress Monitoring & Evaluation

Initial Gains Evaluation

Results Monitoring & Evaluation

Impact Evaluation

Activities

Hierarchy of Objectives

Type ofM&E

Measure efficiency of implementation: focused on scope,

quantity, quality, time and cost

Measure effects of intervention during implementation; improvement in

performance, behavior and practices

Measure benefits to target groups after all interventions are

completed

Measure Impact to Goal

Objective

Table 1-1 Types of M&E

Page 10: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Introduction

budget allocations. It is a mechanism established to assess the quality of outputs delivered,

early warning signs for implementation problems and to identify external factors affecting

delivery of outputs.

Progress Monitoring and Evaluation is undertaken during the implementation stage and is

an integral part of the plan-design-act-control cycle.

1.3.2 Initial Gains Evaluation

Initial Gains Evaluation keeps track of the changes or improvements in the performance

and/or practices of the target groups. Initial gains represent leading indicators, the

achievement of which will lead to the attainment of desired outcomes.

Evaluations of this type are conducted every mid-term implementation and before the

completion of the plan.

1.4.3 Results Monitoring and Evaluation

Results Monitoring and Evaluation is a type of post-implementation review (PIR) that

measures the realization of outcome-level objectives. It aims to assess the effectiveness of

implementation by measuring the benefits received by target groups (recipients) and to

determine the changes in the behavior and practices of the target groups as a result of

their application and utilization of outputs.

Results Monitoring and Evaluation focuses on effectiveness.

1.4.4 Impact Evaluation

Impact Evaluation is an ex-post type of evaluation. The objective is to determine the

impact or contribution of an intervention (programs or projects) to a higher level

undertaking.

Page 1 - 8

Page 11: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Introduction

Table 1-2 Types of M&E & Description

Type of M&E DescriptionHierarchy of Objective

Measure of Performance

Timing of M&E

Progress M&E

Assessment of an on-going implementation. The objective is to steer implementation as efficiently as possible based on the approved plan.

Output-Activity- Input Level

Efficiency - Physical Accomplishment (Actual versus Plan)

During Implementation

Initial Gains Evaluation

Keeps track of the changes or improvement in the behavior, performance and practices of the target group/s.

Intermediate level (in between outcome & outputs)

Initial Gains - Improvement in the performance of target group/s

Middle of the Implementation and before the termination of the current plan

Results or Outcome M&E

Measures the realization of Outcome level objectives; Determines the effectiveness of the implementation

Outcome levelEffectiveness � achievement of benefits

Immediately after the end of implementation

Impact Evaluation

Measures the contribution of the interventions to a higher level objective

Goal level

Impact � achievement of long term objectives

Post Implementation (not immediate)

Page 1 - 9

Page 12: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

DIVISION M&E SYSTEM

2.0OBJECTIVES OF THE DIVISION M&E

Page 13: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Objectives of Division M&E

2 . 0 O B J E C T I V E S O F D I V I S I O N M& E

2.1 Def init ion

The Division M&E System is a mechanism for gathering, processing, analyzing, interpreting,

and storing data and information about the school's performance, needs and requirements

to sustain an effective school-based management. Operated by the Division, it is a System

which provides data, information and insights on the efficiency and effectiveness of the

Division s technical support to schools. It sets into motion a series of � managerial actions,

adjustments and realignments for the purpose of creating a sustained impact on the

quality of education provided by schools to learners.

A complete Division M&E System should have the following features:

Organized gathering and processing.

Analysis and Interpretation

Storing data and information

Managerial actions

Realization of objectives

2.2 Objectives

The main objective of the Division M&E System is to ensure the timely flow of information

and insights on the effectiveness of the Division's technical assistance to improving school

performance. The System is used to keep track of the Division's programs and projects.

Specifically, the Division M&E System will provide the following data and information on:

school s performance. The System will allow the Division to adjust its technical�

assistance on SBM according to the school s performance on enrollment, retention,�

completion and achievement. This will facilitate the classification and profiling of

schools into high, average and low performance. The classification will be used as

the major input to customizing programs and projects of the Division based on

school performance.

participation rate. The Division M&E System provides data and information on the

percentage of learners of school age participating in the basic school systems and

Page 2 - 2

Page 14: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Objectives of Division M&E

the number of out-of-school youth and indigenous people being served by the

alternative learning system.

capabilities of the school heads. One of the main target groups of the Division is

the school head. The Division will track the performance and requirements of the

school heads on instructional supervision and SBM.

capabilities of teachers. Another major target group of the Division is the teacher.

The tracking will include the teachers teaching skills and mastery of the subject�

matter.

efficient management of the DEDP implementation. The Division M&E System will

also be used to assess the internal efficiency of the Division, especially in the

implementation of the programs and projects outlined in the DEDP terms of

difficulties, problems, issues or risks that hinder efficient implementation of Division

programs and projects.

The Division M&E System is part of the Integrated M&E System which connects the Division

to schools and to the Region. This will enable the Division to collect and share data,

information and insights from the schools to the Region and vice-versa. The integration will

provide the Division with critical and timely information regarding its operations and will

allow it to adjust or improve its technical assistance based on the needs and requirements

of the schools. Also, the Division's documentation of practices, initial gains and results will

serve as valuable inputs to the Region and National Offices to improve their respective

programs, policies and standards.

2.3 Characteris t ics of a Well-designed M&E System

A complete and well designed Division M&E System should have the following features:

Organized data gathering and processing.

Organized data analysis and interpretation

Systematic storing of data and information

Facilitative of managerial actions

Aligned with the realization of objectives

2.3.1 Organized gathering, processing, analyzing and interpretation.

Page 2 - 3

Page 15: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Objectives of Division M&E

In monitoring and evaluation, it is important that the collection of data and information be

done in an orderly and systematic manner. A typical Schools Division deals with hundreds

of elementary and secondary schools. It also has to track the performance of community

learning centers and their service providers.

In this regard, the Division needs an organized and efficient system of gathering and sorting

information to reduce repetitive, costly and time-consuming gathering of data. An

organized system will facilitate the following:

accuracy of data and information

non-duplication of data and efforts

more time for technical assistance

2.3.2 Systematic Storing of Data and Information

The Division M&E System is the most authoritative source of information about the

performance of schools. It stores information on the performance of schools within the

Division and is a repository of programs and projects that can be considered as part of the

effective practices of the Division. These can be shared to all schools when they need the

information which is an important input to knowledge management.

As such, the M&E System will enable the following:

prompt retrieval of data and information when needed

detailed recording of information

standardized formats, documents and reports.

2.3.3 Facilitative of timely managerial actions

A must feature of a M&E system is the ability to provide relevant information to facilitate

decision-making . In this regard, deriving such information to aid in the decision-making and

the timing of the decisions to be made are very important considerations in the design of

the M&E system.

The monitoring activities and quality control points to be implemented by the Division are

timed with the implementation requirements of the schools. In this way, the data,

information, insights and lessons derived from the Division M&E System are immediately use

for making managerial and technical actions that will support the schools.

Page 2 - 4

Page 16: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Objectives of Division M&E

2.3.4 Aligned with the realization of objectives

And last but not the least, a well-designed M&E system must be able to keep track of the

accomplishments, initial gains and results. The main use of the System is to provide

information and insights towards the realization of objectives.

Aside from ensuring the realization of objectives and targets, the Division M&E System will

likewise allow the Division to:

document effective practices

draw lessons from failed or problematic programs and projects

determine whether to stop, continue or make adjustments in the strategies given

the early warning information

Page 2 - 5

Page 17: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

DIVISION M&E SYSTEM

3.0SCOPE OF THE DIVISION M&E SYSTEM

Page 18: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Scope of the Division Monitoring and Evaluation

3.0 SCOPE OF THE D IV IS ION MONITOR ING AND EVALUAT ION

3.1 M&E Coverage

The main task of the Division is to provide technical assistance to the schools and

community learning centers. In order to be effective, the Division must continually improve

its services by providing timely and relevant programs and projects that will benefit the

schools and community learning centers. In this regard, the Division M&E System must

capture information and insights on schools performance, efficiency of schools,�

capabilities of school heads and teachers. The System must also obtain information about

the Division's efficiency to provide the technical assistance programs and projects.

Specifically, the scope of the Division monitoring and evaluation work is defined in the

Division Education Development Plan (DEDP) and the School Improvement Plans (SIPs). The

objectives, targets, programs and projects documented in these plans will be used to

define the scope of the Division M&E System.

3.1.1 Three-Year School Improvement Plan

At the school level, the SIP will be the main reference document for the monitoring and

evaluation strategies and activities of the Division. The Division will evaluate the

performance of the schools based on the targets documented at the Purpose/Outcome

Level objectives in the SIP. These include targets for enrollment, retention, completion and

learner achievement.

The Division will also monitor the efficiency of the schools in implementing the school

programs and projects specified in the SIP or AIP. Hence, the quality of the SIPs is critical to

the successful operation of the Division M&E System.

3.1.2 Six-Year Division Education Development Plan

Another main reference document on the scope of the Division M&E System is the DEDP.

The objectives, targets and deliverables contained in the DEDP will be used to track the

efficiency of the Division and to evaluate the effectiveness of Division programs and

projects in helping the schools and community learning centers improve their performance.

Page 3 - 2

Page 19: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Scope of the Division Monitoring and Evaluation

3.2 Types of Division M&E

The Division M&E System is divided into three types: Outcome Evaluation, the Intermediate

Results Evaluation and the Progress M&E. These three M&E strategies are designed to

measure the Division outcomes and initial gains, determine the achievement of critical

leading indicators and assess the efficiency of the Division in managing technical

assistance programs and projects for schools and community learning centers. Specifically,

the Division M&E System will include the following:

3.1.1 Outcome Evaluat ion

Primary target groups of the Division M&E System are the schools and the community

learning centers. One of the main tasks of the Division M&E System is to evaluate the

effectiveness of the Division's technical support programs and projects to the schools and

community learning centers. This is known as the Division Outcome Evaluation.

Outcome evaluation will be conducted every three years or at every end of SIP cycle. The

evaluation will provide the Division with information and insights on the improvements in the

performance of schools and community learning centers. The same will be used as input to

the preparation and/or adjustment of the DEDP.

Specifically, the M&E at this level will include the evaluation of the following:

• school s performance �

• performance of the community learning center

• SBM level of practice of the school

• participation of learners of school age, out of school, indigenous people and others

to the basic education system

• school stakeholders satisfaction on the quality of school services

3.1.2 Tracking Intermediate Results

The Division M&E System will also track the intermediate results. Tracking Intermediate Results

is a type of evaluation that is undertaken by collecting and assessing data and information

that predict the achievement of the Outcome Indicators. Collecting and analyzing

leading data or information is a proactive M&E strategy that will help identify the

achievement or non-achievement of the outcomes even before the evaluation period

takes place. Leading indicators of the Division's performance include:

Page 3 - 3

Page 20: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Scope of the Division Monitoring and Evaluation

• Improvement in the capability of school heads on instructional supervision and SBM

• improvement in the teaching skills of teachers and mastery of the subject matter

• improvement in the teaching skills of facilitators and mastery of the subject matter

• Improvement in the learning environment of the schools which includes classrooms,

laboratories, school equipment, textbooks, manuals and supplementary materials

and the ancillary services of the school

Intermediate Results Evaluation will be undertaken annually or as the need arises by the

Division. The findings from the evaluation will be used to enhance or improve the Division's

programs and projects (when the leading indicators showed favorable results) and/or to

implement remediation strategies when the leading indicators are showing negative results.

3.1.3 Progress Monitor ing

The major feature of the Division M&E System is the Progress Monitoring. Its objective is to

track the efficiency of both the schools and the Division in the implementation of

education programs and projects outlined in the DEDP and the SIPs. Specifically, progress

monitoring covers:

• school's efficiency as per the SIP and/or Annual Improvement Plan (AIP)

• implementation of capability building programs for Division staff

• efficiency of the Division as per the DEDP and/or Division Annual Plan (DAP)

• fiscal management vis-a-vis physical accomplishment.

Table 1. Division M&E Framework outlines the scope of the Division M&E System. It shows the

relationships of the school performance, Division objectives and strategies, performance

indicators and means of verification. It also provides information on the type of monitoring

and evaluation the Division will implement to operationalize the System.

Page 3 - 4

Page 21: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Scope of the Division Monitoring and Evaluation

Table 3-1 Division M&E Framework

Objectives Performance IndicatorsMeans of Verification

Type of M&E / Division M&E

Process

Division Goal:

1.Access: To ensure that all learners of school age are in school and are ready for school

2. Retention: To ensure that learners who are in school will stay in school

3. Completion: To ensure learners who are in school will complete the requirements of the primary and secondary level

4. Achievement: To ensure that learners demonstrate the necessary competencies at each level

Impact Indicators

• Increase in enrollment

• Learners entering the school system are ready

• Increase in number of learners retained in the school (retention rate)

• Reduction in drop outs

• Reduction school leavers

• Increase in number of learners able to complete the basic education requirements

• Improve graduation rate

• Improvement in the basic functional literacy skills of the learners

• Improvement in the academic performance of learners in all subject matter

• Improvement in the social skills

Enrollment ReportSchool Report Card

School Report Card

School Report Card

Learner Report CardTeacher AssessmentNational Achievement Test (2nd Year)Regional Achievement Test (3rd Year)

Impact EvaluationProcess: Evaluation of School Performance

Division Level Outcomes:

1. Improved school performance

Effectiveness Indicators

(a) Reduce disparity between high performing schools and low performing schools (in NEAT and NAT) by --- percent

(b) Reduce disparity in enrollment, drop out, and completion rates between high performing schools and low performing schools

Division Report Card Division Education Development Plan

(DEDP)

Outcome EvaluationProcess:Monitoring DEDP Implementation

Page 3 - 5

Page 22: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Scope of the Division Monitoring and Evaluation

Objectives Performance IndicatorsMeans of Verification

Type of M&E / Division M&E

Process

2. Improved teachers performance

3. Improved school heads performance

4. Improved learning environment

(c) Increase in satisfaction of school stakeholders in the quality of instructions in the school

(d) Improve SBM Practice of schools

(a) Teachers demonstrated competencies on General Content and Subject specific skills.

(b) Teachers meeting the desired competencies based on the NCBTS

(a) School heads demonstrated competencies on school based management and instructional supervision

(a) Teacher to learners' ratio is 1:45(b) Learner to textbook ratio is 1:1(c) Teacher to teacher manual

ratio is 1:1(d) Teacher and learners have

access to school equipment, science laboratories and other facilities

(e) School comply with Standards of a Child Friendly School

Perception Survey

SBM Assessment Result

Division Report Card and DEDPTeachers'

Performance Assessment Report Assessment for Math

and Science teachers

Division Report Card and DEDP

Division Report Card

Tracking Intermediate ResultsProcess:Monitoring DEDP Implementation

Tracking Intermediate ResultsProcess:Monitoring DEDP Implementation

Tracking Intermediate ResultsProcess:Monitoring DEDP Implementation

Division Intermediate Results:

1. Improved Division performance

Leading Indicators

(a) Increase in gross enrollment rate;

(b) Improvement in the net enrollment rate

(c) Reduce disparity in the net enrollment ratio / participation rate between highly urbanized and SRA Divisions

Division Report Card and DEDP

Tracking Intermediate ResultsProcess:Monitoring DEDP Implementation

Page 3 - 6

Page 23: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Scope of the Division Monitoring and Evaluation

Objectives Performance IndicatorsMeans of Verification

Type of M&E / Division M&E

Process

2. Improved competencies of DepED Division and District staff in providing technical and management support to schools, community learning centers, school heads, teachers and facilitators

3. Management and technical assistance systems are in placed and operational

(a) Division and District staff demonstrates competencies on educational planning, curriculum management, instructional consultancy, training and development and monitoring and evaluation

(a) Continuous improvement in the management and technical assistance processes of the Division;

Results of Performance Assessment

Quality Assurance Readiness Assessment

Report

Tracking Intermediate ResultsProcess:Monitoring DEDP Implementation

Tracking Intermediate ResultsProcess:Monitoring DEDP Implementation

Outputs 1 :

1. On Improving School PerformanceDivision programs and projects on SBM

2. On Staff Development ProgramCapability building programs for Division staff, school heads, teachers and non-teaching staff

3. On Improving the learning environmentDivision programs and projects related to school building, school equipment, textbooks and manuals

On Managing Division Systems and ProcessesDivision programs and projects related to systems development and implementation

Efficiency/Progress Indicators

% physical accomplishment (number of programs and projects implemented versus number of programs and project planned in the SIP)

% physical accomplishment (number of Division staff, school heads and teachers trained versus number of Division staff, school heads and teachers targeted as per DEDP)

% physical accomplishment (number of school facilities / infrastructures set up versus targets in the DEDP)

% physical accomplishment (plan (DEDP) versus actual)

Division Monthly ReportCompletion Report

Division Monthly ReportCompletion Report

Division Monthly ReportCompletion Report

Division Monthly ReportCompletion Report

Progress Monitoring and Evaluation Process:Monitoring DEDP Implementation

1 Scope of outputs vary depending on the target (quantity) outcomes, needs and targets specified in the DEDP.

Page 3 - 7

Page 24: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Scope of the Division Monitoring and Evaluation

Objectives Performance IndicatorsMeans of Verification

Type of M&E / Division M&E

Process

Others (please add)

Input

Management of Division MOOE and other financial resources

Actual expenditure versus Approved Budget

Division Financial Report

Progress Monitoring and Evaluation Process:Monitoring DEDP Implementation

3.2 Pr imary Users of the Divis ion M&E System

The Division M&E System is an internal system� � designed primarily to cater to the

decision- making requirements of the Schools Division Superintendents (SDS), Assistant

Schools Division Superintendents (ASDS), Education Supervisors (ES) and other Division staff.

The implementation of the System is not in compliance with the requirements of the Region

but a critical support mechanism to the Division's role of providing quality and relevant

programs and projects to schools and community learning centers. At the same time, the

Division M&E System provides feedback to the Region and National on the effectiveness of

existing policies and provide information on issues, concerns and opportunities for policy

agenda.

The Division M&E System, especially the Progress M&E, provides the Division implementers

with with up-to-date and accurate information needed in making day-to-day decisions to

assure the best courses of actions and support that will improve performance.

Page 3 - 8

Page 25: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

DIVISION M&E SYSTEM

4.0PERFORMANCE MEASURES

Page 26: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Performance Measures

4 . 0 P E R F O R M A N C E M E A S U R E S

Performance measures provide an accurate picture of the status of accomplishments

and achievements or outcomes attained by the Division as contained in the DEDP. The

Division's performance will be assessed in the following areas:

Impact. Pertains to the achievement of the DEDP Goal Level Objective Learners�

Outcomes

Effectiveness. Refers to achievement of the DEDP Purpose Level Objectives �

Schools Performance

Efficiency. Pertains to the Division's implementation of programs and projects as

contained in the DEDP / DAP.

Organizational Maturity. Assessment of the practices and processes employed in

the Division. Refers to the Division's compliance or adherence to the quality

standard processes

Readiness of Division Staff. Refers to the competencies of Division staff on providing

technical assistance to schools and community learning centers.

4.1 Impact

Division impact is measured in four areas. These are:

Increase in the participation rate. The first measure of school effectiveness is the

ability of the school to bring learners of school age to school. The primary� �

indicator for access is increase in the school's enrollment.

Retention. School effectiveness is measured in terms of its ability to encourage

learners who are in school will stay in school. The primary measure of success in� �

this area is retention rate. Other indicators like drop out rate and school leavers' rate

will also be used.

Learners complete the requirements from Grade 1 Grade 6 or 1� st Year High School

to 4th Year High School. Another measure of effectiveness is the ability of the school

to assist or compel the learners to complete the requirements at the elementary

level or at the secondary level. The indicator to be used for this area is completion

rate and supported by other indicators like graduation rate and cohort survival rate

to help explain the phenomena.

Learners achievement.� The last, but not the least, measure of school effectiveness is

the learners achievement. This pertains to the learners demonstration of required� �

competencies (at every level) and their readiness to pursue the next higher level of

learning. Learners achievement is a progressive indicator that shows the progress of�

Page 4 - 2

Page 27: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Performance Measures

learners from one competency to the next. Measures to be used in achievement

include learner grade per subject and the score in the national or regional

achievement tests.

Achieving these four performance measures is a big challenge for school heads. These are

interrelated measures and therefore schools must be able to balance their efforts in the

four areas. The achievement of these performance measures will demonstrate the

effectiveness of programs, projects and other school services.

These measures are collected and analyzed every year and will be used as the main input

to the adjustment or enhancement of the school's programs and projects listed in the SIP

and to the preparation of the next cycle SIP.

4.2 Ef fectiveness

Division effectiveness is measured in four areas. These are:

Improved school performance

Improved performance of community learning centers

Improved performance of school heads and teachers

Improved learning environment

4.3 Divis ion Ef f iciency

Efficiency of the Division is measured in terms of its ability to deliver education programs

and projects on time and based on targets

The performance measures for school efficiency are:

physical accomplishment which plots the total accomplishment of the Division

(programs and projects completed) versus the total plan or targets (planned

programs and projects) on a periodic basis

cost efficiency which plots the school's usage of financial resources versus the

approved budget.

4.4 Organizational Maturi ty

Organizational Maturity measure focuses on the operations and practices of the Division. It

assesses the maturity level of the Division based on its adherence and compliance to the

quality standard processes established for Division operation. The operations of the Division

will be assessed using the Quality Management Inventory Model which will determine their

level of maturity based on the implementation of standard processes.

Page 4 - 3

Page 28: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Performance Measures

The Quality Management Inventory Model calibrates a Division's operation into:

(1) Ad hoc. The initial or entry level of readiness. A Division is often characterized by a

temporary and informal ways of doing things. Organizational procedures or

methods are not well defined and disseminated leading to inconsistent results and

poor quality of service. Its technical assistance packages are reactive, inefficient

and not relevant to the requirements of its target groups. Often these packages are

hand-me down practices. Its utility value and effectiveness have not been proven,

yet these are utilized year in and year out. Some may yield positive outcomes and

some may offer temporary solutions.

(2) Defined. There is an effort to implement interventions as efficiently as possible by

following a structured approach. There is a high awareness to use commonly

established management tools, techniques and procedures. But there is still that

tendency to revert back to the ad hoc or traditional practices when confronted

with a difficult situation. There is a defined process but the application is not

consistent.

(3) Integrated. Demonstrate a more mature and more consistent way of doing things. In

this category, Divisions are able to collate, document and transform its effective

practices into an integrated, well choreographed process. There is high

compliance to its own standards and processes such that all Division units and/or

individuals know the what to do and understands the coordination, cooperation

and collaboration requirements expected from them.

(4) Sustained. The Division's maturity on this level hinges on its commitment to

excellence. It must have the ability to perform continuous improvements, always

optimizing the gains or outcomes of its undertaking. Therefore, a Readiness Level 4

Region/Division should have the following traits:

Defined processes are regularly updated in accordance with the strengths,

weaknesses, opportunities, threats faced by the schools;

Defined processes are improved and in sync with agency policies and

directions;

4.5 Readiness of Divis ion Staf f

The quality of Division performance hinges on the readiness of Division staff to implement

programs and projects needed by schools and community learning centers. The Staff's

readiness will be assessed in the areas of:

school based management

curriculum management

strategic planning

Page 4 - 4

Page 29: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Performance Measures

program and project management

Table 4-1 Division Performance Measures

Performance Area

DescriptionFocus of Measure

Performance Measures

Impact to LearnersThis performance area is focused on the contribution of the Division's effort to support the schools and community learning centers. The focus

Division

� Enrollment

� Retention rate

� Completion rate

� Achievement

Effectiveness of the Division

The Division effectiveness is manifested in the improvements in the performance of schools and community learning centers; changes or improvements in the competencies of school heads and teachers; improvement in the maturity level of schools in implementing SBM and; in the ability of the Division to support the schools and community learning centers in the upgrading of learning environment within the Division

Schools

� Reduce disparity between high performing schools and low performing schools (in NEAT and NAT) by --- percent

� Reduce disparity in enrollment, drop out, and completion rates between high performing schools and low performing schools

� Increase in satisfaction of school stakeholders in the quality of instructions in the school

� Improve SBM Practice of schools

Efficiency

The objective is to measure the Division's capability to deliver programs and projects as promised in the DEDP and DAP; the efficient delivery of such programs and projects increases the likelihood of achieving the DEDP Purpose level objectives

Division � Physical accomplishment

� Cost Efficiency

Adherence to Standards

This performance measure assesses the maturity level of the Division in implementing and adhering to the quality standards set that will assure correct and timely implementation of the Division's Core Technical Assistance Processes

Division � QA Readiness

Readiness of Division Staff

This area evaluates the capabilities of the Division staff who will provide the technical assistance to schools on SBM and to community learning centers.

Division Staff � Competencies

Page 4 - 5

Page 30: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

DIVISION M&E SYSTEM

5.0DIVISION MONITORING PROCESS

Page 31: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Division Monitoring Process

5 . 0 D I V I S I O N M O N I T O R I N G P R O C E S S

5.1 Def init ion and Scope of Monitor ing Process

The Division Monitoring is a process of systematically tracking accomplishments, budget

and schedule against deliverables. It is a mechanism for measuring the performance of

the Division, schools and community learning centers and comparing these with set

standards. The systematic tracking of performance allows the Division to quality assure the

status of an on-going implementation, perform scope management and manage external

factors influencing and/or hindering the accomplishment of objectives and targets of the

Division. Specifically, the monitoring process is designed to regularly track, measure and

document the following:

the accomplishment of outputs and milestones as compared to what is specified in

the plans of the Division (DEDP), the schools (SIP) and the contracts of service

providers implementing the ALS programs.

the performance of the school heads in managing the schools. This includes

tracking the performance of school heads on SBM and instructional supervision

the schools' implementation of the curriculum

the operations of the community learning centers. This also includes monitoring the

learners participating in the alternative learning system and the quality assurance

of facilitators and/or mobile teachers

The Division Monitoring Process will serve as a trip wire, early warning signs on issues that� �

may affect the quality and/or hinder completion of outputs. The monitoring process will

enable the Division to make immediate corrective actions before issues become full blown

problems affecting quality, targets, schedules and budget.

The monitoring process is a review of an on-going implementation. Monitoring activities are undertaken to assess the following:

Quality of products and services provided

Compliance to quality standards

Accomplishments based on scope and time

Cost efficiency based on budget and time

Frame conditions or external factors beyond the control of the implementers that may affect achievement of targets

Page 5 - 2

Page 32: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Division Monitoring Process

The Division Monitoring Process also covers observing, measuring and documenting events

in the external environment. It includes tracking the stakeholders support and factors�

beyond the control of the Division and the schools which may affect the implementation

of the plans.

The Division Monitoring Process includes: (1) Monitoring the DEDP Implementation, (2) School

Performance Monitoring, and (3) Managing the ALS Programs.

Figure 5-1 Overview of the Division Monitoring Process

5.2 Some Guideposts in Monitoring

In implementing a monitoring program, consider the following:

Track and manage the 4 core areas of management: quality, scope, time and cost.

All throughout the DEDP Implementation, beware of scope creep. Minimize them as

much as you can. It will have implications to your targets, time and resources.

Annual and quarterly reviews will help reduce unwanted activities.

In reporting progress, always start with the percent physical accomplishment. Then,

elaborate the reported progress or status of implementation by discussing quality

Page 5 - 3

Monitoring Process(1 Year)

Status Reporting

Status Reporting

Status Reporting

Process Review

(school visit)

Process Review

(school visit)

Process Review

(school visit)

Process Review

(school visit)

Division & District M&E System

Review of Proposals (SP)

Contracting of SP

Review of Proposals (SP)

Mid Point M&E

End of Contract Evaluation

Page 33: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Division Monitoring Process

concerns, scope and cost concerns.

If after 3 reporting period and there are no changes or progress in the

accomplishments, something is wrong. This is the reason for monthly reporting: it

tracks progress.

Track and document effective school practices. Use appreciative inquiry to

determine the good practices.

At the end of implementation, beware of the 90% accomplishment. The last 10% is

usually the most difficult to implement.

5.3 Monitoring DEDP & SIP Implementation

The DEDP and SIP provide the scope of the Division monitoring process. The efficiency

measures, monitoring strategies and activities of the Division will be based on the content of

the DEDP and SIP. Specifically, the monitoring will be based on the accomplishment of

outputs, targets and activities.

The DEDP outlines the support programs and projects of the Division for the schools. It also contains

staff development programs for school heads and teachers, technical assistance support to school

heads on SBM, instructional consultancy strategies for teachers, learning materials support and other

support requirements of the schools and community learning centers.

On the other hand, the SIP contains the scope of work of the school for the next three years. The work

is detailed yearly through the Annual Implementation Plan or AIP. The SIP/AIP is used to track the

implementation efficiency of the schools.

Monitoring DEDP & SIP Implementation is a management mechanism which will allow the

Division to manage its monthly operations more efficiently. It focuses on the deliverables

and sees to it that these are accomplished and delivered. Tracking the DEDP and SIP

implementation will also facilitate the systematic handling of concerns on the quality of

technical assistance delivered to schools and community learning centers.

Specifically, the mechanism will allow the Division to manage the following:

quality and status of Division programs and projects.

Division's Physical Accomplishment (S-Curve). Involves comparing the number of

Page 5 - 4

Page 34: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Division Monitoring Process

Division programs and projects implemented versus the planned/targeted

programs and projects in the DEDP.

Expenses versus budget (Cost S-Curve). Involves monitoring the Division's generation

and management of its financial resources vis a vis the financial resources outlined

in the DEDP

Schools' Efficiency. Involves monitoring actual progress versus plan, actual cost

versus budget.

5.3.1 Objectives

Monitoring the DEDP and SIP implementation is done to accomplish the following:

Ensure the timely and cost efficient delivery of programs and projects outlined in

the DEDP and SIP.

Provide immediate feedback on the quality and effectiveness of technical

assistance provided to schools, community learning centers and to

teachers/facilitators.

Provide information on the accomplishments of the Division and schools including

enabling and hindering factors that may be used as basis for adjusting and/or

improving efficiency

Document the experiences of the Division in providing technical support to schools.

This includes effective practices and lessons learned.

Immediately address issues and risks that may affect future performance of the

Division, schools and community learning centers.

5.3.2 M&E Activities

Monitoring the DEDP implementation involves conducting the following activities:

Preparation and submission of Division Quarterly Status Report. The report

highlights the accomplishments of the Division and schools versus the targets in the

DEDP and SIP. It also includes comparison of budget versus actual resources utilized

in the implementation of programs and projects.

Conduct of team meeting with Division staff on the status of Division programs and

projects. The Division Quarterly Status Report and the schools' Quarterly Status

Report will be used as reference documents in the conduct of the team meeting.

The status reports will be used as trip wire to determine issues and concerns that� �

Page 5 - 5

Page 35: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Division Monitoring Process

demand immediate attention by the Division and districts.

Discussion of problems that affected delivery of Division services. Problems that

can be solved at the school level should be resolved immediately. Problems and/or

issues requiring decisions or support from the Division Office should be included in

the School's monthly report.

Updates on the implementation chart and on the status of activities, events and

outputs completed.

Communication of accomplishments to stakeholders.

5.3.3 Process Owner

The Schools Division Superintendent (SDS) is mainly responsible for managing the DEDP

implementation process. The SDS is one of the main beneficiaries of the data and

information generated from undertaking this process. The process provides the SDS and

other key personnel with up to date, relevant information needed in making timely

decisions and/or adjustments in the implementation of the DEDP/DAP.

The following individuals are tasked to provide the staff work needed by the SDS:

(1) Assistant Schools Division Superintendent (ASDS). The ASDS oversees the day-to-

day operations required in managing the DEDP implementation. The ASDS shall

ensure the quality of the reports and documents needed in the status reporting of

Division operation.

(2) Division M&E Coordinator. The M&E Coordinator is tasked to do the following:

collection and collation of reports submitted by units/offices within the

Division

write and package the report of the Division

(3) Division Planning Officer. Assists the M&E Coordinator in the preparation of the

Division Quarterly Status Report. Specifically, the Planning Officer will provide the

planning documents or reference documents needed in the preparation of the

status reports.

(4) Division Program/Project Manager. This is a designation given to an Education

Supervisor or any other Division staff tasked to lead the implementation of a

program or project. The Program/Project Manager will provide monthly updates on

the status of programs/projects to the SDS.

Page 5 - 6

Page 36: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Division Monitoring Process

5.3.4 Documents and Reports

The main output of this process is a Division Quarterly Status Report covering the Division's

accomplishments versus the targets set in the DEDP/DAP.. The report shall also contain

information on the quality of accomplishments, factors facilitating the implementation and

discussion of problems and issues that affected the implementation.

The Division Quarterly Status Report is a consolidation of the following documents/reports:

school quarterly report

monthly report of program/project managers in the Division.

This quarterly status report will be used as input to the preparation of the Division Annual

Accomplishment Report.

Figure 5-2 Status Report - Quarterly

Page 5 - 7

Accomplishment Report

Quarter Report

Quarter Report

Quarter Report

��� ��� ��� �� �� �� ��� ��� ��� ��� �����

Division M&E System����������������������������

Monthly Report

Monthly Report

Monthly Report

Monthly Report

Monthly Report

Monthly Report

Monthly Report

Monthly Report

Monthly Report

Monthly Report

Monthly Report

Monthly Report

Reports and Documents

School Quarterly

Status Report

School Quarterly

Status Report

School Quarterly

Status Report

School Annual Accomplishment

Report

Page 37: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Division Monitoring Process

5.4 School Performance Monitoring

Monitoring School Performance is a mechanism that will provide valuable information on

the strengths, weaknesses and challenges faced by the schools, school heads, teachers

and non-teaching staff in delivering quality education to learners. Specifically, this will

provide the Division with an up-to-date information on the following:

SBM maturity level of practice of the schools

compliance of schools and their staff on the standard processes

readiness of school heads and teachers to provide quality service to learners. The

process aims to identify the strong and weak points of school heads and teachers

which will be used as basis for providing training support

5.4.1 Objectives

The objectives of School Performance Monitoring are the following:

promote the practice of continuous improvement and self-examination in the

schools

determine the relevance and applicability of school processes and practices to

improving learners performance and to improving school efficiency�

determine the knowledge and skills of staff in performing tasks based on their

compliance to the standard process. The results will be used as input to the

capability building programs for schools.

allow the Division and District to manage the technical assistance more efficiently

by adjusting its programs and projects to the changing requirements of the schools

and shifting resources where needed

design more relevant programs and projects in schools

5.4.2 M&E Activities

Monitoring school performance is a mechanism designed to gather information about the

performance, accomplishments and practices of the schools on SBM, instructional

supervision, curriculum implementation and the school programs and projects

implemented by the school. The monitoring will involve the following activities:

(1) Prepare a school monitoring plan. The plan is a quarterly plan of the education

supervisors and district supervisors which details the objectives of the school visit, the

data gathering methods and the list of schools to be visited. The preparation of the

Page 5 - 8

Page 38: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Division Monitoring Process

monitoring plan is based on the technical assistance requirements of the schools

and the time the Division needs to validate the application and/or utilization of

outputs provided to schools.

(2) Analyze school reports and accomplishments. Monitoring school performance will

be triggered by the status reports submitted by the schools. The contents of the

reports will be used as basis or input by the Division monitoring staff to determine

the scope of the school monitoring.

(3) Periodic school visits to be undertaken by the education supervisors and district

supervisors. The visits will include observations, interviews, focus group discussions

with learners, teachers and school head about the practices on SBM and the

management of the curriculum. The school visits will be done randomly and

unannounced. This is to capture the actual and/or real practices of schools and� �

school staff in delivering quality education to learners. Activities will include:

Process Check or Review. This entails actual observation or demonstration

of compliance to established standards. The review will be undertaken

using a predetermined checklist outlining the standard processes that will

assure quality. Personnel from the Division and/or District will do actual

observations and interviews.

Review of artifacts or MoVs. Artifacts validate the claims of individuals

regarding application of certain practices or processes.

Team discussion. This includes sharing of information and insights regarding

the findings of the inventory.

(4) Conduct of a perception survey. The Division and District will be conducting a

perception survey to get feedback on the performance and quality of services

provided by the schools from the community and other stakeholders.

(5) Share information through the conduct of Division team meetings.

5.4.3 Process Owner

The process owner in the conduct of monitoring school performance is the Assistant

Schools Division Superintendent (ASDS). As process owner, the ASDS shall:

ensure that all schools are visited and given technical assistance by the education

supervisors and district supervisors.

ensure integrity of the process by making sure proper planning, data gathering

Page 5 - 9

Page 39: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Division Monitoring Process

techniques and documentation are followed by the education supervisors and

district supervisors.

incorporate to the Division Status Report the findings of the monitoring teams or

individuals

act on the recommendations made by the monitoring teams especially on

problems and issues requiring Division interventions

On the actual conduct of the school performance monitoring, the education supervisors

and/or the district supervisors shall compose the monitoring team. The team shall gather

and collect data and information, analyze. recommend and provide immediate feedback

to school heads.

5.4.4 Documents and Reports

The data, information and insights to be generated in monitoring school performance is

included in the Division Quarterly Status Report. The status report will have a section on the

schools performance. � The results of the Process Check/Review will be discussed with the

Division and District staff and will be used as a case study to improve and or sustain the

quality of services provided by Division and District staff to schools and community learning

centers.

Other documents required in this process are:

School Monitoring Plan. Provides the objectives and scope of the monitoring visits

to be undertaken by the Division and District.

Report on School Visit. Every school monitoring activity, the monitoring team or

individual shall prepare a School Visit Report.

Frequency of Visit Matrix. Provides information on the number of visits undertaken

by Division and District to schools.

In order to ensure efficient operation of this process, the following reference documents are

required:

DEDP/ DAP. Outlines the programs and projects of the Division for schools.

Accepted SIP/ AIP. Represents the scope of work of the school for three years (SIP)

and within a year (AIP). Provides information on the objectives, outputs targets,�

and schedules of activities.

School Quarterly Progress Report. Provides information on the physical

Page 5 - 10

Page 40: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Division Monitoring Process

accomplishment of the schools as per SIP/ AIP, facilitating factors and issues and

concerns affecting school performance.

5.5 Monitoring the ALS Programs

The Division manages two major ALS programs namely, the Basic Literacy Program (BLP) and

the Accreditation and Equivalency (A&E) Program. As part of the continuous improvement

process, the Division and District will monitor the learning sessions and the quality of the

contact period between the learners and the literacy facilitators and/or instructional

managers. The scope of the monitoring covers:

profile of learners participating in the BLP and A&E programs including the

individual learners progress �

performance of instructional managers and facilitators

performance of Service Providers (SP) implementing the BLP and A&E programs

based on the implementation plan as specified in the contract

quality of inputs or service provided by literacy facilitators and instructional

managers

5.5.1 Objectives

The objective of this process is to generate feedback and information from the Division's

implementation of the BLP and A&E programs. The results of the monitoring process will

enable the Division to:

• improve the efficiency of the Division and District in implementing the ALS programs

in order to widen its coverage and further increase participation

• strengthen the Division's partnership with private groups or organizations, private

and state universities and college and other government organizations acting as

Service Providers.

• design technical assistance support and capability building programs for literacy

facilitators and mobile teachers in order to make them more effective partners in

providing education for all and improving functional literacy.

Page 5 - 11

Page 41: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Division Monitoring Process

5.5.2 M&E Activities

The District Supervisor will be conducting at least 5 visits within the contract period. The visits

focus on the Service Provider's compliance to the provision in the ALS Service Contract.

Field visits to be undertaken at midpoint and at the end of the contract period of

the service providers.

Interviews on the practices of instructional managers and the literacy facilitators

5.5.3 Process Owner

The designated Education Supervisor for Alternative Learning System (ALS) is the process

owner for monitoring the performance of the service providers, instructional managers and

literacy facilitators. The ALS Division Supervisor will be assisted by District Supervisors in the

conduct of actual monitoring and report writing.

5.5.4 Documents and Reports

The documents and reports in this process will be used as input to the Division

Quarterly/Annual Status Report. These include:

(1) Training Completion Report on the orientation and training conducted by the

Division and District supervisors for the facilitators, instructional managers and

service providers

(2) Initial Report prepared by the District Supervisor. This will include information on the

enrollees or learners, facilitators and instructional managers, the activities observed

and the problems and issues to be resolved.

(3) Status reports of District Supervisor on the implementation of ALS programs at the

field level. The report shall include:

process documentation of actual implementation of ALS programs

information on the networking and coordination efforts with local

government units, other line agencies and non-government organizations

evaluation of performance of learners and service providers

(4) Mid Term Report and End of Contract Report.

Page 5 - 12

Page 42: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

DIVISION M&E SYSTEM

6.0DIVISION QUALITY CONTROL AND ADJUSTMENT POINTS

Page 43: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

6 . 0 D I V I S I O N Q U A L I T Y C O N T R O L & A D J U S T M E N T P O I N T S

6.1 Quali ty Control & Adjustment Points

A Control Point is a mechanism for continuous improvement. It is a time-based evaluation

activity designed to assess major accomplishments at every assigned time period and to

determine achievement of critical milestones within the implementation life cycle.

A Control Point provides the transition from one

major implementation stage to the next

milestones. At each stage, a quality control

point is installed. Each point is designed to

check and review an implementation stage.

The review process will be unique per control

point gate as the objectives, requirements and

problems at each stage vary. Each review gate

will provide the transition to the preceding

project stage.

The Control Points represent the evaluation

activities of the Division. Predetermined evaluation points are set up which will allow the

Division to assess the quality of its programs, projects and technical assistance activities to

schools and community learning centers. These provide valuable information and insights

on the effectiveness of the Division.

The Control Point is also an adjustment or

enhancement mechanism. The Division, based

on practices proven effective and lessons

learned from previous implementation, uses the

Control Point as a mechanism to adjust

strategies, programs and projects to improve

efficiency and increase likelihood of achieving

desired objectives.

The control points and the adjustment points

provide the operational framework of the

Division M&E System. The operational framework

will help establish the mechanism for systematically integrating data collection, analysis and

decision making into one cohesive process. The major processes of technical assistance,

planning, monitoring and evaluation are integrated and systematically designed into one

cohesive process called Quality Control and Adjustment Points.

Page 6 - 2

������������

Transition point from 1 stage to another stage

��� � ����������

��� � ����������

Figure 6-1 Control Point

������������

��� � ���������� � ����������� ����

��� � ���������� � ����������� ����

Continuous improvement

Figure 6-2. Adjustments

Page 44: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Value Added from Quality Control and Adjustment Points

Aside from providing the operational framework of the Division M&E System, the following

are some of the potential benefits in using the quality control and adjustment points:

Communicates the intent of the Division M&E. Monitoring and evaluation can be

abused. It may be used as a counter intelligence agency, a mechanism for fault� �

finding and punishing individuals. As a result, implementers conceal the true and

actual situation from evaluators due to fear and distrust of the system. Such

behavior always happens when the objectives for evaluation are not disclosed;

performance measures keep on changing and unscheduled and surprise

evaluation is done.

The Quality Control and Adjustment Points provide clear description of M&E

objectives, performance measures, activities and resource requirements of every

review activity. The schools, community learning centers and stakeholders are well

informed about the intent of each review.

Ensures necessary implementation mechanisms and critical support infrastructure

are installed and ready for use. The Quality Control and Adjustment Points

establish review mechanisms that will ensure the installation of mechanisms or

systems and infrastructure necessary to the achievement of objectives and targets

in the plan. These mechanisms ensure the setting-up of management systems that

will facilitate the implementation of programs and projects.

Confines implementation problems to one stage. One of the common signs of a

poor M&E system is the presence of recurring problems. These problems are a

product of inactions and wrong decisions made early in the implementation stage.

A responsive M&E system should be able to detect and predict these problems.

The Division M&E is a mechanism for screening problems, including potential ones.

The stage approach allows proper and timely management of issues and concerns

before these escalates into problems that will affect the deliverables and results.

Control Points are installed after and before every major Division milestones. These

are designed to enable the Division staff to reflect on decisions and activities

already undertaken and, at the same time allows them to be forward looking by

assessing the implications of their previous actions.

Anticipates issues and risks. The Adjustment Points serve as integrated review

mechanism designed to limit or reduce the exposure of the Division and schools to

issues and risks. Review points are established to identify, analyze and make

immediate adjustment in the strategies before issues and concerns evolve into

problems.

Systematizes evaluation and decision making. The Quality Control and Adjustment

Points provide the rationale for data collection, reporting, communication and

Page 6 - 3

Page 45: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

decision making. It highlights the importance of any major evaluation activities for�

making decisions.

Some Guideposts in Using the Quality Control and Adjustment Points

In determining the control points to be used, consider the following:

The planning stage is the most critical stage in implementation life cycle. If plans are

vague, managing and implementing it is difficult. The cost to change a plan during

the planning stage is more cost efficient and the value added in enhancing the

plan is highest if done during the preparation stage.

During start up, watch out for the first 15% of implementation, if after six months the

accomplishment remains at 15%, something is terribly wrong.

Quality is a prevention process. It is a product of processes set up to ensure services

and products are fit for use.

Set up adjustment points. After every major assessment, it should be followed by

adjustments and/or enhancements in the plan, strategies and design. Every major

evaluation event should be preceded by major adjustment efforts.

No report driven monitoring and evaluation. The evaluation should be used as basis

for adjustments and improving performance.

At the end of implementation, beware of the 90% accomplishment. The last 10% is

usually the most difficult to implement.

Page 6 - 4

Page 46: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Figure 6-3 Division Quality Control and Adjustment Points

6.2 Divis ion Quali ty Control and Adjustment Points

Quality Control & Adjustment Points are established in order to ensure relevant, up-to-date

and timely technical assistance of the Division and districts to schools and community

learning centers. Control points are strategically placed at every major milestone in the

DEDP implementation life cycle. The control points are mechanisms to steer and manage

technical assistance to schools and learning centers.

The 5 Division Quality Control and Adjustment Points are:

SIP Appraisal (SA). A quality control mechanism designed to make sure that SIPs

are able to meet the criteria of a good plan: relevance, responsiveness and

feasibility. This is also the review point where the SIP is assessed in terms of

completeness of information and in terms of its fit for use as a reference for

monitoring and evaluation.

Start Up Review (SUR). Ensures the readiness of schools to implement the 3 year SIP.

This quality control point evaluates the compliance of the school to set up critical

management mechanisms before fully implementing the SIP. Example is the set up

of the M&E system.

Annual Implementation Review (AIR). A major review of the Division and schools'

implementation of their programs and projects. Assessments are made in terms of

achievements and accomplishments based on the objectives and targets in the

DEDP and SIP. The AIR is used as an adjustment point for the next implementation

Page 6 - 5

��� � ������� ������

�������������

������� ����� �������� ����� �� �� ��!∀

������ � �������� �����#��∀

����� �������

�����������

��������!�� ���������

∃ % &���∋���∃ % &

()�� ����%��∋����

������ � �������� ���∗�#�+∀

����� �������

�����������

��������!�� ���������

∃ % &���∋���∃ % &

,∋���� ��%��∋����

�����−������

������� ��������

����

��� � ������� ���+�

��� � ������� ���.�∗

������ � ������

�� ���.∀

������ � ������

�� ��� ∀

� ���� � � � � � � � �� �� � � � � ���� � �� � � � � � � ��� �� �

Division Quality Control & Adjustment Points��������

����

Page 47: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

year.

Mid-Term Review (MTR). A review undertaken after the first 3 years of the DEDP (at

the end of the SIP cycle). The Division evaluates its impact to the learners and its

effectiveness based on the schools' achievement of their outcomes. The results of

the MTR will serve as a major input to adjusting the next 3 years of the DEDP.

Outcome Evaluation (OE). A post implementation review conducted at the end of

the DEDP implementation. The main objective is to determine whether the outcome

level objectives and goals in the DEDP are achieved. OE investigates factors that

contributed to success and/or hindered achievement of targets. The results of the

OE will be used as input to the preparation of the next cycle DEDP..

6.3 Monitoring Process and the Quali ty Control & Adjustment Points

The Division M&E System is composed of two major systems: the Monitoring Process

(discussed in Part 5) and the Quality Control & Adjustment Points. These two systems gather

different but related information.

The Monitoring Process represents the daily, weekly, monthly and quarterly efforts to track

and improve the delivery of services to schools. The data, information and insights collected

in this process are immediately used for adjustments, to solve issues and problems and to

ensure the implementation progress is on track within scope, time and cost. �

Figure 6-4 Division Monitoring Interface with Division Quality Control and Adjustment Points

On the other hand, the Quality Control and Adjustment Points are major evaluation points

set up to measure the achievement of outcomes, initial gains and major milestones (such as

appraisal and start up). It uses the data and information from the Monitoring Process to

Page 6 - 6

Monitoring Process(1 Year)

Quality Control and Adjustment Points

AppraisalAppraisal Start Up Start Up ReviewReview

Annual Annual Implementation Implementation

ReviewReview

Annual Annual Implementation Implementation

ReviewReview

Outcome Outcome EvaluationEvaluation

Status Status ReportingReporting

Status Status ReportingReporting

Status Status ReportingReporting

Process Process Review Review

(school visit)(school visit)

Process Process Review Review

(school visit)(school visit)

Process Process Review Review

(school visit)(school visit)

Process Process Review Review

(school visit)(school visit)

Division & District M&E System

Review of Review of Proposals (SP)Proposals (SP)

Contracting Contracting of SPof SP

Review of Review of Proposals (SP)Proposals (SP)

Mid Point Mid Point M&EM&E

End of End of Contract Contract EvaluationEvaluation

Page 48: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

provide background stories about what happened, what transpired, and the factors that� �

influenced the achievement of the major milestones.

Figure 6-2 illustrates the interaction between the two systems.

6.4 Quali ty Control Point #1: SIP Appraisal (SA)

The Appraisal Process is one of the quality control mechanisms of the Division and a major

activity undertaken by the Division during the preparation of the DEDP. Mainly, it is a

planning activity designed to ensure quality plans both at the school and Division levels.

SIP Appraisal has two major focus.

First, as a quality control mechanism

to ensure the correctness and

fitness for use of the school plans. It

provides the venue for both the

Division and schools to collaborate

on strengthening the school

performance. Specifically, the

Division, through the Division QMT,

reviews the plan in terms of

relevance, feasibility and

sustainability as well as in format

and presentation. The Division

provides suggestions to enhance the plan and increase its likelihood of success. The

appraisal process ends when the Division accepts the SIP for implementation.

Secondly, the SIP Appraisal serves as a data collection activity. The process of appraisal

provides the Division with detailed information on the programs and projects of the schools,

furnishing adequate information and insights to the Division on the type of technical

support the school will need to successfully achieve the objectives in the SIP. These

information and insights are inputted to the DEDP.

6.4.1 Guiding Principles

A poorly prepared plan is the most common cause of inefficient implementation

and non-attainment of desired objectives. The cost to revise or troubleshoot a

wayward implementation is much more expensive than revising a plan at the

preparation stage. It is, therefore, important to focus more attention and spend more

time ensuring the quality of plans. This is shown in figure 6-#. [?]

Schools will be needing all the assistance in implementing SBM. The appraisal

process is one of the most concrete modes of assistance by the Division to the

school heads. Guiding the schools in developing a quality plan increases the

Page 6 - 7

SIPSIP���

�����!��DEDPDEDP

Input Process Output

Schools prepare & submit their plans

Division assist the schools in preparing a quality plan

Division prepare programs and projects that will support the SIP

Accept the plan Input to Divis ion plan

Figure 6-5 Appraisal Process

Page 49: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

chances of an efficient and effective SBM implementation.

Before proceeding immediately to implementation, the schools need to satisfy

stakeholders about the relevance and feasibility of the proposed programs and

projects.

An approved plan ceases to be a plan of the proponent. Once approved, the plan

becomes a plan both of the proponent and decision maker, who are now both

accountable to the successful implementation of the plan.

Figure 6-6 Importance of Planning

6.4.2 Objectives

The SIP Appraisal Process is established to assist schools in the preparation of SIP. It is a

quality control mechanism of the Division that will assure relevance, feasibility and

sustainability of education programs and projects of the schools.

The main objective of this control point is to ensure the schools prepare a relevant and

implementable plan. Specifically, the Division conducts an appraisal of SIP to warrant the

following:

the statement of the problems and objectives is clear. The baseline situation and

the desired situation is clearly explained and shows logical link.

SIP objectives and targets are specific, measurable and reasonable.

strategies and proposed programs and projects in the SIP are relevant. This means

that there is a logical link between the baseline situation and the proposed

strategies and programs to bring about changes or improvements in the situation.

Relevance means the plan will be able to solve the problems of the school and/or

Page 6 - 8

(���(�/���,/���///0

1��∋ ��)) ) ��!�����∃ %!

2���∋3� !4�����5

2 6� �!% 5

21��∋ !���)���!�5����� ��������������� ������������������

���� ��� �� � ���� ��

Page 50: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

the strategies match with available opportunities.

the schools will be able to sustain their operations by appraising the SIP and

evaluating the capacity of the school to implement the SIP, assessing the support of

stakeholders and availability or sources of funds that can back up and sustain

school programs and projects.

There are enough details in the implementation plan that can be used as input in

the development of monitoring instruments. This means that the milestones and

targets are well defined, schedules and dates are clearly specified and the

resources required are distinctly identified and outlined specified in the plan.

Figure 6-7 Appraisal Process Flow

6.4.3 Process Description

The Division will implement the following appraisal activities:

(1) Initial Review - check compliance to requirements. The first major step in the

appraisal process is to check the completeness of information and the compliance

to agreed SIP format. The objective of the compliance check is to ensure

information are complete before it is handed over to the review team.

Refer to SIP Appraisal Checklist Item #1 Completeness of SIP.�

(2) Assess the relevance of the SIP. Review of the Rationale or Background Section

and the Goal Chart or Objectives Section. This review establishes the relevance of

the SIP by assessing the match between the baseline situation (problems,

Page 6 - 9

��������!��

�����∃ % &

�!! !!�� � %���

7 )3��8�#�∃ %!��

������ �����

���������∃ % &

School Start Up Stage

�!! !!����� ��� !!

��� �� )����

��

��

..

∗∗

++

Next Control Point

Control Point 1. SIP Appraisal Process

Appraisal Process Flow

1. Initial Review

2. Assess relevance

3. Assess technical correctness of proposed programs and projects

4. Feedback and revision

5. SIP acceptance

Output: Accepted SIP

Next Process: Start Up Review

Page 51: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

opportunities, strengths and weaknesses) and the desired future situation. There has

to be an agreement between the school and the Division about the baseline

situation and the desired future situation (includes targets). When this requirement is

satisfied, proceed to the next step. If relevance is not satisfied, do not proceed to

the next step. Return the SIP for revision.

Refer to SIP Appraisal Checklist Item #2 Relevance of the Plan.�

(3) Assess correctness of strategies. The appraisal, at this point, focuses on the

feasibility of strategies as outlined in the detailed implementation plan. This activity

will include the review of the following:

Individual programs and projects proposed in the SIP. Examine the

technical correctness of these programs and projects. Assessment includes

identifying other alternatives that may produce better results to achieve the

Outcomes.

Link between future desired situation and the proposed programs and

projects. Assess whether the proposed Outputs/Contributory Objectives are

complete and necessary.

Refer to SIP Appraisal Checklist Item #3. Necessary and Adequacy.

(4) Assess the feasibility of the plan. The appraisal shall not be limited to the review of

the document but shall also include assessment of the school's capacity to

implement and sustain the plan.

Assess capacity of school to implement the SIP including the programs and

projects.

Assess capacity of stakeholders to support the school in implementing the

SIP strategies

Review the costings and estimates. The Division QMT also reviews the

assumptions and cost estimates presented in the SIP.

Refer to SIP Appraisal Checklist Items # 4,5 and 6.

(5) After appraising the relevance of the SIP and the technical correctness of the

proposed programs and projects, the last item to review and enhance is the

completeness of the Implementation Plan. This means checking the following must

items:

targets and milestones are clearly specified

activities are broken down to desired level,

the relationships of the activities (network) are logically sequenced

activities are assigned with resources (human, material, equipment etc)

activities are specified on a monthly (not quarterly) period

Page 6 - 10

Page 52: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Gantt or bar chart showing the activities on a time scale (months)

budgetary requirements are specified per activity and per month

Refer to SIP Appraisal Checklist Item # 7 Detailed Implementation Plan.

Table 6-1 Guide to SIP Appraisal provides a more detailed listing of the areas to

consider in reviewing and enhancing the SIPs. It also contains possible decisions or

actions the Division may take in different scenario.

(6) Feedback and Revision. After a thorough review of the SIP, the Division QMT will

provide feedback to the schools on areas for enhancement. The school is expected

to revise and/or enhance the SIP and submit it back for acceptance. Specifically,

this activity will include the following:

Write recommendations (alternative interventions) and suggestions on the

implementation plan.

Communicate findings and provide next steps instructions to schools.

(7) Acceptance. After satisfactorily complying with the requirements, the Division QMT

endorses the SIP to the SDS for acceptance.

Page 6 - 11

Page 53: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Table 6-1 Guide to SIP Appraisal

Focus of Appraisal Inquiry Area Possible Actions

1. Completeness of Document

To ensure the SIP is complete in terms of data, information and supporting documents are present

� Is the SIP following or complying the prescribed format?

� Are the data, information and assumptions used correct and valid?

� Are there supporting documents?

� Were the stakeholders involved or participated in the preparation of the plan?

Return SIP when it does not comply with requirements

Proceed to assessment of relevance when SIP is deemed complete

2. Relevance of the plan (background/rationale/objectives section)

To determine if the desired objectives in the SIP match with the needs and opportunities listed in the rationale or background

� Are problems, needs and opportunities clearly described? Is there a supporting analysis?

� Are the objectives match with the needs and opportunities identified?

Return SIP when needs and opportunities identified are vague; when objectives do not match with situations described

Proceed to assessment of feasibility when relevance is clearly established or described

3. Necessary and Adequate (Objectives � Outputs/Component/Implementation Plan)

To establish direct link between objectives and the proposed programs and projects in the SIP

To provide other and better alternatives in achieving the desired objectives

� Are the outputs/deliverables identified sufficient, or adequate to achieve desired objectives?

� Are the outputs/deliverables identified necessary to achieve desired objectives?

� Are there better alternatives (outputs) available that will help achieve the desired objectives?

If the proposed deliverables are inadequate and/or unnecessary to achieve objectives,, return SIP for enhancements.

Suggest better alternatives to school

Proceed to next appraisal area when outputs are considered complete and necessary

4 Capacity of School

To determine the capacity of the school to implement the proposed programs and projects in the SIP

� Can the school head implement and manage the programs and projects in the SIP?

� Can the teachers deliver programs and projects efficiently and effectively?

� What are the capability building requirements (needed to implement the SIP) of the school head and teachers?

If yes, proceed to next appraisal area.

If no, consider the school requirements in the DEDP. Make sure technical assistance support to schools are incorporated in the DEDP.

5. Stakeholders Support

To determine the level of support the stakeholders can provide to schools

� Are the stakeholders ready and willing to participate and support the implementation of the plan?

� Are they capable?

If yes, proceed to next appraisal area.

If no, consider the school requirements in the DEDP. Make sure technical assistance support to schools are incorporated in the DEDP.

Page 6 - 12

Page 54: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Focus of Appraisal Inquiry Area Possible Actions

6. Resource Generation

To determine the feasibility of implementing the plan considering the cost requirements

� Are the cost requirements reasonable?

� Are there other fund sources?

If yes, proceed to checking the implementation plan.

If no, can the Division assist in looking for fund sources? If no, downsize the plan.

7. Detailed Implementation Plan

To assess the implementability of the plan

to ensure necessary elements are present in the plan milestones and� targets, resources, schedule and cost

� Are the milestones and target clear and correct?

� Is there a work breakdown of outputs/activities?

� Are target dates specified for each milestone?

� Are there resource and cost allotted for each milestone and activity? Is there a cash flow matrix?

If yes, endorse the plan for acceptance

If no and requires major changes, return the plan for revision

If no and revisions required are considered minor, accept the plan conditionally. The school is to submit a revised implementation plan before or during the start up stage

6.4.4 Knowledge and Skills Requirements

Individuals who will be involved in the Appraisal Process need to have adequate

background and experience in planning and school operation. The following are some of

the competencies required for an appraisal team:

Understand the needs, problems and issues in the locality. These include knowledge

of the school programs and projects that succeeded and those that failed and the

historical performance of the school.

Good understanding of the community relationships, especially the stakeholders.

This will help the appraiser make suggestions on how to maximize the support from

stakeholders.

Good analytical skills, especially in problem analysis, opportunity analysis and in

appreciative inquiry.

Have actually used planning tools like goal chart (lograme), work breakdown

structure, Gantt or bar chart, network chart in the preparation of a plan and in

implementing the same.

Last but not the least, subject matter specialist who will assist in the review and

enhancement of proposed programs and projects of the schools.

Page 6 - 13

Page 55: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

6.4.5 Process Output

The main output of the appraisal process is the accepted SIP. Specifically, the objectives,

targets, outputs or deliverables described in the SIP represent the collective agreement of

the schools and the Division and District. The accepted SIP becomes a plan of both the

school and the Division, which will be known as the SIP Baseline Plan and will be used as

basis for the monitoring and evaluation.

The SIP Appraisal Checklist, which contains the comments, findings and suggestions of the

Division QMT, will also be documented and archived for future reference.

6.4.6 Evaluation Tools and Techniques

The appraisal process may be reinforced by selected data gathering and analytical

techniques. The objective is to be able to get enough information (through triangulation)

that will help the QMT to objectively review and enhance the SIPs for appraisal. These

include, but not limited to the following:

Document review. Most appropriate to use in checking the completeness and

correctness of data and information.

Interviews. Interviews may be conducted when the QMT needs to validate some

information that contradict the documented information. Include interviews with

teachers and school stakeholders, including learners.

Panel review. The QMT may opt for a panel review that will allow both the QMT and

the school head and teachers to answer and clarify questions.

Regardless of techniques to be used, the main instrument to be used on appraisal is the

Appraisal Checklist. The checklist itemizes the areas to be assessed and appraised during

the review. This will serve as a guide for the Division QMT in implementing the Appraisal

Process.

Page 6 - 14

Page 56: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Table 6-2 SIP Appraisal Checklist1

Area Remarks

1. Completeness of Document

Compliance to formatSIP submitted followed the official SIP format.

Yes NoMore Info Needed

EndorsementSIP submitted contains signature of school stakeholders

Yes NoMore Info Needed

DocumentationFilled up all sections/parts of the SIP

Yes NoMore Info Needed

Situational AnalysisAre the problems, issues, needs and opportunities clearly articulated in the background/rationale section?

Yes NoMore Info Needed

FactsAre the data and information quoted in the SIP comes from reliable or authoritative source?

Yes NoMore Info Needed

Gaol Chart Is the goal chart correctly formulated? Are the Objectives and indicators are SMARTly formulated?

Yes NoMore Info Needed

Purpose level objectives The SIP contains 4 purpose level objectives � on enrollment, retention, completion and achievement

Yes NoMore Info Needed

Implementation PlanIs there a detailed implementation plan? Does it contain information on activities to be undertaken, people responsible and budget?

Yes NoMore Info Needed

d. Attachments complete. All supporting data, tables, graphs and other documents are accounted for

Yes NoMore Info Needed

2. Relevance of the Plan

Vision Statement Does the vision statement paint a picture of the future situation of the school?

Yes NoMore Info Needed

Situational AnalysisThe problems, issues, needs and opportunities described in the SIP are real and based on sound analysis

Yes NoMore Info Needed

Target GroupsNeeds of different target groups are clearly identified

Yes NoMore Info Needed

SIP Objectives and TargetsThe objectives (in the goal chart) of the SIP are logically link to the problems, issues, needs and opportunities described in the plan

Yes NoMore Info Needed

1 Checklist to be used by the Division Quality Management Team to appraised the SIPS. Additional items may be added depending on the requirements and/or intent of the Division

Page 6 - 15

Page 57: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Area Remarks

TargetsTargets are reasonable and attainable

Yes NoMore Info Needed

Support National ProgramAttainment of objectives and targets support the national programs and international commitment

Yes NoMore Info Needed

3. Necessary and Adequate

Complete OutputsThe Outputs (programs and projects) are adequate to achieve the objectives

Yes NoMore Info Needed

NecessaryAll the outputs listed are necessary to achieve the objectives

Yes NoMore Info Needed

4. Capacity of school

School Based ManagementDoes the school head have adequate experience on managing school programs and projects?

Yes NoMore Info Needed

School Based ManagementIs the school head trained on SBM and other related training?

Yes NoMore Info Needed

TeachersAre all teachers capable of implementing the programs and projects in the SIP?

Yes NoMore Info Needed

TeachersAre there enough preparations and training for teachers to handle the programs and projects in the SIP?

Yes NoMore Info Needed

5. Stakeholders Support

School Governing CouncilIs the SGC operational?

Yes NoMore Info Needed

Parents Community Teachers Association (PTCA)Is the PTCA active and supportive of school programs and projects?

Yes NoMore Info Needed

Barangay/Municipal/City LGUIs the LGU actively involved in school programs and projects?

Yes NoMore Info Needed

OthersAre there organizations operating in the area that are supportive of education and other related undertakings?

Yes NoMore Info Needed

6. Resource Generation

BudgetIs the total estimated cost required to implement the school programs and projects reasonable?

Yes NoMore Info Needed

Fund SourcesAre there fund sources available?

Yes NoMore Info Needed

Resource mobilizationIs the school head capable of generating

Yes No More Info Needed

Page 6 - 16

Page 58: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Area Remarks

financial support from different sources?

7. Detailed Implementation Plan

ActivitiesAre the activities listed in the WFP directly linked to the outputs/deliverables listed in the goal chart?

Yes NoMore Info Needed

Work and Financial PlanIs there a WFP? Is it presented on a monthly basis?

Yes NoMore Info Needed

Targets and SchedulesAre targets plotted monthly?

Yes NoMore Info Needed

Cash FlowAre cash flow requirements plotted monthly?

Yes NoMore Info Needed

Persons ResponsibleIs there an assigned individual per activity?

Yes NoMore Info Needed

Monitoring and EvaluationAre M&E activities reflected in the WFP? Are there assigned resources for M&E?

Yes NoMore Info Needed

Page 6 - 17

Page 59: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

6.5 Quali ty Control Point #2: Start Up Review

Start Up refers to preparatory activities to be undertaken by the Division and schools before

fully implementing the programs and projects contained in the education plans. These are

activities undertaken after the appraisal stage where the plan is formally accepted. This is

also know as the mobilization stage.

Among the Start Up related activities the Division and schools will implement are:

Kick off meeting. A kick off meeting signals the start of the start up stage. The

meeting brings together all the internal and key external stakeholders. The meeting

will serve as an orientation about the objectives and scope of the approved

education plan. This is to ensure that the management and staff have the same

understanding of the targets and objectives of the plan and there is agreement or

consensus in the strategies and activities to be undertaken.

Revise or update the plan. Based on the comments and suggestions at the

appraisal stage, the plan is adjusted or enhanced. Enhancements are made in

targets, strategies, events and allocation of resources.

Assign individuals to task and deliverables. A significant activity at this stage is the

mobilization of individuals who will be assigned to perform tasks and deliverables. It

is important to make responsibilities clear to all. Vague responsibility assignments

are often the major cause of conflict between units and individuals.

Page 6 - 18

������������ ������������ ��� � ���������� � �������

���� ����

Transition from Planning to Implementation

������������� �������������

�����!�������!��������������������∃ % &∃ % &

Figure 6-8 Start Up Stage

Page 60: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Four people named Everybody, Somebody, Anybody and Nobody worked together. An important Outcome needed managing, and Everybody was sure that Somebody would do it. Anybody could have done it, but Nobody actually did it. Somebody got angry, because it was really Everybody's job. Everybody thought Anybody could do it, but Nobody realized that Somebody wouldn't. As it turned out, Everybody blamed Somebody when Nobody did what Anybody could have done!

- Author Unknown

Advocacy and resource mobilization. One of the important start up activities is the

advocacy work especially in generating support and/or resources from

stakeholders.

Prepare status report. The report to be prepared at this stage will serve as the

inception report.

Start Up stage is also a sustainability mechanism. It involves setting up of critical systems

and involves rallying and mobilizing support for the plans. Among the mechanisms that

must be set up at the start of implementation are the following:

(1) Participation of stakeholders. This refers to the stakeholders' understanding of the

plan, especially the target benefits and improvements.

(2) Communication system. This includes setting up the mechanism for sharing and

disseminating data and information throughout the organization. This will enable the

Division, district and schools to:

coordinate efforts more efficiently, thus avoiding duplication

gain up-to-date information about the status of implementation including

issues and problems, and make timely corrective actions

know about policies and directions of the organization in order to

synchronize decisions and actions at their level

(3) Monitoring, Evaluation and Adjustment system. Plans are best estimates of the

future. However, even a well-written plan will never be able to predict in exact detail

the future situation. At the start of the implementation, therefore, the mechanism for

tracking, analyzing and adjusting the implementation plan should be already in

place.

The inability to set up the critical mechanisms during start up and the failure to implement

the mobilization activities often leads to implementation difficulties and inefficiency. Based

on the experience of many, misunderstandings on the scope of the plan and on the roles

and responsibilities of individuals could have been avoided had an honest-to-goodness

activities related to start up were undertaken. Recurring problems manifested through

Page 6 - 19

Page 61: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

delays, cost overrun, poor quality of services and non-achievement of targets and

outcomes are often traced to activities related to scope and role clarification and setting

up of systems that will facilitate information sharing and facilitate decision making.

In order to minimize, if not eradicate implementation problems, the Start Up Review Process

is installed as one of the quality control mechanisms in the Division M&E System.

6.5.1 Guiding Principles

Start up activities are critical to sustaining an efficient implementation. Therefore,

start up activities should be planned and allocated with resources. And if it is an

important part of the implementation phase, it should also be quality assured.

All stakeholders, internal or external, must be clear on what is to be achieved, how

the outcomes will be achieved and what are their roles and responsibilities.

Commitment of stakeholders should be assured at the start of the implementation.

Recurring problems are symptoms of missing management systems or poorly

installed mechanisms. In order to prevent or avoid these, efforts must be spent on

ensuring the management systems are in place before shifting to high gear in the

implementation. The start up activities serve as the system check.� �

A good start increases the chances of successfully implementing a plan.

6.5.2 Objectives

The main objective of the Start Up Review process is to ensure the readiness of the schools

to implement the SIP. Readiness is determined when the school is able to implement the

required necessary mobilization activities and has established critical management systems

that will sustain the implementation of the school's SIP.

Specifically, the Start Up Review will allow the Division and District to:

Pinpoint schools that are ready to implement the SIP and schools needing

assistance in jump starting their plans. This will allow the Division and District to focus

assistance on schools having difficulty launching their SIPs.

Synchronize the Division M&E system with the school M&E system. At this stage, the

Division is also initiating its DEDP implementation.

In the case of the Division's alternative learning programs, start up activities are

undertaken to ensure readiness of the accredited service providers to implement

the Basic Literacy Program and the A&E Program.

Page 6 - 20

Page 62: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Figure 6-9 Start Up Review Process Flow

6.5.3 Process Description

The Start Up Review process is a three-month long activity implemented immediately after

the acceptance of the SIP, and undertaken through school visits. In general, the Division

reviews the start up related activities of the school and determine whether it complies with

the standard process and requirements. Specifically, the start up review activities will

include:

(1) Preparation for start up review. The Division QMT determines which schools will

need more immediate assistance. As a rule of thumb, the focus will be on schools

that have difficulty complying with the SIP appraisal (due to poor and vague plans).

Generally, the same groups will have difficulty starting up.

(2) School visit. The focus of the school visit is to verify the activities undertaken by the

schools after acceptance of the SIP. Verification may be undertaken through:

(a) Interview school head, teachers regarding activities conducted which are

related to start up and discuss mobilization problems and issues

encountered

(b) Review of the following documents:

SIP. To validate whether the school has incorporated the suggestions

provided by the Division QMT in the plan

AIP. To determine whether the school has already firmed up its plan

for the 1st year of SIP implementation.

AIP Monitoring Sheet. To determine if the school has finalized the AIP

Monitoring Sheet which will be used later in the tracking of school

progress

Page 6 - 21

���������∃ % &

�� ��� �����

∃ % &

��4����1!�

���∋������ � �������

∃ % &

Implementation Stage

���∋� �����

��4����� �)�������� � ���

��

��

��

..Next Control

Point

Control Point 2. Start Up Review Process

Start Up Review Process Flow

1. Prepare for Start Up review

2. School visit

3. Prepare documentation report

Output: School ready to implement

Next Process: Start Up Review

Page 63: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Responsibility Assignment Matrix. Document that shows the

assignments of teachers and non-teaching staff.

(c) Assist schools in the start up activities. While in school, the Division QMT may

assist the school head and staff in complying with the requirements of the

Start Up process.

(3) Preparation of Start Up process report. The QMT will prepare a short report on the

start up assistance provided to schools and include discussion of the problems and

issues that will require interventions from the Division.

6.5.4 Knowledge and Skills Requirements

The Division and District staff who will comprise the start up review must posses the following

characteristics:

Knowledgeable about the agreements made between the schools and the

Division during the appraisal process

Can help the school install its M&E system.

Good project management skills, especially in the use of planning tools such as

work breakdown structure, network chart, Gantt or bar or bar chart and costing.

Good soft project management skills. This includes managing, motivating people,� �

and negotiation skills.

6.5.5 Process Output

The main report is the Division Start Up Review Report. This report is a documentation of the

schools start up accomplishments, start up problems and difficulties and the assistance

provided by the Division to jump start the SIP implementation.

The schools will also submit a status report on the implementation of their own start up

activities.

6.5.6 Evaluation Tools and Techniques

A Start Up Review Checklist is developed to facilitate the process. The checklist contains

the start up activities that must be implemented by the schools.

The checklist is reinforced using the following methods:

Document review. Includes review of the following documents: AIP, School Inset Plan,

School Monitoring Sheet, Responsibility Assignment Matrix and the School Inception

Report (Quarterly Status Report).

Interviews. Interview of teachers, non-teaching staff on start up related activities

Page 6 - 22

Page 64: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

undertaken by the school including their understanding of their roles and

responsibilities

Process Review. Includes observation and review of kick off meeting, negotiation

and setting up the M&E system.

Table 6-3 Start Up Checklist

Start Up Requirements

# Items YesOn

GoingNot

StartedRemarks

Revision/Enhancement of School Plan

1

The school has incorporated in the SIP or AIP the suggestions/recommendations and agreements from the appraisal process

21st Year AIP already finalized including targets

Awareness of school plans and programs

3

Kick off meeting undertaken. The meeting was attended by teachers, non-teaching staff and school stakeholders.

4Teachers and non-teaching staff are already aware of their roles and responsibilities.

Advocacy 5School head has visited and oriented key stakeholders about support needed by the school

School Inset 6The school has a capability building plan for teachers and non-teaching staff

School M&E

7AIP Monitoring sheet already updated and ready for use

8

Orientation on School M&E completed. This means teachers and non-teaching staff are already aware of the reporting requirements, performance parameters of the school

9Inception Report representing the quarter status report of the school completed and for submission.

Page 6 - 23

Page 65: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

6.6 Quali ty Control Point #3: Annual Implementation Review (AIR)

Annual Implementation Review (AIR) is a quality control mechanism conducted after 1 year

of implementation. The review compares the actual achievements and accomplishments

of the Division, districts

and schools versus the

achievements and

accomplishments based

on plan.

The AIR is used as a major

adjustment point of both

the Division and schools

for their strategies and

activities of the next

implementation year.

6.6..1 Guiding Principles

It is difficult to swallow an elephant whole. It is best to chop the elephant into

pieces. Managing the implementation per segment allows more control on the

quality of services and products.

Is the implementation on the right track? Continuous improvement will work only

when there is regular review or evaluation of implementation.

Monitoring and evaluation does not wait for problems to occur. It provides a

systematic and proactive venue for avoiding and mitigating issues and problems.

Recurring problems are symptoms of a poorly functioning monitoring system. A

regular review is undertaken to, once and for all, solve the problem and prevent it

from happening again.

6.6.2 Objectives

The annual review is undertaken to assess the initial gains generated after 1 year of

implementation. It is a mechanism to track the achievement of outcomes (Division and

school level) on a year to year basis. It is also a mechanism for assessing the efficiency of

Division units, districts and schools in delivering the target outputs in the DEDP and SIP.

The Annual Implementation Review is designed to generate information and insights that

will be useful for continuous improvement and in solving recurrent problems. The Review will

also serve as a major adjustment point for plans and programs of the Division and schools.

Specifically, the annual review will provide the Division with the following information:

Page 6 - 24

Division & District M&E SystemAnnual Implementation Review

��������∋�������

� �������������� � ������

�!! !!���������!�#��������!4� ��!

� ������������� � ������������ 7 37 3 (��(��

������ (��(�� �∋��∋�

7�∋� � ��9����∋������ � �������∃ % &

Page 66: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Programs and projects that produced positive and/or encouraging (initial) results.

The review will enable the Division to reinforce on programs and project that work

and improve the design of programs that produced negative results.

Technical assistance processes or practices that need further enhancements.

Accomplishment to date. An annual review provides an overall status of

accomplishment since the implementation started.

Factors that facilitated implementation as well as factors that adversely affected

delivery of services and assistance.

Figure 6-11 Annual Implementation Review Process Flow

6.6.3 Process Description

The Review is undertaken after 1 implementation year.

(1) Consolidate reports. At the end of the year, the schools and the Division will be

completing their annual report. A school annual report will contain the

achievements (purpose level SIP) and the accomplishments (AIP) of the school. All�

school reports will be used as input to the preparation of the Division Annual Report.

(2) Analyze achievements and accomplishments. This is a pre-assessment (workshop)

activity. The main task is to sort out the achievements and accomplishments of

schools, community learning centers, districts and division and put into one cohesive

document called the Division Report Card.

To facilitate the analysis, an AIR Implementation Guide is developed. It contains the

Page 6 - 25

�∃

���!��)�� �∃ ����!

�����: ���4 % � �����)��������!4� ��!

�� ��� ����∋�������

���∋� ������∋�������

()�� ���∃ % &

Implementation Stage

�!! !!���� � ������

���∋������ � �������

����

��

��

..

∗∗

++

Next Control Point

Control Point 3. Annual Implementation Review

Annual Implementation Review Process Flow

1. Consolidate annual reports

2. Analyze achievements and accomplishments

3. Assess implementation (what went right & what went wrong)

4. Prepare next year plan

5. Submit and document the next year plan

Output: Annual Plan

Next Process: Mid Term Review

Page 67: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

important areas to analyze (in an annual review) and provides process questions

that will help the QMT to analyze and formulate recommendations for the next

implementation year.

(3) Assess implementation. Using the outputs of activity 1 and 2, the Division QMT will

conduct a one to two day assessment workshop. This will be attended by school

heads, district staff and Division staff. The objective of the workshop is to assess and

identify factors that facilitated the achievements and accomplishments and to

collectively identify issues and external factors that contributed to difficulties in

implementation.

Depending on the requirements, size and other factors, the assessment may be

undertaken in several options:

Division-wide assessment and planning workshop. The Division will conduct

only 1 workshop to be attended by all schools, districts and Division units

Division-wide assessment and planning workshop. Similar to option above

but divided into elementary and secondary schools

Assessment per District or cluster. Simultaneous conduct of assessment and

planning workshop to be facilitated by Division QMTs.

The assessment workshop will focus on the following areas:

Year end accomplishment as per AIP and DAP. Need to identify the

factors that contributed and factors that hindered the efficient

implementation of the plans

Initial gains per school outcomes. The assessment will include discussion

and analysis of the school performance indicators (enrollment, retention,

completion and achievement). During the workshop, discussion will focus on

programs and projects to continue, documentation of lessons learned and

propagation of effective practices.

Performance of teachers and school heads.

Accomplishment in ALS programs. Assessment includes performance of

community learning centers, service providers, facilitator and instructional

managers

Division operations. Assessment of Division's application of processes

(standards) and practices.

The results of the assessment will be used as input to the finalization of the Division

Annual Report and input to the preparation of the next Division Annual Plan.

(4) Prepare next year implementation plan. Using the results of the assessment, the

Division and the schools will revisit the DEDP and SIP to assess whether these need

any adjustment.

Page 6 - 26

Page 68: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

(5) Document next year implementation plans. The DAP and AIPs will be documented

and used as basis for the monitoring implementation progress in the next year.

6.6.4 Knowledge and Skills Requirements

The Division and District staff who will spearhead the annual review process must posses the

following characteristics:

has been involved and/or knowledgeable about the Division programs and

projects as well as issues and concerns affecting school and community learning

center operations

has basic handles on planning tools and techniques such as logframe, work

breakdown structure, network chart, Gantt or bar chart and costing techniques

competencies on progress monitoring and evaluation and understand the

concepts of physical accomplishments, scope management, scope creep, s-curve

and initial gains

can write technical reports

computing skills, especially the use of word processing and spreadsheets software

6.6.5 Process Outputs

The major outputs of the Annual Implementation Review are:

Division Annual Report. Contains the achievements and accomplishments of the

Division in one year. The report also contains discussion of factors that helped the

implementation and discussion of issues and difficulties experienced by the Division,

districts and schools.

Division Report Card. An end-of-year document that provides a comprehensive

picture of the Division's performance. It contains information about the Division

which will include Goal level (school performance) and Outcome level

(performance of school heads, teachers, instructional managers and facilitators)

indicators.

DEDP Annual Plan (next year). Based on the achievements and accomplishments

in the previous year, the Division prepares or adjusts the DAP.

School Annual Implementation Plan (next year).

6.6.6 Evaluation Tools and Techniques

The following are some of the M&E tools to aid the implementation of the Annual

Implementation Review (AIR):

Page 6 - 27

Page 69: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

(1) Line of Balance or S-curve. This tool provides an overall status of accomplishment.

It shows (in a diagram) the actual accomplishments of the Division and schools

versus outputs according to plans.

(2) Segmentation Techniques. This is a technique use to understand and gain insights

from target groups. Segmentation is a process of identifying and grouping schools

based on school characteristics and accomplishments. The main objective of

segmentation is to get to know the schools better in order to customize or fit the

Division's technical assistance to the requirements of the school.

Specifically, the segmentation technique will allow the Division to compare similar

schools (the same characteristics) and schools from different groups (different

characteristics). This approach will facilitate the monitoring of schools and allow the

Division to determine the unique needs, problems and requirements of schools

belonging to the same segment.

The following groupings will be used:

(a) school characteristics (sample only)

• type science, vocational, national high school�

• location upland, urban, rural�

• facilities high classroom need, medium classroom need, low�

classroom need

• leadership schools headed by principal 2, principal 1, TIC�

• teacher to learner ratio high, medium and low�

(b) school performance (sample only)

• enrollment decreasing, increasing, stable�

• retention high, medium, low�

• completion high, medium, low�

• achievement - 75 and above MPS, 50-74 MPS, 50 and below

• SBM Practice beginner, mature�

(3) AIR Implementation Guide. A guide for QMT members on how to go about the

process of implementing the AIR.

Page 6 - 28

Page 70: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Table 6-4 AIR Implementation Guide

Areas to Review Performance Measure Process Question Decision Point

A. Status of Implementation

Implementation of AIP

Actual targets accomplished versus plan

What school practices facilitated the SIP/AIP implementation?What factors hindered the achievement of targets?

Identify practices to reinforce Address factors hindering implementation

Implementation of DAP

Actual targets accomplished versus plan

What Division practices facilitated the DEDP/DAP implementation?What factors hindered the accomplishment of targets?

Continue practices contributing to efficient operationsDocument and address factors hindering efficiency

B. Initial Gains/Results (Learners)

Enrollment% increase in enrollment Y1 to Y2

What factors led to the increase/decrease in enrollment?What school programs and projects of the schools contributed to increasing the enrollment?What Division programs and projects led to the increase in participation rate?

Document effective/best practicesDocument lessons learnedIdentify programs and projects to enhance

Retention/ Completion

Decrease in drop out rateDecrease in school leavers rate Increase in retention rate Increase in completion rate

What factors led to the improvement/ reduction in retention rate?What programs and projects the schools implemented contributed to improvement in retention rate?

Document effective/best practicesDocument lessons learnedIdentify programs and projects to enhance

Achievement

Increase in MPS (NEAT) for grade 6Increase in MPS (NAT) for 2nd

year and 4th year

What factors led to the increase/decrease in the MPS (NAT) of schools?What programs and project of the schools contributed to the improvement in the MPS?

Document effective/best practicesDocument lessons learnedIdentify programs and projects to enhance

C. Initial Gains/ Results (Performance of School Head and Teachers)

School Based Management

Improvement in the skills of school heads in managing school operations

What management areas the school heads are strong? And where are they weak?What systems and processes, programs and projects of the Division contributed to the enhancement of school heads skills?

Assess training programsAlign training assistance to areas where SH are weak Division practices and systems to enhance Reinforced systems that generate positive feedback

ContentMastery of teachers per subject area

What subject matter the teachers are strong? And where are they weak?

Staff development programs to continue and areas to focus

Teaching skillsImprovement in the teachers teaching skills

What teaching skills the teachers are strong and where are they weak

Staff development programs to continue and areas to focus

D. Initial Gains (alternative learning system)

Performance of service providers

Learners under BLP achieved 100% of the core competencies in reading, writing and numeracy50% mastery of the core

What are the practices of the service providers that contributed/ hindered the achievement of targets?What assistance or Division programs contributed to the achievement of

Contracts to extendLessons learned and effective practices to continue

Page 6 - 29

Page 71: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Areas to Review Performance Measure Process Question Decision Point

competencies under the A&E program

targets?

Community learning centers

Learners under BLP achieved 100% of the core competencies in reading, writing and numeracy50% mastery of the core competencies under the A&E program

What are the practices of the service providers that contributed/ hindered the achievement of targets?What assistance or Division programs contributed to the achievement of targets?

Practices and programs to continuePrograms to be reinforced

Facilitators and Instructional Managers

Demonstration of skillsWhat are the areas of strength and areas for improvement?

Staff development programs to continue and areas to focus

E. Division Operations

Managing the Core Processes

Compliance to standardsWhat is the maturity level of the Division in applying the processes?

Identify system or process to enhance Train Division and District staff on the processes

Page 6 - 30

Page 72: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

F igure 6-12 Mid-Term Implementat ion

6.7 Control Point #4: Mid-Term Implementation Review

Mid-Term Evaluation is undertaken at the last 6 months of the 1st Cycle of SIP

implementation and at the 3rd year of DEDP implementation. It is one of two major

evaluation activities to be undertaken by the Division QMT in the six-year DEDP cycle.

The Mid-Term Evaluation focuses on the achievement of the Purpose-level objectives in the

SIP and the Outcome-level objectives in the DEDP. Specifically, the evaluation is designed

to measure the achievement of the following:

(1) SIP (Purpose-level Objectives)

Increase in enrollment

Improvement in the retention rate

Improvement in the completion rate

Improvement in learner achievement

(2) DEDP (Outcome-level Objectives)

Improvement in the performance of school heads on school-based

management and instructional supervision

Improvement in the performance of teachers on content and teaching skills

Page 6 - 31

��� � ������� ������

�������������

������� ����� �������� ����� �� �� ��!∀

������ � �������� �����#��∀

����� �������

�����������

��������!�� ���������

∃ % &���∋���∃ % &

()�� ������ � ��∃ % &

������ � �������� ���∗�#�+∀

����� �������

�����������

��������!�� ���������

∃ % &���∋���∃ % &

,∋���� ��%��∋����

�����−������

������� ��������

����

��� � ������� ���+�

��� � ������� ���.�∗

������ � ������

�� ���.∀

������ � ������

�� ��� ∀

� ���� � � � � � � � � ��� � � � � ���� � �� � � � � � �� �� � � �

Division Quality Control & Adjustment Points��������

����

Page 73: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Improvement in the SBM practices of schools

Improvement in the learning environment

6.7.1 Guiding Principles

Evaluation findings gathered at the end of the implementation cycle will have little

or no use at all in improving efficiency and effectiveness. At best, the evaluation

findings will be used as input to the next planning process. In order to be useful and

to generate the highest value-added to the organization, the evaluation should be

conducted during the implementation stage where results can be used to improve

or enhance efficiency and increase the likelihood of effectiveness.

Initial evaluation results will serve as the major ingredients to continuous

improvement.

6.7.2 Objectives

The objectives of the Mid-Term Evaluation Review are to:

Evaluate how closely the achievements and accomplishments are to the planned

objectives and targets

Assess the first 3 years of DEDP implementation to determine which programs and

projects should be continued or stopped

Document the effective practices and processes that contributed to attainment of

initial gains and make recommendations to continue applying them in the next 3

years of implementation

Analyze the causes of problems and difficulties encountered and document these

as part of the lessons learned

Identify factors that may help sustain the initial gains.

Mainly, the results of the evaluation will be used as input to enhancing the implementation

strategies and technical assistance to schools for the next three years.

Page 6 - 32

Page 74: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Figure 6-13 Mid-Term Implementation Review Process Flow

6.7.3 Process Description

The Mid-Term Implementation Review will be implemented in 5 major activities:

(1) Prepare for Mid Term Review. Preparatory activities include creation of evaluation

team, preparation of the evaluation design and the evaluation implementation

plan.

(a) Prepare implementation plan

(b) Prepare evaluation design

(c) Form and create evaluation team.

Please see Checklist for Mid-Term Implementation Review.

(2) Review and/or document Division achievements. The review will provide data and

information to the Division QMT on the extent of initial gains or benefits achieved by

the Division, schools and community learning centers after 3 years.

Key source of information for the review includes the Basic Education Information

System (BEIS) and the Report Cards

(3) Data Gathering. The main objective of this task or activity is to support the

achievements documented in reports with qualitative information that will provide

the stories behind the numbers and percentages reported. The validation seeks� �

to document effective practices and draw lessons from failed undertakings.

The data gathering is divided into 3 major activities:

Page 6 - 33

()�� ���∃ % &�

�� ��� ��%��∋������ !��

���∋� ���∃ % &

�� ��� ��%��∋�����∃ ����

�);∋!������

���∋������ � �������

∃ % &

Implementation Stage (next 3

years)

���)∋��������0��!��%��∋����

���∋������ � �������

����

��

��

..

∗∗

++

Next Control Point

Control Point 4. Mid Term Implementation Review

Mid Term Implementation Review Process Flow

1. Consolidate 3 year SIP Completion Reports

2. Prepare evaluation design

3. Conduct initial gains evaluation

4. Prepare evaluation reports

5. Adjust DEDP

Output: Adjustment of DEDP for next 3 years

Next Process: Annual Implementation Review

Page 75: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Data Gathering to validate the achievements and accomplishments.

Includes visits to schools and community learning centers and involves the

use of rapid appraisal techniques

Perception survey to gather feedback from school stakeholders on quality

of services provided by the school

SBM Assessment Level of Practice

(4) Prepare Mid-Term Implementation Report.

(5) Adjust Plan. The results of the review will be used as input to the improvement of the

DEDP and SIP.

The results will provide insights to the Division on what programs and projects work and

what does not. Therefore, enhancement in the strategies for the next 3 years is warranted.

The results will also be used in helping the schools prepare a better SIP for the 2nd cycle of

implementation.

6.7.4 Knowledge and Skills Requirements

The Division and District staff who will spearhead the mid-term implementation review

process must posses the following characteristics:

has basic handles on planning tools and techniques such as logframe, work

breakdown structure, network chart, Gantt or bar chart and costing techniques

competencies on conducting benefits evaluation including using rapid appraisal

techniques such as focus group discussion, interviews, key informant interviews,

transect walk, observations and inspection

knowledgeable about SBM and other issues and opportunities affecting school

operations

can write technical reports

computing skills especially the use of word processing and spreadsheets software

6.7.5 Process Outputs

The major outputs of the Mid-Term Implementation Review are:

(1) Division Mid-Term Report. Contains the achievements and accomplishments of the

Division and schools after three years (after 1 SIP cycle). The report highlights the

initial results or improvements and changes that take place after providing support

to schools in implementing the 1st SIP Cycle. Specifically, the Mid-Term Report

contains the achievement of schools (using key performance indicators),

improvements in the competencies of school heads, teachers and facilitators,

Page 6 - 34

Page 76: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

instructional managers and non-teaching staff.

The report also contains discussion of factors that helped the implementation and

discussion of issues and difficulties experienced by the Division, districts, schools and

community learning centers.

The Division Mid-Term Report will draw information from the following:

Division Report Card. An end of the year document that provides a

comprehensive picture of the Division' performance. It contains information

about the Division which will include: Goal level (school performance) and

Outcome level (performance of school heads, teachers, instructional

managers and facilitators) indicators

Stakeholders Perception Study. Contains perception of parents, community,

local government units and other local organizations on the quality of

education and quality of services provided by the school to learners.

SBM Level of Practice. Results SBM assessment conducted by the Division in

randomly selected schools.

The evaluation reports will be used as input to:

(1) DEDP Implementation Plan (next 3 years). Based on the initial achievements and

accomplishments, the Division makes adjustment to the DEDP.

(2) School Improvement Plan (next cycle). The evaluation results will be used as basis

for the appraisal and enhancement of SIPs representing the next 3 years.

6.7.6 Evaluation Tools and Techniques

The following are some of the M&E tools to aid the implementation of the Mid-Term

Implementation Review (MTR):

(1) Rapid Appraisal Techniques. These are not so quick and not so dirty' techniques�

of gathering qualitative information data about school achievements and

performance. It is a technique for gathering information that will help explain a

phenomenon. It documents the practices (what was done and what were not

undertaken) of the schools. It involves the use of different techniques in order to

validate and triangulate information that will help derived an unbiased view of the

situation. Rapid appraisal techniques include:

key informant interviews. Key informants refer to individuals who can provide

holistic and complete information about the schools. Interview may also be

undertaken through transect walk (walk through) or through the use of a

questionnaire

focus group discussion. Involves individuals group according to similar

characteristics and traits. A facilitator leads the discussion and draws

Page 6 - 35

Page 77: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

information from the participants. There are no right and wrong answers but

the facilitator must see to it that the discussion is focused and will generate

the desired information from the participants.

Inspection. This is an activity that will validate the claims of individuals about

a practice or way of doing things.

actual observation. In order to document the actual practice or behavior,

actual observation is undertaken. This method will help validate the claims

made by key informants and participants to the FGD.

Questionnaire. Predetermined questions are jotted down. These are used to

guide the interviews.

(2) Segmentation Techniques. This is a technique use to understand and gain insights

about target groups. Segmentation is a process of identifying and grouping schools

based on school characteristics and accomplishments. The main objective of

segmentation is to get to know the schools better in order to customize or fit the

Division's technical assistance to the requirement of the school.

Specifically, the segmentation technique will allow the Division to compare similar

schools (the same characteristics) and compare schools from different groups

(different characteristics). This approach will facilitate the monitoring of schools and

allow the Division to determine the unique needs, problems and requirements of

schools belonging to the same segment.

The following groupings will be used:

(a) school characteristics (sample only to be developed further)

type science, vocational, national high school�

location upland, urban, rural�

facilities high classroom needs, medium classroom needs, low�

classroom needs

leadership schools headed by principal 2, principal 1, TIC�

teacher to learner ratio high, medium and low�

(b) school performance (sample only to be developed further)�

enrollment decreasing, increasing, stable�

retention high, medium, low�

completion high, medium, low�

achievement - 75 and above MPS, 50-74 MPS, 50 and below

SBM Practice beginner, mature�

(3) SBM Assessment. The Division is going to assess the SBM practices of the schools

Page 6 - 36

Page 78: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

using the same SBM assessment tool the schools are using to do self-assessment.

Team of assessors from the Division and District are going to conduct the SBM

assessment.

In order to maintain uniform application of criteria and unbiased assessment of

school practice, the tool is reinforced with the consensus technique. Assessors will

not immediately render judgment about the school practice but instead jot down

notes and document the school practices as has been observed. These

documentations are discussed by the team of assessors and a consensus is to be

made as to whether the school is able to satisfy the level of practice.

(4) Mid Term Implementation Review Checklist. The Checklist is for use by the Division

QMT/ evaluation team as guide in the preparation, implementation and completion

of the Mid-Term Review.

The Checklist will not be use, in any way, to score or grade the performance of the

QMT but to serve as a guide in the implementation of evaluation activities. Its main

objective is to ensure a smooth and efficient conduct of the Mid-Term Review.

The Checklist provides a listing of activities, resources, and reference documents

necessary for an efficient implementation of the Mid-Term Review. The QMT will

check: Yes if condition/question is complied with; No if condition/question posted is

not met; and More Information Needed when an objective Yes or No response

cannot be undertaken due to insufficient information.

Page 6 - 37

Page 79: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Table 6-5 Mid-Term Implementation Review Checklist

Area Remarks

1. Preparatory Activities

Implementation Plan for Mid Term ReviewIs there an approved Mid Term Review implementation plan?

Yes NoMore Info Needed

ResourcesAre needed resources available for use by the evaluators?

Yes NoMore Info Needed

Time TableThe review can be finished in 3 months

Yes NoMore Info Needed

BudgetIs there an approved budget alloted for the Mid-Term Review?

Yes NoMore Info Needed

QMTAre the QMTs ready and available?

Yes NoMore Info Needed

EvaluatorsWill there be enough evaluators from the Division and District in order to finish the evaluation in 3 months?

Yes NoMore Info Needed

Capability of the EvaluatorsHave they been trained on results evaluation? Perception survey? SBM assessment?

Yes NoMore Info Needed

Availability of EvaluatorsAre there enough QMT members/ evaluators to simultaneously implement outcome evaluation, perception survey and SBM assessment?

Yes NoMore Info Needed

2. Evaluation Design

Evaluation DesignIs there an evaluation design?

Yes NoMore Info Needed

Evaluation DesignHas this been discussed with the QMT and members of the MTR?

Yes NoMore Info Needed

Objectives Are the objectives of the evaluation clear and SMART?

Yes NoMore Info Needed

MethodsDifferent methods will be used to triangulate and validate information

Yes NoMore Info Needed

SampleSchools to be visited already selected

Yes NoMore Info Needed

3. Review of Achievements and Accomplishments

DocumentsAre supporting documents complete? These include report cards, completion reports and accomplishment reports

Yes NoMore Info Needed

Authoritative SourceAre the reference documents the most authoritative source of data and information?

Yes NoMore Info Needed

Report Cards Yes No More Info

Page 6 - 38

Page 80: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Area Remarks

Are the school report cards updated? Needed

SegmentationAre the schools grouped based on predefined characteristics?

Yes NoMore Info Needed

Historical DataSupport documents provide at least 2 years of past data?

Yes NoMore Info Needed

Division Report CardIs the Division Report Card updated?

Yes NoMore Info Needed

BEISThe BEIS data is complete and up to date

Yes NoMore Info Needed

4. Pre-Data Gathering Activities

QMTs and EvaluatorsAre all members of the QMTs and evaluators oriented about the scope, strategies and implementation plan?

Yes NoMore Info Needed

MaterialsQuestionnaire and other evaluation paraphernalia ready?

Yes NoMore Info Needed

VehicleTravel and transportation arrangements completed

Yes NoMore Info Needed

MaterialsData Processing Guide, Questionnaires, Interview guides and other evaluation paraphernalia ready

Yes NoMore Info Needed

Data Gathering Guide Each member of the QMT and/or evaluating team provided with a copy of the data gathering guide

Yes NoMore Info Needed

5. Actual Data Gathering

Pre Data Gathering ConferenceSchool head given an orientation on the review to be undertaken

Yes NoMore Info Needed

Selection of ParticipantsThe QMT and/or evaluation team selected the teachers, non-teaching staff, learners and others in a random and transparent manner

Yes NoMore Info Needed

Classroom ObservationPre observation, actual observation and post observation process followed

Yes NoMore Info Needed

Classroom ObservationDocumentation of the observation; documentation is phenomenological and non-judgmental

Yes NoMore Info Needed

Focus Group DiscussionDiscussion is focused and limited to the assigned topics

Yes NoMore Info Needed

Focus Group DiscussionTandem of facilitator and documenter

Yes NoMore Info Needed

Focus Group DiscussionMinimum of 5 participants per FGD

Yes NoMore Info Needed

Inspection Yes No More Info

Page 6 - 39

Page 81: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Area Remarks

A school representative accompanies the evaluator in the inspection

Needed

Post Data Gathering ConferenceThe QMT apprise the school about the activities undertaken and the next steps

Yes NoMore Info Needed

6. Perception Survey

Stakeholders Participants to the perception survey are notified

InstrumentsQuestionnaires and other instruments to be used in the perception survey are validate and reliable

Key InformantsKey informants are identified and interviewed

TriangulatePerception of other stakeholders in order to triangulate information and minimize bias of informants

Processing of findingsDivision staff capable of using spreadsheet software and its special functions

ReportResults of perception survey is made incorporated to the Mid-Term Report

6. Consensus Building

EncodingRaw data gathered are documented using word software

Yes NoMore Info Needed

ConsensusQMT/evaluators discuss the observations, raw data to come up with a consensus

Yes NoMore Info Needed

Signed and endorsedResults of consensus are finalized and endorsed by the QMT/evaluators

Yes NoMore Info Needed

7. Evaluation Report

Mid Term Implementation Review ReportMTR endorse by members of the QMT/ evaluators

Yes NoMore Info Needed

Issues and problemsIssues and problems raised or identified in the report have sound basis and back up with data

Yes NoMore Info Needed

Issues and RecommendationsFor every major issue raised there is a corresponding recommendations on how to mitigate the issue

Yes NoMore Info Needed

Presentation of FindingsThe findings were presented clearly and objectively through graphs and diagrams

Yes NoMore Info Needed

RecommendationsThe report contains suggestions for next steps on how to improve or enhance future

Yes No More Info Needed

Page 6 - 40

Page 82: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Area Remarks

implementation

FeedbackResults and recommendations are properly disseminated and communicated

Yes NoMore Info Needed

8. Adjustment of Plans

RecommendationsSuggested next steps find its way to the implementation plan for next year

Yes NoMore Info Needed

Issues and problemsCorrections/ adjustments are made in the plan in order to mitigate if not solve the issues and problems raised in the evaluation

Yes NoMore Info Needed

Input to AppraisalEvaluation findings and recommendations are use as input to appraisal of SIP

Yes NoMore Info Needed

9. Knowledge Management

Sharing of InformationEvaluation findings are shared and discussed to the Division and District

Yes NoMore Info Needed

Improve design of programs/projectsEvaluation findings are used to enhance design of Division programs and projects

Yes NoMore Info Needed

Expectations from ESDivision staff, especially education supervisors are knowledgeable about the results of the evaluation and the issues and problems

Yes NoMore Info Needed

Access Evaluation results are made available and accessible

Yes NoMore Info Needed

Page 6 - 41

Page 83: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

F igure 6-14 DEDP Wrap Up

6.8 Quali ty Control Point #5: Outcome Evaluation (OE)

The last major control point of the Division M&E System is the DEDP Outcome Evaluation.

Also known as results evaluation, it focuses on the achievement of the Goal-level

objectives and Purpose-level objectives of the DEDP. This process is undertaken at the end

of the six-year DEDP implementation and after 2 cycles of SIP implementation.

Outcome Evaluation is undertaken in order to verify the achievement of the following:

(1) Achievement of the Division Goal

Access as measured in terms of participation rate and increase in

enrollment

Learners' stay in school as measured by retention, drop out and completion

rates

Learners achieve desired learning competencies as measured in the

achievement tests

(2) Achievement of the Division Outcome

Reduce disparity in the performance of high-performing schools and low-

performing schools (retention, completion and achievement)

Increase satisfaction of stakeholders in the delivery and quality of instruction in

schools

Page 6 - 42

��� � ������� ������

�������������

������� ����� �������� ����� �� �� ��!∀

������ � �������� �����#��∀

����� �������

�����������

��������!�� ���������

∃ % &���∋���∃ % &

()�� ����%��∋����

������ � �������� ���∗�#�+∀

����� �������

�����������

��������!�� ���������

∃ % &���∋���∃ % &

,∋���� ��%��∋����

�����−������

������� ��������

����

��� � ������� ���+�

��� � ������� ���.�∗

������ � ������

�� ���.∀

������ � ������

�� ��� ∀

� ���� � � � � � � � � ��� � � � � ���� � �� � � � � � �� �� � � �

Division Quality Control & Adjustment Points��������

����

Page 84: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Improvement in the SBM practices of schools

Improvement in the competencies of teachers and school heads

Improvement in the schools' learning environment

The scope of the Outcome Evaluation is detailed further in Table 6-# Division M&E Framework

Table 6-6 Division M&E Framework

Objectives Performance Indicators Means of Verification

Division Goal:

Access: To ensure that all learners of school age are in school and are ready for school

Retention: To ensure that learners who are in school will stay in school

Completion: To ensure learners who are in school will complete the requirements of the primary and secondary level

Achievement: To ensure that learners demonstrate the necessary competencies at each level

Impact Indicators

Increase in participation rateIncrease in enrollmentLearners entering the school system are ready

Increase in number of learners retained in the school (retention rate)Reduction in drop outsReduction school leavers

Increase in number of learners able to complete the basic education requirementsImprove graduation rate

Improvement in the basic functional literacy skills of the learnersImprovement in the academic performance of learners in all subject matterImprovement in the social skills

Enrollment ReportDivision Report Card

School Report Card

School Report Card

Learner Report CardTeacher AssessmentNational Achievement Test (2nd Year)Regional Achievement Test (3rd Year)

Division Level Outcomes:

1. Improved school performance

2. Improved teachers performance

Effectiveness Indicators

Reduce disparity between high performing schools and low performing schools (in NEAT and NAT) by --- percent

Reduce disparity in enrollment, drop out, and completion rates between high performing schools and low performing schools

Increase in satisfaction of school stakeholders in the quality of instructions in the school

Improve SBM Practice of schools

Teachers demonstrated competencies on General Content and Subject specific skills.

Teachers meeting the desired competencies based on the NCBTS

Division Report Card Division Education

Development Plan (DEDP)

Perception Survey

SBM Assessment Result

Division Report Card and DEDP

Teachers' Performance Assessment Report

Assessment for Math and Science teachers

Page 6 - 43

Page 85: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Objectives Performance Indicators Means of Verification

3. Improved school heads performance

4. Improved learning environment

School heads demonstrated competencies on school based management and instructional supervision

Teacher to learners' ratio is 1:45Learner to textbook ratio is 1:1Teacher to teacher manual ratio is 1:1Teacher and learners have access to school equipment, science laboratories and other facilities

School comply with Standards of a Child Friendly School

Division Report Card and DEDP

Division Report Card

Division Intermediate Results:

1. Improved competencies of DepED Division and District staff in providing technical and management support to schools, community learning centers, school heads, teachers and facilitators

2. Management and technical assistance systems are in placed and operational

Leading Indicators

Division and District staff demonstrates competencies on educational planning, curriculum management, instructional consultancy, training and development and monitoring and evaluation

Continuous improvement in the management and technical assistance processes of the Division;

Division Report Card and DEDP

Results of Performance Assessment

Quality Assurance Readiness Assessment Report

6.8.1 Guiding Principles

Connecting the dots. Evaluations are undertaken to determine how the interplay of

programs and projects implemented and the influence of external factors resulted

in the realization or non-realization of desired objectives. These same dots are� �

used in plotting the future.

The goals, objectives and strategies in the plan provide the scope of the evaluation.

Effectiveness is demonstrated through the target group. Effectiveness is measured in

terms of changes or improvements in the status of the target groups as a result of

the benefits derived from programs and projects implemented.

6.8.2 Objectives

As an integral part of the process improvement mechanism, the objectives of Outcome

Evaluation are the following:

measure the improvement in the performance of schools, school heads and

teachers, instructional managers and facilitators and non-teaching staff of schools

determine whether the Division programs and projects lead to the achievement of

Page 6 - 44

Page 86: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

the Purpose-level objectives

determine whether the achievement of the Purpose-level objectives leads to the

attainment of the Goal-level objectives

document factors that may have contributed or hindered the achievement of

purpose-level objectives and goal level objectives

document and propagate best practices and lessons learned in the six years of

implementation.

The findings and results of the Outcome Evaluation will be used as input to defining and

formulating the next cycle DEDP.

Figure 6-15 Outcome Evaluation Process Flow

6.8.3 Process Description

The process of conducting the Outcome Evaluation is similar to the process used in the

Mid-Term Evaluation Review. The only difference is the context of the evaluation and the

utilization of the evaluation results.

Outcome evaluation will be implemented using 5 major activities:

(1) Prepare for Outcome Evaluation. Preparatory activities include creation of

evaluation team , the preparation of the evaluation design and the evaluation

implementation plan.

Prepare implementation plan

Prepare evaluation design

Form and create evaluation team.

Page 6 - 45

()�� ���∃ % &�

�� ��� ��%��∋������ !��

���∋� ���∃ % &

�� ��� ��%��∋�����∃ ����

�� ��� �����

���)∋��������0��!��%��∋����

/ 6������ �����

��

��

..

∗∗

++

Control Point 5. Outcome Evaluation

Outcome EvaluationProcess Flow

1. Prepare evaluation design

2. Review school achievements based on SIP

3. Conduct evaluation

4. Prepare evaluation reports

5. Prepare next cycle DEDP

Output: Next Cycle DEDP

Next Process: Annual Implementation Review

Page 87: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

(2) Review and/or document Division achievements. Using the Basic Education

Information System (BEIS) and the Report Cards, the Division will document the

achievement of the Goal and Purpose level objectives in the DEDP.

(3) Data Gathering. The task is to validate the documented achievements and

document how these achievements and accomplishments were achieved.

Specifically, the focus of the validation is to document the processes, practices and

other factors that contributed to the realization of the DEDP objectives. Includes

gathering of data and information at the Division, district, school, community

learning center level and community level and building consensus.

The data gathering is divided into 3 major activities:

Data Gathering to validate the achievements and accomplishments.

Includes visits to schools and community learning centers and involves the

use of rapid appraisal techniques

Perception survey to gather feedback from school stakeholders on quality

of services provided by the school

SBM Assessment Level of Practice

(4) Prepare DEDP Terminal Report. The terminal report describes the situation at the

Division level after six years of implementation. It describes the status of the schools

(using the school performance indicators) and provides a comparative assessment

of performance in terms of before and after and between and among school

groups.

Specifically, the terminal report will contain the following information:

Achievement of the DEDP Goal and Purpose level objectives

Major accomplishments, challenges encountered and how these were

solved or mitigated

Effective practices and lessons learned

Analysis of current issues, problems and opportunities

The Terminal Report is the main reference document in the preparation of the next

Division plan.

6.8.4 Knowledge and Skills Requirements

The Division and District staff who will form part of the Outcome Evaluation team must

posses the following characteristics:

has basic handles on planning tools and techniques such as logframe, work

breakdown structure, network chart, Gantt or bar chart and costing techniques

Page 6 - 46

Page 88: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

competencies on conducting benefits evaluation including using rapid appraisal

techniques such as focus group discussion, interviews, key informant interviews,

transect walk, observations and inspection

can write technical reports

computing skills especially the use of word processing and spreadsheets software

must be objective and fair

6.8.5 Process Outputs

The major outputs of the Outcome Evaluation are:

(1) DEDP Terminal Report. Contains the achievements and accomplishments of the

Division and schools after six years of implementing the DEDP and providing support

to schools.. The report highlights the achievements (Division and Purpose level) and

accomplishments of the Division.

The Terminal Report provides the situationer at the Division level six years after

implementing the DEDP. It also contains a detailed analysis of factors that helped

the implementation and discussion of issues and difficulties experienced by the

Division, districts, schools and community learning centers. It is a documentation of

the Division's best practices and lessons learned.

The Division Mid-Term Report will draw information from the following:

Division Report Card. An end-of-year document that provides a

comprehensive picture of the Division's performance. It contains information

about the Division which will include Goal level (school performance) and

Outcome level (performance of school heads, teachers, instructional

managers and facilitators) indicators

Stakeholders' Perception Study. Contains perception of parents, community,

local government units and other local organizations on the quality of

education and quality of services provided by the school to learners.

SBM Level of Practice. Results of SBM assessment conducted by the Division

in randomly selected schools.

The Terminal Report will be used as input to:

(1) DEDP next 6 Years. The Terminal Report is the main input to be used in the

preparation of the next cycle DEDP.

(2) Reference material in the appraisal of SIPs

Page 6 - 47

Page 89: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

6.8.6 Evaluation Tools and Techniques

The following are some of the M&E tools and techniques to aid the implementation of the

Outcome Evaluation (OE):

(1) Rapid Appraisal Techniques. These are not so quick and not so dirty techniques� �

of gathering qualitative information data about school achievements and

performance. It is a technique for gathering information that will help explain a

phenomenon. It documents the practices (what was done and what were not

undertaken) of the schools. It involves the use of different techniques in order to

validate and triangulate information that will help derived an unbiased view of the

situation. Rapid appraisal techniques include:

key informant interviews. Key informants refer to individuals who can provide

holistic and complete information about the schools. Interview may also be

undertaken through transect walk (walk through) or through the use of a

questionnaire

focus group discussion. Involves individuals group according to similar

characteristics and traits. A facilitator leads the discussion and draws

information from the participants. There are no right and wrong answers but

the facilitator must see to it that the discussion is focused and will generate

the desired information from the participants.

inspection. This is an activity that will validate the claims of individuals about

a practice or way of doing things.

actual observation. In order to document the actual practice or behavior,

actual observation is undertaken. This method will help validate the claims

made by key informants and participants to the FGD.

questionnaire. Predetermined questions are jotted down. These are used to

guide the interviews.

(2) Segmentation Techniques. This is a technique used to understand and gain insights

about target groups. Segmentation is a process of identifying and grouping schools

based on school characteristics and accomplishments. The main objective of

segmentation is to get to know the schools better in order to customize or fit the

Division's technical assistance to the requirements of the school.

Specifically, the segmentation technique will allow the Division to compare similar

schools (the same characteristics) and schools from different groups (different

characteristics). This approach will facilitate the monitoring of schools and allow the

Division to determine the unique needs, problems and requirements of schools

belonging to the same segment.

The following groupings will be used:

(a) school characteristics (sample only to be developed further)

Page 6 - 48

Page 90: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

type science, vocational, national high school�

location upland, urban, rural�

facilities high classroom need, medium classroom need, low�

classroom need

leadership schools headed by principal 2, principal 1, TIC�

teacher to learner ratio high, medium and low�

(b) school performance (sample only to be developed further)�

enrollment decreasing, increasing, stable�

retention high, medium, low�

completion high, medium, low�

achievement - 75 and above MPS, 50-74 MPS, 50 and below

SBM Practice standard, progressive, mature�

(3) SBM Assessment. The Division is going to assess the SBM practices of the schools

using the same SBM assessment tool the schools are using to do self-assessment.

Team of assessors from the Division and District are going to conduct the SBM

assessment.

In order to maintain uniform application of criteria and unbiased assessment of

school practice, the tool is reinforced with the consensus technique. Assessors will not

immediately render judgment about the school practice but instead jot down notes

and document the school practices as observed. These documentations are

discussed by the team of assessors and a consensus is made as to whether the

school is able to satisfy the level of practice.

(4) Checklist. Two checklist for outcome evaluation:

Pointers for Outcome Evaluation

For the Outcome Evaluation process, the same Checklist (Table 6-5) used in

the Mid-Term Implementation Review will be adopted.

Page 6 - 49

Page 91: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Table 6-7 Pointers for Outcome Evaluation

Performance Area Yes No Same Process Questions

1. Participation Rate

Is there an increase in the participation rate? Yes No Same

What programs and projects contributed to the increase in the participation rate?What external factors contributed to the increase/decrease of participation rate?

Is the targeted participation rate achieved? Yes No Same

If yes, to what Division programs and projects can these be attributed?If no, what hindered the improvement in the participation rate?

Is the Division participation rate better than the average within the region?

Yes No Same

If yes, what unique programs and projects contributed to such?If no, what factors can this be attributed?

Is the Division participation rate higher or better than the national average?

Yes No Same

If yes, what unique programs and projects contributed to such?If no, what factors can this be attributed?

2. Retention Rate & Completion Rate

Is there an increase in the retention rate? Decrease in drop out rate?

Yes No Same

What programs and projects of the Division contributed to increase in retention rate?What grade/year level is drop out incidence highest?

Is the retention rate and drop out rate of the Division higher than the regional average?

Yes No SameWhat factors and/or programs and projects can these be attributed?

Is the Division retention rate/drop out rate higher than the national average?

Yes No SameWhat factors and/or programs and projects can these be attributed?

Is there a decrease in the number of schools with increasing drop out rates?

Yes No Same

If yes, what school programs and projects were implemented?If no, what factors can these be attributed?

Is there an increase in the completion rate? Yes No SameWhat programs and projects of the Division contributed to the increase in completion rate?

3. Achievement

Are the performance of Grade 6 learners improving in the last 5 years?

Yes No Same

If yes, what programs and projects can these be attributed?If no, what factors internal and� external can these be attributed?

Are the performance of 2nd year high school learners improving in the last 5 years?

Yes No Same

If yes, what programs and projects can these be attributed?If no, what factors internal and� external can these be attributed?

Are the targeted learner achievement achieved by most of the schools?

Yes No Same

If yes, how was these achieved? What practices must be continued?If no, what lessons can be drawn from the effort or interventions provided that should be improved or not repeated anymore?

Is the Division performance on achievement higher than the regional average?

Yes No Same If yes, what could be the factors that allowed the Division to have higher than

Page 6 - 50

Page 92: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Performance Area Yes No Same Process Questions

(elementary and high school)average performance within the RegionIf no, what can be learned from other Divisions

Is the Division performance on achievement higher than the national average? (elementary and high school)

Yes No Same

Are there significant number of low performing schools whose performance improve from low performers to average or high performers?

Yes No SameIf yes, document their practices and Division programs and projects that helped improve performance

4. Alternative Learning Programs

Are the participation of out of school youth in the Division alternative learning programs increasing in the last 5 years?

Yes No Same If yes, probe on the phenomena for the increase in participation. What are the Division programs that contributed to such improvement in participationIf no, probe why

Is there an increasing trend in the number of A&E passers?

Yes No SameIf yes, to what factors these can be attributed?

Are the CLCs meeting the standards of the Division?

Yes No Same

Are the competencies/performance of facilitators and instructional managers improving?

Yes No Same

5. Stakeholders Perception

Are the perception of learners to the teaching and learning process improving?

Yes No Same

What areas of the teaching and learning process gained positive responses and what areas garnered negative responses?

Is there an improvement in the perception of stakeholders concerning quality of education in the Division (as compared 3 years ago)?

Yes No SameIn what areas or service the school improved facilities, teachers, school� management etc

Are the perceptions of parents improving concerning the quality of education? Yes No Same

In what areas are the perception positive and what areas are they negative? Why?

Is there an improvement in the perception of local government units and others regarding the quality of education?

Yes No SameIn what areas are the perception positive and what areas are they negative? Why?

6. SBM Level of Practice

Is there an increase in the number of schools belonging to level 1 and promoted to level 2 or 3

Yes No Same Are they the same schools?Are there schools whose SBM practice deteriorated?What facilitated the improvement in the practices?

Is there an increase in the number of schools belonging to level 2 and promoted to level 3

Yes No Same Are they the same schools?Are there schools whose SBM practice deteriorated?What facilitated the improvement in the practices?

7. Learning Environment

Is the 1:1 learner to textbook ratio achieved? Yes No SameWhere is the shortage of textbooks most acute?

Page 6 - 51

Page 93: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Performance Area Yes No Same Process Questions

Is the 1;45 classroom to learner ratio achieved? Yes No Same

How many schools have achieved the classroom to learner ratio? How many schools are the classroom shortage most acute?

Do the learners have access to laboratories and school equipment?

Yes No Same How many schools have provided very good access to learners on laboratories and school equipment?

Is there an ICT laboratory i Yes No SameIf yes, how updated are the equipment? How is the utilization?

8. School Head Performance

Are there improvement in the competencies of school heads in the following areas:

Instructional supervision Yes No Same

What were the training programs received/attended by the school heads? How are they applying these training programs?

Educational Planning Yes No Same

What are the training programs received/attended by the school heads? How are they applying these training programs?

Resource Mobilization Yes No Same

What werethe training programs received/attended by the school heads? How are they applying these training programs?

Advocacy Yes No Same

What are the training programs received/attended by the school heads? How are they applying these training programs?

Managing education programs and projects

Yes No Same

What were the training programs received/attended by the school heads? How are they applying these training programs?

Managing stakeholders Yes No Same Are they capable of managing the school governing council? Partnering with LGUs?

Progress Tracking Yes No Same

What were the training programs received/attended by the school heads? How are they applying these training programs?

Outcome Evaluation Yes No Same

What were the training programs received/attended by the school heads? How are they applying these training programs?

9. Teachers Performance

Are there improvement in the competencies/ performance of teachers on:

Subject mastery Yes No Same

How many teachers have mastery of the subject they teach? What were the training programs received/attended by the teachers? How are they applying these training programs?

Teaching skills (classroom management, student assessment, modern teaching methods, care and use of learning

Yes No Same How many teachers demonstrated the proper teaching skills? How many have been trained?

Page 6 - 52

Page 94: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Quality Control and Adjustment Points

Performance Area Yes No Same Process Questions

materials and equipment)

Use of ICT Yes No Same

How many teachers are using the ICT? How many have been trained in the use of ICT?

Page 6 - 53

Page 95: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

DIVISION M&E SYSTEM

7.0MONITORING AND EVALUATION TOOLS & TECHNIQUES

Page 96: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Tools and Techniques

7. 0 D I V I S I O N M& E T O O L S

This section enumerates tools and techniques for monitoring and evaluation.

The choice of M&E tools and techniques will influence the results of the evaluation or

assessment that will be undertaken by the Division. Selection of the most appropriate ones

increases the likelihood of a correct, precise and accurate results or findings. In this regard, it

is important that the M&E team must be familiar with the different tools and techniques,

especially in the results these will generate, as well as the context and nature of these tools.

The following is a classification of M&E tools and techniques:

(1) Tools to assess effectiveness

(2) Tools to assess Division Readiness

(3) Tools to track efficiency

(4) Tools for data gathering.

7.1 Tools to Assess Divis ion Ef fectiveness

The effectiveness of the Division can be measured using the following tools:

SBM Level of Practice. This is a self-administered assessment tool for the school

head. The assessment covers the six dimensions of SBM, namely, School Leadership,

Internal Stakeholder Participation, External Stakeholder Participation, Continuous

School Improvement Process, School-Based Resources and School Performance

Accountability. The result of the self-assessment will be used as input to the

adjustments in the AIP and the preparation of the next cycle SIP.

See Manual on Assessment of School-Based Management Practices.

Segmentation Techniques. This is a technique used to understand and gain insights

about target groups. Segmentation is a process of identifying and grouping schools

based on characteristics and accomplishments. The main objective of

segmentation is to get to know the schools better in order to customize or fit the

Division's technical assistance to the requirements of the school.

Specifically, the segmentation technique will allow the Division to compare similar

schools (the same characteristics) and schools from different groups (different

Page 7 - 2

Page 97: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Tools and Techniques

characteristics). This approach will facilitate the monitoring of schools and allow the

Division to determine the unique needs, problems and requirements of schools

belonging to the same segment.

As an evaluation tool, the Division assess the performance of a school against the

performance of schools with similar characteristics or schools belonging to the

same typology. The performance of the schools (in a Division) are also compared or

benchmark against the performance of schools (belonging to the same typology)

in other Divisions (within the region) and against national performance (average).

Competencies Checklist. Includes list of competencies that must be demonstrated

by school heads, teachers, instructional managers and literacy facilitators.

Stakeholders Perception Survey. Refers to the perception of the stakeholders

(community, LGUs, learners, etc) on the quality of education and quality of services

provided by the schools.

Logical Framework Approach. This refers to situational tools and techniques used to

assess and explain the phenomena behind the results or outcomes. Logical

framework matrix uses problem tree, objectives tree, stakeholders analysis and SWOT

(strengths, weakness, opportunities and threats).

7.2 Tool to Assess Readiness

Intermediate objectives pertain to improvement in the practices of the Division in providing

service or technical support to the schools and community learning centers.

Quality Management Inventory Model. The QMIM depicts a road map that traces

the Region and Division's transformation from use of informal processes to a more

established technical assistance packages and support mechanism. It projects an

organization's transition from the realm of uncertainty to a more repeatable and

predictable results. The Model represents a progression of capability by the Region

and Division to deliver management support and its technical assistance packages

to their target groups.

The QMIM is also a yardstick to assess the performance of the Region and division. It

will be used to examine the Region and Division's processes and support

mechanisms that allows it to efficiently and effectively deliver technical assistance

packages to schools, school managers, teachers and the school's non-teaching

staff.

Page 7 - 3

Page 98: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Tools and Techniques

The Model is defined into (1) Ad Hoc, (2) Defined, (3) Integrated, (4) Sustained.

Level 1 is the entry level. It represents a Division that is characterized by ad hoc

processes and informal way of doing things. As it matures, the Division Office is

expected to establish its internal procedures (Level 2. Defined). The Division improves

into a stage where it is expected to manage and integrate different mechanisms

into an integrated system. The highest level is the Sustained level. This represents a

Division that adapts, maximizes and continuously improve its way of doing things.

For a more detailed discussion, see attached document Quality Management�

Inventory Model which describes the model and the process for undertaking the

inventory on quality management.

Division Readiness on Quality Management. A self-assessment tool used to

evaluate the competencies of Division and District staff on the critical areas of

quality management. compare the approved or accepted targets in the AIP/SIP

versus the actual number of targets completed. For a sample checklist on readiness

assessment below.

Page 7 - 4

�������������������������� �� ������������

������������������������������

����������������

������������������������������������ ������������������

������������������

�� ������ ��������������������������

����������������

�� ������������

�������������������

�����������������������������

Page 99: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Tools and Techniques

Sample Readiness Assessment Tool

Readiness Assessment

This is a readiness assessment. We would like get your perception on the readiness of your organization to implement a quality assurance system Listed below are some of the processes on QA and M&E, key tools and techniques on planning, monitoring and evaluation and concepts and principles of Quality Assurance. The results will help us determine our assistance to you especially in designing the capability building programs.

Please rate your organization as 1 have no or minimal knowledge or understanding of the process/tool/�concepts; 2 have been trained on the process/tool/concepts but have yet to apply or use them; 3 -have� limited implementation or application of the process/tool/concepts; 4- have been using or applying the process/tool/concept

Area 1 2 3 4 Rating

1Appraisal of the SIP/DEDP. Review process on the relevance and technical correctness of the SIP/DEDP.

0

2Review of Start Up. Process of assessing the readiness of a unit to implement a newly approved plan

0

3Implementation review. Evaluation of accomplishments (physical accomplishment) and analysis of problems and issues surrounding an implementation

0

4Mid Term Implementation Review. Process involving assessment of initial gains, physical accomplishments to date and adjusting the plan

0

5Outcome Evaluation. Process involving the evaluation of results based on the Purpose-level objective and targets contained in the Plan

0

6Process Audit or Compliance Review. Assessment of a unit's application of standards in delivering an output

0

7Education Planning. Process involve in preparing a strategic plan and a detailed implementation plan. Includes planning for resources,estimating time and cost and setting up the appropriate organizational structure to implement the plan.

0

8Logical Framework Matrix. Is a planning and M&E tool used to define the objectives, targets and indicators. Provides the scope of the plan

0

9 SBM Assessment. A tool developed to assess a school's level of practice on SBM. 0

10Implementation and Improvement Clinic. A technical assistance process on helping a Unit improve its delivery of programs and projects.

0

11 Accreditation. Mechanism to raise the bar of excellence for schools 0

12Change Management. Is a structured approach to change individuals or organizations from the current state to the future state.

0

13Knowledge Management. Practices of the organization to distribute, transfer and propagate knowledge within the organization

0

14 Quality Assurance. Understands the concepts and principles of quality assurance. 0

15Perception Assessment. Assess stakeholders level of satisfaction on the delivery of basic education

0

Page 7 - 5

Page 100: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Tools and Techniques

7.3 Tools to Track Ef f iciency

Efficiency is measured by comparing the actual implementation progress versus the plan.

Tracking efficiency is based on the regular collection of data and information specifically

on the accomplishments. Efficiency can be measured using the following tools:

Line of Balance Method or

S Curve� � . A tool that plots

the total plan on a periodic

basis versus the actual

accomplishments per period.

The S Curve diagram� �

provides management with a

clear status of implementation

and shows the trend

(accomplishments) over time.

Gantt Chart. Is a type

of bar chart that

details the entire

implementation in a

chart. It is both used

as a planning tool

and as a tool for

tracking the progress

of implementation. As

a monitoring tool, the

chart is used to track

or plot the activities

implemented or

outputs delivered

based on the

approved scope of

work. It is a good tool

to visualize the accomplishments vis-a-vis with the plan.

Page 7 - 6

Activities Start Date Finish Date 1 2 3 4 5 6 7 8 9 10 11 12 Status Remarks

1.0 Preparation

1.1 Organize the Evaluation Team Mon 05/Jan 09 Tue 20/Jan 09

Completed

1.2 Orient the Evaluation Team Tue 10/Feb 09 Fri 13/Feb 09

Completed

1.3 Prepare the Evaluation Instruments Tue 10/Feb 09 Wed 15/Apr 09

Completed

1.4 Train Data Gatherers Sat 02/May 09 Mon 15/Jun 09

Completed

1.5 Prepare Data Gathering Plan Thu 11/Jun 09 Thu 25/Jun 09

Completed

2.0 Data Gathering

2.1 Kick-Off Meeting Wed 24/Jun 09 Tue 30/Jun 09

Completed

2.2 School 1 Mon 29/Jun 09 Mon 29/Jun 09

Completed

2.3 School 2 Tue 30/Jun 09 Tue 30/Jun 09

2.4 School 3 Mon 13/Jul 09 Mon 13/Jul 09

On-going

2.5 School 4 Wed 15/Jul 09 Wed 15/Jul 09

On-going

2.6 Etc Mon 10/Aug 09 Fri 14/Aug 09

3.0 Analysis

3.1 Tabulate Data Sept 20, 09 Sept 25, 09

3.2 Consensus Building Sept 20, 09 Wed 30/Sep 09

3.3 Analyze findings Thu 01/Oct 09 Mon 19/Oct 09

4.0 Submission of Report

4.1 Write the Report Tue 20/Oct 09 Fri 30/Oct 09

4.2 Communicate Tue 03/Nov 09 Sun 29/Nov 09

4.3 Submit to Region Tue 15/Dec 09 Tue 15/Dec 09

Gantt Chart (Plan versus Actual Implementation Time)�

!

�∀!

�#!

∃%!

� !

!

� !

∀ !

& !

∋ !

� !

�� !

� � � ∀ ∃

Page 101: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Tools and Techniques

Checklist or Process Audit Checklist. Checklists will be developed and used to

assess the practices of schools, community learning centers, districts and Division

staff on the delivery of services. The checklists will contain the standard processes

set by the Division and Region.

Below is a sample checklist to be used during the Process Audit..

Sample Checklist. Implementation of SBM Assessment

Area Remarks

1 Preparatory Activities

School Head The school head attended the orientation on SBM and is aware of the intent of the SBM assessment

Yes NoMore Info Needed

Interview the school head. Ask him/her some important information about SBM assessment

School HeadThe school head understand all the terms used in the instrument and knows the process of administering, scoring and reporting.

Yes NoMore Info Needed

Interview the school head. Ask him/her some important information about SBM assessment

Teachers and Non-Teaching StaffThe school head oriented the teaching and non-teaching staff on the concepts of SBM and aware of the purpose and process involve in the assessment

Yes NoMore Info Needed

This is to triangulate the information provided by the school head. Ask the teachers and non-teaching staff about the purpose of the SBM assessment

External StakeholdersThe school head oriented the school stakeholders on the concepts of SBM and aware of the purpose and process involve in the assessment

Yes NoMore Info Needed

This is to triangulate the information provided by the school head. Ask the teachers and non-teaching staff about the purpose of the SBM assessment

External StakeholdersMajority of the invited stakeholders attended and participated in the assessment ; each dimension was represented by stakeholders

Yes NoMore Info Needed

Check the attendance sheet.

2 Data Gathering

Copy of Assessment ToolEveryone has a copy of the assessment tool

Yes NoMore Info Needed

EvidenceThe documents provided are the most authoritative document signed and the� most up to date

Yes NoMore Info Needed

Validate the evidence presented. Check the dates, signatories and content of the document if sufficient

Page 7 - 7

Page 102: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Tools and Techniques

Area Remarks

EvidenceThe school head allowed access to all school documents

Yes NoMore Info Needed

Ask the stakeholders.

EvidenceEnough time was given to review and assess the content of the documents

Yes NoMore Info Needed

Ask the stakeholders

Evidence and Interview Agreements and consistency between the content of the documents and the responses of the interviewees

Yes NoMore Info Needed

Look for documentation of the interviews

3 Summarizing the Stakeholders Responses

FGDGroup discussion was done after the inventory per dimension

Yes NoMore Info Needed

Ask the stakeholders to validate

EvidenceDocuments or actual objects were used to validate the observation

Yes NoMore Info Needed

Ask the stakeholders

Check MarkCounting of the check mark was done collaboratively and in a transparent manner

Yes NoMore Info Needed

Ask the stakeholders; or check the raw scoring sheet

ConsensusThe rating or scoring was done through consensus

Yes NoMore Info Needed

Ask the stakeholders

Discussion of ResultsThe total score of the school was presented and discussed to the stakeholders

Yes NoMore Info Needed

Ask the stakeholders; look for the presentation material

Weak areasIn the FGD, discussion was also focused on what to do with the school weaknesses

Yes NoMore Info Needed

Ask the stakeholders

Strengthen SBM PracticesDiscussion on what to do and how to strengthen the school practices

Yes NoMore Info Needed

Ask the stakeholders; refer to SIP, AIP or any other plan where the findings were incorporated

Support from StakeholdersUsing the results of the assessment, the school head rallies support from the stakeholders

Yes NoMore Info Needed

Ask the stakeholders; look for documentation of support from stakeholders

4 Others

Minutes of the MeetingThere is a documentation of the entire

Yes NoMore Info Needed

Document review

Page 7 - 8

Page 103: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Tools and Techniques

Area Remarks

process especially the FGD?

Documentation The school prepared a report detailing the results of the SBM assessment

Yes NoMore Info Needed

Document review

SubmissionThe Division was furnished a copy of the School Report

Yes NoMore Info Needed

7.3 Tools and Techniques for Data Gathering

7.4.1 Purpose of Data Gathering

Data gathering activities are undertaken to objectively verify accomplishments,

demonstration and use of skills, utilization of a system, and to explain unintended effects of

programs and projects. It involves gathering first hand data from various sources.

Specifically, gathering of primary data is undertaken when:

the data and information provided in the reports need to be verified. There is a

need to further explain the numbers and statistics presented in the report with

stories� �

there is a need to assess the factors influencing or causing the phenomenon and

identify the facilitating and hindering factors

there is a need to document effective practices and difficulties being encountered

in the application or utilization of skills, system and facilities

there is a need to probe further a report and/or challenge a report.

7.4.2 Some Guideposts in the Selection of Tools and Techniques

The decision to gather primary data is to be taken with caution. Careful considerations

should be undertaken in the choice of tools and techniques to use. All too often, the tool

selected will influence, if not dictate, the quality of data and information that will be

gathered. The following are some guideposts in selecting the appropriate tools

Know what to collect and what to validate. There is no substitute to good planning

Page 7 - 9

Page 104: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Tools and Techniques

and preparation. Identify the indicators that you want to verify, the data that will

support the indicators then determine the appropriate tool.

Triangulate. Use more than 1 technique in gathering data to minimize the error

inherent among data gathering tools and techniques

Just in time not just in case. Collect data that you need to make decision, not

collect data in case you need them in the future. This will help you avoid data

overload.

Cost efficient. Consider the costs involved in using a tool to gather data. As a

general rule, always go for a less expensive technique that will offer same quality of

data as that of the more expensive technique.

7.4.3 Tools and Techniques

Rapid Appraisal Techniques. These are not so quick and not so dirty techniques of� �

gathering qualitative information data about school achievements and performance. It is a

technique for gathering information that will help explain a phenomenon. It documents the

practices (what was done and what were not undertaken) of the schools. It involves the use

of different techniques in order to validate and triangulate information that will help

derived an unbiased view of the situation. Rapid appraisal techniques include:

key informant interviews. Key informants refer to individuals who can provide holistic

and complete information about the schools. Interview may also be undertaken

through transect walk (walk through) or through the use of a questionnaire

focus group discussion. Involves individuals grouped according to similar

characteristics and traits. A facilitator leads the discussion and draws information

from the participants. There are no right and wrong answers but the facilitator must

see to it that the discussion is focused and will generate the desired information

from the participants.

inspection. This is an activity that will validate the claims of individuals about a

practice or way of doing things.

actual observation. In order to document the actual practice or behavior, actual

observation is undertaken. This method will help validate the claims made by key

informants and participants to the FGD.

questionnaire. Predetermined questions are jotted down. These are used to guide

Page 7 - 10

Page 105: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Tools and Techniques

the interviews.

Table 7-1 Rapid Appraisal Tools

Data Gathering Tool Description Key Pointers

Observation(Direct observation)

This method focuses on the actual performance, actual utilization, on -going activities and events. Observers record what they see and hear. This method is appropriate when the objective is to document demonstration of skills in an actual setting.

The key item to remember in making observation is for the observer to avoid the urge to document and analyze at the same time. The observer must jot down notes as objectively as he/she can, noting down exactly what is being observed.

Interview (key informant interviews, informal interviews, transect walk)

Interview is one of the most commonly used data gathering methods. It gathers qualitative data and is a good source of perspectives� � which will help explain the phenomenon being validated.The interviewer uses guides (list of topics or open ended questions) and probes the interviewee to elicit opinion, experiences and practices.

Using a guide or questionnaire would often distract the interviewee. This may also cause the interviewee to be cautious about the information he/she is sharing.The key in using interview as a method is the INTERVIEWER. The interviewer should evolve as the instrument . He/she must be quick� �

to adapt and adjust to the demeanor of the interviewees.

Focus Group Discussion

This method involves around 8 to 12 individuals discussing a certain subject matter. The group is assisted by a facilitator and a documenter. The facilitator asks process questions to start the discussion. The facilitator will not discriminate the answers provided by the participants but should probe and ask clarification on the responses given. A documenter will document all the responses.

FGD should be focused . � � If the group discusses so many topics and it goes out of focus, the FGD fails. And this happens a lot of times.The main role of the facilitator is to keep the discussion of the group FOCUSED.

Inspection (artifacts review)

This data gathering method focuses on the existence of artifacts. These are outputs (in document form) developed or prepared by the target group. The existence of an output is a demonstration of skills.Inspection is also a quality control activity. It involves seeing and touching the materials� � and equipment purchased, and assessing the quality of facilities constructed

Inspection is used to validate claims made by interviewees� �

during the interview, FGD and in the reports submitted.The existence of standards will facilitate the inspection process.

Questionnaire

This is a structured way of gathering data. Questions about data and/or information you want to know are inputted to the questionnaire. The questionnaire ensures that your concerns are covered. It also allows uniform presentation of questions to respondents reducing the bias of the researchers.

In using the questionnaire, limit the questions to what you really need to know. Ensure that the questions are linked to the program design. Long questionnaires have a dismal response rate.Use simple terms and instructions.When able, provide incentives.

Perception Survey Among the tools enumerated, perception survey is the only tool used to gather quantitative data. It is used to gather information about what people think about a performance, service or a

There is a need to ensure the reliability of the questionnaires to be used. And there is a need to standardize the questionnaire and its

Page 7 - 11

Page 106: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Tools and Techniques

Data Gathering Tool Description Key Pointers

product. The data generated by the perception survey are usually considered in the preparation of a program design.It involves the use of a structured questionnaire and selected number of respondents (based on sampling methods used)Can either be self-administered or done by professional researchers.

administration.

Page 7 - 12

Page 107: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

DIVISION M&E SYSTEM

8.0DOCUMENTS AND REPORTS

Page 108: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Documents and Reports

8 . 0 D O C U M E N T S A N D R E P O R T S

8.1 Descript ion

Management Reports are data organized in an easy to understand format. The reports

provide the stakeholders with a holistic perspective on the accomplishments, events and

period that have elapsed. It is essential that the report provides complete information of

the events in order to be a useful input towards decision making. Furthermore, in order to

be useful and effective, reports should contain information about three essential areas:

Operational information. This describes the progress or status of implementation

happening within the school and classroom. At the school level, it includes school

programs and projects implemented, quality of outputs delivered, resources

generated and the expenditures of the school. At the classroom level, this may

include competencies gained by students, lessons covered, and attendance.

Internal and external information. Internal information relates to all activities within

the school or classroom and the stories behind the activities. This includes reporting

of the major events and activities that took place inside the school and factors that

facilitated or hindered the activities. On the other hand, external information

pertains to factors outside the school that may have influenced or affected school

performance. Information outside provides good comparative information to assess

the school s own performance.�

Leading and lagging information. Leading information or leading indicators

provide insight or early warning into a future event. Some examples of leading

indicators are teachers performance (predicting student learning), frequent�

absenteeism (leads to dropping out), and good teaching and school-based

management (influences enrollment).

On the other hand, lagging information or historical information provides useful

insights to current accomplishments. Reports provide a comparison of past

accomplishments to accomplishments to date. Example, in reporting drop out� �

rate for this year, drop out rates of previous years are also reflected in the report to

provide historical trend.

8.2 Some Guideposts in Management Report ing

The content of reports should be driven by the decision-making requirements of the user of

the report, and not by what data is available. Avoid unnecessary information as well as

unnecessary attachments. The following are some guideposts in preparing a report format:

Keep it short and simple. Readers may have little time to read a voluminous report.

Keep it short but full of relevant information;

Page 8 - 2

Page 109: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Documents and Reports

Use graphs and tables that will provide immediate information about performance

and accomplishments;

Follow the format of your plan. When using coding (e.g.,. C.1.1) follow the one used in

the approved plan.

Provide stories behind the numbers, but keep it simple and direct.

Avoid ambiguous words and limit jargon the reader may not understand.

Attach documents that will directly support or explain further the information in the

main report. And use the most authoritative source of data.

8.3 Division Documents and Reports

Documents and reports contain information that are needed in making accurate decisions.

Reports are prepared not because data is available but because decisions need to be

made. The organized data contained in a report will provide valuable input and insights to

the Division on future actions to undertake. Such is the importance of management reports.

The problem, however, is not the lack of reports but too many reports. There are too many

information but ironically these are not received at the right time by the right individuals. It

is important to streamline the reports and ensure timely arrival of information that a school

head would need in managing the school. The following documents and reports are

identified not for reporting purposes but because of information and insights they contain in

helping the Division determine the future moves and actions.

8.3.1 Baseline Documents

The following documents provide baseline information about the school, school head,

teachers, instructional managers and literacy facilitators. These baseline documents will be

used as basis for the progress or accomplishments of the Division.

School Report Card. Refers to school performance with focus on the enrollment,

retention, completion and learners academic performance.�

School Profile. Inventory of school's learning environment and learning resources

and other services.

Competency Profile of Division target groups: school heads, teachers, instructional

managers and facilitators. This will provide the baseline information about the

capabilities of the Division target groups.

Division Report Card. Provides information on the overall performance of the

Division using selected performance indicators and a comparative performance of

the schools within the Division..

Page 8 - 3

Page 110: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Documents and Reports

8.3.2 Control Documents

Control documents are education development plans prepared by the schools and the

Divisions. These documents provide the scope of technical assistance and the coverage of

the M&E activities that will be undertaken by the Division.

SIP/AIP. Contains the outcomes, target performance indicators, target outputs and

strategies of the school. This will be most useful in understanding the context of the

problems and issues encountered by the school and the opportunities available to

the school. The SIP/AIP will be used by the Division to track the efficiency of the

schools.

DEDP/DAP. Represents the six-year plan of the Division. This control document will be

used by the Division to manage its operations for the next six years. These are

adjusted and detailed every year (DAP).

8.3.3 Status Reports

Status reports provide information on the progress of implementation and information on

the initial gains or results of the implementation. Status reports must contain the following

information:

Scope or the Quantity (target). Provide a comparison of the actual outputs covered

or delivered versus the plan.

Quality of accomplishments. Describe the characteristics of the outputs delivered in

order to provide readers with ideas (on what was accomplished) and as basis for

further evaluation.

Schedule. Provide a situationer on the accomplishments versus the plan. Status

reports should show how fast, slow or just right is the progress of implementation.

Cost. Status reports provide an account of the expenditures vis-a-vis with the

approved budget.

Initial benefits or results. Describe the improvements in the practices of the target

groups.

Major problems and issues. Provides a list of the major problems encountered and

issues that may create more problems. The report must contain suggestions or

recommendations on how to mitigate and/or solve these problems and issues.

Next steps. A status report serves as the important link between the current situation

to the activities to be implemented in the next period.

The following are the main status reports and their objectives:

School Quarterly Progress Report. Consolidation of 3 monthly reports to be

submitted to the Division. It contains the physical accomplishments for the quarter

and description of programs and projects implemented. The report may also

Page 8 - 4

Page 111: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Documents and Reports

contain problems and issues encountered by the school that need to be addressed

by the Division

Monthly Report/Annual Report on ALS Programs. Contains the status of the Basic

Literacy Program and A&E Program of the Division which are implemented through

the community learning centers operated by service providers or run by the District.

Division Monthly Report. Contains the physical accomplishments of the Division

versus the plan (DAP), short description of programs and projects implemented and

documentation of problems, issues and opportunities encountered.

Division Annual Accomplishment Report. An end-of-year report containing the

programs, projects and other services delivered by the Division. The report also

provides a comparative report on the performance of the schools using selected

performance indicators and the schools' level of practice on SBM.

8.3.4 Accomplishment Reports

Accomplishment reports are documents prepared and submitted after every end of

program or project or the end of a major undertaking. Accomplishment reports provide

DEDP Terminal Report. This report contains the accomplishments of the Division after

six years of implementing the DEDP. Specifically, it contains information on the

schools' performance (comparative), competency profile of school heads, teachers

and Division staff and an inventory of the programs and projects implemented for

the schools. The completion report also includes the Division Report Card which

provides a holistic picture of the Division after six years of DEDP implementation.

Division Mid-Term Implementation Report. This report is prepared at the end of the

1st cycle of SIP implementation (or phase 1 of the six year DEDP implementation). The

Mid-Term Report contains the information on the achievements and

accomplishments of the schools and the Division after three years. The report

provides information and insights that may drastically alter or affect the next three

years of the DEDP implementation.

Division Best Practices. Documentation of programs and projects implemented that

netted positive results.

Table 8-1 Division Documents and Reports

Type of Document /

Report

Document / Report

Content PurposeTiming of Report

As Input to

Baseline Document

School Profile

Resource Profile of school. Includes human and physical resources

Document submitted by schools to Division to determine the areas for technical assistance to schools

End of March of each year

DEDP and DAP

Baseline School Report Performance Document submitted by End of March Attachment

Page 8 - 5

Page 112: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Documents and Reports

Type of Document /

Report

Document / Report

Content PurposeTiming of Report

As Input to

Document Card indicators

schools to Division to determine the effectiveness of Division interventions as well as determine the impact to schools performance�

of each year

to annual report and SIP completion report

Baseline Document

Division Report Card

Schools' Performance (comparative)Competency Profile of school heads, teachers and Division staffQM Assessment

To provide baseline information of the Division. This will be used to assess the year to year performance of the Division; also to be used for outcome evaluation by the Region

Annual

Preparation of DEDPRegion's input to REDP

Status ReportSchool Quarterly Report

To report on the progress of implementation; Documentation of accomplishments per month

End of the month

Adjustment of next month's activitiesPerformance assessment of teachers

Status ReportSchool Quarterly Report

To show status of AIP implementation after every 3 months

End of the quarter

Adjustment of next quarter's activitiesPerformance assessment of teachers

Status ReportSchool Annual Report

To present the accomplishment report of the school after 1 year;

March of each year except the last year of SIP Implementation

Information from the report will be used as basis to adjust/enhance the next year AIPAs basis for measuring efficiency of SH

Status ReportLearner Report Card

To provide information about the learners' performance

Quarterly and EO SY

Learning Management Plans

Accomplishment Report

SIP Completion Report

To provide documentation of the 3 year implementation which will include lessons learned and key practices of the school

January � February of 3rd Year of SIP

Next Cycle SIP

Accomplishment Report

School Program/Project Report

To provide documentation of program/project accomplished

End of Program/Project

Best practices

8.3.5 Other Documents

Other Documents include acknowledgement receipts or inventory receipt of property,

certificate of acceptance, memorandum receipt.

Page 8 - 6

Page 113: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Documents and Reports

8.4 Reporting Flow and Frequency

Most reporting flow follows the organizational structure. The teacher submit reports to the

department heads and the department heads report to the school head. As such, critical

and valuable information are reported up the ladder but are not disseminated fast enough

to the other field units who may be needing the information. This reporting flaw should be� �

corrected if the objective is to provide timely and relevant information to stakeholders or

users of the reports.

Consider not only the vertical flow of information but also the horizontal flow of information.

Horizontal flow of information encourages sharing of data, information and insights. It is a

much faster way of propagating effective practices. For example, english teachers to share

information about a learner to the math and science teachers rather than going through

the process of reporting and submitting information to the Department Head and to the

School Head.

Some guidelines on the reporting flow:

Timing. The most accurate data and information may lose its usefulness if it is

received late.

Not all information are reported or shared to the other levels. Only data and

information needed to make decisions or adjustments. However, in cases where

additional information is requested, detailed information or back up information

should be readily available;

Uniform or similar format of reports. This is to ensure easier consolidation,

comparison and analysis of information be it vertical or horizontal structure.

Accuracy over precision

Page 8 - 7

Page 114: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

DIVISION M&E SYSTEM

9.0M&E TERMS OF REFERENCE

Page 115: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Terms of Reference

9.0 M&E TERMS OF REFERENCE

9.1 Manifestat ions of a neglected M&E System

The M&E system is acknowledged to be one of the most important systems in

management. It is an important mechanism in the directing, steering and controlling

functions of management. It provides information and insights to management to ensure

quality products and services as well as continuous improvement in the organization.

Ironically, the M&E system is one the most often neglected systems in an organization. Here

are some of the manifestations of an organization that neglected its M&E function:

Nobody is in charge of M&E. If ever there is somebody assigned to do M&E, it is a

junior staff assigned to collect data and put them together in one document.

Implementers are forced to make decisions without the benefit of data and� �

information. There is no direct link between M&E activities and decision making.

M&E system and its requirements are set up in the middle of an implementation.

Nobody can really say the status of implementation.

Scope creeps1. There are too many intervening activities, events or outputs that

lead to non-implementation of approved programs and projects

Same (failed) programs and projects continue to be implemented

Never ending collection of data. Field personnel are often burdened with request

for data and information even though these have been reported already

Different units or different individuals collecting same data simultaneously.

A sure sign of a missing M&E system is when the basic data elements are never

collected.

9.2 Common Misconceptions about M&E Work

As a result, the actual efforts on M&E is often limited to data gathering, report writing and

report submission. Such wrong perceptions contributed to the popular belief that there is a

dichotomy between M&E and decision making.

Here are some of the misconceptions about the work in M&E::

M&E is about submitting reports. Report preparation and submission is just one of

the many functions in M&E. It includes the process of gathering data and

information, analyzing them, writing and presenting these in a format that will

� A scope creep is an activity, event or an output undertaken (may be necessary) but is not part of the approved or agreed plan.

Page 9 - 2

Page 116: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Terms of Reference

facilitate decision making. These reports will be the basis for future actions and

future designs of programs and projects.

The M&E System is often equated to filling up forms, tables and matrices. The

important activity of validating the data and information is often neglected, thus,

the practice of filling up forms, tables and matrices leads to erroneous data and

information.

M&E is about field visits and data gathering.

Data collected must always go up before it is disseminated down the line. As a

result, needed data and information arrive late or never at all.

Another wrong notion about M&E is to meet the information requirements of the

external/higher management level unit. The M&E system is set up and put into

operations in order to meet the information requirements of the internal units

especially individuals who are responsible for the delivery of outputs and the

achievement of outcomes.

These misconceptions often lead to false notions that M&E people are just spectators �

watching and observing (spying) and waiting for people to make mistakes. Then they report

these mistakes. The misconceptions above are the usual reasons why people shun M&E.

9.2 The M&E Funct ion

The M&E function is very important in decision making. Every implementer will make

decisions according to their own accountabilities. Monitoring and evaluation activities are

undertaken to ensure that accountabilities and desired results are achieved. Essentially, the

M&E is about adjustments, which can consist of:

no changes. if no or tolerable deviations from the plan are observed;

changes in activities, if deviations from the plan can be counteracted by adjusting

resources and activities;

adaptation of the plan, if the strategy does not yield the expected results and

effects;

changes in the strategy or termination of the plan if target purpose turn out to be

unachievable due to misconceptions and changes in frame conditions.

Page 9 - 3

Page 117: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Terms of Reference

9.3 The Process Owner

9.3 .1 Schools Divis ion Superintendent

The SDS is the process owner of the Division M&E System. As process owner, the SDS must

ensure the integrity and efficiency of the System. This means providing accurate, correct,

timely and relevant information to the schools, Region and other stakeholders. The SDS will

also be the major beneficiary of the lessons and insights produced by the M&E System.

Specifically, the following outlines the roles and responsibilities of the SDS on M&E:

Overall, the SDS provides the steering and decision-making requirements of the Division's

technical support to schools and community learning centers.

The SDS shall report directly to the Regional Director and provide information on the

progress or status of the DEDP implementation. The SDS shall also raise problems and issues

affecting Division operations and recommend areas for adjustments.

The SDS shall interact closely with the following stakeholders:

Regional Office

Schools

Assistant Schools Division Superintendent

Division M&E Coordinator

Division Planning Officer

Division staff

Division Quality Management Team

As the major decision maker in the Division, the SDS shall have overall supervision of the

DEDP/DAP implementation. The SDS is to ensure that all the programs and projects

undertaken by the Division and Districts are in accordance with the accepted DEDP/DAP,

on time and within budget. In this regard, the SDS shall undertake the following:

Conduct regular meetings, workshops and evaluation

Oversee the implementation of the Quality Control and Adjustment Points

Review and endorse the DEDP Completion Report.

Lead the Division Quality Management Team. As team leader, provide directions

in the conduct of the SIP Appraisal, Start Up Review, Annual Implementation Review,

Mid-Term Review Outcome Evaluation

9.3.2 Assistant Schools Division Superintendents

The ASDS is responsible for monitoring and evaluating the progress and quality of the

Division programs and projects for schools and community learning centers or programs

outlined in the DEDP/DAP. �he ASDS is directly responsible for the operational supervision

Page 9 - 4

Page 118: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Terms of Reference

of their units. They provide technical assistance to education supervisors and district

supervisors on how to efficiently and effectively deliver the Division programs and projects.

The ASDS shall interact closely with the following stakeholders:

Schools Division Superintendent

Division Monitoring and Evaluation Coordinator

School Heads

QMT Members

The ASDS shall report directly to the SDS. He/She shall provide the SDS with the progress or

status of programs and projects of the Division, raise issues and problems affecting (or

may affect) the technical assistance support to schools and community learning centers

and recommend areas for adjustment in the DEDP/DAP. Specifically, the following outlines

the roles and responsibilities of the ASDS:

Prepare programs and projects that will support the requirements of the Division

staff, particularly the promotional staff and the district supervisors.

Prepare and submit Monthly Report detailing the status of programs and projects

and future activities

Conduct unit meetings and workshops related to M&E concerns

Supervise day to day activities of Division staff. Monitor the provision of technical

assistance to schools and community learning centers

As member of the Division Quality Management Team, participate in the SIP

Appraisal, Start Up Review, Annual Implementation Review, and Mid-Term Review

Outcome Evaluation

9.3.3 Division M&E Coordinator2

The M&E Coordinator is responsible for the overall M&E strategy and implementation of

M&E related activities within the Division and provides timely and relevant information to

stakeholders.

The M&E Coordinator shall interact closely with the following stakeholders:

Schools Division Superintendent

Assistant Schools Division Superintendent

Division Planning Officer

School Heads and Instructional Managers

QMT Members

� Due to the strategic and sensitive nature of the M&E function, it is suggested that the designation of M&E Coordinator should be given to one of the Assistant Schools Division Superintendent (ASDS).

Page 9 - 5

Page 119: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Terms of Reference

The M&E Coordinator shall report directly to the SDS. The Coordinator shall provide the SDS

with interpretation and analysis of M&E data, raise issues and problems affecting (or

may affect) DEDP/DAP implementation and recommend areas for adjustment in the

implementation plan. Specifically, the following outlines the roles and responsibilities of the

Division M&E Coordinator:

Assist in the review and revision of the DEDP objectives and strategies, particularly

in the adjustments of the OVIs (objectively verifiable indicators) and MOVs (means

of verifications reports, documents, data gathering methods)�

Assist in the development and adjustment of the DEDP and DAP.

Assist the ASDS in setting up the Division M&E System. Ensure that the Division M&E

System complies with the operations of the Region M&E System

Communicate to Division and District staff the requirements of the School M&E

System and the roles and responsibilities of staff on M&E

Prepare consolidate Division Monthly Report for the SDS in accordance with the

approved reporting formats and schedule. This also includes reviewing and

validating the reports submitted by school staff and documents received from

outside the school.

Assist the education supervisors, district supervisors and other Division staff in the

preparation of their progress reports.

Record and report the Physical Accomplishments of the Division and schools.

Ensure proper documentation and safekeeping of Division reports and documents

generated during project implementation.

9.3.4 Division Planning Officer

The Planning Officer is responsible for ensuring reliability of Division and school data.

Specifically, the Planning Coordinator is responsible for the collection and validation of

school, community learning center and Division data and will provide initial statistical

analyses for the same.

The Planning Officer shall interact closely with the following stakeholders:

Schools Division Superintendent

Assistant Schools Division Superintendent

Division Monitoring and Evaluation Coordinator

School Heads

QMT Members

The Planning Coordinator shall report directly to the SDS and work side by side with the

Page 9 - 6

Page 120: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Terms of Reference

M&E Coordinator. The Planning Coordinator shall provide initial interpretation and analysis

of project data. Specifically, the following outlines the roles and responsibilities of the

Division Planning Coordinator:

Assist in the review and revision of the DEDP objectives and strategies, particularly

in the adjustments of the OVIs (objectively verifiable indicators) and MOVs (means

of verifications reports, documents, data gathering methods)�

Assist in the development and adjustment of the DEDP and DAP

Assist the M&E Coordinator in the preparation of consolidated Division Monthly

Report

Ensure availability of information from the BEIS and timeliness of data.

Mainly responsible for the computer entry of data and provide some initial analysis

and interpretation. Ensure integrity and accuracy of data

Manage and maintain the BEIS

9.3.5 Education Supervisors / District Supervisors

The Education Supervisors and the District Supervisors are responsible for tracking the

performance of the schools and the community learning centers and in supporting the

school heads, teachers and facilitators deliver quality education programs. The

supervisors are directly responsible for the monthly monitoring of school heads,

teachers, facilitators. Their primary monitoring responsibility is to provide feedback to the

Division in order to enhance the Division programs and projects for schools and community

learning centers.

The Supervisors shall interact closely with the following stakeholders:

Schools Division Superintendent

Assistant Schools Division Superintendent

Division Monitoring and Evaluation Coordinator

School Heads

The Supervisors shall report directly to the SDS and/or the ASDS. They shall provide the SDS/

ASDS with the progress of the training and technical support to schools and community

learning centers on a periodical basis. Specifically, the following outlines the roles and

responsibilities of the Supervisors:

Prepare program/project design,

Monitor the performance of schools and community learning centers. This

includes conduct of regular field visits.

Inform the SDS, ASDS and/or other Supervisors about school and community

learning centers' concerns to ensure prompt response to problems and issues

Page 9 - 7

Page 121: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Terms of Reference

Prepare and submit Monthly Report detailing the status of programs and projects

and future activities

Prepare and submit an Assessment Report on school performance

As member of the Division Quality Management Team, participate in the SIP

Appraisal, Start Up Review, Annual Implementation Review, and Mid-Term Review

Outcome Evaluation

9.3.5 Coordinators

Coordinators are Division staff assigned a specific program or responsibility. These include

Division staff designated as physical facilities coordinator, procurement coordinator, SBM

coordinator, DORP coordinator and others.

Each coordinator shall be responsible for monitoring programs and/or concerns assigned to

him/her.

The Coordinators shall interact closely with the following stakeholders:

SDS

ASDS

School M&E Coordinator

School Planning Coordinator

Other Coordinators

The Coordinators are urged to establish and strengthen their horizontal link with other

coordinators and/or education supervisors and district supervisors in order to fast track the

sharing and utilization of valuable information affecting programs and projects. Specifically,

the following outlines the roles and responsibilities of the Coordinators regarding monitoring

and evaluation.

Monitor the efficiency and effectiveness of programs or concerns

Monitor utilization of programs, systems, facilities and learning equipment installed at

the school and community learning center level

Prepare and submit status report

9.3.6 M&E Support Staff

The M&E Support Staff is an administrative support staff to the Division M&E Coordinator. The

Support Staff shall be responsible for the collection of data, encoding, filing and

maintenance requirements of the Division M&E System.

Page 9 - 8

Page 122: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System M&E Terms of Reference

9.4 The Quali ty Management Team

Quality Management Team or QMT is an ad hoc body composed of Division and District

staff whose task is to implement the quality control and adjustment mechanisms of the

Division. These mechanisms include: SIP appraisal process, start up, annual implementation

review, mid-term implementation review and outcome evaluation.

In general, the QMT is created to ensure compliance and adherence to the objectives and

targets of the Division and to ensure uniform application of policies, standards and

processes.

The QMT is responsible for:

ensuring the quality of plans, program and project designs developed by the

Division, district and schools

ensuring that staff from the division, district and schools are adhering to the

standard processes employed to assure quality

evaluating the major milestones at the school and community learning centers

The QMTs are divided into two major groups: the Core QMT and the Area QMT.

The Core QMT is the central body or process owner of the Quality Management System.

Specifically, the Core QMT will be responsible for the following:

set up of the Quality Management System in the Division

oversee the creation and formation of Area QMTs

build capability of the Area QMTs

communicate and enhance Division standards and the Quality Management

System

The Area QMTs put into operation the Quality Management System of the Division. These

teams are responsible for enforcing the quality standards of the Division and providing

technical and training support to schools and community learning centers.

Specifically, the Area QMTs are responsible for the following:

provide technical support to schools in setting up the school quality management

system

orient the schools and community learning centers on quality management

implement the quality control and adjustment points of the Division

evaluate the SBM assessment and ensure the integrity of the process

Page 9 - 9

Page 123: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

DIVISION M&E SYSTEM

10.0SETTING UP A DIVISION M&E SYSTEM

Page 124: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Setting Up

10.0 SE TT ING UP THE D IV IS ION M&E SYSTEM

10.1 When to Set Up

The Division M&E System is set up at the Start Up Stage of the DEDP Implementation. The

DEDP provides the directions, objectives, scope of work, indicators and criteria for success

that are very important in monitoring and evaluation.

10.2 Requirements for an Ef fective M&E System

Effective school management requires that a well organized monitoring and evaluation

system be designed, developed and implemented so that immediate feedback can be

obtained concerning school performance and immediate adjustments or decisions can be

made to ensure achievement of objectives and targets outlined in the school improvement

plan. The requirements for an effective M&E system include:

Commitment to do monitoring and evaluation. The primary requisite to an effective

monitoring and evaluation system is the commitment of all the stakeholders. This

commitment is manifested by the following: (1) all have the same understanding

and appreciation of the scope of monitoring and evaluation; (2) a continuing

commitment to excellence and improvement of school outcomes, and; (3)

objective used of the information and results of monitoring and evaluation.

Desired outcomes and objectives are realistic, clearly defined and verifiable. This

simply means that DEDP and the SIPs are correctly done. The DEDP and the SIPs

provides the scope of the monitoring and evaluation. Unless the plan is

comprehensively and clearly prepared the conduct of monitoring and evaluation

will be difficult. There should be a clear description of the objectives, targets and

milestones to be achieved in 3 to 6 years.

Standards are well established and communicated to all team members.

Standards include expected competencies, curriculum, SBM standards, training and

development process, planning requirements and the standard process for

Page 10 - 2

�������������������������� ���������� ��������

������

Set Up the Division M&E System

����� ��������� ����

Figure 10-1 Start Up Stage

Page 125: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Setting Up

monitoring and evaluation

System boundaries among the different levels are clearly established. This includes

clear definition and delineation of the roles and responsibilities in the different levels

of the organization.

A clear definition of the DEDP life cycle framework is established. The life cycle

framework puts into context the decision making requirements of the Division in

every phase, stage or milestone.

10.3 Some Guidepost in Sett ing Up the M&E System

The set up process must ensure the following:

Make sure the school managers, teachers and other stakeholders understand the

scope of the M&E system

There should be agreement in the performance measures that will be used in the

monitoring and evaluation of the school. Outcomes and intermediate objectives in

the SIP.

If you can't measure it, you can't manage it. Be sure your targets are objectively

verifiable.

Consider the standards and policies of the DepED before finalizing the

performance measures and reports in the system.

In the design of the system, put the requirements of the school and teachers. It must

meet the decision making requirements of the school head and teachers before

considering or meeting the information requirements of external stakeholders.

Keep the system as simple as possible. Minimize reports, merged documents and

reports when possible. Identify the most authoritative report.

Set up the system as quick as possible.

10.4 Steps in Sett ing Up the M&E System

The five step process in setting up the Division M&E System includes:

(1) define the scope of the M&E;,

(2) design the control and adjustment points,

(3) determine the information requirements of the stakeholders,

(4) set up the monitoring process, and

(5) communicate the system.

Page 10 - 3

Page 126: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Setting Up

10.4.1 Define the Scope of the M&E

The most important first step in setting up the M&E System is to clarify and define the scope

of the M&E. This involves clarifying the objectives and targets of the Division, defining the

success indicators and performance measures.

In defining the scope of the M&E, the following guide questions must be answered:

What are the education impact objectives we want to achieve?

What are the outcomes or benefits we want our target groups to experience ?

What are the programs and projects that we must deliver to achieve the

outcomes? How many and when?

Are the indicators SMARTly formulated? Can they be verified?

What are the resources needed to implement the programs and projects?

The answers to the questions above are found in the Division Education Development Plan

(DEDP). It is the DEDP that provides the coverage and boundary of the M&E System. Hence,

it is very important that the DEDP is very clear and accurate when it comes to objectives,

targets, programs and projects that it needs to deliver in the next 6 years. Although the

DEDP have already been reviewed and approved during the appraisal period, it is

recommended that this again be reviewed or revisited before full implementation is

undertaken.

The following illustrates the 3 step process in defining the scope of the school M&E system:

(1) The first step in designing the Division M&E system is to have a good understanding

of the objectives targets and strategies contained both in the SIPs and the DEDP.

This means reviewing and/or updating the DEDP and the SIPs.

(2) Second step is to review and finalize the performance measures1. Performance

measure is one of the critical elements of M&E. The measures must provide an

accurate picture of the status of accomplishments or outcomes. When the

performance measures are clarified and adjusted, finalize these targets and

freeze� � them. These will be the basis for the monitoring and evaluation.

Careful considerations must be done in choosing a performance measure. Some

guidelines in defining the performance measures of the school:

The fewer the better. One of the common pitfalls in evaluation is the notion

that the more data gathered, the more performance measures used, the

better. This does not always follow. As a rule, the fewer the performance

measures, the more accurate is the picture of the situation.

1 A performance measure is composed of a number and a unit of measure. The number provides the magnitude (how much) and the unit of measure gives the number a meaning (what)

Page 10 - 4

Page 127: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Setting Up

Focus on the right things. Ensure that the performance measures selected

are the correct measures for assessing the learners, teachers and school

head's performance. By correct, it means direct and exclusively used for one

performance only.

Integrated with other measures. A performance measure used is

connected to the other measures in order to provide a more holistic picture

of the school's accomplishment and achievement.

(3) Third step is to finalize the DEDP and prepare the DAP (Year 1) based on the

adjusted targets and schedule.

Defining the scope of the M&E will facilitate the design and establishment of the Division

M&E System. The DEDP with its objectives, targets, proposed strategies and activities define

the scope of the M&E. It is important that these are revisited and finalized to formalized the

scope of the M&E. This will lead to the design of the monitoring process, control points, data

collection tools and techniques and management reports.

10.4.2 Design the Control and Adjustment Points

The next major step in establishing a M&E system is to design the control and adjustment

points. The control points represent the core features of the M&E system. Control and

adjustment points are mechanisms for review to assess, validate and adjust (when needed)

the quality, scope, timing and cost requirements of an implementation. All the other

requirements of M&E, from reports to data requirements, will be based on these control

points.

Division Control and Adjustment Points are determined by drawing a road map of the DEDP

implementation. The road map will help illustrate the context and the relationship between

the implementation, control requirements and timing of the control. This will help establish

the link between the SIP implementation and the implementation requirements of the

Division and districts. This approach will provide the context of the decisions.

Implementation Stage and Control & Adjustment Point

A stage represents a major segment in the implementation phase, the completion of which

represents a major milestone. Each stage in the DEDP implementation represents unique

requirements and interactions as well as unique problems and issues. The stage approach allows

managers with more control in managing the implementation. The unique requirements of

each stage provide the context for monitoring and evaluation.

Control points are M&E review gates for evaluating major outputs and milestones. The results of

the control points are used as basis for adjusting the or enhancing the implementation.

Page 10 - 5

Page 128: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Setting Up

The main reference material in the establishment of control and adjustment points is the

DEDP and the SIP. The implementation plans provide details on the critical activities to be

undertaken and the targeted accomplishment dates of outputs. The following are items

need to be considered in the identification and design of Control and Adjustment Points:

Accomplishment of an output or major milestones. One of the major

considerations in the set up of the Control and Adjustment Points are the outputs to

be delivered. Outputs need to be quality assured.

Management reporting practices in the Department. Control points are patterned

after the management reporting practices of the agency. Considering the

reporting practices of the agency will facilitate both requirements of the school as

well as the requirements of the Division, Region and National Office.

Critical path or segment in the implementation process. Critical path refers to an

activity or activities that will have major implications to other activities, outputs and

decisions to be made in the future.

Figure 10-2 Control and Adjustment Points

Page 10 - 6

�� ��������������

����������

������������������������������������

������ �������������� ���

������ ������

���������

����! ������ ����� �

∀#�∃!��%���∀#�∃

&�∋�(����#��%����

������ ������������)� �∗�

������ ������

���������

����! ������ ����� �

∀#�∃!��%���∀#�∃

+%�����#��%����

�����,�� ���

������� ����������

�� �����������∗��

�� �����������−�)

������ �������

�����−�

������ �������

�������

� ���� � � � � � � � � ��� � � � � ���� � �� � � � � � �� �� � � �

Division Quality Control & Adjustment Points�������

����

Page 129: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Setting Up

10.4.3 Determine the Decision Making Requirements of Stakeholders

The third important step in setting up the M&E System is to determine the decision making

requirements of the stakeholders. Also known as key players, these refer to individuals or

group who will provide support and can/or can influence the implementation of the DEDP.

Stakeholders are classified into internal and external stakeholders. Internal stakeholders

refer to Division management and staff or people who has the highest stake in the

successful implementation of the DEDP. External stakeholders refer to individuals or groups

who provide technical support to the Division.

For both type of stakeholders, the System must be able to provide timely, accurate and

relevant information in order to ensure effective management and prompt delivery of

support and assistance. The decision making requirements of the stakeholders will dictate

the type of information that needs to be generated by the Division M&E system.

Determining the decision-making requirements of stakeholders is accomplished in 4

sequential steps. These steps assure that only the critical decision requirements of the

stakeholders are identified.

(1) First and foremost, the information requirements of the INTERNAL stakeholders

must be identified before responding to the information requirements of external

stakeholders. The Division M&E System must facilitate the information requirements

of the Division in order to help its management and staff make timely and critical

decisions that will lead to the attainment of targets and objectives contained in the

DEDP. As a system, the needs of the internal stakeholders are provided in order to

ensure efficient and effective implementation of programs and projects the

Division is accountable to provide.

The following are suggested steps in analyzing internal stakeholders and their

information requirements,

(a) Identify the stakeholders

The main M&E users at the Division and District level are the following:

Schools Division Superintendent. Accountable to the overall success of the

Division and the performance of the Division staff to provide efficient

technical support to schools and community learning centers.

Assistant Schools Division Superintendent. Accountable for the operational

efficiency of the Division and District to deliver programs and projects on

time and as per target.

Education Supervisors. Accountable to the effectiveness of Division

programs and projects for school heads, teachers, non-teaching staff,

instructional managements and facilitators.

District Supervisors. Accountable to maintaining continuous assistance to

Page 10 - 7

Page 130: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Setting Up

schools and community learning centers

(b) Profile the stakeholders

Profile each stakeholder based on the following:

Functions (roles and responsibilities). Pertains to both the de jure (mandated)

and de facto (actual) functions of the individual.

Information requirements. Pertains to data and information needed by the

stakeholder in order to make a decision.

When refers to time/day the information is needed by the stakeholder.

MoV refers to current practices or mode to verify the information.

It is also important to take note that the functions and information requirements of

certain stakeholders vary per time period. In this regard, refer to the M&E Operation's

Framework (macro-annual) to determine the kind of decision and information a

stakeholder makes.

(c) Identify possible design considerations

The stakeholder profile and information requirements can be used as input to the

content of the report, format of the report, how the reports will be presented and

the timing of the reports.

Below is a sample Stakeholder Information Matrix for Internal Stakeholders

Table 10-1 Stakeholder Information Requirement Matrix Internal Stakeholder� 2

StakeholderFunctions (roles & responsibilities)

Information WhenMeans of Verification

Implication to M&E Design

SDSProvide strategic directions to schools, districts and division

Performance of schools and community learning centers

Quarterly/Annual

School Report CardDivision Report CardDivision Status Report

Information must reach the SDS on a quarterly basis in order to effectively provide steering role

Overall management of Division

Programs and projects delivered versus plan

MonthlyDivision Monthly Status Report

ASDSEfficient operations of Division programs and projects

Performance of school heads, teachers, facilitators and instructional managers

Start of SYDuring the SY (periodical)End of SY

Implementation Plan and status reports

Education Supervisors

Provide training and technical assistance to school heads,

Competencies of target groups

All year round School VisitObservation and Inspection

2 Sample only. Intended to show the information to be gathered about the stakeholder. Additional stakeholders and additional information may be added depending on the accountabilities of individuals in the Division and District

Page 10 - 8

Page 131: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Setting Up

StakeholderFunctions (roles & responsibilities)

Information WhenMeans of Verification

Implication to M&E Design

teachers, facilitators and instruction managers

District Supervisors

Provide training and technical assistance to school heads, teachers, facilitators and instruction managers

Competencies of target groups

All year round School VisitObservation and Inspection

Division Procurement Staff

Division Physical Facilities Coordinator

(2) Second, identify the information requirements of the EXTERNAL stakeholders.

External stakeholders are individuals or groups who can influence and/or provide

support to the school. These groups may require information from schools in order to

align its plans, programs and even its policies to the school requirements. The School

M&E System is also designed to meet the information needs of these stakeholders.

Unlike the internal stakeholders whose information needs are operational concerns,

the information needs of the external stakeholders are more on outcomes and

results, information that are important for policy formulation and design of technical

assistance programs for schools.

Normally, these stakeholders have their own M&E system. The report content, format

and timing should be considered as much as possible. Hence, the need to

configure the School M&E System to the M&E System of these stakeholders. The

suggested steps in doing a external stakeholder analysis is almost the same as that

of the internal stakeholders. These include:

(a) Identify the stakeholder

Not all individuals or groups outside of the school may qualify as stakeholder. In this

context, external stakeholders are those that can influence or support the school's

implementation of its programs and projects. Influence may include changes in

policies, technical assistance support and financial assistance.

(b) Profile the Stakeholder

Gather some information about the stakeholder in terms of:

Mandate. Pertains to roles and responsibilities on education as determined

by law. Or it may also pertains to the charter or vision/mission of the

organization especially non-government organizations or foundations.

Possible support to schools. Includes current support and potential support

that can be provided by the stakeholder.

Page 10 - 9

Page 132: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Setting Up

Information. Refers to the type of information the stakeholder may need to

make decisions that will compel the stakeholder to support or assist the

school.

When. Refers to the time period the information is needed by the

stakeholder.

Means of Verification. These are current practices or documents the

stakeholder is using in its own M&E system.

(c) Determine the implication to the design of the Division M&E System.

The design of the Division M&E System must be closely linked to the design of the

M&E System of the external stakeholders especially the Regional Office and to the

M&E System of the schols. The report content, forms and formats must be closely

linked to those being used by the Schools, Region and Central Offices.

For external stakeholders who are not directly mandated by law to support the

schools but are supporting the schools on their own, it is important to determine

their planning and budgeting period. This period may be the most appropriate

time to provide information about the school and the support it needs.

The table below provides an example of a Stakeholder Information Matrix External�

Stakeholder.

Table 10-2 Stakeholder Information Requirement Matrix External Stakeholder� 3

Stakeholder MandateSupport / Possible Support

Information WhenMeans of Verification

Implication to M&E Design

Regional Office

Provide technical assistance support to Division

Ensure quality of Division operations

Training of Division staff on management, consultancyetc

Status of DEDP implementation

Teachers performance

QuarterlyAs the need arise

School break & Semestral break

Quarterly ReportSchool Managers Meeting/ ConferenceSchool Visit

Needs analysisClassroom observation

Central Office

LGUs � Province level

Other Agencies

3 Sample only. The list of stakeholder and their roles and responsibilities may vary from place to place.

Page 10 - 10

Page 133: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Setting Up

The information requirements of the stakeholders will have implications to the design and/or

content of the following:

data elements. These are the most basic information about a status of an

implementation. Usually, these are raw data and are important in determining or

computing for the performance measures. This will also dictate the forms and tables

to be developed.

forms and template. The simpler the forms and the templates, the better. These are

the most fundamental collection tool to be used in documenting an event and an

accomplishment.

report format. The decision making requirements of the stakeholders will determine

the format of the report. It should contain all the necessary information numbers�

and stories behind the numbers in order for a school head or a teacher make the

necessary adjustments or improvements in the strategies implemented.

reporting frequency. The need of the stakeholders to make decisions will also

dictate the reporting periods. The reports with the numbers and stories must be�

received on time by the stakeholders in order to ensure timely adjustments (if

needed) or decisions.

evaluation frequency. Evaluation pertains to external assessment to be undertaken

to validate the accomplishments and stories written in the reports. Usually,

evaluation is undertaken when the evaluation party is to come up with their own

plan.

As a rule of thumb, when there are conflict between the requirements of the internal and

external stakeholders, the requirements of the internal stakeholder must be met first

before the external. The decision making requirements of the internal stakeholders the�

school head and teachers must be given priority in order to help them make immediate

enhancements or remediations in the school interventions. This perspective will also ensure

that M&E is more about managing and making decisions than meeting the reporting

writing.

10.4.4 Set Up the Monitoring Process

The next step in setting up the M&E System is to define the operating details of the system.

These include designing the monitoring process/es that will operationalize the Division M&E

System. This also includes data collection system, reports and reporting process and the

feedback mechanism.

A Division Monitoring Process is a series of actions by the Division and Districts used to track,

evaluate and analyze the school performance and its target groups. It is a support process

undertaken to assure the quality and relevance of Division programs and projects. These

processes operationalize the data collection and reporting activities of the Division and

Page 10 - 11

Page 134: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Division M&E System Setting Up

integrates these to the control and adjustment points of the system. Once in used, the

Division M&E Process can supply the different information requirements of the Division

management, program units, support units and districts.

Define the M&E process. This includes defining the control points and events that

will be undertaken during the DEDP implementation.

Finalized the reporting requirements and disseminate them.

Formulate the M&E Terms of Reference. After detailing the requirements of the

M&E system, the next step is to define the roles and responsibilities of the school

head, teachers and staff concerning data collection, sharing of information,

reporting assignments and in giving feedback.

10.4.5 Communicate the System

The last step is the conduct of a Kick-off Meeting to signal the operationalization of the

Division M&E system. In football, a kick off represents the start of a game which means the

rules of football are now enforce. It is important to conduct a Kick-off Meeting to allow the

teachers, non teaching staff and others that the School M&E System is now operational.

Aside from the official start off point of the system, the Kick-off Meeting is also the venue for

the Division and District to understand the system. Before enforcing the system, it is

important to ensure that all staff have/can:

(a) high awareness of the scope or coverage of the Division M&E System

(b) explain the context and rationale of the different M&E control and

adjustment points

(c) awareness of the support that can be provided by the stakeholders and the

information they need to facilitate that support

(d) understands his/her roles and responsibilities in M&E.

Page 10 - 12

Page 135: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qu a l i t y M an a g e m e n t Inve n to r y Mo de l

1st draft (fn. Quality Management Inventory Model)

Page 136: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

BACKG ROUND

Technical Support To SBM

The Basic Education Social Reform Agenda (BESRA) is a package of policy reforms that seeks

to systematically improve critical regulatory , institutional, structural, financial, cultural, physical

and information conditions affecting basic education provision, access and delivery on the

ground. BESRA is expected to create critical changes necessary to further accelerate,

broaden, deepen and sustain the improved education efforts.

BESRA's implementation of actions is focused on four main areas. These are (1) school based

management (SBM), to help schools to better manager their operations for improved learning,

(2) Competency Based Teachers Standards, to enable more teachers to practice

competency-based teaching, (3) Quality Assurance and Accountability Framework, to provide

better institutional support to learning and quality assurance, and (4) Outcomes-Focused

Resource Mobilization, to ensure resources are focused on achieving desired outcomes.

The focal point of BESRA is the school. The main integrating vehicle for BESRA implementation

is SBM. Through SBM, schools are allowed to manage its own affairs to improved the delivery of

education services in a sustained manner. SBM also includes strengthening the school heads

on resource mobilization, negotiation, partnerships with community and stakeholders. Other

assistance include provision of funds for priority school projects. Clearly, all major efforts,

resources and funds are funneled to helping the schools manage basic education services

more efficiently and effectively.

In this regard, the Region and Division will play a very important role in supporting the schools. It

is essential for the Region and Division to have the capability and necessary systems and

mechanisms in placed that will sustain its support the schools management of its affairs. The

Region and Division are in a strategic position to propagate, maintain quality and sustain SBM

interventions and results.

It is in this context that a quality inventory assessment tool is developed to ensure the Regions

and Divisions are ready to assume the tasks of facilitating SBM. The assessment will focus on

their readiness. By readiness, it includes the presence of well defined technical assistance

processes and support mechanisms that will support schools' delivery of basic education

services to students.

Page 2

Page 137: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

Page 3

��������������������������

�� ������������

������������������������������

����������������

������������������������������������

������������������

����������������������������������

���������������������

����������������

�� ������������

�������������������

!���������∀#�����������#���

Page 138: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

Q U A L I T Y M A N AG E M E N T I N V E N T O R Y MO D E L

(QM IM )

The Quality Management Inventory Model is an integral part of the Quality Management

System of the Region. It is a mechanism to promote continuous improvements in the Region

and the Divisions. Its main goal is to improve things and manage things better . ∃ %

The QMIM depicts a road map that traces the Region and Division's transformation from use of

informal processes to a more established technical assistance packages and support

mechanism. It projects an organization's transition from the realm of uncertainty to a more

repeatable and predictable results. The Model represents a progression of capability by the

Region and Division to deliver management support and its technical assistance packages to

their target groups.

The QMIM is also a yardstick to assess the performance of the Region and division. It will be

used to examine the Region and Division's processes and support mechanisms that allows it

to efficiently and effectively deliver technical assistance packages to schools, school

managers, teachers and the school's non-teaching staff.

The Model is defined into (1) Ad Hoc, (2) Defined, (3) Integrated, (4) Sustained. Level 1 is the

entry level. It represents a Division that is characterized by ad hoc processes and informal way

of doing things. As it matures, the Division Office is expected to establish its internal procedures

(Level 2. Defined). The Division improves into a stage where it is expected to manage and

integrate different mechanisms into an integrated system. The highest level is the Sustained

level. This represents a Division that adapts, maximizes and continuously improve its way of

doing things.

The following discussion describes in detail the four models:

Readiness Level 1. Ad Hoc

The initial or entry level of readiness. A Readiness Level 1 Region/Division is often characterized

by a temporary and informal ways of doing things. Organizational procedures or methods are

not well defined and disseminated leading to inconsistent results and poor quality of service. Its

technical assistance packages are reactive, inefficient and not relevant to the requirements of

its target groups. Often these packages are hand-me down practices. Its utility value and

effectiveness have not been proven, yet these are utilized year in and year out. Some may

Page 4

Page 139: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

yield positive outcomes and some may offer temporary solutions.

A Region/Division is characterized by an absence of standards and defined processes for

planning, implementation and evaluation. In cases there are defined procedures, these are

forgotten when the procedure owner leaves the organization or is replaced by another∃ %

staff. There is no continuity and standard way of doing things in this type of organization.

The common features of a Level 1 Region/Division are the following:

There is no clear and/or defined way of doing things;

If there is a defined process, it does not respond to the challenges and support

requirements needed by its target client le;&

If there is defined process, the implementation or application is not consistent;

Dependence to one or two individuals. These effective individuals are often hailed∃ %

as heroes because they are able to move the organization to achieve results. However,

when these champions leave, the organization suffers setback; and,

Some positive results are achieved but not sustained and/or maintained.

Region/Divisions belonging to this category may achieve good results (eg. NAT results),

however, these are not maintained (leading to poor results in the succeeding years).

If a Level 1 Region/Division aims to efficiently do things, then it must undertake the following

steps:

1. Build capability of staff on the following areas: planning, management, supervision

and control. The staff should know the fundamental principles involved in these

management areas and should have very good handle on management tools and

techniques;

2. Establish or set up its own mechanism or procedures.

Collect and collate its practices and experiences and formulate its own set of

procedures;

Adapt other Divisions' experience and/or approach that have been proven

and tested already;

3. Transform these mechanisms/procedures into a Region/Division policy;

4. Communicate to Region/Division staff and build a critical mass of individuals who will

guide, guard and champion the newly established process.

The transformation of a Region/Division from Level 1 to Level 2 is critical in the maturity process

Page 5

Page 140: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

of the Division. This stage is often the most frustrating part of the change management

process. Introduction of new methods or practices are often met with suspicions and cynicism

by individuals. Aside from the suggested steps outlined above, the Region/Division

management should take sessions on how to implement and manage change.

Readiness Level 2. Defined

It is critical first to establish the support mechanisms that will be used as the platform for

delivering the Region/Division's technical assistance packages. These support mechanisms

ensure an efficient delivery of basic education strategies and services. Region/Division

organizational processes must be defined, refined and communicated in order to guarantee

consistency in the quality of its technical assistance.

At this level, Region/Divisions start to refine and define their technical assistance packages.

These packages are detailed into specific steps, refined, standardized and documented.

Region/Division staff are oriented, trained and are expected to perform these processes.

Region/Division Readiness Level 2 is about institutionalizing an efficient process or procedure

that leads to effective results.

Level 2 Region/Divisions exhibit a significant improvement from ad hoc, temporary way of

doing things to having a clearly defined and concrete processes and practices. At this level,

there is an effort to implement interventions as efficiently as possible by following a structured

approach. There is a high awareness to use commonly established management tools,

techniques and procedures. But there is still that tendency to revert back to the ad hoc or

traditional practices when confronted with a difficult situation. There is a defined process but

the application is not consistent. An example is the tendency to cut short a procedure due to

time constraints or cost constraints thereby sacrificing quality.

Region/Divisions belonging to Readiness Level 2 may have the following characteristics:

Have defined, formulated and established its TAP processes and support systems;

There is a staff development program that supports the application of the defined

system;

Application of these systems or procedures are not consistent. On a case to case basis,

defined systems and processes are ignored or not followed; and,

Although systems are in placed, these may not talk to one another or there may be∃ %

duplication of efforts.

Page 6

Page 141: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

Unlike Readiness Level 1, Region/Divisions belonging to this category are less dependent to

individuals (one or two staff) but need a strong willed management who will enforce the

quality standards, defined procedures and the agreed programs and projects.

In order to achieve a higher readiness rating, Level 2 Region/Divisions should undertake the

following:

1. Increase staff awareness of Region/Division standards and procedures;

2. Documentation and dissemination of standards and procedures;

3. Critical number of Region/Division staff are knowledgeable about the procedures;

4. Updates on demands (needs and opportunities) of schools and stakeholders; and,

5. Improve Region/Division management capability to enforce adherence to its own

processes.

The transformation from Level 2 to Level 3 is a result of the organization's motivation to achieve

more as a result of its initial success. The satisfaction generated by a formal and coordinated

process is boosted by a desire to do things more permanently and consistently.

Readiness Level 3. Integrated

Region/Divisions belonging to Readiness Level 3 demonstrate a more mature and more

consistent way of doing things. In this category, Region/Divisions are able to collate, document

and transform its effective practices into an integrated, well choreographed process. There is

high compliance to its own standards and processes such that all Region/Division units and/or

individuals know the what to do and understands the coordination, cooperation and

collaboration requirements expected from them.

A Region/Division with Level 3 Readiness demonstrates maturity in balancing the competing

requirements of quality, time, cost and quantity (scope). It is able to efficiently deliver its

technical assistance to schools, have completed them based on approved schedules and

within budget. It is able to achieve high quality of outcomes as a result of its adherence to its

own standards and to its defined process.

Readiness Level 3 Region/Divisions manifest the following:

On Quality Management. Region/Division standards are well established and are used

as basis for monitoring and evaluation;

On Scope Management. Ability to perform all activities in the DEDP and deliver the

Page 7

Page 142: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

promised outputs; there is minimal deviation to approved plans;

On Time Management. Tasks and activities completed according to schedule;

On Integration Management. Effective practices are integrated into a single common

process;

On Configuration Management. There is consistency between its staff development

program and its system requirements; there is horizontal integration of Region/Division

mechanisms (planning, monitoring and evaluation, procurement, finance etc);

On Organizational Arrangement. There is a high level of interaction and

interoperability between and among units;

On Management of Results. Region/Division can predict and control results of

interventions (monitoring and evaluation).

This level is not affected by changes in management and/or personnel. The processes, systems

and practices provides stability to the operations of the organization.

To proceed to the next level, the following actions are suggested:

1. Continuing staff development program;

2. Continuing review of its processes, procedures and systems;

3. A quality assurance mechanism that detects the positive and negative elements of

the processes;

4. Upgrading its own standards.

Table 1. inventory Level and Characteristics

Inventory Level CharacteristicsTechnical Assistance

Package Actions

Level 4 Sustained

Continuous improvement of technical assistance packages; interventions, processes are enhanced based on needs and opportunities

Dynamic and improving processes

High interaction with client le,& always have better way of doing things

Level 3 IntegratedDemonstrates maturity in balancing the competing requirements of quality, scope, time and cost

Consistent application of processes

Continuous capability building and improvement of processes

Level 2 Defined

Installation of well defined processes to guide service delivery but inconsistent in its enforcement and/or application of processes

Institutionalizing but still inconsistent in application

Standardized and enforced defined processes

Level 1 Ad Hoc Temporary way of doing things; process No clear way of Define the

Page 8

Page 143: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

Inventory Level CharacteristicsTechnical Assistance

Package Actions

may vary depending to situation and personalities

doing things; personality dependent

processes, build staff capability

Readiness Level 4. Sustained

The Region/Divisions are expected to facilitate the implementation of school based

management (SBM) at the schools level. They will play a critical role in preparing the schools

achieved the desired level of maturity under a decentralized management set up. Given the

enormity of the tasks, the Region/Division must be a growing organization in order to be

effective. It should be able to respond, adapt and adjust to the unique and changing

requirements of the schools. Meaning, its pre-occupation is not on its packages and processes

established, but on adjusting and improving these to suit the challenges and requirements of its

school client le.&

The Region/Division's maturity on this level hinges on its commitment to excellence. It must

have the ability to perform continuous improvements, always optimizing the gains or outcomes

of its undertaking. Therefore, a Readiness Level 4 Region/Division should have the following

traits:

Defined processes are regularly updated in accordance with the strengths,

weaknesses, opportunities, threats faced by the schools;

Defined processes are improved and in sync with agency policies and directions;

This level adheres to the principle of continuous improvement, always optimizing gains or results.

In order to maintain this level, the following efforts are suggested:

1. Regularly conducts research and evaluation studies;

2. Commitment to the monitoring, evaluation and adjustment process;

3. Documents lessons learned and other experiences; input these to the improvement

and/or enhancement of Region/Division products and processes; and,

4. Learn from other Region/Divisions' experience.

It is inherent for people to leave the organization. Individuals who played a big role in the

continuous improvement and growth of the organization will eventually retire, resign, get

transferred to other stations. It is imperative that institutional memory is maintained and

Page 9

Page 144: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

handed over to the next generation of Region/Division staff who will continue the culture of

excellence established in the Region/Division. In this regard, a good knowledge management

program must be in placed to ensure Level 4 Region/Divisions.

Page 10

Page 145: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

Technical Assistance Package

(TAP)

The focus of the QMIM is the ability of the Region/Division to efficiently deliver its technical

assistance packages to its target group. A Technical Assistance Package is a set of activities,

developed and defined by the Region/Division into a process, designed to solve a particular

issue and/or to achieve desired education objectives or outcomes.

The QMIM classifies these packages int three types:

1. School Based Management Packages

These packages constitute the core process areas that will directly impact on learning

outcomes. These include school based management, learning management program,

instructional supervision, learning materials and learning environments. Capability

building assistance for school heads and teachers to improve instructions are also part

of this package.

2. Management Mechanisms

These governs the operations of the Region/Division into a system. Management

mechanisms provide the backbone to the implementation of Division TAPs as it

provides the integration processes required to efficiently conduct and manage its

operations. Management mechanisms include planning, appraisal, staff development

program, monitoring and evaluation.

3. Support Mechanisms

These are necessary support processes that are vital to the efficient operations of the

organization. Support mechanisms refer to processes pertaining to finance, human

resource management, administration and procurement.

The management readiness of Region/Division will be assessed using these TAPs. Essentially, the

core review areas include: (1) technical assistance on provision of SBM assistance and

instructional support to schools, (2) management processes that integrates the Region/Division

operation, and (3) support processes of the Region/Division to deliver technical assistance as

efficiently as possible. The expectation is that the Region/Divisions are in a position (i.e. with a

well defined and tested procedures and mechanisms) to provide consistent, relevant and

timely assistance to divisions/schools.

Page 11

Page 146: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

The efficiency and effectiveness of a Region/Division is dependent on how it delivers its

technical assistance support to schools. Ideally, the TAPs are products of the Divisions' effort to

continuously improve its capability to provide support to schools. The type of support that a

division/school will receive is dependent on the maturity of the Region/Division to respond

consistently, efficiently and effectively deliver its technical assistance packages.

Table 2. Core Review Areas - Quality Management Maturity Assessment

Review Areas Processes

Division's SBM Technical Assistance

Capability Development Program for School Heads and Teachers on

� SBM

� Curriculum Implementation

� Teaching and learningEducational Planning ProcessInstructional ConsultancyInstructional Materials DevelopmentSchool Building ProgramManaging resources and networking

Division Management Mechanisms

Division M&E System which includes processes on:

� SIP Appraisal

� Start Up Review

� Annual Implementation Review

� SIP Outcome EvaluationDivision Education Planning Process

Division Administrative Support Mechanisms

Human Resource Management including recruitment, selection, performance appraisal and promotion)Procurement ProcessFinance and budgeting

Page 12

Page 147: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

C R I T I C A L A S S U M P T I O N S

Listed below are critical

assumptions used to define the

scope and boundaries of the

Quality Management Inventory

Model. It also seeks to provide the

context, focus and locus of the

assessment.

1. Management readiness of

Region/Division should impact to

quality education. Its main objective is to

strengthen and improve Region/Division's delivery of programs and projects that will

impact positively the learners.

2. The context of the QMIM hinges on the ability of the Region to prepare and

strengthen the Divisions in facilitating and sustaining support to schools on SBM. On

the part of the Division, it is anchored on their ability to provide the necessary

technical support for schools to effectively implement SBM. As such, the assessment

will focus on the capability of the Region/Division to deliver technical assistance

packages that will facilitate and sustain the schools delivery of basic education

services.

3. Management readiness is not intended to appraise the performance of the Regional

Director, Assistant Director, Schools Division Superintendent, Assistant Schools Division

Superintended and their staff. Instead, the focus and locus of the inventory assessment

is the Region/Division as a whole. The current practices of the Region and the Divisions

will be assessed. Specifically, the focus of the assessment is on how the Region/Division

delivers/operates the technical assistance packages and support mechanisms.

4. The highest management readiness or maturity level of a Region/Division should

manifest the following characteristics:

repeatable and predictable education results.� � A ready or mature

Region/Division is able to produce outcomes as a result of its efforts rather than

by chance. Example, a Division achieves a high NAT results in 2005 and poor

NAT results in 2006. A mature Division is able to predict and repeat results;

Page 13

��������������������������������

�� ������������

������������������������������

����������������

������������������������������������

������������������

����������������������������������

���������������������

����������������

�� ������������

�������������������

!������������������#��������#���

Page 148: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

holistic and integrative. technical support is not on a piece meal approach

but always consider the implications and effects of a support to other areas.

able to consider both the management and instructional support requirements

of front liners;

relevant. Its service delivery mechanisms are custom fitted to meet the unique

and changing demands or needs of schools;

proactive. It is always a step ahead, readily responding or reacting to changing

demands of school stakeholders;

timely. Delivery of basic education services are on time, based on plans and

programs;

consistent, repetitive and simple. Division's service delivery effort is consistent,

can be repeated and simple;

the best technical support. Provides the most efficient and effective service

delivery mechanism to schools and school stakeholders; and,

continuous improvement. Division should have a culture of excellence, always

challenging itself to provide better and better service.

5 . T he inventory model ranges the least mature stage to the most matured state . The∃ %

model assumes that organizational readiness must be grown over time in order to

produce repeatable success.

Page 14

Page 149: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

C R I T I C A L M A N AG E M E N T A P P R OAC H E S

One of the critical assumptions used in the Quality Management Inventory Model is

continuous improvement. Continuous improvement is based on the premise that change will

occur and will always challenge the status quo. This may be brought about by new needs and

issues, new policies and thrusts, new standards and challenges. These factors will force the

Region/Division to continue innovate, change and improve itself in order to be efficient and

effective.

In order to ensure continuous improvements, the Region/Division must adopt certain basic

management approaches. Adherence to the basic principles and techniques of these

management strategies will help sustain and maintain a high readiness level of the

Region/Division.

In order to facilitate the maturation process of the Region/Division from Level 1 to Level 4, the

following management approaches are must inputs to the Region/Division staff development

program. These include change management, project management and knowledge

management.

Change Management

Change management is a structured approach to change individuals, teams and

organizations that enables them to transform from the current state to a desired state

in the future. The maturation of Region/Division from Levels 1 to 4 is dependent on how

changes are introduced and applied. In this regard, the Region/Division readiness

programs should incorporate capability building, especially for management staff on

organizational change management. Management staff should be equipped with

techniques on how to effectively plan, implement and manage changes for Division

personnel.

A well planned change management program is very important in the transition from

Level 1 to 2 readiness.

Knowledge Management

Knowledge management (KM) is an important ingredient to continuous improvement.

KM pertains to practices by organizations to distribute, transfer and propagate

knowledge within the organization. KM is an important input to sustaining organization

efficiency.

Page 15

Page 150: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

Project Management

Project management (PM) is the use of different tools and techniques to activities in

order to achieve objectives or targeted results. PM boasts of robust planning,

implementation and control techniques that can help the Division deliver its services in

a more efficient approach. PM techniques will serve as valuable tools in standardizing

organizational processes and integrating these into a more coherent and optimal

process. Knowledge and skills on PM is very important for Levels 2 & 3 Readiness.

These management approaches identified above should not be taken in isolation. Deliberate

effort to integrate the three approaches should be undertaken. The change management

approach provides the strategies on how to soften resistance and facilitate change when∃ %

new processes are introduced. Knowledge management, on the other hand, provides the

input and perspectives on how to manage, share, propagate and sustain these processes. And

lastly, project management techniques lend themselves well to managing change and

managing knowledge.

Page 16

Page 151: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

As se s smen t A reas

Region

The scope of the QMIM assessment for the Region includes the following technical assistance

packages to Divisions:

Technical Assistance to Divisions

Capability Building Program for Division Management and Staff

Learning Materials Support

Region's Management Mechanism

Strategic Planning

DEDP Appraisal

Outcome Evaluation

Progress Monitoring and Evaluation

Assessment (learners' achievement)

Policy Research

Administrative Support Mechanism of the Region

Human Resource Management

Procurement Process

Finance and Management Support

Division

The Quality Management Inventory Model covers assessment of the Divisions' demonstration of

the following technical assistance packages:

School Based Management assistance to Schools

Drop Out Reduction Program

Capability Building Program for School Heads on SBM

Capability Building Program for Teachers

Learning Materials Support to Schools

Page 17

Page 152: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

School Building Program

Division Management Mechanism

Strategic Planning

SIP Appraisal

Monitoring and Evaluation

Information Support

Inset for Non-Teaching Staff

Performance Evaluation

Division Administrative Support Mechanism

Human Resource Management

Procurement Process

Finance and Management Support

Page 18

Page 153: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

Q UA L I T Y MANAG EM E N T INV EN TO R Y MOD E L

(QMIM) A S S E S SM EN T

Objectives

The aim of the QMMA is to determine the capability of the Region/Division to implement timely,

consistent and relevant technical assistance packages to its school client le. The assessment&

focuses on the ability of the Division to consistently deliver quality technical assistance

packages through processes designed to improve efficiency and assure effectiveness of

service.

Specifically, the assessment aims to accomplish the following:

1. Documentation of processes and mechanisms within the Region/Division. The

assessment is designed to solicit effective practices within a Division Office that may

be shared and replicated to other areas;

2. Determine and pinpoint areas where a Region/Division is considered matured and

where it needs improvements; The results will also provide valuable insights and

perspectives on what processes that work and processes that does not work.

3. Determine the support requirements of Region/Division. These may include human

resource complement, staff development, process and system; and.

4. Input to DepED Regional Office. This assessment can be one of the strategies for the

Regional Office to implement its quality assurance and accountability work. The results

can be used by the Regional Office as input to design its technical assistance support

to the Division Office.

On the part of the Division, the readiness model may be used as a template by which it can

assess its own operations and determine its capability building requirements. For both the

Regional Office and Division Office, the results of the assessment provide important inputs

toward a more efficient and effective delivery of basic education services.

Methodologies

The QMIM Assessment will make use of different methods of data gathering. These include:

1. Key informant interview. A one on one session with the process owner or individual/staff

who is directly accountable to the implementation of the technical assistance

Page 19

Page 154: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

packages.

2. Interview. Session with staff who has direct knowledge of the process areas or who

were involved in the process. Interviewees may also include school heads and

teachers who became recipient of the service/s provided by the Division.

3. Focus Group Discussion. Session with selected Division staff and/or school heads and

teachers discussing the practices and/or processes implemented by the Division. The

FGD will help validate the claims of the key informants or process owner.

4. Artifacts Review. Refers to gathering documents that will prove the existence of Division

process or practice. This will also include inspection of the documents.

5. Observation. When possible, the Assessment Team may conduct observation of the

actual process being undertaken by the Division staff.

Process Owner

The Region will be the Process Owner of the QMIM assessment. The Region will create Quality

Management Team/s that will be tasked to undertake the assessment. As process owner, the

Region must ensure the following:

1. integrity of the process. It must be undertaken as objectively as possible and used

simply for continuous process improvement

2. usefulness of the process, especially the findings and results. These should find its way to

the Regional Education Development Plan and used as design considerations to

Region programs and projects

3. capability building of the QMT members to assure that they are ready and capable to

become assessors

4. continuous improvement of the assessment process. This means improvement in the

assessment tools, methods and management arrangements.

Page 20

Page 155: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

A S S E S SM E N T TOO L

The QMIM Assessment Tool is developed to facilitate the conduct of the QMIM assessment of

the Region/Division in implementing key process areas to deliver its technical assistance

packages (TAPs). The Assessment Tool contains the key process areas of the Region and

Division and the various scenarios using the 4 maturity level (ad hoc, defined, integrative and

sustained).

The Assessment Tool is not a checklist but will serve as a guide for the QMTs to objectively

document the application or utilization of existing key process areas of the Region/Division. The

tool will also be used to facilitate the documentation of best or effective practices in the

Region and Divisions.

Page 21

Page 156: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

M A N AG I N G T H E QM IM A S S E S S M E N T

The process owner of the QMIM assessment process is the Region. The following will serve as

guide for the QMTs to be deputized by the Region to implement the QMIM assessment. The

steps listed below are the minimum requirements in conducting an efficient assessment.

Depending on the requirements of the Region, additional activities and requirements may be

added.

A. Start Up - Chance favors the prepared mind� �

The following are suggested start up activities for the Region to implement before undertaking

the Division QMIM assessment.

Step 1. Review

1. Review concepts and principles of quality management, process improvement and

the inventory levels. Make sure each assessor understands these concepts in order to

ensure same paradigm among team members/ assessors.

2. Ask each team member to familiarize himself or herself with the content of the

Assessment Tool. This will minimize dependence to the Assessment Tool and allow the

assessors to conduct the interviews as normal as possible without much interference∃ %

from looking at the tool from time to time.

3. Make sure that all team members understand how to use the Assessment Tool. This

includes understanding of the continuum (level of maturity), and documentation

requirements.

4. On rapid appraisal, remind each team member that the assessment approach being

utilized is the not so quick and not so dirty approach. R∃ % eview the principles and

methods of rapid appraisal.

5. Conduct a group review of key process areas or management process that must be

present in a Division. If possible, the Team should review and familiar themselves with

the objectives, strategies and content of the Division Education Development Plan.

Step 2. Assessment Team

1. Form assessment teams based on the targeted number of Divisions to be assessed

Page 22

Page 157: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

and the target date of completion. Ensure the teams will be able to cover all divisions

based on the time allotment.

2. Assign a Team Leader for each team. The Team Leader shall:

� ensure access to documents and materials the team may need for the

assessment

� ensure the team has enough copies of the Assessment Tools and other related

materials

� coordinate Division visit schedules

� orient the Schools Division Superintendent interviewee on purpose of∋

assessment

� conduct an exit conference (after assessment)

� consolidates report of team members

3. The number of team members per team should be enough to cover all the items in

the Assessment Tool and to ensure the documentation requirements are met.

Suggested minimum number of members is 5.

4. Assign team members who are very knowledgeable about the management

processes. Form a multi-disciplinary team to ensure coverage of the key process areas.

Step 3. Work Plan

1. Prepare a work plan detailing the activities to be undertaken by the Team. The work

plan should also include the schedules, logistics and financial requirements needed to

undertake the assessment.

2. Orient all the Assessment Teams and individual team members about the schedule of

the QMIM assessments and the important milestones in the plan such as the deadline

for the preparation of reports.

B. Implementation - The Assessor is the Instrument� �

Step 4. Preparing for the Division Visit

1. Send formal communication to SDS regarding your visit. Indicate in your

communication the purpose of your visit, the people you want to talk to and the

documents that must be made available during your visit. Also specify if you need to

Page 23

Page 158: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

interview school heads and teachers so that arrangements can be made by the

Division before the actual review.

2. Coordinate with the Division about your travel arrangements, logistical requirements

and other administrative prerequisite to ensure smooth conduct of assessment.

3. Make sure every team member has a copy of the Assessment Tool and other

necessary forms. Have them reproduce before going to the school.

4. Conduct final team meeting before going to school. Apprise the team members of

their roles and responsibilities and the scope of the evaluation.

Step 5. During School Visit

�� Start your visit with a courtesy call. Discuss the purpose of your visit, your plan for the

day, the people you need at a particular time and the documents you need to review.

If the Division opted to conduct an opening program or ceremony, try to limit this to 30

minutes.

�� Assure the Division management and staff that the QMIM assessment is not meant to

evaluate the performance of the Division but to ascertain its level of readiness to

perform critical process or technical assistance packages.

�� Whether you use interview, group interview or focus group discussion, tell the

respondent/s about the purpose of the activity. Inform respondent that you will be

taking down notes in the course of the discussion.

�� Don t forget to ask for evidences (MOVs) on claims that the respondent/s made and to(

thank respondents for their participation soon as you have finished your data

gathering with them.

�� As soon as you have completed your data gathering activity, organize your data and

meet as a team to prepare for the exit conference.

�� Conduct an exit conference with the SDS. Point out significant observations regarding

the Division s level of readiness but avoid making judgment or conclusions right away.(

Inform the SDS that the Team will discuss the observations, analyze the findings as a

Team and make recommendations.

�� Thank the SDS and tell him/her that a complete report will be sent to him/her officially

by the Regional Office.

Step 6. Post QMIM Assessment

Page 24

Page 159: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

1. Encode and consolidate the observations and documentations of the Team using the

same format. Provide each member a copy of the documentation.

2. Convene the Team to discuss and analyze the results.

3. During the Team meeting, come up with a consensus regarding the Level of Maturity

of the Division per process area. Prepare a report on the assessment.

4. Given the findings and analysis, the Team is to formulate recommendations and

suggestions to the Region on how to assist the Division improve and/or reinforce its

practices and/or delivery of technical assistance packages.

C. Completion - Connect the dots� �

The purpose of the QMIM Assessment is to facilitate continuous improvements in the

operations of the Division concerning its practices and its current way of doing things. The

reports which includes the findings, insights and lessons learned should find its way to the

Region and Division plans so that appropriate actions can be made in the future. Thus, the

Completion Phase of this assessment should not only include report preparation but also

should lead towards plan enhancement.

Step 7. Region Report

5. Submit and discuss the findings, analysis and recommendations to the Regional

Director.

The report shall include

� procedure which the Assessment Teams had adopted to gather the data.

The discussion should cover all the activities done before(preparations

done), during(interview and generation of MOVs) and post- assessment

stage(collation of data, manipulation of data, how the data were analyzed,

etc)

� describes the results of the status of the QMIM assessment of the Division.

This section will present the findings of the various items studied and

indicating the maturity level of the Division.

� categorization of the Division vis- -vis the level of process maturity. )

� actions needed in order to fast track the progression or development of

the Division practices and processes to the next level.

6. Upon approval, provide feedback to the Division especially on the recommendations

Page 25

Page 160: The Division Monitoring &Evaluation System - deped-qmsdeped-qms.wikispaces.com/file/view/Division+M&E+System.pdf · Division Monitoring & Evaluation System 1. Introduction 2. Objectives

Qual i ty Management Inventory Model

and suggestions of the Region.

Step 8 Region Next Steps

1. The findings of the QMIM Assessment should find its way to the Region's technical

assistance packages for the Division. The results of the Assessment will be used as

input by the Region to define and formulate programs and projects for the Division.

2. The findings concerning Level 3. Integration or Level 4. Scale Up should find its way to

the documentation of best/effective practices of the Division.

3. The QMIM results can also be used as input to the Outcome Evaluation to be

undertaken by the Region. It can also be used to amend, enhance and/or formulate

new standards and policies.

4. The Division may want to implement the assessment to the other 70% schools not

covered by the initial assessment.

Page 26