Você está na página 1de 160

The Division Monitoring &Evaluation System

Division
Monitoring & Evaluation System

1. Introduction
2. Objectives
3. Scope
4. Performance Measures
5. School Monitoring Process
6. Control and Adjustment Point
7. M&E Tools and Techniques
8. Documents and Reports
9. Terms of Reference
10. Setting Up the School M&E System
Division Quality Management Inventory Model (QMIM)
DIVISION M&E SYSTEM
1.0
INTRODUCTION
Division M&E System Introduction

THE FISHERMEN
AUTHOR,UNKNOWN SOURCE UNKNOWN

There was a group called 'The Fisherman's Fellowship.' They were


surrounded by streams and lakes full of hungry fish. They met regularly to
discuss the call to fish, and the thrill of catching fish. They got excited about
fishing!!
So a committee was formed to send out fishermen. As prospective fishing
places outnumbered fishermen, the committee needed to determine
priorities.
A priority list of fishing places was posted on bulletin boards in all of the
fellowship halls. But still, no one was fishing. A survey was launched, to find
out why Most did not answer the survey, but from those that did, it was
discovered that some felt called to study fish, a few to furnish fishing
equipment, and several to go around encouraging the fisherman.
What with meetings, conferences, and seminars, they just simply didn't have
time to fish.
Now, Jake was a newcomer to the Fisherman's Fellowship. After one stirring
meeting of the Fellowship, Jake went fishing. He tried a few things, got the
hang of it, and caught a choice fish. At the next meeting, he told his story,
and he was honored for his catch, and then scheduled to speak at all the
Fellowship chapters and tell how he did it. Now, because of all the
speaking invitations and his election to the board of directors of the
Fisherman's Fellowship, Jake no longer has time to go fishing.
But soon he began to feel restless and empty. He longed to feel the tug on
the line once again. So he cut the speaking, he resigned from the board,
and he said to a friend, "Let's go fishing." They did, just the two of them, and
they caught fish.
The members of the Fisherman's Fellowship were many, the fish were
plentiful, but the fishers were few.

1.0 INTRODUCTION

1.1 Purpose of the Manual


Monitoring and evaluation is acknowledged to be one of the important systems in an
organization. It is one of the most desired systems. In conferences, seminars and workshops,
participants and organizers would talk about the importance of M&E, how it could facilitate

Page 1 - 2
Division M&E System Introduction

an efficient operation and how the organizations can benefit from it. People would get
excited about the importance of M&E that it would often result to many bright ideas and
plans about how to implement an M&E. However, the perceived importance of M&E does
not always translate to actually doing and implementing M&E. Often, critical elements of
the M&E system are missing. As a result, activities and events are undertaken in the name of
M&E and yet most fail to provide the necessary information needed in making good
decisions. Data gathering and report writing are often confused to be the M&E.
The story of The Fishermen draws an important parallelism to the practice of M&E by
organizations. Too many activities and events are undertaken in the name of M&E. Forms
and data gathering instruments are developed, but which are often incoherent. Costly
infrastructures and facilities are set up, but which usage is far from being maximized. And
generally, despite all the efforts stated above, the most basic information requirements are
missing.
It is ironic that the one of the most important systems is also one of the most neglected
systems in the organizations. Often, there are too many fisherman's fellowship and yet the
fishers are few. The main purpose of the Manual is to serve as a guide to would-be
Monitors and Evaluators on how to operationalize a M&E System. This document illustrates
the fundamental requirements and techniques of implementing M&E at the Division.
The schools (fish) are plentiful. There is an urgent need to set up an efficient M&E system
to enable the monitors to actually fish.

1.2 Understanding M&E


The capability to get things done in an efficient way is dependent on the organization's
ability to gather data, analyze data, and provide feedback to improve the way things are
done. Receiving the right information at the right time is critical to an efficient operation.
Information must be correct in order for decision-makers to set directions, and information
must be accurate in order for field implementers to act decisively in making the necessary
adjustments to improve things. Among the systems in the organization, it is the M&E system
that provides such information. It is one of the must systems that should be in place in an
organization, in a program or a project. Its importance necessitates that it should be done
in a very systematic manner.
M&E is more than an activity, more than collecting data and more than reporting. It is
purposive, deliberate and systematically undertaken. In order to understand M&E, it is
important to understand the following areas: (1) planning, (2) decision-making, and (3)
continuous improvement.
1. Planning. One of the major objectives of M&E is to determine if the implementation
is going according to plan. Without a plan, there is nothing to monitor and
evaluate.
The Plan provides the scope of monitoring and evaluation. It provides the

Page 1 - 3
Division M&E System Introduction

destination (outcomes), the directions (strategies) and the means (resources) to get
to the destination. It is important to ensure that the plan is accurate, correct and
clearly written, especially the targets and indicators.
The Plan defines the areas to be monitored and evaluated.
2. Decision-making. During implementation, things may not go according to plan.
Every manager must make a decision  correct and timely decision before things
get out of control. The objective of every manager is to ensure that despite
changes in the frame conditions and changes in the plan the outcomes can still be
achieved. Necessary adjustments have to be made in the strategies and activities,
in the use of resources in order to keep implementation on track  according to
schedule, target, time and quality.
The quality of decisions is dependent on the timeliness and completeness of
information that a decision maker has at hand. In setting up the M&E system, one of
the key considerations is knowing the information requirements of key internal
stakeholders  the manager, supervisor and field personnel.
The M&E function is core to decision-making.
3. Continuous improvement is a management process where delivery processes are
constantly evaluated and improved in the light of efficiency, effectiveness and
flexibility. (Wikipedia).

1.3 What is M&E?


Monitoring and evaluation is a means to support the continuous learning processes of an
organization. It is an essential component of any reform agenda, programs, projects and
interventions. The learning and insights generated by doing monitoring and evaluation are
used to promote continuous improvement in the work place, especially in the practices
and processes of an organization. The lack of it often results to poor performance,
inefficient implementation and program failure. The presence of a M&E System, therefore, is
important to organizations.

1.3.1 Definition
Monitoring and Evaluation (M&E) is defined as the systematic process of gathering,
processing, analyzing, interpreting, and storing data and information thereby setting into
motion a series of managerial actions for the purpose of ascertaining the realization of
set objectives.
M&E is composed of three interrelated processes. These are:
Monitoring refers to the systematic observation and documentation of actual
accomplishments as well as tracking of issues, opportunities and problems that may

Page 1 - 4
Division M&E System Introduction

affect implementation
Evaluation concerns the assessment of information (collected through monitoring)
regarding the extent to which actual accomplishments conform to or deviate from
the objectives set in the plan
Adjustment means steering the implementation. This means using the information
and insights derived from evaluation, adjusting the strategies or way of doing things
to make implementation more efficient and lead towards realization of objectives.
The main purpose of M&E is to spur managerial actions based on information and insights
collected, processed, analyzed and interpreted by the monitors and evaluators. These
managerial actions are undertaken to improve performance during implementation and to
increase the likelihood of achieving the desired outcomes.

1.3.2 M&E and Decision-Making

M&E is closely linked to decision-making. Decision-making is the process of identifying and


selecting alternatives, directions and/or solutions. Often, decisions are based on the
accountabilities of an individual, on the vision, mission and values of an organization and
on the goals and desired outcomes of the plan. However, the quality, relevance and
timeliness of decisions depend greatly on the quality and availability of data and
information. Supplying sound data and information will allow individuals to meet their
accountabilities, stay true to their organizational values and most importantly, significantly
contribute to the achievement of set goals and outcomes.
The M&E function is primarily set up to support the decision-making requirements of
managers and staff. This is the first and most important design requirement of any M&E. The
strategies, data collection and analytical techniques and timing of M&E should be tied-up
with the decisions and the timing of the decisions.

1.3.3 Elements of M&E

There are five major elements of M&E. These are:

Scope. All M&E efforts should have a scope. The scope provides the standards and
parameters for evaluating performance of programs, projects and including
individuals. The coverage of M&E is defined by the approved or accepted plan.
Without a plan, there is no scope for the M&E. Specifically, the scope will define the
following M&E concerns:

outcomes to be achieved

outputs to be delivered

Page 1 - 5
Division M&E System Introduction

activities to be undertaken

resource requirements including human, material and equipment

schedule or timing of implementation

budget or cost

Plan versus Actual Performance. M&E is about tracking of performance by


comparing the approved plan (scope) versus the actual performance. The M&E
supplies the following information:

Outcomes. Includes changes in the performance and/or practices of target


groups and benefits received as a result of the interventions

Quality. Assess quality of outputs delivered versus the standards

Scope/Quantity. Comparison of target outputs versus actual outputs


delivered

Time. Target schedule/duration versus actual date/duration of work


undertaken

Cost. Budget versus actual expenditure

Means of verification (MoV). One of the main features of any M&E system is the
means of verification. MoVs are authoritative source of information about the
achievement of outputs and the actual performance. The role of M&E is to provide
relevant, timely, and accurate information about the achievements and status of
implementation. MoVs include:

Status and/or Accomplishment Reports

Documentation of effective practices

Testing

Observation and Inspection

Key Informant Interviews

Focus Group Discussion

Managerial actions. One of the objectives of M&E is to supply information about


performance and the status or possible occurrence of external factors which may
affect implementation. The role of is to provide the venue for making corrective
actions. Possible actions include:

Page 1 - 6
Division M&E System Introduction

Adjustments in the activities and strategies

Reallocation of resources

Rescheduling

Provision of more resources

Adjustment in the scope or inclusion of new strategies

No actions at all

Termination/replacement of staff

New design, new strategies and new plans

External factors. M&E also keeps track of the possible occurrence of external
factors that may affect actual performance. These factors include:

Status and/or Accomplishment Reports

Documentation of effective practices

Testing

1.4 Types of M&E

The concerns of M&E range from tracking


Hierarchy of Objective Type of
Objectives M&E
efficiency to evaluating effectiveness.
There are different types of M&E with Goal
Measure Impact to
Goal
Impact Evaluation

different focus and usage. There are four Measure benefits to


target groups after Results Monitoring &
Outcomes
4 common types: all interventions are
completed
Evaluation

Measure effects of
intervention during
Intermediate implementation;
improvement in Initial Gains
Objectives Evaluation
performance, behavior
1.4.1 Progress Monitoring and Evaluation and practices

Progress Monitoring and Evaluation is a Outputs Measure efficiency Progress Monitoring


of implementation:
Activities & Evaluation
systematic and objective assessment of focused on scope,
quantity, quality, time
Inputs and cost
an on-going implementation of a plan or
project. Its aim is to steer implementation Table 1-1 Types of M&E

as efficiently as possible based on


empirical facts determined through a systematic observation and documentation process
and through a verifiable assessment process. Specifically, Progress M&E measures physical
progress against plans and work schedules and financial progress against cash flow and

Page 1 - 7
Division M&E System Introduction

budget allocations. It is a mechanism established to assess the quality of outputs delivered,


early warning signs for implementation problems and to identify external factors affecting
delivery of outputs.

Progress Monitoring and Evaluation is undertaken during the implementation stage and is
an integral part of the plan-design-act-control cycle.

1.3.2 Initial Gains Evaluation

Initial Gains Evaluation keeps track of the changes or improvements in the performance
and/or practices of the target groups. Initial gains represent leading indicators, the
achievement of which will lead to the attainment of desired outcomes.

Evaluations of this type are conducted every mid-term implementation and before the
completion of the plan.

1.4.3 Results Monitoring and Evaluation

Results Monitoring and Evaluation is a type of post-implementation review (PIR) that


measures the realization of outcome-level objectives. It aims to assess the effectiveness of
implementation by measuring the benefits received by target groups (recipients) and to
determine the changes in the behavior and practices of the target groups as a result of
their application and utilization of outputs.

Results Monitoring and Evaluation focuses on effectiveness.

1.4.4 Impact Evaluation

Impact Evaluation is an ex-post type of evaluation. The objective is to determine the


impact or contribution of an intervention (programs or projects) to a higher level
undertaking.

Page 1 - 8
Division M&E System Introduction

Table 1-2 Types of M&E & Description


Hierarchy of Measure of
Type of M&E Description Timing of M&E
Objective Performance
Assessment of an on-going Efficiency -
implementation. The objective is to steer Output-Activity- Physical During
Progress M&E
implementation as efficiently as possible Input Level Accomplishment Implementation
based on the approved plan. (Actual versus Plan)
Middle of the
Keeps track of the changes or Intermediate level Initial Gains -
Implementation
Initial Gains improvement in the behavior, (in between Improvement in
and before the
Evaluation performance and practices of the target outcome & the performance
termination of the
group/s. outputs) of target group/s
current plan
Results or Measures the realization of Outcome Effectiveness  Immediately after
Outcome level objectives; Determines the Outcome level achievement of the end of
M&E effectiveness of the implementation benefits implementation
Impact 
Post
Impact Measures the contribution of the achievement of
Goal level Implementation
Evaluation interventions to a higher level objective long term
(not immediate)
objectives

Page 1 - 9
DIVISION M&E SYSTEM
2.0
OBJECTIVES OF THE DIVISION M&E
Division M&E System Objectives of Division M&E

2.0 OBJECTIVES OF DIVISION M&E

2.1 Definition
The Division M&E System is a mechanism for gathering, processing, analyzing, interpreting,
and storing data and information about the school's performance, needs and requirements
to sustain an effective school-based management. Operated by the Division, it is a System
which provides data, information and insights on the efficiency and effectiveness of the
Divisions technical support to schools. It sets into motion a series of managerial actions,
adjustments and realignments for the purpose of creating a sustained impact on the
quality of education provided by schools to learners.
A complete Division M&E System should have the following features:
Organized gathering and processing.
Analysis and Interpretation
Storing data and information
Managerial actions
Realization of objectives

2.2 Objectives
The main objective of the Division M&E System is to ensure the timely flow of information
and insights on the effectiveness of the Division's technical assistance to improving school
performance. The System is used to keep track of the Division's programs and projects.
Specifically, the Division M&E System will provide the following data and information on:
schools performance. The System will allow the Division to adjust its technical
assistance on SBM according to the schools performance on enrollment, retention,
completion and achievement. This will facilitate the classification and profiling of
schools into high, average and low performance. The classification will be used as
the major input to customizing programs and projects of the Division based on
school performance.
participation rate. The Division M&E System provides data and information on the
percentage of learners of school age participating in the basic school systems and

Page 2 - 2
Division M&E System Objectives of Division M&E

the number of out-of-school youth and indigenous people being served by the
alternative learning system.
capabilities of the school heads. One of the main target groups of the Division is
the school head. The Division will track the performance and requirements of the
school heads on instructional supervision and SBM.
capabilities of teachers. Another major target group of the Division is the teacher.
The tracking will include the teachers teaching skills and mastery of the subject
matter.
efficient management of the DEDP implementation. The Division M&E System will
also be used to assess the internal efficiency of the Division, especially in the
implementation of the programs and projects outlined in the DEDP terms of
difficulties, problems, issues or risks that hinder efficient implementation of Division
programs and projects.
The Division M&E System is part of the Integrated M&E System which connects the Division
to schools and to the Region. This will enable the Division to collect and share data,
information and insights from the schools to the Region and vice-versa. The integration will
provide the Division with critical and timely information regarding its operations and will
allow it to adjust or improve its technical assistance based on the needs and requirements
of the schools. Also, the Division's documentation of practices, initial gains and results will
serve as valuable inputs to the Region and National Offices to improve their respective
programs, policies and standards.

2.3 Characteristics of a Well-designed M&E System


A complete and well designed Division M&E System should have the following features:
Organized data gathering and processing.
Organized data analysis and interpretation
Systematic storing of data and information
Facilitative of managerial actions
Aligned with the realization of objectives
2.3.1 Organized gathering, processing, analyzing and interpretation.

Page 2 - 3
Division M&E System Objectives of Division M&E

In monitoring and evaluation, it is important that the collection of data and information be
done in an orderly and systematic manner. A typical Schools Division deals with hundreds
of elementary and secondary schools. It also has to track the performance of community
learning centers and their service providers.
In this regard, the Division needs an organized and efficient system of gathering and sorting
information to reduce repetitive, costly and time-consuming gathering of data. An
organized system will facilitate the following:
accuracy of data and information
non-duplication of data and efforts
more time for technical assistance
2.3.2 Systematic Storing of Data and Information
The Division M&E System is the most authoritative source of information about the
performance of schools. It stores information on the performance of schools within the
Division and is a repository of programs and projects that can be considered as part of the
effective practices of the Division. These can be shared to all schools when they need the
information which is an important input to knowledge management.
As such, the M&E System will enable the following:
prompt retrieval of data and information when needed
detailed recording of information
standardized formats, documents and reports.
2.3.3 Facilitative of timely managerial actions
A must feature of a M&E system is the ability to provide relevant information to facilitate
decision-making . In this regard, deriving such information to aid in the decision-making and
the timing of the decisions to be made are very important considerations in the design of
the M&E system.
The monitoring activities and quality control points to be implemented by the Division are
timed with the implementation requirements of the schools. In this way, the data,
information, insights and lessons derived from the Division M&E System are immediately use
for making managerial and technical actions that will support the schools.

Page 2 - 4
Division M&E System Objectives of Division M&E

2.3.4 Aligned with the realization of objectives


And last but not the least, a well-designed M&E system must be able to keep track of the
accomplishments, initial gains and results. The main use of the System is to provide
information and insights towards the realization of objectives.
Aside from ensuring the realization of objectives and targets, the Division M&E System will
likewise allow the Division to:
document effective practices
draw lessons from failed or problematic programs and projects
determine whether to stop, continue or make adjustments in the strategies given
the early warning information

Page 2 - 5
DIVISION M&E SYSTEM
3.0
SCOPE OF THE DIVISION M&E SYSTEM
Division M&E System Scope of the Division Monitoring and Evaluation

3.0 S C O P E OF THE DIVISION MONITORING AND E VA L UA T I O N

3.1 M&E Coverage


The main task of the Division is to provide technical assistance to the schools and
community learning centers. In order to be effective, the Division must continually improve
its services by providing timely and relevant programs and projects that will benefit the
schools and community learning centers. In this regard, the Division M&E System must
capture information and insights on schools performance, efficiency of schools,
capabilities of school heads and teachers. The System must also obtain information about
the Division's efficiency to provide the technical assistance programs and projects.
Specifically, the scope of the Division monitoring and evaluation work is defined in the
Division Education Development Plan (DEDP) and the School Improvement Plans (SIPs). The
objectives, targets, programs and projects documented in these plans will be used to
define the scope of the Division M&E System.

3.1.1 Three-Year School Improvement Plan


At the school level, the SIP will be the main reference document for the monitoring and
evaluation strategies and activities of the Division. The Division will evaluate the
performance of the schools based on the targets documented at the Purpose/Outcome
Level objectives in the SIP. These include targets for enrollment, retention, completion and
learner achievement.
The Division will also monitor the efficiency of the schools in implementing the school
programs and projects specified in the SIP or AIP. Hence, the quality of the SIPs is critical to
the successful operation of the Division M&E System.

3.1.2 Six-Year Division Education Development Plan


Another main reference document on the scope of the Division M&E System is the DEDP.
The objectives, targets and deliverables contained in the DEDP will be used to track the
efficiency of the Division and to evaluate the effectiveness of Division programs and
projects in helping the schools and community learning centers improve their performance.

Page 3 - 2
Division M&E System Scope of the Division Monitoring and Evaluation

3.2 Types of Division M&E


The Division M&E System is divided into three types: Outcome Evaluation, the Intermediate
Results Evaluation and the Progress M&E. These three M&E strategies are designed to
measure the Division outcomes and initial gains, determine the achievement of critical
leading indicators and assess the efficiency of the Division in managing technical
assistance programs and projects for schools and community learning centers. Specifically,
the Division M&E System will include the following:

3.1.1 Outcome Evaluation


Primary target groups of the Division M&E System are the schools and the community
learning centers. One of the main tasks of the Division M&E System is to evaluate the
effectiveness of the Division's technical support programs and projects to the schools and
community learning centers. This is known as the Division Outcome Evaluation.
Outcome evaluation will be conducted every three years or at every end of SIP cycle. The
evaluation will provide the Division with information and insights on the improvements in the
performance of schools and community learning centers. The same will be used as input to
the preparation and/or adjustment of the DEDP.
Specifically, the M&E at this level will include the evaluation of the following:
schools performance
performance of the community learning center
SBM level of practice of the school
participation of learners of school age, out of school, indigenous people and others
to the basic education system
school stakeholders satisfaction on the quality of school services
3.1.2 Tracking Intermediate Results
The Division M&E System will also track the intermediate results. Tracking Intermediate Results
is a type of evaluation that is undertaken by collecting and assessing data and information
that predict the achievement of the Outcome Indicators. Collecting and analyzing
leading data or information is a proactive M&E strategy that will help identify the
achievement or non-achievement of the outcomes even before the evaluation period
takes place. Leading indicators of the Division's performance include:

Page 3 - 3
Division M&E System Scope of the Division Monitoring and Evaluation

Improvement in the capability of school heads on instructional supervision and SBM


improvement in the teaching skills of teachers and mastery of the subject matter
improvement in the teaching skills of facilitators and mastery of the subject matter
Improvement in the learning environment of the schools which includes classrooms,
laboratories, school equipment, textbooks, manuals and supplementary materials
and the ancillary services of the school
Intermediate Results Evaluation will be undertaken annually or as the need arises by the
Division. The findings from the evaluation will be used to enhance or improve the Division's
programs and projects (when the leading indicators showed favorable results) and/or to
implement remediation strategies when the leading indicators are showing negative results.
3.1.3 Prog ress Monitoring
The major feature of the Division M&E System is the Progress Monitoring. Its objective is to
track the efficiency of both the schools and the Division in the implementation of
education programs and projects outlined in the DEDP and the SIPs. Specifically, progress
monitoring covers:
school's efficiency as per the SIP and/or Annual Improvement Plan (AIP)
implementation of capability building programs for Division staff
efficiency of the Division as per the DEDP and/or Division Annual Plan (DAP)
fiscal management vis-a-vis physical accomplishment.
Table 1. Division M&E Framework outlines the scope of the Division M&E System. It shows the
relationships of the school performance, Division objectives and strategies, performance
indicators and means of verification. It also provides information on the type of monitoring
and evaluation the Division will implement to operationalize the System.

Page 3 - 4
Division M&E System Scope of the Division Monitoring and Evaluation

Table 3-1 Division M&E Framework

Type of M&E /
Means of
Objectives Performance Indicators Division M&E
Verification
Process

Division Goal: Impact Indicators

1.Access: To ensure that all Increase in enrollment Enrollment Report Impact


learners of school age are Learners entering the school system School Report Evaluation
in school and are ready are ready Card Process:
for school Evaluation of
School
Performance

2. Retention: To ensure that School Report


learners who are in school Increase in number of learners Card
will stay in school retained in the school (retention
rate)
Reduction in drop outs
Reduction school leavers

3. Completion: To ensure Increase in number of learners School Report


learners who are in school able to complete the basic Card
will complete the education requirements
requirements of the Improve graduation rate
primary and secondary
level

4. Achievement: To ensure Learner Report


Improvement in the basic functional
that learners demonstrate Card
literacy skills of the learners
the necessary Teacher
competencies at each Improvement in the academic Assessment
level performance of learners in all National
subject matter Achievement Test
Improvement in the social skills (2nd Year)
Regional
Achievement Test
(3rd Year)

Division Level Outcomes: Effectiveness Indicators

1. Improved school (a) Reduce disparity between Division Report Card Outcome
performance high performing schools and Division Education Evaluation
low performing schools (in Development Plan Process:
NEAT and NAT) by --- percent (DEDP) Monitoring DEDP
(b) Reduce disparity in Implementation
enrollment, drop out, and
completion rates between
high performing schools and
low performing schools

Page 3 - 5
Division M&E System Scope of the Division Monitoring and Evaluation

Type of M&E /
Means of
Objectives Performance Indicators Division M&E
Verification
Process

(c) Increase in satisfaction of Tracking


school stakeholders in the Perception Survey Intermediate
quality of instructions in the Results
school Process:
(d) Improve SBM Practice of Monitoring DEDP
schools SBM Assessment Implementation
Result

2. Improved teachers
performance (a) Teachers demonstrated Division Report Card Tracking
competencies on General and DEDP Intermediate
Content and Subject specific Teachers' Results
skills. Performance Process:
(b) Teachers meeting the desired Assessment Report Monitoring DEDP
competencies based on the Assessment for Math Implementation
NCBTS and Science
teachers

3. Improved school heads (a) School heads demonstrated Division Report Card Tracking
performance competencies on school and DEDP Intermediate
based management and Results
instructional supervision Process:
Monitoring DEDP
Implementation
4. Improved learning
environment (a) Teacher to learners' ratio is 1:45 Division Report Card
(b) Learner to textbook ratio is 1:1
(c) Teacher to teacher manual
ratio is 1:1
(d) Teacher and learners have
access to school equipment,
science laboratories and
other facilities
(e) School comply with
Standards of a Child Friendly
School

Division Intermediate Results: Leading Indicators

1. Improved Division
performance (a) Increase in gross enrollment Division Report Card Tracking
rate; and DEDP Intermediate
(b) Improvement in the net Results
enrollment rate Process:
(c) Reduce disparity in the net Monitoring DEDP
enrollment ratio / Implementation
participation rate between
highly urbanized and SRA
Divisions

Page 3 - 6
Division M&E System Scope of the Division Monitoring and Evaluation

Type of M&E /
Means of
Objectives Performance Indicators Division M&E
Verification
Process

2. Improved competencies of Tracking


DepED Division and District (a) Division and District staff Results of Intermediate
staff in providing technical demonstrates competencies Performance Results
and management support to on educational planning, Assessment Process:
schools, community learning curriculum management, Monitoring DEDP
centers, school heads, instructional consultancy, Implementation
teachers and facilitators training and development
and monitoring and
evaluation

3. Management and (a) Continuous improvement in Quality Assurance Tracking


technical assistance systems the management and Readiness Assessment Intermediate
are in placed and technical assistance Report Results
operational processes of the Division; Process:
Monitoring DEDP
Implementation

Outputs1: Efficiency/Progress Indicators

1. On Improving School % physical accomplishment (number of Division Monthly Progress


Performance programs and projects implemented Report Monitoring and
Division programs and versus number of programs and project Completion Report Evaluation
projects on SBM planned in the SIP) Process:
Monitoring DEDP
Implementation

2. On Staff Development % physical accomplishment (number of Division Monthly


Program Division staff, school heads and Report
Capability building teachers trained versus number of Completion Report
programs for Division staff, Division staff, school heads and
school heads, teachers and teachers targeted as per DEDP)
non-teaching staff

3. On Improving the % physical accomplishment (number of Division Monthly


learning environment school facilities / infrastructures set up Report
Division programs and versus targets in the DEDP) Completion Report
projects related to school
building, school equipment,
textbooks and manuals

On Managing Division % physical accomplishment (plan Division Monthly


Systems and Processes (DEDP) versus actual) Report
Division programs and Completion Report
projects related to systems
development and
implementation

1 Scope of outputs vary depending on the target (quantity) outcomes, needs and targets specified in the DEDP.

Page 3 - 7
Division M&E System Scope of the Division Monitoring and Evaluation

Type of M&E /
Means of
Objectives Performance Indicators Division M&E
Verification
Process

Others (please add)

Input

Management of Division Actual expenditure versus Approved Division Financial Progress


MOOE and other financial Budget Report Monitoring and
resources Evaluation
Process:
Monitoring DEDP
Implementation

3.2 Primar y Users of t he Division M& E Sys tem

The Division M&E System is an internal system designed primarily to cater to the
decision- making requirements of the Schools Division Superintendents (SDS), Assistant
Schools Division Superintendents (ASDS), Education Supervisors (ES) and other Division staff.
The implementation of the System is not in compliance with the requirements of the Region
but a critical support mechanism to the Division's role of providing quality and relevant
programs and projects to schools and community learning centers. At the same time, the
Division M&E System provides feedback to the Region and National on the effectiveness of
existing policies and provide information on issues, concerns and opportunities for policy
agenda.
The Division M&E System, especially the Progress M&E, provides the Division implementers
with with up-to-date and accurate information needed in making day-to-day decisions to
assure the best courses of actions and support that will improve performance.

Page 3 - 8
DIVISION M&E SYSTEM
4.0
PERFORMANCE MEASURES
Division M&E System Performance Measures

4.0 PERFORMANCE MEASURES

Performance measures provide an accurate picture of the status of accomplishments


and achievements or outcomes attained by the Division as contained in the DEDP. The
Division's performance will be assessed in the following areas:
Impact. Pertains to the achievement of the DEDP Goal Level Objective  Learners
Outcomes
Effectiveness. Refers to achievement of the DEDP Purpose Level Objectives 
Schools Performance
Efficiency. Pertains to the Division's implementation of programs and projects as
contained in the DEDP / DAP.
Organizational Maturity. Assessment of the practices and processes employed in
the Division. Refers to the Division's compliance or adherence to the quality
standard processes
Readiness of Division Staff. Refers to the competencies of Division staff on providing
technical assistance to schools and community learning centers.

4.1 Im pact
Division impact is measured in four areas. These are:
Increase in the participation rate. The first measure of school effectiveness is the
ability of the school to bring learners of school age to school. The primary
indicator for access is increase in the school's enrollment.
Retention. School effectiveness is measured in terms of its ability to encourage
learners who are in school will stay in school. The primary measure of success in
this area is retention rate. Other indicators like drop out rate and school leavers' rate
will also be used.
Learners complete the requirements from Grade 1  Grade 6 or 1st Year High School
to 4th Year High School. Another measure of effectiveness is the ability of the school
to assist or compel the learners to complete the requirements at the elementary
level or at the secondary level. The indicator to be used for this area is completion
rate and supported by other indicators like graduation rate and cohort survival rate
to help explain the phenomena.
Learners achievement. The last, but not the least, measure of school effectiveness is
the learners achievement. This pertains to the learners demonstration of required
competencies (at every level) and their readiness to pursue the next higher level of
learning. Learners achievement is a progressive indicator that shows the progress of

Page 4 - 2
Division M&E System Performance Measures

learners from one competency to the next. Measures to be used in achievement


include learner grade per subject and the score in the national or regional
achievement tests.
Achieving these four performance measures is a big challenge for school heads. These are
interrelated measures and therefore schools must be able to balance their efforts in the
four areas. The achievement of these performance measures will demonstrate the
effectiveness of programs, projects and other school services.
These measures are collected and analyzed every year and will be used as the main input
to the adjustment or enhancement of the school's programs and projects listed in the SIP
and to the preparation of the next cycle SIP.

4.2 Effectiveness
Division effectiveness is measured in four areas. These are:
Improved school performance
Improved performance of community learning centers
Improved performance of school heads and teachers
Improved learning environment

4.3 Division Eff iciency


Efficiency of the Division is measured in terms of its ability to deliver education programs
and projects on time and based on targets
The performance measures for school efficiency are:
physical accomplishment which plots the total accomplishment of the Division
(programs and projects completed) versus the total plan or targets (planned
programs and projects) on a periodic basis
cost efficiency which plots the school's usage of financial resources versus the
approved budget.

4.4 Organizational Maturity


Organizational Maturity measure focuses on the operations and practices of the Division. It
assesses the maturity level of the Division based on its adherence and compliance to the
quality standard processes established for Division operation. The operations of the Division
will be assessed using the Quality Management Inventory Model which will determine their
level of maturity based on the implementation of standard processes.

Page 4 - 3
Division M&E System Performance Measures

The Quality Management Inventory Model calibrates a Division's operation into:


(1) Ad hoc. The initial or entry level of readiness. A Division is often characterized by a
temporary and informal ways of doing things. Organizational procedures or
methods are not well defined and disseminated leading to inconsistent results and
poor quality of service. Its technical assistance packages are reactive, inefficient
and not relevant to the requirements of its target groups. Often these packages are
hand-me down practices. Its utility value and effectiveness have not been proven,
yet these are utilized year in and year out. Some may yield positive outcomes and
some may offer temporary solutions.
(2) Defined. There is an effort to implement interventions as efficiently as possible by
following a structured approach. There is a high awareness to use commonly
established management tools, techniques and procedures. But there is still that
tendency to revert back to the ad hoc or traditional practices when confronted
with a difficult situation. There is a defined process but the application is not
consistent.
(3) Integrated. Demonstrate a more mature and more consistent way of doing things. In
this category, Divisions are able to collate, document and transform its effective
practices into an integrated, well choreographed process. There is high
compliance to its own standards and processes such that all Division units and/or
individuals know the what to do and understands the coordination, cooperation
and collaboration requirements expected from them.
(4) Sustained. The Division's maturity on this level hinges on its commitment to
excellence. It must have the ability to perform continuous improvements, always
optimizing the gains or outcomes of its undertaking. Therefore, a Readiness Level 4
Region/Division should have the following traits:
Defined processes are regularly updated in accordance with the strengths,
weaknesses, opportunities, threats faced by the schools;
Defined processes are improved and in sync with agency policies and
directions;

4.5 Readiness of Division Staff


The quality of Division performance hinges on the readiness of Division staff to implement
programs and projects needed by schools and community learning centers. The Staff's
readiness will be assessed in the areas of:
school based management
curriculum management
strategic planning

Page 4 - 4
Division M&E System Performance Measures

program and project management

Table 4-1 Division Performance Measures


Performance Focus of
Description Performance Measures
Area Measure

This performance area is focused on  Enrollment


Impact to Learners the contribution of the Division's effort
to support the schools and community Division
 Retention rate
learning centers. The focus  Completion rate
 Achievement

 Reduce disparity between


high performing schools and
The Division effectiveness is manifested low performing schools (in
in the improvements in the NEAT and NAT) by ---
performance of schools and percent
community learning centers; changes  Reduce disparity in
or improvements in the competencies enrollment, drop out, and
Effectiveness of of school heads and teachers; completion rates between
Schools
the Division improvement in the maturity level of high performing schools and
schools in implementing SBM and; in low performing schools
the ability of the Division to support the  Increase in satisfaction of
schools and community learning school stakeholders in the
centers in the upgrading of learning quality of instructions in the
environment within the Division school
 Improve SBM Practice of
schools
The objective is to measure the
Division's capability to deliver programs
and projects as promised in the DEDP
Efficiency and DAP; the efficient delivery of such Division
 Physical accomplishment
programs and projects increases the  Cost Efficiency
likelihood of achieving the DEDP
Purpose level objectives
This performance measure assesses the
maturity level of the Division in
implementing and adhering to the
Adherence to
quality standards set that will assure Division  QA Readiness
Standards
correct and timely implementation of
the Division's Core Technical Assistance
Processes
This area evaluates the capabilities of
Readiness of the Division staff who will provide the
Division Staff  Competencies
Division Staff technical assistance to schools on SBM
and to community learning centers.

Page 4 - 5
DIVISION M&E SYSTEM
5.0
DIVISION MONITORING PROCESS
Division M&E System Division Monitoring Process

5.0 DIVISION MONITORING PROCESS

5.1 Definition and Scope of Monitoring Process


The Division Monitoring is a process of systematically tracking accomplishments, budget
and schedule against deliverables. It is a mechanism for measuring the performance of
the Division, schools and community learning centers and comparing these with set
standards. The systematic tracking of performance allows the Division to quality assure the
status of an on-going implementation, perform scope management and manage external
factors influencing and/or hindering the accomplishment of objectives and targets of the
Division. Specifically, the monitoring process is designed to regularly track, measure and
document the following:
the accomplishment of outputs and milestones as compared to what is specified in
the plans of the Division (DEDP), the schools (SIP) and the contracts of service
providers implementing the ALS programs.
the performance of the school heads in managing the schools. This includes
tracking the performance of school heads on SBM and instructional supervision
the schools' implementation of the curriculum
the operations of the community learning centers. This also includes monitoring the
learners participating in the alternative learning system and the quality assurance
of facilitators and/or mobile teachers
The Division Monitoring Process will serve as a trip wire, early warning signs on issues that
may affect the quality and/or hinder completion of outputs. The monitoring process will
enable the Division to make immediate corrective actions before issues become full blown
problems affecting quality, targets, schedules and budget.

The monitoring process is a review of an on-going implementation. Monitoring activities are


undertaken to assess the following:
Quality of products and services provided
Compliance to quality standards
Accomplishments based on scope and time
Cost efficiency based on budget and time
Frame conditions or external factors beyond the control of the implementers that may
affect achievement of targets

Page 5 - 2
Division M&E System Division Monitoring Process

The Division Monitoring Process also covers observing, measuring and documenting events
in the external environment. It includes tracking the stakeholders support and factors
beyond the control of the Division and the schools which may affect the implementation
of the plans.
The Division Monitoring Process includes: (1) Monitoring the DEDP Implementation, (2) School
Performance Monitoring, and (3) Managing the ALS Programs.

Division & District


M&E System

Process Process Process Process


Review Review Review Review
(school visit) (school visit) (school visit) (school visit)

Status Status Status


Reporting Reporting Reporting

Review of Contracting Mid Point End of


Proposals (SP) of SP M&E Contract
Evaluation
Monitoring Process
(1 Year)

Figure 5-1 Overview of the Division Monitoring Process

5.2 Some Guideposts in Monitoring


In implementing a monitoring program, consider the following:
Track and manage the 4 core areas of management: quality, scope, time and cost.
All throughout the DEDP Implementation, beware of scope creep. Minimize them as
much as you can. It will have implications to your targets, time and resources.
Annual and quarterly reviews will help reduce unwanted activities.
In reporting progress, always start with the percent physical accomplishment. Then,
elaborate the reported progress or status of implementation by discussing quality

Page 5 - 3
Division M&E System Division Monitoring Process

concerns, scope and cost concerns.


If after 3 reporting period and there are no changes or progress in the
accomplishments, something is wrong. This is the reason for monthly reporting: it
tracks progress.
Track and document effective school practices. Use appreciative inquiry to
determine the good practices.
At the end of implementation, beware of the 90% accomplishment. The last 10% is
usually the most difficult to implement.

5.3 Monitoring DEDP & SIP Im plementation


The DEDP and SIP provide the scope of the Division monitoring process. The efficiency
measures, monitoring strategies and activities of the Division will be based on the content of
the DEDP and SIP. Specifically, the monitoring will be based on the accomplishment of
outputs, targets and activities.

The DEDP outlines the support programs and projects of the Division for the schools. It also contains
staff development programs for school heads and teachers, technical assistance support to school
heads on SBM, instructional consultancy strategies for teachers, learning materials support and other
support requirements of the schools and community learning centers.
On the other hand, the SIP contains the scope of work of the school for the next three years. The work
is detailed yearly through the Annual Implementation Plan or AIP. The SIP/AIP is used to track the
implementation efficiency of the schools.

Monitoring DEDP & SIP Implementation is a management mechanism which will allow the
Division to manage its monthly operations more efficiently. It focuses on the deliverables
and sees to it that these are accomplished and delivered. Tracking the DEDP and SIP
implementation will also facilitate the systematic handling of concerns on the quality of
technical assistance delivered to schools and community learning centers.
Specifically, the mechanism will allow the Division to manage the following:
quality and status of Division programs and projects.
Division's Physical Accomplishment (S-Curve). Involves comparing the number of

Page 5 - 4
Division M&E System Division Monitoring Process

Division programs and projects implemented versus the planned/targeted


programs and projects in the DEDP.
Expenses versus budget (Cost S-Curve). Involves monitoring the Division's generation
and management of its financial resources vis a vis the financial resources outlined
in the DEDP
Schools' Efficiency. Involves monitoring actual progress versus plan, actual cost
versus budget.
5.3.1 Objectives
Monitoring the DEDP and SIP implementation is done to accomplish the following:
Ensure the timely and cost efficient delivery of programs and projects outlined in
the DEDP and SIP.
Provide immediate feedback on the quality and effectiveness of technical
assistance provided to schools, community learning centers and to
teachers/facilitators.
Provide information on the accomplishments of the Division and schools including
enabling and hindering factors that may be used as basis for adjusting and/or
improving efficiency
Document the experiences of the Division in providing technical support to schools.
This includes effective practices and lessons learned.
Immediately address issues and risks that may affect future performance of the
Division, schools and community learning centers.
5.3.2 M&E Activities
Monitoring the DEDP implementation involves conducting the following activities:
Preparation and submission of Division Quarterly Status Report. The report
highlights the accomplishments of the Division and schools versus the targets in the
DEDP and SIP. It also includes comparison of budget versus actual resources utilized
in the implementation of programs and projects.
Conduct of team meeting with Division staff on the status of Division programs and
projects. The Division Quarterly Status Report and the schools' Quarterly Status
Report will be used as reference documents in the conduct of the team meeting.
The status reports will be used as trip wire to determine issues and concerns that

Page 5 - 5
Division M&E System Division Monitoring Process

demand immediate attention by the Division and districts.


Discussion of problems that affected delivery of Division services. Problems that
can be solved at the school level should be resolved immediately. Problems and/or
issues requiring decisions or support from the Division Office should be included in
the School's monthly report.
Updates on the implementation chart and on the status of activities, events and
outputs completed.
Communication of accomplishments to stakeholders.
5.3.3 Process Owner
The Schools Division Superintendent (SDS) is mainly responsible for managing the DEDP
implementation process. The SDS is one of the main beneficiaries of the data and
information generated from undertaking this process. The process provides the SDS and
other key personnel with up to date, relevant information needed in making timely
decisions and/or adjustments in the implementation of the DEDP/DAP.
The following individuals are tasked to provide the staff work needed by the SDS:
(1) Assistant Schools Division Superintendent (ASDS). The ASDS oversees the day-to-
day operations required in managing the DEDP implementation. The ASDS shall
ensure the quality of the reports and documents needed in the status reporting of
Division operation.
(2) Division M&E Coordinator. The M&E Coordinator is tasked to do the following:
collection and collation of reports submitted by units/offices within the
Division
write and package the report of the Division
(3) Division Planning Officer. Assists the M&E Coordinator in the preparation of the
Division Quarterly Status Report. Specifically, the Planning Officer will provide the
planning documents or reference documents needed in the preparation of the
status reports.
(4) Division Program/Project Manager. This is a designation given to an Education
Supervisor or any other Division staff tasked to lead the implementation of a
program or project. The Program/Project Manager will provide monthly updates on
the status of programs/projects to the SDS.

Page 5 - 6
Division M&E System Division Monitoring Process

5.3.4 Documents and Reports


The main output of this process is a Division Quarterly Status Report covering the Division's
accomplishments versus the targets set in the DEDP/DAP.. The report shall also contain
information on the quality of accomplishments, factors facilitating the implementation and
discussion of problems and issues that affected the implementation.
The Division Quarterly Status Report is a consolidation of the following documents/reports:
school quarterly report
monthly report of program/project managers in the Division.
This quarterly status report will be used as input to the preparation of the Division Annual
Accomplishment Report.
Division M&E System
   

Quarter Quarter Quarter Accomplishment


Report Report Report Report

    


      

Monthly Monthly Monthly Monthly Monthly Monthly Monthly Monthly Monthly Monthly Monthly Monthly
Report Report Report Report Report Report Report Report Report Report Report Report

School School School School Annual


Quarterly Quarterly Quarterly Accomplishment
Status Status Status Report
Report Report Report

Reports and Documents

Figure 5-2 Status Report - Quarterly

Page 5 - 7
Division M&E System Division Monitoring Process

5.4 School Performance Monitoring


Monitoring School Performance is a mechanism that will provide valuable information on
the strengths, weaknesses and challenges faced by the schools, school heads, teachers
and non-teaching staff in delivering quality education to learners. Specifically, this will
provide the Division with an up-to-date information on the following:
SBM maturity level of practice of the schools
compliance of schools and their staff on the standard processes
readiness of school heads and teachers to provide quality service to learners. The
process aims to identify the strong and weak points of school heads and teachers
which will be used as basis for providing training support
5.4.1 Objectives
The objectives of School Performance Monitoring are the following:
promote the practice of continuous improvement and self-examination in the
schools
determine the relevance and applicability of school processes and practices to
improving learners performance and to improving school efficiency
determine the knowledge and skills of staff in performing tasks based on their
compliance to the standard process. The results will be used as input to the
capability building programs for schools.
allow the Division and District to manage the technical assistance more efficiently
by adjusting its programs and projects to the changing requirements of the schools
and shifting resources where needed
design more relevant programs and projects in schools
5.4.2 M&E Activities
Monitoring school performance is a mechanism designed to gather information about the
performance, accomplishments and practices of the schools on SBM, instructional
supervision, curriculum implementation and the school programs and projects
implemented by the school. The monitoring will involve the following activities:
(1) Prepare a school monitoring plan. The plan is a quarterly plan of the education
supervisors and district supervisors which details the objectives of the school visit, the
data gathering methods and the list of schools to be visited. The preparation of the

Page 5 - 8
Division M&E System Division Monitoring Process

monitoring plan is based on the technical assistance requirements of the schools


and the time the Division needs to validate the application and/or utilization of
outputs provided to schools.
(2) Analyze school reports and accomplishments. Monitoring school performance will
be triggered by the status reports submitted by the schools. The contents of the
reports will be used as basis or input by the Division monitoring staff to determine
the scope of the school monitoring.
(3) Periodic school visits to be undertaken by the education supervisors and district
supervisors. The visits will include observations, interviews, focus group discussions
with learners, teachers and school head about the practices on SBM and the
management of the curriculum. The school visits will be done randomly and
unannounced. This is to capture the actual and/or real practices of schools and
school staff in delivering quality education to learners. Activities will include:
Process Check or Review. This entails actual observation or demonstration
of compliance to established standards. The review will be undertaken
using a predetermined checklist outlining the standard processes that will
assure quality. Personnel from the Division and/or District will do actual
observations and interviews.
Review of artifacts or MoVs. Artifacts validate the claims of individuals
regarding application of certain practices or processes.
Team discussion. This includes sharing of information and insights regarding
the findings of the inventory.
(4) Conduct of a perception survey. The Division and District will be conducting a
perception survey to get feedback on the performance and quality of services
provided by the schools from the community and other stakeholders.
(5) Share information through the conduct of Division team meetings.
5.4.3 Process Owner
The process owner in the conduct of monitoring school performance is the Assistant
Schools Division Superintendent (ASDS). As process owner, the ASDS shall:
ensure that all schools are visited and given technical assistance by the education
supervisors and district supervisors.
ensure integrity of the process by making sure proper planning, data gathering

Page 5 - 9
Division M&E System Division Monitoring Process

techniques and documentation are followed by the education supervisors and


district supervisors.
incorporate to the Division Status Report the findings of the monitoring teams or
individuals
act on the recommendations made by the monitoring teams especially on
problems and issues requiring Division interventions
On the actual conduct of the school performance monitoring, the education supervisors
and/or the district supervisors shall compose the monitoring team. The team shall gather
and collect data and information, analyze. recommend and provide immediate feedback
to school heads.
5.4.4 Documents and Reports
The data, information and insights to be generated in monitoring school performance is
included in the Division Quarterly Status Report. The status report will have a section on the
schools performance. The results of the Process Check/Review will be discussed with the
Division and District staff and will be used as a case study to improve and or sustain the
quality of services provided by Division and District staff to schools and community learning
centers.
Other documents required in this process are:
School Monitoring Plan. Provides the objectives and scope of the monitoring visits
to be undertaken by the Division and District.
Report on School Visit. Every school monitoring activity, the monitoring team or
individual shall prepare a School Visit Report.
Frequency of Visit Matrix. Provides information on the number of visits undertaken
by Division and District to schools.
In order to ensure efficient operation of this process, the following reference documents are
required:
DEDP/ DAP. Outlines the programs and projects of the Division for schools.
Accepted SIP/ AIP. Represents the scope of work of the school for three years (SIP)
and within a year (AIP). Provides information on the objectives, outputs  targets,
and schedules of activities.
School Quarterly Progress Report. Provides information on the physical

Page 5 - 10
Division M&E System Division Monitoring Process

accomplishment of the schools as per SIP/ AIP, facilitating factors and issues and
concerns affecting school performance.

5.5 Monitoring the AL S Prog rams


The Division manages two major ALS programs namely, the Basic Literacy Program (BLP) and
the Accreditation and Equivalency (A&E) Program. As part of the continuous improvement
process, the Division and District will monitor the learning sessions and the quality of the
contact period between the learners and the literacy facilitators and/or instructional
managers. The scope of the monitoring covers:
profile of learners participating in the BLP and A&E programs including the
individual learners progress
performance of instructional managers and facilitators
performance of Service Providers (SP) implementing the BLP and A&E programs
based on the implementation plan as specified in the contract
quality of inputs or service provided by literacy facilitators and instructional
managers
5.5.1 Objectives
The objective of this process is to generate feedback and information from the Division's
implementation of the BLP and A&E programs. The results of the monitoring process will
enable the Division to:
improve the efficiency of the Division and District in implementing the ALS programs
in order to widen its coverage and further increase participation
strengthen the Division's partnership with private groups or organizations, private
and state universities and college and other government organizations acting as
Service Providers.
design technical assistance support and capability building programs for literacy
facilitators and mobile teachers in order to make them more effective partners in
providing education for all and improving functional literacy.

Page 5 - 11
Division M&E System Division Monitoring Process

5.5.2 M&E Activities


The District Supervisor will be conducting at least 5 visits within the contract period. The visits
focus on the Service Provider's compliance to the provision in the ALS Service Contract.
Field visits to be undertaken at midpoint and at the end of the contract period of
the service providers.
Interviews on the practices of instructional managers and the literacy facilitators
5.5.3 Process Owner
The designated Education Supervisor for Alternative Learning System (ALS) is the process
owner for monitoring the performance of the service providers, instructional managers and
literacy facilitators. The ALS Division Supervisor will be assisted by District Supervisors in the
conduct of actual monitoring and report writing.
5.5.4 Documents and Reports
The documents and reports in this process will be used as input to the Division
Quarterly/Annual Status Report. These include:
(1) Training Completion Report on the orientation and training conducted by the
Division and District supervisors for the facilitators, instructional managers and
service providers
(2) Initial Report prepared by the District Supervisor. This will include information on the
enrollees or learners, facilitators and instructional managers, the activities observed
and the problems and issues to be resolved.
(3) Status reports of District Supervisor on the implementation of ALS programs at the
field level. The report shall include:
process documentation of actual implementation of ALS programs
information on the networking and coordination efforts with local
government units, other line agencies and non-government organizations
evaluation of performance of learners and service providers
(4) Mid Term Report and End of Contract Report.

Page 5 - 12
DIVISION M&E SYSTEM
6.0
DIVISION QUALITY CONTROL AND ADJUSTMENT POINTS
Division M&E System Quality Control and Adjustment Points

6.0 DIVISION QUALITY CONTROL & ADJUSTMENT POINTS

6.1 Quality Control & Adjustment Points


A Control Point is a mechanism for continuous improvement. It is a time-based evaluation
activity designed to assess major accomplishments at every assigned time period and to
determine achievement of critical milestones within the implementation life cycle.
A Control Point provides the transition from one
major implementation stage to the next
milestones. At each stage, a quality control 
 
point is installed. Each point is designed to
check and review an implementation stage.
  
  
 
The review process will be unique per control
point gate as the objectives, requirements and
problems at each stage vary. Each review gate Transition point from 1
stage to another stage
will provide the transition to the preceding
project stage. Figure 6-1 Control Point

The Control Points represent the evaluation


activities of the Division. Predetermined evaluation points are set up which will allow the
Division to assess the quality of its programs, projects and technical assistance activities to
schools and community learning centers. These provide valuable information and insights
on the effectiveness of the Division.
The Control Point is also an adjustment or
enhancement mechanism. The Division, based
on practices proven effective and lessons
 learned from previous implementation, uses the
Control Point as a mechanism to adjust

  
  
  strategies, programs and projects to improve
efficiency and increase likelihood of achieving
Continuous improvement desired objectives.
The control points and the adjustment points
Figure 6-2. Adjustments
provide the operational framework of the
Division M&E System. The operational framework
will help establish the mechanism for systematically integrating data collection, analysis and
decision making into one cohesive process. The major processes of technical assistance,
planning, monitoring and evaluation are integrated and systematically designed into one
cohesive process called Quality Control and Adjustment Points.

Page 6 - 2
Division M&E System Quality Control and Adjustment Points

Value Added from Quality Control and Adjustment Points


Aside from providing the operational framework of the Division M&E System, the following
are some of the potential benefits in using the quality control and adjustment points:
Communicates the intent of the Division M&E. Monitoring and evaluation can be
abused. It may be used as a counter intelligence agency, a mechanism for fault
finding and punishing individuals. As a result, implementers conceal the true and
actual situation from evaluators due to fear and distrust of the system. Such
behavior always happens when the objectives for evaluation are not disclosed;
performance measures keep on changing and unscheduled and surprise
evaluation is done.
The Quality Control and Adjustment Points provide clear description of M&E
objectives, performance measures, activities and resource requirements of every
review activity. The schools, community learning centers and stakeholders are well
informed about the intent of each review.
Ensures necessary implementation mechanisms and critical support infrastructure
are installed and ready for use. The Quality Control and Adjustment Points
establish review mechanisms that will ensure the installation of mechanisms or
systems and infrastructure necessary to the achievement of objectives and targets
in the plan. These mechanisms ensure the setting-up of management systems that
will facilitate the implementation of programs and projects.
Confines implementation problems to one stage. One of the common signs of a
poor M&E system is the presence of recurring problems. These problems are a
product of inactions and wrong decisions made early in the implementation stage.
A responsive M&E system should be able to detect and predict these problems.
The Division M&E is a mechanism for screening problems, including potential ones.
The stage approach allows proper and timely management of issues and concerns
before these escalates into problems that will affect the deliverables and results.
Control Points are installed after and before every major Division milestones. These
are designed to enable the Division staff to reflect on decisions and activities
already undertaken and, at the same time allows them to be forward looking by
assessing the implications of their previous actions.
Anticipates issues and risks. The Adjustment Points serve as integrated review
mechanism designed to limit or reduce the exposure of the Division and schools to
issues and risks. Review points are established to identify, analyze and make
immediate adjustment in the strategies before issues and concerns evolve into
problems.
Systematizes evaluation and decision making. The Quality Control and Adjustment
Points provide the rationale for data collection, reporting, communication and

Page 6 - 3
Division M&E System Quality Control and Adjustment Points

decision making. It highlights the importance of any major evaluation activities  for
making decisions.

Some Guideposts in Using the Quality Control and Adjustment Points


In determining the control points to be used, consider the following:
The planning stage is the most critical stage in implementation life cycle. If plans are
vague, managing and implementing it is difficult. The cost to change a plan during
the planning stage is more cost efficient and the value added in enhancing the
plan is highest if done during the preparation stage.
During start up, watch out for the first 15% of implementation, if after six months the
accomplishment remains at 15%, something is terribly wrong.
Quality is a prevention process. It is a product of processes set up to ensure services
and products are fit for use.
Set up adjustment points. After every major assessment, it should be followed by
adjustments and/or enhancements in the plan, strategies and design. Every major
evaluation event should be preceded by major adjustment efforts.
No report driven monitoring and evaluation. The evaluation should be used as basis
for adjustments and improving performance.
At the end of implementation, beware of the 90% accomplishment. The last 10% is
usually the most difficult to implement.

Page 6 - 4
Division M&E System Quality Control and Adjustment Points

Division Quality Control &


 Adjustment Points
 

     !

 
  
  
  
   
    .  +  




  !    ( )   

% & % &   !    , 
%  % & % & % 


 
 
 
 
 
 
 

   
  
     
  
  

 #  .  #+  

 
  


Figure 6-3 Division Quality Control and Adjustment Points

6.2 Division Quality Control and Adjustment Points

Quality Control & Adjustment Points are established in order to ensure relevant, up-to-date
and timely technical assistance of the Division and districts to schools and community
learning centers. Control points are strategically placed at every major milestone in the
DEDP implementation life cycle. The control points are mechanisms to steer and manage
technical assistance to schools and learning centers.
The 5 Division Quality Control and Adjustment Points are:
SIP Appraisal (SA). A quality control mechanism designed to make sure that SIPs
are able to meet the criteria of a good plan: relevance, responsiveness and
feasibility. This is also the review point where the SIP is assessed in terms of
completeness of information and in terms of its fit for use as a reference for
monitoring and evaluation.
Start Up Review (SUR). Ensures the readiness of schools to implement the 3 year SIP.
This quality control point evaluates the compliance of the school to set up critical
management mechanisms before fully implementing the SIP. Example is the set up
of the M&E system.
Annual Implementation Review (AIR). A major review of the Division and schools'
implementation of their programs and projects. Assessments are made in terms of
achievements and accomplishments based on the objectives and targets in the
DEDP and SIP. The AIR is used as an adjustment point for the next implementation

Page 6 - 5
Division M&E System Quality Control and Adjustment Points

year.
Mid-Term Review (MTR). A review undertaken after the first 3 years of the DEDP (at
the end of the SIP cycle). The Division evaluates its impact to the learners and its
effectiveness based on the schools' achievement of their outcomes. The results of
the MTR will serve as a major input to adjusting the next 3 years of the DEDP.
Outcome Evaluation (OE). A post implementation review conducted at the end of
the DEDP implementation. The main objective is to determine whether the outcome
level objectives and goals in the DEDP are achieved. OE investigates factors that
contributed to success and/or hindered achievement of targets. The results of the
OE will be used as input to the preparation of the next cycle DEDP..

6.3 Monitoring Process and the Quality Control & Adjustment Points
The Division M&E System is composed of two major systems: the Monitoring Process
(discussed in Part 5) and the Quality Control & Adjustment Points. These two systems gather
different but related information.
The Monitoring Process represents the daily, weekly, monthly and quarterly efforts to track
and improve the delivery of services to schools. The data, information and insights collected
in this process are immediately used for adjustments, to solve issues and problems and to
ensure the implementation progress is on track  within scope, time and cost.

Division & District


M&E System

Process Process Process Process


Review Review Review Review
(school visit) (school visit) (school visit) (school visit)

Status Status Status


Reporting Reporting Reporting

Review of Contracting Mid Point End of


Proposals (SP) of SP M&E Contract
Evaluation
Monitoring Process
(1 Year)

Annual Annual Outcome


Start Up Implementation Implementation
Appraisal Evaluation
Review Review Review

Quality Control and Adjustment Points

Figure 6-4 Division Monitoring Interface with


Division Quality Control and Adjustment Points

On the other hand, the Quality Control and Adjustment Points are major evaluation points
set up to measure the achievement of outcomes, initial gains and major milestones (such as
appraisal and start up). It uses the data and information from the Monitoring Process to

Page 6 - 6
Division M&E System Quality Control and Adjustment Points

provide background stories about what happened, what transpired, and the factors that
influenced the achievement of the major milestones.
Figure 6-2 illustrates the interaction between the two systems.

6.4 Quality Control Point #1: SIP Appraisal (SA)


The Appraisal Process is one of the quality control mechanisms of the Division and a major
activity undertaken by the Division during the preparation of the DEDP. Mainly, it is a
planning activity designed to ensure quality plans both at the school and Division levels.
SIP Appraisal has two major focus.
Input Process Output First, as a quality control mechanism
to ensure the correctness and

 DEDP fitness for use of the school plans. It
SIP   !
provides the venue for both the
Division and schools to collaborate
Schools Division assist Division prepare on strengthening the school
prepare & the schools in prog rams and
submit their preparing a projects that will performance. Specifically, the
plans quality plan suppor t the SIP
Division, through the Division QMT,
reviews the plan in terms of
Accept the plan Input to Division plan
relevance, feasibility and
Figure 6-5 Appraisal Process
sustainability as well as in format
and presentation. The Division
provides suggestions to enhance the plan and increase its likelihood of success. The
appraisal process ends when the Division accepts the SIP for implementation.
Secondly, the SIP Appraisal serves as a data collection activity. The process of appraisal
provides the Division with detailed information on the programs and projects of the schools,
furnishing adequate information and insights to the Division on the type of technical
support the school will need to successfully achieve the objectives in the SIP. These
information and insights are inputted to the DEDP.

6.4.1 Guiding Principles


A poorly prepared plan is the most common cause of inefficient implementation
and non-attainment of desired objectives. The cost to revise or troubleshoot a
wayward implementation is much more expensive than revising a plan at the
preparation stage. It is, therefore, important to focus more attention and spend more
time ensuring the quality of plans. This is shown in figure 6-#. [?]
Schools will be needing all the assistance in implementing SBM. The appraisal
process is one of the most concrete modes of assistance by the Division to the
school heads. Guiding the schools in developing a quality plan increases the

Page 6 - 7
Division M&E System Quality Control and Adjustment Points

chances of an efficient and effective SBM implementation.


Before proceeding immediately to implementation, the schools need to satisfy
stakeholders about the relevance and feasibility of the proposed programs and
projects.
An approved plan ceases to be a plan of the proponent. Once approved, the plan
becomes a plan both of the proponent and decision maker, who are now both
accountable to the successful implementation of the plan.

2 6 ! % 5

23 !4 5

1 )) ) ! % !


///0
((/
,/

21 !)!5
  
     
Figure 6-6 Importance of Planning

6.4.2 Objectives
The SIP Appraisal Process is established to assist schools in the preparation of SIP. It is a
quality control mechanism of the Division that will assure relevance, feasibility and
sustainability of education programs and projects of the schools.
The main objective of this control point is to ensure the schools prepare a relevant and
implementable plan. Specifically, the Division conducts an appraisal of SIP to warrant the
following:
the statement of the problems and objectives is clear. The baseline situation and
the desired situation is clearly explained and shows logical link.
SIP objectives and targets are specific, measurable and reasonable.
strategies and proposed programs and projects in the SIP are relevant. This means
that there is a logical link between the baseline situation and the proposed
strategies and programs to bring about changes or improvements in the situation.
Relevance means the plan will be able to solve the problems of the school and/or

Page 6 - 8
Division M&E System Quality Control and Adjustment Points

the strategies match with available opportunities.


the schools will be able to sustain their operations by appraising the SIP and
evaluating the capacity of the school to implement the SIP, assessing the support of
stakeholders and availability or sources of funds that can back up and sustain
school programs and projects.
There are enough details in the implementation plan that can be used as input in
the development of monitoring instruments. This means that the milestones and
targets are well defined, schedules and dates are clearly specified and the
resources required are distinctly identified and outlined specified in the plan.

Appraisal Process Flow School


 Start Up
Stage

  
1. Initial Review % &


2. Assess relevance   )
!! !! 

  %
3. Assess technical 

 
correctness of proposed   !
% &
programs and projects !! !! 

.   !!  
4. Feedback and revision +
7 )38
# % ! 
5. SIP acceptance
Next Control
Point
Output: Accepted SIP

Next Process: Start Up Review Control Point 1. SIP Appraisal Process

Figure 6-7 Appraisal Process Flow

6.4.3 Process Description


The Division will implement the following appraisal activities:
(1) Initial Review - check compliance to requirements. The first major step in the
appraisal process is to check the completeness of information and the compliance
to agreed SIP format. The objective of the compliance check is to ensure
information are complete before it is handed over to the review team.
Refer to SIP Appraisal Checklist  Item #1 Completeness of SIP.
(2) Assess the relevance of the SIP. Review of the Rationale or Background Section
and the Goal Chart or Objectives Section. This review establishes the relevance of
the SIP by assessing the match between the baseline situation (problems,

Page 6 - 9
Division M&E System Quality Control and Adjustment Points

opportunities, strengths and weaknesses) and the desired future situation. There has
to be an agreement between the school and the Division about the baseline
situation and the desired future situation (includes targets). When this requirement is
satisfied, proceed to the next step. If relevance is not satisfied, do not proceed to
the next step. Return the SIP for revision.
Refer to SIP Appraisal Checklist  Item #2 Relevance of the Plan.
(3) Assess correctness of strategies. The appraisal, at this point, focuses on the
feasibility of strategies as outlined in the detailed implementation plan. This activity
will include the review of the following:
Individual programs and projects proposed in the SIP. Examine the
technical correctness of these programs and projects. Assessment includes
identifying other alternatives that may produce better results to achieve the
Outcomes.
Link between future desired situation and the proposed programs and
projects. Assess whether the proposed Outputs/Contributory Objectives are
complete and necessary.
Refer to SIP Appraisal Checklist Item #3. Necessary and Adequacy.
(4) Assess the feasibility of the plan. The appraisal shall not be limited to the review of
the document but shall also include assessment of the school's capacity to
implement and sustain the plan.
Assess capacity of school to implement the SIP including the programs and
projects.
Assess capacity of stakeholders to support the school in implementing the
SIP strategies
Review the costings and estimates. The Division QMT also reviews the
assumptions and cost estimates presented in the SIP.
Refer to SIP Appraisal Checklist Items # 4,5 and 6.
(5) After appraising the relevance of the SIP and the technical correctness of the
proposed programs and projects, the last item to review and enhance is the
completeness of the Implementation Plan. This means checking the following must
items:
targets and milestones are clearly specified
activities are broken down to desired level,
the relationships of the activities (network) are logically sequenced
activities are assigned with resources (human, material, equipment etc)
activities are specified on a monthly (not quarterly) period

Page 6 - 10
Division M&E System Quality Control and Adjustment Points

Gantt or bar chart showing the activities on a time scale (months)


budgetary requirements are specified per activity and per month
Refer to SIP Appraisal Checklist Item # 7 Detailed Implementation Plan.
Table 6-1 Guide to SIP Appraisal provides a more detailed listing of the areas to
consider in reviewing and enhancing the SIPs. It also contains possible decisions or
actions the Division may take in different scenario.
(6) Feedback and Revision. After a thorough review of the SIP, the Division QMT will
provide feedback to the schools on areas for enhancement. The school is expected
to revise and/or enhance the SIP and submit it back for acceptance. Specifically,
this activity will include the following:
Write recommendations (alternative interventions) and suggestions on the
implementation plan.
Communicate findings and provide next steps instructions to schools.
(7) Acceptance. After satisfactorily complying with the requirements, the Division QMT
endorses the SIP to the SDS for acceptance.

Page 6 - 11
Division M&E System Quality Control and Adjustment Points

Table 6-1 Guide to SIP Appraisal

Focus of Appraisal Inquiry Area Possible Actions

1. Completeness of Document
To ensure the SIP is complete in  Is the SIP following or complying the Return SIP when it does not
terms of data, information and prescribed format? comply with requirements
supporting documents are present
Proceed to assessment of
 Are the data, information and relevance when SIP is deemed
assumptions used correct and valid? complete
 Are there supporting documents?
 Were the stakeholders involved or
participated in the preparation of the
plan?

2. Relevance of the plan


(background/rationale/objectives
section)
 Are problems, needs and opportunities Return SIP when needs and
To determine if the desired
clearly described? Is there a supporting opportunities identified are
objectives in the SIP match with the
analysis? vague; when objectives do not
needs and opportunities listed in the
match with situations described
rationale or background
 Are the objectives match with the Proceed to assessment of
needs and opportunities identified? feasibility when relevance is
clearly established or described

3. Necessary and Adequate


(Objectives 
Outputs/Component/Implementatio
n Plan)
To establish direct link between  Are the outputs/deliverables identified If the proposed deliverables are
objectives and the proposed sufficient, or adequate to achieve inadequate and/or unnecessary
programs and projects in the SIP desired objectives? to achieve objectives,, return SIP
for enhancements.
To provide other and better
alternatives in achieving the desired  Are the outputs/deliverables identified Suggest better alternatives to
objectives necessary to achieve desired school
objectives?
Proceed to next appraisal area
 Are there better alternatives (outputs) when outputs are considered
available that will help achieve the complete and necessary
desired objectives?

4 Capacity of School
To determine the capacity of the  Can the school head implement and If yes, proceed to next appraisal
school to implement the proposed manage the programs and projects in area.
programs and projects in the SIP the SIP? If no, consider the school
requirements in the DEDP. Make
 Can the teachers deliver programs and sure technical assistance support
projects efficiently and effectively? to schools are incorporated in
 What are the capability building the DEDP.
requirements (needed to implement the
SIP) of the school head and teachers?

5. Stakeholders Support
To determine the level of support the  Are the stakeholders ready and willing If yes, proceed to next appraisal
stakeholders can provide to schools to participate and support the area.
implementation of the plan? If no, consider the school
requirements in the DEDP. Make
 Are they capable? sure technical assistance support
to schools are incorporated in
the DEDP.

Page 6 - 12
Division M&E System Quality Control and Adjustment Points

Focus of Appraisal Inquiry Area Possible Actions

6. Resource Generation
To determine the feasibility of  Are the cost requirements reasonable? If yes, proceed to checking the
implementing the plan considering implementation plan.
the cost requirements  Are there other fund sources? If no, can the Division assist in
looking for fund sources? If no,
downsize the plan.

7. Detailed Implementation Plan


To assess the implementability of the  Are the milestones and target clear and If yes, endorse the plan for
plan correct? acceptance
to ensure necessary elements are If no and requires major changes,
present in the plan  milestones and  Is there a work breakdown of return the plan for revision
targets, resources, schedule and cost outputs/activities?
If no and revisions required are
 Are target dates specified for each considered minor, accept the
milestone? plan conditionally. The school is
to submit a revised
 Are there resource and cost allotted for implementation plan before or
each milestone and activity? Is there a during the start up stage
cash flow matrix?

6.4.4 Knowledge and Skills Requirements


Individuals who will be involved in the Appraisal Process need to have adequate
background and experience in planning and school operation. The following are some of
the competencies required for an appraisal team:
Understand the needs, problems and issues in the locality. These include knowledge
of the school programs and projects that succeeded and those that failed and the
historical performance of the school.
Good understanding of the community relationships, especially the stakeholders.
This will help the appraiser make suggestions on how to maximize the support from
stakeholders.
Good analytical skills, especially in problem analysis, opportunity analysis and in
appreciative inquiry.
Have actually used planning tools like goal chart (lograme), work breakdown
structure, Gantt or bar chart, network chart in the preparation of a plan and in
implementing the same.
Last but not the least, subject matter specialist who will assist in the review and
enhancement of proposed programs and projects of the schools.

Page 6 - 13
Division M&E System Quality Control and Adjustment Points

6.4.5 Process Output


The main output of the appraisal process is the accepted SIP. Specifically, the objectives,
targets, outputs or deliverables described in the SIP represent the collective agreement of
the schools and the Division and District. The accepted SIP becomes a plan of both the
school and the Division, which will be known as the SIP Baseline Plan and will be used as
basis for the monitoring and evaluation.
The SIP Appraisal Checklist, which contains the comments, findings and suggestions of the
Division QMT, will also be documented and archived for future reference.

6.4.6 Evaluation Tools and Techniques


The appraisal process may be reinforced by selected data gathering and analytical
techniques. The objective is to be able to get enough information (through triangulation)
that will help the QMT to objectively review and enhance the SIPs for appraisal. These
include, but not limited to the following:
Document review. Most appropriate to use in checking the completeness and
correctness of data and information.
Interviews. Interviews may be conducted when the QMT needs to validate some
information that contradict the documented information. Include interviews with
teachers and school stakeholders, including learners.
Panel review. The QMT may opt for a panel review that will allow both the QMT and
the school head and teachers to answer and clarify questions.
Regardless of techniques to be used, the main instrument to be used on appraisal is the
Appraisal Checklist. The checklist itemizes the areas to be assessed and appraised during
the review. This will serve as a guide for the Division QMT in implementing the Appraisal
Process.

Page 6 - 14
Division M&E System Quality Control and Adjustment Points

Table 6-2 SIP Appraisal Checklist1

Area Remarks

1. Completeness of Document

Compliance to format More Info


Yes No
SIP submitted followed the official SIP format. Needed
Endorsement
More Info
SIP submitted contains signature of school Yes No
Needed
stakeholders
Documentation More Info
Yes No
Filled up all sections/parts of the SIP Needed
Situational Analysis
Are the problems, issues, needs and More Info
Yes No
opportunities clearly articulated in the Needed
background/rationale section?
Facts
Are the data and information quoted in the More Info
Yes No
SIP comes from reliable or authoritative Needed
source?
Gaol Chart
Is the goal chart correctly formulated? Are More Info
Yes No
the Objectives and indicators are SMARTly Needed
formulated?
Purpose level objectives
The SIP contains 4 purpose level objectives  More Info
Yes No
on enrollment, retention, completion and Needed
achievement
Implementation Plan
Is there a detailed implementation plan?
More Info
Does it contain information on activities to Yes No
Needed
be undertaken, people responsible and
budget?
d. Attachments complete.
More Info
All supporting data, tables, graphs and other Yes No
Needed
documents are accounted for

2. Relevance of the Plan

Vision Statement
More Info
Does the vision statement paint a picture of Yes No
Needed
the future situation of the school?
Situational Analysis
The problems, issues, needs and More Info
Yes No
opportunities described in the SIP are real Needed
and based on sound analysis
Target Groups
More Info
Needs of different target groups are clearly Yes No
Needed
identified
SIP Objectives and Targets
The objectives (in the goal chart) of the SIP
More Info
are logically link to the problems, issues, Yes No
Needed
needs and opportunities described in the
plan

1 Checklist to be used by the Division Quality Management Team to appraised the SIPS. Additional items may be
added depending on the requirements and/or intent of the Division

Page 6 - 15
Division M&E System Quality Control and Adjustment Points

Area Remarks

Targets More Info


Yes No
Targets are reasonable and attainable Needed
Support National Program
Attainment of objectives and targets support More Info
Yes No
the national programs and international Needed
commitment
3. Necessary and Adequate
Complete Outputs
More Info
The Outputs (programs and projects) are Yes No
Needed
adequate to achieve the objectives
Necessary
More Info
All the outputs listed are necessary to Yes No
Needed
achieve the objectives

4. Capacity of school

School Based Management


Does the school head have adequate More Info
Yes No
experience on managing school programs Needed
and projects?
School Based Management
More Info
Is the school head trained on SBM and other Yes No
Needed
related training?
Teachers
More Info
Are all teachers capable of implementing Yes No
Needed
the programs and projects in the SIP?
Teachers
Are there enough preparations and training More Info
Yes No
for teachers to handle the programs and Needed
projects in the SIP?

5. Stakeholders Support

School Governing Council More Info


Yes No
Is the SGC operational? Needed
Parents Community Teachers Association
(PTCA) More Info
Yes No
Is the PTCA active and supportive of school Needed
programs and projects?
Barangay/Municipal/City LGU
More Info
Is the LGU actively involved in school Yes No
Needed
programs and projects?
Others
Are there organizations operating in the More Info
Yes No
area that are supportive of education and Needed
other related undertakings?

6. Resource Generation

Budget
Is the total estimated cost required to More Info
Yes No
implement the school programs and Needed
projects reasonable?
Fund Sources More Info
Yes No
Are there fund sources available? Needed
Resource mobilization Yes No More Info
Is the school head capable of generating Needed

Page 6 - 16
Division M&E System Quality Control and Adjustment Points

Area Remarks

financial support from different sources?

7. Detailed Implementation Plan

Activities
Are the activities listed in the WFP directly More Info
Yes No
linked to the outputs/deliverables listed in the Needed
goal chart?
Work and Financial Plan
More Info
Is there a WFP? Is it presented on a monthly Yes No
Needed
basis?
Targets and Schedules More Info
Yes No
Are targets plotted monthly? Needed
Cash Flow
More Info
Are cash flow requirements plotted Yes No
Needed
monthly?
Persons Responsible More Info
Yes No
Is there an assigned individual per activity? Needed
Monitoring and Evaluation
More Info
Are M&E activities reflected in the WFP? Are Yes No
Needed
there assigned resources for M&E?

Page 6 - 17
Division M&E System Quality Control and Adjustment Points

 
  ! % &


  
   


Transition from Planning to


Implementation

Figure 6-8 Start Up Stage

6.5 Quality Control Point #2: Start Up Review


Start Up refers to preparatory activities to be undertaken by the Division and schools before
fully implementing the programs and projects contained in the education plans. These are
activities undertaken after the appraisal stage where the plan is formally accepted. This is
also know as the mobilization stage.
Among the Start Up related activities the Division and schools will implement are:
Kick off meeting. A kick off meeting signals the start of the start up stage. The
meeting brings together all the internal and key external stakeholders. The meeting
will serve as an orientation about the objectives and scope of the approved
education plan. This is to ensure that the management and staff have the same
understanding of the targets and objectives of the plan and there is agreement or
consensus in the strategies and activities to be undertaken.
Revise or update the plan. Based on the comments and suggestions at the
appraisal stage, the plan is adjusted or enhanced. Enhancements are made in
targets, strategies, events and allocation of resources.
Assign individuals to task and deliverables. A significant activity at this stage is the
mobilization of individuals who will be assigned to perform tasks and deliverables. It
is important to make responsibilities clear to all. Vague responsibility assignments
are often the major cause of conflict between units and individuals.

Page 6 - 18
Division M&E System Quality Control and Adjustment Points

Four people named Everybody, Somebody, Anybody and Nobody worked


together. An important Outcome needed managing, and Everybody was sure
that Somebody would do it. Anybody could have done it, but Nobody actually
did it. Somebody got angry, because it was really Everybody's job. Everybody
thought Anybody could do it, but Nobody realized that Somebody wouldn't. As it
turned out, Everybody blamed Somebody when Nobody did what Anybody
could have done!

- Author Unknown

Advocacy and resource mobilization. One of the important start up activities is the
advocacy work especially in generating support and/or resources from
stakeholders.
Prepare status report. The report to be prepared at this stage will serve as the
inception report.
Start Up stage is also a sustainability mechanism. It involves setting up of critical systems
and involves rallying and mobilizing support for the plans. Among the mechanisms that
must be set up at the start of implementation are the following:
(1) Participation of stakeholders. This refers to the stakeholders' understanding of the
plan, especially the target benefits and improvements.
(2) Communication system. This includes setting up the mechanism for sharing and
disseminating data and information throughout the organization. This will enable the
Division, district and schools to:
coordinate efforts more efficiently, thus avoiding duplication
gain up-to-date information about the status of implementation including
issues and problems, and make timely corrective actions
know about policies and directions of the organization in order to
synchronize decisions and actions at their level
(3) Monitoring, Evaluation and Adjustment system. Plans are best estimates of the
future. However, even a well-written plan will never be able to predict in exact detail
the future situation. At the start of the implementation, therefore, the mechanism for
tracking, analyzing and adjusting the implementation plan should be already in
place.
The inability to set up the critical mechanisms during start up and the failure to implement
the mobilization activities often leads to implementation difficulties and inefficiency. Based
on the experience of many, misunderstandings on the scope of the plan and on the roles
and responsibilities of individuals could have been avoided had an honest-to-goodness
activities related to start up were undertaken. Recurring problems manifested through

Page 6 - 19
Division M&E System Quality Control and Adjustment Points

delays, cost overrun, poor quality of services and non-achievement of targets and
outcomes are often traced to activities related to scope and role clarification and setting
up of systems that will facilitate information sharing and facilitate decision making.
In order to minimize, if not eradicate implementation problems, the Start Up Review Process
is installed as one of the quality control mechanisms in the Division M&E System.

6.5.1 Guiding Principles


Start up activities are critical to sustaining an efficient implementation. Therefore,
start up activities should be planned and allocated with resources. And if it is an
important part of the implementation phase, it should also be quality assured.
All stakeholders, internal or external, must be clear on what is to be achieved, how
the outcomes will be achieved and what are their roles and responsibilities.
Commitment of stakeholders should be assured at the start of the implementation.
Recurring problems are symptoms of missing management systems or poorly
installed mechanisms. In order to prevent or avoid these, efforts must be spent on
ensuring the management systems are in place before shifting to high gear in the
implementation. The start up activities serve as the system check.
A good start increases the chances of successfully implementing a plan.

6.5.2 Objectives
The main objective of the Start Up Review process is to ensure the readiness of the schools
to implement the SIP. Readiness is determined when the school is able to implement the
required necessary mobilization activities and has established critical management systems
that will sustain the implementation of the school's SIP.
Specifically, the Start Up Review will allow the Division and District to:
Pinpoint schools that are ready to implement the SIP and schools needing
assistance in jump starting their plans. This will allow the Division and District to focus
assistance on schools having difficulty launching their SIPs.
Synchronize the Division M&E system with the school M&E system. At this stage, the
Division is also initiating its DEDP implementation.
In the case of the Division's alternative learning programs, start up activities are
undertaken to ensure readiness of the accredited service providers to implement
the Basic Literacy Program and the A&E Program.

Page 6 - 20
Division M&E System Quality Control and Adjustment Points

Implementation
Start Up Review  Stage
Process Flow   

% &

1. Prepare for Start Up 4


 )
review   


4  
2. School visit % &

1 ! 
  
% &
3. Prepare documentation
report
 

Output: School ready to implement Next Control
. Point
Next Process: Start Up Review

Control Point 2. Start Up Review


Process

Figure 6-9 Start Up Review Process Flow

6.5.3 Process Description


The Start Up Review process is a three-month long activity implemented immediately after
the acceptance of the SIP, and undertaken through school visits. In general, the Division
reviews the start up related activities of the school and determine whether it complies with
the standard process and requirements. Specifically, the start up review activities will
include:
(1) Preparation for start up review. The Division QMT determines which schools will
need more immediate assistance. As a rule of thumb, the focus will be on schools
that have difficulty complying with the SIP appraisal (due to poor and vague plans).
Generally, the same groups will have difficulty starting up.
(2) School visit. The focus of the school visit is to verify the activities undertaken by the
schools after acceptance of the SIP. Verification may be undertaken through:
(a) Interview school head, teachers regarding activities conducted which are
related to start up and discuss mobilization problems and issues
encountered
(b) Review of the following documents:
SIP. To validate whether the school has incorporated the suggestions
provided by the Division QMT in the plan
AIP. To determine whether the school has already firmed up its plan
for the 1st year of SIP implementation.
AIP Monitoring Sheet. To determine if the school has finalized the AIP
Monitoring Sheet which will be used later in the tracking of school
progress

Page 6 - 21
Division M&E System Quality Control and Adjustment Points

Responsibility Assignment Matrix. Document that shows the


assignments of teachers and non-teaching staff.
(c) Assist schools in the start up activities. While in school, the Division QMT may
assist the school head and staff in complying with the requirements of the
Start Up process.
(3) Preparation of Start Up process report. The QMT will prepare a short report on the
start up assistance provided to schools and include discussion of the problems and
issues that will require interventions from the Division.

6.5.4 Knowledge and Skills Requirements


The Division and District staff who will comprise the start up review must posses the following
characteristics:
Knowledgeable about the agreements made between the schools and the
Division during the appraisal process
Can help the school install its M&E system.
Good project management skills, especially in the use of planning tools such as
work breakdown structure, network chart, Gantt or bar or bar chart and costing.
Good soft project management skills. This includes managing, motivating people,
and negotiation skills.

6.5.5 Process Output


The main report is the Division Start Up Review Report. This report is a documentation of the
schools start up accomplishments, start up problems and difficulties and the assistance
provided by the Division to jump start the SIP implementation.
The schools will also submit a status report on the implementation of their own start up
activities.

6.5.6 Evaluation Tools and Techniques


A Start Up Review Checklist is developed to facilitate the process. The checklist contains
the start up activities that must be implemented by the schools.
The checklist is reinforced using the following methods:
Document review. Includes review of the following documents: AIP, School Inset Plan,
School Monitoring Sheet, Responsibility Assignment Matrix and the School Inception
Report (Quarterly Status Report).
Interviews. Interview of teachers, non-teaching staff on start up related activities

Page 6 - 22
Division M&E System Quality Control and Adjustment Points

undertaken by the school including their understanding of their roles and


responsibilities
Process Review. Includes observation and review of kick off meeting, negotiation
and setting up the M&E system.

Table 6-3 Start Up Checklist


Start Up On Not
# Items Yes Remarks
Requirements Going Started
The school has incorporated in the SIP
or AIP the
1
Revision/Enhancement suggestions/recommendations and
of School Plan agreements from the appraisal process
1st Year AIP already finalized including
2
targets
Kick off meeting undertaken. The
meeting was attended by teachers,
3
non-teaching staff and school
Awareness of school stakeholders.
plans and programs
Teachers and non-teaching staff are
4 already aware of their roles and
responsibilities.
School head has visited and oriented
Advocacy 5 key stakeholders about support
needed by the school
The school has a capability building
School Inset 6 plan for teachers and non-teaching
staff
AIP Monitoring sheet already updated
7
and ready for use
Orientation on School M&E completed.
This means teachers and non-teaching
8 staff are already aware of the
School M&E
reporting requirements, performance
parameters of the school
Inception Report representing the
9 quarter status report of the school
completed and for submission.

Page 6 - 23
Division M&E System Quality Control and Adjustment Points

6.6 Quality Control Point #3: Annual Im plementation Review (AIR)


Annual Implementation Review (AIR) is a quality control mechanism conducted after 1 year
of implementation. The review compares the actual achievements and accomplishments
of the Division, districts
and schools versus the
achievements and Division & District M&E System
Annual Implementation Review
accomplishments based 
on plan.
  ( 
The AIR is used as a major
!! !!     
adjustment point of both  !#
 7 3 (   !4 !
  
the Division and schools
for their strategies and  

  
activities of the next
implementation year.
7   9
   % &

6.6..1 Guiding Principles


It is difficult to swallow an elephant whole. It is best to chop the elephant into
pieces. Managing the implementation per segment allows more control on the
quality of services and products.
Is the implementation on the right track? Continuous improvement will work only
when there is regular review or evaluation of implementation.
Monitoring and evaluation does not wait for problems to occur. It provides a
systematic and proactive venue for avoiding and mitigating issues and problems.
Recurring problems are symptoms of a poorly functioning monitoring system. A
regular review is undertaken to, once and for all, solve the problem and prevent it
from happening again.

6.6.2 Objectives
The annual review is undertaken to assess the initial gains generated after 1 year of
implementation. It is a mechanism to track the achievement of outcomes (Division and
school level) on a year to year basis. It is also a mechanism for assessing the efficiency of
Division units, districts and schools in delivering the target outputs in the DEDP and SIP.
The Annual Implementation Review is designed to generate information and insights that
will be useful for continuous improvement and in solving recurrent problems. The Review will
also serve as a major adjustment point for plans and programs of the Division and schools.
Specifically, the annual review will provide the Division with the following information:

Page 6 - 24
Division M&E System Quality Control and Adjustment Points

Programs and projects that produced positive and/or encouraging (initial) results.
The review will enable the Division to reinforce on programs and project that work
and improve the design of programs that produced negative results.
Technical assistance processes or practices that need further enhancements.
Accomplishment to date. An annual review provides an overall status of
accomplishment since the implementation started.
Factors that facilitated implementation as well as factors that adversely affected
delivery of services and assistance.

Annual Implementation
Review Process Flow Implementation
 Stage
1. Consolidate annual reports ! ) 
!
2. Analyze achievements and  
accomplishments
  
:  
4 % )
3. Assess implementation   !4 !
( )
(what went right & what 
  
went wrong) % &
!! !!  
.
   
4. Prepare next year plan
   +

5. Submit and document the 
next year plan Next Control
Point

Output: Annual Plan Control Point 3. Annual


Implementation Review
Next Process: Mid Term Review

Figure 6-11 Annual Implementation Review Process Flow

6.6.3 Process Description


The Review is undertaken after 1 implementation year.
(1) Consolidate reports. At the end of the year, the schools and the Division will be
completing their annual report. A school annual report will contain the
achievements (purpose level  SIP) and the accomplishments (AIP) of the school. All
school reports will be used as input to the preparation of the Division Annual Report.
(2) Analyze achievements and accomplishments. This is a pre-assessment (workshop)
activity. The main task is to sort out the achievements and accomplishments of
schools, community learning centers, districts and division and put into one cohesive
document called the Division Report Card.
To facilitate the analysis, an AIR Implementation Guide is developed. It contains the

Page 6 - 25
Division M&E System Quality Control and Adjustment Points

important areas to analyze (in an annual review) and provides process questions
that will help the QMT to analyze and formulate recommendations for the next
implementation year.
(3) Assess implementation. Using the outputs of activity 1 and 2, the Division QMT will
conduct a one to two day assessment workshop. This will be attended by school
heads, district staff and Division staff. The objective of the workshop is to assess and
identify factors that facilitated the achievements and accomplishments and to
collectively identify issues and external factors that contributed to difficulties in
implementation.
Depending on the requirements, size and other factors, the assessment may be
undertaken in several options:
Division-wide assessment and planning workshop. The Division will conduct
only 1 workshop to be attended by all schools, districts and Division units
Division-wide assessment and planning workshop. Similar to option above
but divided into elementary and secondary schools
Assessment per District or cluster. Simultaneous conduct of assessment and
planning workshop to be facilitated by Division QMTs.
The assessment workshop will focus on the following areas:
Year end accomplishment as per AIP and DAP. Need to identify the
factors that contributed and factors that hindered the efficient
implementation of the plans
Initial gains per school outcomes. The assessment will include discussion
and analysis of the school performance indicators (enrollment, retention,
completion and achievement). During the workshop, discussion will focus on
programs and projects to continue, documentation of lessons learned and
propagation of effective practices.
Performance of teachers and school heads.
Accomplishment in ALS programs. Assessment includes performance of
community learning centers, service providers, facilitator and instructional
managers
Division operations. Assessment of Division's application of processes
(standards) and practices.
The results of the assessment will be used as input to the finalization of the Division
Annual Report and input to the preparation of the next Division Annual Plan.
(4) Prepare next year implementation plan. Using the results of the assessment, the
Division and the schools will revisit the DEDP and SIP to assess whether these need
any adjustment.

Page 6 - 26
Division M&E System Quality Control and Adjustment Points

(5) Document next year implementation plans. The DAP and AIPs will be documented
and used as basis for the monitoring implementation progress in the next year.

6.6.4 Knowledge and Skills Requirements


The Division and District staff who will spearhead the annual review process must posses the
following characteristics:
has been involved and/or knowledgeable about the Division programs and
projects as well as issues and concerns affecting school and community learning
center operations
has basic handles on planning tools and techniques such as logframe, work
breakdown structure, network chart, Gantt or bar chart and costing techniques
competencies on progress monitoring and evaluation and understand the
concepts of physical accomplishments, scope management, scope creep, s-curve
and initial gains
can write technical reports
computing skills, especially the use of word processing and spreadsheets software

6.6.5 Process Outputs


The major outputs of the Annual Implementation Review are:
Division Annual Report. Contains the achievements and accomplishments of the
Division in one year. The report also contains discussion of factors that helped the
implementation and discussion of issues and difficulties experienced by the Division,
districts and schools.
Division Report Card. An end-of-year document that provides a comprehensive
picture of the Division's performance. It contains information about the Division
which will include Goal level (school performance) and Outcome level
(performance of school heads, teachers, instructional managers and facilitators)
indicators.
DEDP Annual Plan (next year). Based on the achievements and accomplishments
in the previous year, the Division prepares or adjusts the DAP.
School Annual Implementation Plan (next year).

6.6.6 Evaluation Tools and Techniques


The following are some of the M&E tools to aid the implementation of the Annual
Implementation Review (AIR):

Page 6 - 27
Division M&E System Quality Control and Adjustment Points

(1) Line of Balance or S-curve. This tool provides an overall status of accomplishment.
It shows (in a diagram) the actual accomplishments of the Division and schools
versus outputs according to plans.
(2) Segmentation Techniques. This is a technique use to understand and gain insights
from target groups. Segmentation is a process of identifying and grouping schools
based on school characteristics and accomplishments. The main objective of
segmentation is to get to know the schools better in order to customize or fit the
Division's technical assistance to the requirements of the school.
Specifically, the segmentation technique will allow the Division to compare similar
schools (the same characteristics) and schools from different groups (different
characteristics). This approach will facilitate the monitoring of schools and allow the
Division to determine the unique needs, problems and requirements of schools
belonging to the same segment.
The following groupings will be used:
(a) school characteristics (sample only)
type  science, vocational, national high school
location  upland, urban, rural
facilities  high classroom need, medium classroom need, low
classroom need
leadership  schools headed by principal 2, principal 1, TIC
teacher to learner ratio  high, medium and low
(b) school performance (sample only)
enrollment  decreasing, increasing, stable
retention  high, medium, low
completion  high, medium, low
achievement - 75 and above MPS, 50-74 MPS, 50 and below
SBM Practice  beginner, mature
(3) AIR Implementation Guide. A guide for QMT members on how to go about the
process of implementing the AIR.

Page 6 - 28
Division M&E System Quality Control and Adjustment Points

Table 6-4 AIR Implementation Guide

Areas to Review Performance Measure Process Question Decision Point

A. Status of Implementation
What school practices facilitated the Identify practices to
Implementation of Actual targets SIP/AIP implementation? reinforce
AIP accomplished versus plan What factors hindered the Address factors hindering
achievement of targets? implementation
Continue practices
What Division practices facilitated the contributing to efficient
Implementation of Actual targets DEDP/DAP implementation? operations
DAP accomplished versus plan What factors hindered the Document and address
accomplishment of targets? factors hindering
efficiency

B. Initial Gains/Results (Learners)


What factors led to the
increase/decrease in enrollment? Document
What school programs and projects of effective/best practices
% increase in enrollment Y1 the schools contributed to increasing Document lessons
Enrollment
to Y2 the enrollment? learned
What Division programs and projects Identify programs and
led to the increase in participation projects to enhance
rate?
Document
Decrease in drop out rate What factors led to the improvement/
effective/best practices
Decrease in school leavers reduction in retention rate?
Retention/ Document lessons
rate What programs and projects the
Completion learned
Increase in retention rate schools implemented contributed to
Identify programs and
Increase in completion rate improvement in retention rate?
projects to enhance
What factors led to the Document
Increase in MPS (NEAT) for increase/decrease in the MPS (NAT) of effective/best practices
grade 6 schools? Document lessons
Achievement
Increase in MPS (NAT) for 2nd What programs and project of the learned
year and 4th year schools contributed to the Identify programs and
improvement in the MPS? projects to enhance

C. Initial Gains/ Results (Performance of School Head and Teachers)

Assess training programs


What management areas the school Align training assistance
heads are strong? And where are they to areas where SH are
Improvement in the skills of weak? weak
School Based
school heads in managing What systems and processes, programs Division practices and
Management
school operations and projects of the Division contributed systems to enhance
to the enhancement of school heads Reinforced systems that
skills? generate positive
feedback
Staff development
Mastery of teachers per What subject matter the teachers are
Content programs to continue
subject area strong? And where are they weak?
and areas to focus
Staff development
Improvement in the What teaching skills the teachers are
Teaching skills programs to continue
teachers teaching skills strong and where are they weak
and areas to focus

D. Initial Gains (alternative learning system)

Performance of Learners under BLP What are the practices of the service Contracts to extend
service providers achieved 100% of the core providers that contributed/ hindered Lessons learned and
competencies in reading, the achievement of targets? effective practices to
writing and numeracy What assistance or Division programs continue
50% mastery of the core contributed to the achievement of

Page 6 - 29
Division M&E System Quality Control and Adjustment Points

Areas to Review Performance Measure Process Question Decision Point

competencies under the


targets?
A&E program
Learners under BLP
What are the practices of the service
achieved 100% of the core
providers that contributed/ hindered Practices and programs
competencies in reading,
Community learning the achievement of targets? to continue
writing and numeracy
centers What assistance or Division programs Programs to be
50% mastery of the core
contributed to the achievement of reinforced
competencies under the
targets?
A&E program
Facilitators and Staff development
What are the areas of strength and
Instructional Demonstration of skills programs to continue
areas for improvement?
Managers and areas to focus
E. Division Operations
Identify system or
Managing the Core What is the maturity level of the Division process to enhance
Compliance to standards
Processes in applying the processes? Train Division and District
staff on the processes

Page 6 - 30
Division M&E System Quality Control and Adjustment Points

Division Quality Control &


 Adjustment Points
 

     !

 
  
  
  
   
    .  +  




  !    ( )   

% & % &
    !    , 
% & % & % & % 


 
 
 
 
 
 



   
  
     
  
  

 #  .  #+  

 
  


F i g u re 6 - 12 M i d -Te r m I m p l e m e n t a t i o n

6.7 Control Point #4: Mid-Term Im plementation Review


Mid-Term Evaluation is undertaken at the last 6 months of the 1st Cycle of SIP
implementation and at the 3rd year of DEDP implementation. It is one of two major
evaluation activities to be undertaken by the Division QMT in the six-year DEDP cycle.
The Mid-Term Evaluation focuses on the achievement of the Purpose-level objectives in the
SIP and the Outcome-level objectives in the DEDP. Specifically, the evaluation is designed
to measure the achievement of the following:
(1) SIP (Purpose-level Objectives)
Increase in enrollment
Improvement in the retention rate
Improvement in the completion rate
Improvement in learner achievement
(2) DEDP (Outcome-level Objectives)
Improvement in the performance of school heads on school-based
management and instructional supervision
Improvement in the performance of teachers on content and teaching skills

Page 6 - 31
Division M&E System Quality Control and Adjustment Points

Improvement in the SBM practices of schools


Improvement in the learning environment

6.7.1 Guiding Principles


Evaluation findings gathered at the end of the implementation cycle will have little
or no use at all in improving efficiency and effectiveness. At best, the evaluation
findings will be used as input to the next planning process. In order to be useful and
to generate the highest value-added to the organization, the evaluation should be
conducted during the implementation stage where results can be used to improve
or enhance efficiency and increase the likelihood of effectiveness.
Initial evaluation results will serve as the major ingredients to continuous
improvement.

6.7.2 Objectives
The objectives of the Mid-Term Evaluation Review are to:
Evaluate how closely the achievements and accomplishments are to the planned
objectives and targets
Assess the first 3 years of DEDP implementation to determine which programs and
projects should be continued or stopped
Document the effective practices and processes that contributed to attainment of
initial gains and make recommendations to continue applying them in the next 3
years of implementation
Analyze the causes of problems and difficulties encountered and document these
as part of the lessons learned
Identify factors that may help sustain the initial gains.
Mainly, the results of the evaluation will be used as input to enhancing the implementation
strategies and technical assistance to schools for the next three years.

Page 6 - 32
Division M&E System Quality Control and Adjustment Points

Mid Term Implementation


Review Process Flow 
   Implementation
1. Consolidate 3 year SIP %  Stage (next 3
 ! 
Completion Reports  years)
  
2. Prepare evaluation design % &
  

3. Conduct initial gains ( )   
evaluation % &

  
% &
)
   );!
4. Prepare evaluation reports . 0 !%  

5. Adjust DEDP +
  
% 

Next Control
Output: Adjustment of Point
DEDP for next 3 years

Next Process: Annual


Implementation Review Control Point 4. Mid Term
Implementation Review

Figure 6-13 Mid-Term Implementation Review Process Flow

6.7.3 Process Description


The Mid-Term Implementation Review will be implemented in 5 major activities:
(1) Prepare for Mid Term Review. Preparatory activities include creation of evaluation
team, preparation of the evaluation design and the evaluation implementation
plan.
(a) Prepare implementation plan
(b) Prepare evaluation design
(c) Form and create evaluation team.
Please see Checklist for Mid-Term Implementation Review.
(2) Review and/or document Division achievements. The review will provide data and
information to the Division QMT on the extent of initial gains or benefits achieved by
the Division, schools and community learning centers after 3 years.
Key source of information for the review includes the Basic Education Information
System (BEIS) and the Report Cards
(3) Data Gathering. The main objective of this task or activity is to support the
achievements documented in reports with qualitative information that will provide
the stories behind the numbers and percentages reported. The validation seeks
to document effective practices and draw lessons from failed undertakings.
The data gathering is divided into 3 major activities:

Page 6 - 33
Division M&E System Quality Control and Adjustment Points

Data Gathering to validate the achievements and accomplishments.


Includes visits to schools and community learning centers and involves the
use of rapid appraisal techniques
Perception survey to gather feedback from school stakeholders on quality
of services provided by the school
SBM Assessment Level of Practice
(4) Prepare Mid-Term Implementation Report.
(5) Adjust Plan. The results of the review will be used as input to the improvement of the
DEDP and SIP.
The results will provide insights to the Division on what programs and projects work and
what does not. Therefore, enhancement in the strategies for the next 3 years is warranted.
The results will also be used in helping the schools prepare a better SIP for the 2nd cycle of
implementation.

6.7.4 Knowledge and Skills Requirements


The Division and District staff who will spearhead the mid-term implementation review
process must posses the following characteristics:
has basic handles on planning tools and techniques such as logframe, work
breakdown structure, network chart, Gantt or bar chart and costing techniques
competencies on conducting benefits evaluation including using rapid appraisal
techniques such as focus group discussion, interviews, key informant interviews,
transect walk, observations and inspection
knowledgeable about SBM and other issues and opportunities affecting school
operations
can write technical reports
computing skills especially the use of word processing and spreadsheets software

6.7.5 Process Outputs


The major outputs of the Mid-Term Implementation Review are:
(1) Division Mid-Term Report. Contains the achievements and accomplishments of the
Division and schools after three years (after 1 SIP cycle). The report highlights the
initial results or improvements and changes that take place after providing support
to schools in implementing the 1st SIP Cycle. Specifically, the Mid-Term Report
contains the achievement of schools (using key performance indicators),
improvements in the competencies of school heads, teachers and facilitators,

Page 6 - 34
Division M&E System Quality Control and Adjustment Points

instructional managers and non-teaching staff.


The report also contains discussion of factors that helped the implementation and
discussion of issues and difficulties experienced by the Division, districts, schools and
community learning centers.
The Division Mid-Term Report will draw information from the following:
Division Report Card. An end of the year document that provides a
comprehensive picture of the Division' performance. It contains information
about the Division which will include: Goal level (school performance) and
Outcome level (performance of school heads, teachers, instructional
managers and facilitators) indicators
Stakeholders Perception Study. Contains perception of parents, community,
local government units and other local organizations on the quality of
education and quality of services provided by the school to learners.
SBM Level of Practice. Results SBM assessment conducted by the Division in
randomly selected schools.
The evaluation reports will be used as input to:
(1) DEDP Implementation Plan (next 3 years). Based on the initial achievements and
accomplishments, the Division makes adjustment to the DEDP.
(2) School Improvement Plan (next cycle). The evaluation results will be used as basis
for the appraisal and enhancement of SIPs representing the next 3 years.

6.7.6 Evaluation Tools and Techniques


The following are some of the M&E tools to aid the implementation of the Mid-Term
Implementation Review (MTR):
(1) Rapid Appraisal Techniques. These are not so quick and not so dirty' techniques
of gathering qualitative information data about school achievements and
performance. It is a technique for gathering information that will help explain a
phenomenon. It documents the practices (what was done and what were not
undertaken) of the schools. It involves the use of different techniques in order to
validate and triangulate information that will help derived an unbiased view of the
situation. Rapid appraisal techniques include:
key informant interviews. Key informants refer to individuals who can provide
holistic and complete information about the schools. Interview may also be
undertaken through transect walk (walk through) or through the use of a
questionnaire
focus group discussion. Involves individuals group according to similar
characteristics and traits. A facilitator leads the discussion and draws

Page 6 - 35
Division M&E System Quality Control and Adjustment Points

information from the participants. There are no right and wrong answers but
the facilitator must see to it that the discussion is focused and will generate
the desired information from the participants.
Inspection. This is an activity that will validate the claims of individuals about
a practice or way of doing things.
actual observation. In order to document the actual practice or behavior,
actual observation is undertaken. This method will help validate the claims
made by key informants and participants to the FGD.
Questionnaire. Predetermined questions are jotted down. These are used to
guide the interviews.
(2) Segmentation Techniques. This is a technique use to understand and gain insights
about target groups. Segmentation is a process of identifying and grouping schools
based on school characteristics and accomplishments. The main objective of
segmentation is to get to know the schools better in order to customize or fit the
Division's technical assistance to the requirement of the school.
Specifically, the segmentation technique will allow the Division to compare similar
schools (the same characteristics) and compare schools from different groups
(different characteristics). This approach will facilitate the monitoring of schools and
allow the Division to determine the unique needs, problems and requirements of
schools belonging to the same segment.
The following groupings will be used:
(a) school characteristics (sample only to be developed further)
type  science, vocational, national high school
location  upland, urban, rural
facilities  high classroom needs, medium classroom needs, low
classroom needs
leadership  schools headed by principal 2, principal 1, TIC
teacher to learner ratio  high, medium and low
(b) school performance (sample only  to be developed further)
enrollment  decreasing, increasing, stable
retention  high, medium, low
completion  high, medium, low
achievement - 75 and above MPS, 50-74 MPS, 50 and below
SBM Practice  beginner, mature
(3) SBM Assessment. The Division is going to assess the SBM practices of the schools

Page 6 - 36
Division M&E System Quality Control and Adjustment Points

using the same SBM assessment tool the schools are using to do self-assessment.
Team of assessors from the Division and District are going to conduct the SBM
assessment.
In order to maintain uniform application of criteria and unbiased assessment of
school practice, the tool is reinforced with the consensus technique. Assessors will
not immediately render judgment about the school practice but instead jot down
notes and document the school practices as has been observed. These
documentations are discussed by the team of assessors and a consensus is to be
made as to whether the school is able to satisfy the level of practice.
(4) Mid Term Implementation Review Checklist. The Checklist is for use by the Division
QMT/ evaluation team as guide in the preparation, implementation and completion
of the Mid-Term Review.
The Checklist will not be use, in any way, to score or grade the performance of the
QMT but to serve as a guide in the implementation of evaluation activities. Its main
objective is to ensure a smooth and efficient conduct of the Mid-Term Review.
The Checklist provides a listing of activities, resources, and reference documents
necessary for an efficient implementation of the Mid-Term Review. The QMT will
check: Yes if condition/question is complied with; No if condition/question posted is
not met; and More Information Needed when an objective Yes or No response
cannot be undertaken due to insufficient information.

Page 6 - 37
Division M&E System Quality Control and Adjustment Points

Table 6-5 Mid-Term Implementation Review Checklist

Area Remarks

1. Preparatory Activities

Implementation Plan for Mid Term Review


More Info
Is there an approved Mid Term Review Yes No
Needed
implementation plan?
Resources
More Info
Are needed resources available for use by Yes No
Needed
the evaluators?
Time Table More Info
Yes No
The review can be finished in 3 months Needed
Budget
More Info
Is there an approved budget alloted for the Yes No
Needed
Mid-Term Review?
QMT More Info
Yes No
Are the QMTs ready and available? Needed
Evaluators
Will there be enough evaluators from the More Info
Yes No
Division and District in order to finish the Needed
evaluation in 3 months?
Capability of the Evaluators
Have they been trained on results More Info
Yes No
evaluation? Perception survey? SBM Needed
assessment?
Availability of Evaluators
Are there enough QMT members/ evaluators
More Info
to simultaneously implement outcome Yes No
Needed
evaluation, perception survey and SBM
assessment?
2. Evaluation Design
Evaluation Design More Info
Yes No
Is there an evaluation design? Needed
Evaluation Design
More Info
Has this been discussed with the QMT and Yes No
Needed
members of the MTR?
Objectives
More Info
Are the objectives of the evaluation clear Yes No
Needed
and SMART?
Methods
More Info
Different methods will be used to triangulate Yes No
Needed
and validate information
Sample More Info
Yes No
Schools to be visited already selected Needed
3. Review of Achievements and
Accomplishments
Documents
Are supporting documents complete? These More Info
Yes No
include report cards, completion reports and Needed
accomplishment reports
Authoritative Source
Are the reference documents the most More Info
Yes No
authoritative source of data and Needed
information?
Report Cards Yes No More Info

Page 6 - 38
Division M&E System Quality Control and Adjustment Points

Area Remarks

Are the school report cards updated? Needed


Segmentation
More Info
Are the schools grouped based on Yes No
Needed
predefined characteristics?
Historical Data
More Info
Support documents provide at least 2 years Yes No
Needed
of past data?
Division Report Card More Info
Yes No
Is the Division Report Card updated? Needed
BEIS More Info
Yes No
The BEIS data is complete and up to date Needed
4. Pre-Data Gathering Activities
QMTs and Evaluators
Are all members of the QMTs and evaluators More Info
Yes No
oriented about the scope, strategies and Needed
implementation plan?
Materials
More Info
Questionnaire and other evaluation Yes No
Needed
paraphernalia ready?
Vehicle
More Info
Travel and transportation arrangements Yes No
Needed
completed
Materials
Data Processing Guide, Questionnaires, More Info
Yes No
Interview guides and other evaluation Needed
paraphernalia ready
Data Gathering Guide
Each member of the QMT and/or evaluating More Info
Yes No
team provided with a copy of the data Needed
gathering guide
5. Actual Data Gathering
Pre Data Gathering Conference
More Info
School head given an orientation on the Yes No
Needed
review to be undertaken
Selection of Participants
The QMT and/or evaluation team selected
More Info
the teachers, non-teaching staff, learners Yes No
Needed
and others in a random and transparent
manner
Classroom Observation
More Info
Pre observation, actual observation and post Yes No
Needed
observation process followed
Classroom Observation
Documentation of the observation; More Info
Yes No
documentation is phenomenological and Needed
non-judgmental
Focus Group Discussion
More Info
Discussion is focused and limited to the Yes No
Needed
assigned topics
Focus Group Discussion More Info
Yes No
Tandem of facilitator and documenter Needed
Focus Group Discussion More Info
Yes No
Minimum of 5 participants per FGD Needed
Inspection Yes No More Info

Page 6 - 39
Division M&E System Quality Control and Adjustment Points

Area Remarks

A school representative accompanies the


Needed
evaluator in the inspection
Post Data Gathering Conference
More Info
The QMT apprise the school about the Yes No
Needed
activities undertaken and the next steps
6. Perception Survey
Stakeholders
Participants to the perception survey are
notified
Instruments
Questionnaires and other instruments to be
used in the perception survey are validate
and reliable
Key Informants
Key informants are identified and
interviewed
Triangulate
Perception of other stakeholders in order to
triangulate information and minimize bias of
informants
Processing of findings
Division staff capable of using spreadsheet
software and its special functions
Report
Results of perception survey is made
incorporated to the Mid-Term Report
6. Consensus Building
Encoding
More Info
Raw data gathered are documented using Yes No
Needed
word software
Consensus
More Info
QMT/evaluators discuss the observations, raw Yes No
Needed
data to come up with a consensus
Signed and endorsed
More Info
Results of consensus are finalized and Yes No
Needed
endorsed by the QMT/evaluators

7. Evaluation Report

Mid Term Implementation Review Report


More Info
MTR endorse by members of the QMT/ Yes No
Needed
evaluators
Issues and problems
Issues and problems raised or identified in More Info
Yes No
the report have sound basis and back up Needed
with data
Issues and Recommendations
For every major issue raised there is a More Info
Yes No
corresponding recommendations on how to Needed
mitigate the issue
Presentation of Findings
More Info
The findings were presented clearly and Yes No
Needed
objectively through graphs and diagrams
Recommendations Yes No More Info
The report contains suggestions for next Needed
steps on how to improve or enhance future

Page 6 - 40
Division M&E System Quality Control and Adjustment Points

Area Remarks

implementation
Feedback
More Info
Results and recommendations are properly Yes No
Needed
disseminated and communicated

8. Adjustment of Plans

Recommendations
More Info
Suggested next steps find its way to the Yes No
Needed
implementation plan for next year
Issues and problems
Corrections/ adjustments are made in the More Info
Yes No
plan in order to mitigate if not solve the Needed
issues and problems raised in the evaluation
Input to Appraisal
More Info
Evaluation findings and recommendations Yes No
Needed
are use as input to appraisal of SIP

9. Knowledge Management

Sharing of Information
More Info
Evaluation findings are shared and discussed Yes No
Needed
to the Division and District
Improve design of programs/projects
More Info
Evaluation findings are used to enhance Yes No
Needed
design of Division programs and projects
Expectations from ES
Division staff, especially education
More Info
supervisors are knowledgeable about the Yes No
Needed
results of the evaluation and the issues and
problems
Access
More Info
Evaluation results are made available and Yes No
Needed
accessible

Page 6 - 41
Division M&E System Quality Control and Adjustment Points

Division Quality Control &


 Adjustment Points
 

     !

 
  
  
  
   
    .  +  




  !    ( )   

% & % &   !    , 
%  % & % & % 


 
 
 
 
 
 



   
  
     
  
  

 #  .  #+  

 
  


F i g u re 6 - 14 D E D P Wra p U p

6.8 Quality Control Point #5: Outcome Evaluation (OE)


The last major control point of the Division M&E System is the DEDP Outcome Evaluation.
Also known as results evaluation, it focuses on the achievement of the Goal-level
objectives and Purpose-level objectives of the DEDP. This process is undertaken at the end
of the six-year DEDP implementation and after 2 cycles of SIP implementation.
Outcome Evaluation is undertaken in order to verify the achievement of the following:
(1) Achievement of the Division Goal
Access as measured in terms of participation rate and increase in
enrollment
Learners' stay in school as measured by retention, drop out and completion
rates
Learners achieve desired learning competencies as measured in the
achievement tests
(2) Achievement of the Division Outcome

Reduce disparity in the performance of high-performing schools and low-


performing schools (retention, completion and achievement)

Increase satisfaction of stakeholders in the delivery and quality of instruction in


schools

Page 6 - 42
Division M&E System Quality Control and Adjustment Points

Improvement in the SBM practices of schools

Improvement in the competencies of teachers and school heads

Improvement in the schools' learning environment

The scope of the Outcome Evaluation is detailed further in Table 6-# Division M&E Framework

Table 6-6 Division M&E Framework

Objectives Performance Indicators Means of Verification

Division Goal: Impact Indicators

Access: To ensure that all Increase in participation rate Enrollment Report


learners of school age are in Increase in enrollment Division Report Card
school and are ready for school Learners entering the school system
are ready

Retention: To ensure that Increase in number of learners School Report Card


learners who are in school will retained in the school (retention
stay in school rate)
Reduction in drop outs
Reduction school leavers

Completion: To ensure learners Increase in number of learners able School Report Card
who are in school will complete to complete the basic education
the requirements of the primary requirements
and secondary level Improve graduation rate

Achievement: To ensure that Improvement in the basic functional Learner Report Card
learners demonstrate the literacy skills of the learners Teacher Assessment
necessary competencies at Improvement in the academic National Achievement Test
each level performance of learners in all subject (2nd Year)
matter Regional Achievement Test
Improvement in the social skills (3rd Year)

Division Level Outcomes: Effectiveness Indicators

1. Improved school performance Reduce disparity between high Division Report Card
performing schools and low Division Education
performing schools (in NEAT and NAT) Development Plan (DEDP)
by --- percent
Reduce disparity in enrollment, drop
out, and completion rates between
high performing schools and low
performing schools
Increase in satisfaction of school
stakeholders in the quality of Perception Survey
instructions in the school
Improve SBM Practice of schools SBM Assessment Result

2. Improved teachers performance Teachers demonstrated Division Report Card and


competencies on General Content DEDP
and Subject specific skills. Teachers' Performance
Teachers meeting the desired Assessment Report
competencies based on the NCBTS Assessment for Math and
Science teachers

Page 6 - 43
Division M&E System Quality Control and Adjustment Points

Objectives Performance Indicators Means of Verification


Division Report Card and
School heads demonstrated DEDP
3. Improved school heads competencies on school based
performance management and instructional
supervision Division Report Card

Teacher to learners' ratio is 1:45


Learner to textbook ratio is 1:1
4. Improved learning environment Teacher to teacher manual ratio is 1:1
Teacher and learners have access to
school equipment, science
laboratories and other facilities
School comply with Standards of a
Child Friendly School

Division Intermediate Results: Leading Indicators

1. Improved competencies of Division and District staff Division Report Card and
DepED Division and District staff in demonstrates competencies on DEDP
providing technical and educational planning, curriculum
management support to schools, management, instructional
community learning centers, school consultancy, training and Results of Performance
heads, teachers and facilitators development and monitoring and Assessment
evaluation

2. Management and technical Continuous improvement in the Quality Assurance Readiness


assistance systems are in placed management and technical Assessment Report
and operational assistance processes of the Division;

6.8.1 Guiding Principles


Connecting the dots. Evaluations are undertaken to determine how the interplay of
programs and projects implemented and the influence of external factors resulted
in the realization or non-realization of desired objectives. These same dots are
used in plotting the future.
The goals, objectives and strategies in the plan provide the scope of the evaluation.
Effectiveness is demonstrated through the target group. Effectiveness is measured in
terms of changes or improvements in the status of the target groups as a result of
the benefits derived from programs and projects implemented.

6.8.2 Objectives
As an integral part of the process improvement mechanism, the objectives of Outcome
Evaluation are the following:
measure the improvement in the performance of schools, school heads and
teachers, instructional managers and facilitators and non-teaching staff of schools
determine whether the Division programs and projects lead to the achievement of

Page 6 - 44
Division M&E System Quality Control and Adjustment Points

the Purpose-level objectives


determine whether the achievement of the Purpose-level objectives leads to the
attainment of the Goal-level objectives
document factors that may have contributed or hindered the achievement of
purpose-level objectives and goal level objectives
document and propagate best practices and lessons learned in the six years of
implementation.
The findings and results of the Outcome Evaluation will be used as input to defining and
formulating the next cycle DEDP.

Outcome Evaluation
Process Flow 
  
1. Prepare evaluation design % 
 ! 

2. Review school  
achievements based on % & / 6 
SIP 

( )  
3. Conduct evaluation % &

)
     
4. Prepare evaluation reports . 0 !%  

5. Prepare next cycle DEDP +


  
% 


Output: Next Cycle DEDP

Next Process: Annual Control Point 5. Outcome Evaluation


Implementation Review

Figure 6-15 Outcome Evaluation Process Flow

6.8.3 Process Description


The process of conducting the Outcome Evaluation is similar to the process used in the
Mid-Term Evaluation Review. The only difference is the context of the evaluation and the
utilization of the evaluation results.
Outcome evaluation will be implemented using 5 major activities:
(1) Prepare for Outcome Evaluation. Preparatory activities include creation of
evaluation team , the preparation of the evaluation design and the evaluation
implementation plan.
Prepare implementation plan
Prepare evaluation design
Form and create evaluation team.

Page 6 - 45
Division M&E System Quality Control and Adjustment Points

(2) Review and/or document Division achievements. Using the Basic Education
Information System (BEIS) and the Report Cards, the Division will document the
achievement of the Goal and Purpose level objectives in the DEDP.
(3) Data Gathering. The task is to validate the documented achievements and
document how these achievements and accomplishments were achieved.
Specifically, the focus of the validation is to document the processes, practices and
other factors that contributed to the realization of the DEDP objectives. Includes
gathering of data and information at the Division, district, school, community
learning center level and community level and building consensus.
The data gathering is divided into 3 major activities:
Data Gathering to validate the achievements and accomplishments.
Includes visits to schools and community learning centers and involves the
use of rapid appraisal techniques
Perception survey to gather feedback from school stakeholders on quality
of services provided by the school
SBM Assessment Level of Practice
(4) Prepare DEDP Terminal Report. The terminal report describes the situation at the
Division level after six years of implementation. It describes the status of the schools
(using the school performance indicators) and provides a comparative assessment
of performance in terms of before and after and between and among school
groups.
Specifically, the terminal report will contain the following information:
Achievement of the DEDP Goal and Purpose level objectives
Major accomplishments, challenges encountered and how these were
solved or mitigated
Effective practices and lessons learned
Analysis of current issues, problems and opportunities
The Terminal Report is the main reference document in the preparation of the next
Division plan.

6.8.4 Knowledge and Skills Requirements


The Division and District staff who will form part of the Outcome Evaluation team must
posses the following characteristics:
has basic handles on planning tools and techniques such as logframe, work
breakdown structure, network chart, Gantt or bar chart and costing techniques

Page 6 - 46
Division M&E System Quality Control and Adjustment Points

competencies on conducting benefits evaluation including using rapid appraisal


techniques such as focus group discussion, interviews, key informant interviews,
transect walk, observations and inspection
can write technical reports
computing skills especially the use of word processing and spreadsheets software
must be objective and fair

6.8.5 Process Outputs


The major outputs of the Outcome Evaluation are:
(1) DEDP Terminal Report. Contains the achievements and accomplishments of the
Division and schools after six years of implementing the DEDP and providing support
to schools.. The report highlights the achievements (Division and Purpose level) and
accomplishments of the Division.
The Terminal Report provides the situationer at the Division level six years after
implementing the DEDP. It also contains a detailed analysis of factors that helped
the implementation and discussion of issues and difficulties experienced by the
Division, districts, schools and community learning centers. It is a documentation of
the Division's best practices and lessons learned.
The Division Mid-Term Report will draw information from the following:
Division Report Card. An end-of-year document that provides a
comprehensive picture of the Division's performance. It contains information
about the Division which will include Goal level (school performance) and
Outcome level (performance of school heads, teachers, instructional
managers and facilitators) indicators
Stakeholders' Perception Study. Contains perception of parents, community,
local government units and other local organizations on the quality of
education and quality of services provided by the school to learners.
SBM Level of Practice. Results of SBM assessment conducted by the Division
in randomly selected schools.
The Terminal Report will be used as input to:
(1) DEDP next 6 Years. The Terminal Report is the main input to be used in the
preparation of the next cycle DEDP.
(2) Reference material in the appraisal of SIPs

Page 6 - 47
Division M&E System Quality Control and Adjustment Points

6.8.6 Evaluation Tools and Techniques


The following are some of the M&E tools and techniques to aid the implementation of the
Outcome Evaluation (OE):
(1) Rapid Appraisal Techniques. These are not so quick and not so dirty techniques
of gathering qualitative information data about school achievements and
performance. It is a technique for gathering information that will help explain a
phenomenon. It documents the practices (what was done and what were not
undertaken) of the schools. It involves the use of different techniques in order to
validate and triangulate information that will help derived an unbiased view of the
situation. Rapid appraisal techniques include:
key informant interviews. Key informants refer to individuals who can provide
holistic and complete information about the schools. Interview may also be
undertaken through transect walk (walk through) or through the use of a
questionnaire
focus group discussion. Involves individuals group according to similar
characteristics and traits. A facilitator leads the discussion and draws
information from the participants. There are no right and wrong answers but
the facilitator must see to it that the discussion is focused and will generate
the desired information from the participants.
inspection. This is an activity that will validate the claims of individuals about
a practice or way of doing things.
actual observation. In order to document the actual practice or behavior,
actual observation is undertaken. This method will help validate the claims
made by key informants and participants to the FGD.
questionnaire. Predetermined questions are jotted down. These are used to
guide the interviews.
(2) Segmentation Techniques. This is a technique used to understand and gain insights
about target groups. Segmentation is a process of identifying and grouping schools
based on school characteristics and accomplishments. The main objective of
segmentation is to get to know the schools better in order to customize or fit the
Division's technical assistance to the requirements of the school.
Specifically, the segmentation technique will allow the Division to compare similar
schools (the same characteristics) and schools from different groups (different
characteristics). This approach will facilitate the monitoring of schools and allow the
Division to determine the unique needs, problems and requirements of schools
belonging to the same segment.
The following groupings will be used:
(a) school characteristics (sample only to be developed further)

Page 6 - 48
Division M&E System Quality Control and Adjustment Points

type  science, vocational, national high school


location  upland, urban, rural
facilities  high classroom need, medium classroom need, low
classroom need
leadership  schools headed by principal 2, principal 1, TIC
teacher to learner ratio  high, medium and low
(b) school performance (sample only  to be developed further)
enrollment  decreasing, increasing, stable
retention  high, medium, low
completion  high, medium, low
achievement - 75 and above MPS, 50-74 MPS, 50 and below
SBM Practice  standard, progressive, mature
(3) SBM Assessment. The Division is going to assess the SBM practices of the schools
using the same SBM assessment tool the schools are using to do self-assessment.
Team of assessors from the Division and District are going to conduct the SBM
assessment.
In order to maintain uniform application of criteria and unbiased assessment of
school practice, the tool is reinforced with the consensus technique. Assessors will not
immediately render judgment about the school practice but instead jot down notes
and document the school practices as observed. These documentations are
discussed by the team of assessors and a consensus is made as to whether the
school is able to satisfy the level of practice.
(4) Checklist. Two checklist for outcome evaluation:
Pointers for Outcome Evaluation
For the Outcome Evaluation process, the same Checklist (Table 6-5) used in
the Mid-Term Implementation Review will be adopted.

Page 6 - 49
Division M&E System Quality Control and Adjustment Points

Table 6-7 Pointers for Outcome Evaluation

Performance Area Yes No Same Process Questions

1. Participation Rate
What programs and projects
contributed to the increase in the
Is there an increase in the participation rate? Yes No Same participation rate?
What external factors contributed to the
increase/decrease of participation rate?
If yes, to what Division programs and
projects can these be attributed?
Is the targeted participation rate achieved? Yes No Same
If no, what hindered the improvement in
the participation rate?
If yes, what unique programs and
Is the Division participation rate better than projects contributed to such?
Yes No Same
the average within the region? If no, what factors can this be
attributed?
If yes, what unique programs and
Is the Division participation rate higher or projects contributed to such?
Yes No Same
better than the national average? If no, what factors can this be
attributed?

2. Retention Rate & Completion Rate


What programs and projects of the
Division contributed to increase in
Is there an increase in the retention rate?
Yes No Same retention rate?
Decrease in drop out rate?
What grade/year level is drop out
incidence highest?
Is the retention rate and drop out rate of the What factors and/or programs and
Yes No Same
Division higher than the regional average? projects can these be attributed?
Is the Division retention rate/drop out rate What factors and/or programs and
Yes No Same
higher than the national average? projects can these be attributed?
If yes, what school programs and
Is there a decrease in the number of schools projects were implemented?
Yes No Same
with increasing drop out rates? If no, what factors can these be
attributed?
What programs and projects of the
Is there an increase in the completion rate? Yes No Same Division contributed to the increase in
completion rate?

3. Achievement
If yes, what programs and projects can
Are the performance of Grade 6 learners these be attributed?
Yes No Same
improving in the last 5 years? If no, what factors  internal and
external can these be attributed?
If yes, what programs and projects can
Are the performance of 2nd year high school these be attributed?
Yes No Same
learners improving in the last 5 years? If no, what factors  internal and
external can these be attributed?
If yes, how was these achieved? What
practices must be continued?
Are the targeted learner achievement If no, what lessons can be drawn from
Yes No Same
achieved by most of the schools? the effort or interventions provided that
should be improved or not repeated
anymore?
Is the Division performance on achievement Yes No Same If yes, what could be the factors that
higher than the regional average? allowed the Division to have higher than

Page 6 - 50
Division M&E System Quality Control and Adjustment Points

Performance Area Yes No Same Process Questions


average performance within the Region
(elementary and high school) If no, what can be learned from other
Divisions
Is the Division performance on achievement
higher than the national average? Yes No Same
(elementary and high school)
Are there significant number of low
If yes, document their practices and
performing schools whose performance
Yes No Same Division programs and projects that
improve from low performers to average or
helped improve performance
high performers?

4. Alternative Learning Programs

Yes No Same If yes, probe on the phenomena for the


Are the participation of out of school youth in increase in participation. What are the
the Division alternative learning programs Division programs that contributed to
increasing in the last 5 years? such improvement in participation
If no, probe why
Is there an increasing trend in the number of If yes, to what factors these can be
Yes No Same
A&E passers? attributed?
Are the CLCs meeting the standards of the
Yes No Same
Division?
Are the competencies/performance of
facilitators and instructional managers Yes No Same
improving?

5. Stakeholders Perception
What areas of the teaching and
Are the perception of learners to the teaching learning process gained positive
Yes No Same
and learning process improving? responses and what areas garnered
negative responses?
Is there an improvement in the perception of In what areas or service the school
stakeholders concerning quality of education Yes No Same improved  facilities, teachers, school
in the Division (as compared 3 years ago)? management etc
Are the perceptions of parents improving In what areas are the perception
concerning the quality of education? Yes No Same positive and what areas are they
negative? Why?
Is there an improvement in the perception of In what areas are the perception
local government units and others regarding Yes No Same positive and what areas are they
the quality of education? negative? Why?

6. SBM Level of Practice


Yes No Same Are they the same schools?
Is there an increase in the number of schools Are there schools whose SBM practice
belonging to level 1 and promoted to level 2 deteriorated?
or 3 What facilitated the improvement in the
practices?
Yes No Same Are they the same schools?
Are there schools whose SBM practice
Is there an increase in the number of schools
deteriorated?
belonging to level 2 and promoted to level 3
What facilitated the improvement in the
practices?

7. Learning Environment
Where is the shortage of textbooks most
Is the 1:1 learner to textbook ratio achieved? Yes No Same
acute?

Page 6 - 51
Division M&E System Quality Control and Adjustment Points

Performance Area Yes No Same Process Questions


How many schools have achieved the
classroom to learner ratio? How many
Is the 1;45 classroom to learner ratio achieved? Yes No Same
schools are the classroom shortage
most acute?
Yes No Same How many schools have provided very
Do the learners have access to laboratories
good access to learners on laboratories
and school equipment?
and school equipment?
If yes, how updated are the equipment?
Is there an ICT laboratory i Yes No Same
How is the utilization?

8. School Head Performance


Are there improvement in the competencies of
school heads in the following areas:
What were the training programs
received/attended by the school
Instructional supervision Yes No Same
heads? How are they applying these
training programs?
What are the training programs
received/attended by the school
Educational Planning Yes No Same
heads? How are they applying these
training programs?
What werethe training programs
received/attended by the school
Resource Mobilization Yes No Same
heads? How are they applying these
training programs?
What are the training programs
received/attended by the school
Advocacy Yes No Same
heads? How are they applying these
training programs?
Managing education programs and What were the training programs
projects received/attended by the school
Yes No Same
heads? How are they applying these
training programs?
Managing stakeholders Yes No Same Are they capable of managing the
school governing council? Partnering
with LGUs?
What were the training programs
received/attended by the school
Progress Tracking Yes No Same
heads? How are they applying these
training programs?
What were the training programs
received/attended by the school
Outcome Evaluation Yes No Same
heads? How are they applying these
training programs?

9. Teachers Performance
Are there improvement in the competencies/
performance of teachers on:
How many teachers have mastery of
the subject they teach? What were the
Subject mastery Yes No Same training programs received/attended by
the teachers? How are they applying
these training programs?
Teaching skills (classroom management, Yes No Same How many teachers demonstrated the
student assessment, modern teaching proper teaching skills? How many have
methods, care and use of learning been trained?

Page 6 - 52
Division M&E System Quality Control and Adjustment Points

Performance Area Yes No Same Process Questions


materials and equipment)
Use of ICT How many teachers are using the ICT?
Yes No Same How many have been trained in the use
of ICT?

Page 6 - 53
DIVISION M&E SYSTEM
7.0
MONITORING AND EVALUATION TOOLS & TECHNIQUES
Division M&E System M&E Tools and Techniques

7. 0 D I V I S I O N M & E T O O L S

This section enumerates tools and techniques for monitoring and evaluation.
The choice of M&E tools and techniques will influence the results of the evaluation or
assessment that will be undertaken by the Division. Selection of the most appropriate ones
increases the likelihood of a correct, precise and accurate results or findings. In this regard, it
is important that the M&E team must be familiar with the different tools and techniques,
especially in the results these will generate, as well as the context and nature of these tools.
The following is a classification of M&E tools and techniques:
(1) Tools to assess effectiveness
(2) Tools to assess Division Readiness
(3) Tools to track efficiency
(4) Tools for data gathering.

7.1 Tools to Assess Division Effectiveness


The effectiveness of the Division can be measured using the following tools:
SBM Level of Practice. This is a self-administered assessment tool for the school
head. The assessment covers the six dimensions of SBM, namely, School Leadership,
Internal Stakeholder Participation, External Stakeholder Participation, Continuous
School Improvement Process, School-Based Resources and School Performance
Accountability. The result of the self-assessment will be used as input to the
adjustments in the AIP and the preparation of the next cycle SIP.
See Manual on Assessment of School-Based Management Practices.
Segmentation Techniques. This is a technique used to understand and gain insights
about target groups. Segmentation is a process of identifying and grouping schools
based on characteristics and accomplishments. The main objective of
segmentation is to get to know the schools better in order to customize or fit the
Division's technical assistance to the requirements of the school.
Specifically, the segmentation technique will allow the Division to compare similar
schools (the same characteristics) and schools from different groups (different

Page 7 - 2
Division M&E System M&E Tools and Techniques

characteristics). This approach will facilitate the monitoring of schools and allow the
Division to determine the unique needs, problems and requirements of schools
belonging to the same segment.
As an evaluation tool, the Division assess the performance of a school against the
performance of schools with similar characteristics or schools belonging to the
same typology. The performance of the schools (in a Division) are also compared or
benchmark against the performance of schools (belonging to the same typology)
in other Divisions (within the region) and against national performance (average).
Competencies Checklist. Includes list of competencies that must be demonstrated
by school heads, teachers, instructional managers and literacy facilitators.
Stakeholders Perception Survey. Refers to the perception of the stakeholders
(community, LGUs, learners, etc) on the quality of education and quality of services
provided by the schools.
Logical Framework Approach. This refers to situational tools and techniques used to
assess and explain the phenomena behind the results or outcomes. Logical
framework matrix uses problem tree, objectives tree, stakeholders analysis and SWOT
(strengths, weakness, opportunities and threats).

7.2 Tool to Assess Readiness


Intermediate objectives pertain to improvement in the practices of the Division in providing
service or technical support to the schools and community learning centers.
Quality Management Inventory Model. The QMIM depicts a road map that traces
the Region and Division's transformation from use of informal processes to a more
established technical assistance packages and support mechanism. It projects an
organization's transition from the realm of uncertainty to a more repeatable and
predictable results. The Model represents a progression of capability by the Region
and Division to deliver management support and its technical assistance packages
to their target groups.

The QMIM is also a yardstick to assess the performance of the Region and division. It
will be used to examine the Region and Division's processes and support
mechanisms that allows it to efficiently and effectively deliver technical assistance
packages to schools, school managers, teachers and the school's non-teaching
staff.

Page 7 - 3
Division M&E System M&E Tools and Techniques

   



  
  

  


 

  
   



 
  
 

   

The Model is defined into (1) Ad Hoc, (2) Defined, (3) Integrated, (4) Sustained.
Level 1 is the entry level. It represents a Division that is characterized by ad hoc
processes and informal way of doing things. As it matures, the Division Office is
expected to establish its internal procedures (Level 2. Defined). The Division improves
into a stage where it is expected to manage and integrate different mechanisms
into an integrated system. The highest level is the Sustained level. This represents a
Division that adapts, maximizes and continuously improve its way of doing things.
For a more detailed discussion, see attached document  Quality Management
Inventory Model which describes the model and the process for undertaking the
inventory on quality management.
Division Readiness on Quality Management. A self-assessment tool used to
evaluate the competencies of Division and District staff on the critical areas of
quality management. compare the approved or accepted targets in the AIP/SIP
versus the actual number of targets completed. For a sample checklist on readiness
assessment below.

Page 7 - 4
Division M&E System M&E Tools and Techniques

Sam ple Readiness As sessment Tool


Readiness Assessment
This is a readiness assessment. We would like get your perception on the readiness of your organization to
implement a quality assurance system Listed below are some of the processes on QA and M&E, key tools
and techniques on planning, monitoring and evaluation and concepts and principles of Quality
Assurance. The results will help us determine our assistance to you especially in designing the capability
building programs.
Please rate your organization as 1  have no or minimal knowledge or understanding of the process/tool/
concepts; 2  have been trained on the process/tool/concepts but have yet to apply or use them; 3 -have
limited implementation or application of the process/tool/concepts; 4- have been using or applying the
process/tool/concept
Area 1 2 3 4 Rating
Appraisal of the SIP/DEDP. Review process on the relevance and technical correctness of
1 the SIP/DEDP. 0
Review of Start Up. Process of assessing the readiness of a unit to implement a newly
2 approved plan 0
Implementation review. Evaluation of accomplishments (physical accomplishment) and
3 analysis of problems and issues surrounding an implementation 0
Mid Term Implementation Review. Process involving assessment of initial gains, physical
4 accomplishments to date and adjusting the plan 0
Outcome Evaluation. Process involving the evaluation of results based on the Purpose-
5 level objective and targets contained in the Plan 0
Process Audit or Compliance Review. Assessment of a unit's application of standards in
6 delivering an output 0
Education Planning. Process involve in preparing a strategic plan and a detailed
7 implementation plan. Includes planning for resources,estimating time and cost and setting 0
up the appropriate organizational structure to implement the plan.
Logical Framework Matrix. Is a planning and M&E tool used to define the objectives,
8 targets and indicators. Provides the scope of the plan 0
9 SBM Assessment. A tool developed to assess a school's level of practice on SBM. 0
Implementation and Improvement Clinic. A technical assistance process on helping a Unit
10 improve its delivery of programs and projects. 0
11 Accreditation. Mechanism to raise the bar of excellence for schools 0
Change Management. Is a structured approach to change individuals or organizations
12 from the current state to the future state. 0
Knowledge Management. Practices of the organization to distribute, transfer and
13 propagate knowledge within the organization 0
14 Quality Assurance. Understands the concepts and principles of quality assurance. 0
Perception Assessment. Assess stakeholders level of satisfaction on the delivery of basic
15 education
0

Page 7 - 5
Division M&E System M&E Tools and Techniques

7.3 Tools to Track Efficiency


Efficiency is measured by comparing the actual implementation progress versus the plan.
Tracking efficiency is based on the regular collection of data and information specifically
on the accomplishments. Efficiency can be measured using the following tools:
Line of Balance Method or
S Curve. A tool that plots  !
the total plan on a periodic  !  !
basis versus the actual !
accomplishments per period. & ! %!
!
The S Curve diagram #!
 ! !
provides management with a
! !
clear status of implementation   
and shows the trend
(accomplishments) over time.

Gantt Chart (Plan versus Actual Implementation  Time)


Gantt Chart. Is a type
1.0 Preparation
Activities Start Date Finish Date 1 2 3 4 5 6 7 8 9 10 11 12 Status Remarks
of bar chart that
1.1 Organize the Evaluation Team Mon 05/Jan 09 Tue 20/Jan 09

1.2 Orient the Evaluation Team Tue 10/Feb 09 Fri 13/Feb 09


Completed details the entire
1.3 Prepare the Evaluation Instruments Tue 10/Feb 09 Wed 15/Apr 09
Completed
implementation in a
Completed
1.4 Train Data Gatherers Sat 02/May 09 Mon 15/Jun 09
Completed
chart. It is both used
1.5 Prepare Data Gathering Plan Thu 11/Jun 09 Thu 25/Jun 09

2.0 Data Gathering


Completed as a planning tool
2.1 Kick-Off Meeting Wed 24/Jun 09 Tue 30/Jun 09
Completed and as a tool for
2.2 School 1 Mon 29/Jun 09 Mon 29/Jun 09

2.3 School 2 Tue 30/Jun 09 Tue 30/Jun 09


Completed
tracking the progress
2.4 School 3 Mon 13/Jul 09 Mon 13/Jul 09
On-going
of implementation. As
2.5 School 4 Wed 15/Jul 09 Wed 15/Jul 09
On-going a monitoring tool, the
2.6 Etc Mon 10/Aug 09 Fri 14/Aug 09

3.0 Analysis
chart is used to track
3.1 Tabulate Data Sept 20, 09 Sept 25, 09

3.2 Consensus Building Sept 20, 09 Wed 30/Sep 09


or plot the activities
3.3 Analyze findings Thu 01/Oct 09 Mon 19/Oct 09 implemented or
4.0 Submission of Report
4.1 Write the Report Tue 20/Oct 09 Fri 30/Oct 09
outputs delivered
4.2 Communicate Tue 03/Nov 09 Sun 29/Nov 09 based on the
4.3 Submit to Region Tue 15/Dec 09 Tue 15/Dec 09
approved scope of
work. It is a good tool
to visualize the accomplishments vis-a-vis with the plan.

Page 7 - 6
Division M&E System M&E Tools and Techniques

Checklist or Process Audit Checklist. Checklists will be developed and used to


assess the practices of schools, community learning centers, districts and Division
staff on the delivery of services. The checklists will contain the standard processes
set by the Division and Region.
Below is a sample checklist to be used during the Process Audit..
Sample Checklist. Implementation of SBM Assessment

Area Remarks

1 Preparatory Activities
School Head Interview the school head. Ask
The school head attended the More Info him/her some important
Yes No
orientation on SBM and is aware of the Needed information about SBM
intent of the SBM assessment assessment
School Head
Interview the school head. Ask
The school head understand all the terms
More Info him/her some important
used in the instrument and knows the Yes No
Needed information about SBM
process of administering, scoring and
assessment
reporting.
This is to triangulate the
Teachers and Non-Teaching Staff
information provided by the
The school head oriented the teaching
More Info school head. Ask the teachers
and non-teaching staff on the concepts Yes No
Needed and non-teaching staff about
of SBM and aware of the purpose and
the purpose of the SBM
process involve in the assessment
assessment
This is to triangulate the
External Stakeholders
information provided by the
The school head oriented the school
More Info school head. Ask the teachers
stakeholders on the concepts of SBM and Yes No
Needed and non-teaching staff about
aware of the purpose and process
the purpose of the SBM
involve in the assessment
assessment
External Stakeholders
Majority of the invited stakeholders
More Info
attended and participated in the Yes No Check the attendance sheet.
Needed
assessment ; each dimension was
represented by stakeholders
2 Data Gathering
Copy of Assessment Tool
More Info
Everyone has a copy of the assessment Yes No
Needed
tool
Evidence Validate the evidence
The documents provided are the most More Info presented. Check the dates,
Yes No
authoritative document  signed and the Needed signatories and content of the
most up to date document if sufficient

Page 7 - 7
Division M&E System M&E Tools and Techniques

Area Remarks

Evidence
More Info
The school head allowed access to all Yes No Ask the stakeholders.
Needed
school documents
Evidence
More Info
Enough time was given to review and Yes No Ask the stakeholders
Needed
assess the content of the documents
Evidence and Interview
Agreements and consistency between More Info Look for documentation of the
Yes No
the content of the documents and the Needed interviews
responses of the interviewees

3 Summarizing the Stakeholders Responses


FGD
More Info
Group discussion was done after the Yes No Ask the stakeholders to validate
Needed
inventory per dimension
Evidence
More Info
Documents or actual objects were used Yes No Ask the stakeholders
Needed
to validate the observation
Check Mark
Counting of the check mark was done More Info Ask the stakeholders; or check
Yes No
collaboratively and in a transparent Needed the raw scoring sheet
manner
Consensus
More Info
The rating or scoring was done through Yes No Ask the stakeholders
Needed
consensus
Discussion of Results
The total score of the school was More Info Ask the stakeholders; look for
Yes No
presented and discussed to the Needed the presentation material
stakeholders
Weak areas
In the FGD, discussion was also focused More Info
Yes No Ask the stakeholders
on what to do with the school Needed
weaknesses
Strengthen SBM Practices Ask the stakeholders; refer to SIP,
More Info
Discussion on what to do and how to Yes No AIP or any other plan where the
Needed
strengthen the school practices findings were incorporated
Support from Stakeholders
Ask the stakeholders; look for
Using the results of the assessment, the More Info
Yes No documentation of support from
school head rallies support from the Needed
stakeholders
stakeholders
4 Others
Minutes of the Meeting More Info
Yes No Document review
There is a documentation of the entire Needed

Page 7 - 8
Division M&E System M&E Tools and Techniques

Area Remarks

process especially the FGD?


Documentation
More Info
The school prepared a report detailing Yes No Document review
Needed
the results of the SBM assessment
Submission
More Info
The Division was furnished a copy of the Yes No
Needed
School Report

7.3 Tools and Techniques for Data Gathering


7.4.1 Purpose of Data Gathering
Data gathering activities are undertaken to objectively verify accomplishments,
demonstration and use of skills, utilization of a system, and to explain unintended effects of
programs and projects. It involves gathering first hand data from various sources.
Specifically, gathering of primary data is undertaken when:
the data and information provided in the reports need to be verified. There is a
need to further explain the numbers and statistics presented in the report with
stories
there is a need to assess the factors influencing or causing the phenomenon and
identify the facilitating and hindering factors
there is a need to document effective practices and difficulties being encountered
in the application or utilization of skills, system and facilities
there is a need to probe further a report and/or challenge a report.

7.4.2 Some Guideposts in the Selection of Tools and Techniques


The decision to gather primary data is to be taken with caution. Careful considerations
should be undertaken in the choice of tools and techniques to use. All too often, the tool
selected will influence, if not dictate, the quality of data and information that will be
gathered. The following are some guideposts in selecting the appropriate tools
Know what to collect and what to validate. There is no substitute to good planning

Page 7 - 9
Division M&E System M&E Tools and Techniques

and preparation. Identify the indicators that you want to verify, the data that will
support the indicators then determine the appropriate tool.
Triangulate. Use more than 1 technique in gathering data to minimize the error
inherent among data gathering tools and techniques
Just in time not just in case. Collect data that you need to make decision, not
collect data in case you need them in the future. This will help you avoid data
overload.
Cost efficient. Consider the costs involved in using a tool to gather data. As a
general rule, always go for a less expensive technique that will offer same quality of
data as that of the more expensive technique.

7.4.3 Tools and Techniques


Rapid Appraisal Techniques. These are not so quick and not so dirty techniques of
gathering qualitative information data about school achievements and performance. It is a
technique for gathering information that will help explain a phenomenon. It documents the
practices (what was done and what were not undertaken) of the schools. It involves the use
of different techniques in order to validate and triangulate information that will help
derived an unbiased view of the situation. Rapid appraisal techniques include:
key informant interviews. Key informants refer to individuals who can provide holistic
and complete information about the schools. Interview may also be undertaken
through transect walk (walk through) or through the use of a questionnaire
focus group discussion. Involves individuals grouped according to similar
characteristics and traits. A facilitator leads the discussion and draws information
from the participants. There are no right and wrong answers but the facilitator must
see to it that the discussion is focused and will generate the desired information
from the participants.
inspection. This is an activity that will validate the claims of individuals about a
practice or way of doing things.
actual observation. In order to document the actual practice or behavior, actual
observation is undertaken. This method will help validate the claims made by key
informants and participants to the FGD.
questionnaire. Predetermined questions are jotted down. These are used to guide

Page 7 - 10
Division M&E System M&E Tools and Techniques

the interviews.
Table 7-1 Rapid Appraisal Tools

Data Gathering Tool Description Key Pointers

This method focuses on the actual performance, The key item to remember in making
actual utilization, on -going activities and events. observation is for the observer to
Observers record what they see and hear. avoid the urge to document and
Observation This method is appropriate when the objective is analyze at the same time. The
(Direct observation) to document demonstration of skills in an actual observer must jot down notes as
setting. objectively as he/she can, noting
down exactly what is being
observed.
Interview is one of the most commonly used Using a guide or questionnaire would
data gathering methods. It gathers qualitative often distract the interviewee. This
data and is a good source of perspectives may also cause the interviewee to
which will help explain the phenomenon being be cautious about the information
Interview (key informant validated. he/she is sharing.
interviews, informal interviews, The interviewer uses guides (list of topics or open The key in using interview as a
transect walk) ended questions) and probes the interviewee to method is the INTERVIEWER. The
elicit opinion, experiences and practices. interviewer should evolve as the
instrument. He/she must be quick
to adapt and adjust to the
demeanor of the interviewees.
This method involves around 8 to 12 individuals FGD should be focused. If the
discussing a certain subject matter. The group is group discusses so many topics and
assisted by a facilitator and a documenter. it goes out of focus, the FGD fails.
The facilitator asks process questions to start the And this happens a lot of times.
Focus Group Discussion discussion. The facilitator will not discriminate the The main role of the facilitator is to
answers provided by the participants but should keep the discussion of the group
probe and ask clarification on the responses FOCUSED.
given.
A documenter will document all the responses.
This data gathering method focuses on the Inspection is used to validate
existence of artifacts. These are outputs (in claims made by interviewees
document form) developed or prepared by the during the interview, FGD and in the
target group. The existence of an output is a reports submitted.
Inspection (artifacts review) demonstration of skills. The existence of standards will
Inspection is also a quality control activity. It facilitate the inspection process.
involves seeing and touching the materials
and equipment purchased, and assessing the
quality of facilities constructed
This is a structured way of gathering data. In using the questionnaire, limit the
Questions about data and/or information you questions to what you really need to
want to know are inputted to the questionnaire. know. Ensure that the questions are
The questionnaire ensures that your concerns are linked to the program design.
Questionnaire
covered. It also allows uniform presentation of Long questionnaires have a dismal
questions to respondents reducing the bias of response rate.
the researchers. Use simple terms and instructions.
When able, provide incentives.
Perception Survey Among the tools enumerated, perception survey There is a need to ensure the
is the only tool used to gather quantitative data. reliability of the questionnaires to be
It is used to gather information about what used. And there is a need to
people think about a performance, service or a standardize the questionnaire and its

Page 7 - 11
Division M&E System M&E Tools and Techniques

Data Gathering Tool Description Key Pointers

product. administration.
The data generated by the perception survey
are usually considered in the preparation of a
program design.
It involves the use of a structured questionnaire
and selected number of respondents (based on
sampling methods used)
Can either be self-administered or done by
professional researchers.

Page 7 - 12
DIVISION M&E SYSTEM
8.0
DOCUMENTS AND REPORTS
Division M&E System Documents and Reports

8.0 DOCUMENTS AND REPORTS

8.1 Description
Management Reports are data organized in an easy to understand format. The reports
provide the stakeholders with a holistic perspective on the accomplishments, events and
period that have elapsed. It is essential that the report provides complete information of
the events in order to be a useful input towards decision making. Furthermore, in order to
be useful and effective, reports should contain information about three essential areas:
Operational information. This describes the progress or status of implementation
happening within the school and classroom. At the school level, it includes school
programs and projects implemented, quality of outputs delivered, resources
generated and the expenditures of the school. At the classroom level, this may
include competencies gained by students, lessons covered, and attendance.
Internal and external information. Internal information relates to all activities within
the school or classroom and the stories behind the activities. This includes reporting
of the major events and activities that took place inside the school and factors that
facilitated or hindered the activities. On the other hand, external information
pertains to factors outside the school that may have influenced or affected school
performance. Information outside provides good comparative information to assess
the schools own performance.
Leading and lagging information. Leading information or leading indicators
provide insight or early warning into a future event. Some examples of leading
indicators are teachers performance (predicting student learning), frequent
absenteeism (leads to dropping out), and good teaching and school-based
management (influences enrollment).
On the other hand, lagging information or historical information provides useful
insights to current accomplishments. Reports provide a comparison of past
accomplishments to accomplishments to date. Example, in reporting drop out
rate for this year, drop out rates of previous years are also reflected in the report to
provide historical trend.

8.2 Some Guideposts in Management Reporting


The content of reports should be driven by the decision-making requirements of the user of
the report, and not by what data is available. Avoid unnecessary information as well as
unnecessary attachments. The following are some guideposts in preparing a report format:
Keep it short and simple. Readers may have little time to read a voluminous report.
Keep it short but full of relevant information;

Page 8 - 2
Division M&E System Documents and Reports

Use graphs and tables that will provide immediate information about performance
and accomplishments;
Follow the format of your plan. When using coding (e.g.,. C.1.1) follow the one used in
the approved plan.
Provide stories behind the numbers, but keep it simple and direct.
Avoid ambiguous words and limit jargon the reader may not understand.
Attach documents that will directly support or explain further the information in the
main report. And use the most authoritative source of data.

8.3 Division Documents and Reports


Documents and reports contain information that are needed in making accurate decisions.
Reports are prepared not because data is available but because decisions need to be
made. The organized data contained in a report will provide valuable input and insights to
the Division on future actions to undertake. Such is the importance of management reports.
The problem, however, is not the lack of reports but too many reports. There are too many
information but ironically these are not received at the right time by the right individuals. It
is important to streamline the reports and ensure timely arrival of information that a school
head would need in managing the school. The following documents and reports are
identified not for reporting purposes but because of information and insights they contain in
helping the Division determine the future moves and actions.
8.3.1 Baseline Documents
The following documents provide baseline information about the school, school head,
teachers, instructional managers and literacy facilitators. These baseline documents will be
used as basis for the progress or accomplishments of the Division.
School Report Card. Refers to school performance with focus on the enrollment,
retention, completion and learners academic performance.
School Profile. Inventory of school's learning environment and learning resources
and other services.
Competency Profile of Division target groups: school heads, teachers, instructional
managers and facilitators. This will provide the baseline information about the
capabilities of the Division target groups.
Division Report Card. Provides information on the overall performance of the
Division using selected performance indicators and a comparative performance of
the schools within the Division..

Page 8 - 3
Division M&E System Documents and Reports

8.3.2 Control Documents


Control documents are education development plans prepared by the schools and the
Divisions. These documents provide the scope of technical assistance and the coverage of
the M&E activities that will be undertaken by the Division.
SIP/AIP. Contains the outcomes, target performance indicators, target outputs and
strategies of the school. This will be most useful in understanding the context of the
problems and issues encountered by the school and the opportunities available to
the school. The SIP/AIP will be used by the Division to track the efficiency of the
schools.
DEDP/DAP. Represents the six-year plan of the Division. This control document will be
used by the Division to manage its operations for the next six years. These are
adjusted and detailed every year (DAP).
8.3.3 Status Reports
Status reports provide information on the progress of implementation and information on
the initial gains or results of the implementation. Status reports must contain the following
information:
Scope or the Quantity (target). Provide a comparison of the actual outputs covered
or delivered versus the plan.
Quality of accomplishments. Describe the characteristics of the outputs delivered in
order to provide readers with ideas (on what was accomplished) and as basis for
further evaluation.
Schedule. Provide a situationer on the accomplishments versus the plan. Status
reports should show how fast, slow or just right is the progress of implementation.
Cost. Status reports provide an account of the expenditures vis-a-vis with the
approved budget.
Initial benefits or results. Describe the improvements in the practices of the target
groups.
Major problems and issues. Provides a list of the major problems encountered and
issues that may create more problems. The report must contain suggestions or
recommendations on how to mitigate and/or solve these problems and issues.
Next steps. A status report serves as the important link between the current situation
to the activities to be implemented in the next period.
The following are the main status reports and their objectives:
School Quarterly Progress Report. Consolidation of 3 monthly reports to be
submitted to the Division. It contains the physical accomplishments for the quarter
and description of programs and projects implemented. The report may also

Page 8 - 4
Division M&E System Documents and Reports

contain problems and issues encountered by the school that need to be addressed
by the Division
Monthly Report/Annual Report on ALS Programs. Contains the status of the Basic
Literacy Program and A&E Program of the Division which are implemented through
the community learning centers operated by service providers or run by the District.
Division Monthly Report. Contains the physical accomplishments of the Division
versus the plan (DAP), short description of programs and projects implemented and
documentation of problems, issues and opportunities encountered.
Division Annual Accomplishment Report. An end-of-year report containing the
programs, projects and other services delivered by the Division. The report also
provides a comparative report on the performance of the schools using selected
performance indicators and the schools' level of practice on SBM.
8.3.4 Accomplishment Reports
Accomplishment reports are documents prepared and submitted after every end of
program or project or the end of a major undertaking. Accomplishment reports provide
DEDP Terminal Report. This report contains the accomplishments of the Division after
six years of implementing the DEDP. Specifically, it contains information on the
schools' performance (comparative), competency profile of school heads, teachers
and Division staff and an inventory of the programs and projects implemented for
the schools. The completion report also includes the Division Report Card which
provides a holistic picture of the Division after six years of DEDP implementation.
Division Mid-Term Implementation Report. This report is prepared at the end of the
1st cycle of SIP implementation (or phase 1 of the six year DEDP implementation). The
Mid-Term Report contains the information on the achievements and
accomplishments of the schools and the Division after three years. The report
provides information and insights that may drastically alter or affect the next three
years of the DEDP implementation.
Division Best Practices. Documentation of programs and projects implemented that
netted positive results.

Table 8-1 Division Documents and Reports


Type of
Document / Timing of
Document / Content Purpose As Input to
Report Report
Report
Document submitted by
Resource Profile of
schools to Division to
Baseline school. Includes End of March DEDP and
School Profile determine the areas for
Document human and of each year DAP
technical assistance to
physical resources
schools
Baseline School Report Performance Document submitted by End of March Attachment

Page 8 - 5
Division M&E System Documents and Reports

Type of
Document / Timing of
Document / Content Purpose As Input to
Report Report
Report
schools to Division to to annual
determine the effectiveness report and
Document Card indicators of Division interventions as well of each year SIP
as determine the impact to completion
schools performance report
Schools'
To provide baseline
Performance
information of the Division. This Preparation
(comparative)
will be used to assess the year of DEDP
Baseline Division Report Competency Profile
to year performance of the Annual Region's
Document Card of school heads,
Division; also to be used for input to
teachers and
outcome evaluation by the REDP
Division staff
Region
QM Assessment
Adjustment
of next
To report on the progress of
month's
School Quarterly implementation; End of the
Status Report activities
Report Documentation of month
Performance
accomplishments per month
assessment
of teachers
Adjustment
of next
To show status of AIP quarter's
School Quarterly End of the
Status Report implementation after every 3 activities
Report quarter
months Performance
assessment
of teachers
Information
from the
report will be
March of
used as basis
each year
to
To present the except the
School Annual adjust/enha
Status Report accomplishment report of the last year of
Report nce the next
school after 1 year; SIP
year AIP
Implementati
As basis for
on
measuring
efficiency of
SH
Learning
Learner Report To provide information about Quarterly
Status Report Manageme
Card the learners' performance and EO SY
nt Plans
To provide documentation of
the 3 year implementation January 
Accomplishm SIP Completion Next Cycle
which will include lessons February of
ent Report Report SIP
learned and key practices of 3rd Year of SIP
the school
School To provide documentation of End of
Accomplishm Best
Program/Project program/project Program/Proj
ent Report practices
Report accomplished ect

8.3.5 Other Documents


Other Documents include acknowledgement receipts or inventory receipt of property,
certificate of acceptance, memorandum receipt.

Page 8 - 6
Division M&E System Documents and Reports

8.4 Reporting Flow and Frequency


Most reporting flow follows the organizational structure. The teacher submit reports to the
department heads and the department heads report to the school head. As such, critical
and valuable information are reported up the ladder but are not disseminated fast enough
to the other field units who may be needing the information. This reporting flaw should be
corrected if the objective is to provide timely and relevant information to stakeholders or
users of the reports.
Consider not only the vertical flow of information but also the horizontal flow of information.
Horizontal flow of information encourages sharing of data, information and insights. It is a
much faster way of propagating effective practices. For example, english teachers to share
information about a learner to the math and science teachers rather than going through
the process of reporting and submitting information to the Department Head and to the
School Head.
Some guidelines on the reporting flow:
Timing. The most accurate data and information may lose its usefulness if it is
received late.
Not all information are reported or shared to the other levels. Only data and
information needed to make decisions or adjustments. However, in cases where
additional information is requested, detailed information or back up information
should be readily available;
Uniform or similar format of reports. This is to ensure easier consolidation,
comparison and analysis of information be it vertical or horizontal structure.
Accuracy over precision

Page 8 - 7
DIVISION M&E SYSTEM
9.0
M & E T ERMS OF R EFERENCE
Division M&E System M&E Terms of Reference

9.0 M&E T E R M S OF REFERENCE

9.1 Manifestations of a neg lected M& E Sys tem


The M&E system is acknowledged to be one of the most important systems in
management. It is an important mechanism in the directing, steering and controlling
functions of management. It provides information and insights to management to ensure
quality products and services as well as continuous improvement in the organization.
Ironically, the M&E system is one the most often neglected systems in an organization. Here
are some of the manifestations of an organization that neglected its M&E function:
Nobody is in charge of M&E. If ever there is somebody assigned to do M&E, it is a
junior staff assigned to collect data and put them together in one document.
Implementers are forced to make decisions without the benefit of data and
information. There is no direct link between M&E activities and decision making.
M&E system and its requirements are set up in the middle of an implementation.
Nobody can really say the status of implementation.
Scope creeps1. There are too many intervening activities, events or outputs that
lead to non-implementation of approved programs and projects
Same (failed) programs and projects continue to be implemented
Never ending collection of data. Field personnel are often burdened with request
for data and information even though these have been reported already
Different units or different individuals collecting same data simultaneously.
A sure sign of a missing M&E system is when the basic data elements are never
collected.

9.2 Common Misconceptions about M& E Work


As a result, the actual efforts on M&E is often limited to data gathering, report writing and
report submission. Such wrong perceptions contributed to the popular belief that there is a
dichotomy between M&E and decision making.
Here are some of the misconceptions about the work in M&E::
M&E is about submitting reports. Report preparation and submission is just one of
the many functions in M&E. It includes the process of gathering data and
information, analyzing them, writing and presenting these in a format that will

 A scope creep is an activity, event or an output undertaken (may be necessary) but is not part of the approved or
agreed plan.

Page 9 - 2
Division M&E System M&E Terms of Reference

facilitate decision making. These reports will be the basis for future actions and
future designs of programs and projects.
The M&E System is often equated to filling up forms, tables and matrices. The
important activity of validating the data and information is often neglected, thus,
the practice of filling up forms, tables and matrices leads to erroneous data and
information.
M&E is about field visits and data gathering.
Data collected must always go up before it is disseminated down the line. As a
result, needed data and information arrive late or never at all.
Another wrong notion about M&E is to meet the information requirements of the
external/higher management level unit. The M&E system is set up and put into
operations in order to meet the information requirements of the internal units
especially individuals who are responsible for the delivery of outputs and the
achievement of outcomes.
These misconceptions often lead to false notions that M&E people are just spectators 
watching and observing (spying) and waiting for people to make mistakes. Then they report
these mistakes. The misconceptions above are the usual reasons why people shun M&E.

9.2 The M& E Function


The M&E function is very important in decision making. Every implementer will make
decisions according to their own accountabilities. Monitoring and evaluation activities are
undertaken to ensure that accountabilities and desired results are achieved. Essentially, the
M&E is about adjustments, which can consist of:
no changes. if no or tolerable deviations from the plan are observed;
changes in activities, if deviations from the plan can be counteracted by adjusting
resources and activities;
adaptation of the plan, if the strategy does not yield the expected results and
effects;
changes in the strategy or termination of the plan if target purpose turn out to be
unachievable due to misconceptions and changes in frame conditions.

Page 9 - 3
Division M&E System M&E Terms of Reference

9.3 The Process Owner

9.3.1 Schools Divi sion Superintendent


The SDS is the process owner of the Division M&E System. As process owner, the SDS must
ensure the integrity and efficiency of the System. This means providing accurate, correct,
timely and relevant information to the schools, Region and other stakeholders. The SDS will
also be the major beneficiary of the lessons and insights produced by the M&E System.
Specifically, the following outlines the roles and responsibilities of the SDS on M&E:
Overall, the SDS provides the steering and decision-making requirements of the Division's
technical support to schools and community learning centers.
The SDS shall report directly to the Regional Director and provide information on the
progress or status of the DEDP implementation. The SDS shall also raise problems and issues
affecting Division operations and recommend areas for adjustments.
The SDS shall interact closely with the following stakeholders:
Regional Office
Schools
Assistant Schools Division Superintendent
Division M&E Coordinator
Division Planning Officer
Division staff
Division Quality Management Team
As the major decision maker in the Division, the SDS shall have overall supervision of the
DEDP/DAP implementation. The SDS is to ensure that all the programs and projects
undertaken by the Division and Districts are in accordance with the accepted DEDP/DAP,
on time and within budget. In this regard, the SDS shall undertake the following:
Conduct regular meetings, workshops and evaluation
Oversee the implementation of the Quality Control and Adjustment Points
Review and endorse the DEDP Completion Report.
Lead the Division Quality Management Team. As team leader, provide directions
in the conduct of the SIP Appraisal, Start Up Review, Annual Implementation Review,
Mid-Term Review Outcome Evaluation

9.3.2 Assistant Schools Division Superintendents


The ASDS is responsible for monitoring and evaluating the progress and quality of the
Division programs and projects for schools and community learning centers or programs
outlined in the DEDP/DAP. he ASDS is directly responsible for the operational supervision

Page 9 - 4
Division M&E System M&E Terms of Reference

of their units. They provide technical assistance to education supervisors and district
supervisors on how to efficiently and effectively deliver the Division programs and projects.
The ASDS shall interact closely with the following stakeholders:
Schools Division Superintendent
Division Monitoring and Evaluation Coordinator
School Heads
QMT Members
The ASDS shall report directly to the SDS. He/She shall provide the SDS with the progress or
status of programs and projects of the Division, raise issues and problems affecting (or
may affect) the technical assistance support to schools and community learning centers
and recommend areas for adjustment in the DEDP/DAP. Specifically, the following outlines
the roles and responsibilities of the ASDS:
Prepare programs and projects that will support the requirements of the Division
staff, particularly the promotional staff and the district supervisors.
Prepare and submit Monthly Report detailing the status of programs and projects
and future activities
Conduct unit meetings and workshops related to M&E concerns
Supervise day to day activities of Division staff. Monitor the provision of technical
assistance to schools and community learning centers
As member of the Division Quality Management Team, participate in the SIP
Appraisal, Start Up Review, Annual Implementation Review, and Mid-Term Review
Outcome Evaluation

9.3.3 Division M&E Coordinator2


The M&E Coordinator is responsible for the overall M&E strategy and implementation of
M&E related activities within the Division and provides timely and relevant information to
stakeholders.
The M&E Coordinator shall interact closely with the following stakeholders:
Schools Division Superintendent
Assistant Schools Division Superintendent
Division Planning Officer
School Heads and Instructional Managers
QMT Members

 Due to the strategic and sensitive nature of the M&E function, it is suggested that the designation of M&E
Coordinator should be given to one of the Assistant Schools Division Superintendent (ASDS).

Page 9 - 5
Division M&E System M&E Terms of Reference

The M&E Coordinator shall report directly to the SDS. The Coordinator shall provide the SDS
with interpretation and analysis of M&E data, raise issues and problems affecting (or
may affect) DEDP/DAP implementation and recommend areas for adjustment in the
implementation plan. Specifically, the following outlines the roles and responsibilities of the
Division M&E Coordinator:
Assist in the review and revision of the DEDP objectives and strategies, particularly
in the adjustments of the OVIs (objectively verifiable indicators) and MOVs (means
of verifications  reports, documents, data gathering methods)
Assist in the development and adjustment of the DEDP and DAP.
Assist the ASDS in setting up the Division M&E System. Ensure that the Division M&E
System complies with the operations of the Region M&E System
Communicate to Division and District staff the requirements of the School M&E
System and the roles and responsibilities of staff on M&E
Prepare consolidate Division Monthly Report for the SDS in accordance with the
approved reporting formats and schedule. This also includes reviewing and
validating the reports submitted by school staff and documents received from
outside the school.
Assist the education supervisors, district supervisors and other Division staff in the
preparation of their progress reports.
Record and report the Physical Accomplishments of the Division and schools.
Ensure proper documentation and safekeeping of Division reports and documents
generated during project implementation.

9.3.4 Division Planning Officer


The Planning Officer is responsible for ensuring reliability of Division and school data.
Specifically, the Planning Coordinator is responsible for the collection and validation of
school, community learning center and Division data and will provide initial statistical
analyses for the same.
The Planning Officer shall interact closely with the following stakeholders:
Schools Division Superintendent
Assistant Schools Division Superintendent
Division Monitoring and Evaluation Coordinator
School Heads
QMT Members
The Planning Coordinator shall report directly to the SDS and work side by side with the

Page 9 - 6
Division M&E System M&E Terms of Reference

M&E Coordinator. The Planning Coordinator shall provide initial interpretation and analysis
of project data. Specifically, the following outlines the roles and responsibilities of the
Division Planning Coordinator:
Assist in the review and revision of the DEDP objectives and strategies, particularly
in the adjustments of the OVIs (objectively verifiable indicators) and MOVs (means
of verifications  reports, documents, data gathering methods)
Assist in the development and adjustment of the DEDP and DAP
Assist the M&E Coordinator in the preparation of consolidated Division Monthly
Report
Ensure availability of information from the BEIS and timeliness of data.
Mainly responsible for the computer entry of data and provide some initial analysis
and interpretation. Ensure integrity and accuracy of data
Manage and maintain the BEIS

9.3.5 Education Supervisors / District Supervisors


The Education Supervisors and the District Supervisors are responsible for tracking the
performance of the schools and the community learning centers and in supporting the
school heads, teachers and facilitators deliver quality education programs. The
supervisors are directly responsible for the monthly monitoring of school heads,
teachers, facilitators. Their primary monitoring responsibility is to provide feedback to the
Division in order to enhance the Division programs and projects for schools and community
learning centers.
The Supervisors shall interact closely with the following stakeholders:
Schools Division Superintendent
Assistant Schools Division Superintendent
Division Monitoring and Evaluation Coordinator
School Heads
The Supervisors shall report directly to the SDS and/or the ASDS. They shall provide the SDS/
ASDS with the progress of the training and technical support to schools and community
learning centers on a periodical basis. Specifically, the following outlines the roles and
responsibilities of the Supervisors:
Prepare program/project design,
Monitor the performance of schools and community learning centers. This
includes conduct of regular field visits.
Inform the SDS, ASDS and/or other Supervisors about school and community
learning centers' concerns to ensure prompt response to problems and issues

Page 9 - 7
Division M&E System M&E Terms of Reference

Prepare and submit Monthly Report detailing the status of programs and projects
and future activities
Prepare and submit an Assessment Report on school performance
As member of the Division Quality Management Team, participate in the SIP
Appraisal, Start Up Review, Annual Implementation Review, and Mid-Term Review
Outcome Evaluation

9.3.5 Coordinators
Coordinators are Division staff assigned a specific program or responsibility. These include
Division staff designated as physical facilities coordinator, procurement coordinator, SBM
coordinator, DORP coordinator and others.
Each coordinator shall be responsible for monitoring programs and/or concerns assigned to
him/her.
The Coordinators shall interact closely with the following stakeholders:
SDS
ASDS
School M&E Coordinator
School Planning Coordinator
Other Coordinators
The Coordinators are urged to establish and strengthen their horizontal link with other
coordinators and/or education supervisors and district supervisors in order to fast track the
sharing and utilization of valuable information affecting programs and projects. Specifically,
the following outlines the roles and responsibilities of the Coordinators regarding monitoring
and evaluation.
Monitor the efficiency and effectiveness of programs or concerns
Monitor utilization of programs, systems, facilities and learning equipment installed at
the school and community learning center level
Prepare and submit status report

9.3.6 M&E Support Staff


The M&E Support Staff is an administrative support staff to the Division M&E Coordinator. The
Support Staff shall be responsible for the collection of data, encoding, filing and
maintenance requirements of the Division M&E System.

Page 9 - 8
Division M&E System M&E Terms of Reference

9.4 The Quality Management Team

Quality Management Team or QMT is an ad hoc body composed of Division and District
staff whose task is to implement the quality control and adjustment mechanisms of the
Division. These mechanisms include: SIP appraisal process, start up, annual implementation
review, mid-term implementation review and outcome evaluation.
In general, the QMT is created to ensure compliance and adherence to the objectives and
targets of the Division and to ensure uniform application of policies, standards and
processes.
The QMT is responsible for:
ensuring the quality of plans, program and project designs developed by the
Division, district and schools
ensuring that staff from the division, district and schools are adhering to the
standard processes employed to assure quality
evaluating the major milestones at the school and community learning centers
The QMTs are divided into two major groups: the Core QMT and the Area QMT.
The Core QMT is the central body or process owner of the Quality Management System.
Specifically, the Core QMT will be responsible for the following:
set up of the Quality Management System in the Division
oversee the creation and formation of Area QMTs
build capability of the Area QMTs
communicate and enhance Division standards and the Quality Management
System
The Area QMTs put into operation the Quality Management System of the Division. These
teams are responsible for enforcing the quality standards of the Division and providing
technical and training support to schools and community learning centers.
Specifically, the Area QMTs are responsible for the following:
provide technical support to schools in setting up the school quality management
system
orient the schools and community learning centers on quality management
implement the quality control and adjustment points of the Division
evaluate the SBM assessment and ensure the integrity of the process

Page 9 - 9
DIVISION M&E SYSTEM
10.0
SETTING UP A DIVISION M&E SYSTEM
Division M&E System Setting Up

10.0 S E T T I N G U P THE D I V I S I O N M&E S Y S T E M

10.1 When to Set Up


The Division M&E System is set up at the Start Up Stage of the DEDP Implementation. The
DEDP provides the directions, objectives, scope of work, indicators and criteria for success
that are very important in monitoring and evaluation.



  
 
    

 

Set Up the Division M&E


System

Figure 10-1 Start Up Stage

10.2 Requirements for an Effective M&E System


Effective school management requires that a well organized monitoring and evaluation
system be designed, developed and implemented so that immediate feedback can be
obtained concerning school performance and immediate adjustments or decisions can be
made to ensure achievement of objectives and targets outlined in the school improvement
plan. The requirements for an effective M&E system include:
Commitment to do monitoring and evaluation. The primary requisite to an effective
monitoring and evaluation system is the commitment of all the stakeholders. This
commitment is manifested by the following: (1) all have the same understanding
and appreciation of the scope of monitoring and evaluation; (2) a continuing
commitment to excellence and improvement of school outcomes, and; (3)
objective used of the information and results of monitoring and evaluation.
Desired outcomes and objectives are realistic, clearly defined and verifiable. This
simply means that DEDP and the SIPs are correctly done. The DEDP and the SIPs
provides the scope of the monitoring and evaluation. Unless the plan is
comprehensively and clearly prepared the conduct of monitoring and evaluation
will be difficult. There should be a clear description of the objectives, targets and
milestones to be achieved in 3 to 6 years.
Standards are well established and communicated to all team members.
Standards include expected competencies, curriculum, SBM standards, training and
development process, planning requirements and the standard process for

Page 10 - 2
Division M&E System Setting Up

monitoring and evaluation


System boundaries among the different levels are clearly established. This includes
clear definition and delineation of the roles and responsibilities in the different levels
of the organization.
A clear definition of the DEDP life cycle framework is established. The life cycle
framework puts into context the decision making requirements of the Division in
every phase, stage or milestone.

10.3 Some Guidepost in Setting Up the M&E System


The set up process must ensure the following:
Make sure the school managers, teachers and other stakeholders understand the
scope of the M&E system
There should be agreement in the performance measures that will be used in the
monitoring and evaluation of the school. Outcomes and intermediate objectives in
the SIP.
If you can't measure it, you can't manage it. Be sure your targets are objectively
verifiable.
Consider the standards and policies of the DepED before finalizing the
performance measures and reports in the system.
In the design of the system, put the requirements of the school and teachers. It must
meet the decision making requirements of the school head and teachers before
considering or meeting the information requirements of external stakeholders.
Keep the system as simple as possible. Minimize reports, merged documents and
reports when possible. Identify the most authoritative report.
Set up the system as quick as possible.

10.4 Steps in Setting U p the M&E System


The five step process in setting up the Division M&E System includes:
(1) define the scope of the M&E;,
(2) design the control and adjustment points,
(3) determine the information requirements of the stakeholders,
(4) set up the monitoring process, and
(5) communicate the system.

Page 10 - 3
Division M&E System Setting Up

10.4.1 Define the Scope of the M&E


The most important first step in setting up the M&E System is to clarify and define the scope
of the M&E. This involves clarifying the objectives and targets of the Division, defining the
success indicators and performance measures.
In defining the scope of the M&E, the following guide questions must be answered:
What are the education impact objectives we want to achieve?
What are the outcomes or benefits we want our target groups to experience ?
What are the programs and projects that we must deliver to achieve the
outcomes? How many and when?
Are the indicators SMARTly formulated? Can they be verified?
What are the resources needed to implement the programs and projects?
The answers to the questions above are found in the Division Education Development Plan
(DEDP). It is the DEDP that provides the coverage and boundary of the M&E System. Hence,
it is very important that the DEDP is very clear and accurate when it comes to objectives,
targets, programs and projects that it needs to deliver in the next 6 years. Although the
DEDP have already been reviewed and approved during the appraisal period, it is
recommended that this again be reviewed or revisited before full implementation is
undertaken.
The following illustrates the 3 step process in defining the scope of the school M&E system:
(1) The first step in designing the Division M&E system is to have a good understanding
of the objectives targets and strategies contained both in the SIPs and the DEDP.
This means reviewing and/or updating the DEDP and the SIPs.
(2) Second step is to review and finalize the performance measures1. Performance
measure is one of the critical elements of M&E. The measures must provide an
accurate picture of the status of accomplishments or outcomes. When the
performance measures are clarified and adjusted, finalize these targets and
freeze them. These will be the basis for the monitoring and evaluation.
Careful considerations must be done in choosing a performance measure. Some
guidelines in defining the performance measures of the school:
The fewer the better. One of the common pitfalls in evaluation is the notion
that the more data gathered, the more performance measures used, the
better. This does not always follow. As a rule, the fewer the performance
measures, the more accurate is the picture of the situation.

1 A performance measure is composed of a number and a unit of measure. The number provides the magnitude
(how much) and the unit of measure gives the number a meaning (what)

Page 10 - 4
Division M&E System Setting Up

Focus on the right things. Ensure that the performance measures selected
are the correct measures for assessing the learners, teachers and school
head's performance. By correct, it means direct and exclusively used for one
performance only.
Integrated with other measures. A performance measure used is
connected to the other measures in order to provide a more holistic picture
of the school's accomplishment and achievement.
(3) Third step is to finalize the DEDP and prepare the DAP (Year 1) based on the
adjusted targets and schedule.

Defining the scope of the M&E will facilitate the design and establishment of the Division
M&E System. The DEDP with its objectives, targets, proposed strategies and activities define
the scope of the M&E. It is important that these are revisited and finalized to formalized the
scope of the M&E. This will lead to the design of the monitoring process, control points, data
collection tools and techniques and management reports.

10.4.2 Design the Control and Adjustment Points


The next major step in establishing a M&E system is to design the control and adjustment
points. The control points represent the core features of the M&E system. Control and
adjustment points are mechanisms for review to assess, validate and adjust (when needed)
the quality, scope, timing and cost requirements of an implementation. All the other
requirements of M&E, from reports to data requirements, will be based on these control
points.
Division Control and Adjustment Points are determined by drawing a road map of the DEDP
implementation. The road map will help illustrate the context and the relationship between
the implementation, control requirements and timing of the control. This will help establish
the link between the SIP implementation and the implementation requirements of the
Division and districts. This approach will provide the context of the decisions.

Implementation Stage and Control & Adjustment Point

A stage represents a major segment in the implementation phase, the completion of which
represents a major milestone. Each stage in the DEDP implementation represents unique
requirements and interactions as well as unique problems and issues. The stage approach allows
managers with more control in managing the implementation. The unique requirements of
each stage provide the context for monitoring and evaluation.

Control points are M&E review gates for evaluating major outputs and milestones. The results of
the control points are used as basis for adjusting the or enhancing the implementation.

Page 10 - 5
Division M&E System Setting Up

The main reference material in the establishment of control and adjustment points is the
DEDP and the SIP. The implementation plans provide details on the critical activities to be
undertaken and the targeted accomplishment dates of outputs. The following are items
need to be considered in the identification and design of Control and Adjustment Points:
Accomplishment of an output or major milestones. One of the major
considerations in the set up of the Control and Adjustment Points are the outputs to
be delivered. Outputs need to be quality assured.
Management reporting practices in the Department. Control points are patterned
after the management reporting practices of the agency. Considering the
reporting practices of the agency will facilitate both requirements of the school as
well as the requirements of the Division, Region and National Office.
Critical path or segment in the implementation process. Critical path refers to an
activity or activities that will have major implications to other activities, outputs and
decisions to be made in the future.

Division Quality Control &


  Adjustment Points







   

  

   

   

   
 
 
) 
 , 
 

 
!      !% &(
   

#

#
!      !% +% 

#% 
#

#
#% 

               
   

   

   

   
  

   

  


  
 
)  


 
  


Figure 10-2 Control and Adjustment Points

Page 10 - 6
Division M&E System Setting Up

10.4.3 Determine the Decision Making Requirements of Stakeholders


The third important step in setting up the M&E System is to determine the decision making
requirements of the stakeholders. Also known as key players, these refer to individuals or
group who will provide support and can/or can influence the implementation of the DEDP.
Stakeholders are classified into internal and external stakeholders. Internal stakeholders
refer to Division management and staff or people who has the highest stake in the
successful implementation of the DEDP. External stakeholders refer to individuals or groups
who provide technical support to the Division.
For both type of stakeholders, the System must be able to provide timely, accurate and
relevant information in order to ensure effective management and prompt delivery of
support and assistance. The decision making requirements of the stakeholders will dictate
the type of information that needs to be generated by the Division M&E system.
Determining the decision-making requirements of stakeholders is accomplished in 4
sequential steps. These steps assure that only the critical decision requirements of the
stakeholders are identified.
(1) First and foremost, the information requirements of the INTERNAL stakeholders
must be identified before responding to the information requirements of external
stakeholders. The Division M&E System must facilitate the information requirements
of the Division in order to help its management and staff make timely and critical
decisions that will lead to the attainment of targets and objectives contained in the
DEDP. As a system, the needs of the internal stakeholders are provided in order to
ensure efficient and effective implementation of programs and projects the
Division is accountable to provide.
The following are suggested steps in analyzing internal stakeholders and their
information requirements,
(a) Identify the stakeholders
The main M&E users at the Division and District level are the following:
Schools Division Superintendent. Accountable to the overall success of the
Division and the performance of the Division staff to provide efficient
technical support to schools and community learning centers.
Assistant Schools Division Superintendent. Accountable for the operational
efficiency of the Division and District to deliver programs and projects on
time and as per target.
Education Supervisors. Accountable to the effectiveness of Division
programs and projects for school heads, teachers, non-teaching staff,
instructional managements and facilitators.
District Supervisors. Accountable to maintaining continuous assistance to

Page 10 - 7
Division M&E System Setting Up

schools and community learning centers


(b) Profile the stakeholders
Profile each stakeholder based on the following:
Functions (roles and responsibilities). Pertains to both the de jure (mandated)
and de facto (actual) functions of the individual.
Information requirements. Pertains to data and information needed by the
stakeholder in order to make a decision.
When refers to time/day the information is needed by the stakeholder.
MoV refers to current practices or mode to verify the information.
It is also important to take note that the functions and information requirements of
certain stakeholders vary per time period. In this regard, refer to the M&E Operation's
Framework (macro-annual) to determine the kind of decision and information a
stakeholder makes.
(c) Identify possible design considerations
The stakeholder profile and information requirements can be used as input to the
content of the report, format of the report, how the reports will be presented and
the timing of the reports.
Below is a sample Stakeholder Information Matrix for Internal Stakeholders

Table 10-1 Stakeholder Information Requirement Matrix  Internal Stakeholder2


Functions (roles & Means of Implication to
Stakeholder Information When
responsibilities) Verification M&E Design
Information must
School Report
reach the SDS on
Performance of Card
Provide strategic a quarterly basis
schools and Division Report
SDS directions to schools, Quarterly/Annual in order to
community Card
districts and division effectively
learning centers Division Status
provide steering
Report
role
Programs and
Overall
projects Division Monthly
management of Monthly
delivered versus Status Report
Division
plan
Performance of
school heads, Start of SY Implementation
Efficient operations
teachers, During the SY Plan and status
ASDS of Division programs
facilitators and (periodical) reports
and projects
instructional End of SY
managers
Education Provide training and Competencies All year round School Visit
Supervisors technical assistance of target groups Observation and
to school heads, Inspection

2 Sample only. Intended to show the information to be gathered about the stakeholder. Additional stakeholders and
additional information may be added depending on the accountabilities of individuals in the Division and District

Page 10 - 8
Division M&E System Setting Up

Functions (roles & Means of Implication to


Stakeholder Information When
responsibilities) Verification M&E Design
teachers, facilitators
and instruction
managers
District Provide training and Competencies All year round School Visit
Supervisors technical assistance of target groups Observation and
to school heads, Inspection
teachers, facilitators
and instruction
managers
Division
Procurement Staff
Division Physical
Facilities
Coordinator

(2) Second, identify the information requirements of the EXTERNAL stakeholders.


External stakeholders are individuals or groups who can influence and/or provide
support to the school. These groups may require information from schools in order to
align its plans, programs and even its policies to the school requirements. The School
M&E System is also designed to meet the information needs of these stakeholders.
Unlike the internal stakeholders whose information needs are operational concerns,
the information needs of the external stakeholders are more on outcomes and
results, information that are important for policy formulation and design of technical
assistance programs for schools.
Normally, these stakeholders have their own M&E system. The report content, format
and timing should be considered as much as possible. Hence, the need to
configure the School M&E System to the M&E System of these stakeholders. The
suggested steps in doing a external stakeholder analysis is almost the same as that
of the internal stakeholders. These include:
(a) Identify the stakeholder
Not all individuals or groups outside of the school may qualify as stakeholder. In this
context, external stakeholders are those that can influence or support the school's
implementation of its programs and projects. Influence may include changes in
policies, technical assistance support and financial assistance.
(b) Profile the Stakeholder
Gather some information about the stakeholder in terms of:
Mandate. Pertains to roles and responsibilities on education as determined
by law. Or it may also pertains to the charter or vision/mission of the
organization especially non-government organizations or foundations.
Possible support to schools. Includes current support and potential support
that can be provided by the stakeholder.

Page 10 - 9
Division M&E System Setting Up

Information. Refers to the type of information the stakeholder may need to


make decisions that will compel the stakeholder to support or assist the
school.
When. Refers to the time period the information is needed by the
stakeholder.
Means of Verification. These are current practices or documents the
stakeholder is using in its own M&E system.
(c) Determine the implication to the design of the Division M&E System.
The design of the Division M&E System must be closely linked to the design of the
M&E System of the external stakeholders especially the Regional Office and to the
M&E System of the schols. The report content, forms and formats must be closely
linked to those being used by the Schools, Region and Central Offices.
For external stakeholders who are not directly mandated by law to support the
schools but are supporting the schools on their own, it is important to determine
their planning and budgeting period. This period may be the most appropriate
time to provide information about the school and the support it needs.
The table below provides an example of a Stakeholder Information Matrix  External
Stakeholder.

Table 10-2 Stakeholder Information Requirement Matrix  External Stakeholder3


Support /
Means of Implication to
Stakeholder Mandate Possible Information When
Verification M&E Design
Support
Regional Provide technical Training of
Office assistance Division staff
support to on Quarterly
Division manageme Status of DEDP Quarterly Report
nt, implementation As the need School
Ensure quality of consultancy arise Managers
Division etc Meeting/
operations Conference
Teachers School Visit
performance School break &
Semestral Needs analysis
break Classroom
observation
Central Office

LGUs 
Province level

Other
Agencies

3 Sample only. The list of stakeholder and their roles and responsibilities may vary from place to place.

Page 10 - 10
Division M&E System Setting Up

The information requirements of the stakeholders will have implications to the design and/or
content of the following:
data elements. These are the most basic information about a status of an
implementation. Usually, these are raw data and are important in determining or
computing for the performance measures. This will also dictate the forms and tables
to be developed.
forms and template. The simpler the forms and the templates, the better. These are
the most fundamental collection tool to be used in documenting an event and an
accomplishment.
report format. The decision making requirements of the stakeholders will determine
the format of the report. It should contain all the necessary information  numbers
and stories behind the numbers in order for a school head or a teacher make the
necessary adjustments or improvements in the strategies implemented.
reporting frequency. The need of the stakeholders to make decisions will also
dictate the reporting periods. The reports  with the numbers and stories must be
received on time by the stakeholders in order to ensure timely adjustments (if
needed) or decisions.
evaluation frequency. Evaluation pertains to external assessment to be undertaken
to validate the accomplishments and stories written in the reports. Usually,
evaluation is undertaken when the evaluation party is to come up with their own
plan.
As a rule of thumb, when there are conflict between the requirements of the internal and
external stakeholders, the requirements of the internal stakeholder must be met first
before the external. The decision making requirements of the internal stakeholders  the
school head and teachers must be given priority in order to help them make immediate
enhancements or remediations in the school interventions. This perspective will also ensure
that M&E is more about managing and making decisions than meeting the reporting
writing.

10.4.4 Set Up the Monitoring Process


The next step in setting up the M&E System is to define the operating details of the system.
These include designing the monitoring process/es that will operationalize the Division M&E
System. This also includes data collection system, reports and reporting process and the
feedback mechanism.
A Division Monitoring Process is a series of actions by the Division and Districts used to track,
evaluate and analyze the school performance and its target groups. It is a support process
undertaken to assure the quality and relevance of Division programs and projects. These
processes operationalize the data collection and reporting activities of the Division and

Page 10 - 11
Division M&E System Setting Up

integrates these to the control and adjustment points of the system. Once in used, the
Division M&E Process can supply the different information requirements of the Division
management, program units, support units and districts.
Define the M&E process. This includes defining the control points and events that
will be undertaken during the DEDP implementation.
Finalized the reporting requirements and disseminate them.
Formulate the M&E Terms of Reference. After detailing the requirements of the
M&E system, the next step is to define the roles and responsibilities of the school
head, teachers and staff concerning data collection, sharing of information,
reporting assignments and in giving feedback.

10.4.5 Communicate the System


The last step is the conduct of a Kick-off Meeting to signal the operationalization of the
Division M&E system. In football, a kick off represents the start of a game which means the
rules of football are now enforce. It is important to conduct a Kick-off Meeting to allow the
teachers, non teaching staff and others that the School M&E System is now operational.
Aside from the official start off point of the system, the Kick-off Meeting is also the venue for
the Division and District to understand the system. Before enforcing the system, it is
important to ensure that all staff have/can:
(a) high awareness of the scope or coverage of the Division M&E System
(b) explain the context and rationale of the different M&E control and
adjustment points
(c) awareness of the support that can be provided by the stakeholders and the
information they need to facilitate that support
(d) understands his/her roles and responsibilities in M&E.

Page 10 - 12
Quality Management Inventory Model
1st draft (fn. Quality Management Inventory Model)
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

BACKGROUND

Technical Support To SBM

The Basic Education Social Reform Agenda (BESRA) is a package of policy reforms that seeks
to systematically improve critical regulatory , institutional, structural, financial, cultural, physical
and information conditions affecting basic education provision, access and delivery on the
ground. BESRA is expected to create critical changes necessary to further accelerate,
broaden, deepen and sustain the improved education efforts.

BESRA's implementation of actions is focused on four main areas. These are (1) school based
management (SBM), to help schools to better manager their operations for improved learning,
(2) Competency Based Teachers Standards, to enable more teachers to practice
competency-based teaching, (3) Quality Assurance and Accountability Framework, to provide
better institutional support to learning and quality assurance, and (4) Outcomes-Focused
Resource Mobilization, to ensure resources are focused on achieving desired outcomes.

The focal point of BESRA is the school. The main integrating vehicle for BESRA implementation
is SBM. Through SBM, schools are allowed to manage its own affairs to improved the delivery of
education services in a sustained manner. SBM also includes strengthening the school heads
on resource mobilization, negotiation, partnerships with community and stakeholders. Other
assistance include provision of funds for priority school projects. Clearly, all major efforts,
resources and funds are funneled to helping the schools manage basic education services
more efficiently and effectively.

In this regard, the Region and Division will play a very important role in supporting the schools. It
is essential for the Region and Division to have the capability and necessary systems and
mechanisms in placed that will sustain its support the schools management of its affairs. The
Region and Division are in a strategic position to propagate, maintain quality and sustain SBM
interventions and results.

It is in this context that a quality inventory assessment tool is developed to ensure the Regions
and Divisions are ready to assume the tasks of facilitating SBM. The assessment will focus on
their readiness. By readiness, it includes the presence of well defined technical assistance
processes and support mechanisms that will support schools' delivery of basic education
services to students.

Pa g e 2
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

    


   

 
  
 


  
   
 

   
 
 

!# # 

Pa g e 3
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

QUALITY MANAGEMENT INVENTORY MODEL


(QMIM)

The Quality Management Inventory Model is an integral part of the Quality Management
System of the Region. It is a mechanism to promote continuous improvements in the Region
and the Divisions. Its main goal is to improve things and manage things better%.

The QMIM depicts a road map that traces the Region and Division's transformation from use of
informal processes to a more established technical assistance packages and support
mechanism. It projects an organization's transition from the realm of uncertainty to a more
repeatable and predictable results. The Model represents a progression of capability by the
Region and Division to deliver management support and its technical assistance packages to
their target groups.

The QMIM is also a yardstick to assess the performance of the Region and division. It will be
used to examine the Region and Division's processes and support mechanisms that allows it
to efficiently and effectively deliver technical assistance packages to schools, school
managers, teachers and the school's non-teaching staff.

The Model is defined into (1) Ad Hoc, (2) Defined, (3) Integrated, (4) Sustained. Level 1 is the
entry level. It represents a Division that is characterized by ad hoc processes and informal way
of doing things. As it matures, the Division Office is expected to establish its internal procedures
(Level 2. Defined). The Division improves into a stage where it is expected to manage and
integrate different mechanisms into an integrated system. The highest level is the Sustained
level. This represents a Division that adapts, maximizes and continuously improve its way of
doing things.

The following discussion describes in detail the four models:

Readiness Level 1. Ad Hoc

The initial or entry level of readiness. A Readiness Level 1 Region/Division is often characterized
by a temporary and informal ways of doing things. Organizational procedures or methods are
not well defined and disseminated leading to inconsistent results and poor quality of service. Its
technical assistance packages are reactive, inefficient and not relevant to the requirements of
its target groups. Often these packages are hand-me down practices. Its utility value and
effectiveness have not been proven, yet these are utilized year in and year out. Some may

Pa g e 4
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

yield positive outcomes and some may offer temporary solutions.

A Region/Division is characterized by an absence of standards and defined processes for


planning, implementation and evaluation. In cases there are defined procedures, these are
forgotten when the procedure% owner leaves the organization or is replaced by another
staff. There is no continuity and standard way of doing things in this type of organization.

The common features of a Level 1 Region/Division are the following:

There is no clear and/or defined way of doing things;

If there is a defined process, it does not respond to the challenges and support
requirements needed by its target client≤

If there is defined process, the implementation or application is not consistent;

Dependence to one or two individuals. These effective individuals% are often hailed
as heroes because they are able to move the organization to achieve results. However,
when these champions leave, the organization suffers setback; and,

Some positive results are achieved but not sustained and/or maintained.
Region/Divisions belonging to this category may achieve good results (eg. NAT results),
however, these are not maintained (leading to poor results in the succeeding years).

If a Level 1 Region/Division aims to efficiently do things, then it must undertake the following
steps:

1. Build capability of staff on the following areas: planning, management, supervision


and control. The staff should know the fundamental principles involved in these
management areas and should have very good handle on management tools and
techniques;

2. Establish or set up its own mechanism or procedures.

Collect and collate its practices and experiences and formulate its own set of
procedures;

Adapt other Divisions' experience and/or approach that have been proven
and tested already;

3. Transform these mechanisms/procedures into a Region/Division policy;

4. Communicate to Region/Division staff and build a critical mass of individuals who will
guide, guard and champion the newly established process.

The transformation of a Region/Division from Level 1 to Level 2 is critical in the maturity process

Pa g e 5
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

of the Division. This stage is often the most frustrating part of the change management
process. Introduction of new methods or practices are often met with suspicions and cynicism
by individuals. Aside from the suggested steps outlined above, the Region/Division
management should take sessions on how to implement and manage change.

Readiness Level 2. Defined

It is critical first to establish the support mechanisms that will be used as the platform for
delivering the Region/Division's technical assistance packages. These support mechanisms
ensure an efficient delivery of basic education strategies and services. Region/Division
organizational processes must be defined, refined and communicated in order to guarantee
consistency in the quality of its technical assistance.

At this level, Region/Divisions start to refine and define their technical assistance packages.
These packages are detailed into specific steps, refined, standardized and documented.
Region/Division staff are oriented, trained and are expected to perform these processes.

Region/Division Readiness Level 2 is about institutionalizing an efficient process or procedure


that leads to effective results.

Level 2 Region/Divisions exhibit a significant improvement from ad hoc, temporary way of


doing things to having a clearly defined and concrete processes and practices. At this level,
there is an effort to implement interventions as efficiently as possible by following a structured
approach. There is a high awareness to use commonly established management tools,
techniques and procedures. But there is still that tendency to revert back to the ad hoc or
traditional practices when confronted with a difficult situation. There is a defined process but
the application is not consistent. An example is the tendency to cut short a procedure due to
time constraints or cost constraints thereby sacrificing quality.

Region/Divisions belonging to Readiness Level 2 may have the following characteristics:

Have defined, formulated and established its TAP processes and support systems;

There is a staff development program that supports the application of the defined
system;

Application of these systems or procedures are not consistent. On a case to case basis,
defined systems and processes are ignored or not followed; and,

Although systems are in placed, these may not talk to one another% or there may be
duplication of efforts.

Pa g e 6
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

Unlike Readiness Level 1, Region/Divisions belonging to this category are less dependent to
individuals (one or two staff) but need a strong willed management who will enforce the
quality standards, defined procedures and the agreed programs and projects.

In order to achieve a higher readiness rating, Level 2 Region/Divisions should undertake the
following:

1. Increase staff awareness of Region/Division standards and procedures;

2. Documentation and dissemination of standards and procedures;

3. Critical number of Region/Division staff are knowledgeable about the procedures;

4. Updates on demands (needs and opportunities) of schools and stakeholders; and,

5. Improve Region/Division management capability to enforce adherence to its own


processes.

The transformation from Level 2 to Level 3 is a result of the organization's motivation to achieve
more as a result of its initial success. The satisfaction generated by a formal and coordinated
process is boosted by a desire to do things more permanently and consistently.

Readiness Level 3. Integrated

Region/Divisions belonging to Readiness Level 3 demonstrate a more mature and more


consistent way of doing things. In this category, Region/Divisions are able to collate, document
and transform its effective practices into an integrated, well choreographed process. There is
high compliance to its own standards and processes such that all Region/Division units and/or
individuals know the what to do and understands the coordination, cooperation and
collaboration requirements expected from them.

A Region/Division with Level 3 Readiness demonstrates maturity in balancing the competing


requirements of quality, time, cost and quantity (scope). It is able to efficiently deliver its
technical assistance to schools, have completed them based on approved schedules and
within budget. It is able to achieve high quality of outcomes as a result of its adherence to its
own standards and to its defined process.

Readiness Level 3 Region/Divisions manifest the following:

On Quality Management. Region/Division standards are well established and are used
as basis for monitoring and evaluation;

On Scope Management. Ability to perform all activities in the DEDP and deliver the

Pa g e 7
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

promised outputs; there is minimal deviation to approved plans;

On Time Management. Tasks and activities completed according to schedule;

On Integration Management. Effective practices are integrated into a single common


process;

On Configuration Management. There is consistency between its staff development


program and its system requirements; there is horizontal integration of Region/Division
mechanisms (planning, monitoring and evaluation, procurement, finance etc);

On Organizational Arrangement. There is a high level of interaction and


interoperability between and among units;

On Management of Results. Region/Division can predict and control results of


interventions (monitoring and evaluation).

This level is not affected by changes in management and/or personnel. The processes, systems
and practices provides stability to the operations of the organization.

To proceed to the next level, the following actions are suggested:

1. Continuing staff development program;

2. Continuing review of its processes, procedures and systems;

3. A quality assurance mechanism that detects the positive and negative elements of
the processes;

4. Upgrading its own standards.

Table 1. inventory Level and Characteristics


Technical Assistance
Inventory Level Characteristics Actions
Package
High interaction
Continuous improvement of technical
with client&le,
assistance packages; interventions, Dynamic and
Level 4 Sustained always have
processes are enhanced based on needs improving processes
better way of
and opportunities
doing things
Continuous
Demonstrates maturity in balancing the Consistent capability
Level 3 Integrated competing requirements of quality, scope, application of building and
time and cost processes improvement of
processes
Installation of well defined processes to Standardized
Institutionalizing but
guide service delivery but inconsistent in its and enforced
Level 2 Defined still inconsistent in
enforcement and/or application of defined
application
processes processes
Level 1 Ad Hoc Temporary way of doing things; process No clear way of Define the

Pa g e 8
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

Technical Assistance
Inventory Level Characteristics Actions
Package
doing things;
may vary depending to situation and processes, build
personality
personalities staff capability
dependent

Readiness Level 4. Sustained

The Region/Divisions are expected to facilitate the implementation of school based


management (SBM) at the schools level. They will play a critical role in preparing the schools
achieved the desired level of maturity under a decentralized management set up. Given the
enormity of the tasks, the Region/Division must be a growing organization in order to be
effective. It should be able to respond, adapt and adjust to the unique and changing
requirements of the schools. Meaning, its pre-occupation is not on its packages and processes
established, but on adjusting and improving these to suit the challenges and requirements of its
school client&le.

The Region/Division's maturity on this level hinges on its commitment to excellence. It must
have the ability to perform continuous improvements, always optimizing the gains or outcomes
of its undertaking. Therefore, a Readiness Level 4 Region/Division should have the following
traits:

Defined processes are regularly updated in accordance with the strengths,


weaknesses, opportunities, threats faced by the schools;

Defined processes are improved and in sync with agency policies and directions;

This level adheres to the principle of continuous improvement, always optimizing gains or results.
In order to maintain this level, the following efforts are suggested:

1. Regularly conducts research and evaluation studies;

2. Commitment to the monitoring, evaluation and adjustment process;

3. Documents lessons learned and other experiences; input these to the improvement
and/or enhancement of Region/Division products and processes; and,

4. Learn from other Region/Divisions' experience.

It is inherent for people to leave the organization. Individuals who played a big role in the
continuous improvement and growth of the organization will eventually retire, resign, get
transferred to other stations. It is imperative that institutional memory is maintained and

Pa g e 9
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

handed over to the next generation of Region/Division staff who will continue the culture of
excellence established in the Region/Division. In this regard, a good knowledge management
program must be in placed to ensure Level 4 Region/Divisions.

Pa g e 10
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

Technical Assistance Package


(TAP)

The focus of the QMIM is the ability of the Region/Division to efficiently deliver its technical
assistance packages to its target group. A Technical Assistance Package is a set of activities,
developed and defined by the Region/Division into a process, designed to solve a particular
issue and/or to achieve desired education objectives or outcomes.

The QMIM classifies these packages int three types:

1. School Based Management Packages

These packages constitute the core process areas that will directly impact on learning
outcomes. These include school based management, learning management program,
instructional supervision, learning materials and learning environments. Capability
building assistance for school heads and teachers to improve instructions are also part
of this package.

2. Management Mechanisms

These governs the operations of the Region/Division into a system. Management


mechanisms provide the backbone to the implementation of Division TAPs as it
provides the integration processes required to efficiently conduct and manage its
operations. Management mechanisms include planning, appraisal, staff development
program, monitoring and evaluation.

3. Support Mechanisms

These are necessary support processes that are vital to the efficient operations of the
organization. Support mechanisms refer to processes pertaining to finance, human
resource management, administration and procurement.

The management readiness of Region/Division will be assessed using these TAPs. Essentially, the
core review areas include: (1) technical assistance on provision of SBM assistance and
instructional support to schools, (2) management processes that integrates the Region/Division
operation, and (3) support processes of the Region/Division to deliver technical assistance as
efficiently as possible. The expectation is that the Region/Divisions are in a position (i.e. with a
well defined and tested procedures and mechanisms) to provide consistent, relevant and
timely assistance to divisions/schools.

Pa g e 11
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

The efficiency and effectiveness of a Region/Division is dependent on how it delivers its


technical assistance support to schools. Ideally, the TAPs are products of the Divisions' effort to
continuously improve its capability to provide support to schools. The type of support that a
division/school will receive is dependent on the maturity of the Region/Division to respond
consistently, efficiently and effectively deliver its technical assistance packages.

Table 2. Core Review Areas - Quality Management Maturity Assessment

Review Areas Processes

Capability Development Program for School


Heads and Teachers on
 SBM
 Curriculum Implementation
Division's SBM  Teaching and learning
Technical Assistance Educational Planning Process
Instructional Consultancy
Instructional Materials Development
School Building Program
Managing resources and networking

Division M&E System which includes processes


on:
 SIP Appraisal
 Start Up Review
Division Management
Mechanisms  Annual Implementation Review
 SIP Outcome Evaluation
Division Education Planning Process

Human Resource Management including


recruitment, selection, performance appraisal
Division Administrative
and promotion)
Support Mechanisms
Procurement Process
Finance and budgeting

Pa g e 12
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

CRITICAL ASSUMPTIONS
    
   

Listed below are critical  


assumptions used to define the   
 

scope and boundaries of the
  
Quality Management Inventory    
 
Model. It also seeks to provide the 
   
context, focus and locus of the   
assessment.
! ## 
1. Management readiness of
Region/Division should impact to
quality education. Its main objective is to
strengthen and improve Region/Division's delivery of programs and projects that will
impact positively the learners.

2. The context of the QMIM hinges on the ability of the Region to prepare and
strengthen the Divisions in facilitating and sustaining support to schools on SBM. On
the part of the Division, it is anchored on their ability to provide the necessary
technical support for schools to effectively implement SBM. As such, the assessment
will focus on the capability of the Region/Division to deliver technical assistance
packages that will facilitate and sustain the schools delivery of basic education
services.

3. Management readiness is not intended to appraise the performance of the Regional


Director, Assistant Director, Schools Division Superintendent, Assistant Schools Division
Superintended and their staff. Instead, the focus and locus of the inventory assessment
is the Region/Division as a whole. The current practices of the Region and the Divisions
will be assessed. Specifically, the focus of the assessment is on how the Region/Division
delivers/operates the technical assistance packages and support mechanisms.

4. The highest management readiness or maturity level of a Region/Division should


manifest the following characteristics:

repeatable and predictable education results. A ready or mature


Region/Division is able to produce outcomes as a result of its efforts rather than
by chance. Example, a Division achieves a high NAT results in 2005 and poor
NAT results in 2006. A mature Division is able to predict and repeat results;

Pa g e 13
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

holistic and integrative. technical support is not on a piece meal approach


but always consider the implications and effects of a support to other areas.
able to consider both the management and instructional support requirements
of front liners;

relevant. Its service delivery mechanisms are custom fitted to meet the unique
and changing demands or needs of schools;

proactive. It is always a step ahead, readily responding or reacting to changing


demands of school stakeholders;

timely. Delivery of basic education services are on time, based on plans and
programs;

consistent, repetitive and simple. Division's service delivery effort is consistent,


can be repeated and simple;

the best technical support. Provides the most efficient and effective service
delivery mechanism to schools and school stakeholders; and,

continuous improvement. Division should have a culture of excellence, always


challenging itself to provide better and better service.

5. T he inventory model ranges the least mature stage to the most matured state%. The
model assumes that organizational readiness must be grown over time in order to
produce repeatable success.

Pa g e 14
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

CRITICAL MANAGEMENT APPROACHES

One of the critical assumptions used in the Quality Management Inventory Model is
continuous improvement. Continuous improvement is based on the premise that change will
occur and will always challenge the status quo. This may be brought about by new needs and
issues, new policies and thrusts, new standards and challenges. These factors will force the
Region/Division to continue innovate, change and improve itself in order to be efficient and
effective.

In order to ensure continuous improvements, the Region/Division must adopt certain basic
management approaches. Adherence to the basic principles and techniques of these
management strategies will help sustain and maintain a high readiness level of the
Region/Division.

In order to facilitate the maturation process of the Region/Division from Level 1 to Level 4, the
following management approaches are must inputs to the Region/Division staff development
program. These include change management, project management and knowledge
management.

Change Management

Change management is a structured approach to change individuals, teams and


organizations that enables them to transform from the current state to a desired state
in the future. The maturation of Region/Division from Levels 1 to 4 is dependent on how
changes are introduced and applied. In this regard, the Region/Division readiness
programs should incorporate capability building, especially for management staff on
organizational change management. Management staff should be equipped with
techniques on how to effectively plan, implement and manage changes for Division
personnel.

A well planned change management program is very important in the transition from
Level 1 to 2 readiness.

Knowledge Management

Knowledge management (KM) is an important ingredient to continuous improvement.


KM pertains to practices by organizations to distribute, transfer and propagate
knowledge within the organization. KM is an important input to sustaining organization
efficiency.

Pa g e 15
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

Project Management

Project management (PM) is the use of different tools and techniques to activities in
order to achieve objectives or targeted results. PM boasts of robust planning,
implementation and control techniques that can help the Division deliver its services in
a more efficient approach. PM techniques will serve as valuable tools in standardizing
organizational processes and integrating these into a more coherent and optimal
process. Knowledge and skills on PM is very important for Levels 2 & 3 Readiness.

These management approaches identified above should not be taken in isolation. Deliberate
effort to integrate the three approaches should be undertaken. The change management
approach provides the strategies on how to soften% resistance and facilitate change when
new processes are introduced. Knowledge management, on the other hand, provides the
input and perspectives on how to manage, share, propagate and sustain these processes. And
lastly, project management techniques lend themselves well to managing change and
managing knowledge.

Pa g e 16
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

Assessment Areas

Region

The scope of the QMIM assessment for the Region includes the following technical assistance
packages to Divisions:

Technical Assistance to Divisions

Capability Building Program for Division Management and Staff

Learning Materials Support

Region's Management Mechanism

Strategic Planning

DEDP Appraisal

Outcome Evaluation

Progress Monitoring and Evaluation

Assessment (learners' achievement)

Policy Research

Administrative Support Mechanism of the Region

Human Resource Management

Procurement Process

Finance and Management Support

Division

The Quality Management Inventory Model covers assessment of the Divisions' demonstration of
the following technical assistance packages:

School Based Management assistance to Schools

Drop Out Reduction Program

Capability Building Program for School Heads on SBM

Capability Building Program for Teachers

Learning Materials Support to Schools

Pa g e 17
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

School Building Program

Division Management Mechanism

Strategic Planning

SIP Appraisal

Monitoring and Evaluation

Information Support

Inset for Non-Teaching Staff

Performance Evaluation

Division Administrative Support Mechanism

Human Resource Management

Procurement Process

Finance and Management Support

Pa g e 18
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

QUALIT Y MANAGEMENT INVENTORY MODEL


(QMIM) ASSESSMENT

Objectives

The aim of the QMMA is to determine the capability of the Region/Division to implement timely,
consistent and relevant technical assistance packages to its school client&le. The assessment
focuses on the ability of the Division to consistently deliver quality technical assistance
packages through processes designed to improve efficiency and assure effectiveness of
service.

Specifically, the assessment aims to accomplish the following:

1. Documentation of processes and mechanisms within the Region/Division. The


assessment is designed to solicit effective practices within a Division Office that may
be shared and replicated to other areas;

2. Determine and pinpoint areas where a Region/Division is considered matured and


where it needs improvements; The results will also provide valuable insights and
perspectives on what processes that work and processes that does not work.

3. Determine the support requirements of Region/Division. These may include human


resource complement, staff development, process and system; and.

4. Input to DepED Regional Office. This assessment can be one of the strategies for the
Regional Office to implement its quality assurance and accountability work. The results
can be used by the Regional Office as input to design its technical assistance support
to the Division Office.

On the part of the Division, the readiness model may be used as a template by which it can
assess its own operations and determine its capability building requirements. For both the
Regional Office and Division Office, the results of the assessment provide important inputs
toward a more efficient and effective delivery of basic education services.

Methodologies

The QMIM Assessment will make use of different methods of data gathering. These include:

1. Key informant interview. A one on one session with the process owner or individual/staff
who is directly accountable to the implementation of the technical assistance

Pa g e 19
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

packages.

2. Interview. Session with staff who has direct knowledge of the process areas or who
were involved in the process. Interviewees may also include school heads and
teachers who became recipient of the service/s provided by the Division.

3. Focus Group Discussion. Session with selected Division staff and/or school heads and
teachers discussing the practices and/or processes implemented by the Division. The
FGD will help validate the claims of the key informants or process owner.

4. Artifacts Review. Refers to gathering documents that will prove the existence of Division
process or practice. This will also include inspection of the documents.

5. Observation. When possible, the Assessment Team may conduct observation of the
actual process being undertaken by the Division staff.

Process Owner

The Region will be the Process Owner of the QMIM assessment. The Region will create Quality
Management Team/s that will be tasked to undertake the assessment. As process owner, the
Region must ensure the following:

1. integrity of the process. It must be undertaken as objectively as possible and used


simply for continuous process improvement

2. usefulness of the process, especially the findings and results. These should find its way to
the Regional Education Development Plan and used as design considerations to
Region programs and projects

3. capability building of the QMT members to assure that they are ready and capable to
become assessors

4. continuous improvement of the assessment process. This means improvement in the


assessment tools, methods and management arrangements.

Pa g e 20
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

ASSESSMENT TOOL

The QMIM Assessment Tool is developed to facilitate the conduct of the QMIM assessment of
the Region/Division in implementing key process areas to deliver its technical assistance
packages (TAPs). The Assessment Tool contains the key process areas of the Region and
Division and the various scenarios using the 4 maturity level (ad hoc, defined, integrative and
sustained).

The Assessment Tool is not a checklist but will serve as a guide for the QMTs to objectively
document the application or utilization of existing key process areas of the Region/Division. The
tool will also be used to facilitate the documentation of best or effective practices in the
Region and Divisions.

Pa g e 21
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

MANAGING THE QMIM ASSESSMENT

The process owner of the QMIM assessment process is the Region. The following will serve as
guide for the QMTs to be deputized by the Region to implement the QMIM assessment. The
steps listed below are the minimum requirements in conducting an efficient assessment.
Depending on the requirements of the Region, additional activities and requirements may be
added.

A. Start Up - Chance favors the prepared mind

The following are suggested start up activities for the Region to implement before undertaking
the Division QMIM assessment.

Step 1. Review

1. Review concepts and principles of quality management, process improvement and


the inventory levels. Make sure each assessor understands these concepts in order to
ensure same paradigm among team members/ assessors.

2. Ask each team member to familiarize himself or herself with the content of the
Assessment Tool. This will minimize dependence to the Assessment Tool and allow the
assessors to conduct the interviews as normal% as possible without much interference
from looking at the tool from time to time.

3. Make sure that all team members understand how to use the Assessment Tool. This
includes understanding of the continuum (level of maturity), and documentation
requirements.

4. On rapid appraisal, remind each team member that the assessment approach being
utilized is the not so quick and not so dirty% approach. Review the principles and
methods of rapid appraisal.

5. Conduct a group review of key process areas or management process that must be
present in a Division. If possible, the Team should review and familiar themselves with
the objectives, strategies and content of the Division Education Development Plan.

Step 2. Assessment Team

1. Form assessment teams based on the targeted number of Divisions to be assessed

Pa g e 22
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

and the target date of completion. Ensure the teams will be able to cover all divisions
based on the time allotment.

2. Assign a Team Leader for each team. The Team Leader shall:

 ensure access to documents and materials the team may need for the
assessment

 ensure the team has enough copies of the Assessment Tools and other related
materials

 coordinate Division visit schedules

 orient the Schools Division Superintendent interviewee on purpose of


assessment

 conduct an exit conference (after assessment)

 consolidates report of team members

3. The number of team members per team should be enough to cover all the items in
the Assessment Tool and to ensure the documentation requirements are met.
Suggested minimum number of members is 5.

4. Assign team members who are very knowledgeable about the management
processes. Form a multi-disciplinary team to ensure coverage of the key process areas.

Step 3. Work Plan

1. Prepare a work plan detailing the activities to be undertaken by the Team. The work
plan should also include the schedules, logistics and financial requirements needed to
undertake the assessment.

2. Orient all the Assessment Teams and individual team members about the schedule of
the QMIM assessments and the important milestones in the plan such as the deadline
for the preparation of reports.

B. Implementation - The Assessor is the Instrument

Step 4. Preparing for the Division Visit

1. Send formal communication to SDS regarding your visit. Indicate in your


communication the purpose of your visit, the people you want to talk to and the
documents that must be made available during your visit. Also specify if you need to

Pa g e 23
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

interview school heads and teachers so that arrangements can be made by the
Division before the actual review.

2. Coordinate with the Division about your travel arrangements, logistical requirements
and other administrative prerequisite to ensure smooth conduct of assessment.

3. Make sure every team member has a copy of the Assessment Tool and other
necessary forms. Have them reproduce before going to the school.

4. Conduct final team meeting before going to school. Apprise the team members of
their roles and responsibilities and the scope of the evaluation.

Step 5. During School Visit

 Start your visit with a courtesy call. Discuss the purpose of your visit, your plan for the
day, the people you need at a particular time and the documents you need to review.

If the Division opted to conduct an opening program or ceremony, try to limit this to 30
minutes.

 Assure the Division management and staff that the QMIM assessment is not meant to
evaluate the performance of the Division but to ascertain its level of readiness to
perform critical process or technical assistance packages.

 Whether you use interview, group interview or focus group discussion, tell the
respondent/s about the purpose of the activity. Inform respondent that you will be
taking down notes in the course of the discussion.

 Don(t forget to ask for evidences (MOVs) on claims that the respondent/s made and to
thank respondents for their participation soon as you have finished your data
gathering with them.

 As soon as you have completed your data gathering activity, organize your data and
meet as a team to prepare for the exit conference.

 Conduct an exit conference with the SDS. Point out significant observations regarding
the Division(s level of readiness but avoid making judgment or conclusions right away.
Inform the SDS that the Team will discuss the observations, analyze the findings as a
Team and make recommendations.

 Thank the SDS and tell him/her that a complete report will be sent to him/her officially
by the Regional Office.

Step 6. Post QMIM Assessment

Pa g e 24
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

1. Encode and consolidate the observations and documentations of the Team using the
same format. Provide each member a copy of the documentation.

2. Convene the Team to discuss and analyze the results.

3. During the Team meeting, come up with a consensus regarding the Level of Maturity
of the Division per process area. Prepare a report on the assessment.

4. Given the findings and analysis, the Team is to formulate recommendations and
suggestions to the Region on how to assist the Division improve and/or reinforce its
practices and/or delivery of technical assistance packages.

C. Completion - Connect the dots

The purpose of the QMIM Assessment is to facilitate continuous improvements in the


operations of the Division concerning its practices and its current way of doing things. The
reports which includes the findings, insights and lessons learned should find its way to the
Region and Division plans so that appropriate actions can be made in the future. Thus, the
Completion Phase of this assessment should not only include report preparation but also
should lead towards plan enhancement.

Step 7. Region Report

5. Submit and discuss the findings, analysis and recommendations to the Regional
Director.

The report shall include

 procedure which the Assessment Teams had adopted to gather the data.
The discussion should cover all the activities done before(preparations
done), during(interview and generation of MOVs) and post- assessment
stage(collation of data, manipulation of data, how the data were analyzed,
etc)

 describes the results of the status of the QMIM assessment of the Division.
This section will present the findings of the various items studied and
indicating the maturity level of the Division.

 categorization of the Division vis-)-vis the level of process maturity.

 actions needed in order to fast track the progression or development of


the Division practices and processes to the next level.

6. Upon approval, provide feedback to the Division especially on the recommendations

Pa g e 25
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l

and suggestions of the Region.

Step 8 Region Next Steps

1. The findings of the QMIM Assessment should find its way to the Region's technical
assistance packages for the Division. The results of the Assessment will be used as
input by the Region to define and formulate programs and projects for the Division.

2. The findings concerning Level 3. Integration or Level 4. Scale Up should find its way to
the documentation of best/effective practices of the Division.

3. The QMIM results can also be used as input to the Outcome Evaluation to be
undertaken by the Region. It can also be used to amend, enhance and/or formulate
new standards and policies.

4. The Division may want to implement the assessment to the other 70% schools not
covered by the initial assessment.

Pa g e 26

Você também pode gostar