Escolar Documentos
Profissional Documentos
Cultura Documentos
Division
Monitoring & Evaluation System
1. Introduction
2. Objectives
3. Scope
4. Performance Measures
5. School Monitoring Process
6. Control and Adjustment Point
7. M&E Tools and Techniques
8. Documents and Reports
9. Terms of Reference
10. Setting Up the School M&E System
Division Quality Management Inventory Model (QMIM)
DIVISION M&E SYSTEM
1.0
INTRODUCTION
Division M&E System Introduction
THE FISHERMEN
AUTHOR,UNKNOWN SOURCE UNKNOWN
1.0 INTRODUCTION
Page 1 - 2
Division M&E System Introduction
an efficient operation and how the organizations can benefit from it. People would get
excited about the importance of M&E that it would often result to many bright ideas and
plans about how to implement an M&E. However, the perceived importance of M&E does
not always translate to actually doing and implementing M&E. Often, critical elements of
the M&E system are missing. As a result, activities and events are undertaken in the name of
M&E and yet most fail to provide the necessary information needed in making good
decisions. Data gathering and report writing are often confused to be the M&E.
The story of The Fishermen draws an important parallelism to the practice of M&E by
organizations. Too many activities and events are undertaken in the name of M&E. Forms
and data gathering instruments are developed, but which are often incoherent. Costly
infrastructures and facilities are set up, but which usage is far from being maximized. And
generally, despite all the efforts stated above, the most basic information requirements are
missing.
It is ironic that the one of the most important systems is also one of the most neglected
systems in the organizations. Often, there are too many fisherman's fellowship and yet the
fishers are few. The main purpose of the Manual is to serve as a guide to would-be
Monitors and Evaluators on how to operationalize a M&E System. This document illustrates
the fundamental requirements and techniques of implementing M&E at the Division.
The schools (fish) are plentiful. There is an urgent need to set up an efficient M&E system
to enable the monitors to actually fish.
Page 1 - 3
Division M&E System Introduction
destination (outcomes), the directions (strategies) and the means (resources) to get
to the destination. It is important to ensure that the plan is accurate, correct and
clearly written, especially the targets and indicators.
The Plan defines the areas to be monitored and evaluated.
2. Decision-making. During implementation, things may not go according to plan.
Every manager must make a decision correct and timely decision before things
get out of control. The objective of every manager is to ensure that despite
changes in the frame conditions and changes in the plan the outcomes can still be
achieved. Necessary adjustments have to be made in the strategies and activities,
in the use of resources in order to keep implementation on track according to
schedule, target, time and quality.
The quality of decisions is dependent on the timeliness and completeness of
information that a decision maker has at hand. In setting up the M&E system, one of
the key considerations is knowing the information requirements of key internal
stakeholders the manager, supervisor and field personnel.
The M&E function is core to decision-making.
3. Continuous improvement is a management process where delivery processes are
constantly evaluated and improved in the light of efficiency, effectiveness and
flexibility. (Wikipedia).
1.3.1 Definition
Monitoring and Evaluation (M&E) is defined as the systematic process of gathering,
processing, analyzing, interpreting, and storing data and information thereby setting into
motion a series of managerial actions for the purpose of ascertaining the realization of
set objectives.
M&E is composed of three interrelated processes. These are:
Monitoring refers to the systematic observation and documentation of actual
accomplishments as well as tracking of issues, opportunities and problems that may
Page 1 - 4
Division M&E System Introduction
affect implementation
Evaluation concerns the assessment of information (collected through monitoring)
regarding the extent to which actual accomplishments conform to or deviate from
the objectives set in the plan
Adjustment means steering the implementation. This means using the information
and insights derived from evaluation, adjusting the strategies or way of doing things
to make implementation more efficient and lead towards realization of objectives.
The main purpose of M&E is to spur managerial actions based on information and insights
collected, processed, analyzed and interpreted by the monitors and evaluators. These
managerial actions are undertaken to improve performance during implementation and to
increase the likelihood of achieving the desired outcomes.
Scope. All M&E efforts should have a scope. The scope provides the standards and
parameters for evaluating performance of programs, projects and including
individuals. The coverage of M&E is defined by the approved or accepted plan.
Without a plan, there is no scope for the M&E. Specifically, the scope will define the
following M&E concerns:
outcomes to be achieved
outputs to be delivered
Page 1 - 5
Division M&E System Introduction
activities to be undertaken
budget or cost
Means of verification (MoV). One of the main features of any M&E system is the
means of verification. MoVs are authoritative source of information about the
achievement of outputs and the actual performance. The role of M&E is to provide
relevant, timely, and accurate information about the achievements and status of
implementation. MoVs include:
Testing
Page 1 - 6
Division M&E System Introduction
Reallocation of resources
Rescheduling
No actions at all
Termination/replacement of staff
External factors. M&E also keeps track of the possible occurrence of external
factors that may affect actual performance. These factors include:
Testing
Measure effects of
intervention during
Intermediate implementation;
improvement in Initial Gains
Objectives Evaluation
performance, behavior
1.4.1 Progress Monitoring and Evaluation and practices
Page 1 - 7
Division M&E System Introduction
Progress Monitoring and Evaluation is undertaken during the implementation stage and is
an integral part of the plan-design-act-control cycle.
Initial Gains Evaluation keeps track of the changes or improvements in the performance
and/or practices of the target groups. Initial gains represent leading indicators, the
achievement of which will lead to the attainment of desired outcomes.
Evaluations of this type are conducted every mid-term implementation and before the
completion of the plan.
Page 1 - 8
Division M&E System Introduction
Page 1 - 9
DIVISION M&E SYSTEM
2.0
OBJECTIVES OF THE DIVISION M&E
Division M&E System Objectives of Division M&E
2.1 Definition
The Division M&E System is a mechanism for gathering, processing, analyzing, interpreting,
and storing data and information about the school's performance, needs and requirements
to sustain an effective school-based management. Operated by the Division, it is a System
which provides data, information and insights on the efficiency and effectiveness of the
Divisions technical support to schools. It sets into motion a series of managerial actions,
adjustments and realignments for the purpose of creating a sustained impact on the
quality of education provided by schools to learners.
A complete Division M&E System should have the following features:
Organized gathering and processing.
Analysis and Interpretation
Storing data and information
Managerial actions
Realization of objectives
2.2 Objectives
The main objective of the Division M&E System is to ensure the timely flow of information
and insights on the effectiveness of the Division's technical assistance to improving school
performance. The System is used to keep track of the Division's programs and projects.
Specifically, the Division M&E System will provide the following data and information on:
schools performance. The System will allow the Division to adjust its technical
assistance on SBM according to the schools performance on enrollment, retention,
completion and achievement. This will facilitate the classification and profiling of
schools into high, average and low performance. The classification will be used as
the major input to customizing programs and projects of the Division based on
school performance.
participation rate. The Division M&E System provides data and information on the
percentage of learners of school age participating in the basic school systems and
Page 2 - 2
Division M&E System Objectives of Division M&E
the number of out-of-school youth and indigenous people being served by the
alternative learning system.
capabilities of the school heads. One of the main target groups of the Division is
the school head. The Division will track the performance and requirements of the
school heads on instructional supervision and SBM.
capabilities of teachers. Another major target group of the Division is the teacher.
The tracking will include the teachers teaching skills and mastery of the subject
matter.
efficient management of the DEDP implementation. The Division M&E System will
also be used to assess the internal efficiency of the Division, especially in the
implementation of the programs and projects outlined in the DEDP terms of
difficulties, problems, issues or risks that hinder efficient implementation of Division
programs and projects.
The Division M&E System is part of the Integrated M&E System which connects the Division
to schools and to the Region. This will enable the Division to collect and share data,
information and insights from the schools to the Region and vice-versa. The integration will
provide the Division with critical and timely information regarding its operations and will
allow it to adjust or improve its technical assistance based on the needs and requirements
of the schools. Also, the Division's documentation of practices, initial gains and results will
serve as valuable inputs to the Region and National Offices to improve their respective
programs, policies and standards.
Page 2 - 3
Division M&E System Objectives of Division M&E
In monitoring and evaluation, it is important that the collection of data and information be
done in an orderly and systematic manner. A typical Schools Division deals with hundreds
of elementary and secondary schools. It also has to track the performance of community
learning centers and their service providers.
In this regard, the Division needs an organized and efficient system of gathering and sorting
information to reduce repetitive, costly and time-consuming gathering of data. An
organized system will facilitate the following:
accuracy of data and information
non-duplication of data and efforts
more time for technical assistance
2.3.2 Systematic Storing of Data and Information
The Division M&E System is the most authoritative source of information about the
performance of schools. It stores information on the performance of schools within the
Division and is a repository of programs and projects that can be considered as part of the
effective practices of the Division. These can be shared to all schools when they need the
information which is an important input to knowledge management.
As such, the M&E System will enable the following:
prompt retrieval of data and information when needed
detailed recording of information
standardized formats, documents and reports.
2.3.3 Facilitative of timely managerial actions
A must feature of a M&E system is the ability to provide relevant information to facilitate
decision-making . In this regard, deriving such information to aid in the decision-making and
the timing of the decisions to be made are very important considerations in the design of
the M&E system.
The monitoring activities and quality control points to be implemented by the Division are
timed with the implementation requirements of the schools. In this way, the data,
information, insights and lessons derived from the Division M&E System are immediately use
for making managerial and technical actions that will support the schools.
Page 2 - 4
Division M&E System Objectives of Division M&E
Page 2 - 5
DIVISION M&E SYSTEM
3.0
SCOPE OF THE DIVISION M&E SYSTEM
Division M&E System Scope of the Division Monitoring and Evaluation
Page 3 - 2
Division M&E System Scope of the Division Monitoring and Evaluation
Page 3 - 3
Division M&E System Scope of the Division Monitoring and Evaluation
Page 3 - 4
Division M&E System Scope of the Division Monitoring and Evaluation
Type of M&E /
Means of
Objectives Performance Indicators Division M&E
Verification
Process
1. Improved school (a) Reduce disparity between Division Report Card Outcome
performance high performing schools and Division Education Evaluation
low performing schools (in Development Plan Process:
NEAT and NAT) by --- percent (DEDP) Monitoring DEDP
(b) Reduce disparity in Implementation
enrollment, drop out, and
completion rates between
high performing schools and
low performing schools
Page 3 - 5
Division M&E System Scope of the Division Monitoring and Evaluation
Type of M&E /
Means of
Objectives Performance Indicators Division M&E
Verification
Process
2. Improved teachers
performance (a) Teachers demonstrated Division Report Card Tracking
competencies on General and DEDP Intermediate
Content and Subject specific Teachers' Results
skills. Performance Process:
(b) Teachers meeting the desired Assessment Report Monitoring DEDP
competencies based on the Assessment for Math Implementation
NCBTS and Science
teachers
3. Improved school heads (a) School heads demonstrated Division Report Card Tracking
performance competencies on school and DEDP Intermediate
based management and Results
instructional supervision Process:
Monitoring DEDP
Implementation
4. Improved learning
environment (a) Teacher to learners' ratio is 1:45 Division Report Card
(b) Learner to textbook ratio is 1:1
(c) Teacher to teacher manual
ratio is 1:1
(d) Teacher and learners have
access to school equipment,
science laboratories and
other facilities
(e) School comply with
Standards of a Child Friendly
School
1. Improved Division
performance (a) Increase in gross enrollment Division Report Card Tracking
rate; and DEDP Intermediate
(b) Improvement in the net Results
enrollment rate Process:
(c) Reduce disparity in the net Monitoring DEDP
enrollment ratio / Implementation
participation rate between
highly urbanized and SRA
Divisions
Page 3 - 6
Division M&E System Scope of the Division Monitoring and Evaluation
Type of M&E /
Means of
Objectives Performance Indicators Division M&E
Verification
Process
1 Scope of outputs vary depending on the target (quantity) outcomes, needs and targets specified in the DEDP.
Page 3 - 7
Division M&E System Scope of the Division Monitoring and Evaluation
Type of M&E /
Means of
Objectives Performance Indicators Division M&E
Verification
Process
Input
The Division M&E System is an internal system designed primarily to cater to the
decision- making requirements of the Schools Division Superintendents (SDS), Assistant
Schools Division Superintendents (ASDS), Education Supervisors (ES) and other Division staff.
The implementation of the System is not in compliance with the requirements of the Region
but a critical support mechanism to the Division's role of providing quality and relevant
programs and projects to schools and community learning centers. At the same time, the
Division M&E System provides feedback to the Region and National on the effectiveness of
existing policies and provide information on issues, concerns and opportunities for policy
agenda.
The Division M&E System, especially the Progress M&E, provides the Division implementers
with with up-to-date and accurate information needed in making day-to-day decisions to
assure the best courses of actions and support that will improve performance.
Page 3 - 8
DIVISION M&E SYSTEM
4.0
PERFORMANCE MEASURES
Division M&E System Performance Measures
4.1 Im pact
Division impact is measured in four areas. These are:
Increase in the participation rate. The first measure of school effectiveness is the
ability of the school to bring learners of school age to school. The primary
indicator for access is increase in the school's enrollment.
Retention. School effectiveness is measured in terms of its ability to encourage
learners who are in school will stay in school. The primary measure of success in
this area is retention rate. Other indicators like drop out rate and school leavers' rate
will also be used.
Learners complete the requirements from Grade 1 Grade 6 or 1st Year High School
to 4th Year High School. Another measure of effectiveness is the ability of the school
to assist or compel the learners to complete the requirements at the elementary
level or at the secondary level. The indicator to be used for this area is completion
rate and supported by other indicators like graduation rate and cohort survival rate
to help explain the phenomena.
Learners achievement. The last, but not the least, measure of school effectiveness is
the learners achievement. This pertains to the learners demonstration of required
competencies (at every level) and their readiness to pursue the next higher level of
learning. Learners achievement is a progressive indicator that shows the progress of
Page 4 - 2
Division M&E System Performance Measures
4.2 Effectiveness
Division effectiveness is measured in four areas. These are:
Improved school performance
Improved performance of community learning centers
Improved performance of school heads and teachers
Improved learning environment
Page 4 - 3
Division M&E System Performance Measures
Page 4 - 4
Division M&E System Performance Measures
Page 4 - 5
DIVISION M&E SYSTEM
5.0
DIVISION MONITORING PROCESS
Division M&E System Division Monitoring Process
Page 5 - 2
Division M&E System Division Monitoring Process
The Division Monitoring Process also covers observing, measuring and documenting events
in the external environment. It includes tracking the stakeholders support and factors
beyond the control of the Division and the schools which may affect the implementation
of the plans.
The Division Monitoring Process includes: (1) Monitoring the DEDP Implementation, (2) School
Performance Monitoring, and (3) Managing the ALS Programs.
Page 5 - 3
Division M&E System Division Monitoring Process
The DEDP outlines the support programs and projects of the Division for the schools. It also contains
staff development programs for school heads and teachers, technical assistance support to school
heads on SBM, instructional consultancy strategies for teachers, learning materials support and other
support requirements of the schools and community learning centers.
On the other hand, the SIP contains the scope of work of the school for the next three years. The work
is detailed yearly through the Annual Implementation Plan or AIP. The SIP/AIP is used to track the
implementation efficiency of the schools.
Monitoring DEDP & SIP Implementation is a management mechanism which will allow the
Division to manage its monthly operations more efficiently. It focuses on the deliverables
and sees to it that these are accomplished and delivered. Tracking the DEDP and SIP
implementation will also facilitate the systematic handling of concerns on the quality of
technical assistance delivered to schools and community learning centers.
Specifically, the mechanism will allow the Division to manage the following:
quality and status of Division programs and projects.
Division's Physical Accomplishment (S-Curve). Involves comparing the number of
Page 5 - 4
Division M&E System Division Monitoring Process
Page 5 - 5
Division M&E System Division Monitoring Process
Page 5 - 6
Division M&E System Division Monitoring Process
Monthly Monthly Monthly Monthly Monthly Monthly Monthly Monthly Monthly Monthly Monthly Monthly
Report Report Report Report Report Report Report Report Report Report Report Report
Page 5 - 7
Division M&E System Division Monitoring Process
Page 5 - 8
Division M&E System Division Monitoring Process
Page 5 - 9
Division M&E System Division Monitoring Process
Page 5 - 10
Division M&E System Division Monitoring Process
accomplishment of the schools as per SIP/ AIP, facilitating factors and issues and
concerns affecting school performance.
Page 5 - 11
Division M&E System Division Monitoring Process
Page 5 - 12
DIVISION M&E SYSTEM
6.0
DIVISION QUALITY CONTROL AND ADJUSTMENT POINTS
Division M&E System Quality Control and Adjustment Points
Page 6 - 2
Division M&E System Quality Control and Adjustment Points
Page 6 - 3
Division M&E System Quality Control and Adjustment Points
decision making. It highlights the importance of any major evaluation activities for
making decisions.
Page 6 - 4
Division M&E System Quality Control and Adjustment Points
.
+
! ( )
%
&
%
& ! ,
%
%
&
%
& %
#
.
#+
Quality Control & Adjustment Points are established in order to ensure relevant, up-to-date
and timely technical assistance of the Division and districts to schools and community
learning centers. Control points are strategically placed at every major milestone in the
DEDP implementation life cycle. The control points are mechanisms to steer and manage
technical assistance to schools and learning centers.
The 5 Division Quality Control and Adjustment Points are:
SIP Appraisal (SA). A quality control mechanism designed to make sure that SIPs
are able to meet the criteria of a good plan: relevance, responsiveness and
feasibility. This is also the review point where the SIP is assessed in terms of
completeness of information and in terms of its fit for use as a reference for
monitoring and evaluation.
Start Up Review (SUR). Ensures the readiness of schools to implement the 3 year SIP.
This quality control point evaluates the compliance of the school to set up critical
management mechanisms before fully implementing the SIP. Example is the set up
of the M&E system.
Annual Implementation Review (AIR). A major review of the Division and schools'
implementation of their programs and projects. Assessments are made in terms of
achievements and accomplishments based on the objectives and targets in the
DEDP and SIP. The AIR is used as an adjustment point for the next implementation
Page 6 - 5
Division M&E System Quality Control and Adjustment Points
year.
Mid-Term Review (MTR). A review undertaken after the first 3 years of the DEDP (at
the end of the SIP cycle). The Division evaluates its impact to the learners and its
effectiveness based on the schools' achievement of their outcomes. The results of
the MTR will serve as a major input to adjusting the next 3 years of the DEDP.
Outcome Evaluation (OE). A post implementation review conducted at the end of
the DEDP implementation. The main objective is to determine whether the outcome
level objectives and goals in the DEDP are achieved. OE investigates factors that
contributed to success and/or hindered achievement of targets. The results of the
OE will be used as input to the preparation of the next cycle DEDP..
6.3 Monitoring Process and the Quality Control & Adjustment Points
The Division M&E System is composed of two major systems: the Monitoring Process
(discussed in Part 5) and the Quality Control & Adjustment Points. These two systems gather
different but related information.
The Monitoring Process represents the daily, weekly, monthly and quarterly efforts to track
and improve the delivery of services to schools. The data, information and insights collected
in this process are immediately used for adjustments, to solve issues and problems and to
ensure the implementation progress is on track within scope, time and cost.
On the other hand, the Quality Control and Adjustment Points are major evaluation points
set up to measure the achievement of outcomes, initial gains and major milestones (such as
appraisal and start up). It uses the data and information from the Monitoring Process to
Page 6 - 6
Division M&E System Quality Control and Adjustment Points
provide background stories about what happened, what transpired, and the factors that
influenced the achievement of the major milestones.
Figure 6-2 illustrates the interaction between the two systems.
Page 6 - 7
Division M&E System Quality Control and Adjustment Points
2 6 ! % 5
21
!)!5
Figure 6-6 Importance of Planning
6.4.2 Objectives
The SIP Appraisal Process is established to assist schools in the preparation of SIP. It is a
quality control mechanism of the Division that will assure relevance, feasibility and
sustainability of education programs and projects of the schools.
The main objective of this control point is to ensure the schools prepare a relevant and
implementable plan. Specifically, the Division conducts an appraisal of SIP to warrant the
following:
the statement of the problems and objectives is clear. The baseline situation and
the desired situation is clearly explained and shows logical link.
SIP objectives and targets are specific, measurable and reasonable.
strategies and proposed programs and projects in the SIP are relevant. This means
that there is a logical link between the baseline situation and the proposed
strategies and programs to bring about changes or improvements in the situation.
Relevance means the plan will be able to solve the problems of the school and/or
Page 6 - 8
Division M&E System Quality Control and Adjustment Points
2. Assess relevance
)
!!
!!
%
3. Assess technical
correctness of proposed !
%
&
programs and projects !!
!!
.
!!
4. Feedback and revision +
7
)38
#
% !
5. SIP acceptance
Next Control
Point
Output: Accepted SIP
Page 6 - 9
Division M&E System Quality Control and Adjustment Points
opportunities, strengths and weaknesses) and the desired future situation. There has
to be an agreement between the school and the Division about the baseline
situation and the desired future situation (includes targets). When this requirement is
satisfied, proceed to the next step. If relevance is not satisfied, do not proceed to
the next step. Return the SIP for revision.
Refer to SIP Appraisal Checklist Item #2 Relevance of the Plan.
(3) Assess correctness of strategies. The appraisal, at this point, focuses on the
feasibility of strategies as outlined in the detailed implementation plan. This activity
will include the review of the following:
Individual programs and projects proposed in the SIP. Examine the
technical correctness of these programs and projects. Assessment includes
identifying other alternatives that may produce better results to achieve the
Outcomes.
Link between future desired situation and the proposed programs and
projects. Assess whether the proposed Outputs/Contributory Objectives are
complete and necessary.
Refer to SIP Appraisal Checklist Item #3. Necessary and Adequacy.
(4) Assess the feasibility of the plan. The appraisal shall not be limited to the review of
the document but shall also include assessment of the school's capacity to
implement and sustain the plan.
Assess capacity of school to implement the SIP including the programs and
projects.
Assess capacity of stakeholders to support the school in implementing the
SIP strategies
Review the costings and estimates. The Division QMT also reviews the
assumptions and cost estimates presented in the SIP.
Refer to SIP Appraisal Checklist Items # 4,5 and 6.
(5) After appraising the relevance of the SIP and the technical correctness of the
proposed programs and projects, the last item to review and enhance is the
completeness of the Implementation Plan. This means checking the following must
items:
targets and milestones are clearly specified
activities are broken down to desired level,
the relationships of the activities (network) are logically sequenced
activities are assigned with resources (human, material, equipment etc)
activities are specified on a monthly (not quarterly) period
Page 6 - 10
Division M&E System Quality Control and Adjustment Points
Page 6 - 11
Division M&E System Quality Control and Adjustment Points
1. Completeness of Document
To ensure the SIP is complete in Is the SIP following or complying the Return SIP when it does not
terms of data, information and prescribed format? comply with requirements
supporting documents are present
Proceed to assessment of
Are the data, information and relevance when SIP is deemed
assumptions used correct and valid? complete
Are there supporting documents?
Were the stakeholders involved or
participated in the preparation of the
plan?
4 Capacity of School
To determine the capacity of the Can the school head implement and If yes, proceed to next appraisal
school to implement the proposed manage the programs and projects in area.
programs and projects in the SIP the SIP? If no, consider the school
requirements in the DEDP. Make
Can the teachers deliver programs and sure technical assistance support
projects efficiently and effectively? to schools are incorporated in
What are the capability building the DEDP.
requirements (needed to implement the
SIP) of the school head and teachers?
5. Stakeholders Support
To determine the level of support the Are the stakeholders ready and willing If yes, proceed to next appraisal
stakeholders can provide to schools to participate and support the area.
implementation of the plan? If no, consider the school
requirements in the DEDP. Make
Are they capable? sure technical assistance support
to schools are incorporated in
the DEDP.
Page 6 - 12
Division M&E System Quality Control and Adjustment Points
6. Resource Generation
To determine the feasibility of Are the cost requirements reasonable? If yes, proceed to checking the
implementing the plan considering implementation plan.
the cost requirements Are there other fund sources? If no, can the Division assist in
looking for fund sources? If no,
downsize the plan.
Page 6 - 13
Division M&E System Quality Control and Adjustment Points
Page 6 - 14
Division M&E System Quality Control and Adjustment Points
Area Remarks
1. Completeness of Document
Vision Statement
More Info
Does the vision statement paint a picture of Yes No
Needed
the future situation of the school?
Situational Analysis
The problems, issues, needs and More Info
Yes No
opportunities described in the SIP are real Needed
and based on sound analysis
Target Groups
More Info
Needs of different target groups are clearly Yes No
Needed
identified
SIP Objectives and Targets
The objectives (in the goal chart) of the SIP
More Info
are logically link to the problems, issues, Yes No
Needed
needs and opportunities described in the
plan
1 Checklist to be used by the Division Quality Management Team to appraised the SIPS. Additional items may be
added depending on the requirements and/or intent of the Division
Page 6 - 15
Division M&E System Quality Control and Adjustment Points
Area Remarks
4. Capacity of school
5. Stakeholders Support
6. Resource Generation
Budget
Is the total estimated cost required to More Info
Yes No
implement the school programs and Needed
projects reasonable?
Fund Sources More Info
Yes No
Are there fund sources available? Needed
Resource mobilization Yes No More Info
Is the school head capable of generating Needed
Page 6 - 16
Division M&E System Quality Control and Adjustment Points
Area Remarks
Activities
Are the activities listed in the WFP directly More Info
Yes No
linked to the outputs/deliverables listed in the Needed
goal chart?
Work and Financial Plan
More Info
Is there a WFP? Is it presented on a monthly Yes No
Needed
basis?
Targets and Schedules More Info
Yes No
Are targets plotted monthly? Needed
Cash Flow
More Info
Are cash flow requirements plotted Yes No
Needed
monthly?
Persons Responsible More Info
Yes No
Is there an assigned individual per activity? Needed
Monitoring and Evaluation
More Info
Are M&E activities reflected in the WFP? Are Yes No
Needed
there assigned resources for M&E?
Page 6 - 17
Division M&E System Quality Control and Adjustment Points
!
%
&
Page 6 - 18
Division M&E System Quality Control and Adjustment Points
- Author Unknown
Advocacy and resource mobilization. One of the important start up activities is the
advocacy work especially in generating support and/or resources from
stakeholders.
Prepare status report. The report to be prepared at this stage will serve as the
inception report.
Start Up stage is also a sustainability mechanism. It involves setting up of critical systems
and involves rallying and mobilizing support for the plans. Among the mechanisms that
must be set up at the start of implementation are the following:
(1) Participation of stakeholders. This refers to the stakeholders' understanding of the
plan, especially the target benefits and improvements.
(2) Communication system. This includes setting up the mechanism for sharing and
disseminating data and information throughout the organization. This will enable the
Division, district and schools to:
coordinate efforts more efficiently, thus avoiding duplication
gain up-to-date information about the status of implementation including
issues and problems, and make timely corrective actions
know about policies and directions of the organization in order to
synchronize decisions and actions at their level
(3) Monitoring, Evaluation and Adjustment system. Plans are best estimates of the
future. However, even a well-written plan will never be able to predict in exact detail
the future situation. At the start of the implementation, therefore, the mechanism for
tracking, analyzing and adjusting the implementation plan should be already in
place.
The inability to set up the critical mechanisms during start up and the failure to implement
the mobilization activities often leads to implementation difficulties and inefficiency. Based
on the experience of many, misunderstandings on the scope of the plan and on the roles
and responsibilities of individuals could have been avoided had an honest-to-goodness
activities related to start up were undertaken. Recurring problems manifested through
Page 6 - 19
Division M&E System Quality Control and Adjustment Points
delays, cost overrun, poor quality of services and non-achievement of targets and
outcomes are often traced to activities related to scope and role clarification and setting
up of systems that will facilitate information sharing and facilitate decision making.
In order to minimize, if not eradicate implementation problems, the Start Up Review Process
is installed as one of the quality control mechanisms in the Division M&E System.
6.5.2 Objectives
The main objective of the Start Up Review process is to ensure the readiness of the schools
to implement the SIP. Readiness is determined when the school is able to implement the
required necessary mobilization activities and has established critical management systems
that will sustain the implementation of the school's SIP.
Specifically, the Start Up Review will allow the Division and District to:
Pinpoint schools that are ready to implement the SIP and schools needing
assistance in jump starting their plans. This will allow the Division and District to focus
assistance on schools having difficulty launching their SIPs.
Synchronize the Division M&E system with the school M&E system. At this stage, the
Division is also initiating its DEDP implementation.
In the case of the Division's alternative learning programs, start up activities are
undertaken to ensure readiness of the accredited service providers to implement
the Basic Literacy Program and the A&E Program.
Page 6 - 20
Division M&E System Quality Control and Adjustment Points
Implementation
Start Up Review Stage
Process Flow
%
&
Page 6 - 21
Division M&E System Quality Control and Adjustment Points
Page 6 - 22
Division M&E System Quality Control and Adjustment Points
Page 6 - 23
Division M&E System Quality Control and Adjustment Points
6.6.2 Objectives
The annual review is undertaken to assess the initial gains generated after 1 year of
implementation. It is a mechanism to track the achievement of outcomes (Division and
school level) on a year to year basis. It is also a mechanism for assessing the efficiency of
Division units, districts and schools in delivering the target outputs in the DEDP and SIP.
The Annual Implementation Review is designed to generate information and insights that
will be useful for continuous improvement and in solving recurrent problems. The Review will
also serve as a major adjustment point for plans and programs of the Division and schools.
Specifically, the annual review will provide the Division with the following information:
Page 6 - 24
Division M&E System Quality Control and Adjustment Points
Programs and projects that produced positive and/or encouraging (initial) results.
The review will enable the Division to reinforce on programs and project that work
and improve the design of programs that produced negative results.
Technical assistance processes or practices that need further enhancements.
Accomplishment to date. An annual review provides an overall status of
accomplishment since the implementation started.
Factors that facilitated implementation as well as factors that adversely affected
delivery of services and assistance.
Annual Implementation
Review Process Flow Implementation
Stage
1. Consolidate annual reports ! )
!
2. Analyze achievements and
accomplishments
:
4
%
)
3. Assess implementation !4
!
( )
(what went right & what
went wrong)
%
&
!!
!!
.
4. Prepare next year plan
+
5. Submit and document the
next year plan Next Control
Point
Page 6 - 25
Division M&E System Quality Control and Adjustment Points
important areas to analyze (in an annual review) and provides process questions
that will help the QMT to analyze and formulate recommendations for the next
implementation year.
(3) Assess implementation. Using the outputs of activity 1 and 2, the Division QMT will
conduct a one to two day assessment workshop. This will be attended by school
heads, district staff and Division staff. The objective of the workshop is to assess and
identify factors that facilitated the achievements and accomplishments and to
collectively identify issues and external factors that contributed to difficulties in
implementation.
Depending on the requirements, size and other factors, the assessment may be
undertaken in several options:
Division-wide assessment and planning workshop. The Division will conduct
only 1 workshop to be attended by all schools, districts and Division units
Division-wide assessment and planning workshop. Similar to option above
but divided into elementary and secondary schools
Assessment per District or cluster. Simultaneous conduct of assessment and
planning workshop to be facilitated by Division QMTs.
The assessment workshop will focus on the following areas:
Year end accomplishment as per AIP and DAP. Need to identify the
factors that contributed and factors that hindered the efficient
implementation of the plans
Initial gains per school outcomes. The assessment will include discussion
and analysis of the school performance indicators (enrollment, retention,
completion and achievement). During the workshop, discussion will focus on
programs and projects to continue, documentation of lessons learned and
propagation of effective practices.
Performance of teachers and school heads.
Accomplishment in ALS programs. Assessment includes performance of
community learning centers, service providers, facilitator and instructional
managers
Division operations. Assessment of Division's application of processes
(standards) and practices.
The results of the assessment will be used as input to the finalization of the Division
Annual Report and input to the preparation of the next Division Annual Plan.
(4) Prepare next year implementation plan. Using the results of the assessment, the
Division and the schools will revisit the DEDP and SIP to assess whether these need
any adjustment.
Page 6 - 26
Division M&E System Quality Control and Adjustment Points
(5) Document next year implementation plans. The DAP and AIPs will be documented
and used as basis for the monitoring implementation progress in the next year.
Page 6 - 27
Division M&E System Quality Control and Adjustment Points
(1) Line of Balance or S-curve. This tool provides an overall status of accomplishment.
It shows (in a diagram) the actual accomplishments of the Division and schools
versus outputs according to plans.
(2) Segmentation Techniques. This is a technique use to understand and gain insights
from target groups. Segmentation is a process of identifying and grouping schools
based on school characteristics and accomplishments. The main objective of
segmentation is to get to know the schools better in order to customize or fit the
Division's technical assistance to the requirements of the school.
Specifically, the segmentation technique will allow the Division to compare similar
schools (the same characteristics) and schools from different groups (different
characteristics). This approach will facilitate the monitoring of schools and allow the
Division to determine the unique needs, problems and requirements of schools
belonging to the same segment.
The following groupings will be used:
(a) school characteristics (sample only)
type science, vocational, national high school
location upland, urban, rural
facilities high classroom need, medium classroom need, low
classroom need
leadership schools headed by principal 2, principal 1, TIC
teacher to learner ratio high, medium and low
(b) school performance (sample only)
enrollment decreasing, increasing, stable
retention high, medium, low
completion high, medium, low
achievement - 75 and above MPS, 50-74 MPS, 50 and below
SBM Practice beginner, mature
(3) AIR Implementation Guide. A guide for QMT members on how to go about the
process of implementing the AIR.
Page 6 - 28
Division M&E System Quality Control and Adjustment Points
A. Status of Implementation
What school practices facilitated the Identify practices to
Implementation of Actual targets SIP/AIP implementation? reinforce
AIP accomplished versus plan What factors hindered the Address factors hindering
achievement of targets? implementation
Continue practices
What Division practices facilitated the contributing to efficient
Implementation of Actual targets DEDP/DAP implementation? operations
DAP accomplished versus plan What factors hindered the Document and address
accomplishment of targets? factors hindering
efficiency
Performance of Learners under BLP What are the practices of the service Contracts to extend
service providers achieved 100% of the core providers that contributed/ hindered Lessons learned and
competencies in reading, the achievement of targets? effective practices to
writing and numeracy What assistance or Division programs continue
50% mastery of the core contributed to the achievement of
Page 6 - 29
Division M&E System Quality Control and Adjustment Points
Page 6 - 30
Division M&E System Quality Control and Adjustment Points
.
+
! ( )
%
&
%
&
! ,
%
&
%
&
%
& %
#
.
#+
F i g u re 6 - 12 M i d -Te r m I m p l e m e n t a t i o n
Page 6 - 31
Division M&E System Quality Control and Adjustment Points
6.7.2 Objectives
The objectives of the Mid-Term Evaluation Review are to:
Evaluate how closely the achievements and accomplishments are to the planned
objectives and targets
Assess the first 3 years of DEDP implementation to determine which programs and
projects should be continued or stopped
Document the effective practices and processes that contributed to attainment of
initial gains and make recommendations to continue applying them in the next 3
years of implementation
Analyze the causes of problems and difficulties encountered and document these
as part of the lessons learned
Identify factors that may help sustain the initial gains.
Mainly, the results of the evaluation will be used as input to enhancing the implementation
strategies and technical assistance to schools for the next three years.
Page 6 - 32
Division M&E System Quality Control and Adjustment Points
5. Adjust DEDP +
%
Next Control
Output: Adjustment of Point
DEDP for next 3 years
Page 6 - 33
Division M&E System Quality Control and Adjustment Points
Page 6 - 34
Division M&E System Quality Control and Adjustment Points
Page 6 - 35
Division M&E System Quality Control and Adjustment Points
information from the participants. There are no right and wrong answers but
the facilitator must see to it that the discussion is focused and will generate
the desired information from the participants.
Inspection. This is an activity that will validate the claims of individuals about
a practice or way of doing things.
actual observation. In order to document the actual practice or behavior,
actual observation is undertaken. This method will help validate the claims
made by key informants and participants to the FGD.
Questionnaire. Predetermined questions are jotted down. These are used to
guide the interviews.
(2) Segmentation Techniques. This is a technique use to understand and gain insights
about target groups. Segmentation is a process of identifying and grouping schools
based on school characteristics and accomplishments. The main objective of
segmentation is to get to know the schools better in order to customize or fit the
Division's technical assistance to the requirement of the school.
Specifically, the segmentation technique will allow the Division to compare similar
schools (the same characteristics) and compare schools from different groups
(different characteristics). This approach will facilitate the monitoring of schools and
allow the Division to determine the unique needs, problems and requirements of
schools belonging to the same segment.
The following groupings will be used:
(a) school characteristics (sample only to be developed further)
type science, vocational, national high school
location upland, urban, rural
facilities high classroom needs, medium classroom needs, low
classroom needs
leadership schools headed by principal 2, principal 1, TIC
teacher to learner ratio high, medium and low
(b) school performance (sample only to be developed further)
enrollment decreasing, increasing, stable
retention high, medium, low
completion high, medium, low
achievement - 75 and above MPS, 50-74 MPS, 50 and below
SBM Practice beginner, mature
(3) SBM Assessment. The Division is going to assess the SBM practices of the schools
Page 6 - 36
Division M&E System Quality Control and Adjustment Points
using the same SBM assessment tool the schools are using to do self-assessment.
Team of assessors from the Division and District are going to conduct the SBM
assessment.
In order to maintain uniform application of criteria and unbiased assessment of
school practice, the tool is reinforced with the consensus technique. Assessors will
not immediately render judgment about the school practice but instead jot down
notes and document the school practices as has been observed. These
documentations are discussed by the team of assessors and a consensus is to be
made as to whether the school is able to satisfy the level of practice.
(4) Mid Term Implementation Review Checklist. The Checklist is for use by the Division
QMT/ evaluation team as guide in the preparation, implementation and completion
of the Mid-Term Review.
The Checklist will not be use, in any way, to score or grade the performance of the
QMT but to serve as a guide in the implementation of evaluation activities. Its main
objective is to ensure a smooth and efficient conduct of the Mid-Term Review.
The Checklist provides a listing of activities, resources, and reference documents
necessary for an efficient implementation of the Mid-Term Review. The QMT will
check: Yes if condition/question is complied with; No if condition/question posted is
not met; and More Information Needed when an objective Yes or No response
cannot be undertaken due to insufficient information.
Page 6 - 37
Division M&E System Quality Control and Adjustment Points
Area Remarks
1. Preparatory Activities
Page 6 - 38
Division M&E System Quality Control and Adjustment Points
Area Remarks
Page 6 - 39
Division M&E System Quality Control and Adjustment Points
Area Remarks
7. Evaluation Report
Page 6 - 40
Division M&E System Quality Control and Adjustment Points
Area Remarks
implementation
Feedback
More Info
Results and recommendations are properly Yes No
Needed
disseminated and communicated
8. Adjustment of Plans
Recommendations
More Info
Suggested next steps find its way to the Yes No
Needed
implementation plan for next year
Issues and problems
Corrections/ adjustments are made in the More Info
Yes No
plan in order to mitigate if not solve the Needed
issues and problems raised in the evaluation
Input to Appraisal
More Info
Evaluation findings and recommendations Yes No
Needed
are use as input to appraisal of SIP
9. Knowledge Management
Sharing of Information
More Info
Evaluation findings are shared and discussed Yes No
Needed
to the Division and District
Improve design of programs/projects
More Info
Evaluation findings are used to enhance Yes No
Needed
design of Division programs and projects
Expectations from ES
Division staff, especially education
More Info
supervisors are knowledgeable about the Yes No
Needed
results of the evaluation and the issues and
problems
Access
More Info
Evaluation results are made available and Yes No
Needed
accessible
Page 6 - 41
Division M&E System Quality Control and Adjustment Points
.
+
! ( )
%
&
%
& ! ,
%
%
&
%
& %
#
.
#+
F i g u re 6 - 14 D E D P Wra p U p
Page 6 - 42
Division M&E System Quality Control and Adjustment Points
The scope of the Outcome Evaluation is detailed further in Table 6-# Division M&E Framework
Completion: To ensure learners Increase in number of learners able School Report Card
who are in school will complete to complete the basic education
the requirements of the primary requirements
and secondary level Improve graduation rate
Achievement: To ensure that Improvement in the basic functional Learner Report Card
learners demonstrate the literacy skills of the learners Teacher Assessment
necessary competencies at Improvement in the academic National Achievement Test
each level performance of learners in all subject (2nd Year)
matter Regional Achievement Test
Improvement in the social skills (3rd Year)
1. Improved school performance Reduce disparity between high Division Report Card
performing schools and low Division Education
performing schools (in NEAT and NAT) Development Plan (DEDP)
by --- percent
Reduce disparity in enrollment, drop
out, and completion rates between
high performing schools and low
performing schools
Increase in satisfaction of school
stakeholders in the quality of Perception Survey
instructions in the school
Improve SBM Practice of schools SBM Assessment Result
Page 6 - 43
Division M&E System Quality Control and Adjustment Points
1. Improved competencies of Division and District staff Division Report Card and
DepED Division and District staff in demonstrates competencies on DEDP
providing technical and educational planning, curriculum
management support to schools, management, instructional
community learning centers, school consultancy, training and Results of Performance
heads, teachers and facilitators development and monitoring and Assessment
evaluation
6.8.2 Objectives
As an integral part of the process improvement mechanism, the objectives of Outcome
Evaluation are the following:
measure the improvement in the performance of schools, school heads and
teachers, instructional managers and facilitators and non-teaching staff of schools
determine whether the Division programs and projects lead to the achievement of
Page 6 - 44
Division M&E System Quality Control and Adjustment Points
Outcome Evaluation
Process Flow
1. Prepare evaluation design %
!
2. Review school
achievements based on
%
& /
6
SIP
( )
3. Conduct evaluation
%
&
)
4. Prepare evaluation reports . 0 !%
Page 6 - 45
Division M&E System Quality Control and Adjustment Points
(2) Review and/or document Division achievements. Using the Basic Education
Information System (BEIS) and the Report Cards, the Division will document the
achievement of the Goal and Purpose level objectives in the DEDP.
(3) Data Gathering. The task is to validate the documented achievements and
document how these achievements and accomplishments were achieved.
Specifically, the focus of the validation is to document the processes, practices and
other factors that contributed to the realization of the DEDP objectives. Includes
gathering of data and information at the Division, district, school, community
learning center level and community level and building consensus.
The data gathering is divided into 3 major activities:
Data Gathering to validate the achievements and accomplishments.
Includes visits to schools and community learning centers and involves the
use of rapid appraisal techniques
Perception survey to gather feedback from school stakeholders on quality
of services provided by the school
SBM Assessment Level of Practice
(4) Prepare DEDP Terminal Report. The terminal report describes the situation at the
Division level after six years of implementation. It describes the status of the schools
(using the school performance indicators) and provides a comparative assessment
of performance in terms of before and after and between and among school
groups.
Specifically, the terminal report will contain the following information:
Achievement of the DEDP Goal and Purpose level objectives
Major accomplishments, challenges encountered and how these were
solved or mitigated
Effective practices and lessons learned
Analysis of current issues, problems and opportunities
The Terminal Report is the main reference document in the preparation of the next
Division plan.
Page 6 - 46
Division M&E System Quality Control and Adjustment Points
Page 6 - 47
Division M&E System Quality Control and Adjustment Points
Page 6 - 48
Division M&E System Quality Control and Adjustment Points
Page 6 - 49
Division M&E System Quality Control and Adjustment Points
1. Participation Rate
What programs and projects
contributed to the increase in the
Is there an increase in the participation rate? Yes No Same participation rate?
What external factors contributed to the
increase/decrease of participation rate?
If yes, to what Division programs and
projects can these be attributed?
Is the targeted participation rate achieved? Yes No Same
If no, what hindered the improvement in
the participation rate?
If yes, what unique programs and
Is the Division participation rate better than projects contributed to such?
Yes No Same
the average within the region? If no, what factors can this be
attributed?
If yes, what unique programs and
Is the Division participation rate higher or projects contributed to such?
Yes No Same
better than the national average? If no, what factors can this be
attributed?
3. Achievement
If yes, what programs and projects can
Are the performance of Grade 6 learners these be attributed?
Yes No Same
improving in the last 5 years? If no, what factors internal and
external can these be attributed?
If yes, what programs and projects can
Are the performance of 2nd year high school these be attributed?
Yes No Same
learners improving in the last 5 years? If no, what factors internal and
external can these be attributed?
If yes, how was these achieved? What
practices must be continued?
Are the targeted learner achievement If no, what lessons can be drawn from
Yes No Same
achieved by most of the schools? the effort or interventions provided that
should be improved or not repeated
anymore?
Is the Division performance on achievement Yes No Same If yes, what could be the factors that
higher than the regional average? allowed the Division to have higher than
Page 6 - 50
Division M&E System Quality Control and Adjustment Points
5. Stakeholders Perception
What areas of the teaching and
Are the perception of learners to the teaching learning process gained positive
Yes No Same
and learning process improving? responses and what areas garnered
negative responses?
Is there an improvement in the perception of In what areas or service the school
stakeholders concerning quality of education Yes No Same improved facilities, teachers, school
in the Division (as compared 3 years ago)? management etc
Are the perceptions of parents improving In what areas are the perception
concerning the quality of education? Yes No Same positive and what areas are they
negative? Why?
Is there an improvement in the perception of In what areas are the perception
local government units and others regarding Yes No Same positive and what areas are they
the quality of education? negative? Why?
7. Learning Environment
Where is the shortage of textbooks most
Is the 1:1 learner to textbook ratio achieved? Yes No Same
acute?
Page 6 - 51
Division M&E System Quality Control and Adjustment Points
9. Teachers Performance
Are there improvement in the competencies/
performance of teachers on:
How many teachers have mastery of
the subject they teach? What were the
Subject mastery Yes No Same training programs received/attended by
the teachers? How are they applying
these training programs?
Teaching skills (classroom management, Yes No Same How many teachers demonstrated the
student assessment, modern teaching proper teaching skills? How many have
methods, care and use of learning been trained?
Page 6 - 52
Division M&E System Quality Control and Adjustment Points
Page 6 - 53
DIVISION M&E SYSTEM
7.0
MONITORING AND EVALUATION TOOLS & TECHNIQUES
Division M&E System M&E Tools and Techniques
7. 0 D I V I S I O N M & E T O O L S
This section enumerates tools and techniques for monitoring and evaluation.
The choice of M&E tools and techniques will influence the results of the evaluation or
assessment that will be undertaken by the Division. Selection of the most appropriate ones
increases the likelihood of a correct, precise and accurate results or findings. In this regard, it
is important that the M&E team must be familiar with the different tools and techniques,
especially in the results these will generate, as well as the context and nature of these tools.
The following is a classification of M&E tools and techniques:
(1) Tools to assess effectiveness
(2) Tools to assess Division Readiness
(3) Tools to track efficiency
(4) Tools for data gathering.
Page 7 - 2
Division M&E System M&E Tools and Techniques
characteristics). This approach will facilitate the monitoring of schools and allow the
Division to determine the unique needs, problems and requirements of schools
belonging to the same segment.
As an evaluation tool, the Division assess the performance of a school against the
performance of schools with similar characteristics or schools belonging to the
same typology. The performance of the schools (in a Division) are also compared or
benchmark against the performance of schools (belonging to the same typology)
in other Divisions (within the region) and against national performance (average).
Competencies Checklist. Includes list of competencies that must be demonstrated
by school heads, teachers, instructional managers and literacy facilitators.
Stakeholders Perception Survey. Refers to the perception of the stakeholders
(community, LGUs, learners, etc) on the quality of education and quality of services
provided by the schools.
Logical Framework Approach. This refers to situational tools and techniques used to
assess and explain the phenomena behind the results or outcomes. Logical
framework matrix uses problem tree, objectives tree, stakeholders analysis and SWOT
(strengths, weakness, opportunities and threats).
The QMIM is also a yardstick to assess the performance of the Region and division. It
will be used to examine the Region and Division's processes and support
mechanisms that allows it to efficiently and effectively deliver technical assistance
packages to schools, school managers, teachers and the school's non-teaching
staff.
Page 7 - 3
Division M&E System M&E Tools and Techniques
The Model is defined into (1) Ad Hoc, (2) Defined, (3) Integrated, (4) Sustained.
Level 1 is the entry level. It represents a Division that is characterized by ad hoc
processes and informal way of doing things. As it matures, the Division Office is
expected to establish its internal procedures (Level 2. Defined). The Division improves
into a stage where it is expected to manage and integrate different mechanisms
into an integrated system. The highest level is the Sustained level. This represents a
Division that adapts, maximizes and continuously improve its way of doing things.
For a more detailed discussion, see attached document Quality Management
Inventory Model which describes the model and the process for undertaking the
inventory on quality management.
Division Readiness on Quality Management. A self-assessment tool used to
evaluate the competencies of Division and District staff on the critical areas of
quality management. compare the approved or accepted targets in the AIP/SIP
versus the actual number of targets completed. For a sample checklist on readiness
assessment below.
Page 7 - 4
Division M&E System M&E Tools and Techniques
Page 7 - 5
Division M&E System M&E Tools and Techniques
3.0 Analysis
chart is used to track
3.1 Tabulate Data Sept 20, 09 Sept 25, 09
Page 7 - 6
Division M&E System M&E Tools and Techniques
Area Remarks
1 Preparatory Activities
School Head Interview the school head. Ask
The school head attended the More Info him/her some important
Yes No
orientation on SBM and is aware of the Needed information about SBM
intent of the SBM assessment assessment
School Head
Interview the school head. Ask
The school head understand all the terms
More Info him/her some important
used in the instrument and knows the Yes No
Needed information about SBM
process of administering, scoring and
assessment
reporting.
This is to triangulate the
Teachers and Non-Teaching Staff
information provided by the
The school head oriented the teaching
More Info school head. Ask the teachers
and non-teaching staff on the concepts Yes No
Needed and non-teaching staff about
of SBM and aware of the purpose and
the purpose of the SBM
process involve in the assessment
assessment
This is to triangulate the
External Stakeholders
information provided by the
The school head oriented the school
More Info school head. Ask the teachers
stakeholders on the concepts of SBM and Yes No
Needed and non-teaching staff about
aware of the purpose and process
the purpose of the SBM
involve in the assessment
assessment
External Stakeholders
Majority of the invited stakeholders
More Info
attended and participated in the Yes No Check the attendance sheet.
Needed
assessment ; each dimension was
represented by stakeholders
2 Data Gathering
Copy of Assessment Tool
More Info
Everyone has a copy of the assessment Yes No
Needed
tool
Evidence Validate the evidence
The documents provided are the most More Info presented. Check the dates,
Yes No
authoritative document signed and the Needed signatories and content of the
most up to date document if sufficient
Page 7 - 7
Division M&E System M&E Tools and Techniques
Area Remarks
Evidence
More Info
The school head allowed access to all Yes No Ask the stakeholders.
Needed
school documents
Evidence
More Info
Enough time was given to review and Yes No Ask the stakeholders
Needed
assess the content of the documents
Evidence and Interview
Agreements and consistency between More Info Look for documentation of the
Yes No
the content of the documents and the Needed interviews
responses of the interviewees
Page 7 - 8
Division M&E System M&E Tools and Techniques
Area Remarks
Page 7 - 9
Division M&E System M&E Tools and Techniques
and preparation. Identify the indicators that you want to verify, the data that will
support the indicators then determine the appropriate tool.
Triangulate. Use more than 1 technique in gathering data to minimize the error
inherent among data gathering tools and techniques
Just in time not just in case. Collect data that you need to make decision, not
collect data in case you need them in the future. This will help you avoid data
overload.
Cost efficient. Consider the costs involved in using a tool to gather data. As a
general rule, always go for a less expensive technique that will offer same quality of
data as that of the more expensive technique.
Page 7 - 10
Division M&E System M&E Tools and Techniques
the interviews.
Table 7-1 Rapid Appraisal Tools
This method focuses on the actual performance, The key item to remember in making
actual utilization, on -going activities and events. observation is for the observer to
Observers record what they see and hear. avoid the urge to document and
Observation This method is appropriate when the objective is analyze at the same time. The
(Direct observation) to document demonstration of skills in an actual observer must jot down notes as
setting. objectively as he/she can, noting
down exactly what is being
observed.
Interview is one of the most commonly used Using a guide or questionnaire would
data gathering methods. It gathers qualitative often distract the interviewee. This
data and is a good source of perspectives may also cause the interviewee to
which will help explain the phenomenon being be cautious about the information
Interview (key informant validated. he/she is sharing.
interviews, informal interviews, The interviewer uses guides (list of topics or open The key in using interview as a
transect walk) ended questions) and probes the interviewee to method is the INTERVIEWER. The
elicit opinion, experiences and practices. interviewer should evolve as the
instrument. He/she must be quick
to adapt and adjust to the
demeanor of the interviewees.
This method involves around 8 to 12 individuals FGD should be focused. If the
discussing a certain subject matter. The group is group discusses so many topics and
assisted by a facilitator and a documenter. it goes out of focus, the FGD fails.
The facilitator asks process questions to start the And this happens a lot of times.
Focus Group Discussion discussion. The facilitator will not discriminate the The main role of the facilitator is to
answers provided by the participants but should keep the discussion of the group
probe and ask clarification on the responses FOCUSED.
given.
A documenter will document all the responses.
This data gathering method focuses on the Inspection is used to validate
existence of artifacts. These are outputs (in claims made by interviewees
document form) developed or prepared by the during the interview, FGD and in the
target group. The existence of an output is a reports submitted.
Inspection (artifacts review) demonstration of skills. The existence of standards will
Inspection is also a quality control activity. It facilitate the inspection process.
involves seeing and touching the materials
and equipment purchased, and assessing the
quality of facilities constructed
This is a structured way of gathering data. In using the questionnaire, limit the
Questions about data and/or information you questions to what you really need to
want to know are inputted to the questionnaire. know. Ensure that the questions are
The questionnaire ensures that your concerns are linked to the program design.
Questionnaire
covered. It also allows uniform presentation of Long questionnaires have a dismal
questions to respondents reducing the bias of response rate.
the researchers. Use simple terms and instructions.
When able, provide incentives.
Perception Survey Among the tools enumerated, perception survey There is a need to ensure the
is the only tool used to gather quantitative data. reliability of the questionnaires to be
It is used to gather information about what used. And there is a need to
people think about a performance, service or a standardize the questionnaire and its
Page 7 - 11
Division M&E System M&E Tools and Techniques
product. administration.
The data generated by the perception survey
are usually considered in the preparation of a
program design.
It involves the use of a structured questionnaire
and selected number of respondents (based on
sampling methods used)
Can either be self-administered or done by
professional researchers.
Page 7 - 12
DIVISION M&E SYSTEM
8.0
DOCUMENTS AND REPORTS
Division M&E System Documents and Reports
8.1 Description
Management Reports are data organized in an easy to understand format. The reports
provide the stakeholders with a holistic perspective on the accomplishments, events and
period that have elapsed. It is essential that the report provides complete information of
the events in order to be a useful input towards decision making. Furthermore, in order to
be useful and effective, reports should contain information about three essential areas:
Operational information. This describes the progress or status of implementation
happening within the school and classroom. At the school level, it includes school
programs and projects implemented, quality of outputs delivered, resources
generated and the expenditures of the school. At the classroom level, this may
include competencies gained by students, lessons covered, and attendance.
Internal and external information. Internal information relates to all activities within
the school or classroom and the stories behind the activities. This includes reporting
of the major events and activities that took place inside the school and factors that
facilitated or hindered the activities. On the other hand, external information
pertains to factors outside the school that may have influenced or affected school
performance. Information outside provides good comparative information to assess
the schools own performance.
Leading and lagging information. Leading information or leading indicators
provide insight or early warning into a future event. Some examples of leading
indicators are teachers performance (predicting student learning), frequent
absenteeism (leads to dropping out), and good teaching and school-based
management (influences enrollment).
On the other hand, lagging information or historical information provides useful
insights to current accomplishments. Reports provide a comparison of past
accomplishments to accomplishments to date. Example, in reporting drop out
rate for this year, drop out rates of previous years are also reflected in the report to
provide historical trend.
Page 8 - 2
Division M&E System Documents and Reports
Use graphs and tables that will provide immediate information about performance
and accomplishments;
Follow the format of your plan. When using coding (e.g.,. C.1.1) follow the one used in
the approved plan.
Provide stories behind the numbers, but keep it simple and direct.
Avoid ambiguous words and limit jargon the reader may not understand.
Attach documents that will directly support or explain further the information in the
main report. And use the most authoritative source of data.
Page 8 - 3
Division M&E System Documents and Reports
Page 8 - 4
Division M&E System Documents and Reports
contain problems and issues encountered by the school that need to be addressed
by the Division
Monthly Report/Annual Report on ALS Programs. Contains the status of the Basic
Literacy Program and A&E Program of the Division which are implemented through
the community learning centers operated by service providers or run by the District.
Division Monthly Report. Contains the physical accomplishments of the Division
versus the plan (DAP), short description of programs and projects implemented and
documentation of problems, issues and opportunities encountered.
Division Annual Accomplishment Report. An end-of-year report containing the
programs, projects and other services delivered by the Division. The report also
provides a comparative report on the performance of the schools using selected
performance indicators and the schools' level of practice on SBM.
8.3.4 Accomplishment Reports
Accomplishment reports are documents prepared and submitted after every end of
program or project or the end of a major undertaking. Accomplishment reports provide
DEDP Terminal Report. This report contains the accomplishments of the Division after
six years of implementing the DEDP. Specifically, it contains information on the
schools' performance (comparative), competency profile of school heads, teachers
and Division staff and an inventory of the programs and projects implemented for
the schools. The completion report also includes the Division Report Card which
provides a holistic picture of the Division after six years of DEDP implementation.
Division Mid-Term Implementation Report. This report is prepared at the end of the
1st cycle of SIP implementation (or phase 1 of the six year DEDP implementation). The
Mid-Term Report contains the information on the achievements and
accomplishments of the schools and the Division after three years. The report
provides information and insights that may drastically alter or affect the next three
years of the DEDP implementation.
Division Best Practices. Documentation of programs and projects implemented that
netted positive results.
Page 8 - 5
Division M&E System Documents and Reports
Type of
Document / Timing of
Document / Content Purpose As Input to
Report Report
Report
schools to Division to to annual
determine the effectiveness report and
Document Card indicators of Division interventions as well of each year SIP
as determine the impact to completion
schools performance report
Schools'
To provide baseline
Performance
information of the Division. This Preparation
(comparative)
will be used to assess the year of DEDP
Baseline Division Report Competency Profile
to year performance of the Annual Region's
Document Card of school heads,
Division; also to be used for input to
teachers and
outcome evaluation by the REDP
Division staff
Region
QM Assessment
Adjustment
of next
To report on the progress of
month's
School Quarterly implementation; End of the
Status Report activities
Report Documentation of month
Performance
accomplishments per month
assessment
of teachers
Adjustment
of next
To show status of AIP quarter's
School Quarterly End of the
Status Report implementation after every 3 activities
Report quarter
months Performance
assessment
of teachers
Information
from the
report will be
March of
used as basis
each year
to
To present the except the
School Annual adjust/enha
Status Report accomplishment report of the last year of
Report nce the next
school after 1 year; SIP
year AIP
Implementati
As basis for
on
measuring
efficiency of
SH
Learning
Learner Report To provide information about Quarterly
Status Report Manageme
Card the learners' performance and EO SY
nt Plans
To provide documentation of
the 3 year implementation January
Accomplishm SIP Completion Next Cycle
which will include lessons February of
ent Report Report SIP
learned and key practices of 3rd Year of SIP
the school
School To provide documentation of End of
Accomplishm Best
Program/Project program/project Program/Proj
ent Report practices
Report accomplished ect
Page 8 - 6
Division M&E System Documents and Reports
Page 8 - 7
DIVISION M&E SYSTEM
9.0
M & E T ERMS OF R EFERENCE
Division M&E System M&E Terms of Reference
A scope creep is an activity, event or an output undertaken (may be necessary) but is not part of the approved or
agreed plan.
Page 9 - 2
Division M&E System M&E Terms of Reference
facilitate decision making. These reports will be the basis for future actions and
future designs of programs and projects.
The M&E System is often equated to filling up forms, tables and matrices. The
important activity of validating the data and information is often neglected, thus,
the practice of filling up forms, tables and matrices leads to erroneous data and
information.
M&E is about field visits and data gathering.
Data collected must always go up before it is disseminated down the line. As a
result, needed data and information arrive late or never at all.
Another wrong notion about M&E is to meet the information requirements of the
external/higher management level unit. The M&E system is set up and put into
operations in order to meet the information requirements of the internal units
especially individuals who are responsible for the delivery of outputs and the
achievement of outcomes.
These misconceptions often lead to false notions that M&E people are just spectators
watching and observing (spying) and waiting for people to make mistakes. Then they report
these mistakes. The misconceptions above are the usual reasons why people shun M&E.
Page 9 - 3
Division M&E System M&E Terms of Reference
Page 9 - 4
Division M&E System M&E Terms of Reference
of their units. They provide technical assistance to education supervisors and district
supervisors on how to efficiently and effectively deliver the Division programs and projects.
The ASDS shall interact closely with the following stakeholders:
Schools Division Superintendent
Division Monitoring and Evaluation Coordinator
School Heads
QMT Members
The ASDS shall report directly to the SDS. He/She shall provide the SDS with the progress or
status of programs and projects of the Division, raise issues and problems affecting (or
may affect) the technical assistance support to schools and community learning centers
and recommend areas for adjustment in the DEDP/DAP. Specifically, the following outlines
the roles and responsibilities of the ASDS:
Prepare programs and projects that will support the requirements of the Division
staff, particularly the promotional staff and the district supervisors.
Prepare and submit Monthly Report detailing the status of programs and projects
and future activities
Conduct unit meetings and workshops related to M&E concerns
Supervise day to day activities of Division staff. Monitor the provision of technical
assistance to schools and community learning centers
As member of the Division Quality Management Team, participate in the SIP
Appraisal, Start Up Review, Annual Implementation Review, and Mid-Term Review
Outcome Evaluation
Due to the strategic and sensitive nature of the M&E function, it is suggested that the designation of M&E
Coordinator should be given to one of the Assistant Schools Division Superintendent (ASDS).
Page 9 - 5
Division M&E System M&E Terms of Reference
The M&E Coordinator shall report directly to the SDS. The Coordinator shall provide the SDS
with interpretation and analysis of M&E data, raise issues and problems affecting (or
may affect) DEDP/DAP implementation and recommend areas for adjustment in the
implementation plan. Specifically, the following outlines the roles and responsibilities of the
Division M&E Coordinator:
Assist in the review and revision of the DEDP objectives and strategies, particularly
in the adjustments of the OVIs (objectively verifiable indicators) and MOVs (means
of verifications reports, documents, data gathering methods)
Assist in the development and adjustment of the DEDP and DAP.
Assist the ASDS in setting up the Division M&E System. Ensure that the Division M&E
System complies with the operations of the Region M&E System
Communicate to Division and District staff the requirements of the School M&E
System and the roles and responsibilities of staff on M&E
Prepare consolidate Division Monthly Report for the SDS in accordance with the
approved reporting formats and schedule. This also includes reviewing and
validating the reports submitted by school staff and documents received from
outside the school.
Assist the education supervisors, district supervisors and other Division staff in the
preparation of their progress reports.
Record and report the Physical Accomplishments of the Division and schools.
Ensure proper documentation and safekeeping of Division reports and documents
generated during project implementation.
Page 9 - 6
Division M&E System M&E Terms of Reference
M&E Coordinator. The Planning Coordinator shall provide initial interpretation and analysis
of project data. Specifically, the following outlines the roles and responsibilities of the
Division Planning Coordinator:
Assist in the review and revision of the DEDP objectives and strategies, particularly
in the adjustments of the OVIs (objectively verifiable indicators) and MOVs (means
of verifications reports, documents, data gathering methods)
Assist in the development and adjustment of the DEDP and DAP
Assist the M&E Coordinator in the preparation of consolidated Division Monthly
Report
Ensure availability of information from the BEIS and timeliness of data.
Mainly responsible for the computer entry of data and provide some initial analysis
and interpretation. Ensure integrity and accuracy of data
Manage and maintain the BEIS
Page 9 - 7
Division M&E System M&E Terms of Reference
Prepare and submit Monthly Report detailing the status of programs and projects
and future activities
Prepare and submit an Assessment Report on school performance
As member of the Division Quality Management Team, participate in the SIP
Appraisal, Start Up Review, Annual Implementation Review, and Mid-Term Review
Outcome Evaluation
9.3.5 Coordinators
Coordinators are Division staff assigned a specific program or responsibility. These include
Division staff designated as physical facilities coordinator, procurement coordinator, SBM
coordinator, DORP coordinator and others.
Each coordinator shall be responsible for monitoring programs and/or concerns assigned to
him/her.
The Coordinators shall interact closely with the following stakeholders:
SDS
ASDS
School M&E Coordinator
School Planning Coordinator
Other Coordinators
The Coordinators are urged to establish and strengthen their horizontal link with other
coordinators and/or education supervisors and district supervisors in order to fast track the
sharing and utilization of valuable information affecting programs and projects. Specifically,
the following outlines the roles and responsibilities of the Coordinators regarding monitoring
and evaluation.
Monitor the efficiency and effectiveness of programs or concerns
Monitor utilization of programs, systems, facilities and learning equipment installed at
the school and community learning center level
Prepare and submit status report
Page 9 - 8
Division M&E System M&E Terms of Reference
Quality Management Team or QMT is an ad hoc body composed of Division and District
staff whose task is to implement the quality control and adjustment mechanisms of the
Division. These mechanisms include: SIP appraisal process, start up, annual implementation
review, mid-term implementation review and outcome evaluation.
In general, the QMT is created to ensure compliance and adherence to the objectives and
targets of the Division and to ensure uniform application of policies, standards and
processes.
The QMT is responsible for:
ensuring the quality of plans, program and project designs developed by the
Division, district and schools
ensuring that staff from the division, district and schools are adhering to the
standard processes employed to assure quality
evaluating the major milestones at the school and community learning centers
The QMTs are divided into two major groups: the Core QMT and the Area QMT.
The Core QMT is the central body or process owner of the Quality Management System.
Specifically, the Core QMT will be responsible for the following:
set up of the Quality Management System in the Division
oversee the creation and formation of Area QMTs
build capability of the Area QMTs
communicate and enhance Division standards and the Quality Management
System
The Area QMTs put into operation the Quality Management System of the Division. These
teams are responsible for enforcing the quality standards of the Division and providing
technical and training support to schools and community learning centers.
Specifically, the Area QMTs are responsible for the following:
provide technical support to schools in setting up the school quality management
system
orient the schools and community learning centers on quality management
implement the quality control and adjustment points of the Division
evaluate the SBM assessment and ensure the integrity of the process
Page 9 - 9
DIVISION M&E SYSTEM
10.0
SETTING UP A DIVISION M&E SYSTEM
Division M&E System Setting Up
Page 10 - 2
Division M&E System Setting Up
Page 10 - 3
Division M&E System Setting Up
1 A performance measure is composed of a number and a unit of measure. The number provides the magnitude
(how much) and the unit of measure gives the number a meaning (what)
Page 10 - 4
Division M&E System Setting Up
Focus on the right things. Ensure that the performance measures selected
are the correct measures for assessing the learners, teachers and school
head's performance. By correct, it means direct and exclusively used for one
performance only.
Integrated with other measures. A performance measure used is
connected to the other measures in order to provide a more holistic picture
of the school's accomplishment and achievement.
(3) Third step is to finalize the DEDP and prepare the DAP (Year 1) based on the
adjusted targets and schedule.
Defining the scope of the M&E will facilitate the design and establishment of the Division
M&E System. The DEDP with its objectives, targets, proposed strategies and activities define
the scope of the M&E. It is important that these are revisited and finalized to formalized the
scope of the M&E. This will lead to the design of the monitoring process, control points, data
collection tools and techniques and management reports.
A stage represents a major segment in the implementation phase, the completion of which
represents a major milestone. Each stage in the DEDP implementation represents unique
requirements and interactions as well as unique problems and issues. The stage approach allows
managers with more control in managing the implementation. The unique requirements of
each stage provide the context for monitoring and evaluation.
Control points are M&E review gates for evaluating major outputs and milestones. The results of
the control points are used as basis for adjusting the or enhancing the implementation.
Page 10 - 5
Division M&E System Setting Up
The main reference material in the establishment of control and adjustment points is the
DEDP and the SIP. The implementation plans provide details on the critical activities to be
undertaken and the targeted accomplishment dates of outputs. The following are items
need to be considered in the identification and design of Control and Adjustment Points:
Accomplishment of an output or major milestones. One of the major
considerations in the set up of the Control and Adjustment Points are the outputs to
be delivered. Outputs need to be quality assured.
Management reporting practices in the Department. Control points are patterned
after the management reporting practices of the agency. Considering the
reporting practices of the agency will facilitate both requirements of the school as
well as the requirements of the Division, Region and National Office.
Critical path or segment in the implementation process. Critical path refers to an
activity or activities that will have major implications to other activities, outputs and
decisions to be made in the future.
)
,
!
!% &(
#
#
!
!% +%
#%
#
#
#%
)
Page 10 - 6
Division M&E System Setting Up
Page 10 - 7
Division M&E System Setting Up
2 Sample only. Intended to show the information to be gathered about the stakeholder. Additional stakeholders and
additional information may be added depending on the accountabilities of individuals in the Division and District
Page 10 - 8
Division M&E System Setting Up
Page 10 - 9
Division M&E System Setting Up
LGUs
Province level
Other
Agencies
3 Sample only. The list of stakeholder and their roles and responsibilities may vary from place to place.
Page 10 - 10
Division M&E System Setting Up
The information requirements of the stakeholders will have implications to the design and/or
content of the following:
data elements. These are the most basic information about a status of an
implementation. Usually, these are raw data and are important in determining or
computing for the performance measures. This will also dictate the forms and tables
to be developed.
forms and template. The simpler the forms and the templates, the better. These are
the most fundamental collection tool to be used in documenting an event and an
accomplishment.
report format. The decision making requirements of the stakeholders will determine
the format of the report. It should contain all the necessary information numbers
and stories behind the numbers in order for a school head or a teacher make the
necessary adjustments or improvements in the strategies implemented.
reporting frequency. The need of the stakeholders to make decisions will also
dictate the reporting periods. The reports with the numbers and stories must be
received on time by the stakeholders in order to ensure timely adjustments (if
needed) or decisions.
evaluation frequency. Evaluation pertains to external assessment to be undertaken
to validate the accomplishments and stories written in the reports. Usually,
evaluation is undertaken when the evaluation party is to come up with their own
plan.
As a rule of thumb, when there are conflict between the requirements of the internal and
external stakeholders, the requirements of the internal stakeholder must be met first
before the external. The decision making requirements of the internal stakeholders the
school head and teachers must be given priority in order to help them make immediate
enhancements or remediations in the school interventions. This perspective will also ensure
that M&E is more about managing and making decisions than meeting the reporting
writing.
Page 10 - 11
Division M&E System Setting Up
integrates these to the control and adjustment points of the system. Once in used, the
Division M&E Process can supply the different information requirements of the Division
management, program units, support units and districts.
Define the M&E process. This includes defining the control points and events that
will be undertaken during the DEDP implementation.
Finalized the reporting requirements and disseminate them.
Formulate the M&E Terms of Reference. After detailing the requirements of the
M&E system, the next step is to define the roles and responsibilities of the school
head, teachers and staff concerning data collection, sharing of information,
reporting assignments and in giving feedback.
Page 10 - 12
Quality Management Inventory Model
1st draft (fn. Quality Management Inventory Model)
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
BACKGROUND
The Basic Education Social Reform Agenda (BESRA) is a package of policy reforms that seeks
to systematically improve critical regulatory , institutional, structural, financial, cultural, physical
and information conditions affecting basic education provision, access and delivery on the
ground. BESRA is expected to create critical changes necessary to further accelerate,
broaden, deepen and sustain the improved education efforts.
BESRA's implementation of actions is focused on four main areas. These are (1) school based
management (SBM), to help schools to better manager their operations for improved learning,
(2) Competency Based Teachers Standards, to enable more teachers to practice
competency-based teaching, (3) Quality Assurance and Accountability Framework, to provide
better institutional support to learning and quality assurance, and (4) Outcomes-Focused
Resource Mobilization, to ensure resources are focused on achieving desired outcomes.
The focal point of BESRA is the school. The main integrating vehicle for BESRA implementation
is SBM. Through SBM, schools are allowed to manage its own affairs to improved the delivery of
education services in a sustained manner. SBM also includes strengthening the school heads
on resource mobilization, negotiation, partnerships with community and stakeholders. Other
assistance include provision of funds for priority school projects. Clearly, all major efforts,
resources and funds are funneled to helping the schools manage basic education services
more efficiently and effectively.
In this regard, the Region and Division will play a very important role in supporting the schools. It
is essential for the Region and Division to have the capability and necessary systems and
mechanisms in placed that will sustain its support the schools management of its affairs. The
Region and Division are in a strategic position to propagate, maintain quality and sustain SBM
interventions and results.
It is in this context that a quality inventory assessment tool is developed to ensure the Regions
and Divisions are ready to assume the tasks of facilitating SBM. The assessment will focus on
their readiness. By readiness, it includes the presence of well defined technical assistance
processes and support mechanisms that will support schools' delivery of basic education
services to students.
Pa g e 2
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
!##
Pa g e 3
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
The Quality Management Inventory Model is an integral part of the Quality Management
System of the Region. It is a mechanism to promote continuous improvements in the Region
and the Divisions. Its main goal is to improve things and manage things better%.
The QMIM depicts a road map that traces the Region and Division's transformation from use of
informal processes to a more established technical assistance packages and support
mechanism. It projects an organization's transition from the realm of uncertainty to a more
repeatable and predictable results. The Model represents a progression of capability by the
Region and Division to deliver management support and its technical assistance packages to
their target groups.
The QMIM is also a yardstick to assess the performance of the Region and division. It will be
used to examine the Region and Division's processes and support mechanisms that allows it
to efficiently and effectively deliver technical assistance packages to schools, school
managers, teachers and the school's non-teaching staff.
The Model is defined into (1) Ad Hoc, (2) Defined, (3) Integrated, (4) Sustained. Level 1 is the
entry level. It represents a Division that is characterized by ad hoc processes and informal way
of doing things. As it matures, the Division Office is expected to establish its internal procedures
(Level 2. Defined). The Division improves into a stage where it is expected to manage and
integrate different mechanisms into an integrated system. The highest level is the Sustained
level. This represents a Division that adapts, maximizes and continuously improve its way of
doing things.
The initial or entry level of readiness. A Readiness Level 1 Region/Division is often characterized
by a temporary and informal ways of doing things. Organizational procedures or methods are
not well defined and disseminated leading to inconsistent results and poor quality of service. Its
technical assistance packages are reactive, inefficient and not relevant to the requirements of
its target groups. Often these packages are hand-me down practices. Its utility value and
effectiveness have not been proven, yet these are utilized year in and year out. Some may
Pa g e 4
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
If there is a defined process, it does not respond to the challenges and support
requirements needed by its target client≤
Dependence to one or two individuals. These effective individuals% are often hailed
as heroes because they are able to move the organization to achieve results. However,
when these champions leave, the organization suffers setback; and,
Some positive results are achieved but not sustained and/or maintained.
Region/Divisions belonging to this category may achieve good results (eg. NAT results),
however, these are not maintained (leading to poor results in the succeeding years).
If a Level 1 Region/Division aims to efficiently do things, then it must undertake the following
steps:
Collect and collate its practices and experiences and formulate its own set of
procedures;
Adapt other Divisions' experience and/or approach that have been proven
and tested already;
4. Communicate to Region/Division staff and build a critical mass of individuals who will
guide, guard and champion the newly established process.
The transformation of a Region/Division from Level 1 to Level 2 is critical in the maturity process
Pa g e 5
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
of the Division. This stage is often the most frustrating part of the change management
process. Introduction of new methods or practices are often met with suspicions and cynicism
by individuals. Aside from the suggested steps outlined above, the Region/Division
management should take sessions on how to implement and manage change.
It is critical first to establish the support mechanisms that will be used as the platform for
delivering the Region/Division's technical assistance packages. These support mechanisms
ensure an efficient delivery of basic education strategies and services. Region/Division
organizational processes must be defined, refined and communicated in order to guarantee
consistency in the quality of its technical assistance.
At this level, Region/Divisions start to refine and define their technical assistance packages.
These packages are detailed into specific steps, refined, standardized and documented.
Region/Division staff are oriented, trained and are expected to perform these processes.
Have defined, formulated and established its TAP processes and support systems;
There is a staff development program that supports the application of the defined
system;
Application of these systems or procedures are not consistent. On a case to case basis,
defined systems and processes are ignored or not followed; and,
Although systems are in placed, these may not talk to one another% or there may be
duplication of efforts.
Pa g e 6
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
Unlike Readiness Level 1, Region/Divisions belonging to this category are less dependent to
individuals (one or two staff) but need a strong willed management who will enforce the
quality standards, defined procedures and the agreed programs and projects.
In order to achieve a higher readiness rating, Level 2 Region/Divisions should undertake the
following:
The transformation from Level 2 to Level 3 is a result of the organization's motivation to achieve
more as a result of its initial success. The satisfaction generated by a formal and coordinated
process is boosted by a desire to do things more permanently and consistently.
On Quality Management. Region/Division standards are well established and are used
as basis for monitoring and evaluation;
On Scope Management. Ability to perform all activities in the DEDP and deliver the
Pa g e 7
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
This level is not affected by changes in management and/or personnel. The processes, systems
and practices provides stability to the operations of the organization.
3. A quality assurance mechanism that detects the positive and negative elements of
the processes;
Pa g e 8
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
Technical Assistance
Inventory Level Characteristics Actions
Package
doing things;
may vary depending to situation and processes, build
personality
personalities staff capability
dependent
The Region/Division's maturity on this level hinges on its commitment to excellence. It must
have the ability to perform continuous improvements, always optimizing the gains or outcomes
of its undertaking. Therefore, a Readiness Level 4 Region/Division should have the following
traits:
Defined processes are improved and in sync with agency policies and directions;
This level adheres to the principle of continuous improvement, always optimizing gains or results.
In order to maintain this level, the following efforts are suggested:
3. Documents lessons learned and other experiences; input these to the improvement
and/or enhancement of Region/Division products and processes; and,
It is inherent for people to leave the organization. Individuals who played a big role in the
continuous improvement and growth of the organization will eventually retire, resign, get
transferred to other stations. It is imperative that institutional memory is maintained and
Pa g e 9
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
handed over to the next generation of Region/Division staff who will continue the culture of
excellence established in the Region/Division. In this regard, a good knowledge management
program must be in placed to ensure Level 4 Region/Divisions.
Pa g e 10
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
The focus of the QMIM is the ability of the Region/Division to efficiently deliver its technical
assistance packages to its target group. A Technical Assistance Package is a set of activities,
developed and defined by the Region/Division into a process, designed to solve a particular
issue and/or to achieve desired education objectives or outcomes.
These packages constitute the core process areas that will directly impact on learning
outcomes. These include school based management, learning management program,
instructional supervision, learning materials and learning environments. Capability
building assistance for school heads and teachers to improve instructions are also part
of this package.
2. Management Mechanisms
3. Support Mechanisms
These are necessary support processes that are vital to the efficient operations of the
organization. Support mechanisms refer to processes pertaining to finance, human
resource management, administration and procurement.
The management readiness of Region/Division will be assessed using these TAPs. Essentially, the
core review areas include: (1) technical assistance on provision of SBM assistance and
instructional support to schools, (2) management processes that integrates the Region/Division
operation, and (3) support processes of the Region/Division to deliver technical assistance as
efficiently as possible. The expectation is that the Region/Divisions are in a position (i.e. with a
well defined and tested procedures and mechanisms) to provide consistent, relevant and
timely assistance to divisions/schools.
Pa g e 11
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
Pa g e 12
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
CRITICAL ASSUMPTIONS
2. The context of the QMIM hinges on the ability of the Region to prepare and
strengthen the Divisions in facilitating and sustaining support to schools on SBM. On
the part of the Division, it is anchored on their ability to provide the necessary
technical support for schools to effectively implement SBM. As such, the assessment
will focus on the capability of the Region/Division to deliver technical assistance
packages that will facilitate and sustain the schools delivery of basic education
services.
Pa g e 13
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
relevant. Its service delivery mechanisms are custom fitted to meet the unique
and changing demands or needs of schools;
timely. Delivery of basic education services are on time, based on plans and
programs;
the best technical support. Provides the most efficient and effective service
delivery mechanism to schools and school stakeholders; and,
5. T he inventory model ranges the least mature stage to the most matured state%. The
model assumes that organizational readiness must be grown over time in order to
produce repeatable success.
Pa g e 14
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
One of the critical assumptions used in the Quality Management Inventory Model is
continuous improvement. Continuous improvement is based on the premise that change will
occur and will always challenge the status quo. This may be brought about by new needs and
issues, new policies and thrusts, new standards and challenges. These factors will force the
Region/Division to continue innovate, change and improve itself in order to be efficient and
effective.
In order to ensure continuous improvements, the Region/Division must adopt certain basic
management approaches. Adherence to the basic principles and techniques of these
management strategies will help sustain and maintain a high readiness level of the
Region/Division.
In order to facilitate the maturation process of the Region/Division from Level 1 to Level 4, the
following management approaches are must inputs to the Region/Division staff development
program. These include change management, project management and knowledge
management.
Change Management
A well planned change management program is very important in the transition from
Level 1 to 2 readiness.
Knowledge Management
Pa g e 15
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
Project Management
Project management (PM) is the use of different tools and techniques to activities in
order to achieve objectives or targeted results. PM boasts of robust planning,
implementation and control techniques that can help the Division deliver its services in
a more efficient approach. PM techniques will serve as valuable tools in standardizing
organizational processes and integrating these into a more coherent and optimal
process. Knowledge and skills on PM is very important for Levels 2 & 3 Readiness.
These management approaches identified above should not be taken in isolation. Deliberate
effort to integrate the three approaches should be undertaken. The change management
approach provides the strategies on how to soften% resistance and facilitate change when
new processes are introduced. Knowledge management, on the other hand, provides the
input and perspectives on how to manage, share, propagate and sustain these processes. And
lastly, project management techniques lend themselves well to managing change and
managing knowledge.
Pa g e 16
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
Assessment Areas
Region
The scope of the QMIM assessment for the Region includes the following technical assistance
packages to Divisions:
Strategic Planning
DEDP Appraisal
Outcome Evaluation
Policy Research
Procurement Process
Division
The Quality Management Inventory Model covers assessment of the Divisions' demonstration of
the following technical assistance packages:
Pa g e 17
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
Strategic Planning
SIP Appraisal
Information Support
Performance Evaluation
Procurement Process
Pa g e 18
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
Objectives
The aim of the QMMA is to determine the capability of the Region/Division to implement timely,
consistent and relevant technical assistance packages to its school client&le. The assessment
focuses on the ability of the Division to consistently deliver quality technical assistance
packages through processes designed to improve efficiency and assure effectiveness of
service.
4. Input to DepED Regional Office. This assessment can be one of the strategies for the
Regional Office to implement its quality assurance and accountability work. The results
can be used by the Regional Office as input to design its technical assistance support
to the Division Office.
On the part of the Division, the readiness model may be used as a template by which it can
assess its own operations and determine its capability building requirements. For both the
Regional Office and Division Office, the results of the assessment provide important inputs
toward a more efficient and effective delivery of basic education services.
Methodologies
The QMIM Assessment will make use of different methods of data gathering. These include:
1. Key informant interview. A one on one session with the process owner or individual/staff
who is directly accountable to the implementation of the technical assistance
Pa g e 19
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
packages.
2. Interview. Session with staff who has direct knowledge of the process areas or who
were involved in the process. Interviewees may also include school heads and
teachers who became recipient of the service/s provided by the Division.
3. Focus Group Discussion. Session with selected Division staff and/or school heads and
teachers discussing the practices and/or processes implemented by the Division. The
FGD will help validate the claims of the key informants or process owner.
4. Artifacts Review. Refers to gathering documents that will prove the existence of Division
process or practice. This will also include inspection of the documents.
5. Observation. When possible, the Assessment Team may conduct observation of the
actual process being undertaken by the Division staff.
Process Owner
The Region will be the Process Owner of the QMIM assessment. The Region will create Quality
Management Team/s that will be tasked to undertake the assessment. As process owner, the
Region must ensure the following:
2. usefulness of the process, especially the findings and results. These should find its way to
the Regional Education Development Plan and used as design considerations to
Region programs and projects
3. capability building of the QMT members to assure that they are ready and capable to
become assessors
Pa g e 20
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
ASSESSMENT TOOL
The QMIM Assessment Tool is developed to facilitate the conduct of the QMIM assessment of
the Region/Division in implementing key process areas to deliver its technical assistance
packages (TAPs). The Assessment Tool contains the key process areas of the Region and
Division and the various scenarios using the 4 maturity level (ad hoc, defined, integrative and
sustained).
The Assessment Tool is not a checklist but will serve as a guide for the QMTs to objectively
document the application or utilization of existing key process areas of the Region/Division. The
tool will also be used to facilitate the documentation of best or effective practices in the
Region and Divisions.
Pa g e 21
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
The process owner of the QMIM assessment process is the Region. The following will serve as
guide for the QMTs to be deputized by the Region to implement the QMIM assessment. The
steps listed below are the minimum requirements in conducting an efficient assessment.
Depending on the requirements of the Region, additional activities and requirements may be
added.
The following are suggested start up activities for the Region to implement before undertaking
the Division QMIM assessment.
Step 1. Review
2. Ask each team member to familiarize himself or herself with the content of the
Assessment Tool. This will minimize dependence to the Assessment Tool and allow the
assessors to conduct the interviews as normal% as possible without much interference
from looking at the tool from time to time.
3. Make sure that all team members understand how to use the Assessment Tool. This
includes understanding of the continuum (level of maturity), and documentation
requirements.
4. On rapid appraisal, remind each team member that the assessment approach being
utilized is the not so quick and not so dirty% approach. Review the principles and
methods of rapid appraisal.
5. Conduct a group review of key process areas or management process that must be
present in a Division. If possible, the Team should review and familiar themselves with
the objectives, strategies and content of the Division Education Development Plan.
Pa g e 22
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
and the target date of completion. Ensure the teams will be able to cover all divisions
based on the time allotment.
2. Assign a Team Leader for each team. The Team Leader shall:
ensure access to documents and materials the team may need for the
assessment
ensure the team has enough copies of the Assessment Tools and other related
materials
3. The number of team members per team should be enough to cover all the items in
the Assessment Tool and to ensure the documentation requirements are met.
Suggested minimum number of members is 5.
4. Assign team members who are very knowledgeable about the management
processes. Form a multi-disciplinary team to ensure coverage of the key process areas.
1. Prepare a work plan detailing the activities to be undertaken by the Team. The work
plan should also include the schedules, logistics and financial requirements needed to
undertake the assessment.
2. Orient all the Assessment Teams and individual team members about the schedule of
the QMIM assessments and the important milestones in the plan such as the deadline
for the preparation of reports.
Pa g e 23
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
interview school heads and teachers so that arrangements can be made by the
Division before the actual review.
2. Coordinate with the Division about your travel arrangements, logistical requirements
and other administrative prerequisite to ensure smooth conduct of assessment.
3. Make sure every team member has a copy of the Assessment Tool and other
necessary forms. Have them reproduce before going to the school.
4. Conduct final team meeting before going to school. Apprise the team members of
their roles and responsibilities and the scope of the evaluation.
Start your visit with a courtesy call. Discuss the purpose of your visit, your plan for the
day, the people you need at a particular time and the documents you need to review.
If the Division opted to conduct an opening program or ceremony, try to limit this to 30
minutes.
Assure the Division management and staff that the QMIM assessment is not meant to
evaluate the performance of the Division but to ascertain its level of readiness to
perform critical process or technical assistance packages.
Whether you use interview, group interview or focus group discussion, tell the
respondent/s about the purpose of the activity. Inform respondent that you will be
taking down notes in the course of the discussion.
Don(t forget to ask for evidences (MOVs) on claims that the respondent/s made and to
thank respondents for their participation soon as you have finished your data
gathering with them.
As soon as you have completed your data gathering activity, organize your data and
meet as a team to prepare for the exit conference.
Conduct an exit conference with the SDS. Point out significant observations regarding
the Division(s level of readiness but avoid making judgment or conclusions right away.
Inform the SDS that the Team will discuss the observations, analyze the findings as a
Team and make recommendations.
Thank the SDS and tell him/her that a complete report will be sent to him/her officially
by the Regional Office.
Pa g e 24
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
1. Encode and consolidate the observations and documentations of the Team using the
same format. Provide each member a copy of the documentation.
3. During the Team meeting, come up with a consensus regarding the Level of Maturity
of the Division per process area. Prepare a report on the assessment.
4. Given the findings and analysis, the Team is to formulate recommendations and
suggestions to the Region on how to assist the Division improve and/or reinforce its
practices and/or delivery of technical assistance packages.
5. Submit and discuss the findings, analysis and recommendations to the Regional
Director.
procedure which the Assessment Teams had adopted to gather the data.
The discussion should cover all the activities done before(preparations
done), during(interview and generation of MOVs) and post- assessment
stage(collation of data, manipulation of data, how the data were analyzed,
etc)
describes the results of the status of the QMIM assessment of the Division.
This section will present the findings of the various items studied and
indicating the maturity level of the Division.
Pa g e 25
Q u a l i t y M a n a g e m e n t In ve n t o r y M o d e l
1. The findings of the QMIM Assessment should find its way to the Region's technical
assistance packages for the Division. The results of the Assessment will be used as
input by the Region to define and formulate programs and projects for the Division.
2. The findings concerning Level 3. Integration or Level 4. Scale Up should find its way to
the documentation of best/effective practices of the Division.
3. The QMIM results can also be used as input to the Outcome Evaluation to be
undertaken by the Region. It can also be used to amend, enhance and/or formulate
new standards and policies.
4. The Division may want to implement the assessment to the other 70% schools not
covered by the initial assessment.
Pa g e 26