Escolar Documentos
Profissional Documentos
Cultura Documentos
DEPARTMENT OF EDUCATION
REGION VII, CENTRAL VISAYAS
DetfD
OfMJRMCNirl (OUCMOH
MAR 0 7 2013
REGIONAL MEMORANDUM
No. 1 4 9 . s. 2013
1. For the information and guidance of all concerned, enclosed is a copy of Unnumbered
Memorandum from USEC Rizalino D. Rivera, Undersecretary for Regional Operations and Chairman,
SBM-TWG dated February 4, 2013 entitled "SBM ASSESSMENT SCORING MATRIX: GUIDE FOR
OPERATIONAL TRY-OUT," which is self-explanatory.
CARMELITAT. DULA ON
Director III
Officer-ln-Charge
CTD/CCL/FCS/apv
22/02/2013 10:52 6337236 DEPED-RADIO ROOM PAGE 03
REGIONAL DIRECTORS
TO
ALL SCHOOLS DIVISION SUPERINTENDENTS
I&^\I_|IVWfeT. * . ,
FROM
nder^ecretary for Regional Operations and Chair, SBM-TWG
The issuance of DepEd Order No. 83, s. 2012 on the revised SBM Framework, Assessment
Process and Tool provides for the unified implementation of the enhanced SBM Practice
and School Accreditation Program through PASBE (Philippine Accreditation System for
Basic Education).
As per DepEd Order No. 83, s. 2012, the SBM Level of Practice is determined by a
composite score derived from sixty percent (60%) demonstrated Performance
Improvements (PI) along the following thematic areas: Access, Efficiency and Quality
and forty percent (40%) from the result of the validated self-assessment process using
the standardized SBM Assessment Rubric which composed the Governance portion.
The scoring matrix [attachment 1] as released in this memorandum was formulated from
a systematic review of 5-year historical data to determine mean performance and
standard deviations (SD) across typology of schools. The means (with consideration of
SD) were used as baseline or "marker" to chart the annual targets towards EFA goals
(beg. 2013- 2015). The projected change of certain performance indicator was scaled to
reflect marginal to high improvements reflective of the school system's development
process,
The scoring matrix was developed by a combined team of experts and practitioners. This
will be subjected to an operational try out between February-March 2013 to ensure
reliability and applicability of measures and metric across typology of schools in the
country. Subsequently, a revised version will be developed and be issued through a
Supplemental Guideline in March 2013 to signal the commencement of the synchronized
SBM assessment in April-May 2013 following the National Orientation of the SBM-PASBE
Coordinating Teams.
For the period of operational try-out, only the participating divisions are encouraged to
conduct the assessment and undertake the analysis of results thereof, for input to the
- ZZ
22/82/2013 10:52 6337236 DEPED-RADIO ROOM PAGE 01
final scoring matrix. An official memorandum will be issued to these divisions for the
purpose.
Should you have questions and/or clarifications on the said memo, you may contact Ms.
Maria Katrina L Gregorio, SBM Secretariat at (02) 633-7216 or email at
sbmpasbe@gmail.com.
22/02/2013 10:52 6337236 DEPED-RADIO ROOM PAGE 04
1. The Implementing Guidelines to the Revised SBM Framework. Assessment Process and
Tool was issued through DepED Order No. 83, s. 2012. The revised process emphasizes the centrality
ot learners and recognizes differentiated SBM practice leading to improvement of basic
education delivery.
2. The revised process underscores the guiding principles of A Child- and Community-
Centered Education Systems (ACCESsj. It illustrates the interrelationships of various mechanisms at
the school level and how these are used to improve their systems and processes. It shows how
School Improvement Planning (SIP), School Governing Council (SGC), accountability system, and
use of a development fund (such as SBM grant) for continuous improvement projects (CIPs). can
help the school and its stakeholders accelerate learning performance, improve curriculum
delivery, and school governance through SBM practice.
3. With this, the connection of school performance as the ultimate outcome for the SBM
practice becomes more evident and quantifiable.
4. For a school to determine its level of SBM practice, it must first demonstrate performance
improvement across four (4) thematic areas: Access, Efficiency, Quality, and Governance.
Access, efficiency, and quality shall comprise 60% of the resulting level of practice, while 40% shall
come from the results of the SBM peer assessment using Document Analysis, Observation,
Discussion (D-O-D) process which shall pertain to governance. The validation process for
governance is already detailed in DepED Order No. 83, s. 2012.
5. The determination of norms and criteria for the identification of the 60% learning
outcomes and the final scoring matrix was established in consultation with select experts and field
practitioners.
7. This memorandum provides all elements in the scoring matrix and steps in the overall
computation of the level of SBM practice. The steps in assessment and scoring shall be as follows:
7.1 As eligibility criteria for a school to determine its level of practice, it must demonstrate an
acceptable performance improvement over a period of three (3) years - certified by
the division in the following areas: access, efficiency, and quality. This shall be the
prerequisite for the school to request validation from the division. (For example, for
schools which are ready to conduct the assessment this year, SY 2010-2011 shall be the
baseline year for computing the improvement).
7.2 Each thematic area is given corresponding weights based on school mandate and
expected organizational outcomes as to: access (45%), efficiency (25%), and quality
(35%).
7.3 The performance indicator for access is enrolment, if need be, it shall be supported by
the school-age population (6-11 for elementary and 12 to 15 for secondary), taken from
22/02/2013 10:52 6337236 DEPED-RADIO ROOM PAGE 02
the Borangay Hall and duty certified by the Barangay Captain/Chairman; for efficiency:
drop-out rate (DR), cohort survival rate (CSR), and completion rate (CR); and for quality:
achievement rate in terms of mean percentage scores (MRS) in the National
Achievement Test (NAT).
7.4 For each indicator a scale/range was determined by determining the baseline, the
historical trend for the past five (5) years, and getting the average and comparing this
with international and country standards. Depending on the increase in improvement, a
score will be given as to: Marginal, Average, or High with equivalent points of 1,2, and 3
respectively. The table below shows the increase in improvement, and the equivalent
rating and point/s for each indicator:
7.5 Based on the rating, compute the total points gathered per thematic area and multiply
it with the assigned weight. Add up the raw scores for all thematic areas. Based on the
resulting score, the school performance in terms of the identified learning outcomes will
be classified as "Good", "Better", or "Best". The table below shows the interval score for
each category.
7.6 The resulting score in item 7.5 shall be multiplied by 60%, which is the weight given for
the improvement of learning outcomes [see Annex 2 for sample computation].
7.7 The school which shall be qualified under the "Better" or "Best" categories shall be
eligible for validation by the division and/or region using the D-O-D practice as detailed
in DepED Order No. 83, s. 2012. The school which will be classified as good is
encouraged to work more in improving its performance to qualify for at least Level l
status.
7.8 To arrive at the Level of Practice, compute for the score in the 40% DOD validation. As
reflected in DepED Order No. 83, s. 2012, the % weight for each principle in relation to
improving learning outcomes and school operations, are as follows;
7.9 By applying the computation in the said DepED Order, add up the resulting points per
principle. Depending on the resulting score, the school shall be classified as to "Good",
"Better", and "Best".
7.10 Multiply the score gathered from item 7.9 and multiply it to 40%, which is the weight
given to governance (refer to DepED Order No. 83, s. 2012 for details].
7.11 Add the scores gathered from Parts I and II. The resulting score will be the basis for the
computation of the Level of SBM Practice.
7.12 A sample computation for a school and a blank School Scoring Matrix is attached as
Annexes 2a and 2b for your easy reference.
8. The scoring matrix for SBM Assessment as a result of the expert validation shall be
subjected for Operational try-out from February to March 2013 in selected schools. The results of
the tryout will be finalized and released as supplementary guidelines to DeEd Order No. 83. s. 2012.
SBM NORMS IN PERFORMANCE INDICATORS
The determination of norms and criteria for the identification of the 60% learning outcomes and the final scoring matrix
was done through review of literature and existing international and country benchmarks, goals and standards across
the thematic areas: access, efficiency and quality. More specifically, the group reviewed five-year historical performance
vis-avis the benchmarks and the EFA goals, computed the average rate of increase, and extrapolated the needed
increase to reach the targeted outcome in 2016. They also considered the population of schools according to enrolment,
the standard deviation and threshold in performance within the range.
AREAS INDICATORS NORMS/BENCHMARKS Proposed PH JUSTIFICATION
Standards
ACCESS Enrolment * Philippines A. Enrolment * The Marginal starts with 3%
Increase since it is the least rate based on
(45%) Increase
A. Net Enrolment Rate: 6.025 trie Average 5-Year Increase of
SY 2011-2012: Elem: 91.21; Sec: 62.0 = 76.61 1 - Marginal: At Sample Rural Division X.
To reach EFA Goal 2016 (100%): 5.B4 least 3% increase
Average for 5 years: Elem: 89.59; Sec: 60.69 = 75.14 2 - Average: At
least 5% increase * On the assessment result:
To reach EFA Goal 2016 (100%): 6.21
3 - High: At least Notations on annual growth rate
B. Enrolment increase (Average Sample Urban Division vs 7% increase at the community level should be
Rurat Division - Elem: 6.51; Sec: 16.23):11.37 considered; hence, community
B. Justification: mapping shouto be considered to
* Secondary: (IU. Non-Ill, Annex HS, W/ Special Program): i6,23 Enrolment Rate
Rural: Division X (Average Increase for 5 Years): 16,0 be certified by SDS, Planning
based on
Urban: Division Y (Average Increase tor 5 Years): 17.0 Officer & Brgy. Captain
Community
* Elementary: (Central, Non-Central, Multi-grade, WJ Special Mapping o
Program): 6.51
Rural: Division X (Average Increase tor 5 Years): 2.61
Urban: Division Y (Average Increase tor 5 Years): 10.21
1 - Marginal: At S
o
least 85%
C. Enrolment Rateis based on the Community Mapping
Enrolment
2 - Average: At
4
o
Certified by die Brgy, CapL, Division Planning Officer & SOS
least 90%
Enrolment
3-High: At least
Example Oniy:
95% Enrolment
Enrolment (based on annual population growth rate)
Average rate of pupils/students retrieved
* Philippines:
SY 2011-2012:1.8 (Elem); 2.3 (Sec) - Ave: 2.05
8
Finland's Educations) System best in the World. December 2012 Retrieved
or January 21. 2012 fccm nyg--:rT3e<!v:I-^!lU-"'?i'|-;JiO!Ssra
OD
Below Average: 26 - 50% * Marginal (At least
Poor: 0-25% 2% Increase) t
* Average (At least
5% Increase) - 2
* Philippines: High (With 7%
Average for 5 Years: Elem: 66.66 or67% Increase or all east
To ?each EFA Goal 2016 (75%): 2% Increase / Year 75% MRS)-3
OPTION 2:
'Marginal: (26 -
50%) - 1
'Average: {51 -
75%) - 2
Higher: (76 -
100%)-3
S
TJ
m
o
-&
>
o
M
O
-D
10
s
m
'o
i-D
22/02/2013 10:52 6337236
DEPED-RADIO ROOM PAGE 11
Annex 2a
Note: Only Schools having a Performance Improvement of "Better" can apply to the Division for SBM Validation.
Interpretation Good
Better
Best
Legend:
Numerical Rating Scale Description
0.50-1.49 Good
1.50-2.49 Better
2.50 - 3.00 Best
Interpretation:
Developing (Level I)
Maturing (Level II)
Advanced (Level III)
Annex 2-b
Sample Computation
Sfep 2: Bosed on fhe rafe of increase in Appendix I, determine fhe score for each indicator if
marginal, average, or nigh.
1.) Access (45%)
Score:
Marginal - 1
Averagers 2* .45 = 0.9
High - 3 \J
Sfep 3: Multiply percentage weight by raw score >
() 2 * .30 = 0,6
Step 4: Then, we add everything together where we will get the subtotal, 2.25.
Resulting points 0.9 + 0.75 + 0.6 = 2.25
As an interpretation, we can now determine the school's level of practice: Better: 2.25.
Categories Interval scores
Good 0.5-1.4
Better 1 .5 - 2.4
Best 2.5-3.0
10
22/02/2013 10:52 6337236 DEPED-RADIO ROOM PAGE 13
Because we got 2.12, we can now interpret that the school is in the level of "Better" if we go
back to our old categories:
Categories Interval scores
Good 0.5-1.4
Better 1.5-2.4
Best 2.5-3.0
So we compute it knowing that we get 60% from the Performance indicator and 40% from DOD, getting
to:
2.25*.60 = 1,35
2.12*.4 = 0.85
Then, adding all, we get... 1.35 + 0.85 = 2.20
1 The final interpretation will be that the school Is at Level It (Maturing) since from our
categories, 2.20 equates to Level II from this table: