Você está na página 1de 59

THE REPUBLIC OF UGANDA

Ministry of Labour, Gender and Social Development

Report on Data Quality Assessment and


Improvement for Orphans and Other
Vulnerable Children Services in Uganda

July 2016
This report presents findings of a data quality assessment and improvement (DQAI) exercise that was
undertaken by the Ministry of Gender, Labour and Social Development (MGLSD), in response to a
recommendation by the Orphans and other Vulnerable Children (OVC) Monitoring and Evaluation
Technical Working Group (OVC M&E TWG) for routine assessment of quality of data reported through
the OVC Management Information System (OVCMIS).

Contact:

Ministry of Gender, Labour and Social Development (MGLSD),


Plot 13 Lumumba Avenue (Simbamanyo Building),
P.O. Box 7136, Kampala Uganda,
Tel: +256 (0) 414 347 854, +256 (0) 414 347 855, +256 (0) 414 343 572
Website: http//www.mglsd.go.ug
E-mail: ps@mglsd.go.ug
TABLE OF CONTENTS

TABLE OF CONTENTS................................................................................................................................ i
LIST OF FIGURES ..................................................................................................................................... iii
ACRONYMS ............................................................................................................................................ iv
ACKNOWLEDGEMENT ............................................................................................................................. v
FOREWORD ............................................................................................................................................. v
EXECUTIVE SUMMARY .......................................................................................................................... vii
1.0 INTRODUCTION ........................................................................................................................... 1
1.1 BACKGROUND TO OVC PROGRAM DATA MANAGEMENT .......................................................... 1
1.2 THE OVC DATA QUALITY ASSESSMENT AND IMPROVEMENT (DQAI) ......................................... 2
1.3 RATIONALE FOR THE OVC DATA QUALITY ASSESSMENT AND IMPROVEMENT (DQAI) .............. 3
1.3.1 OBJECTIVES OF THE OVC DQAI ........................................................................................... 3
1.3.1.1 OVERALL OBJECTIVE OF THE OVC DQAI .......................................................................... 3
1.3.1.2 SPECIFIC OBJECTIVES OF THE OVC DQAI......................................................................... 3
1.3.2 THE MGLSD OVC DQAI CONCEPTUAL FRAMEWORK .......................................................... 4
2.0 METHODOLOGY .......................................................................................................................... 5
2.1 STUDY DESIGN......................................................................................................................... 5
2.2 PROGRAM INDICATORS TRACKED .......................................................................................... 5
2.3 STUDY POPULATION, SAMPLE AND SAMPLING METHODOLOGY........................................... 5
2.4 DATA SOURCES, TOOLS AND COLLECTION ............................................................................. 6
2.5 ORIENTATION OF DQAI TEAMS, PILOT TESTING OF TOOLS AND FIELD PROCEDURES ........... 7
2.6 FIELD DATA COLLECTION ........................................................................................................ 7
2.7 DATA MANAGEMENT; DATA ENTRY, CLEANING AND ANALYSIS ............................................ 8
3.0 FINDINGS ................................................................................................................................... 10
3.1 DESCRIPTIVE CHARACTERISTICS OF OVC SERVICE PROVIDERS ASSESSED ............................ 10
3.2 M&E SYSTEM ASSESSMENT .................................................................................................. 12
3.2.1 M&E STRUCTURE, FUNCTIONS AND CAPABILITIES TO HANDLE OVC INFORMATION ...... 12
3.2.2 UNDERSTANDING OF INDICATOR DEFINITIONS AND REPORTING GUIDELINES ............... 15
3.2.3 AVAILABILITY OF DATA-COLLECTION TOOLS AND REPORTING FORMS............................ 16
3.2.4 DATA MANAGEMENT PROCESSES .................................................................................... 18
3.2.5 USE OF DATA FOR DECISION MAKING .............................................................................. 20
3.3 DATA VERIFICATIONS ............................................................................................................ 22
3.3.1 DOCUMENTATION REVIEW............................................................................................... 22
3.3.1.1 OVERALL STATUS OF DATA SOURCES ........................................................................... 22
3.3.1.2 COMPLETENESS OF ALL AVAILABLE DATA SOURCES .................................................... 24

i | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
3.3.1.3 VERIFICATION OF THE AVAILABLE INFORMATION COVERING THE PERIOD UNDER
REVIEW 25
4.0 DISCUSSIONS AND RECOMMENDATIONS ................................................................................. 28
4.1 GENERAL STRENGTHS AND WEAKNESSES ................................................................................ 28
4.2 RECOMMENDATIONS FOR IMPROVED DOCUMENTATION OF OVC DATA ............................... 29
5.0 ANNEXES ................................................................................................................................... 31
ANNEX I: DATA COLLECTION AND ASSESSMENT TOOLS ................................................................... 31
1) INFORMATION PAGE................................................................................................................. 31
2) DATA VERIFICATION AND VALIDATION TOOL (DVV FORM) FOR SERVICE DELIVERY SITE ........ 31
3) DATA MANAGEMENT ASSESSMENT TOOL (SERVICE POINT & DISTRICT LEVEL M&E UNIT) .... 33
4) DATA VERIFICATION AND SYSTEM ASSESSMENT SHEET – RECOMMENDATIONS.................... 35
5) GENERAL OBSERVATIONS AND NOTABLE GOOD PRACTICES AT SITE ...................................... 36
6) FEED BACK TO SITE – M&E SYSTEM ASSESSMENT.................................................................... 36
7) FEED BACK TO SITE - VERIFICATION .......................................................................................... 37
ANNEX II: SCHEDULE FOR THE ORIENTATION OF FIELD TEAMS ON THE OVC DQA ......................... 38
ANNEX III: DATA DISPARITY STATUS BY IP AND SERVICE POINTS ..................................................... 39
ANNEX IV: LIST OF DQAI TEAM MEMBERS ....................................................................................... 44
ANNEX V: LIST OF SERVICE PROVIDERS ASSESSED ........................................................... 45

ii | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
LIST OF FIGURES

Figure 1: Conceptual Framework for the OVC DQAI:.............................................................................. 4


FIGURE 2: Map of Uganda Showing Distribution of Sites by Region .................................................... 10
Figure 3: Overall Rating of M&E Systems Functionality ....................................................................... 12
Figure 4: M&E Structure, Functions and Capabilities ........................................................................... 12
Figure 5: Understanding of Indicators and Reporting Guidelines ........................................................ 15
Figure 6: Availability of Data Collection and Reporting Tools............................................................... 16
Figure 7: Use of National Tools and Forms for Multiple Partner Reporting ......................................... 17
Figure 8: Adherence to National Confidentiality Guidelines ................................................................ 19
Figure 9: Data Management Files in Some of the Service Providers Visited. ........................................ 20
Figure 10: Overall Status of Data Sources ............................................................................................. 23
Figure 11: Number of OVC Service Providers by Data Disparity ........................................................... 24
Figure 12: Disparity between Service Provider Data Extracted from the OVCMIS and Register Recount
.............................................................................................................................................................. 24
Figure 13: Data Disparity Trend (as it moves from the register to the OVCMIS) ................................. 25

iii | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
ACRONYMS

AOET AIDS Orphans Education Trust


BOCY Better Outcomes for Children and Youth
CDC Centres for Disease Control and Prevention
CDO Community Development Officer
CCDs Central Coordinating Teams
CPA Core Programme Area
CEM/UPHS Uganda Private Health Support
CSI Child Status Index
DCDO District Community Development Officer
DOD Department of Defence
DOS Department of State
DQAI Data Quality Assessment and Improvement
HIV/AIDS Human Immunodeficiency Virus/Acquired Immunodeficiency Syndrome
ICOBI Integrated Community Based Initiative
IRB Institutional Review Board
IPs Implementing Partners
KCCC Kamwokya Christian Caring Community
KIWEPI Kitgum Women Peace Initiative
MakSPH Makerere University School of Public Health
M&E Monitoring and Evaluation
MEEPP Monitoring and Evaluation of Emergency Plan Progress.
METS Monitoring and Evaluation Technical Support Programme.
MGLSD Ministry of Gender, Labour and Social Development
MJAP Makerere Mbarara Joint AIDS Programme
MURWP Makerere University Walter Reed Project
NSPPI National Strategic Programme Plan of Interventions
OVC Orphans and Other Vulnerable Children
OVI Uganda Orphans and Other Vulnerable Children Vulnerability Index
OVC_NIU Orphans and Other Vulnerable Children National Implementation Unit
OVC TWG OVC Monitoring and Evaluation Technical Working Group
RECO-PIN RECO Industries – Production for Improved Nutrition
PEPFAR Presidents’ Emergency Plan for AIDS Relief
PSWO Probation Officer and Social Welfare Officers
RDQA Routine Data Quality Assessment
RHSP Rakai Health Services Programme
SDS Strengthening Decentralisation for Sustainability (SDS) Project
SOP Standard Operating Procedure
SOCY Sustainable Outcomes for Children and Youth
UEC/UCMB Uganda Episcopal Conference/ Uganda Catholic Medical Bureau
UNICEF United Nations International Children Fund
UPMB Uganda Protestant Medical Bureau
USAID United States Agency for International Development

iv | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
FOREWORD

The Ministry of Gender, Labour and Social Development has the principal mandate on the welfare of
children as enshrined under the Constitution of Uganda (Chapter 4) and the Local Government Act
(1997) which provide for the purpose of ensuring implementation of national policies and adherence
to performance standards. The constitution empowers the Ministry to inspect, monitor and where
necessary, offer technical advice, support supervision and trainings.

On the other hand, the Children Act Cap. 59 Section 10 provides for the decentralized duty of all Local
Government Councils from village to the district level to safeguard and promote the welfare of
children within their area assisted by officers of the respective Local Government Councils.

The MGLSD has been implementing the OVCMIS for reporting services provided to vulnerable
children and their households in all the districts of Uganda. In the same disposition, the MGLSD is
mandated to regularly undertake DQAI exercises to verify the quality of services provided to OVC.

In the month of July 2016, the MGLSD teamed-up with the two monitoring and evaluation (M&E)
agents; MEEPP and METS and, other OVC partners and District Community Based Services
Departments (CBSD) to conduct a first national OVC DQAI in seventy (70) OVC service provision points
in twenty-four (24) districts sampled based on the January-March 2016 data input into the OVCMIS.

The success of the OVC DQAI is indeed owed to the strong partnership between Government of
Uganda and its development agencies, particularly PEPFAR through USAID, CDC and other agencies.
Findings of the OVC DQAI will certainly guide interventions for reducing child vulnerability in the
Country.

v | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
ACKNOWLEDGEMENT

The Ministry of Gender, Labour and Social Development (MGLSD) is grateful to all development
partners and service providers for the continued support to the orphans and other vulnerable
children (OVC) response in the Country. Without you, the Government of Uganda alone would not be
reaching out to all the children and their households in need of support.

I would like in a special way to thank the United States Government and the American people for the
support to OVC response and, in particular for supporting the operationalisation of the OVC
Management Information System (OVCMIS). It should be noted that without comprehensive data, it
will be difficult to tell whether a difference is being made in the lives of vulnerable children.

I would also like to thank the Monitoring and Evaluation of Emergency Plan Progress (MEEPP-II
Project) for the immense improvements to the OVCMIS including development of resource materials
and data tools that have been widely disseminated to almost all OVC service providers. A lot of
improvements have been made to the OVCMIS and other resource materials like standard operating
procedures (SOPs), registers and other guidelines. I call upon all OVC service providers to make use of
these tools to improve services for the OVC and their households.

Coming back to the issue of data quality assessment and improvement (DQAI); first of all, I am happy
that we are now emphasising data quality upon strengthening data reporting for a long time. Now
that we are registering 100% reportage by districts, we need to emphasise the issue of data quality,
for it is quality information that can be used for planning and advocating for children development
agenda.

On this note, I would like to thank MEEPP Project and the Monitoring and Evaluation Technical
Support (METS) Project who with support from the President’s Emergency Plan for AIDS Relief
(PEPFAR) through the United States Agency for International Development (USAID) and Centres for
Disease Control (CDC) assisted this first ever comprehensive national OVC data quality assessment.

Let me also take this chance to thank the OVC National Implementation Unit (OVC_NIU) and the OVC
Monitoring and Evaluation (M&E) Technical Working Group (OVC M&E TWG) for coordinating OVC
response in a manner that has brought all service providers to work together.

Finally, I thank all those who participated and accepted to be assessed during the OVC DQAI exercise.
I call upon all stakeholders to appreciatively receive these findings and use them to improve service
delivery. Those who were found to be performing well should aim at maintaining the standards or
even make them much better.

vi | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
EXECUTIVE SUMMARY

BACKGROUND
This report gives a comprehensive account of the processes, findings and recommendations of the
OVC data quality assessment and improvement (DQAI) exercise whose intent was to; validate data
reported during the January-March 2016 quarter and, establish strengths and gaps in monitoring and
evaluation (M&E) systems employed by OVC service providers to manage OVC data.

In 2014, the Ministry (MGLSD) commissioned an assessment of the OVCMIS to assess functionality of
the system in regards to achieving its intended objective. Findings from the assessment highlighted
data quality gaps including; invalid, incomplete, unreliable and untimely data. The web-based system
routinely provides data for planning responses to the plight of vulnerable children in Uganda.

The MGLSD working with stakeholders instituted measures for improving the quality of data reported
through the OVCMIS which, mainly included standardization and rollout of protocols. Despite the
improvements to the OVCMIS; there was evidence of internal data inconsistences as evident in the
January-March 2016 quarterly report which implied that data in the OVCMIS was not of the best
quality.

RATIONALE
The MGLSD and Implementing Partners (IPs) are required to routinely conduct DQAI to inform
planning, monitoring and performance management. Under guidance of the MGLSD, the OVC M&E
Technical Working Group recommended an OVC DQAI to verify quality of data reported during the
January-March 2016 quarter and, assess data management systems, well aware that DQAIs help to
ascertain data quality; strengthen data management and; increase trust in data.

This DQAI was intended to improve data management systems for OVC programmes in Uganda. While
the specific objectives were to: 1) Assess M&E systems at service delivery level; 2) Validate data
reported; 3) Identify challenges in collecting, recording, collating and reporting data and; 4) Verify
services provided to OVC and their households during January-March 2016 quarter.

METHODOLOGY
The MGLSD in partnership with the two PEPFAR M&E agents; MEEPP and METS randomly sampled
and assessed 70 out of a total of 1,590 OVC service providers that reported during the January-March
2016 quarter. Service provider selection was dependent upon a stratified random sampling design
with proportionality to sample size allocation taking into account the representativeness of the nine
(9) technical support zones, IPs and, service providers that reported during the quarter.

Two indicators were assessed, namely: 1) Number of individuals served and, 2) Number of OVC
graduated. Assessment Teams comprised of MGLSD, MEEPP, METS, IPs and District officers.
Assessment focused on three major components, namely; 1) M&E systems readiness assessment; 2)
Data crosschecks and validation and; 3) Analysis of gaps, weaknesses and areas of improvement.

FINDINGS

OPPORTUNITIES NOTED GAPS NOTED


1. Overall most of the assessed OVC 1. Misinterpretation of indicators and reporting guidelines with the
service providers had following related findings made;
staff prioritized for data  35% of sampled service providers generally lacked clarity of
management. However, data understanding of what should be recorded in source documents.
management tools, skills and  45% of sampled service providers had a misunderstanding of how

vii | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
OPPORTUNITIES NOTED GAPS NOTED
practices for attaining good data to derive variables for the indicator; OVC served.
quality data still calls for  56% did not understand how to compute OVC graduated.
improvement. The following 2. Majority of service providers lacked filled (completed) forms and
opportunities were noted; reports for validation and verification. The following were noted;
 59% of sampled OVC service  40% lacked a complete package of national OVCMIS tools. Only
providers had good M&E systems 46% had appropriate client cards, registers and archived summary
for managing OVC data. reports for the January-March 2016 quarter.
 71% of sampled service providers  Poor filling (completion) of source documents, with only 40% of
had good well laid job schedules sampled OVC service providers with data sources (client forms,
and staff for managing data. registers and summary reports) for the period Jan.-Mar. 2016.
 54% of sampled service providers  Data collection tools were not in use in some of the CSOs even
have electronic data management where they existed. The reasons for this being; staff not trained to
system and in some cases, web use OVC tools, some IPs prefer own OVC tools, use of electronic
based computerized databases. databases without reference to hardcopy registers and, funders
 53% of service providers are not emphasizing usage of MGLSD tools.
consistently utilizing national 3. Data validity in the OVCMIS was not acceptable in many service
OVCMIS tools for prioritization, provision sites visited with only 44% of data was valid and
identification, assessment and complete. Other data quality challenges noted include;
monitoring and graduation using  Unexplained sources of reports; with a few sites rejecting data in
the community structures. the OVCMIS and, claimed to never have reported.
 All sampled OVC service providers  Little or no documentation although site submitted a report.
are using government structures in  Heavy reliance on electronic databases with no primary forms.
implementing activities which  Some service providers reported pre-identified clients as served
strengthens partnership and before actual services are provided. Thus in some service points,
supervision. For example, the data sources ended up being pre-identification and enrolment
supporting sub-county CDO in forms rather than service registers.
reaching out to OVC.  Some sites providing HIV services reported children under HIV
 Many CSOs are increasingly positive care without documentation of additional services.
reporting directly on the OVCMIS  Dialogue meetings with caregivers extrapolated to the OVC, for
which has reduced the workload example; birth registration sensitization meetings reported as
on the PSWO. birth registration to the individual OVC.
 65% of service providers had a 4. Limited clarity of data management processes; with only 59% of
clear understanding of indicator OVC service providers with good data management processes.
definitions and reporting 5. Gross data disparity between data in OVCMIS and primary tools;
guidelines. 73% of OVC service points noted gross disparity of over +/-10%
 59% of service providers had compared with data in source documents. Additionally;
favourable data management  Data in OVCMIS was 32% more than the data counted on site for
processes. the indicator; total individual OVC served
 IPs are increasingly playing the  Majority of the data discrepancies introduced during the process
catalyst role of improving data of data entry into OVCMIS. The DQAI noted transcription errors
management as evident among likely during data entry.
service providers and districts with  Two sites rejected data in the OVCMIS and claimed to never have
close links to IPs who had better reported.
data management practices. 6. Generally, poor use of data for decision-making was noted; only
55% demonstrated evidence of data use.

LIMITATIONS OF THE OVC DQAI


The following limitations were encountered during the DQAI;
• A few of the sampled service providers had not started providing services yet they had reported
through the OVCMIS and therefore sampled for this DQAI, thus data validity could not be tested.
• One site had closed office yet it had reported through the OVCMIS and hence in the sample.
Thus introducing a 1% error to the findings (due to the lower response rate).

viii | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
• Poor record keeping made tracing of data sources difficult in a few cases, data without
completed primary data collection tools could not be accurately categorized by reporting period
and,
• Retrieval of records on OVC in refugee camps in Kiryandongo district was impeded by access
rights to refugee data. The DQAI had not obtained the requisite permission from the Office of
the Prime Minister that is in charge of refugee data management.

STRENGTHS OF THE OVC DQAI


The following factors made the DQAI exercise possible;
 Leadership by the MGLSD ‘opened doors’ at all levels with the introductory letters and active
presence of the Ministry increased the urge, understanding and participation by stakeholders.
 Involvement and commitment of IPs and funders of OVC service providers was at optimum and
this added value to the exercise.
 The systematic/scientific process of randomization (sampling) of participating OVC service
providers makes the results representative of the national picture.
 Impressively, high response rate of 99% (69 of 70) OVC service providers targeted were
reached. However, assessments were conducted in 64 of 70 (91%) of OVC service providers
sampled.
 Appreciative approach helped in not only collecting data but improved the capacity of OVC
service providers visited.

RECOMMENDATIONS
In order to improve the quality of data in the OVCMIS, it is crucial to undertake the following:
Orient all OVC service providers and stakeholders to appreciate, understand and use the MGLSD
revised OVCMIS data management tools. This calls for continuous support supervision, mentoring and
coaching for adherence to usage of data management tools. During coaching and mentorship;
indicators tracked regularly for instance; child graduated, child served and child provided psychosocial
support among others should be clarified for collective interpretation.

Routinely, undertake gap analyses and encourage stakeholders to print and distribute sufficient stock
of revised data management tools, especially; the integrated OVC register (Form 004) and quarterly
reporting tool (Form 100) in the already existing formats.

There is a need to popularise, enforce and customise (the use of) national data collection tools
through appreciation, use and adherence to standard procedures including guidance on indicator
definition particularly how to track, compute and report key indicators.

Additionally, refocus data quality improvement efforts to reduce errors during data collection,
transcription and entry and, clarify data management processes and expectations for improved data
use to influence decision support systems.

Importantly, there is a strong desire for rewards to OVC service providers that report accurate, timely
and complete data and, this thus calls for strategies to eliminate data manipulation and appreciate
service providers that meet the dimensions of data quality.

In the medium to long term; there is a need to standardize electronic (computerised) databases in use
by different actors and, harmonise to capture same variables as are in the integrated OVC register.
This automation of tools (particularly Form 004) should allow direct upload of primary data from the
register into the OVCMIS (online database). In case data confidentiality guidelines can be met, this

ix | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
innovation will significantly support generation of accurate reports based on primary sources
uploaded on online with little or no transcription errors arising during data summarisation.

x | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
1.0 INTRODUCTION
1.1 BACKGROUND TO OVC PROGRAM DATA MANAGEMENT
Children constitute 55% of Uganda’s total population of 34.8 million people (UNICEF, 20151 and
UBOS, 20142). This amounts to an estimated 19 million children, of which 8% are orphans and close to
43% are critically or moderately vulnerable (UBOS 2014 and MGLSD, 2011). The national household
survey of 2010 (UBOS, 20103) noted that 14% of children are orphaned and, about 45.6% of this
orphan-hood is attributed to HIV/AIDS, with 105,000 children in the age category of 0-14 years being
HIV positive.
Uganda is among the ten top countries in the world with high maternal, new-born and child mortality
rates (UNICEF, 2015). The indicators of child poverty is demonstrated by under nutrition that stands
at 33%, limited access to clean water and sanitation that stands at 30% and, child trafficking being a
major concern, poor quality of education and high school dropout rates, amongst other many unmet
needs. The OVC situation analysis study of 2010 noted that, at least one in every four households has
an orphan and three (3) million children live below the poverty line (MGLSD, 2011)4.

Orphan-hood in Uganda remains a big challenge with the proportion of children that are orphaned
increasing from 11.5% in 1999/2000 to 13.4% in 2002/2003 and 14.8% in 2005/2006, although in
2009/2010, the magnitude reduced slightly to 14% (MGLSD, 2010)5. It is against this background that
National Strategic Programme Plan of Interventions (NSPPI-1 and 2) were developed to guide efforts
of all OVC service providers aimed at alleviating the plight of OVC.

Government through the Ministry of Gender, Labour and Social Development (MGLSD) and
Implementing Partner (IP) programs are addressing the plight of critically and moderately vulnerable
children as stipulated in the national OVC policy and the second National Strategic Programme Plan of
interventions (NSPPI, 2010/11-2015/16)6.

The NSPPI is built on the National Development Plans, Social Development Sector Strategic
Investment Plans as well as the investment plans of other line government ministries. It provides a
framework for effective and coordinated response to child vulnerability and, articulates strategies for
implementation of comprehensive, effective and quality services.

To measure the extent of reach to address child vulnerability, the MGLSD established the Orphans
and Other Vulnerable Children Management Information System (OVCMIS) as a source of routine
data. The OVCMIS is a web based system accessible through the portal; http://ovcmis.mglsd.go.ug/.
Since 2007, it has been used by OVC program planners and implementers both within and outside
government to routinely report data on indicators for monitoring interventions. The resultant online
reports are analysed to determine the degree of access to comprehensive services or lack thereof.

In 20147, the MGLSD commissioned an assessment of the OVCMIS to identify strengths, gaps and
what needed to be improved. Findings from the assessment highlighted factors that affect quality of
data including lack of data completeness, validity, reliability and timelines. Recommendations were

1
UNICEF, 2015. Situational Analysis of Children in Uganda.
2
UBOS, 2014. National Population and Housing Census 2014 Provisional Results.
3
UBOS, 2010. Uganda National Household Survey, 2009/2010.
4
MGLSD, 2011. National Strategic Programme Plan of Interventions for OVC, 2011/12-2015/16.
5
MGLSD, 2011. National Strategic Programme Plan of Interventions for OVC, 2011/12-2015/16.
6
MGLSD, 2011. National Strategic Programme Plan of Interventions for Orphans and other Vulnerable Children 2011/12- 2015/16.
7
MGLSD, 2015. Assessment of the Uganda National Orphans and other Vulnerable Children Management Information System (OVCMIS).

1 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
made to address short and long term goals, among others; revamp the OVMCIS to make it user
friendly, update program indicators, develop user manuals and procedures to assist users to navigate
the system.

The MGLSD working with the Monitoring and Evaluation of Emergency Plan Progress (MEEPP-II)
Project and other stakeholders embarked on; revamping the database, revision of data management
tools and development of standard operating procedures (SOPs) for data collection and processing.
Additionally, the MGLSD improved tools for child vulnerability identification, prioritization,
assessment, enrolment, monitoring and graduation (Table 1).

Table 1: Tools for Identification, Enrolment, Service Provision, Monitoring and Graduation

CODE REVISED OVCMIS TOOL OLD OVCMIS TOOL


OVCMIS Form OVC Pre-Identification and Registration Child Status Index (CSI) and OVC
005 Form Vulnerability Index (OVI)
OVCMIS Form Uganda OVC Vulnerability Prioritization CSI and OVI
006 Tool
OVCMIS Form Household Vulnerability Assessment CSI, OVI, Child Enrolment Card
007 Tool
OVCMIS Form Child Enrolment and Monitoring Card CSI, OVI and Child Enrolment Card
008
OVCMIS Form Integrated OVC Register OVC Register, OVC Service Register,
004 Case Management Book, Referral
Register and Exit Register
OVCMIS Form Referral Form for OVC Referral Form (Register)
009
OVCMIS Form OVCMIS Quarterly Report OVCMIS Quarterly Report
100

Following the upgrade of the OVCMIS, the MGLSD in partnership with stakeholders embarked on
extensive popularisation and dissemination of the revamped OVCMIS to users at local and lower local
government levels. As part of the rollout; regional and district level orientation meetings for OVC
service providers were held to build capacity to use the OVCMIS and report quality data. As a result,
the system has greatly improved in data completeness, usability and, is being used by stakeholders
including the Presidents’ Emergency Plan for AIDS Relief (PEPFAR) to report OVC program
implementation.

1.2 THE OVC DATA QUALITY ASSESSMENT AND IMPROVEMENT (DQAI)


The OVCMIS is increasingly being used as a source of data for decision and policy making. However,
no national DQAI on OVC had ever been undertaken and despite the improvements in the OVCMIS
and improved reporting by service providers; there was glaring evidence of data inconsistences for
the January-March 2016 quarter report as shown by the internal data quality check report in the
database. For instance; cases of missing and incomplete reports and, incorrectly disaggregated data.
This partly implied that data in the OVCMIS was not of the required quality to be relied upon.

The January-March 2016 reporting period was markedly the first time when PEPFAR extracted its
Quarter II Programme reports from the OVCMIS, thus an additional merit for the DQAI to increase
trust in the national system and as well gauge service reach to vulnerable persons.

2 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
This first ever national OVCMIS DQAI focused on OVC service providers supported by PEPFAR through
Centres for Disease Control and Prevention (CDC), Department of Defence (DOD), Department of
State (DOS) and United States Agency for International Development (USAID). Similarly, OVC service
providers under non-PEPFAR support were assessed, namely; Compassion International, Samaritan’s
Purse, Save the Children and World Vision, amongst others.

1.3 RATIONALE FOR THE OVC DATA QUALITY ASSESSMENT AND IMPROVEMENT (DQAI)
As a measure to build confidence and pursue acceptability by stakeholders to use OVCMIS data for
decision making, it is key to guarantee data quality. Stakeholders that currently use OVCMIS for their
program reporting are required to routinely conduct DQAI to inform program planning, monitoring
and performance management.

Regular DQAIs help to: ascertain quality of data; identify data quality issues; strengthen data
management processes and; increase trust and confidence in data used for decision making (Sanjay,
20108; WHO, 20109; Bergdahl et al., 200710 and; Rouse, 200511). OVC data quality improvement
ensures that data collected cumulatively is accurate, reliable and measures what it is intended to
measure.

The MGLSD is mandated to conduct DQAIs on selected OVC indicators in order to validate quarterly
and annual reports. The DQAI was aimed at improving quality of data and eliminating errors
identified. The Ministry expects OVC data to be reliable, accurate and complete, provided in a timely
manner, valid and should maintain confidentiality.

It is upon this context that the MGLSD’s OVC Monitoring and Evaluation Technical Working Group
(OVC M&E TWG) recommended a DQAI to verify quality of data reported during the January-March
2016 quarter and, assess data management systems.

In the month of July 2016, the MGLSD with support from PEPFAR conducted a DQAI in 70 OVC service
provision sites in 25 districts sampled based on the January-March 2016 data input into the OVCMIS.
The exercise lasted from 4th to 15th July 2016 covering the nine zones of the MGLSD technical support;
Central-1, Central-2, East-central, Eastern, North-central, North-eastern, North-western, South-
western and Western zones.

1.3.1 OBJECTIVES OF THE OVC DQAI


1.3.1.1 OVERALL OBJECTIVE OF THE OVC DQAI
The main objective was to improve data management systems for OVC programs in Uganda.

1.3.1.2 SPECIFIC OBJECTIVES OF THE OVC DQAI


The specific objectives were to;
1.) Assess M&E systems for OVC programs at service delivery and district levels.
2.) Validate OVC data reported during the January-March 2016 quarter.
3.) Identify challenges faced in collecting, recording, collating and reporting OVC data.
4.) Verify services provided to OVC and their households during January-March 2016 quarter.

8
Data Quality Assessment Review, Sanjay Seth, 2010 http://www.sas.com/content/dam/SAS/en_us/doc/servicebrief/data-quality-
assessment-106301.pdf (accessed on 27th June 2016)
9
http://ec.europa.eu/eurostat/documents/64157/4373903/05-Handbook-on-data-quality-assessment-methods-and-tools.pdf/c8bbb146-
4d59-4a69-b7c4-218c43952214
10
http://searchdatamanagement.techtarget.com/feature/Dissecting-data-measurement-Key-metrics-for-assessing-data-quality
11
Handbook on Data Quality Assessment Methods and Tools (2007). Wiesbaden, European Commission

3 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
1.3.2 THE MGLSD OVC DQAI CONCEPTUAL FRAMEWORK
The conceptual framework for the OVC DQAI is illustrated in Figure 1 (below). Generally, the quality of
data is dependent on the underlying data management and reporting systems with the assumption
that stronger systems should produce better quality data.

In other words, for good quality data to be produced by and through a data management system, key
functional components need to be in place at all levels of the system including the points of service
delivery, the intermediate level(s) where the data are aggregated (for instance; service points, sub-
counties and/or districts), and the M&E unit at the highest level (the OVC National Implementation
Unit) to which data are reported.

The DQAI tools are therefore designed to; 1) verify the quality of the data, 2) assess the system that
produces that data, and 3) develop action plans to improve both the quality of data and the system
that produces the data.

Figure 1: Conceptual Framework for the OVC DQAI: Data Management and Reporting Systems,
Functional Areas and Data Quality

4 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
2.0 METHODOLOGY
2.1 STUDY DESIGN
The DQAI employed a cross-sectional design which, was descriptive in nature using qualitative and
quantitative data collection approaches. A cross-sectional design was appropriate as it enabled data
to be collected from various sources for the same purpose at a single point in time from a sample of
the population (Creswell, 2014)12.

2.2 PROGRAM INDICATORS TRACKED


The DQAI focused on two indicators tracked during the reporting period (January-March 2016);

1) Number of individuals served. This is a composite indicator that measures national response
to critically and moderately vulnerable individuals. It looks at coverage of services across all
the core program areas (CPAs). It is reported quarterly in the OVCMIS disaggregated by: 1)
Gender; male and female and, 2) Age; under 1 year, 1-4 years, 5-9 years, 10-14 years, 15-17
years, 18-24 years and 25+ years.
2) Number of individual children graduated. This indicator looks at how many of the served OVC
have been empowered to move from vulnerability and can compete with non-OVC. It is
reported quarterly in the OVCMIS disaggregated by; 1) Gender; male and female and 2) Age;
under 1 year, 1-4 years, 5-9 years, 10-14 years and 15-17 years.

2.3 STUDY POPULATION, SAMPLE AND SAMPLING METHODOLOGY


The assessment population included all sites that reported OVC services in the OVCMIS for the
January-March 2016 period. A stratified random sampling design with a proportionate sample size
allocation was used. To compute the number of OVC service providers to be sampled, it was assumed
that at least 60% of the service providers had favourable M&E systems and practices, acceptable
margin of error of reported data vs. data in source documents and, had reliable data collection,
processing and management systems.

Precise estimation of the proportion of service providers (P) with favourable practices within 12%
error margin (e) with 95% certainty was computed using a simple formula (Kish, 1965)13:

Z2 / 2 P(1  P) 1.962 / 20.6  0.4


n    64
e2 0.122
To ensure representation of the nine regions and IPs across the different development agencies, an
adjusted total of 70 OVC service providers were sampled. The sampled OVC service providers
contributed nearly 25% of the outputs for the January-March 2016 quarter.

For the OVC level verification of data, a two-stage cluster design was superimposed on the stratified
random sample design for the DQAI. Assuming that at least 90% of the children received the listed
services, a sample size of 216 OVC was needed to verify this within 5% error margin with 95%
certainty. In total, there were 1,590 OVC service providers that reported data in the January-March
2016 quarter. In each region, the service providers were listed and sampled according to probability
proportional-to-size sampling. These were expected to be visited for the DQAI.

12
Creswell, J. W., 2014. Research Design. Qualitative, Quantitative and Mixed Methods Approaches. 4th Edition. SAGE Publications, Inc. 2455
Teller Road. Thousand Oaks California 91320.
13
Kish, L., 1965. Survey Sampling. New York: Wiley.

5 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
Engagement of vulnerable children needed an Institutional Review Board (IRB) approval which
process could not be accelerated to fit within the assessment schedule and thus, household visits to
engage the 216 sampled beneficiaries did not happen.

The sample size of service providers was allocated proportionate to the geographical regions
according to the number of OVC served during the quarter.

2.4 DATA SOURCES, TOOLS AND COLLECTION


The standard data sources for the two indicators; number of individuals served and number of
individual OVC graduated are available at service provider level. These two indicators are captured as
part of routine service provision using client cards and registers (for example; Forms 004 and 007).
This data is summarized quarterly in standardized quarterly reporting form (Form 100) and is either
submitted directly by service provider into OVCMIS or a copy is sent to the district and entered by the
District OVCMIS Focal Person. In other words, one copy is kept at the service provider office, another
copy taken to the sub county of operation and the original copy submitted to the district.

Table 2: OVC DQAI Indicators and their Data Sources

INDICATOR DATA SOURCE


Number of individuals served Form 004: OVC Integrated Register
Form 100: OVCMIS Quarterly Report
OVC Service Register
Case Management Book
Form 007: Household Vulnerability Assessment Tool
Client Files
Number of individual OVC graduated Form 004: OVC Integrated Register
Form 100: OVCMIS Quarterly Report
Form 007: Household Vulnerability Assessment Tool
Client Files

Data collection was conducted using semi-quantitative structured questioning and guided
interviewing, observation and; review of pre-primary, primary and secondary data sources. The
checklists were a customized version of the Measure Evaluation’s Routine DQA (RDQA) tool (Annex 1).

The RDQA tool was automated in Microsoft Excel with dashboards summarizing feedback for
providing quick analysis and feedback. Hardcopies were printed out to increase flexibility in use where
printouts of electronic copies could not be possible. Findings were documented in the Microsoft Excel
templates for each service provider by the end.

The RDQA tool has the following templates;


1) Service provider identification data.
2) Document review template.
3) Data validation template
4) M&E systems assessment template which assess the following components:
i. M&E structure, functions and capabilities;
ii. Understanding of indicators and reporting guidelines;
iii. Availability of data-collection tools and reporting forms for OVC programs;
iv. Data management processes including data quality assurance and,

6 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
v. Use of data for decision making.
5) Household OVC (service) verification tool.
6) A feedback form for summarizing findings of the DQAI and provide feedback to service
provider staff on the preliminary findings of the DQAI.
7) Action plan template for prioritizing main areas of weakness and develop action plans for
improvement of the data quality.

2.5 ORIENTATION OF DQAI TEAMS, PILOT TESTING OF TOOLS AND FIELD PROCEDURES
All the teams were trained during a three-day workshop conducted by MGLSD in collaboration with
MEEPP, METS and OVC supporting agencies (Annex II). The orientation comprised of didactic sessions
on the first day covering the following items;
 Introduction to the DQAI concept, background, rationale and priority indicators.
 Tools, processes and mechanisms for routine data collection, reporting and data flow.
 DQAI methods and service provider procedures: etiquette and confidentiality.

The orientation was followed by a one-day pilot/trial assessments in six service providers in Mukono
and Kayunga districts. The piloted service providers did not constitute part of the main assessment
sample apart from one service provider in Mukono district that fell in the main sample.

On the third-day of the training, lessons learned were shared and data collection instruments were
improved accordingly, and breakout sessions for field teams to develop their field schedules and
finalize field deployment plans and data collection logistics.

2.6 FIELD DATA COLLECTION


Under the guidance of the Ministry, two central coordinating teams (CCDs) and field teams were
formed. The CCDs collated service provider reports into a common Microsoft Excel OVC DQAI
database and ensured data completeness of field reports and coordinated field teams to ensure that
all selected service providers were appropriately visited. Each field team had at least four people;
team leader and field officers. Teams comprised of staff from the MGLSD, MEEPP, METS, IPs and
district staff involved in service provision and supervision (Annex IV).

The MGLSD notified districts and IPs who in turn coordinated visits to service points. Field teams
visited districts and held introductory discussions with district leads, shared objectives of the DQAI
with the District Community Development Officer (DCDO) and/ or Probation Officer and Social
Welfare Officers (PSWO). The teams then agreed upon the itinerary of the visit to sampled service
points.

For the service provider visits, the DQAI teams were joined by the PSWO to guide the team. At the
service outlet level, team leaders introduced the objectives of the assessment to the service provider
management and provision staff. The service provider management allocated appropriate personnel
to support the DQAI processes at site. The following actions took place at the service provider;

1) Filled the service provider identification data tool.


2) Conducted a key informant interview, made observations and completed the M&E systems
assessment tool.
3) Reviewed the documents used in the January-March 2016 quarter for OVC reporting and
filled the Documents Review Template.

7 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
4) Validated the data reported in the January-March 2016 quarter and filled the Data Validation
Template through manual count of outputs to determine the level of accuracy or the degree
to which submitted results compared with service provider validated count.
5) Summarized key findings from the M&E systems assessment, data validation and household
verification into the DQAI feedback form and provided feedback to service provider staff on
the preliminary findings of the DQAI.
6) Together with the service provider, district and IP; prioritized weaknesses and documented
actions for improvement of data quality in the Action Plan Template.

In conducting this DQAI, a Consent Form was signed and a Confidentiality Agreement was pursued.
Efforts were made to manage confidentiality and etiquette in order to protect privacy as well as
adhere to basic ethical principles in research; justice, beneficence and respect for persons. The
Assessment Team did not photocopy or remove documents from any service provider.

From conception to execution of the OVC DQAI; attention was paid to the guiding principles of
working with children (UNICEF, 201414). In whatever circumstance the following are key:

1) Best interest of the child is a primary concern in decision making regarding children, Article 3.
2) Do no harm.
3) Maintain confidentiality.
4) Pursue non-discrimination.
5) Promote right to life, survival and development, Article 6.
6) Promote right to participation by seeking child’s opinion, Article 12.

2.7 DATA MANAGEMENT; DATA ENTRY, CLEANING AND ANALYSIS


Data collected was entered into an electronic Microsoft (MS) Excel database formatted to manage
both numerical and text values at point of collection. A hardcopy backup was filled as well. The
templates had coded options to ensure accurate entry and, dashboards that eased feedback to
service providers.

Completed service provider reports/ templates were submitted to the CCTs each day for preliminary
review. Inconsistencies detected were referred to the field teams for correction. Data analysis was
conducted at two levels; preliminary analysis at site level and, analysis of pooled data at central level:
A. Preliminary analysis at service provider level was automated to ease feedback. Dashboards in
MS Excel spreadsheets summarised statistics with automated colour codes for identifying
areas of strong or weak performance. In addition, formulae to calculate summary statistics
such as ratios between reported values and manual recounts, or composite scores was
inbuilt (green when the results are adequate or no disparity +<5%, yellow when acceptable
of level disparity +5 – 10%; and red when disparity exceeds +10%). Preliminary service
provider level analyses were used for feedback, especially to identifying areas in need of
corrective actions.
B. For analyses at central level; data from all DQAI sites were pooled for analyses in MS Excel.
Key variables for each service provider was transferred to the composite sheets where each
row denotes a service provider and columns represent variables.

For pooled data; analyses were conducted to answer the different objectives;

14
UNICEF, 2014. Convention on the rights of the child. http://www.unicef.org/crc/index_30177.html

8 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
1) Assess M&E systems for OVC programs at service delivery and district levels.
Service provider level data was listed by district and service point. Outputs were tabulated reflecting
the performance, with appropriate colour codes that provide at a glance the performance of a service
provider. The comments are summarized under themes to explain the findings from the assessment.
2) Validate OVC data reported during the January-March 2016 quarter.
To meet this objective, findings were summarized showing; Availability and completeness of data
collection tools, Levels of disparity between OVCMIS data and register count; Levels of disparity
between service provider level report data and register count and; Levels of disparity between filed
report data at district and register count.

Levels of data disparity was computed as follows:

Result in Report Under Review – Joint Recount Figure


Percentage
= x 100
disparity
Joint Recount Figure
3) Verify services provided to OVC and their households during January-March 2016 quarter.
This objective was not met since an IRB approval could not be obtained in time.
4) Identify challenges faced in collecting, recording, collating and reporting OVC data.
A list of challenges generated from the key informant interview responses has been generated.

9 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
3.0 FINDINGS
3.1 DESCRIPTIVE CHARACTERISTICS OF OVC SERVICE PROVIDERS ASSESSED

A. GEOGRAPHICAL SCOPE

A total of 70 service providers were sampled in 25 districts for the DQAI. However, data was obtained
from 64 service points (refer to Annex V).

FIGURE 2: MAP OF UGANDA SHOWING DISTRIBUTION OF SITES BY REGION

Region District
North Eastern Kaabong
North Central Kitgum Lira
North Western Arua Nebbi
East Central Buikwe Kayunga
Eastern Bukedea Mbale Tororo
Western Kabarole Kasese Kiryandongo
South Western Bushenyi Mbarara Rukungiri
Central 1 Kampala Luwero Mukono Wakiso
Central 2 Kalungu Kyankwanzi Lwengo Masaka Rakai

10 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
Six OVC service providers could not be assessed due to a number of factors;

1) Reach the Children in Mukono district had closed office (ended operation), thus was not assessed.
2) Onward in Buikwe district denied the DQAI team access to files with the administrator noting that
Onward was ceasing operations since children under its support dropped out-of-school even
when provided scholastic materials. Onward had a plan to handover operations to Sisters of St.
Joseph Child to Youth Development Foundation. The DQAI team then visited Sisters of St. Joseph,
however the replacement was outside the sample and thus the report could not be merged.
3) Interaid in Kiryandongo district required permission from the office of the Prime Minister which
was difficult to meet at the time of the DQAI, thus not assessed.
4) Our Lady of Peace Children's Ark in Luwero district was assessed but the data was mishandled and
thus could not be merged with other datasets for timely analysis.
5) Kabarole hospital in Kabarole district was not assessed as the OVC focal point person was on sick
leave and thus no one to attend to the DQA team. The Team instead visited Child Development
Fund but this was outside the sample thus its data could not constitute part of the larger sample.
6) Baylor in Kabarole district was not assessed as the team failed to schedule a meeting with the
OVC focal person. The office was closed and efforts to reach the OVC focal point person was in
vain.

While the team visited 70 sites for assessment, six (6) of them could not be thoroughly assessed due
to:

1) Absence of legible OVC interventions; a case of SVI in Kaabong district that generally reaches
out to vulnerable households and communities but does not distinguish beneficiaries as OVC
or non OVC which called for guidance from the District to streamline definition of service.
2) Transfer of files to newly recruited CSO that was not part of the sample; a case of two sites
under TPO Uganda in Bushenyi and Rukungiri whose files were transferred to ICOBI and thus
ICOBI could not be assessed since it was outside the sample.

11 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
3.2 M&E SYSTEM ASSESSMENT
Of the 64 sites assessed, 59% (38) had functional M&E systems (Figure 3). Detailed analysis of the M
and E systems functionality showed that 71% of the sites assessed had an M and E structure,
functions and capabilities to handle OVC information; 63% had their staff understanding the indicator
definitions and reporting guidelines while 47% had data collection and reporting tools available.
Results also indicate that 27% were not using their data for decision making.

Figure 3: Overall Rating of M&E Systems Functionality

3.2.1 M&E STRUCTURE, FUNCTIONS AND CAPABILITIES TO HANDLE OVC INFORMATION


Under this component, the following were assessed and below are the findings;

Indicator I Existence of person(s) responsible for recording service delivery data


Indicator II Processes exist to ensure that data compilation and reporting is completed in the
event that the responsible staff is not available to do the job
Indicator III Existence of designated staff responsible for reviewing reports prior to submission
Indicator IV Feedback process about the quality of data (reports)
Indicator V Receipt of regular OVC support supervision according to the guidelines
Indicator VI Person(s) responsible for collecting and reporting OVC data have been trained and

Figure 4: Performance of CSOs on M&E Structure, Functions and Capabilities

12 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
A. INDICATOR I: EXISTENCE OF PERSON(S) RESPONSIBLE FOR RECORDING SERVICE DELIVERY DATA
In 76% of service providers (figure 4), the responsibility of recording service delivery data as part of
M&E processes is well organised, managed and resourced. While in 6%, the responsibility of recording
service delivery information in the registers is not clearly assigned to the relevant staff and is not
pursued to create a supportive team culture.

Among the 76% of service providers; the duty of handling the chain of data flow from the community
to the M&E unit was well spelt. In some cases, it involves engagement with para-social workers
(volunteers) who work under guidance of M&E unit. The para-social workers are oriented on
application of vulnerability identification, prioritisation, assessment, enrolment, monitoring and
graduation forms. The para-social workers collect and submit data to the M&E unit for validation and
synthesis.

B. INDICATOR II: PROCESSES EXIST TO ENSURE THAT DATA COMPILATION AND REPORTING IS
COMPLETED IN THE EVENT THAT THE RESPONSIBLE STAFF IS NOT AVAILABLE TO DO THE JOB
Assessment noted that 68% of the 64 OVC service points practice teamwork protocol that ensure data
compilation and reporting is completed in the event that the responsible staff is absent (figure 4).

In 16% of the service providers, shortage of human resource was noted as a deterrent with the
available staff swamped with compounded tasks that compromise delivery of quality data. In some
instances, weak organisational capacities complicated by high turnover rate meant some duties were
executed by interns who are not significantly oriented or motivated to commit to heavy workload and
multi-tasking.

C. INDICATOR III: EXISTENCE OF STAFF FOR REVIEWING REPORTS PRIOR TO SUBMISSION


Review of documents including reports is a conditional requirement guided by institutional policies
and it is purely a management role as noted by the OVC service providers visited. Outstandingly, 77%
of the 64 service providers visited completely have procedural units responsible for reviewing reports
prior to submission. Additionally, the OVCMIS quarterly report (form 100) necessitates that a
supervisor reviews and appends a signature before submission and, the availability of signed reports
was evident.

D. INDICATOR IV: FEEDBACK PROCESS ABOUT THE QUALITY OF DATA (REPORTS)


Provision of feedback on reports was regarded as having been vitally enriched through quarterly
regional review meetings and in some instances, district based quarterly review meetings through
which, feedback is yielded. However, receipt of onsite feedback from the district on quality of reports
submitted by the OVC service points was categorised as lacking though, 63% of service providers
noted complete receipt of feedback on reports submitted (figure 4). The 63% feedback is owed
habitually to partners that give feedback to reports as part of contractual obligations with funders.

E. INDICATOR V: SUPPORT SUPERVISION TO THE SERVICE PROVIDER


Support supervision and data auditing were being done concurrently to transfer knowledge, attitudes
and skills that are essential for successful M&E of OVC response and as well, audit data to verify
quality. In the 60% of the 64 service providers that admitted receipt of supervision, it was largely done
by IP staff and infrequently conducted by the PSWO (figure 4).

13 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
The problem of insufficient supervision by national, district and IP supervisors is compounded by poor
documentation of the little support with only a few OVC service providers providing evidence of
action point papers as a consequence of supervision events.

In some cases, the proof of supervision was demonstrated in form of signed visitor’s book which is
mediocre. For districts, supervision from national level ended with the Technical Support (TSO)
approach during the SUNRISE-OVC project. This gap was cascaded beyond national level with districts
also blamed for not adequately providing feedback to reports provided by CSOs.

F. INDICATOR VI: TRAINING OF STAFF IN RECORD KEEPING AND REPORTING


During the last five years of implementation of the NSPPI-2, the MGLSD through partner projects
including MEEPP and SUNRISE-OVC improved technical and organisational capacities of OVC service
providers through practical orientation on OVC related concepts.

In 79% of the 64 service providers assessed, staff responsible for recording keeping and reporting
were oriented on OVCMIS data management tools, who in turn have cascaded knowledge to other
persons within places of work.

In order to sustain efforts, the MGLSD, districts and IPs need to continually mentor users to
appreciate national tools, encourage usage and adherence to standard procedures for OVCMIS data
management.

14 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
3.2.2 UNDERSTANDING OF INDICATOR DEFINITIONS AND REPORTING GUIDELINES
In order to measure the understanding of indicator definitions and reporting guidelines among OVC
service providers, the following indicators were assessed;

1) Whether staff understand what should be recorded in register,


2) Whether staff understand what should be included on the OVC quarterly report,
3) Whether staff understand to whom and when the reports should be submitted,
4) In case of errors, does the service provider know how to communicate and effect changes in a
report that was previously submitted to the district,
5) Are the written instructions adequate to ensure standardized recording and reporting of OVC
data?
6) Do the staff understand how to derive the variables for the indicator number of OVC served?
7) Do the staff understand how to derive the variables for the indicator number of OVC graduated?

Below are the findings;

Figure 5: Understanding of Indicators and Reporting Guidelines

With completion of enhancements and rollout of OVCMIS data management tools, 63% of OVC
service providers demonstrated complete understanding of indicator definitions and reporting
guidelines (figure 3). However, close to 35% of the 64 OVC service providers neither knew what
should be recorded in OVC source documents (register- Form 004) nor understood what should be
included on the OVC quarterly report (Form 100) giving a likelihood that the OVC service provider
staff interfaced with probably were not oriented on the improvement tools (figure 5).

The effect of the OVCMIS Helpdesk and continued technical assistance and interfaces during district
and regional level events has seen increased understanding and adherence to the SOP for reporting
as demonstrated by 78% of service providers who had very good understanding of ‘whom’ reports
should be submitted to as well as reporting timelines. However, the SOPs were largely inadequate
with 59% appreciating the adequacy of instructions on data recording and reporting.

The prevalent gaps were noted in the computation of variables for determining the number of OVC
served and graduated. This calls for additional attention to rollout of the Indicator Booklet to increase
conversance with the indicator definitions and reporting guidelines.

15 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
3.2.3 AVAILABILITY OF DATA-COLLECTION TOOLS AND REPORTING FORMS FOR OVC SERVICES
Under this component, the following were assessed;

1) Availability of national forms and tools for data collection and reporting,
2) Consistency in use of forms and tools,
3) Partner acceptance of national forms and tools,
4) Sufficient stock of blank forms and tools,
5) Refill plan for used up forms and tools and,
6) Relevance of variables in existent forms and tools.

Below are the findings:

A. AVAILABILITY OF TOOLS

Upon upgrade and popularisation of OVCMIS improvement tools, the DQAI team expected availability,
sufficient stock and use of data tools, however, many service providers lacked tools. Only 60% of the
64 service providers had national tools, 14% lacked national tools and, 25% fairly had some of the
national tools which evidently affects consistent and logical use of tools (figure 6).

Figure 6: Availability of Data Collection and Reporting Tools

Some of the service providers did not have any national tools at all. Twenty-nine percent (29%) of
service points that lacked tools operate within the central districts of Kampala, Mukono and Wakiso.

B. CONSISTENCY IN USE OF FORMS AND TOOLS

The DQAI noted inconsistencies in the use of national tools in the 61% (39) service points with
national tools. On average, 51% of those that had all national tools consistently used them to
document service provision. While, 8% of the service points with all national tools did not use them
at all.

C. SERVICE PROVIDER ACCEPTANCE OF NATIONAL FORMS AND TOOLS

Fifty-eight (58) percent of service providers assessed completely agreed to the use of national tools,
while 23% use some national tools and own generated (project-led) tools to manage OVC data.
However, 19% do not agree to the use of national tools (figure 7 below).

16 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
Figure 7: Use of National Tools and Forms for Multiple Partner Reporting

D. SUFFICIENT STOCK OF BLANK FORMS AND TOOLS

Although the MGLSD concluded the rollout of OVCMIS improvement tools in all districts in the
country, a number of service providers had a few or no tools at all. Though 61% of service providers
have national tools, sufficient stock was reported by only 31% whom 23% having an assurance of a
refill plan.

E. REFILL PLAN FOR USED UP FORMS AND TOOLS

Despite the assumption that the MGLSD in collaboration with stakeholders is mandated to avail
national tools to service providers, there seems to be no clear mechanism for delivery of the various
tools. Many clearly stated that there was no plan whatsoever of getting tools for the next one quarter
while a few (23%) expected the MGLSD to deliver.

F. RELEVANCE OF VARIABLES IN EXISTENT FORMS AND TOOLS

Regarding the suitability of variables required to compile a quarterly report, 60% of service providers
acknowledged that the existing tools adequately capture the relevant data. While 24% said that some
aspects, for example; while the MGLSD emphasises the household model of graduation, the tool
(Form 100) captures number of individual OVC graduated, which means the indicator does not look at
the household but rather the child which confuses some service providers. Sixteen percent (16%) of
service providers visited expressed concern about the Form 100 (quarterly reporting tool) as not
capturing some narratives which reduces its versatility to meet information needs.

17 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
3.2.4 DATA MANAGEMENT PROCESSES
Under this component of the M&E systems assessment, the DQAI looked at the following items;

1) Tracking of unique variables within and across service delivery points to avoid double
counting,
2) Data quality controls in place for ensuring compilation of accurate quarterly reports,
3) Availability of electronic database and, quality controls for transcription errors,
4) Application of national confidentiality guidelines,
5) Clarity and accessibility of SOPs for data management, and Utilisation of SOPs and,
6) OVC data archiving and storage and, timely submission of OVCMIS quarterly report.

Findings noted are as follows:

A. TRACKING OF UNIQUE VARIABLES WITHIN AND ACROSS SERVICE DELIVERY POINTS

The DQAI indicated that 63% of service providers allocate unique identifiers to each household and
OVC. It is important to note that this unique numbers issued to an OVC does not necessarily avoid
multiple counting of the same OVC across service delivery points since there is no central authority
for allocating identity numbers. Each service point has the liberty to allocate a number whose
formatting is not shared with other service providers. Seventeen (17) percent of service providers did
not issue unique identities to their beneficiaries due to limited understanding of its applicability and
relevance.

B. DATA QUALITY CONTROLS FOR ENSURING COMPILATION OF ACCURATE QUARTERLY REPORTS

The 61% of service providers with national OVCMIS tools and forms equally have strong data quality
controls for compilation of accurate data. This is partly due to the application of the integrated OVC
register and associated tools that conditions that allow tracking of individual OVC. However, 24% of
service providers lack well established recording and reporting systems that allow tracking of unique
individuals within and across service delivery points to minimise double counting.

C. AVAILABILITY OF ELECTRONIC DATABASE

The DQAI noted 71% of the service providers with well managed M&E systems and amongst these,
56% are running electronic data management systems and in some cases, functional web based
computerized data management systems. Notably, service providers under BOCY and UPHS support
have electronic databases customised to generate automated quarterly reports (Form 100).

However, all service providers running electronic databases had challenges in the management of
hardcopy registers thus denting the evidence of pre-primary and primary data sources as well as
backup. The MGLSD needs to support these service providers to customise their electronic databases
to capture same variables as in the integrated OVC register for accurate (matched) recording.

D. QUALITY CONTROLS FOR MANAGEMENT OF ELECTRONIC DATA

Whereas 67% of the service providers have electronic databases with inbuilt quality controls, many
cannot give accurate quarterly reports (Form 100) as it evident during the assessment that a number
of the reports at service provider office had disparity with reports on district file and data extracted
from the OVCMIS. Additionally, most of the computerised databases were at odds with existing pre-
primary and primary data sources, with service providers lacking documentation.

18 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
Some service providers gave an assumption of migration from paper based tracking to complete
reliance on electronic data capture which unfortunately is not the case but rather a weakness in the
emphasis on usage of national tools and forms and also due to unavailability of hardcopy national
tools. In some of the service points, assessment noted transcription and system errors as leading to excess
disparity.

This calls for harmonisation of the existing electronic databases to capture same variables as are in
the integrated OVC register and, also improve the electronic databases to track data errors. But most
of all, mentor service providers to ensure records of assessments, home visits and monitoring are
backed up.

E. APPLICATION OF NATIONAL CONFIDENTIALITY GUIDELINES

In working with children, the MGLSD expects service providers to adhere to the guiding principles of
child safeguarding as are contained in numerous local, national and international bylaws, ordinances,
laws and protocols that promote access to rights and full realisation of potential.

Among the 64 service providers (figure 8), 62% have a good observation of the principles of child
safeguarding, through; interventions that do no harm, working in the best interest of a child by
inspiring participation and pursuing non-discrimination owed to gender or any misdemeanours.

Figure 8: Adherence to National Confidentiality Guidelines

There is a need to support a number of OVC service providers working in the districts of; Buikwe,
Kaabong, Kasese, Lira, Luwero and Lwengo to appreciate the need to maintain confidentiality for all
child care and protection information and documents.

F. STANDARD OPERATING PROCEDURES (SOPs) FOR DATA MANAGEMENT

Service providers accredited the MGLSD and IPs for efforts to standardise procedures for data
management across the country but, there were conspicuous gaps in majority of the service
providers. Slightly above average number of service providers (56%) had gaps in availability of job aids
with clear explanations on archiving and handling OVC records.

G. UTILISATION OF SOPS IN OVCMIS DATA MANAGEMENT

Among OVC service providers with some or all national SOPs, 59% are complying and entirely relying
on available written guidelines and organisational policies on information management to guide the
handling of OVC data. Some service providers (like TASO Mbale) have managerial policies and

19 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
procedures to guide data management. In other service providers, the national SOP guidelines and
relevant instruments were in existence but are never used or openly displayed for quick reference.
This partly demonstrated the need for additional orientation and mentoring on the application of
SOPs.

On a positive note, OVC service providers are increasing engaging and, utilising government
structures in implementation (service delivery) which improves sustainability and supervision. Owed
to this engagement, Community Based Services Departments (CBSDs) in many districts with limited
resources are increasingly leading and guiding non-state actors and civil society responses to child
vulnerability.

H. OVC DATA ARCHIVING AND STORAGE

In 11% of the 64 service providers assessed; adequacy of space, data security and ease of access was
deficient with scattered, inefficient and unreliable databases. In 52% of service providers with
adequate archiving and electronic systems, there seem to be a relationship between adequate
archiving and the presence of written guidelines (figure 9). Furthermore, 37% of service providers
could not strongly demonstrate adequacy of archiving and storage systems thus a need to improve
their data management capabilities.

Figure 9: Data Management Files in Some of the Service Providers Visited.

I. TIMELY SUBMISSION OF OVCMIS QUARTERLY REPORT

The assessment record near average performance on timely submission of reports with just 58% of
OVC service providers acknowledging the need and adhering to timely submission of quarterly OVC
reports as specified in the national SOP for reporting.

While 16% lacked clear understanding of reporting timelines and appropriate channels for reporting.
In some cases, the OVC service provider rejected reports extracted out of the OVCMIS as not theirs or
never reported at all.

3.2.5 USE OF DATA FOR DECISION MAKING


In order to assess the use of OVC data, attention dwelt on the following items;

1) Evidence of OVC data use at the service point,


2) Staff training in data analysis and interpretation,
3) Capacity to analyse and interpret OVC data,

20 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
4) Data use to inform decisions and, any programmatic decisions ever taken by the service
delivery site based on analysed OVC data / results.

Findings are as below:

Overall performance of OVC service providers on data use for decision making averagely stood at
52%. The basis of judging usage of data at a service point was contingent to the display of charts and
graphs in reports and any other narratives showing decisions made basing on data analysed.
However, usage of data by a given service provider needs to have clearer parameters depending on
the mode of operation of a particular service provider.

Regarding training and resident capacity to analyse and interpret data, the assessment revealed
below average capacity with 49% of staff in all the 64 service providers trained while the other 51%
had no training at all which could explain the average usage of data among many service providers.

Fifty-four (54) percent of the service providers assessed reported data usage for programmatic
decision making. Among the common decisions made across the service provision points basing on
data analysed were; evidence dense based performance appraisal and target setting, resource
mobilisation and allocation and, effective planning for the provision of services to moderately and
critically vulnerable children among others.

21 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
3.3 DATA VERIFICATIONS
3.3.1 DOCUMENTATION REVIEW
At the service delivery point, the DQAI team reviewed specific M&E tasks performed during service
provision, appreciated record management and got a clear picture of what is being “counted” and
reported to subsequent data aggregation levels. The following data verification steps were
undertaken:

i. A narration of the connection between the delivery and documentation of services to OVC,
ii. A detailed review of data sources to establish availability, timeliness and completeness of
data,
iii. A thorough verification of the adequacy of data management tools and forms and,
iv. A detailed analysis of procedures for handling errors.

Tracing of numbers of individuals reported as served or graduated during the January-March 2016
quarter was undertaken through recount of individuals recorded in data sources, in particular; the
OVC service register, the OVC integrated register and/ or any other databases.

Recounted numbers were matched with the data extracted out of the OVCMIS. Also, two crosschecks
were done to compare; 1) a copy of the OVC service provider quarterly report (Form 100) with the
recounts from the primary source document (register) and, 2) a copy of the service provider report
(Form 100) filed at district (at PSWO) with the counts from the primary source document (register).

For variations noted; a feasible justification was sought from the service provision team and, in a bid
to ascertain data accuracy and reliability, the two (2) crosschecks were performed in respect to pre-
primary and primary data sources, namely; beneficiary lists, activity reports, home visit forms, child
vulnerability indices and other inventories.

The decision rules as to whether; accept or reject data variation were based upon the following:
i. A range of +5% to -5% represented data with no disparity,
ii. A range of ±5% to ±10% represented data with acceptable disparity and,
iii. A range greater than ±10% represented data in the rejection zone.

3.3.1.1 OVERALL STATUS OF DATA SOURCES


Figure 10 shows the overall status of data sources including availability, completeness and evidence of
information for the period under review expressed in percentages. The chart below shows that 56%
of data reported during the January-March 2016 quarter lacked adequate data sources to justify
service provision and can be regarded as invalid.

22 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
Figure 10: Overall Status of Data Sources

Only 46% of indicator source documents were available and of these, 46% were complete in terms of;
1) the reported count relevant to the indicators in question, 2) the reporting period, 3) the date of
submission of the report and, 4) a signature from the staff having submitted the report. Only 40% of
the witnessed data had all information covering the January-March 2016 quarter.

JUSTIFICATION FOR OBSERVED AVAILABILITY OF DATA SOURCE DOCUMENTS

The following reasons were noted as possible explanations for data source variance:

i. Some service points lacked registers to ease tracking of individuals served or graduated. Some
of the data sources did not indicate the reporting period with some sources dated back to
2013.
ii. Some service points did not serve OVC as reflected in OVCMIS report and could not
understand what the OVCMIS was which raised questions of how their records ended up on
the system.
iii. Poor record keeping by some service providers where, individual files lacked updated records
and as a result, some of the records could not be accessed for verification.
iv. Some data source documents are reportedly kept at national M&E unit and not at site office.
v. Existence of computerized databases not backed by primary records which were perceived as
time wasting and extra workload. In some sites, community facilitators were in custody of
source documents due to the delegated responsibility of reaching out to OVC.
vi. In one site, records were kept in a lockable cupboard and keys were unavailable at time of
visit.
vii. Access to data source documents was denied citing need for authorization from higher offices
whose approval could not be got by the end of feedback session. A case of service providers
working in the refugee settlement in Kiryandongo district that requested for authorisation
from the Prime Minister’s office before the DQAI team could access data.
viii. Some service providers lacked record keeping skills and so files could not easily be retrieved.
ix. Implementers report using phone but forget to deliver primary source documents to M&E
unit.
x. In a few instances, primary data sources were attached to vouchers as accountability for
funds and what was available (if at all) were duplicates of beneficiary forms and
acknowledgements.

23 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
3.3.1.2 COMPLETENESS OF ALL AVAILABLE DATA SOURCES
Figure 11 shows data disparity status among the service points assessed. Recount of data to validate
number of OVC reported as served or graduated from the register to compare with data extracted out
of the OVCMIS was conducted in 64 sites visited. While data source crosschecks to compare counts
from the register with records available in quarterly reports filed at service provider’s office and
district office were conducted on condition that a quarterly report was available on file at the site or
district.

Figure 11: Proportion of OVC Service Providers by Data Disparity

While comparing data extracted from the OVCMIS with counts from the register kept at service
provider office, gross over reporting (excess disparity greater than ±10%) was noted among 73% of
the 64 service providers assessed (figure 11). Only 17.5% of service providers had data with no
disparity (figure 12).

Figure 12: Disparity between Service Provider Data Extracted from the OVCMIS and Register Recount

Similarly, crosschecks for data on registers and quarterly reports on file were done but for only service
providers that could avail a copy (copies) of quarterly reports (form 100) on file at service provider
and/ or district level. Comparison of counts from register with records on quarterly report on file at
service provider office reported disparity among 65% of service providers. When comparison was

24 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
done for counts from the register with counts from quarterly report on file at district, excess disparity
was noted among 68% of service providers.

Trend indicated increment in data disparity as data left the service provider and was entered onto
OVCMIS (figure 13). The justification for this observed increment could not easily be uncovered by the
DQAI exercise as a number of service providers did not explain the anomaly which according to some
service providers could be errors due to transcription, summarisation and manipulation of data.

Figure 13: Data Disparity Trend (as it moves from the register to the OVCMIS)

The DQAI indicated gross over reporting of data with data in the OVCMIS being 32% more than the
service provider counts for total served as recorded in the registers. While reports at site and district
files compared with source documents were of acceptable quality with a difference of within ±10%.
This imply that majority of the data discrepancies are introduced during the process of data entry into
OVCMIS. Additionally, some of the data in OVCMIS had no supporting documentation at district and
site level, for example, two sites rejected data in the OVCMIS and claimed to never have reported.

The following reasons were noted as possible explanations for incomplete data sources:

i. M&E staff irregularly carryout quarterly reviews of data sources to ensure completeness
before reporting as they are deeply involved in implementation of OVC interventions and
other tasks.
ii. Implementers and M&E staff lack of skills and knowledge in procedures to ensure adherence
to dimensions of data quality, completeness and data precision in particular.
iii. Transfer of staff before proper hand over to incoming M&E staff and poor M&E staff
induction.
iv. M&E staff not only perform M&E activities but also implement activities, as a result, they
ineffectively play their document review tasks to ensure completeness of data sources.

3.3.1.3 VERIFICATION OF THE AVAILABLE INFORMATION COVERING THE PERIOD UNDER REVIEW
Table 2: Summary of Data Verification Factors
Indicator Service Provider Joint Verification Onsite Joint Verification District Joint Verification
Data in the Count - Factor Quarterly Count - Factor: Filed Count - Factor:
OVCMIS Register Report Register Crosscheck Report Register Crosscheck
Value 1 2
Number of Individuals 33,850 25,665 32% 24,566 23,024 7% 21,211 20,561 3%

25 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
Indicator Service Provider Joint Verification Onsite Joint Verification District Joint Verification
Data in the Count - Factor Quarterly Count - Factor: Filed Count - Factor:
OVCMIS Register Report Register Crosscheck Report Register Crosscheck
Value 1 2
Served
Number of Individual 1,822 149 1,223% 1,017 148 587% 987 148 567%
OVC graduated
Colour Code
Green .-0.95 to +1.05 No Disparity (Limited Disparity)
Yellow .-0.9 to +1.1 Acceptable Disparity
Red >1.1 Excess disparity

During the DQA exercise at many service provision points, the available data sources appeared
incomplete thus a number of fields were not filled and the reasons or explanations for this include:

i. Some service providers lacked registers that record age disaggregated data as it is in form
100.
ii. Data source documents did not indicate the reporting period and so could not be relied on.
iii. Some service provider did not serve OVC as reflected in OVCMIS report.
iv. Poor record keeping at the service provider and as a result, some records could not be
accessed by the time DQAI teams left the service point.
v. Data sources are reportedly kept at national M&E unit and therefore could not be got at site.
vi. Keys to cabinets where records are kept were not at the site at the time of DQAI.
vii. Access to data source documents was denied citing need for authorization from higher offices
whose approval could not be got by the time exit and feedback conference ended.
viii. Implementers report using phones but forget to deliver primary documents to M&E officers.
ix. Some primary data source documents are attached to vouchers as staff submit
accountabilities to accounts before photocopying beneficiary forms and acknowledgements,
etc.

SPECIFIC REASONS FOR DATA DISPARITY IN GENERAL

i. Incomplete data sources due failure by responsible staff to play their roles in time.
ii. Data source documents for indicators under review are scattered in different locations so
staff could not trace some of them on time.
iii. Some organizations have electronic databases but they are not supported with primary
records used to record activities during implementation and monitoring processes.
iv. Inaccurate tallying of individuals served from activity reports and beneficiary
acknowledgment forms to onsite report.
v. OVC reports in Strengthening Decentralisation of Sustainability (SDS) Project sites are
attached as accountability to vouchers. However, due to resource constraints staff concerned
do not have means to procure paper and photocopy or print file copies for reference
purposes.
vi. District PSWOs were discovered to be approving CSO data on the system before receiving the
hard copy report as per the SOPs that require them to validate this data before approval. This
makes it difficult for one to track concocted data so as to take appropriate action to ensure
quality data.
vii. Some district reports are submitted to national M&E units but a copy of approved report are
not returned to the site as a reference reporting material.

26 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
viii. Some reports are compiled during workshops and submitted without consulting data source
documents as observed during regional data review meetings where reports appear on the
system overnight and one cannot present the hard copy of the data entered.
ix. Some site staff recorded OVC served in notebooks and forget to transfer data to registers.
When the time for reporting approaches, a staff resorts to guesswork due to memory lapse.
x. Some service points are understaffed and therefore overwhelmed with implementation of
interventions to the detriment of reporting.
xi. There are possibilities of data manipulation as many reports could not speak to each other at
the different levels of reporting.

27 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
4.0 DISCUSSIONS AND RECOMMENDATIONS
4.1 GENERAL STRENGTHS AND WEAKNESSES

General Strengths and Opportunities for improving OVC data General Weaknesses and Threats to strengthening OVC data management;
management; 1. Data validity in the OVCMIS was not acceptable in many of the service providers visited; only 46% of
1. Strong data management systems in many of the service data sources were available, 46% of available data sources were complete and, 40% of data for the
providers visited who had: January-March 2016 quarter were available.
i. Good understanding of indicators. 2. Some service providers reported pre-identification clients as served before actual services are
ii. Many service providers had electronic databases and provided. Some of the IPs conducted pre-identification which were reported as OVC served although
in some cases, web based computerized databases. there is no documentation for any additional service provided. Similarly, some CSOs that provide HIV
iii. Some service providers had well organised storage services reported HIV positive care children with no documentation of OVC services.
cabinets with clear archiving systems. 3. Community dialogue meetings of caregivers extrapolated to the OVC for example; birth registration
2. IPs that have just started operation have taken time to sensitization meetings reported as birth registration to OVC.
ensure that vulnerability assessment is done using the 4. Data collection tools are not used in some of the CSOs where they were delivered. The reasons were
community structures. mainly; CSOs not trained to use OVC tools, some IPs are using other OVC tools for primary client
3. Some service providers are using government structures in capturing and electronic data bases without using the OVC registers or not emphasized by the
implementing activities which, strengthens partnership, ‘funders’. Also, registers not updated for all the services that are provided. Some service providers are
e.g. supporting sub-county CDO in service provision. not using recommended service guidelines thus challenging the tracking of national variables, for e.g.;
4. Many CSOs are reporting directly on the system which i. Individual child versus household approach which makes graduation of OVC difficult to track.
reduces the workload of the PSWO and many CSOs are ii. Vocational training with no start up kits that leads to graduation and empowerment as some IPs
using procedures and guidelines for information classified start up kits as non-allowable costs.
5. District PSWOs although miserly funded are diligently 5. Unexplained sources of reports and data errors, some due to;
reporting for all the CSOs in their districts. i. Service provider disowning reports in the system as never submitted any report,
6. IPs can be a catalyst to improved data management as it ii. Little or no documentation although service providers submitted reports,
was evident that CSOs with close links with the IPs were iii. Possible transcription errors and system errors likely from service provider entry module.
performing better.

28 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
4.2 RECOMMENDATIONS FOR IMPROVED DOCUMENTATION OF OVC DATA

The following suggestions were made by service providers as necessary to improve data availability,
completeness and general data quality.

i. Orient service providers and stakeholders to appreciate, understand and use MGLSD revised
OVCMIS data management tools. This calls for continuous supervision, mentoring and
coaching for adherence to usage of data management tools. During coaching and
mentorship; indicators tracked regularly like child graduated, child served and child provided
psychosocial support among others, should be clarified for collective interpretation.
Additionally;
 During service provider coaching; indicators tracked regularly should be clarified for
universal (collective) interpretation.
 Coaching and mentoring to service providers should promote adherence to service
delivery standards that meet or even surpass national approved standards.
 Routinely undertake gap analyses and encourage stakeholders to print and distribute
sufficient stock of revised data management tools; Form 004 and Form 100.
 There is a need to popularise, enforce and customise national data collection tools
through appreciation, use and adherence to standard procedures including guidance
on indicator definition particularly how to track, compute and report key indicators.

ii. Refocus data quality improvement efforts to reduce errors during data collection,
transcription and entry and, clarify on data management processes and expectations for
improved data use to influence decision support systems. Additionally;
 In order to address the gaps in data source and information availability and, data
source completeness, there is a need to institute good data storage practices,
namely; well-labelled files with separators by reporting quarters.
 IPs and districts should support service providers to develop and implement data
quality improvement plans and monitor implementation.
 Routinely conduct internal and external DQAIs to institutionalize this good practice in
data quality management.
 Continuously empower districts to enforce standards for data management. District
and IPs should support CSOs to access recommended reporting forms and data
collection tools and orient on use. Districts and IPs should conduct technical support
supervision to address data quality gaps as when identified.
 Indicator source documents should be authenticated (for example; signed and dated
by implementing staff, receiver and verifier of these data source documents) before
compiling a quarterly report.
 Where data source documents are kept at national/ regional M&E unit, they should
be photocopied and a folder with reporting-period separators created at service
point.

iii. In the medium to long term; there is a need to support standardization of electronic/
computerised databases in use by different actors and, harmonise to capture same variables
as are in the integrated OVC register. Additionally, there is a need to;

29 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
 Develop a universal (national) computerised database and share with service
providers (to reduce reinvention of own electronic databases by different service
providers).
 This automation of data management tools (particularly the integrated OVC register)
should allow direct upload of primary data from the register into the OVCMIS online
database. This will support generation of accurate reports based on primary data
uploaded into the online database.
 Improve computerised databases to track data errors and suit /adapt to Form 004.
 Ensure pre-primary records including assessment and enrolment forms and, home
visit records are current and available. All computerized databases should be backed
by primary data collection tools from which data entered in the database were got.
And only individuals served and recorded in project approved, updated and existing
OVC registers should be reported to the OVCMIS for a particular quarter.
 Service provider managers should not only facilitate staff with required resources to
implement data-quality-improvement plan, but also support supervise staff as they
implement and monitor the actualization of the plan in question in order to improve
data. Service providers should plan and budget for data validation events.
 Before submission of quarterly reports to intermediate aggregation levels an M&E
reflection meeting should be held to enable staff review Data quality of reported data
and close existing information gaps before submission.
 Importantly, there is a strong desire for rewards to OVC service providers that report
accurate, timely and complete data and, this calls for strategies to eliminate data
manipulation and appreciate service providers that meet dimensions of data quality.
 In the medium to long term; there is a need to standardize electronic (computerised)
databases in use by different actors and, harmonise them to capture same variables
as are in the integrated OVC register. This automation of tools (particularly the
integrated OVC register) should allow direct upload of primary data from the register
into the OVCMIS online database. In case data confidentiality guidelines can be met,
this innovation will significantly support generation of accurate reports based on
primary sources uploaded on online with little or no transcription errors arising
during data summarisation.
iv. There is a need for IPs and funding agencies to give prompt clearance on allowable and
disallowable interventions, particularly relating to support to economic strengthening, where
some service providers supported OVC to complete apprenticeship skilling but could not offer
start-up kits (capital) as disallowable thus denting comprehensive support and inducing
further sentiment of disempowerment.

30 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
ANNEXES
5.0
ANNEX I: DATA COLLECTION AND ASSESSMENT TOOLS
1) INFORMATION PAGE

Name of Program under review: OVC Region: District:


IP Providing OVC Services Name of Service Provider: Date:
Name Designation Email Cell Phone
Assessment Team 1
2
Indicator 1 Indicator 2
2) Number of individual children
Indicators Reviewed: 1) Total number of individuals served Jan-Mar 2016
graduated in Jan-Mar 2016

Primary data source (Registers) vs Service Provider Report in OVC MIS


Primary Validation:
reports for the Quarters Jan-Mar, 2016
Primary data source (Registers) vs secondary data source (onsite OVC
Cross-Check 1:
Reports for the Quarter Jan-Mar, 2016)

Primary data source (Registers) vs District filed report for the Service
Cross-Check 2:
Provider for the Quarter Jan-Mar, 2016

Reporting Period Verified: Jan - March 2016


Respondents Name Designation Email Cell Phone
1
Primary contact:
2

2) DATA VERIFICATION AND VALIDATION TOOL (DVV FORM) FOR SERVICE DELIVERY SITE
Service Delivery Site:
District: Implementing Partner at Site:

1) Total number of individuals served this period


Indicator Reviewed:
2) Number of individual children graduated
Date of Review:
Reporting Period Verified:
Part 1: Data Verifications

31 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
COMMENTS:
1) Total number of
2) Number of individual
A - Documentation Review: individuals served this
children graduated
period
Review available data sources for the reporting period being verified. Are all necessary data sources available for 1)
1 review? (Y/N). Briefly Comment as appropriate
2)
If no, determine how this might have affected reported numbers.
Are all available data sources complete (essential data fields)? Briefly Comment as appropriate 1)
2 2)
If no, determine how this might have affected reported numbers.
Review the data sources: Is information available covering the period under review. Briefly Comment as appropriate 1)
3 2)
If no, determine how this might have affected reported numbers.

Part 1: Data Verifications


1) Total number of COMMENTS:
B - Recounting reported Results: Recount results from data source, compare the verified numbers to the service delivery site 2) Number of individual
individuals served this
reported numbers and explain discrepancies (if any). children graduated
period
4 Enter the number of clients reported by site during the reporting period from the site summary report (OVCMIS). [A]
Jan - Mar 2016
5 Recount the number of clients during the reporting period by reviewing the data source (Register(s)). [B]
Jan - Mar 2016
6 Calculate the ratio of reported to recounted numbers. [A/B]
Jan - Mar 2016
For each indicator; What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic 1)
7
errors, missing data source, other)? 2)
C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could
include, for example, randomly selecting 10 OVC Files and verifying if these OVC were recorded in the OVC registers. To the extent relevant, the cross-checks should be performed in both
directions (for example, from Case management book to the Register and from Register Case management book).
CROSS-CHECK 1: Cross-check primary data source (Register) with secondary data source (onsite Quarterly report). (If cross- 1) Total number of COMMENTS:
checks are different from the planned cross-check, i.e. the cross-checks entered on the Information page, specify the cross- 2) Number of individual
individuals served this
checks performed in the comment cells to the right.) children graduated
period
Enter the number of clients reported by the site during the reporting period from the site summary report (onsite
1.1 Quarterly Report)
Jan - Mar 2016
Recount the number of clients during the reporting period by reviewing the data source (Use joint count from the
1.2 Register(s))

32 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
Jan - Mar 2016
Calculate % difference for cross check 1: 1)
1.3 What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing data
source, other)? 2)
Jan - Mar 2016
C - Cross-check reported results with other data sources:
Cross-checks can be performed by examining separate inventory records during the reporting period to see if these numbers corroborate the reported results.

CROSS-CHECK 2: Cross-check the Joint Count from site Register with the District filed report. (If cross-checks are different 1) Total number of COMMENTS:
from the planned cross-check, i.e. the cross-checks entered on the Information page, specify the cross-checks performed in 2) Number of individual
individuals served this
the comment cells to the right.) children graduated
period

1.1 Compare the recount and the number of clients during the reporting period in the filed district report (A)

Jan - Mar 2016


1.2 Enter the number of clients during the reporting period recounted from the site OVC register in 5 above - . [B]
Jan - Mar 2016
Calculate % difference for cross check 1: 1)
1.3 What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing data
source, other)? 2)
Jan - Mar 2016

3) DATA MANAGEMENT ASSESSMENT TOOL (SERVICE POINT & DISTRICT LEVEL M&E UNIT)

Service Delivery Site:


District: Implementing Partner at Site
1) Total number of individuals served this period
Indicator Reviewed:
2) Number of individual children graduated
Date of Review:
Reporting Period Verified:
Component of the M&E System Check the Answer that COMMENTS (Please provide detail
applies for each response
- Yes Completely Detailed responses will help guide
- Partly strengthening measures).
- Not at All
- Not Applicable
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities to handle OVC information

33 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
Is the responsibility for recording OVC service delivery information in the Registers clearly assigned to the relevant staff? (Briefly describe
who & how the Registers [OVC register,005 006,007,or any other tools used to identify, verify, assess, enrol and register OVC in your OVC
1
program and report results] are filled at this site) e.g. mode of service delivery such as through school, direct to the beneficiary, church or
community
Are there processes in place to ensure that data compilation and reporting is completed in the event that the responsible staff is not
2 available to do the job (e.g. shared duties, a team approach etc.)? (Briefly describe for all Registers used for OVC Services [OVC register,
Referral register, register])
Are there designated staff responsible for reviewing periodic reports prior to submission to the next level? (Note that the reviewer can be
3
from any level - Site, district or national levels)
Does the OVC service provider receive regular feedback on the quality of their submitted reports? (Specify persons, frequency and form of
4
providing the feedback)
Does the OVC service provider receive regular OVC supportive supervisory visits from district and/or national level staff and/or other
5 Organisation according to the guidelines? (…If yes, specify the team that provided the support supervision and date for the last visit (month
and year (mm/yyyy)))
Are the person(s) responsible for collecting and reporting OVC data trained/ oriented in data collection & reporting? (In the comments
6 section specify: 1) if the site has an OVC/Records person, 2) if the records person assists with OVC reporting, 3) if the site has a training /
orientation plan for new staff in data collection)

II- Understanding of Indicator Definitions and Reporting Guidelines

Has the site been provided with the National written M&E guidelines for its sub-reporting level on …
……….what should be recorded in the source documents/registers. (probe and comment on whether the data collection team at the site
7
understands the questions/variables to be filled in the various source documents/registers)
………., what should be included on the OVC quarterly report. (probe and comment on whether the data collection team at the site
8
understands how the OVC quarterly summary reports are compiled)
9 ……...… to whom the reports should be submitted.
10 ……… when the reports are due.
……… in case of errors, does the site know how to communicate and effect changes in a report that was previously submitted to the
11
district. (in the comment section briefly describe the process used at the site)
12 Are there written instructions provided adequate to ensure standardized recording and reporting of OVC data.
13 Do the OVCMIS staff understand how to derive the variables for the indicator "OVC Served" in the register and Quarterly
14 Do the OVCMIS staff understand how to derive the variables for the indicator "Number of OVC graduated"
III - Availability of Data-collection Tools and Reporting Forms for OVC services
Does the OVC service provider have the national OVC forms/tools to be used at their reporting level? (In the comment section explain if the
15
following OVC tools are currently available at the site (specify version where possible), 1) OVC Register; 2) OVC cards, summary tool...........
…If yes, are the standard forms/tools consistently used at the Site? (probe and list reasons in case standard forms/tools are not used
16
consistently)
If there are multiple organizations that are implementing activities at this service point, do they use the national reporting forms and follow
17
the same reporting timelines? (Name the organization/ Implementing Partners in the comment section and tools used if different)

34 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
Are there sufficient stocks of blank primary data collection tools/registers and summary forms at the site. (Enquire and comment on when
18 the existing stock of OVC registers & summary forms are expected to run out -estimate based on list of tools provided in #16
above)...period of 1 quarter
19 Does the site have a regular refill program of the OVC Tools when they are used up? (Enquire where and how the site gets these refills)
Do the Primary data collection tools / registers have all the relevant questions/ variables needed to compile the OVC Quarterly reports?
20
(Enquire about how the staff obtain the various answers needed in the compilation of the said reports)
IV- Data Management Processes
Does the established OVC recording and reporting system allow tracking of unique individuals within and across Service Delivery Points to
21
avoid double counting at this site? (E.g. can a client served with OVC services tracked through to the OVC registers over time)?
Are there data quality controls in place for ensuring compilation of accurate quarterly reports (e.g. controls for detection of data
22
inconsistencies; incomplete /incorrect OVC Reporting, missing data & transcription errors)?
23 Does the site have computerized OVC data and/ reports (If yes, specify what data is computerized and the computer packages used)
Ask only where applicable, For computerised sites are there quality controls in place for when data from Registers/paper-based forms are
24 entered into a computer to ensure the accuracy of data entry (e.g. edit and/or logic checks, post-data entry verification, etc.). (In the
comment section, briefly explain the data quality controls and also enquire about their computerized data backup program)
25 Confidentiality of OVC information: Is the OVC data maintained according to national confidentiality guidelines?
Are there SOPs/ Job Aids/Guidelines that describe how site documents should be handled and archived clearly written and accessible to all
26
staff?
If yes, are the staff aware and using the above SOPs/ Job Aids/Guidelines? (Briefly explain, and also enquire if the site has the latest OVC
27
Guidelines)
Is the data archiving/storage system at the site adequate? (Assess data storage, security and ease of accessibility & retrieval of OVC site
28
records (e.g. filing cabinets, storage rooms etc.)
Is the OVC Quarterly report submitted in a timely manner? (Enquire and observe if the 'Oct- Dec 2015' & Jan-Mar 2016 quarterly reports
29
were compiled and ask to see copy, cross check on the date of compilation)
V - Use of data for decision making
Is there evidence of OVC data use at the site? (Check for evidence of data analysis & reporting other than the official reports e.g. charts,
30
graphs, maps, etc. (Ask to see if not displayed)?
31 Have the OVC services staff been trained in data analysis and interpretation? (Probe for trainings done and duration)
32 Is there a staff at the site who takes lead in analysis and interpretation of OVC data?
Is the analysed data / results presented / disseminated to other information system stakeholders in a timely manner so that the
33 information can be used to inform decisions? (Probe about examples of target audiences for the data e.g. Site administration, sub county
council, Donors, IPs)
34 Are there any programmatic decisions taken by the service delivery site based on analysed OVC data / results. (Ask to see example)

4) DATA VERIFICATION AND SYSTEM ASSESSMENT SHEET – RECOMMENDATIONS FOR THE OVC SERVICES
District: Service Delivery Site: Implementing Partner at site:

35 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
Based on the findings of the systems’ review and data verification at the service site, please describe four key challenges to the data quality identified and recommended strengthening measures, with an estimate of
length of time the improvement measure could take. These should be discussed with site staff.
Action Point Timeline Responsible Person(s) Resources
Identified Weaknesses (Specific activities on improving OVC (When do you hope to (Person/organization that will be (What is required in order to achieve
services) achieve this) responsible for accomplishing this task) carry out agreed upon, activities)
1
2

5) GENERAL OBSERVATIONS AND NOTABLE GOOD PRACTICES AT SITE

Service Delivery Site:


District: Implementing Partner at site:
i) General Observations about Site
1 Implementing Partner at site:
ii) Notable Good M&E Practices at Site:
1
2

6) FEED BACK TO SITE – M&E SYSTEM ASSESSMENT


MINISTRY OF GENDER, LABOUR AND SOCIAL DEVELOPMENT
DATE:
Introduction: This document summaries findings, recommendations and actions for future reference and implementation to strengthen OVCMIS at National, District, IP, Sub county - Site level
A1 Documentation review at site
Availability of all required data Completeness of all available Availability of data for period
Variable sources (registers etc.) data sources under review Good practices Areas for Improvement
Number of Individuals served
Number of OVC graduated
A2 Understanding of variable & Validation of Reported Data against this Site
Understanding of how to derive Validated Count at Site vs Validated Count at Site vs
Variable variable for reporting Report in OVCMIS Reported in Report at site Good practices Areas for Improvement
Number of Individuals served
Number of OVC graduated
B M&E System Assessment
Component Observations

36 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
Good practices Areas for Improvement
I - M&E Structure, Functions and Capabilities to
handle OVC information
II - Understanding of Indicator Definitions and
Reporting Guidelines
III - Availability of Data-collection Tools and Reporting
Forms for OVC services

IV- Data Management Processes


V - Use of data for decision making
Recommendations and Action Points for the OVC Services
Based on the findings of the systems’ review and data verification at the service site, please describe four key identified areas that need to be addressed to improve data quality at the site. Work with the site staff
to recommend strengthening measures, with an estimate of the length of time the improvement measure could take.
Action Points Timeline Responsible Person(s) Resources
Identified Areas of
(Specific activities on improving (When do you hope to (Person/organisation that will be responsible for (What is required in order to achieve carry out
Improvement
OVC services) achieve this) accomplishing this task) agreed upon, activities)
1
2
Name Designation Email Cell Phone Signature
Assessment Team 1
2

7) FEED BACK TO SITE - VERIFICATION


Percentage
Percentage Deviation
Deviation of of OVC Comments
Secondar OVC Report Comments Report at - Deviation Percentage
Primary Data y Data Reported from Joint - Deviation site from of OVC Deviation
Source Source Onsite Quarterly Quarterly Quarterly Count of OVC Joint Count Report at of report at Comments - Deviation
(Registers) & (OVCMIS DQAI Outputs Summary Summary OVCMIS Joint Report at (OVC Joint site from District vs of report at district vs
Data for Summary Joint in the Report at Report at Count)/ Site site from Count)/ site site Joint Joint count count from Joint site
Indicator Aggregation Reports) Period Count OVCMIS Site District Joint Count Joint Count Joint Count Count at site Count
Number of
Individuals
served
Number of OVC
graduated

37 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
ANNEX II: SCHEDULE FOR THE ORIENTATION OF FIELD TEAMS ON THE OVC DATA QUALITY ASSESSMENT
MINISTRY OF GENDER, LABOUR AND SOCIAL DEVELOPMENT
RIDAR HOTEL, 29TH JUNE – 1ST JULY, 2016
TIME TOPIC FACILITATOR CHAIRPERSON
Day 1: WEDNESDAY, 29/6/16, ORIENTATION TO TOOLS
8.00 - 9.00 AM Check-in and Registration Admin. J. S. Kaboggoza,
9:00 - 9:15 AM Introductions Lydia Wasula Assistant
9:15 - 9.35 AM Background, Objectives and Rationale and, Priority Indicators for the OVC DQA Sarah Kyokusingura Commissioner
9:35 - 9.45 AM Opening Remarks Commissioner Youth and Children Children
9:45 - 11.00 AM Tools, Processes and Mechanisms for Routine Data Collection, Reporting and Data Flow Willy Etwop
11.00 – 11.30 REFRESHMENT BREAK Admin.
11:30 AM – 11.50 PM Factors that could affect data quality Abiasali Mungo Kyateka Mondo,
 Primary level Assistant
 Secondary Level (Summarizing/Aggregation) Commissioner
11:50 – 12.30 DQA Methods: M&E Systems Assessment Tool Sarah Kyokusingura Youth
12:30 – 1:30 DQA Methods: Data Source Verification and Validation Tools Richard Ongom Opio, Fatumah Matovu
1:30 – 2:30 Lunch Admin.
2.30 – 3:40 DQA Methods: Household Verification Tool Livingstone Kamugisha Lydia Wasula,
3.40 – 4.10 DQA Methods: Working out overall site recommendations and general observations; General Observations, Herbert Mulira Head of NIU
Notable Good M&E practices & Identified weaknesses, proposed action plans, timelines for corrective actions,
resource requirements
4:10 – 4:30 DQA Methods: Provision of Feedback to sites; Wrap up of findings and recommendations & Provision of Livingstone Kamugisha
feedback to service provider Denis Oyirwoth
4:30 – 5:00 Site Procedures: Etiquette and confidentiality Margaret Mugerwa
5:00 – 5:30 Team assignments and pilot logistics Sarah Kyokusingura
5.30 End of Day 1
Day 2: 30/6/2016, Pilot Testing in Selected Sites
8.00 am Depart to pilot sites Team Leaders Obadiah
9.00 am Arrive at pilot sites Kashemeire, MEO
9.00 – 4.00 Pilot DQA methods and tools
4.30 - Travel back from pilot sites
Day 3: Friday 1/7/2016, Feedback from Pilot Testing
9:00 – 9:15 Welcome and overview Obadiah Kashemeire Kenneth
9:15 – 9:30 Guided group discussion of experience with pilot test All Teams Ayebazibwe, Head
9.30 – 11.00 Group Presentations All Teams of IT
10:30 – 10:45 Tea Break Admin.
10:45 – 11:30 Group Presentations All Teams Obadiah
11:30 – 1:00 DQA: Discussion of revision of tools and methods Kashemeire, MEO
1:00 – 2:00 Lunch Admin.
2:00 – 2:30 DQA team schedule All teams Eric
2:30 – 2:50 Administrative/ Accounting Requirements Admin. Munyambabazi
3:20-3:50 Closure Assistant Commissioner

38 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
ANNEX III: DATA DISPARITY STATUS BY IP AND SERVICE POINTS
Table 3: DATA DISPARITY STATUS BY IMPLEMENTING PARTNER AND SERVICE POINTS

Implementing Service Provider M&E SYSTEMS ASSESSMENT: AREAS ASSESSED AND SCORES DATA VERIFICATION: AREAS ASSESSED AND SCORES
Partner
Green 2.5 to 3.0 Very good Green -0.95 to +1.05 No Disparity (Limited Disparity)
Yellow 1.5 to 2.5 Fair Yellow -0.9 to +1.1 Acceptable Disparity
Red < 1.5 Poor Red >1.1 Excess disparity
M&E Understanding Availability Data Use of Average score % difference % % difference Comment
structure, of indicator of data- management data for of data between difference between
functions & definitions & collection processes decision management service between consolidated
capabilities to reporting tools & making systems provider report at OVCMIS &
handle OVC guidelines reporting (DMS) quarterly district & register (n =
information forms assessment report & register (n 39 sites)
register (n = = 32 sites)
36 sites)
AVSI/SCORE 1. Meeting Point - Kitgum 2.7 2.3 2.3 3.0 3.0 2.7 86% -7% -7% Good DMS, gross over reporting in
OVCMIS but acceptable data quality
in the site report.
2. Meeting Point - 3.0 3.0 2.7 3.0 3.0 2.9 -23% -13% -23% Good DMS but with gross under
Kampala reporting
Baylor 3. Baylor - Adumi sub 3.0 2.0 1.7 1.4 1.0 1.8 6% 6% 6% Fair DMS and acceptable reporting
Comprehensi county
ve Eastern &
West Nile
Baylor SNAPS 4. Baylor Uganda Bwera 2.8 2.0 2.7 1.4 1.8 2.1 Acceptable DMS but data not
sub county available for validation
BOCY 5. AIDS Information 2.7 2.3 2.0 2.6 3.0 2.5 0% 0% 0% Good DMS, accurate reporting
Centre
6. CARITAS Lira 2.8 2.6 2.0 2.0 2.8 2.5 -30% 10% 10% Good DMS, gross over reporting in
OVCMIS but acceptable data
quality in the site report
7. KIWEPI 2.3 2.4 2.0 2.6 2.8 2.4 0% 0% 0% Acceptable DMS; accurate reporting
8. TASO, Mbale 2.3 2.8 2.7 2.9 3.0 2.7 -4% -7% -7% Good DMS & OVC MIS with
acceptable site reports
CEM/UPHS 9. Action For Behavioral 3.0 2.6 2.8 3.0 3.0 2.9 100% 0% Good DMS. Gross over reporting in
Change OVCMIS, accurate reporting at site
but no district filed report
10. Agape Nyakibale OVC 3 2.9 1.5 2.3 2.8 2.5 -51% -17% -51% Good DMS, acceptable over
Project reporting
11. AOET Lira 2.8 3 2.5 2.1 2.8 2.6 0% 0% 0% Very good DMS and good quality
data
12. Bukedi Diocese Mobil 2.8 2.9 2.8 2.6 2.4 2.7 97% -2% -2% Good DMS, gross over reporting in
Farm School OVCMIS but accurate reporting in

39 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
Implementing Service Provider M&E SYSTEMS ASSESSMENT: AREAS ASSESSED AND SCORES DATA VERIFICATION: AREAS ASSESSED AND SCORES
Partner
Green 2.5 to 3.0 Very good Green -0.95 to +1.05 No Disparity (Limited Disparity)
Yellow 1.5 to 2.5 Fair Yellow -0.9 to +1.1 Acceptable Disparity
Red < 1.5 Poor Red >1.1 Excess disparity
M&E Understanding Availability Data Use of Average score % difference % % difference Comment
structure, of indicator of data- management data for of data between difference between
functions & definitions & collection processes decision management service between consolidated
capabilities to reporting tools & making systems provider report at OVCMIS &
handle OVC guidelines reporting (DMS) quarterly district & register (n =
information forms assessment report & register (n 39 sites)
register (n = = 32 sites)
36 sites)
site and district filed report
13. Bweranyangi Parish 2.7 2.6 2.3 2 1.8 -31% -31% -31%
OVC Project
14. Chain Foundation 2.7 2.8 2.2 2.4 2.6 -45% -57% -57%
Uganda
15. Kakinga CDC 3 2.8 2 2.4 2.4 5% 5% 5%
16. Katente Child Care 2.7 2.8 1 2.7 2.8 -13% -13%
Project
17. Kiyita Family 3 2.9 2.8 2.8 2.8 -16% 1% -1%
Development Alliance
18. Mary Muke Solidarity 3 2.9 2.3 2.8 3 2% 2% 2%
OVC
19. Mbarara Archdiocese 3 2.6 1.8 2.1 2 -8% -8% -8%
20. Namirembe Diocese 3 2.9 2.3 2.8 3 160% 29% 29%
21. BUSODA 2.8 2.6 2.5 2.6 1 35% 91% 91%
22. Caring Hands 2.8 3 2 2.4 3
23. Fishing Community 2.7 2.8 2.5 2.2 2.2 163% 37% 37%
Health Initiative
Compassion 24. Kamwokya Child 1.5 1.3 1 2.9 1.6 -52%
International Development Centre
DOD/ RTI/ 25. RTI/ UPDF- 1st Division 2.3 1.6 0.5 1.6 0.6
UPDF Headquarters - Kakiri
26. UPDF/RTI Rubongi 2.3 2.5 2.3 2.2 1.4 -53% -53% -53%
Barracks
Mildmay 27. Bajja Initiatives for 2.8 2.8 2.7 2 1 3% 3%
Uganda Community
Empowerment
28. Child to Child Outreach 2.3 1.1 0.7 1.1 1
Ministries
29. Kakunyu Parents 2.8 2.8 2.8 1.8 1.8 -31% -31% -31%
Support Association

40 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
Implementing Service Provider M&E SYSTEMS ASSESSMENT: AREAS ASSESSED AND SCORES DATA VERIFICATION: AREAS ASSESSED AND SCORES
Partner
Green 2.5 to 3.0 Very good Green -0.95 to +1.05 No Disparity (Limited Disparity)
Yellow 1.5 to 2.5 Fair Yellow -0.9 to +1.1 Acceptable Disparity
Red < 1.5 Poor Red >1.1 Excess disparity
M&E Understanding Availability Data Use of Average score % difference % % difference Comment
structure, of indicator of data- management data for of data between difference between
functions & definitions & collection processes decision management service between consolidated
capabilities to reporting tools & making systems provider report at OVCMIS &
handle OVC guidelines reporting (DMS) quarterly district & register (n =
information forms assessment report & register (n 39 sites)
register (n = = 32 sites)
36 sites)
30. Lugazi OVC Care Group 3 3 2.8 2.3 2.8 -4% -4% -4%
31. Nkobazambogo Youth 3 2.9 2.5 2.8 2.8 0% 0% 0%
Group
32. Nkoni Parish 2.3 2.8 3 1.8 1 0% 0% 0%
Community AIDS
Program
33. Omoding Community 2 2.4 2.5 1.4 1
Pivot School
34. Your Neighbor OVC 3 3 2.7 2.1 2.4 0% 0% 0%
Initiative
35. Reach Out Kkasaala 2.8 1.4 1.3 2.7 3
MJAP 36. MJAP-Mbarara 2.8 2.8 1.8 3 2.8
37. MJAP-Kampala 2.8 2.5 2.5 2.8 3
MUWRP 38. MUWRP- Kayunga 2.5 2.8 2.8 2.3 2.2 54% 54%
Other 39. PSWO - Kitgum 2.8 2.9 2.5 2.1 2.6 -100%
40. PSWO - Mbale 2.8 2.8 2.3 2.7 3 -8% -8% -8%
41. Save the children 2.8 2.5 2.7 2.8 2.4 -32%
42. SVI 1 1 1 1 1
43. Amuca SDA 1.8 2.3 2.7 1.9 1.6 236% 236% 236%
44. Samaritan's Purse 3 2.6 2.5 2.4 1
45. World Vision 2.7 2.4 2.5 1.9 2.6
46. ARUWE 2.7 2.4 2.5 1.9 2.6
PIN 47. Islamic Outreach 3 3 2.7 2.7 3 25% 0% 0%
Centre
48. RECO Industries Ltd 2.7 2.5 1.7 2.4 2.2
RHSP 49. Community Enterprise 3 2.8 0.8 1.8 2.6
Development
Organisation
50. Kayonza Child 2.7 2.3 1.8 1.4 2.6 9% 9% 9%
Development Centre
51. Kitovu Mobile Limited 3 3 3 3 3 -58% -58% -58%

41 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
Implementing Service Provider M&E SYSTEMS ASSESSMENT: AREAS ASSESSED AND SCORES DATA VERIFICATION: AREAS ASSESSED AND SCORES
Partner
Green 2.5 to 3.0 Very good Green -0.95 to +1.05 No Disparity (Limited Disparity)
Yellow 1.5 to 2.5 Fair Yellow -0.9 to +1.1 Acceptable Disparity
Red < 1.5 Poor Red >1.1 Excess disparity
M&E Understanding Availability Data Use of Average score % difference % % difference Comment
structure, of indicator of data- management data for of data between difference between
functions & definitions & collection processes decision management service between consolidated
capabilities to reporting tools & making systems provider report at OVCMIS &
handle OVC guidelines reporting (DMS) quarterly district & register (n =
information forms assessment report & register (n 39 sites)
register (n = = 32 sites)
36 sites)
Reach Out 52. Reach Out Mbuya- 3 2.4 1.7 2.7 3
Banda Site
SOCY 53. TPO-Uganda - 1 1 1 1 1
Rukungiri
54. TPO-Uganda - Bushenyi 1 2.1 1.8 2.6 2.2
UEC/UCMB 55. Kamwokya Christian 2.2 2.5 2.2 2.3 2.8 19% 19% 20%
Caring Community
56. Kasanga PHC 3 2.9 2.8 3 2.8 1%
57. Kitovu Hospital ACT 3 2.9 2.7 3 2.8 10% 10% 10%
Program
58. St Francis Nsambya 2.7 2.6 1.2 1.9 1.8
Hospital Home Care
Department
59. St. Luke Hospital Angal 2 2 1.7 1.9 2.2
60. Villa Maria Hospital 3 2.8 2.5 2.1 1
61. Virika Hospital 2.5 2.8 2.3 2.7 1.8 111%
UPMB 62. Ruharo Mission 1.8 1.3 1.8 2.1 1.8
Hospital
World Food 63. Action Against Hunger 1 1.3 1.2 1 1
Programme 64. Community Action for 1.3 2.1 1.2 1.4 1.2
Health
National Average Data Disparity 2.6 2.5 2.1 2.3 2.2 14% 6% 4%

Table 4: AVERAGE DATA DISPARITY STATUS BY IMPLEMENTING PARTNER

Use of % difference between % difference % difference between


M&E structure, functions Understanding of Availability of data- Data data for service provider between report at consolidated OVCMIS
& capabilities to handle indicator definitions & collection tools & management decision quarterly report & district & register & register (n = 39
Implementing Partner OVC information reporting guidelines reporting forms processes making register (n = 36 sites) (n = 32 sites) sites)

42 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
Use of % difference between % difference % difference between
M&E structure, functions Understanding of Availability of data- Data data for service provider between report at consolidated OVCMIS
& capabilities to handle indicator definitions & collection tools & management decision quarterly report & district & register & register (n = 39
Implementing Partner OVC information reporting guidelines reporting forms processes making register (n = 36 sites) (n = 32 sites) sites)

1 AVSI/SCORE 2.9 2.7 2.5 3 3 32% -10% -15%

2 MJAP 2.8 2.7 2.15 2.9 2.9

3 PIN 2.9 2.8 2.2 2.6 2.6 25% 0% 0%

4 MUWRP 2.5 2.8 2.8 2.3 2.2 54% 54%

5 Baylor 2.9 2 2.2 1.4 1.4 6% 6% 6%


6 BOCY 2.5 2.5 2.2 2.5 2.9 -9% 1% 1%

7 CEM/UPHS 2.9 2.8 2.2 2.5 2.5 28% 3% 1%


8 Compassion International 1.5 1.3 1 2.9 1.6 -52%

9 DOD/ RTI/ UPDF 2.3 2.05 1.4 1.9 1 -53% -53% -53%

10 Mildmay Uganda 2.7 2.5 2.3 2 1.9 -5% -5% -7%

11 Save the children 2.8 2.5 2.7 2.8 2.4 -32%

12 Probation Office 2.8 2.85 2.4 2.4 2.8 -54% -8% -8%

13 RHSP 2.9 2.7 1.9 2.1 2.7 -25% -25% -25%

14 Reach Out 3 2.4 1.7 2.7 3

15 Samaritan's Purse 3 2.6 2.5 2.4 1

16 World Vision 2.7 2.4 2.5 1.9 2.6

17 UEC/UCMB 2.6 2.6 2.2 2.4 2.2 35% 15% 15%

18 Amuca SDA 1.8 2.3 2.7 1.9 1.6 236% 236% 236%

19 UPMB 1.8 1.3 1.8 2.1 1.8


20 SOCY 1 1.6 1.4 1.8 1.6

21 World Food Programme 1.15 1.7 1.2 1.2 1.1 :

22 SVI 1 1 1 1 1
National Average Data Disparity for
2.6 2.5 2.1 2.3 2.2 14% 6% 4%
Implementing Partners

43 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
ANNEX IV: LIST OF DQAI TEAM MEMBERS

Name Title Organization


CENTRAL COORDINATING TEAM 1 - MEEPP
1) Lydia Wasula Head, OVC_NIU MGLSD
2) Livingstone Kamugisha Kakuru OVCMIS Consultant MEEPP
3) Dr. Sarah Kyokusingura Senior M&E and Advocacy Advisor MEEPP
4) Dr. Margaret Mugerwa Senior M&E and Advocacy Advisor MEEPP
FIELD TEAM MEMBERS - MEEPP
1) Moritz Magall Senior Social Development Officer MGLSD
2) Mary Auma Senior Probation & Social Welfare Officer Lamwo District Local Government
3) Winnie Nantabo Kisakye Senior Probation & Social Welfare Officer Ibanda District Local Government
4) Cathy Mugoya Karen Senior Probation & Social Welfare Officer Sironko District Local Government
5) Damalie Nabwire District Community Development Officer Bulambuli District Local Government
6) Esther Nandase Senior Probation & Social Welfare Officer Namutumba District Local Government
7) Nelson Odela District Community Development Officer Kaberamaido District Local Government
8) Willy Etwop OVCMIS Consultant MEEPP
9) Fatuma Matovu Program Assistant MEEPP
10) Abiasali Mungo OVCMIS Consultant MEEPP
11) Richard Ongom Opio Consultant MEEPP
12) Joseph Kimera Intern MEEPP
13) Denis Oyirwoth Consultant MEEPP
14) Immaculate Baseka Program/ Field Operations Assistant MEEPP
15) Gorretti Kiiza Mbabazi MEL Specialist USAID Learning Contract QED
16) Ruth Ankunda MEL Officer USAID Learning Contract QED
17) Julius Batemba M&E Manager Plan International - Uganda
18) Sumaya Babirye Mulumba M&E Officer UPHS
19) Solomon Asaba Programme Officer –OVC UPHS
20) Saviour Atama M&E Officer UPHS
21) Specioza Namakula M&E Officer BOCY/BANTWANA
FIELD TEAM MEMBERS - METS
1) Eric Munyambabazi MIS MakSPH-METS
2) Jennifer Balaba MIS MakSPH-METS
3) Herbert Mulira MIS MakSPH-METS
4) Doreen Ntale M&E MakSPH-METS
5) Dianah Birungi M&E MakSPH-METS
6) Innocent Musoke M&E MakSPH-METS
7) Cindy Ngonzi Social Worker MGLSD
8) Obadiah Kashemeire M&E Officer MGLSD
9) Andrew Timothy Kamugasa District Community Development Officer Lyantonde
10) Colleens Pimer Senior Probation & Social Welfare Officer Zombo
11) Farook Yiga Senior Probation & Social Welfare Officer Butambala
12) Jimmy Wale Ameko Senior Probation & Social Welfare Officer Moyo
13) Kenneth Ayebazibwe Head, IT Systems MGLSD
14) Isaac Bitamale Community Development Officer Hoima
15) Shellina R. Abaho Senior Probation & Social Welfare Officer MGLSD
16) Freeman Kato Senior Probation & Social Welfare Officer MGLSD
17) Daniel Kalema M&E Reach Out Mbuya
18) Richard Katende M&E Mild May/Comprehensive Central
19) George Sempiira S/W OVC Program Mild May/Comprehensive Central
20) Paul Masereka DMS Mild May/Comprehensive Central
21) Francis Xavier Wasswa M&E Musph/Fellows/Rakai
22) Godfrey Okiria OVC Social Worker Baylor/ Comprehensive Eastern & West Nile
23) Sarah Nakimera M&E Baylor College of Medicine/PIDC/SNAPS – WEST
24) Josephine Sanyu Social Worker MUFM/MJAP
25) Chris Mugara Retention Coordinator IDI
26) Davis Kitamirike Data Officer IDI
27) Kenneth Kakiiza Data Manager CAF

44 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016
ANNEX V: LIST OF SERVICE PROVIDERS ASSESSED

No. Service Provider District


1 Action Against Hunger Kaabong
2 Action for Behavioural Change Tororo
3 Action for Rural Women’s Empowerment [ARUWE] Kyankwanzi
4 Agape Nyakibale OVC Project Rukungiri
5 AIDS Information Centre Arua
6 Amuca SDA Lira
7 AOET Lira Lira
8 Bajja Initiatives for Community Empowerment Kalungu
9 Baylor Uganda-Bwera sub county Kasese
10 Baylor-Adumi sub county Arua
11 Bukedi Diocese Mobil Farm School Tororo
12 BUSODA Masaka
13 Bweranyangi Parish OVC Project Bushenyi
14 Caring Hands Wakiso
15 CARITAS Lira Lira
16 Chain Foundation Uganda Mukono
17 Child to Child Outreach Ministries Buikwe
18 Community Action for Health Kaabong
19 Community Enterprise Development Organisation Rakai
20 Fishing Community Health Initiative Masaka
21 Islamic Outreach Centre Bukedea
22 Kakinga CDC Rukungiri
23 Kakunyu Parents Support Association (KPSA) Lwengo
24 Kamwokya Child Development Centre Kampala
25 Kamwokya Christian Caring Community Kampala
26 Kasanga PHC Kasese
27 Katente Child Care Project Mukono
28 Kayonza Child Development Centre Rakai
29 Kitovu Hospital ACT Program Masaka
30 Kitovu Mobile Limited Rakai
31 KIWEPI Kitgum
32 Kiyita Family Development Alliance Wakiso
33 Lugazi OVC Care Group Masaka
34 Mary Muke Solidarity OVC Wakiso
35 Mbarara Archdiocese Mbarara
36 Meeting Point Kitgum
37 Meeting Point Kampala Kampala
38 MJAP Kampala
39 MJAP Mbarara
40 MUWRP Kayunga
41 Namirembe Diocese Wakiso
42 Nkobazambogo Youth Group Masaka
No. Service Provider District
43 Nkoni Parish Community AIDS Programme (NKOCAP) Lwengo
44 Omoding Community Pivot School Lwengo
45 PSWO, Kitgum Kitgum
46 PSWO, Mbale Mbale
47 Reach Out Kkasaala Luwero
48 Reach Out Mbuya- Banda Site Kampala
49 RECO Industries Ltd Bukedea
50 Ruharo Mission Hospital Mbarara
51 Samaritan's Purse Kiryandongo
52 Save the children Kiryandongo
53 St. Francis Nsambya Hospital Home Care Department Kampala
54 St. Luke Hospital Angal Nebbi
55 SVI Kaabong
56 TASO, Mbale Mbale
57 TPO-Uganda Bushenyi
58 TPO-Uganda Rukungiri
59 UPDF/RTI 1st Division Headquarters - Kakiri Barracks Wakiso
60 UPDF/RTI Rubongi Barracks Tororo
61 Villa Maria Hospital Kalungu
62 Virika Hospital Kabarole
63 World Vision Kyankwanzi
64 Your Neighbour OVC Initiative Masaka
Total 64

ANNEX VI: LIST OF INDIVIDUALS WHO CONTRIBUTED TO THE COMPILATION OF THIS REPORT

Name Title Organisation


Lydia Wasula Coordinator OVC_NIU
Obadiah Kashemeire M&E Officer OVC_NIU
Charles Etoma Senior Statistician MGLSD
Herbert Mulira MIS TA MUSPH_METS
Livingstone Kamugisha OVCMIS_MES SSS/MEEPP
Julius Batemba MER Manager Plan International
William Mbonigaba Project Coordinator TPO
Solomon Asaba P/O_OVC CEM/UPHS
Bathsheba Bahumwire SPWO Rukungiri DLG
Jennifer Balaba MIS TA MUSPH_METS
Andrew Kamugasa Timothy SPWO Lyantonde DLG
Abiasali Mungo OVCMIS_MES SSS/MEEPP
Ambrose Muhumuza M&E Team Lead CEM/UPHS
Jane Kyosiimire OVC Officer MUWRP/Walter Reed
Paul Maseruka DMS Mildmay Uganda
Darius Kimera Senior Geo- Info Officer UBOS
Willy Etwop OVCMIS_MES SSS/MEEPP

46 | Uganda Country Report - Quality of Data Reported by OVC Service Providers, January-March 2016

Você também pode gostar