Você está na página 1de 16

Community Development Strategy

Prepared by
the Centre of Excellence for Evaluation
Wednesday, March 13th, 2002

Centre of Excellence for Evaluation


Community Development Strategy
Context
A study of the state of evaluation in the federal government conducted by the Treasury
Board Secretariat in 2000 demonstrated that evaluation capacity had diminished from a
position of strength in the 1980s. Compared to other OECD countries such as the United
States and Australia, the federal government was, in recent years, investing relatively
little in the evaluation function. At the time of the study, a survey of Heads of Evaluation,
conducted on behalf of the Treasury Board Secretariat, estimated approximately 230
evaluators were active in the federal government. The survey indicated that an additional
120 evaluators would be required to ensure implementation of the revised Evaluation
Policy, i.e. a required increase of more than 50% of the complement of evaluators in the
federal government.
The Treasury Board Secretariat introduced the revised Evaluation Policy to reposition
the evaluation function by making managers more accountable for its use in Resultsbased Management, and to signal a strengthening of the discipline of evaluation within
the federal government. Concurrent with the introduction of the revised Evaluation
Policy, the Treasury Board Secretariat created the Centre of Excellence for Evaluation
(CEE) to support the implementation of the new Policy and champion initiatives aimed at
capacity building across the system. The Treasury Board ministers directed the CEE to
work with the Evaluation Community in the federal government to develop a human
resource strategy, hereafter referred to as the Community Development Strategy (CDS).

Objective
The objective of the CDS is to assist in renewing and repositioning the evaluation
function within the federal government and building the capacity of evaluation units, both
for evaluators themselves, and for managers. The intention is to ensure, within the
federal government, that evaluation is employed to:

assess the effectiveness of programs, initiatives and policies; and,

assist with the design and redesign of programs that support Results-based
Management.

As an outcome of the CDS, the CEE aims to assist the Evaluation Community increase
its current numbers (estimated at 230) by 30 per cent or 70 additional positions, by
2005-2006.

Intended Client Groups


The revised Evaluation Policy and the CEE provide the context for the CDS. As such,
the strategy serves two primary client groups:

program evaluators; and,

managers who are accountable for results based management and employ the
discipline of evaluation to ensure the design and execution of effective programs,
policies and initiatives.

With respect to evaluators, the CDS involves:

recruitment plans, in addition to

plans for training and development for program evaluators employed at all levels by
the federal government.

Given the current demographics of the federal public service, the key element of the
CDS is to assist evaluation managers to recruit, train, develop and retain professional
program evaluators to assist with the implementation of the revised Evaluation Policy.
As for managers, the CDS aims to provide them with orientation on the uses of
evaluation in the context of Results-based Management. In fulfilling this aspect the
strategy, the CEE will contribute to an integrated management training plan to inform
managers on all aspects of modern comptrollership. In addition, the CEE will seek
opportunities to work with evaluation managers in departments and agencies to
influence program managers at all levels to employ evaluation methods and expertise
as a key element of results based management.

Operating Principles
The CEE is guided in the implementation of the CDS, and other initiatives, by a Senior
Advisory Committee (SAC) composed of selected Heads of Evaluation, as well as a
representative from the Centre of Excellence for Internal Audit. One member of the SAC
serves as a champion for Community Development. The Champion heads a subcommittee, the Community Development Advisory Group, the mandate of which is to
provide specific and expert advice to the CEE on the CDS. In addition, the CEE consults
directly with the Heads of Evaluation of departments and agencies subject to the
Evaluation Policy, on an as needed basis, to ensure that objectives are aligned with
community needs.
The organizational structure of the CEE includes the position of Manager, Capacity
Building. The primary purpose of the position, and those that support it (two FTEs) is to

provide leadership and facilitation for the evaluation community in the area of training
and development, and where appropriate, in recruitment.
As a matter of course, the CEE will encourage the community to lead on specific
initiatives where it makes sense to do so. As an example, CEE will actively encourage
Heads of Evaluation to collaborate on recruitment and staffing initiatives within the
framework of delegated authorities.

Partners and Service Deliverers


The CEE, and the Evaluation Community, will actively seek partnerships to ensure full
implementation of the CDS. For guidance on matters of recruitment and retention, the
CEE will rely on the Public Service Commission. Potential partners for training and
developmental opportunities include:

the Canadian Evaluation Society;

the American Evaluation Association;

Universities and other institutions of higher learning to supply learning and


developmental opportunities (including structured courses, workshops and
information sessions);

federal government training organizations including the Canadian Centre for


Management Development (CCMD), Training and Development Canada (TDC) and
departments such as Statistics Canada which have demonstrated a leadership role
in the development and delivery of specialized methodology courses of particular
interest to evaluators; and,

policy centres within the Treasury Board Secretariat, such as the Centre of
Excellence for Internal Audit, and the Comptrollership Modernization Office. The
CEE is aware that many functional communities are also involved in capacity
building. The intention is to learn from good/best practices, and use or modify
existing programs and processes wherever feasible to meet the overall objectives of
our Community Development Strategy. We are working in partnership with the
Centre of Excellence for Internal Audit (TBS) on specific projects and will continue to
seek out similar alliances within TBS.

The CEE will work with the Evaluation Community to determine community needs and
issues from a global perspective to establish priorities, and ensure that these are
addressed in a timely and cost effective manner. The intention of the CDS is to support
the goal of community renewal and repositioning as required by the Evaluation Policy. In
most cases, the CEE will not itself deliver training or development to the Evaluation
Community. Rather, the role of the CEE will be to seek out and establish practical
options to ensure that evaluators receive the appropriate training and development
opportunities. Partners, as identified above, will be a potential source for the delivery of
training and development for the Evaluation Community.

Timeframe
The CDS is designed to support ongoing Evaluation Policy implementation. As such, it
includes both short-term and longer-term objectives. The short-term objectives include
the establishment of formalized mechanisms for entry-level recruitment, implementation
of a pilot internship program and the establishment of a recruitment/retention and
training and development program for in-career evaluators. The key longer-term
objective of the CDS is to help departments and agencies increase the numbers of
program evaluators actively engaged in the implementation of the Evaluation Policy to
numbers approaching 300 by 2005-2006, and to support them with an appropriate and
sustainable training and development plan. Key elements of the current and upcoming
year are outlined here.

Key Elements of Year One Activities (2001-2002)


In its first year of operation, the CEE has completed several initiatives to support
implementation of the CDS. These include:

A Demographic Study, the purpose of which is to provide an up-to-date picture


of the current evaluation community, particularly with respect to career paths,
anticipated retirements, and recruitment needs. The Demographic Study
provides key information on the communitys training and development needs.

Competency Profiles for entry level, intermediate and senior program


evaluators. The profiles support the development of system-wide training and
development initiatives. The competency profiles may also assist with the
development of assessment tools for recruitment.

A study of the potential for community-wide recruitment effort using the PostSecondary Recruitment Program. Learning from the examples of the FORD/IARD
program (recruitment program for financial officers and internal auditors), the
Accelerated Economist Program and the Policy Development and Research
Program, the CEE is exploring the possibility of implementing a recruitment program
for entry level evaluators on behalf of the community.

A preliminary study on an internship program to support the development of


new recruits to the evaluation function. Building on the best practices of
internship programs already in operation, the CEE is developing an internship
program for evaluators, planned as a pilot project for 2002-2003.

Review of options for training and development program for mid-career and
senior evaluators. The evaluation community requires a coherent and structured
approach to training and development to ensure that professional evaluators are

acquainted with and able to employ the appropriate methodologies and approaches
to support ongoing implementation of the Evaluation Policy.
A synopsis of the key reports completed to date, and related to these initiatives, is
available in Annex A.

Key Elements of Year Two Activities (2002-2003)


Based on the foundation work of the first year, the CEE will seek the cooperation of the
Evaluation Community on the introduction of the key elements of the CDS. These
elements relate to entry-level recruitment, training and development as well as
recruitment/retention, training and development for the in-career evaluator. The CEE
anticipates that within a two-year time frame, the Strategy and Plan will include the
following elements:

a pilot internship program;

a mentoring program to ensure knowledge transfer from senior evaluators to junior


evaluators, across the federal government;

generic staffing actions to identify pre-qualified pool of evaluators to fill intermediate


and senior positions;

community-based networking events including work groups on aspects of


methodology and application of evaluation practices;

formal training events;

good/best practice guides to support the conduct of evaluation to the standards


inherent in the policy; and,

redevelopment and renewal of the CEE web presence, devoted to information and
tools to support evaluators and managers. The initiative may include formal
partnering with the Comptrollership Community extranet project. Elements of the
CEE web presence will include:

training and development opportunities;

information on policy implementation; and,

evaluation methodology.

Evaluation of the CDS


As a key component of the operations of the CEE, the CDS will be evaluated as part of
the 18-month evaluation of the Policy in 2002-2003, and as an element of the overall
Policy evaluation in 2005-2006.

Annex A
Treasury Board Secretariat
Center of Excellence for Evaluation

COMMUNITY DEVELOPMENT STRATEGY


Summary of Foundation Reports
INTRODUCTION
In February 2001, the Treasury Board Secretariat (TBS) introduced a revised federal
government Evaluation Policy, effective April 1, 2001. Strengthening the evaluation
community and synchronizing their capacity with the demands arising from the recent
emphasis on performance measurements and the Comptrollership mode of governance,
have been identified as priorities.
The Centre of Excellence for Evaluation was established as part of TBS to serve these
objectives. TBS, the Centre and others have sponsored a number of studies whose aim
is to define the demographics, required competencies, and training and development
needs of the Evaluation Community with the view to building community capacity. This
Annex contains summaries of the major findings of these studies.

1. Professional Development Activities for the Evaluation Community:


Needs Analysis, TBS in partnership with CES, June 12, 2000
A survey was distributed to evaluation and review units in February 2000 to determine
the needs of the evaluation community in professional development activities. The
sample used in the analysis includes 42 departments and agencies that fall under the TB
evaluation policy and have evaluation units. The response rate was 50%.
Survey Results: Current Course Offerings: There was a moderate to high interest in
all courses currently offered by the CES and TBS. The following lists these courses
based on respondents level of interest from strongest interest level on down:

Program Evaluation Research Methods

Performance Management and Performance Indicators for Evaluators

Performance-based Management

Essential Skills Series (not in partnership with TBS).

Survey Results: Possible Future Offerings: The number of potential participants


suggests some interest for the suggested course offerings, in particular:

Integrating Evaluation, Audit and Performance Measurement

Evaluation and Accountability Frameworks.

Other Topics: While respondents suggested additional topics that would be of interest,
further study is required as there was no overriding consensus on any one topic/s.
Recommendations:

TBS continue to support the offering of the current courses over the
immediate term.

CES and TBS continue to offer opportunities for informal presentations on


topics of interest to the community through information sessions.

CES and TBS identify courses and seminars that most closely match the
interests of the respondents, as described in the survey, for publication on the
CES web site or in CES mail out publicity to members.

TBS and CES review the communication on course offerings to ensure that
departments and agencies are fully aware of what is available.

CES and TBS develop a reliable course calendar and training strategy.

2. The Demographic Profiles of the Internal Audit and Evaluation

Communities in the Canadian Federal Public Service, Personnel


Psychology Center, PSC, December, 2001.
General Demographics: A demographic survey was administered to the Evaluation and
Internal Audit communities of employees located in 58 departments and agencies. The
response rate was estimated as high as 61%. The respondents tended to be career
public servants, the majority having more than 12 years of service to the federal
government and being located in the more senior levels of their respective occupational
groups. While five functional groups were identified for profiling analysis, this summary
focuses on the Evaluation group and the Evaluation & Review (E&R) group. The
Evaluation group reported 42.9% of employees with less than 12 years of experience,
about 10% above the norm of the groups included in the study. Some 30.2% of the
Evaluation group reported 12 to 25 years of service and 27% reported more than 25
years of service. Many were new to their current position (38.7% reported less than 1
year, about 12 times the norm), suggesting a recent increase in staffing activity.
Education: Some 64.1% of evaluators have an MA and 7.8% have a PhD. Only 6.2% of
the Evaluation group and 10.9% of the Evaluation and Review group reported having no
university education.
Occupational Groups: The Evaluation group are primarily from the ES occupational
group (60.3%) but there are also AS representatives (23.8%). Almost twice as many ES
6s were located in the E&R group compared to the Evaluation group and there was a
broader spread at lower level ESs in Evaluation, suggesting some renewal there. Some

50% of the E&R respondents were in ES 6 positions or EX minus one. In Evaluation,


only 30% were at that ES6 level or EX minus 1.
Previous Functional Groups and Department or Agency: Almost half the
respondents (48.4%) indicated that, in the past, they were not from one of the functional
groups represented in the survey. The Evaluators reported 21.9% of people in the same
functional area while the E&R reported only 8.7% (the lowest correspondence between
current function and past function of all groups in the study). Many reported that their
previous position was in the federal Government, which is consistent with the finding that
this group is comprised of career civil servants. Only 18.5% of the Evaluation group
spent 10 or more years in their previous position, which confirms more recent mobility in
the Evaluation positions. In addition, only 32% of the current ESs were previous
members of the ES community. Evaluators (ESs) tend to migrate from functions other
than the 6 principal groups in the survey (10.3% came from the PM group).
Eligibility for Retirement: Almost one third of the respondents indicated that they are
eligible for retirement within the next 5 years. The Evaluation group reported 11.1% with
2 or less years to retirement and 14.3% with 3 to 5 years to retirement. The E&R group
reported 17.4% with 2 or less years to retirement and 13% with 3 to 5 years to
retirement. This is approximately 4 times the norm.
To maintain the status quo assuming attrition 3 years from now, there would have to be
an estimated 22% increase in the current number of employees for Evaluation.
Plans to Stay or leave the Government in the next 3 Years: Results show 7.9% plan
to leave definitively, 15.8% might leave or stay and 76.2% plan to stay.
Supervisory/managerial responsibilities: Data in this area is weak as 32% of
respondents left the fields blank. Of those who entered data, 44% reported they did not
directly supervise or manage employees and the remainder tended to manage small
shops with 48% managing groups of five or less.
Need for Training and Development: Training needs was identified as a relatively high
priority. Evaluation and E&R groups consider that their needs now and for the future are
high and about the same (i.e., little gap between current and future needs), suggesting
that they sense a more pervasive need overall.
Sources of Training: The majority (66.3%) ranked government provided workshops,
conferences and seminars as the most common sources of training currently. When
asked for the best sources of training government sponsored or provided conferences,
workshops and seminars were ranked as the first choice by 46.8% and the second
choice by 31.4% of respondents. The second best sources of training were those offered
by professional organizations with 21.8% naming CES as a source of learning in the
past two years. It appears that sponsorship for courses offered by professional
organizations would be a welcome component of a future plan.

Percent of work effectively learned on the job: Almost half thought 60% or more of
the work could be learned on the job while the other half felt that less than 60% could be
effectively learned on the job.
Training and Development Needs: The Evaluation and E&R groups identified the
following priorities:

Knowledge/understanding of Government (legislation, policies, public admin.,


staffing etc), especially of the mechanics of client organizations

Skill/ability in:
General tools, skills and methodologies
Data analysis
RMAF
Management
Interpersonal aspect of work with clients

As can be seen, the hands-on types of needs in the skills and abilities section ranked as
more important overall than the acquisition of knowledge and understanding.
Role of Centre of Excellence in Evaluation: Respondents clear expectation was that
the Centre of Excellence should lead, performing a centralized function in terms of
developing the required capacity, the standards, policies and even in terms of the
provision of training and development services. This would include the education of the
client departments and the functional community as to their respective roles in this new
enterprise. The need for TBS to build capacity was a unanimous response.
Reports Conclusions: While there are indications of recent recruitment activity in
small pockets, and some recent staffing activity, the eligibility to retire rates combined
with the self-reported plans to leave the government suggest a need for strategic HRM
action now. As there is not a direct University stream out of which to draw talent, it is
suggested that on the job training may not fill the demographic gaps rapidly enough to
maintain community stability and at the same time, transmit corporate memory. This may
be a situation where the development of corporate programs to recruit along the lines of
AETP or CAP, with some opportunity for a non-management stream, would address the
demands in an accelerated and systematic format. The development of talent pools from
employees within the government such as the AEXDP corporate program might help to
cultivate and diversify the skills of the community in a more horizontal way.

10

3. Competency Profile for Federal Public Service Evaluation Professionals,


Personnel Psychology Center, November 2001
The Evaluation Profile was modeled on the fourteen competencies contained in the
Profile of Public Service Leadership Competencies and has been adapted to reflect the
specific culture, values, needs, and future goals and challenges of the federal Public
Service Evaluation Community. In all, 355 professionals across the evaluation and audit
communities took part in the survey, representing a response rate of 61%.
Clustering of Competencies by Theme: The 14 evaluation competencies are
organized into five clusters, as follows:
1. Intellectual:
Cognitive Capacity
Creativity
2. Future Building:
Visioning
3. Management:
Action Management
Organizational Awareness
Teamwork
Partnering
4. Relationship
Interpersonal Relations
Communication
5. Personal
Stamina and Stress Resistance
Ethics and Values
Personality
Behavioral Flexibility
Self-confidence
For each competency there is a generic description as well as a set of 3 to 5 behavioral
indicators tailored specifically to each of 3 evaluation groups (i.e. junior, intermediate,
and senior). The behavioral indicators are examples of how a particular competency
may manifest itself in concrete behavioral terms.

11

Importance of Competencies by Level: The following groupings resulted from the


information collected and can be used as a guide, augmented by the consideration of
specific job/position and departmental needs:
1. Key Competencies Across All Levels:
Cognitive Capacity
Communication
2. Additional Competencies Deemed Important Across All Levels
Organizational Awareness
Interpersonal Relations
Personality
Action Management
Teamwork
Self-confidence
Behavioral Flexibility
Ethics and Values
3. Competencies Deemed Important for Senior Evaluators:
Creativity
Visioning
Stamina and Stress Resistance
Partnering
Human Resource Management Applications: The Evaluation Profile is intended as a
tool and departments may choose to use the complete profile, parts of the profile, or to
tailor the profile as required for specific departmental needs. For example, managers
may apply the competencies for the following human resources management functions:

Entry-level recruitment

Training and development

Performance feedback

Team building

4. Pre-Implementation Study for a Program Evaluation Internship Initiative,


Goss Gilroy Inc., Nov 18, 2001
The objective of this study was to provide information, views and best practices about
internships and ascertain how these could be used for future recruitment in the area of
program evaluation. This work was undertaken as a preliminary steps in the
development of an Internship Program for Evaluators, work currently underway. The key
findings of the study include:

The activities of program evaluators vary according to level and departments. Most,
however, are involved in the preparation of frameworks/RMAFs and the front-end

12

and back-end work of evaluations. As program evaluators gain experience, they play
an active advisory role for managers and high management.

According to the competency profiles, key competencies of program evaluators


include cognitive and communication skills, and organizational awareness. Interview
results suggests that this means a good background knowledge of government and
the department, good knowledge of evaluation approaches and techniques, good
contract project management skills, good oral and written communication skills, and
good interpersonal skills.

According to past training experiences, evaluators prefer more hands-on training,


with some recipes and how-to type courses.

There is an expressed need for internships and support for a mentoring approach.

Internships should incorporate training and mentoring. Training should include a


government introduction course/workshop, some technical courses in program
evaluation and project/contract management course, and a course on government
communication.

Mentoring should involve a coordinator, mentors, mentees, and perhaps facilitators.


The process should include a workshop for mentors, a formal matching process, an
objective setting process, mentor/mentee and overall group meetings, and exit
satisfaction questionnaires. Regional location of interns (NCR, regional offices)
should be considered in the planning phase.

Past experiences show that effective recruitment, support from high management,
appropriate training courses, and good mentoring/support are key success factors in
personnel development strategies.

Many documents and courses have been developed by the PSC and could be
adapted for internship and mentoring purposes.

5. Toward a Coordinated Approach for the Post-Secondary Recruitment


for Evaluators in the Government of Canada, Jacques Berard, Dec. 10,
2001
This report explores how the Post-Secondary Recruitment Program (PSR) could be
used in its present state, or with various modifications, to achieve the aim of building the
evaluation communitys capacity.
Assumptions for a coordinated approach to PSR:

Key stakeholders will come to an agreement that a centrally coordinated, interdepartmental recruitment drive is both effective and efficient for them, and that the
advantages of such an approach outweigh its initially onerous logical and
administrative disadvantages.

Departments will agree to free up the necessary resources to develop common


screening tools on which they will agree.

13

Heads of evaluation will agree to pay into a common pool to conduct recruitment
drives when they may be competing with each other over the best candidates.

People responsible for hiring evaluation staff will accept to delegate their candidate
screening power to staff of other departments as 36 departmental screening agents
will not be needed on site to select the best candidates.

Key stakeholders will agree to modify their practices of classifying staff in different
occupational groups. Harmonization of entry levels for candidates is required.

Once candidates will have been deemed acceptable and put on the PSCs database,
they will turn down other job offers until they are hired by the federal government.

14

Findings:
Defining the Junior Evaluation Officer:

The core competencies from the competency profile need to be assessed through
the proper tools and incorporated into a generic job description.

A bachelors degree is the minimum required and a Masters degree would be


preferable in one of the following disciplines:
Public administration
Economics
Statistics
Social sciences (sociology, political science)
Science

Other selection criteria include:


Satisfactory performance on the Written Communication Test (WCT) and on
the Graduate Record Test (GRT)
BBB level of bilingualism
Willingness to travel and/or to relocate to Ottawa
Upon selection for hiring, obtaining at least an Enhanced Security clearance

A generic job description is required. This study outlines the content of key Universal
Classification Standard of a junior evaluator job description (See pages 20-23 of the
study.)

Recommendation: The PSC and the CEE collaborate to make a case to Human
Resources offices to harmonize the occupational groupings for newly hired junior
evaluators and to institute an ES 2 entrance level.

How to Make Best Use of the PSR to Select Junior Level Evaluators:

Since Sept 2001, the PSC has implemented an electronic process in which
departments can interact with the information contained in a database to search for
the best candidates. In addition, once a department has officially encumbered its
vacant positions, its remaining, short-listed candidates can be accessed by other
departments for hiring.

Options for a PSR process:


The basic, existing PSR. Not recommended, but viable in the short term.
Enhanced PSR: Basic PSR plus candidates must demonstrate their knowledge of
evaluation theory, design, methods, and techniques through a test.
Enhanced PSR (+): Options A plus B and a Group Exercise is added that allows for
additional measurement
Fully Enhanced PSR: Option A, B, C plus a One-on-One Simulation Interaction
Complete Assessment: Options A, B, C, D, plus a Structured Interview

15

Integrated Assessment Centre: Blends and integrates all instruments of Option E into
one package.
Note that all options except Option A require the collaboration and the coordination of
departments in developing tools, reviewing and accepting them, administering them to candidates
as well as a correction and appeals process.

Recommendations:

PSR campaigns be run twice a year late fall for early spring hiring and winter for
early summer hiring with a view to filling in available posts rather than with the goal
of creating a roster of available on-call candidates.

The following step-wise approach be used to implement a coordinated PSR for the
evaluation community:
2001-2002: Use basic PSR and draw candidates from the overflow of other
departments to fill the available posts.
2002-2003:

Spring Campaign: Develop an inter-departmental agreement on the


assessment instruments, and on administration arrangements for Option F

AND implement Option B

Fall Campaign: Implement Option B plus the Structured Interview of Option


E AND develop plans for implementation of Option F in 2002-2003 and
beyond for sound and complete integrated candidate assessment process in
the future.

2003-2004 and following: Implement and pre-test Option F


2004-2005: Amend Option F and implement it as the communitys standard
Conclusions:

Several additional tests are needed to turn the PSR into the powerful recruitment tool
it could be for the evaluation community.

One time costs to departments vary from $12,500 to$74,250 depending on the level
of complexity and rigour of the PSR process.

Based on extrapolations of the number of positions to be filled between fiscal years


2000-2001 and 2004-2005, the estimated cost to the 36 departments in need of
evaluators is a little more than $5,000 each (direct costs to be paid to the PSC) to put
in place a rigorous screening process and hire the estimated 116 needed individuals.

Money is not really the important issue. The challenges are:


Common selection criteria
Generic job description
Common occupational group classification of the new staff
Test correction and appeals process
Composition and authority of the interviewing teams working alongside PPC
psychologists.

16

Você também pode gostar