Você está na página 1de 48

LEVEL-3

EVALUATION
REPORT
For TESP Training Program
January, 2017

M&E Unit

Tertiary Education Support Program


Higher Education Commission
Islamabad

Level 3 Evaluation/M&E/TESP/HEC

1. List of Abbreviations
HEC

Higher Education Commission, Islamabad

HEI

Higher Education Institution

IBA

Institute of Business Administration, Karachi

IMS

Institute of Management Sciences, Peshawar

IP

Implementation Partner

TESP

Tertiary Education Support Program

i|Page

Level 3 Evaluation/M&E/TESP/HEC

1 Table of Contents
1.

List of Abbreviations .............................................................................................................................. i

Table of Contents .................................................................................................................................. ii

List of Figures ....................................................................................................................................... iv

List of Tables ......................................................................................................................................... v

1.

Executive Summary ............................................................................................................................... 1

Introduction .......................................................................................................................................... 2

4.1

About Tertiary Education Support Project .................................................................................... 2

4.2

About the Training Program ......................................................................................................... 2

4.3

About the Evaluation .................................................................................................................... 2

4.4

Objectives of the Level-3 Evaluation............................................................................................. 3

4.5

Specific objectives of the evaluation are as follows: .................................................................... 3

4.6

Methodology................................................................................................................................. 4

4.7

Limitations..................................................................................................................................... 4

Key Findings .......................................................................................................................................... 5


5.1

Profiles of the Respondents .......................................................................................................... 5

5.1.1

Number of training participants and respondents to the survey ......................................... 6

5.1.2

Number of training participants and respondents to the survey, IP wise ............................ 7

5.1.3

Grade wise, profile of the respondents ................................................................................ 8

5.1.4

Role wise, profile of the respondents ................................................................................... 9

5.2

Relevance of Training to Work.................................................................................................... 11

5.2.1

Overall Relevance of Training to Work ............................................................................... 12

5.2.2

Relevance of Training, IP wise............................................................................................. 13

5.2.3

Comparison of IPs on weighted average score about relevance of training ...................... 14

5.3

Application of New Learning ....................................................................................................... 15

5.3.1

Overall Application of New Learning .................................................................................. 16

5.3.2

Application of new learning, IP wise ................................................................................... 17

5.3.3

Comparison of IPs on weighted average score about application of learning ................... 18

5.4

Improved Performance as a Result of New Learning ................................................................. 19

5.4.1

Overall Improved performance .......................................................................................... 20

5.4.2

Improved performance as a result of new learning, IP wise .............................................. 21

5.4.3
Comparison of IPs on weighted average score about improved performance as a result of
training 22
ii | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.4.4
5.5

Examples of Improved performance................................................................................... 23

Ability to Transfer Learning from the Training to Others ........................................................... 25

5.5.1

Ability to transfer learning from the training to others, IP wise......................................... 27

5.5.2

Ability to transfer learning from the training to others, training theme wise .................... 28

5.5.3

Description of transfer of learning ...................................................................................... 29

5.6

Major Enablers in Applying the New Learning to Workplace ..................................................... 30

5.7

Major Barriers that Hampered the Application of New Learning to Workplace ........................ 32

5.8

Comments and Suggestions by the Respondents ....................................................................... 34

Conclusion ........................................................................................................................................... 36

Recommendations .............................................................................................................................. 37

Annexure A.......................................................................................................................................... 38

iii | P a g e

Level 3 Evaluation/M&E/TESP/HEC

2 List of Figures
Figure 1: Grade wise, profile of the respondents ......................................................................................... 8
Figure 2: Role wise, profile of the respondents ............................................................................................ 9
Figure 3: Role wise, profile of the respondents, disaggregated by training themes .................................. 10
Figure 4: Rating of relevance of training to work ....................................................................................... 12
Figure 5: Comparison of IPs on weighted average score about relevance of training ............................... 14
Figure 6: Rating of ability to apply learning to work................................................................................... 16
Figure 7: Comparison of IPs on weighted average score about application of learning ............................ 18
Figure 8: Rating of improved performance as a result of the training ....................................................... 20
Figure 9: Comparison of IPs on weighted average score about improved performance as a result of
training ........................................................................................................................................................ 22
Figure 10: Ability to transfer learning from the training to others ............................................................. 26
Figure 11: Ability to transfer learning from the training to others, IP wise................................................ 27
Figure 12: Ability to transfer learning from the training to others, training theme wise ........................... 28
Figure 13: Major enablers in applying the new learning to workplace ...................................................... 31
Figure 14: Major barriers that hampered the application of new learning to workplace .......................... 33

iv | P a g e

Level 3 Evaluation/M&E/TESP/HEC

3 List of Tables
Table 1: Number of training participants and respondents to the survey ................................................... 6
Table 2: Number of training participants and respondents to the survey, trained at IBA ........................... 7
Table 3: Number of training participants and respondents to the survey, trained at IMS .......................... 7
Table 4: Weighted score on relevance of training conducted by IBA ......................................................... 13
Table 5: Weighted score on relevance of training conducted by IMS ........................................................ 13
Table 6: Comparison of IPs on weighted average score about relevance of training ................................ 14
Table 7: Weighted score on the ability of respondents to apply the new learning from the training
conducted by IBA ........................................................................................................................................ 17
Table 8: Weighted score on the ability of respondents to apply the new learning from the training
conducted by IMS ....................................................................................................................................... 17
Table 9: Weighted score on improved performance as a result of the training, conducted by IBA .......... 21
Table 10: Weighted score on improved performance as a result of the training, conducted by IMS ....... 21

v|Page

Level 3 Evaluation/M&E/TESP/HEC

1. Executive Summary
This report is about the Level-3 Evaluation of the training program conducted by TESP Secretariat for the
HEC, Islamabad. The training program was conducted under eight thematic areas. The target participants
of the training were the administrative staff members of the HEIs from more than 70 public sector
universities of Pakistan. A total of 1735 participants attended the training program. Mostly, the
participants belonged to the higher and middle management of the HEIs with the portfolios such as Vice
Chancellor, Pro-Vice Chancellor, Registrar, Treasurer, Controller Examination, Dean, Director, Additional
Director, Deputy Director, Administrator, Assistant Director, Assistant Treasurer and Librarian. Some of
the participants belonged to the faculty members of the HEIs. Approximately 1100 participants were
contacted through email and 534 participants responded to the online questionnaire. 20% of the
respondents belonged to the higher management (grade 20 and above). 76% of the respondents belonged
to the middle management (grade 17-19).
Overall the evaluation showed very encouraging results of the training. 82% respondents found the
training highly relevant1 to their job assignments. 59% respondents highly agreed2 that they were able to
apply their new learning to their workplace. 63% of the respondents showed high agreement with having
improved their performance at the workplace as a result of the training and quoted a number of examples
of how their performance had improved. 84% respondents reported that they were able to transfer their
new learning to others. Most of the participants identified their colleagues as the best support in
application of the new learning to their work. The biggest factors in hampering the application of new
learning to the work were identified to be the financial, technological and human resource barriers.
The participants emphasized the continuation of such training programs in future. Also, they urged the
need to conduct such training programs at least once in every six months for all administrative staff of
HEIs, previously ignored in capacity building programs.
The evaluation finds the training program very successful and recommends to continue the training
program in future with emphasis on improving the nomination process to attract the most relevant
participants. Also, the evaluation recommends to monitor and evaluate both the training process and
outcomes for ensuring better performance and achieving the optimum results.

Highly relevant, here, means that the participants marked the option 4 or 5 on a Likert scale of 0-5, where 0
meant Not Relevant and 5 meant Very Relevant.
2
Highly agreed, here, means that the participants marked the option 4 or 5 on a Likert scale of 0-5, where 0 meant
Not at All and 5 meant Very Much. For further details and data break up, please see inside the report.

1|Page

Level 3 Evaluation/M&E/TESP/HEC

4 Introduction
4.1 About Tertiary Education Support Project
Tertiary Education Support Project (TESP) has been supported by the World Bank to assist the government
of Pakistan in the implementation of Medium-Term Development Framework for Higher Education for
2011-15 (MTDF-HE II). The aim of the project is to support the government reforms in tertiary education
under two components: Component 1 - Program Financing, and Component 2 - Capacity Building, Policy
Design and Monitoring and Evaluation (M&E). The overall objectives of the project are to:
i.
ii.
iii.
iv.

Improve Fiscal sustainability & Expenditure effectiveness;


Enhance quality & Relevance of teaching, learning and research;
Improve equitable access to Tertiary Education; and
Strengthen Governance & Management of HEIs.

In order to coordinate and provide operational support for carrying out different activities of the Project,
TESP Secretariat has been established at the Higher Education Commission (HEC). TESP Secretariat
coordinates with various Focal Persons from different organs of the HEC and reports the implementation
progress to the World Bank, on behalf of HEC.

4.2 About the Training Program


Capacity Building of staff of HEIs/HEC has been one of the most important and prominent assignments of
TESP. As part of its capacity building plan, TESP-HEC hired the professional services of two implementation
partners (IPs) from public sector leading institutions; Institute of Management Sciences (IMS), Peshawar
and Institute of Business Administration (IBA), Karachi, for imparting training to the staff of HEIs/HEC. The
IMS conducted trainings during March 9, 2015 to June15, 2015 and the IBA conducted trainings during
April 28, 2015 to September 27, 2015. Both the institutions trained 1735 participants from HEC/HEIs, on
the following eight thematic areas:
i.
ii.
iii.
iv.
v.
vi.
vii.
viii.

Leadership and Change Management


Human Resource Management
Information Technology Management
Financial Management
Intellectual Property Rights
Developing and Assessing Proposals
Project Planning Implementation and Evaluation
Quality Assurance in Higher Education

4.3 About the Evaluation


During the trainings, TESP secretariat provided technical assistance to the Implementation Partners (IPs)
to conduct evaluation of the training, building upon Kirkpatrick evaluation model. For details about the
Kirkpatrick
evaluation
model
please
visit
http://www.kirkpatrickpartners.com/OurPhilosophy/TheKirkpatrickModel. Evaluation of the training
program was conducted on two levels, during the training:

Level-1: Reaction of participants to training facilitation, delivery and course contents etc.
Level-2: Learning of participants, measured through pre and post-test.

2|Page

Level 3 Evaluation/M&E/TESP/HEC

Prior to the start of its next phase of capacity building of HEIs in 2016-17, the TESP Secretariat intended
to engage short-term services of a third party individual consultant to conduct Level-3 evaluation of the
above mentioned training programs. However, the senior management of HEC did not agree to the
proposal. Therefore, the evaluation was conducted in house by eh TESP M&E team.

4.4 Objectives of the Level-3 Evaluation


The overall purpose of the evaluation is to know if the learning from training has transformed into
improved actions and behavior at work place. The results of the survey will be internally shared with the
TESP team in the form of infographics and summarized tables. The findings may be useful for the team in
improving strategies for the future training programs.

4.5 Specific objectives of the evaluation are as follows:


i.
ii.
iii.
iv.

To assess the relevance of the training contents with the requirements of HE sector and
professional needs of the participants.
To assess the effectiveness of training in terms of achieving its outcomes.
To assess the enablers and barriers to sustainably implement the learning of the trainings.
To explore the way forward to improve the relevance and effectiveness of future training
programs.

3|Page

Level 3 Evaluation/M&E/TESP/HEC

4.6 Methodology
In order to obtain the answers to the above cited key questions, an online survey questionnaire was
designed. The survey questionnaire was pilot tested and important changes were made to it, before final
launch of the survey. The evaluation team contacted the participants through email and sent them the
link to Google Survey Form, attached at Annexure A. After a few days of launching the survey, the
participants were reminded through email and telephone to fill out the survey form. The survey was open
for responses for 14 days. The training was attended by 1735 participants. A total of 534 responses were
received that were analyzed through pivot tables to disaggregate data, training theme wise and IP wise.
There were three open ended questions to obtain qualitative responses. These ware thematically
analyzed.
The draft report was shared with the program implementation team. Their queries for further analysis
and disaggregation of data were noted and responded to. Also, their general feedback to improve usability
of the contents was incorporated in the final report.

4.7 Limitations
The study encountered certain limitations. First, all participants could not be contacted. The M&E team
had the contact information of around 1600 out of 1735 total participants. Data cleaning process was
initiated before contacting the participants. Some participants had not mentioned the email addresses.
Some of them had attended multiple trainings and the duplicate emails were removed. Finally, around
1100 participants could be contacted through emails. This caused a considerable missing data to analyze
the effects of the training. Second, among those who were contacted, 534 participants responded. This
causes the self-selection bias. There is always a possibility that those who self-selected to respond to the
questionnaire might have performed better than those who did not self-select themselves. Third, as per
the initial activity plan, Level -3 Evaluation was to be conducted by a third party, after a maximum of six
months of the training. However, the administrative process for approval took a considerable time and
finally the evaluation was conducted in house after one year of conducting the training. The delay might
have made it difficult for the participants to recall the training contents and its relevance to their work.

4|Page

Level 3 Evaluation/M&E/TESP/HEC

5 Key Findings
5.1 Profiles of the Respondents

7.1

Profiles of the Respondents

5|Page

Level 3 Evaluation/M&E/TESP/HEC

5.1.1 Number of training participants and respondents to the survey


The following table explains the eight themes of the training program and the participants in each one of
them. It also explains the number of respondents and the percentage of the respondents in relation to
the number of participants:
Table 1: Number of training participants and respondents to the survey
Training Theme

Total Participants

Total Respondents

Percentage

a) Leadership and Change Management

275

94

34%

b) Human Resource Management

235

71

30%

c) Information Technology Management

216

90

42%

d) Financial Management

227

54

24%

e) Intellectual Property Rights

189

76

40%

f) Developing and Assessing Proposals

229

51

22%

g) Project Planning Implementation and Evaluation

259

49

19%

h) Quality Assurance in Higher Education

105

49

47%

1735

534

31%

Total

A total of 1735 participants attended the training programs, organized under eight thematic areas. TESP
contacted around 1100 participants with valid email addresses. 534 participants responded to the survey,
making the overall response ratio 31% that is very encouraging and is helpful in understanding the
usefulness of the training program in improving the performance of the participants at workplace. The
ratio of respondents in relation to the number of participants, under each thematic area of training, is
also quite encouraging as there are a minimum of 49 respondents and the minimum ratio is 19%.

6|Page

Level 3 Evaluation/M&E/TESP/HEC

5.1.2 Number of training participants and respondents to the survey, IP wise


In order to ensure that there are a sufficient number of respondents trained at each one of the IPs so that
the results can be analyzed and disaggregated IP wise. Tables 2 and 3 explain the IP wise respondents of
the survey, in relation to the number of participants.
Table 2: Number of training participants and respondents to the survey, trained at IBA
Training Themes

IBA

Training Name

Total Participants

a) Leadership and Change Management

Total Respondents

Percentage

130

44

34%

90

38

42%

c) Information Technology Management

111

39

35%

d) Financial Management

102

31

30%

95

33

35%

f) Developing and Assessing Proposals

100

21

21%

g) Project Planning Implementation and Evaluation

100

18

18%

0%

728

224

31%

b) Human Resource Management

e) Intellectual Property Rights

h) Quality Assurance in Higher Education


Total

Table 3: Number of training participants and respondents to the survey, trained at IMS
Training Theme

IMS

Training Name

Total Participants

Total Respondents

Percentage

a) Leadership and Change Management

145

50

34%

b) Human Resource Management

145

33

23%

c) Information Technology Management

105

51

49%

d) Financial Management

125

23

18%

e) Intellectual Property Rights

94

43

46%

f) Developing and Assessing Proposals

129

30

23%

g) Project Planning Implementation and Evaluation

159

31

19%

h) Quality Assurance in Higher Education

105

49

47%

1007

310

31%

Total

As the training was imparted by two different IPs, it was important to compare and analyze the
effectiveness of training programs imparted by two different IPs. Therefore, the data of respondents has
been segregated IP wise. Although both the IPs trained different number of participants, their ratio of
respondent remained the same i.e. 31%.

7|Page

Level 3 Evaluation/M&E/TESP/HEC

5.1.3 Grade wise, profile of the respondents

27%

26%
24%

12%
8%

3%
1%
14

16

17

18

19

20

21

Figure 1: Grade wise, profile of the respondents

It is encouraging to find that 20% of the respondents belonged to the higher management (grade 20-21)
of the HEIs and 76% of the respondents belonged to the middle management (Grade 17-19) of HEIs.

8|Page

Level 3 Evaluation/M&E/TESP/HEC

5.1.4 Role wise, profile of the respondents


79%

21%

Admin

Faculty

Figure 2: Role wise, profile of the respondents

The training program was primarily meant for the administrative staff of the HEIs. However, it is surprizing
to find that as many as 21 % of the respondents were found to belong to the faculty staff.

9|Page

Level 3 Evaluation/M&E/TESP/HEC

The following chart shows the training theme wise disaggregated data of administrative staff and faculty
memebrs.
100%
90%
80%

93%

92%

92%
84%

79%

70%

60%

86%

61%

60%
50%

40%

40%
30%

21%

16%

20%
10%

39%

7%

8%

14%

4%

0%

Admin. Staff

Faculty Members

Figure 3: Role wise, profile of the respondents, disaggregated by training themes

The chart illustrated above reveals that the training on Intellectual Property Rights had 40% particpants
from faculty and the training on Developing and Assessing Proposals had 61% particpants from faculty.
The optimum level participation from administrative staff could not be attractied to the two training
themes, highlighted above, because the nomination was left to the higher management of the relevant
HEIs (including the VC and the Regisrar). There is need to improve the nomination process so that only
the most relevant particpants can be attracted to the training.

10 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

Section 2

5.2 Relevance of Training to Work

7.2

Relevance of Training to Work

11 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.2.1 Overall Relevance of Training to Work


The training participants were asked to rate the relevance of the training to their work, on a scale of 0-5,
in the following manner:
0

Not relevant

Very relevant

In response, the participants gave the following ratings:


45%

37%

12%

1%
0

2%

3%

Figure 4: Rating of relevance of training to work

An overwhelming majority (82%) of the respondents found the training highly relevant to their job
assignments, marking the option 4 & 5. This shows the overall positive perceptions of participants about
the relevance of the training.

12 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.2.2 Relevance of Training, IP wise


In order to evaluate the relevance of training conducted by the two different IPs, it was important to
disaggregate the data. Therefore, the scores against each training were placed in the following table and
their weighted sum was calculated by multiplying the number of responses with the corresponding level
of satisfaction [for example, for calculating the weighted sum of the satisfaction level of the training on
Leadership and Change Management, the following calculations were made
(1x0=0)+(0x1=0)+(0x2=0)+(1x3=3)+(19x4=36)+(23x5=115)=194]. The weighted sum was then divided by
the sum of responses to obtain the average score.
Table 4: Weighted score on relevance of training conducted by IBA
IBA
Type of training

Rating level

Weighted
Sum

Sum of
Responses

Average
Score

a) Leadership and Change


Management
b) Human Resource Management

19

23

194

44

4.4

16

18

164

38

4.3

c) Information Technology
Management
d) Financial Management

16

12

150

39

3.8

12

14

131

31

4.2

14

12

135

33

4.1

10

82

21

3.9

74

18

4.1

e) Intellectual Property Rights


f) Developing and Assessing
Proposals
g) Project Planning Implementation
and Evaluation
h) Quality Assurance in Higher
Education
Total

N/A
930

224

4.2

Sum of
Responses

Average
Score

Table 5: Weighted score on relevance of training conducted by IMS


IMS
Type of training

Rating level
0

a) Leadership and Change


Management
b) Human Resource Management
c) Information Technology
Management
d) Financial Management
e) Intellectual Property Rights
f) Developing and Assessing
Proposals
g) Project Planning Implementation
and Evaluation
h) Quality Assurance in Higher
Education
Total

Weighted
Sum

25

14

199

50

4.0

12

13

133

33

4.0

19

26

222

51

4.4

10

94

23

4.1

23

172

43

4.0

13

10

116

30

3.9

11

16

136

31

4.4

13

31

221

49

4.5

1293

310

4.2

1
2

13 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.2.3 Comparison of IPs on weighted average score about relevance of training


It is interesting to find that the two IPs had the same overall score i.e. 4.2 on satisfaction from the
relevance of training. However, as the devil is in the detail, the following comparison reveals the
differences of the two IPs on satisfaction levels on different training themes.

Weighted Satisfaction Score

Table 6: Comparison of IPs on weighted average score about relevance of training


4.7
4.5
4.3
4.1
3.9
3.7
3.5
c)
a)
b)
Informa
d)
e)
Leaders Human
tion
Financi Intellect
hip and Resourc
Technol
al
ual
Change
e
ogy
Manage Propert
Manage Manage
Manage ment y Rights
ment
ment
ment

f)
Develop
ing and
Assessi
ng
Proposa
ls

g)
Project
Plannin
g
Implem
entatio
n and
Evalua

IBA

4.4

4.3

3.8

4.2

4.1

3.9

4.1

IMS

4.0

4.0

4.4

4.1

4.0

3.9

4.4

h)
Quality
Assuran
ce in
Higher
Educati
on

4.5

Figure 5: Comparison of IPs on weighted average score about relevance of training

The participants trained at IBA on Information Technology and Management scored lower, compared to
other trainings at the same place, in terms of relevance of the training to their work. The qualitative
responses also endorse the same as some of the participants of the IT training at IBA Karachi complained
that the training was of very basic level and irrelevant.
Similarly, the participants trained at both IBA and IMS, on Developing and Assessing Proposals, scored
lower, compared to other trainings, in terms of relevance of the training to their work. This is further
explained by the qualitative responses where some respondents have complained the training on
Developing and Assessing Proposals was too basic for the professionals already engaged with proposal
development.

14 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.3 Application of New Learning

7.3

Application of New Learning

15 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.3.1 Overall Application of New Learning


The participants were asked to rate their ability of applying the new learning from training to their work,
on the following scale:
1

Not at all

Very much

The participants marked their rating as detailed below:

50%
43%

45%
40%
35%
30%

27%

25%
20%

16%

15%
8%

10%
5%

2%

3%

0%
2

Figure 6: Rating of ability to apply learning to work

59% respondents highly agreed (by marking the option 4 &5) that they were able to apply their new
learning from the training to their workplace.

16 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.3.2 Application of new learning, IP wise


As explained in the previous section, to evaluate the ability of respondents to apply the new learning
from the trainings conducted by the two different IPs, it was important to disaggregate the data.
Therefore, the scores against each training were placed in the following table and their weighted sum
was calculated by multiplying the number of responses with the corresponding level of satisfaction. The
weighted sum was then divided by the sum of responses to obtain the average score. The following
tables explain the process:

Table 7: Weighted score on the ability of respondents to apply the new learning from the training conducted by IBA
IBA
Type of training
0

a) Leadership and Change


Management
b) Human Resource Management

22

163

Sum of
Response
s
44

12

14

142

38

3.7

c) Information Technology
Management
d) Financial Management

14

11

138

39

3.5

19

118

31

3.8

16

117

33

3.5

71

21

3.4

65

18

3.6

814

224

3.6

e) Intellectual Property Rights

Rating levels

f) Developing and Assessing


Proposals
g) Project Planning Implementation
and Evaluation
h) Quality Assurance in Higher
Education
Total

Weighted
Sum

Average
Score
3.7

Table 8: Weighted score on the ability of respondents to apply the new learning from the training conducted by IMS
IMS
Type of training

Rating level
0

a) Leadership and Change


Management
b) Human Resource Management
c) Information Technology
Management
d) Financial Management
e) Intellectual Property Rights
f) Developing and Assessing
Proposals
g) Project Planning Implementation
and Evaluation
h) Quality Assurance in Higher
Education
Total

22

22

175

Sum of
Response
s
50

10

12

116

33

3.5

16

23

181

51

3.5

79

23

3.4

Weighted
Sum

Average
Score
3.5

11

20

140

43

3.3

10

97

30

3.2

12

117

31

3.8

31

182

49

3.7

1087

310

3.5

17 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

Weighted Score

5.3.3 Comparison of IPs on weighted average score about application of learning

3.9
3.8
3.7
3.6
3.5
3.4
3.3
3.2
3.1
3.0
c)
a)
b)
Informa
d)
e)
Leaders Human
tion
Financi Intellect
hip and Resourc
Technol
al
ual
Change
e
ogy
Manage Propert
Manage Manage
Manage ment y Rights
ment
ment
ment

f)
Develop
ing and
Assessi
ng
Proposa
ls

g)
Project
Plannin
g
Implem
entatio
n and
Evalua

IBA

3.7

3.7

3.5

3.8

3.5

3.4

3.6

IMS

3.5

3.5

3.5

3.4

3.3

3.2

3.8

h)
Quality
Assuran
ce in
Higher
Educati
on

3.7

Figure 7: Comparison of IPs on weighted average score about application of learning

When the data is further analyzed under the thematic areas of the training and IP wise, there is a clear
indication that the training provided by IBA on Financial Management met the highest satisfaction of
the respondents. Also, the IMS received relatively lower satisfaction of the participants on the trainings
on Intellectual Property Rights and on Developing and Assessing Proposals. On the latter, IBA also
received relatively lower satisfaction.

18 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.4 Improved Performance as a Result of New Learning

7.4

Improved Performance as a Result


of New Learning

19 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.4.1 Overall Improved performance


The participants were asked to rate their improved performance as a result of the training, on the
following Likert scale:
1

Not at all

Very much

50%
45%

45%
40%
35%
30%

27%

25%
18%

20%
15%
10%
5%

7%
1%

3%

0%
0

Figure 8: Rating of improved performance as a result of the training

Overall, 63% of the respondents showed high agreement with having improved their performance at the
workplace as a result of the training, by marking the options 4 & 5.

20 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.4.2 Improved performance as a result of new learning, IP wise


Table 9: Weighted score on improved performance as a result of the training, conducted by IBA
IBA
Type of training

a) Leadership and Change


Management
b) Human Resource Management

Rating level

Weighted
Sum

Sum of
Responses

Average
Score

21

10

165

44

3.8

16

10

145

38

3.8

12

14

143

39

3.7

16

123

31

4.0

12

10

115

33

3.5

67

21

3.2

67

18

3.7

825

224

3.7

Sum of
Responses

Average
Score

c) Information Technology
Management
d) Financial Management

e) Intellectual Property Rights


f) Developing and Assessing
Proposals
g) Project Planning
Implementation and Evaluation
h) Quality Assurance in Higher
Education
Total

Table 10: Weighted score on improved performance as a result of the training, conducted by IMS
IMS
Type of training

Rating level
0

a) Leadership and Change


Management
b) Human Resource Management
c) Information Technology
Management
d) Financial Management
e) Intellectual Property Rights
f) Developing and Assessing
Proposals
g) Project Planning
Implementation and Evaluation
h) Quality Assurance in Higher
Education
Total

Weighted
Sum

22

21

172

50

3.4

19

122

33

3.7

12

28

194

51

3.8

12

83

23

3.6

1
1

11

23

154

43

3.6

12

94

30

3.1

11

11

115

31

3.7

10

25

10

186

49

3.8

1120

310

3.6

21 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

Weighted Satisfaction Score

5.4.3 Comparison of IPs on weighted average score about improved performance as a result of
training

4.2
4.0
3.8
3.6
3.4
3.2
3.0
c)
a)
b)
Informa
d)
e)
Leaders Human
tion
Financi Intellect
hip and Resourc
Technol
al
ual
Change
e
ogy
Manage Propert
Manage Manage
Manage ment y Rights
ment
ment
ment

f)
Develop
ing and
Assessi
ng
Proposa
ls

g)
Project
Plannin
g
Implem
entatio
n and
Evalua

IBA

3.8

3.8

3.7

4.0

3.5

3.2

3.7

IMS

3.4

3.7

3.8

3.6

3.6

3.1

3.7

h)
Quality
Assuran
ce in
Higher
Educati
on

3.8

Figure 9: Comparison of IPs on weighted average score about improved performance as a result of training

Further probe in the data, thematically and IP wise, reveals that the participants of the training on
Developing and Assessing Proposals reported a relatively lower satisfaction. This stands true for both the
IPs. The highest satisfaction is reported by the participants of the training on Financial Management at
IBA. These results are consistent with the previous comparison of the IPs on ability of the participants to
apply the learning from the training. In the following section, the qualitative data further confirms and
explains the findings of the above figure.

22 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.4.4 Examples of Improved performance


One of the important aspects of the evaluation was its qualitative data collection to thoroughly
understand the quantitative data. Thematic analysis of the qualitative data reveals the following
important findings, arranged according to the training themes.
5.4.4.1 a) Leadership and Change Management
Improved leadership: changed from a boss to a leader.
Improved emotional intelligence: improved relationship with upper management, peers and
subordinates.
Was able to motivate the subordinates.
Took extra assignment involving leadership role at work.
5.4.4.2

b) Human Resource Management


Improved recruitment process.
Improved team work.
Improved motivation of employees: introduced the Best Employee Award.
Developed job descriptions for various positions.
Developed SOPs for various assignments.
Strengthened the training and development process of employees.
Improved performance appraisal mechanism.

5.4.4.3 c) Information Technology Management


Contributed towards adoption of IT based environment, LMS, CMS, development of IT
infrastructure, building cloud structure, networking, establishment of CISCO academy,
automation of IT support system and promotion of IT culture.
5.4.4.4 d) Financial Management
Improved asset management system by maintaining inventory.
Developed SOPs for financial management.
Improved accounting and record keeping and partially implemented double entry book keeping
system.
Improved the procurement mechanism.
Invested university assets in more profitable ventures.
5.4.4.5 e) Intellectual Property Rights
Improved awareness regarding HEC Plagiarism Policy
Helped in eliminating plagiarism and promoting ethical research.
Provided assistance to file patents, submit copyrights applications, register trademarks and
protect intellectual property rights.
Promoted the use of Turnitin.
Helped in drafting the university policy on intellectual property rights.
5.4.4.6 f) Developing and Assessing Proposals
Enabled to produce proposals for research and development projects
Improved capacity to review and assess research proposals

23 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.4.4.7 g) Project Planning Implementation and Evaluation


Improved capacity for project planning, implementation and reporting on various formats (PC IPC V).
Enhanced capacity for decision making, management of work plan and cash plan and designing
the model for results based management (RBM).
5.4.4.8 h) Quality Assurance in Higher Education
Improved implementation of quality assurance instruments: compiled Self-Assessment Reports
(SARs) for various programs, developed KPIs, included learning outcomes in the course outlines
and teaching plans.
Improved ranking of the QEC and the HEI.

24 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.5 Ability to Transfer Learning from the Training to Others

7.5

Ability to Transfer Learning from


the Training to Others

25 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

The participants were asked whether they were able to pass on their learning to others. In response 84%
of the participants responded in affirmative.

16%

No
Yes

84%

Figure 10: Ability to transfer learning from the training to others

26 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.5.1 Ability to transfer learning from the training to others, IP wise


The question of transfer of learning was further analyzed IP wise. However, the participants of the two
venues of training described no significant differences in their ability to transfer their learning to others,
as explained in the figure below.

90%
80%
70%
60%
50%
40%
30%
20%
10%
0%

No

Yes
IBA, Karachi

Total

15%

85%

No

Yes

Maragalla Hotel, Islamabad


17%
83%

Figure 11: Ability to transfer learning from the training to others, IP wise

27 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.5.2 Ability to transfer learning from the training to others, training theme wise
Furthermore, the ability of the participants to transfer leaning was also analyzed training theme wise.

90%
88%
84%

86%
82%

83%

83%
76%

Figure 12: Ability to transfer learning from the training to others, training theme wise

The participants of the training on Human Resource Management reported the highest ability to transfer
their learning to others. On the other hand, the participants of the training on Developing and Assessing
Proposals reported the lowest ability to transfer their learning to others. This comes as no surprise, if
finding is compared with the previous two section where the participants reported the lowest rating and
satisfaction about the training on Developing and Assessing Proposals.

28 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.5.3 Description of transfer of learning


Probing the ability of the participants to pass on their learning to others, the respondents were asked to
briefly explain how they were able to transfer their learning. The following points give an overview of
passing of the skills and knowledge by the TESP training participants to their colleagues, faculty
members, seniors, juniors, students and fellow researchers etc.
1. Participants shared that they were able to transfer skills and knowledge to others by using
different modes including the following.

Sharing of training material and notes


Discussions
Guidance
Orientation of newly inducted staff
Short presentations
Making part of routine working practices
Lectures making part of the teaching practices in the class-room
Briefings
Formal training workshops
Seminars
Symposiums, and;
On job assistance and motivation etc.

2. Some best practices were also observed from the respondents responses, which are as below.

Addition of new Unit on Quality Assurance in Higher Education for B.Ed. (Hons,)
program
Inclusion of an exclusive session on Intellectual Property Rights in Research
Methodology Workshops.
Shifting of single entry system to double entry system in accounting management
practices
Developing inventory management system which was non-existent before training.
Increased level of interest in developing new project proposals.

29 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.6 Major Enablers in Applying the New Learning to Workplace

7.6

Major Enablers in Applying the


New Learning to Workplace

30 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

The participants were asked about the major support they had in applying new learning to their work.
The participants had the option to select multiple enablers in response. The following figure describes
the major enablers:
Not applicable (in case you were not able to
apply the new learning to your work)

58

My supervisor

144

My colleagues

297

Availability of technological resource

92

Availability of human resource

Availability of financial resource

114

53

Figure 13: Major enablers in applying the new learning to workplace

Most of the participants identified their colleagues as the best support in application of the new learning
to their work.

31 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.7 Major Barriers that Hampered the Application of New Learning to Workplace

7.7

Major Barriers that Hampered the


Application of New Learning to
Workplace

32 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

The participants were asked about the major barriers that hampered the application of their new
learning to their job assignments. The participants had the option to select multiple barriers in response.
The following graph explain the major barriers:
Not applicable (in case you faced no barriers
in applying the new learning to your work)

147

My supervisor

My colleagues

Availability of technological resource

Availability of human resource

42

20

139

132

Availability of financial resource

167

Figure 14: Major barriers that hampered the application of new learning to workplace

The biggest factors in hampering the application of new learning to the work were identified to be the
financial, technological and human resource barriers.

33 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

5.8 Comments and Suggestions by the Respondents

7.8

Comments and Suggestions by the


Respondents

34 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

In addition to the specific questions about the training, the participants were also invited for open
comments and suggestions to improve the future training program. Their responses have been
thematically analyzed and arranged according to the training cycle.

Training needs assessment (TNA) should be systematically conducted.

Nomination process should be improved to ensure that only relevant staff attends the
training.
The training outlines should be shared with the participants in advance.
The trainees should be arranged in homogenous groups with similar level of skills and
knowledge.

The trainer must have relevant practical experience of their theoretical area of expertise.
Adopt case study method. Develop local case studies for training.
Discuss real time scenario.
The examples discussed in the training should be relevant to university not to the corporate
sector.
Discuss relevant success stories and failures.
The training should be more practical than theoretical and should add value to the knowledge
and skills of the professionals at their work place.

Should be more interactive. More space time should be allocated for discussion.
The timing of sessions should be reduced and the number of training days should be
enhanced.

Provide international exposure and foreign training.


Foreign trainers should be invited to share their experiences and best practices.

Conduct field visits to other institutions where to show the best practices.
Make cascading arrangements of the training at university level.
Such trainings should be arranged more frequently, at least twice a year.

Learning of the each day should be measured.


Level 3 evaluation is irrelevant if a participant does not hold an administrative position.

Future trainings should be organized on specific issues related to university management,


such as admission process and examination process and Information Technology
Infrastructure Library (ITIL).

35 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

6 Conclusion
Overall, the evaluation finds the training program very successful based on the feedback of the
respondents. 82% respondents found the training highly relevant to their job assignments. 59%
respondents highly agreed that they were able to apply their new learning to their workplace. 63% of the
respondents showed high agreement with having improved their performance at the workplace as a result
of the training and quoted a number of examples of how their performance had improved. 84%
respondents reported that they were able to transfer their new learning to others. Most of the
participants identified their colleagues as the best support in application of the new learning to their work.
The biggest factors in hampering the application of new learning to the work were identified to be the
financial, technological and human resource barriers.
Other than analyzing the overall data, the evaluation also separately analyzed the rating of the individual
training themes conducted by the two IPs. It turns out that the respondents expressed lower level of
relevance about the training on IT Management, conducted by IBA. Similarly, the respondents gave a
lower rating on relevance, improved performance and ability to transfer learning to others about the
training on Developing and Assessing Proposals, conducted by both IBA and IMS. The qualitative data
also confirms the above as some participants considered the training too basic for the professionals
already engaged with proposal development and IT management.
The participants admired the TESP for organizing the training program. Many of the participants requested
to hold such training programs more frequently, at least after every six months. This informs us of the
importance of this training program and the need to continue it in future.
The evaluation noted that nomination process has room for improvement. The intended participants were
the higher and middle management of HEIs. However, 21% of the respondents were faculty members and
4% of the respondents were even below grade 17.

36 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

7 Recommendations
The nomination process needs to be improved for future training programs. The nominating authority
should be clearly informed about the training contents and the intended participants so that only the
intended and relevant administrative staff is trained by TESP.
Contact information, including email address and phone number, from the training participants, for follow
up and assessment of utility of the training is important and should be collected more carefully. Other
than the official email, which is subject to change with the change of position, personal email address
should also be obtained. This will reduce the bias in the future researches caused by missing data.
The evaluation recommends to monitor and evaluate both the training process and outcomes for ensuring
better performance and achieving the optimum results. In case the IP fails to meet the quality standards
and satisfaction of the participants, the contract should have certain provisions to penalize the IP.

37 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

8 Annexure A
Google Survey Form

Level-3 Evaluation of the TESP Training Participants


Background:
Last year, during March - September, 2015, Tertiary Education Support Program (TESP) imparted
various trainings in Islamabad and Karachi. During the trainings, evaluation of the training, was
conducted on two levels and we are very grateful for your responses and insights.
Prior to the start of its next phase of capacity building in 2016-17, the TESP Secretariat intends to
conduct Level-3 Evaluation of the previously conducted training programs.
The overall purpose of the Level-3 Evaluation is to know if the learning from training has transformed
into improved actions and behavior at work place. It will be used to improve training strategies and
their results in future.
Anonymity of the respondents will be kept intact and the individual responses will be kept confidential.
* Required

Skip to question 1.

Profile
1. Your Designation *

(If you are a faculty member and have an


administrative role, please mention the
Administrative Designation ONLY that is relevant
to the training program, e.g. Director QEC/ORIC)

2. Grade/Equivalent to Grade *

3. Please select the training program attended *

(In case you attended multiple training programs, please select only one that is the most relevant to
your work) Mark only one oval.

38 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

a) Leadership and Change Management


b) Human Resource Management
c)

Information Technology Management

d) Financial Management
e) Intellectual Property Rights
f)

Developing and Assessing Proposals

g) Project Planning Implementation and Evaluation


h) Quality Assurance in Higher Education
4. Venue of the training * Mark only one oval.

Maragalla Hotel, Islamabad


IBA, Karachi
Skip to question 5.

Training Outcomes
6. How do you rate the relevance of the training to your work? * Mark only one oval.

Not relevant

Very relevant

7. How much were you able to apply the new learning from the training to your work?

* Mark only one oval.


0

Not at all

Very much

8. How much has your performance improved as a result of the training? * Mark only

one oval.
0

Not at all

5
Very much

9. Could you please give an example of how the training resulted in improved

performance in your job assignment?

39 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

10. Were you able to pass on your learning from the training to others? * Mark only

one oval.
Yes
No
11. If you were able to pass on your learning from the training to others, please briefly

describe how you were able to do that.

Skip to question 11.

Enablers and Barriers


11. What was the major support you had in applying new learning to your work? Check all that apply.

Not applicable (in case you were not able to apply the new learning to your work)
My supervisor
My colleagues
Availability of human resource
Availability of financial resource
Availability of technological resource
Other:
12. What were the major barriers that hampered the application of your new learning to your job

assignment?
Check all that apply.

40 | P a g e

Level 3 Evaluation/M&E/TESP/HEC

Not applicable (In case you faced no barriers in applying new learning to your work)
My supervisor
My colleagues
Lack of human resource
Lack of financial resource
Lack of technological resource
Other:
13. Any other comments/suggestions to improve the training programs in future?

Powered by

41 | P a g e

TESP SECRETARIAT
Higher Education Commission, Islamabad
Top Floor, HRD Building, H-8, Islamabad, Pakistan
Contact: asadkhan@hec.gov.pk; Phone: +92-51-90808140

Você também pode gostar