Você está na página 1de 21

Running Head: EVALUATION PLAN

Evaluation Plan for the Adult Education Program Kaylea Algire July 26, 2013 James Madison University

EVALUATION PLAN Introduction The AHRD program hovers at a crucial point in its history. The introduction of the new ASTD competency model, a demographic change, and the growth of the program all require a updated system to accommodate both learners and faculty (Wilcox, 2013). Revelations illuminated in the AHRD case study bring some of these issues to the forefront. Further research needs to be conducted to examine the extent of the need for an update to the program. Recommendations based on this case study and other evidence will elucidate areas for growth in the AHRD program. Lastly, the final evaluation of the implementation will allow for the evaluators to test the effectiveness of the evaluation objectives and measure the growth in satisfaction among both faculty and staff. Evaluation Plan Purpose Based on the AHRD Case Study, several reasons for an evaluation became evident (Wilcox, 2013). First, the changes in social and interpersonal dynamic need to be addressed. Second, the growth rate of the program and the effect on faculty and staff impacts the success of participants. Lastly, the new ASTD Competency Model introduces a changing landscape in industry expectations on the beginning workforce. These three factors are the purpose for an evaluation plan for the AHRD program at JMU. After completing this evaluation plan two objectives will be accomplished: 1. By the end of the implementation of this plan, internship and employment placement will continue to remain above 90%. 2. By the end of the implementation of this plan, the program will be better equipped with more faculty and staff to accommodate growth by introducing revised program objectives, new adjunct faculty, and new course options.

EVALUATION PLAN Each of these objectives deals with measuring student and faculty satisfaction in the program. To increase both student and faculty success, an evaluation of the current program and an implementation of a new strategy is necessary for future growth. This plan will take approximately six months for Phase I and Phase II, and a year for Phase III. The entire plan can be completed by the Fall 2015 school year. Program History The Adult Education/Human Resource Development program is housed in the College of Education at James Madison University. Since its inception twenty years ago, the program has had a constant enrollment of fewer than ten students. However, in 2006 the numbers double and have continued to grow in the years following (Wilcox, 2013). The case study attributes this growth to the increased need in the corporate industry for instructional designers and professionals with a background in adult education (Wilcox, 2013). The current structure of the program (as detailed in Figure 1) is linear for traditional students. The majority of the students participating in the program begin in the fall and graduate four semesters later. The Masters program is a thirty-six credit experiential learning program. This program is attractive to most students because of the theory to practice emphasis.

Figure 1: AHRD Program Structure

EVALUATION PLAN

Currently, there are three full-time professors and several adjunct faculty. The staff provides at least six courses a semester to accommodate up to fifty students in the two cohort years. First year-students remain in their cohort for the first two semesters and then branch away from the cohort when taking electives or other concentration focused courses. Second year-students focus mainly on completing final comprehensive exams and finishing their concentrations. There currently is not a lot of cross-mingling among the first and second years in the program. In the past there have been social activities geared towards both cohorts; however, the 2012-2013 school year was the exception. Target Population The target population to be evaluated for this type of program will be pulled from all graduate students at JMU and all graduate faculty. Specifically the target population that will be focused on in this evaluation plan are the current students, alumnus, faculty and staff that are involved in the AHRD program at JMU. This population is small, however, their opinions and observations are the most valuable for the type of information that the evaluator is looking for when dealing with this project. Stakeholders Both internally and externally, the JMU AHRD program affects many stakeholders. Primarily the program touches the faculty, staff, and students enrolled in the program. The program influences the people that attend classes and work for the department because they have a vested interest in the success or failure of the program. Similarly, it also affects the College of Education and the University at large. These stakeholders are affected by the success or failure

EVALUATION PLAN of the program for two reasons: attracting new students to the University, and increasing alumni giving for existing programs. Externally, there are many investors in the program such as JMU alumnus. The alumnus of the program has an interest to maintain its reputation and encourage other professionals to attend. Employers with alumni have a direct relationship with the faculty and staff to recruit the best and the brightest to their companies. In addition to these employers, other corporations at large that are in need of human resource development professionals are also stakeholders in the type of program that JMU supports. Lastly, the network of AHRD professionals could also be a stakeholder in the program in order to support and encourage people to continue their education and further the practice of strong learning values in the workplace. Some of these stake holders are included in Figure 2. The important thing to remember about stakeholders in the AHRD program is the distribution of the effects that changes could have on all of these existing relationships.

EVALUATION PLAN

Figure 2: AHRD Logic Model

EVALUATION PLAN Program Objectives Presently there are ten program objectives that incorporate all levels of the program and directives towards the outcomes. These objectives are listed in Figure 2 above. The program objectives, while inclusive, do not feature measurable indicators of how students will know that they accomplished each goal. According to the Center for Disease Control and Prevention (2013), the SMARTer the goals are the easier it is to measure success (p. 1). Similarly, the program objectives are not synchronized with the most recent American Society of Training and Development competency model for AHRD professionals. Both of these discrepancies need to be remedied in order for stake holders to have a strong vision of expectations and standards of success. The following objectives are suggested to update existing objectives so that participants will have specific and measureable indicators of success. Figure 3 is a diagram outlining the 2013 ASTD Competency Model which guides these updated objectives.

Figure 3: AHRD 2013 Competency Model

EVALUATION PLAN Suggested Updated Program Objectives 1. By the end of the first year, students will evaluate systems theory, analytic systems, principles of adult development, learning theory, leadership theory and current trends during their comprehensive exams. 2. Before matriculation, students interpret business, industry, educational, and other organizational settings in their work experience or internship. 3. Students can support effective organizational relationships that support teaching, learning, and performance improvement appropriate to the context through their mentor relationship during the program. 4. Students evaluate teaching, learning, and performance improvement efforts in each course that they participate in order to hone their skills and the skills of their cohort. 5. To analyze, design, develop, implement, and evaluate appropriate curricula in appropriate modes (including distance, action, self-directed, transformative, informal learning, etc.) for individual, team, organizational, and social learning through team projects and comprehensive evaluations during the program. 6. Students facilitate and lead team-based learning, planning, organizing, and evaluation appropriate to the context as part of their coursework experience. 7. Students experiment with appropriate technologies for designing and developing learning solutions to clients both in and outside of the program. 8. Students recognize and respond responsibly to issues of diversity and ethics within the field of human resource development.

EVALUATION PLAN 9. Students demonstrate the ability to articulate and forecast the vision and role for teaching, learning, and performance improvement appropriate to the context in order to be change agents in their field. 10. Students interpret and create research to deepen their knowledge of understanding of the variety of areas in human resource development as part of a comprehensive exam. Research Design Size of the program dictates the best research design. In this case the program is too small and intimate to support a control group. In this case the program is too small and intimate to support a control group. This limitation rules out experimental designs. However, because the survey data and interviews are plausible, quasi-experimental design would be the best option. In order to gather the appropriate data, a survey will be administered to current students and alumnus. Interviews will need to be conducted with the faculty SMEs and staff because of the relative size of this group compared to the students. Within the survey, Likert statements, open and closed-ended questions will measure usability, satisfaction, and suggestions for improvement. Likert statements will ensure construct validity for part of the data collected (McDavid, 2013, p.13). Internal validity will be a challenge to confirm that the survey is in fact measure the causality of the program on student success or satisfaction. To increase construct validity, the survey will be peer-reviewed to discover any hidden biases or inequities. Furthermore, interview questions will undergo similar review to strengthen internal validity. External validity poses another set of challenges for the researchers. The relative size and specificity of the program limits its generalizability to other similar programs. Comparably, other programs, like Villanova Universitys human resource development program, exist in a

EVALUATION PLAN different environment with differing stakeholders and audiences. For example, most HRD programs are house in the school of business or are offered as online degrees only. Some of JMUs closest competitors are the University of Louisville and Villanova. Neither of these schools reflects the same or similar demographics as JMU. A large part of this evaluation plan will be conducted through faculty SMEs using tested interview questions. In order to gain the most valuable data, these interviews will first be coded and then triangulated to deduce commonalities and best practices for the program. Triangulation will be employed to strengthen the confidence in the validity of the measurement instruments (McDavid, 2013, p.125). Figure 2 illustrates some of the predicted causal linkages between inputs, outputs, and outcomes in the program. All of the inputs are specific to JMU and the AHRD program, also complicating the external validity. Reliability is another challenge in this research design. While the survey and interviews could be conducted with one of JMUs competitors, there is not guarantee that the results can be replicated in their settings. To increase the reliability of measures, Cronbachs alpha can be applied (McDavid, 2013, p.152). Also, coders will need to be trained to secure reliability of the measurement instruments. Measurement instrument validity is of one measure and one construct (p.156). Surveys need to be reviewed for face validity assuming that students are considered lay people in this plan. Faculty interviews need to be examined for content validity assuming they are acting as the SMEs. Furthermore, both groups could be considered homogenous because they are being asked about their common experience around the AHRD program at JMU. Both of the instruments are nominal and ordinal levels of measurement that are required to build the

10

EVALUATION PLAN necessary relationships, gain trust, and explore causality between the program and student success (p.161). Confidentiality will be guaranteed through the use of anonymity in the survey and a number code assigned to each of the SMEs. Unless the SMEs or survey respondents allow for their names or other identifying data to be used these will be kept anonymous and available only to the researcher. Identifying data will be destroyed upon the completion of this study. Data Collection Overall this evaluation plan will be mixed methods, quasi-experimental project. The data will be collected concurrently. While the quantitative portion is being completed by survey respondents, the research will be completing an abbreviated literature review of best practices among JMUs top competitors. Once all the data is collected it will be analyzed to decipher next steps for implementation. A qualitative component is necessary for two reasons. First, it will add the necessary background in which to base recommended improvements. Second, it will ground the findings in the dialogue around HRD program creation and maintenance. After the interviews are conducted, a concurrent triangulation approach will be used to discover any areas of convergence or divergence between the data and the research (McDavid, 2013, p.207). McDavid (2013) describes the idea of this approach as both qualitative and quantitative data are complimentary and when they are used together, strengthen the design (p. 207). It will not be a random sample due to the size of the available population. Interviews will be conducted on an individual basis to avoid any bias on the part of the interviewer. Interviews will be recorded with permission by the interviewee. Once the interview is completed the interviewer will take any notes and the recording and transcribe them for coding later. A related methodology will be completed for the open-ended questions in the survey. The

11

EVALUATION PLAN purpose of both the open-ended questions and the interviews is to extrapolate the perceived success, failures, and suggestions for growth from both sides of the program. For example, the director of the AHRD program will have a different perspective of what works and what does not work for the success of her students. Likewise, students in the workplace and in internships will have a different perspective on new technologies and models that would be useful for current students to be aware of as they enter the workforce. Both perspectives will be invaluable when reevaluating course offerings, program objectives, and considering adjunct faculty. Equally, survey data will be collected using peer-reviewed questions with the option to provide further feedback or not through e-mail correspondence. This survey will be administered through Qualtrics or comparable survey software in order to use the available data analysis tools. Both survey and interview questions will be reviewed using Miles and Hubermans (1994) Ways of Confirming Qualitative Findings (McDavid, 2013, p. 216). This will cement robustness and the reliability of data gathered using the tools. Data Analysis Each measurement instrument requires its own form of analysis. Survey results will be analyzed using the statistical analysis tools available on the survey software chosen for administration. Graphs and charts of the data will be included in the final evaluation to support program recommendations. Interview transcripts will be triangulated with qualitative research findings to isolate convergences and incongruities in the program. From the data sources, best practices will be derived and then applied to the improvement plan. This in depth data analysis will allow the research to recommend specific and demonstrated steps for program improvement as part of this evaluation plan. One the data is analyzed it will be presented in document form to the program director for any additional revisions. After revisions are include a final presentation

12

EVALUATION PLAN will be made to all SMEs, and an abbreviated version will go to all survey respondents with permission for the director. Recommendations For the purpose of providing the best set of recommendations possible to the AHRD program, several of JMUs AHRD competitors were considered for their best practices. The top three that were considered are Villanova University, University of Louisville, and the University of Georgia (Villanova, 2013, UGA, 2013, UL, 2013). Villanovas program focuses specifically on HRD with an internship requirement, several options for comprehensive exams, and it covers HRM topics in addition to HRD topics. Appendix A features the graduation checklist for their HRD degree. Some of the courses like Project Management and Workforce Planning touch on the new competencies for Talent Management and Knowledge Management in organizations (Villanova, 2013). Equally, the University of Louisville offers a Human Resource and Organizational Development Masters degree that is both on campus and distance learning. There are classes that touch on Facilitation specifically, and Change management which is reflective of the new competency model. Appendix B features the checklist for Workplace Learning and Performance (UL, 2013). Lastly, the University of Georgia offers courses both on campus and in a distance learning environment. They have courses specific to training and facilitation skills as well as change management. It is geared towards working professionals and offers classes on weeknights and on a rotating semester basis. UGAs program is the most similar to JMUs in terms of demographics, course offerings, and structure (UGA, 2013). Based on the case study information, previous survey data of the 2010 cohort and comparisons drawn between other programs, several recommendations are evident. 1. Re-align program objectives to current ASTD competencies.

13

EVALUATION PLAN 2. Introduce at least one new course to address coaching, training skills, and talent management. 3. Reinvigorate HRM Concentration 4. Hire two adjunct faculty to accommodate the course load 5. Decrease advisee load to make the relationship stronger. 6. Rejuvenate the mentor program. 7. Make work experience or an internship mandatory. 8. Make new student admission more evenly split between experienced professionals and new graduates. 9. Separate leadership and facilitation class into two different classes. 10. Re-arrange course structure. The least complicated of the recommendations will be the simplest to implement. First, redistributing the diversity of the program will alleviate the clique issue. This is a maturity challenge, and with more mature, experienced professionals this may not be as large of an issue. Second, re-aligning program objectives will continue to make students competitive and have a better sense of workplace obligations. Third, make the mentor program a staple. It may be better to look outside current students and recruit alumnus for this task. This could do two things: 1. it will help foster relationships for future employment. 2. it will provide current students with guidance in the program for career choices. The next group of recommendations will be more challenging, but ultimately make the program and stakeholders more competitive. Introducing new courses that deal specifically with talent management, coaching, and training skills address newer aspects of the AHRD workplace. Several people in both cohorts do not come into the program with these skills, and in order to

14

EVALUATION PLAN develop them, they could use a specific course. Similarly, leadership and facilitation could be divided into two courses with a stronger corporate focus in addition to the current introspective philosophy. To accommodate for these new courses, at least two adjunct faculty will need to be hired. Adjunct is preferable because of the specific skill set they could provide (Bettinger & Long, 2010, p.611). These would ideally be working professionals in the field that specializes in training skills, coaching, and talent management. Judy Rannow is a good example of this type of adjunct. She provides a specific skill set that is in demand in the market and she can offer courses that are relevant to that skill set. More faculty like Judy will help diversify the non-core courses so that students can develop more niche concentrations as they prepare for the workforce. The last four recommendations should be a derivative of the previous six recommendations. First, decreasing the advisee load among full-time faculty will occur naturally with the addition of two more adjuncts. The number of first and second year cohort members that received internships or jobs immediately following the end of second semester was lower than previous years. If advisors had fewer students to place, this issue could be addressed. Similarly, making work experience or internships mandatory will increase students marketability. It is a crucial part of Villanova Universitys experience. They also allow students with significant work experience to opt out by getting permission from the department head. Another simple adjustment could be made to the course sequence. Courses need to be adjusted to best reflect continuity in instructional topics. For example, switching EDUC 641 and AHRD 630 will deepen the connection students make between theory and practice as they complete their thesis research. Lastly, the HRM concentration needs to be revamped. Creating a relationship between the College of Business and the College of Education can focus on the common goal of

15

EVALUATION PLAN preparing both sets of students for the new workplace. Management has become a critical part in the new ASTD 2013 Competencies (ASTD 2013), and current students in the AHRD program lack a workable knowledge of this area. If HRM remains a concentration option, the courses need to be in place. In order for both sets of students to enter the workforce as ethical and experienced professionals, this relationship is in both colleges best interests. The evaluator recommends that all interventions be implemented within stages to continue to produce competitive alumnus into the field. Begin with the least complicated adjustments and then phase in the more disruptive. The fundamentals are strong and should not be adjusted. However, there are minor things that can be done to main the programs competitiveness in the field. None of the previous recommendations are meant to be derogatory against any of the current components of the current program. Most of these recommendations are based on the research conducted of JMUs nearest competitors and do not reflect the opinion of the evaluator. Evaluation Following Kirkpatricks four levels of evaluation model, the final evaluation of the recommendations and implementation will include initial reactions of students and staff to changes, evaluate learning that occurred during the process, changes in organizational behavior, and it will measure the results of the updated program (Chapman, 2012). The intention of the evaluator is that the implementation of the recommendations occurs in three stages. Table 1 organizes the evaluation plan into a visualization. The purpose of the table is to illustrate that the evaluation will occur at different time during different portions of the implementation of the recommendations. Each phase needs to be evaluated as soon as possible after implementation. Reaction needs to be judged by the first and second years in order to measure the initial

16

EVALUATION PLAN effectiveness. Learning measurement will be measured at different times depending on when the implementation intersects with course sequence. Recommendation Reaction Survey after first semester Phase I 1,6,8 Learning Include in first survey Behavior Include as part of Written Comps for first years Results Measure graduating cohort employment rates after implementation Measure graduating cohort employment rates after implementation Measure graduating cohort employment rates after implementation

Phase II

2, 4, 9

Survey at the end of the first year post implementation

Include in end of year survey

Phase III

3, 6, 7, 10

Survey at the end of second year after implementation

Include in end of year survey

Include as part of comprehensive exams depending on the cohort year Include as part of comprehensive exams depending on the cohort year

Table 1: Implementation and Evaluation Plan

Next, to measure behavior change, it will need to be included as a piece in one of the comprehensive examinations. It can either be a quick post exam survey or final question in the comprehensives. The purpose of the behavior measurement is to not only measure changes in satisfaction, but also to see if the participants and staff feel that their skills have changed and they have changed the way they work and participate in school because of the adjustments to the program. Lastly, results will be measured using the increase of students employed at graduation, first years with internships, increased satisfaction, and stronger corporate partners. By implementing all phases of this plan, JMU will remain competitive with others in the field, and build new relationships necessary for students to continue to view their education as a critical part of their career development.

17

EVALUATION PLAN Conclusion Most of the recommendations will not cause the AHRD program to undergo financial hardship. With the reallocation of funds to areas like adjunct faculty instead of a full-time faculty member, students will be exposed to different skills on a rotating basis, more targeted advising, and deeper relationships for their career development. There will be some costs incurred, however, with more faculty, more students can be accommodated, and thus more income can be allocated to the department. All of these recommendations should be viewed as an investment in the future of the programs. The philosophy and greater purpose stays the same but with a few adjustments. Overall the JMU program is foundationally sound in the fundamentals of human resource development. With improvements like additional faculty, added courses, and making internships required, the program will remain competitive and continue to offer strong candidates into the work force. This is an exciting opportunity to grow and change with the industry and the AHRD program can become an leader in this opportunity.

18

EVALUATION PLAN References ASTD. (2013). The ASTD competency model. Retrieved from http://www.astd.org/Certification/Competency-Model.aspx Bettinger, E. P., & Long, B. T. (2010). Does cheaper mean better. The Review of Economics and Statistics,92(3), 598-613. CDC. (2012, Nov 02). Communities of practice (cops). Retrieved from http://www.cdc.gov/phcommunities/ CDC. (2012, Nov 02). Develop smart objectives. Retrieved from http://www.cdc.gov/phcommunities/resourcekit/evaluate/smart_objectives.html CDC. (2012, Nov 02). Resources. Retrieved from http://www.cdc.gov/phcommunities/resourcekit/resources.html Chapman, A. (2012). Donald l Kirkpatricks training evaluation model - the four levels of learning evaluation. Retrieved from http://www.businessballs.com/kirkpatricklearningevaluationmodel.htm McDavid, J. C, Huse, I, & Hawthorn, L. R. L. (2013). Program evaluation and performance measurement. (2nd ed.). Washington D.C. Sage. University of Georgia. (2013). Retrieved from http://www.coe.uga.edu/leap/academic-programs/adult-education/ University of Louisville. (2013). Retrieved from http://louisville.edu/education/degrees/ms-hrod-wlp/files/ms-hrod/ Villanova University. (2013). Retrieved from http://www1.villanova.edu/villanova/artsci/hrd.html Wilcox, D. (2013). AHRD Case. Harrisonburg, VA. Retrieved from https://blackboard.jmu.edu/bbcswebdav/pid-3191276-dt-content-rid8552193_1/courses/EDUC625_OP01_G62_SM13/AHRD%20Case.pdf.

19

EVALUATION PLAN Appendix A: Villanova Graduation Checklist

20

EVALUATION PLAN Appendix B: University of Louisville Workplace Learning and Performance Checklist

21

Você também pode gostar