Você está na página 1de 8

Evaluation and Program Planning 25 (2002) 15±22

www.elsevier.com/locate/evalprogplan

A participatory evaluation of an inner-city science enrichment program


Glenda Quintanilla a, Thomas Packard b,*
a
Department of Children and Family Services, Adoptions Division, Los Angeles County, Los Angeles, CA, USA
b
School of Social Work, San Diego State University, 5500 Campanile Drive, San Diego, CA 92182-4119, USA

Abstract
A participatory evaluation (PE) of an inner-city science enrichment program for elementary school youth was conducted using an
evaluation team consisting of staff, board members, students, parents, and representatives of the agency's major funder. This evaluation
team designed and implemented the entire evaluation with guidance from an evaluator-consultant and researchers from a local university.
Data gathered from surveys of alumni, parents, students, and teachers revealed high satisfaction with the program and a validation of the
hands-on teaching model. Alumni reported that the program impacted their scienti®c and social competence. The highly participative design
process was seen as very successful by all participants, and offers useful guidelines for other PE, including active participation of all
stakeholders, commitment to a shared vision, and a good match between the organization and the evaluator. q 2002 Elsevier Science Ltd.
All rights reserved.
Keywords: Participatory evaluation; Organizational learning; Science enrichment; Appreciative inquiry

1. Introduction organizations and in actually using evaluation ®ndings are


well-known. The limitations of traditional methods have
This paper describes a participatory evaluation (PE) of an been eloquently articulated by Schorr (1997), who criticized
inner city science enrichment program in San Diego: The the overemphasis on evaluability in traditional terms. She
Elementary Institute of Science (EIS), which for over 30 supports ªnew approaches to the evaluation of complex
years has provided after school, weekend, and summer interventions [which] share at least four attributes: They
programs of hands-on science activities for youth who are built on a strong theoretical and conceptual base, empha-
historically have grown up to be under represented in the size shared interests rather than adversarial relationships
science and technology ®elds. This case study will review between evaluators and program people, employ multiple
the participative evaluation process used to design and methods and perspectives, and offer both rigor and rele-
implement the evaluation, resulting in immediately usable vance.º (p. 147) PE has emerged as a model which
®ndings and the creation of an internal capacity for ongoing addresses Schorr's concerns, particularly by emphasizing
evaluation and organizational learning for program the shared interests of stakeholders.
improvement. Patton (1997) has suggested that this participation of
After providing a conceptual foundation for the evalua- stakeholders leads to an increased use of evaluation ®nd-
tion process used (PE and organizational learning) and a ings. Also, according to Burke (1998), ªwith low involve-
review of similar science enrichment programs, the program ment by the key stakeholders, there is a risk that program
will be described. Next, the evaluation process will be bene®ciaries will lose a sense of ownership of the evaluation
summarized, using Burke's (1998) participative evaluation ®ndings. PE needs to be seen as an investment in the future.º
model. After a brief summary of evaluation ®ndings, we (p. 48)
will share some lessons learned.

3. Participatory evaluation
2. Issues in human services evaluation
Participatory evaluation ªis generally used to describe
Problems in adequately evaluating human service situations where stakeholders are involved in evaluation
decision making as well as share joint responsibility for
* Corresponding author. Tel.: 11-619-594-6723; fax: 11-619-594-5991. the evaluation report with an external evaluatorº (Turnbull,
E-mail address: tpackard@mail.sdsu.edu (T. Packard). 1999, p. 131).
0149-7189/02/$ - see front matter q 2002 Elsevier Science Ltd. All rights reserved.
PII: S 0149-718 9(01) 00045-3
16 G. Quintanilla, T. Packard / Evaluation and Program Planning 25 (2002) 15±22

According to Burke (1998), there are several key Due to EIS's philosophy of valuing and accenting posi-
elements of a PE process. First, of course, key stakeholders tive perspectives and visions, the organization also used
must be involved in the decision making. The discussion principles of appreciative inquiry in this evaluation.
below will describe the processes used here to ensure in- According to Cooperrider and Srivastva (1987, p. 130),
depth involvement of all key stakeholders in the process. appreciative inquiry is ªan innovative theory capable of
Second, the inequities of power and voice among participat- inspiring the imagination, commitment, and passionate
ing stakeholders must be acknowledged and addressed. dialogue required for the consensual re-ordering of social
Power dynamics which could be expected in a case such conductº (p. 131). This mode of inquiry is ªdistinguished by
as this would include relatively low in¯uence on the part of the art and the science of asking powerful, positive ques-
parents and students, and high in¯uence of the representa- tionsº (Cooperrider, 1996, p. 5). Appreciative inquiry seeks
tive of the agency's key funder; staff; and `high status' board to reveal the positive elements of an organization in order to
members, in this case those with graduate degrees. Perhaps help achieve its ideal future (for a current review, including
because of the agency's philosophy of highly valuing young some critical analysis, see a special issue of the OD Practi-
people and parents, there was no evidence of domination of tioner: Sorensen, Yaeger, & Nicoll, 2000).
decision making by particular individuals or roles. This will
be discussed in more detail in Section 10.
Other elements of a PE process according to Burke 5. Inner-city youth and science education
include: (a) ªan expanded role for evaluators that includes
elements of organizational learning and planned changeº National statistics indicate that as students enter middle
(p. 46), (b) multiple and varied data collection methods, and high school, their participation in advanced courses in
(c) an action component, (d) the building of the organiza- science and math decreases. Students' decreased participa-
tion's capacity for doing its own evaluation in the future, tion in advanced courses in science and math may be due to
and (e) an educational focus, on both individual and collec- ªthe reticence of elementary teachers to teach science
tive learning. The ways in which this project incorporated [which] has been linked to their perceptions of their ability
all of these elements will be discussed below. to understand and teach science wellº (Worch, Gabel, &
While the use of PE is growing (Cousins & Whitmore, Odell, 1994, p. 401). Also, students' decreased participation
1998), Turnbull (1999) has noted the lack of empirical in advanced courses of science and math may be due to lack
research regarding this method. One study of PE, which is of hands-on science instruction (Worch et al., 1994). One
relevant to the project described here concerned a large not- national study reports that the number of students of color
for-pro®t organization which developed internal PE capa- (Hispanic and African±American) and female students
bilities (Minnett, 1999). This project parallels the Minnett enrolled in science and math courses is lower compared to
study, on a smaller scale, using external evaluators. other students (Campbell, Voelkl, & Donahue, 1998).
This situation has contributed to the creation of after-
school and weekend hands-on science enrichment pro-
4. Organizational learning grams geared towards minority and inner-city youth, who
seem to have the least opportunity to experience hands-on
Another principle guiding the project discussed here is
science instruction in school (Brownstein & Destino, 1995).
organizational learning (Argyris & Schon, 1996). While
Brownstein and Destino state that hands-on science instruc-
historically human service organizations often agreed to
tion both engages students in the process of inquiry and
evaluation only under pressure to satisfy the needs of
in¯uences social engagement in participating students.
funders or other outside groups, there has been growing
Furthermore, they cite ªa considerable amount of evidence
interest in using evaluations as opportunities to learn
[which] supports the connection made between the class-
about program operations to enhance effectiveness. Organi-
room learning environment and cognitive and affective
zational learning has become a popular approach for this.
learning outcomesº (p. 29).
According to Argyris and Schon (1996, p. xxi) organiza-
tional learning ªrefers broadly to an organization's acqui-
sition of understandings, know-how, techniques, and
6. Some science enrichment programs
practices of any kind and by whatever means.º
According to DiBella and Nevis (1998), ªthere are As part of this evaluation, a `best practices' survey was
three essential criteria of organizational learning: conducted to learn of the practices of other science enrich-
First, new skills, attitudes, values, and behaviors are ment programs which could be adapted by EIS. Three not-
created or acquired over time¼ Second, what is able examples will be summarized here.
learned becomes the property of some collective Hands-on Science Outreach (HOSO) ªis a once-a-week,
unit¼ Third, what is learned remains within the after school, recreational science enrichment program for
organization or group even if individuals leaveº. children from pre-Kindergarten (age 4) through sixth
(pp. 25±26) gradeº (Goodman & Rylander, 1993, p. 1) located in
G. Quintanilla, T. Packard / Evaluation and Program Planning 25 (2002) 15±22 17

Rockville, Maryland. Three eight-week sessions are offered This program model has been re®ned over the years; and
each year during the fall, winter, and spring. staff, children, and parents have seemed very satis®ed with
The Saturday Science QUEST is a science enrichment it. However, although EIS has been operating for more than
program for elementary children and preservice elementary 30 years, it has never undergone a program evaluation in
teachers. The program ªexplore[s] concepts in speci®c areas order to assess its operating strategies and techniques. As a
of science with an emphasis on the idea that science is all result, the EIS 1995 strategic plan included implementation
around us and fun to learnº (Worch et al., 1994, p. 1). Also, of a program evaluation as a strategic goal.
ªchildren spend 2.5±3 h per week exploring one major
[science] theme or topic¼using a hands-on approachº
(Worch et al., 1994, p. 1). It also ªuse[s] university labora- 8. The evaluation design process
tories and other science facilities¼[in order to] in¯uence
children to pursue science and science-related careersº This evaluation will be described using the PE process
(Worch et al., 1994, p. 2). Additionally, cooperative learn- suggested by Burke (1998), based on her PE principles
ing is encouraged to enhance positive interdependence. which were summarized earlier. The PE steps are: (a) decid-
The Saturday Science Academy (SSA) of Clark Atlanta ing to do it; (b) assembling the team; (c) making a plan; (d)
University is a science enrichment program for African collecting the data; (e) synthesizing, analyzing, and verify-
American third±eighth-grade students located in Atlanta, ing the data; (f) developing action plans for the future;
Georgia. The SSA was created in 1979 as a way of address- and (g) controlling and using outcomes and reports.
ing students' limited exposure to science, especially among While Burke reported some signi®cant problems with one
African American students (Brownstein & Destino, 1995). particular PE, the case here represents successful applica-
tion of PE principles and methods.

8.1. Deciding to do it
7. Background and setting
As part of an agency strategic planning process, the board
Founded in 1964 in an inner-city neighborhood of San and staff decided that a program evaluation would help the
Diego as a `hands-on science enrichment program', the EIS agency accomplish several goals, including developing and
believes that children need to experience science `up-close strengthening peer learning opportunities, establishing a
and personal' in order to become acquainted with this ®eld structure for tracking and involving alumni, and supporting
and discover the vast knowledge and opportunities which an ongoing planning process. An evaluation was expected to
this ®eld generates. As a result, it adheres to the idea of early provide documentation of program results, seen as impor-
intervention as a critical helping tool for children between tant to agency fundraising because ªcorporate and founda-
the ages of 7 and 13. This hands-on, experiential strategy tion funders expressed the need for measurable outcomes
emphasizes that children need to experience science and before considering funding supportº (Jacobs Center for
technology personally in order to understand and appreciate Nonpro®t Innovation, 1997, p. 18). The agency also wanted
these subjects. Research has found that students often lose to contact EIS alumni in order to `uncover stories' about
interest in these subjects because they are typically taught their EIS experience to discover commonalties which could
through lectures and textbooks, with decreasing oppor- lead to improvement for EIS. Therefore, speci®c goals of
tunities for hands-on projects due to shrinking educational the evaluation were to learn about program outcomes and
budgets (Worch et al., 1994, p. 401). accomplishments and to learn how the program could
EIS provides a variety of classes geared to actualizing become even better. The plan intended to include program
students' success, including ªa variety of activities and `success indicators' such as the percentages of students who
experiments in electronics, computer science, biology, engi- return for subsequent programs and who go on to college or
neering, chemistry, physics, oceanography, photography, science-related ®elds, student interest in science, and
and ecologyº (Elementary Institute of Science, 1996, p. 1). students emulating respectful relationships. These success
EIS uses college students or recent college graduates as indicators were later used as a basis for developing evalua-
teachers. tion questions. The decision to begin an evaluation process
All classes, including biology, computers, engineering, was made by staff, board members, parents, and the
photography, and natural sciences, provide hands-on agency's major funder. Students were not represented in
instruction to stimulate, encourage, and nurture each this decision, but these other stakeholders resolved to
student's natural inquiry about the world. Students are not involve them in all subsequent activities.
tested, ranked, or scored in order to measure the level of
knowledge they have gained. Each instructor is allowed to 8.2. Assembling the team
devise a unique curriculum relevant to his or her area of
expertise. EIS offers classes in after-school, Saturday, and Next, an evaluation design team was formed. Members
summer sessions. The summer session lasts 9 weeks and the included representatives from the agency Board of Direc-
other sessions run from October to May. tors; staff; students; parents; and consultants from the
18 G. Quintanilla, T. Packard / Evaluation and Program Planning 25 (2002) 15±22

program's major funder, the Jacobs Center for Nonpro®t and two staff: the executive director and the education direc-
Innovation (JCNI). The team explicitly intended to use tor. The decision to include students in all aspects of the
principles of PE and organizational learning to design and evaluation ¯owed naturally from the organization's
implement an evaluation that was truly driven by program mission, vision, and values. The Youth Board had a promin-
participantsÐyouth, parents, staff, and board membersÐ ent role in the organization, most recently through their
using outside experts to ensure that the process and methods commitment to raise funds for the capital campaign for a
selected were sound from a research perspective. For the new facility (Elementary Institute of Science, 2000). To
initial design phase, expertise, primarily in the form of the assist with implementation they hired the authors: a pro-
design and facilitation of planning workshops, was provided fessor and a research assistant from a local university. In
by a consultant employed by the funder, JCNI. This consultant addition to possessing teaching and research skills, the
was a doctoral student in urban planning who had extensive professor had been an organization development consultant
experience in community development and group facilitation. for 15 years, using a participative philosophy for working
with client organizations. We worked with the team as
8.3. Making a plan evaluation consultants, providing methodological expertise
and facilitation of team meetings. This team held three
The original evaluation design team completed the EIS Saturday sessions of 3 h each over a 4 month period to
evaluation design during ®ve workshops (typically 3 h develop an implementation plan for the evaluation.
Saturday sessions) throughout 1997. This resulted in a
shared vision for the organization, including desired 8.4. Collecting the data
outcomes and proposed evaluation questions, inputs, indi-
cators, and outputs. In order for the evaluation to ®t with the The team made decisions regarding the targets for the
mission, values, philosophical assumptions, and goals of evaluation, data collection instruments, and evaluation
EIS, the team and the evaluation consultants recognized methods. They decided to gather data from alumni, parents,
that the evaluation process would need to be a hands-on students, and staff, using multiple data collection methods,
process, which would require the participation of all key including structured interviews of alumni, pre- and post-
EIS stakeholdersÐstudents, staff, parents, and the Board survey questionnaires for parents whose children attended
of Directors. The team intended that evaluation would the program, questionnaires for teachers at the end of each
become an ongoing process, integrated into the operation program series (fall to spring, summer), a short question-
of EIS and owned by its stakeholders because it would be naire for students older than age 10, and an analysis of
their work product. program documentation. The team determined that the
Following are the evaluation questions which the design evaluation would be simple and unobtrusive, and would
team developed. be linked with the agency's strategic plan. An additional
aspect of the evaluation was a review of best practices in
1. Does EIS increase students' interest and con®dence in hands-on elementary science education, in which one of the
the area of science? authors looked for similar programs from across the country
2. Does EIS increase students' scienti®c competence? to enable the evaluation team to assess other practices for
3. Does the EIS environment nurture social competence? possible adoption to improve their program.
4. Does EIS help students see a wider range of life options There were spirited discussions regarding several key issues,
(e.g. higher education, choice of science careers)? such as whether or not scienti®c competence or attitudes
5. How involved are families in the EIS program? towards science should be explicitly measured and whether
6. What visions and suggestions for the future do students, or not changes in self esteem should be measured using stan-
parents, alumni, and staff have? dardized tests. The group ultimately decided not to use such
measures, because their obtrusiveness outweighed the value,
After the plan was completed, the group decided to take a which they might have provided. This discussion also led the
break from the intense process they had just completed team to conclude that the aspect of scienti®c competence,
while they searched for an evaluation consultant to assist which the program attempted to develop, was focused on
with implementation of the evaluation. Eight months later, a process skills such as the scienti®c method rather than on scien-
smaller evaluation team composed of a representative ti®c knowledge, which was seen as a responsibility of the
subset of the original design team reconvened to plan imple- schools. The team concluded that questions regarding social
mentation. To create a representative mix from each stake- competence would best be answered by asking alumni directly
holder group, team members chosen included two EIS board rather than through standardized self-esteem measures. These
members (a professor of psychology who chaired the team, discussions had a side effect of refocusing and clarifying for the
and a physician researcher at a local internationally- group the true intentions of the program, which emphasized
renowned medical research center), three parents, three increasing life options for youth.
students who were also members of the agency's Youth Another bene®t to this participative approach to evalua-
Board, two consultants from the agency's major funder, tion emerged through the discussion regarding who should
G. Quintanilla, T. Packard / Evaluation and Program Planning 25 (2002) 15±22 19

conduct the interviews of alumni. One of the programs collated by the researchers and presented at workshops to
discovered through the best practices analysis was a vision- the evaluation team and to the board of directors at two of
ing process for the City of Chicago (Cooperrider, 1996), their regular meetings. While this paper is not intended to
which involved having young people conduct interviews report evaluation ®ndings, some selected summary informa-
with community leaders. Based on the Chicago experience tion may be of interest to a reader curious about the
and EIS's own philosophy, the team decided to train some outcomes of the process. Fifty three alumni were inter-
of the Youth Board members to conduct alumni interviews. viewed, with over 70% reporting that EIS de®nitely or prob-
Two Saturday training sessions were held for the training. ably impacted their con®dence regarding both science and
This process also served as a pilot test of the instrument and social skills. Eighty-seven percent reported that EIS
a way to re®ne the interview questions and protocol. Triads, de®nitely or probably impacted their increased interest in
consisting of an interviewer, and interviewee (alumni), science. The parent surveys began with the 1999 summer
and an observer, role-played the interview process. The program, and the student surveys began with the summer
agency's philosophy of involving and respecting youth 2000 program. Parents and students reported very high
was demonstrated at one session during which the evalua- levels of satisfaction with the program. Parents reported
tion team chair, a professor of psychology, role-played an the achievement of stated program outcomes including
interview with a youth. The youth provided very useful students' increased interest in science, enjoyment of an
feedback on the wording of questions and the way the intellectual experience, and learning how to do experiments.
professor asked them, with the professor responding appre- The ®nal aspect of the evaluation, gathering of program
ciatively and adjusting his approach. Another example of documentation, was seen as less urgent but very important
the team's working to cultivate equitable relationships in the long term. As part of the development of the evalua-
among all stakeholders has been the use of a youth member tion, agency staff recognized that they needed to develop a
as a meeting chair. For one meeting at which the Chair was better system for tracking ongoing program activities such
absent, a youth member was asked to chair the meeting and as demographic data on students and teachers, program data
functioned effectively in that role. including students enrolled in each program and the length
Another component of the Evaluation Team meetings of their participation, and parent involvement. Staff and
was a ritual in which the team reviewed the evaluation evaluators, with support of the rest of the Evaluation
goals at the beginning of each meeting. Because the major- Team, identi®ed program and demographic data to be
ity of the participants had never participated in the planning collected on a regular basis. These data will be entered
and implementation of an evaluation process, the review of into a comprehensive database for ongoing monitoring
the evaluation goals provided them with a reminder of their and analysis of student demographics and trends in enroll-
role in the evaluation. In addition, it helped keep the team ment and program activities.
focused during meetings. A second ritual was an oral The Evaluation Team and the agency's Board of Direc-
critique at the end of each session, providing participants tors were pleased with the ®ndings, since results were posi-
with an opportunity to recognize their own and others' parti- tive and highlighted opportunities for improvement that
cipation in the evaluation process and to learn how to make were already planned (e.g. a larger facility) or were feasible
future meetings even more effective. for implementation (e.g. year-round photography classes).
The team decided to collect data in the following phases: One board member noted that provisions should be made in
the new building for enough room for the photography
I: alumni interviews; program. The board also encouraged staff to pay more
II: questionnaires ®lled out by parents before and after attention to orientation of new teachers, which staff had
their children's participation in the program; already planned, based on their earlier review of the results.
III: questionnaires ®lled out by teachers at the end of each Other suggestions from parents and alumni, such as the
program; interest in a program for older teenagers, will be used in
IV: questionnaires ®lled out by students at the end of each agency planning.
program;
V: collection and analysis of program documentation and 8.6. Developing action plans
student demographics.
The team made a commitment to use surveys of parents,
The team decided to continue interviewing alumni until at teachers, and students on an ongoing basis, with results fed
least 50 had been contacted, and to make all other aspects of back to staff to help improve program offerings and opera-
the evaluation ongoing activities. tions. Data will also be used in reports to the board and
funders, and for proposals submitted to foundations for
8.5. Synthesizing, analyzing, and verifying the data additional funding. Notably, several months ago a prospec-
tive funder who had a particular interest in enhancing the
Consistent with principles articulated by Burke (1998), involvement of women in science careers asked the agency
data from alumni, students, parents, and teachers were for feedback data from female alumni. The researchers were
20 G. Quintanilla, T. Packard / Evaluation and Program Planning 25 (2002) 15±22

able to extract this information from the alumni survey data conducted interviews. A total of eight stakeholders
and summarize this for use by the agency. Such requests responded. Interviews were held with two board members,
will be easy to accommodate in the future because the the original evaluation consultant (she was involved as an
evaluation system will be institutionalized as an ongoing evaluation team member but did not conduct the evaluation
aspect of agency operations. or analysis of data), one parent, and one youth board
member. Surveys were sent to two parents, two youth, and
8.7. Controlling and using outcomes and reports two staff, all of who had been members of the Evaluation
Team, and three of these surveys were returned.
Consistent with one of Burke's principles, the data
Respondents were asked for their observations on what
remain in the hands of the stakeholders, to use to address
factors had helped make the evaluation successful, what
the original evaluation purposes: learning about program
could have been done better, and what recommendations
accomplishments and identifying opportunities for program
they would offer to others doing such evaluations. Key
improvements. After each academic year program and each
success factors noted included active participation of all
summer program, evaluators will collate data and the
stakeholders, the evaluation consultants' leadership and
Evaluation Team will meet to review results and develop
guidance, and a good match between the client and evalu-
action plans. These results and action plans will also be
ators. An important implication here is that the client
presented to the Board.
organization should clearly articulate not only technical
requirements of evaluation consultants but also the agency's
9. Lessons learnt philosophy and values and any expectations regarding how
the client and consultant will work together. Before a
The alumni survey administration went well and met consultant is chosen, there should be an open and frank
expectations. The only change from the original plan was discussion of client and prospective consultant philos-
that students were trained as interviewers but were not able ophies, roles, and decision-making expectations to ensure
to actually conduct any of the interviews. Most alumni did alignment.
not want to come into the agency where the students could Evaluation Team respondents also concluded that the
interview them, so the interviews were conducted by the process could have been shortened, especially in the design
researchers, almost all by phone. The convenience of a phase, perhaps by moving more quickly to a smaller group
phone interview made it easier to get alumni cooperation, so that decisions could be made more quickly, and by
which outweighed not having students do the interviews. having shorter intervals between meetings. A more explicit
We would only suggest to researchers considering using and disciplined use of a timeline would help keep the
this procedure that they facilitate explicit discussions process moving so that momentum is maintained. If consult-
regarding the costs and bene®ts of the use of stakeholders ants are sensitive to the time availabilities and pacing
as interviewers and encourage the stakeholders to make a preferences of the stakeholders (who in this case were
thoughtful decision based on their particular circumstances. mainly volunteers, including board members and parents
The interview training process was useful nevertheless in with day jobs and students with time commitments includ-
helping re®ne the questions and protocol as well as in giving ing athletics and other extracurricular time demands), they
the team members an experiential feel for the data collection can regularly have the group review progress to ensure that
process. events are on schedule or dates are adjusted. The stake-
When staff and the researchers told the Evaluation Team holders suggested that such pushing from the consultants
about the incompleteness of the agency's database of alumni, on this might have helped. Because of our participative
all realized the importance of keeping more thorough and up philosophy, we may have held back from being overly
to date records for future use. In the author's experience, directive; but it seems that if a truly collaborative climate
many community-based organizations do not place a high has been established, consultants need not hold back in
priority on maintaining data, which could facilitate contact making directive suggestions, as long as decision-making
with former clients for follow up and evaluation. The remains with the stakeholders.
experiences here may help demonstrate the value of such Based on the successes here, both the stakeholders and
data management to administrators who in the future may the consultants strongly encourage others to heavily involve
want to contact former clients for evaluation purposes. Of key stakeholders throughout the process while using the
course, such arrangements would need to be explained in expertise of consultant/evaluators to manage project details.
advance to clients, and client consent obtained for follow We also learned the value of building the evaluation process
up. into regular program development. In this case, the evalua-
After the data were fed back to the Evaluation Team and tion was linked with the agency's strategic plan and even its
the board, members of the team were interviewed or capital fundraising campaign, enhancing its value and rele-
received a questionnaire to provide their observations on vance. Finally, the stakeholders would recommend to others
the evaluation process and its results. A research assistant that they trust their instincts. As one respondent put it: ªmost
who had not previously been af®liated with the project people close to an organization have a pretty good idea of
G. Quintanilla, T. Packard / Evaluation and Program Planning 25 (2002) 15±22 21

what information they need.º A consultant's role in this then techniques) and these were easily resolved after discussions
becomes providing expertise on helping the stakeholders get of the pros and cons of various options in this setting. The
program models and objectives into evaluable forms and consultants provided technical suggestions and guidance,
suggesting valid and reliable data collection options. and facilitated discussions regarding issues such as ques-
One ®nal lesson, perhaps obvious but worth emphasizing, tionnaire construction and the limitations of methods
is that the development of a comprehensive evaluation which did not use traditional standards such as randomiza-
which involves building ownership and stakeholder partici- tion and control groups. The team took time to thoroughly
pation takes time. This is particularly true in the case of address such issues, even when this slowed down the
volunteer stakeholders with other responsibilities. Meetings process. Doing this in a collaborative spirit seemed to
may need to be scheduled on evenings and weekends, and work well, with the team making key decisions and any
spread out over time. Consultants and stakeholders should resultant methodological limitations noted.
be patient as well as focused, and should keep a long-term Since its creation in 1964, the EIS has valued children's
time perspective. This will be easier if the evaluation is seen inquiry about their world. It seems that the principles of
as the development of a new and valuable element in the life appreciative inquiry and organizational learning worked
of an organization rather than as a one-shot endeavor. here largely because they were consistent with the agency's
philosophy of creating a positive and exciting learning
environment for students and nurturing their inherently
10. Final thoughts inquisitive nature. These principles were implemented in
both explicit and subtle ways. The structure of the initial
In a recent discussion of PE, Cousins and Whitmore evaluation design was known as the the `Learning Lab.'
(1998) identi®ed several issues for consideration in future Each meeting of the evaluation team began with a review
evaluations. The experiences on this project suggest of the goals of the evaluation: to learn about program
responses to some of their questions. First, who is in control outcomes and accomplishments and to learn how the
of the evaluation is often an issue. A related ethical issue program can be improved. Each meeting ended with a
concerns who owns the ®ndings and can dictate the use of critique of the meeting's process and the results to date,
them. There are also more technical questions: Who selects with re®nements and new actions planned to ensure that
participants? How is technical quality de®ned and by the group built upon what it was learning and improved
whom? How will participants be trained? Finally, they its operations. Appreciative inquiry principles in particular
ask, what conditions enable PE to ¯ourish? The evaluation were re¯ected in the framing of the evaluation's purposes
process described here was successful due in part to the and questions in positive ways rather than focusing on
power sharing among all key stakeholders. The evaluation problems. The initial vision of the Learning Lab was
was funded by the agency's main funding source, and one based on positive principles such as the empowered child,
could expect that the funder may dominate decision- strong families, and beloved community; and emphasized
making. In fact, agency staff, board members, students, activities including `constant af®rmation and encourage-
and parents participated fully in all decisions, with the ment' (Elementary Institute of Science, 1997). Meeting
representative of the funder acting as an equal partner. critiques focused on not only opportunities for improvement
The data were seen as owned by the Evaluation Team and but also on a recognition of what was going well, and appre-
the Board of Directors. The evaluation team, with guidance ciation for those involved.
and recommendations from the outside evaluator, made EIS was able to design an evaluation process consistent
technical decisions. with its mission, vision, values, philosophical assumptions
This process also con®rmed learnings articulated by King and goals. Support and involvement from its stakeholders,
(1998). High levels of interpersonal and organizational trust including staff, students, parents, volunteers, board members,
were developed through the collaborative vision-setting and funders helped EIS throughout the design and imple-
process, in which participants created shared meanings of mentation of its evaluation process. Their participation
their experiences over time. Perhaps because of the organi- ensured that the variables being studied were those con-
zation's long-standing participative and parent- and student- sidered to be important. Despite differences in age and
centered philosophy, its history of community involvement, education for the common purpose of implementing the
and stable, long-term ®nancial support from a local founda- evaluation, members of the evaluation team came together
tion, power issues which often emerge in formal organiza- easily. The organization has a long-established culture of
tions were rarely relevant here; and when they did emerge valuing diversity, particularly related to ethnicity and age,
they were explicitly addressed through discussions of the and this expedited team building on the evaluation to the
goals and processes of the evaluation and the roles of each extent that little speci®c attention needed to be paid to rela-
participant. In fact, the only substantive issues concerned tionship building.
consultant suggestions to consider making the evaluation Finally, this project supports conclusions by Whitmore
more rigorous in traditional ways (e.g. the use of a control (1998) regarding essential ingredients for a successful parti-
group and standardized measurement instruments, sampling cipative evaluation. There was a receptive context, with
22 G. Quintanilla, T. Packard / Evaluation and Program Planning 25 (2002) 15±22

staff, board, youth, and parents as well as the agency's main tional life. In W. Pasmore & R. Woodman, Research in organizational
funder being interested in the evaluation; the evaluators change and development (pp. 29±169). Greenwich: JAI Press.
Cousins, J., & Whitmore, E. (1998). Framing participatory evaluation. In E.
were committed to a participatory process and possessed Whitmore, Understanding and practicing participatory evaluation. New
strong `people skills' in areas such as meeting facilitation; directions for evaluation (pp. 5±24). Vol. 80. San Francisco: Jossey-Bass.
and enough time was allowed, over a period of three years, DiBella, A., & Nevis, E. (1998). How organizations learn, San Francisco:
for the vision of the evaluation to be developed and then Jossey-Bass.
implemented. All these factors contributed to the building of Elementary Institute of Science (1996M). 31 years of hands-on science
enrichment. Science News, 1, 1.
agency capacity regarding organizational learning and Elementary Institute of Science. (1997, May 31). Learning lab. Available
evaluation, so that the process has been institutionalized from the Elementary Institute of Science, 588 Euclid Ave. San Diego,
and become part of the culture of the organization. CA 92114.
Elementary Institute of Science. (2000). A campaign for young minds.
Available from the Elementary Institute of Science, 588 Euclid Ave
Acknowledgements Euclid Ave. San Diego, CA 92114.
Goodman, I., & Rylander, K. (1993). An evaluation of childrens' participa-
tion in the hands on science outreach program, Cambridge: Sierra
Thanks to Ms Doris Anderson, Dr Billy Vaughn, Ms Research Associates.
Teresa Lingafelter, and the rest of the Evaluation Team at Jacobs Center for Nonpro®t Innovation (1997). Elementary institute of
EIS; and to the Jacobs Center for Nonpro®t Innovation for science: Campaign planning study, San Diego: Jacobs Center for Non-
their support. pro®t Innovation.
King, J. (1998). Making sense of participatory evaluation practice. In E.
Whitmore, Understanding and practicing participatory evaluation.
New directions for evaluation (pp. 57±68). Vol. 80. San Francisco:
References Jossey-Bass.
Minnett, A. (1999). Internal evaluation in a self-re¯ective organization:
Argyris, C., & Schon, D. (1996). Organizational learning II: Theory, One nonpro®t agency's model. Evaluation and Program Planning,
method, and practice, Reading, MA: Addison-Wesley. 22, 353±362.
Brownstein, E., & Destino, T. (1995). Science enrichment outreach. Patton, M. (1997). Utilization-focused evaluation, Thousand Oaks, CA:
Science Teacher, 62 (2), 28±31. Sage Publications.
Burke, B (1998). Evaluation for a change: Re¯ections on participatory Schorr, L. (1997). Common purpose, New York: Anchor Books Doubleday.
methodology. In E Whitmore, Understanding and practicing participa- Sorensen, P., Yaeger, T. & Nicoll, D. (2000). OD Practitioner 32 (1).
tory evaluation. New directions for evaluation (pp. 43±56). Vol. 80. Turnbull, B. (1999). The mediating effects of participation ef®cacy on
San Francisco: Jossey-Bass. evaluation use. Evaluation and Program Planning, 22, 131±140.
Campbell, J., Voelkl, K., & Donahue, P. (1998). NAEP 1996 trends in Whitmore, E. (1998). In E. Whitmore, Understanding and practicing parti-
academic progress. US department of educationÐNational center for cipatory evaluation. New directions for evaluation (pp. 95±100). Vol.
education statistics. (On-line). Available: http://nces.ed.gov/NAEP/ 80. San Francisco: Jossey-Bass.
96report/97986.shtml (October 1998). Worch, E., Gabel, D., & Odell, M. (1994). Saturday science QUEST: A
Cooperrider, D. (1996). The child as agent of inquiry. OD Practitioner, 28 science enrichment program for elementary children and preservice
(2), 5±11. elementary teachers. School Science and Mathematics, 94 (8), 401±
Cooperrider, D., & Srivastva, S. (1987). Appreciative inquiry in organiza- 406.

Você também pode gostar