Você está na página 1de 4

Practice: Get quality data into your evaluator’s

hands
Key Action: Use technique to ensure valid and reliable data

VIGNETTE: Challenges of an MSAP Rigorous Evaluation for an Interdistrict Program

Purpose: Conducting a rigorous evaluation poses particular


challenges for interdistrict magnet programs. For example,
communication and data collection must be coordinated
across multiple districts; and appropriate comparison
schools may need to be identified outside the interdistrict
consortium. In this interview, an interdistrict magnet director
describes challenges that emerged throughout the rigorous
evaluation process, and the strategies her evaluation team
used to overcome them. Reflecting on her lessons may be
useful as you consider the feasibility of a rigorous evaluation
design in your own district.

Note: Although this vignette addresses the evaluation


challenges of interdistrict programs, the insights about
communication and comparison school selection may be
useful for all magnet programs.

Source: Interview with Karla Fawbush, Director of Magnet School


Programs, Northwest Suburban Integration School District,
MN, on November 21, 2008.

Questions for Reflection


1. What rigorous evaluation issues do you have in common with the Northwest Suburban
Integration School District?

2. What additional issues may you need to address as you design your plan for rigorous
evaluation?

3. How would you assess your district’s or consortium’s capacity to support a rigorous
evaluation of your magnet programs?

4. Are there rigorous evaluation demands you anticipate your district or consortium cannot
address? How do you anticipate this situation might affect your evaluation design?

1
Practice: Get quality data into your evaluator’s
hands
Key Action: Use technique to ensure valid and reliable data

Challenges of an MSAP Rigorous Evaluation for an Interdistrict Program

Background: Northwest Suburban Integration School District (NWSISD) staff received federal
Magnet Schools Assistance Program (MSAP) funding to support new magnet schools in both the
2004 and 2007 cycles. After having its 2004 MSAP proposal for conducting rigorous evaluation
denied, the district’s 2007 proposal was accepted. In this interview with Karla Fawbush, Director
of Magnet School Programs for the interdistrict consortium, she describes how the evaluation
team used lessons learned during the 2004 cycle to create the rigorous evaluation design for the
2007 proposal. One aspect of the improved 2007 design was a stronger process for selecting
appropriate comparison schools.

Some history

The NWSISD is a consortium of seven school districts created in 2001 after two of the districts
were declared “racially isolated” by the Minnesota state legislature. In Minnesota, if a district has
20% more students of color than a contiguous district, it is considered racially isolated. The
schools and communities of the NWSISD work together to create solutions to intentional or
unintentional segregation. One of these solutions is the formation of magnet schools. The two
other interdistrict consortiums in the metro area run the operations of their magnet schools;
however, the NWSISD has 15 magnet schools run by the individual districts in its consortium. A
student can start with one of three magnet themes: Science, Technology, Engineering and Math
(STEM); International Baccalaureate (IB); or Visual, Performing, Literary and New Media Arts,
and start elementary school in one district, continue on to middle school or junior high in another
district, and then finish the strand in high school in a third district.

One challenge for our consortium is coordinating the magnet school operations of our seven
member districts, which vary in size, infrastructure and programs. We have small districts with
two-four schools, and we also have the largest district in the state. Members of our consortium
include urban, suburban and rural districts. They differ in how they coordinate curriculum and
staff development, how they transport students, and how they manage student data, for example.
Even before we dealt with the issue of the rigorous evaluation, we had to learn a great deal about
how each district functions and whom to contact for various information. For example, some
districts have a staff person for communications, while others do not. Specific roles and
responsibilities vary greatly from district to district. We had to work out the complexities of being
an interdistrict consortium, so that by the time we were awarded the second MSAP grant, which
included the rigorous evaluation, we had much of that structure in place.

2
Practice: Get quality data into your evaluator’s
hands
Key Action: Use technique to ensure valid and reliable data

The challenge of finding comparison schools

There are some challenges that are specific to being an interdistrict program in terms of finding
comparison schools for a rigorous evaluation. For example, most of our member districts have
elementary schools, middle schools, and high schools, but one of our bigger districts has
th
elementary schools that run through 6th grade, and junior highs that run through 9 grade. In
preparing to gather data to compare schools, we need to take into account that one high school
may have students in grades 9-12 while a potential comparison school might have grades 10-12.
th
Therefore, since the state writing test takes place in 9 grade, we would either decide not to use
that test as a measure of student achievement, or we would want to find another comparison
school.

A major challenge in our rigorous evaluation design was selecting comparison schools for our
three new MSAP high schools, two of which are urban. The first step to finding appropriate
comparison schools was trying to match schools based on head count and demographic data: the
total number of students; the percentages of students who receive free and reduced-price lunch;
those who have limited English proficiency; and those who are part of a minority ethnic or racial
group. We focus mainly on our black population because that’s the largest racial/ethnic group in
the two urban schools. For our more suburban magnet high school, it was fairly easy to find a
comparable school since there are other similar suburban schools within our consortium. We
simply went to the Minnesota Department of Education (MDE) website and looked at the October
1, 2006, demographic data to find a school within a member district that had similar
subpopulations to those in our magnet high school.

But with the two more urban magnet high schools, it was more difficult. There are no schools of
any size that are similar in demographics within our member districts, so we had to go outside of
the consortium to find comparison schools. This was an additional complication to factor into the
process. We always start with the data that’s downloadable from the MDE website—head counts
and demographic information—but, of course, there’s additional information that we need for the
rigorous evaluation, so we need to obtain that data directly from the individual districts. We
narrowed the search for comparable schools to our local metro area because we thought it would
be easier to contact those districts to get their assistance. Initially we were hoping to find a
Minneapolis school because we had already gotten data from them during the previous grant
cycle. But as it turned out, the Minneapolis school district is starting its own programs with
treatments that are similar to those of the NWSISD magnet schools, so none of those schools
would have made a good comparison school.

After reviewing the data, we discovered that there were about five high schools in the metro area
that were pretty close to our two urban magnet schools in terms of demographics. They weren’t
perfect because school size wasn’t an exact match, but we knew that would be difficult to find:
one of our urban magnet high schools has about 800 students and the other has several
thousand. Then, among the five potential comparison schools that were the closest demographic
match, we also had to verify their programs. It turned out that one of the stronger comparison
school matches was just starting an International Baccalaureate Diploma Programme, which is
too similar to what we are implementing in our own magnet schools. So we went with the next
closest match. From there, we called their assessment and data management people and got
permission to collect and use the data from their district for the rigorous evaluation.

3
Practice: Get quality data into your evaluator’s
hands
Key Action: Use technique to ensure valid and reliable data

Communicating with comparison schools outside our consortium

The staff members who work with the data have been very gracious about letting us use the data,
but some of the things we need for the rigorous evaluation, like pulling the grade-point averages
and individual test score data, isn’t something they regularly do. We appreciate their collaboration
and agreement to continue this process over the course of three years, even if they are not part
of our consortium.

Interacting with so many administrators throughout the consortium gives me the opportunity to
make our evaluation as thorough as possible. For example, through a very skilled assessment
director in one of our districts, we’ve learned about potential issues with one of state tests.
Originally, we planned to use math, reading, writing, and science tests, but after talking with this
assessment director, we felt that the science test was too new to use this year in the rigorous
evaluation. Since last year’s test was the first time the science test had been administered, some
students taking the test on the computer experienced technical difficulties. In addition, the science
test wasn’t used for AYP (Adequate Yearly Progress) and none of the three new Northwest
Suburban magnet schools has science as its principal theme. We factored those points into the
final evaluation plan, not using the test in the data analysis for the first rigorous evaluation report.

Você também pode gostar