Você está na página 1de 15

NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL VOLUME 31, NUMBER 1, 2013

UTILIZATION OF 360-DEGREE FEEDBACK IN PROGRAM ASSESSMENT: DATA SUPPORT FOR IMPROVEMENT OF PRINCIPAL PREPARATION
Bobbie Eddins Jeffrey Kirk Dorleen Hooten Brenda Russell Texas A&M UniversityCentral Texas
ABSTRACT Crucial to each school leaders success in the complex environment of a prek-12 campus is development of knowledge and skill in a relevant preparation program. In collaboration with school district leaders and school leadership practitioners, faculty members in the school of education at a regional university have designed and implemented a learn-as-you-go self-assessment approach for continuous improvement of the masters degree/principal certification program for prek-12 school leaders. Grounded by a focusing mission and an action research case study approach, the approach utilizes 360degree feedback that provides multiple-source data linked to the state and national principal standards. It borrows the circular concept from the widely utilized individual 360-degree feedback assessment practice that provides both depth and breadth concerning a leaders effectiveness. Data from internal and external 360-degree feedback sources include an ongoing review of school leadership literature, a self-assessment by program faculty, a critical review by educational leadership experts, an analysis of internal and external student performance data, focused conversations with advisory groups, and perceptions of program completers as well as their supervisors as they move forward on professional leadership pathways. This feedback supports data-informed decisions that strengthen key program components student recruitment and selection, program curriculum, instructional delivery including a rigorous internship and mentoring support, stakeholder involvement, program staffing and faculty development, and program planning and evaluation.

18

NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL

Introduction

rom the moment school leaders are hired, they are expected to mobilize the school community in improvement efforts focused on success for all students. Leading schools today means developing a culture of joint responsibility focused on a vision of prosperity, opportunity, and creativity (Hargreaves & Shirley, 2009). It requires growing student, teacher, staff, parent, and community leaders who are engaged in shared inquiry and innovative responses concerning learning challenges across an increasingly diverse community of practice (Seashore Louis, Leithwood, Wahlstrom, & Anderson, 2010; Sergiovanni, 2006; Glickman, 2003). Crucial to each school leaders success in this complex environment is development of knowledge and skill through relevant preparation. Principal preparation program quality is habitually criticized by policy makers, school leaders, scholars, and professional organizations. Concerns are raised about the relevancy of practice to theory connections related to real time school purpose and capacity building (Fullan, 2009; Orr, King, & LaPointe, 2010); the need for a better aligned and more rigorous curriculum (DarlingHammond, LaPointe, Meyerson, & Orr, 2009; Lashway, 2003), the lack of substantial clinical experiences (Frye, Bottoms, & ONeill, 2005; Levine, 2005), and the underutilization of program assessment (Orr, 2009). In short, for principal preparation program completers to make an impact as school leaders, they must engage in program learning that is relevant to prek-12 school environments, delivered in a coherent and engaging sequence, facilitated by knowledgeable and experienced faculty, and assessed through both internal and external measures of effectiveness.

The Critical Role of Program Assessment A growing emphasis on accountability for educator preparation continues to increase motivation for ongoing assessment about the

EDDINS, KIRK, HOOTEN, & RUSSEL 19

program and quality guarantees for program completers. Program evaluative assessmentthe process of systematically collecting and analyzing data about a program to determine its significance and improve its performance provides valid and reliable information to decision makers about program results. As with any assessment process, data is collected that supplies answers about key program components and outcomes. Inquiry concerning program improvement includes questions such as: Does the program succeed in doing what it was meant to do? How effectively is the program functioning? and What modifications are needed to meet program goals? Meaningful assessment of school leadership preparation is critical to continuous program improvement, particularly in view of the constantly shifting requirements of certifying entities and the ever changing knowledge and skill set associated with school leadership roles. The importance of this type of assessment for principal preparation programs rests in its usefulness as footing for change that will improve student learning and completer success. Effective programs continually assess themselves against their vision, mission, and goals, leveraging improvement based on measurable data rather than impulse or tradition. Program self assessment in partnership with the programs many stakeholders provides positioning that reflects a broad consensus concerning the most efficacious approaches to preparing school leaders (Glasman, Cibulka, & Ashby, 2002, p. 283). The results provide program faculty, institutions, and consumers with information to inform decision-making and policy development. Those closely involved with preparation programs depend on assessment processes to answer a number of questions related to the effectiveness of their programs. Program-specific focusing questions may include: What added benefit or value do program participants receive? How well prepared are the program completers for entrylevel school leadership positions? How do employers feel about their school leaders who completed the program? How is the program perceived by participants, completers, the university, and the field? What impact do the completers have on the school campuses

18

NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL

they serve? (Texas Principals Leadership Initiative, 2002). The answers to the questions come from a variety of sources in the form of perception and performance data. In much the same way that a leaders performance is scrutinized by internal and external stakeholders in the widely accepted practice of individual 360-degree feedback (Bracken, Dalton, Jaki, McCauley, & Pollman, 1997), the school leadership preparation program may be centered within a circle of known and trusted internal and external feedback sources. When the many perception and performance data sources impart a surround sound or 360-degree effect, the combined evidence offers a more inclusive look at program effectiveness (Hernez-Broome & Hughes, n.d.). This feedback coupled with relational supports such as coaching and action research offers a path forward for mindful program improvement.

An Assessment Approach to Anchor Continuous Improvement In collaboration with regional school district leaders and school leadership practitioners, school leadership program faculty members in the department of professional education and policy studies within a regional universitys school of education have designed and implemented a learn-as-you-go planning and assessment approach for continuous improvement of the school leadership preparation program. All improvement efforts are anchored by a focusing mission crafted and utilized by program faculty and stakeholders: The School Leadership Preparation Program prepares school leaders who are capable of facilitating the creative work of school communities that leads to student and school success in the midst of a constantly changing environment (Department of Educational Leadership and Policy Studies at Texas A&M University-Central Texas, n.d.). The following program learning goals coupled with defining sets of competencies track the state principal standards and mirror a set of certification test framework: (1) improvement of a school community including vision, culture, communication, collaboration,

EDDINS, KIRK, HOOTEN, & RUSSEL 19

and ethical leadership; (2) development, implementation, and evaluation of a schools instructional program which meets the needs of all students and includes curriculum, instruction, faculty and staff growth, and instructional problem solving and decision making; and (3) assurance of a safe and effective learning environment including resource use and physical and support system management (Texas Education Agency, 2010). In addition, program curricula is organized by a set of integrated strands of learning in crucial areas such as a generative and systemic decision making processes, integrated system management, ethical leadership processes, and social justice equity that unfold and are practiced across courses activities, all organized in a Program Instructional Strands by Course Alignment Learning Matrix tied to state and national school leadership standards as well as the NCATE ELCC Program Standards. Unpacking program curricula across courses ensures learning in all goal areas; however, the skillful integration of key elements is critical to school leadership effectiveness (Robinson, 2010; Tucker, Young, & Korschoreck, 2012). The case study format is built around a set of seven key components: student recruitment and selection, program curriculum, instructional delivery including a rigorous internship and mentoring support, stakeholder involvement, program staffing and faculty development, and program planning and evaluation. Providing the spine for a multi-year state principal preparation improvement process (Texas Principal Leadership Initiative, 2002), the corresponding continuums for the components are consistent with a more recently developed list of common features from an in-depth study of eight leadership development programs (Darling-Hammond, Meyerson, LaPointe, & Orr, 2009). Based on findings from the study, special emphasis wording has been added to the original continuums for student support, practicum experiences, and the internship experience. Initial ratings about approach, implementation, and outcomes for each of the seven component continuums have given program faculty a baseline for tracking improvement within the case study format. Exemplar Approach language with special emphasis wording in brackets for each continuum is provided below (Texas Principals

18

NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL

Leadership Initiative, 2002). Program Involvement of Stakeholders Faculty view collaboration between internal and external individuals and groups as critical to keeping the program aligned and viable. Input from a large group of stakeholdersstudent, graduates, practitioners, and other university facultyis solicited on a regular basis using multiple methods. Involvement in implementing the vision of the program is integrated in all improvement undertakings. The faculty is actively engaged in work with personnel from schools, districts, and state agencies. These personnel serve on advisory groups for the program. Program Improvement Planning Engagement in planning is proactive and focused on maintaining the vitality of the preparation program. Complacency is low and the sense of urgency is higha persistent discomfort with the status quo characterizes the facultys approach to improvement planning. A clearly defined continuous improvement process is used to introduce and test proposed program changes. Alignment with the facultys shared purpose and vision for the program is the litmus test for any proposed change. Program improvement is a collaborative rather than individual undertaking. Multiple types and sources of data [tied to a set of goals linked to state and national school leadership standards] are used to guide program improvement decision making. Program Content The program curriculum is a set of carefully sequenced learning experiences focused on the principals primary role in providing leadership for the improvement of teaching and learning in elementary and secondary schools. Integrated into the program content is a continuum of knowledge and skills that becomes more sophisticated as the program progresses. Technical knowledge is

EDDINS, KIRK, HOOTEN, & RUSSEL 19

examined in terms of its relationship to the teaching and learning process (e.g., instead of learning a school finance formula, the focus is on learning how to budget allocated resources to achieve the schools improvement goals). Course offerings vary in format and credit designation. Program Delivery The faculty [in consultation with program stakeholders] decides on the types of learning experiences needed by the students as they progress through the program. Accordingly, faculty members, individually or as teams, introduce and use a wide variety of instructional strategies (problem-based learning, action learning, real time practicum activities) and media. Students are members of cohort groups that are considered to function as learning communities. [Expert mentoring and peer coaching are utilized.] Performance is assessed using departmental rubrics that are published and which include a variety of perspectives: self, peer, and faculty. Group performance is also assessed. Students receive regular and ongoing feedback about their performance [on both coursework and in an intensive internship anchored by standards]. Program Staffing and Faculty Development The universitys strategic plan gives priority to the recruitment and hiring of a core of fulltime faculty with contemporary professional experience. A proactive approach is used to identify and recruit highcaliber fulltime and part-time faculty. Part-time faculty is used to provide expertise not possessed by the full-time faculty and are considered a key part of the faculty. Fulltime faculty teach the same number of courses or hours per week as their peers in other graduate programs, have equivalent advising loads, receive equivalent compensation, and are rewarded equally for the quality of their teaching and their engagement in scholarship. The need for faculty to be engaged in the field with practitioners isvalued [and] workloads are adjusted to accommodate work in the field. Resources are allocated to address development needs

18

NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL

of the entire program faculty. Student Recruitment and Selection The faculty has developed a profile or portrait of the successful student. There is written recruitment strategy that is shared with schools and districts for feedback and to launch aggressive recruitment initiatives. The goal is to recruit a diverse, capable, high potential student body. Criteria for student selection are published and used systematically during the selection process. Admission decisions are made by the faculty upon review of the admission documents. There is conscious control over the number of students admitted. Commitment to a comprehensive set of measures is consistent with the demands and expectations of the program. Program Evaluation All aspects of the program are rigorously evaluated in an ongoing effort to keep the program aligned with the needs of the field and best practices in leadership preparation. Feedback from those with an interest in the program is sought on a regular basis. There is a systematic plan for assessing the objectives of the program, the quality of instruction and instructors, the nature of the curriculum and its fit with student needs. The adopted evaluation plan calls for the use of multiple forms of data in addition to feedback from students and practitioners.

360-Degree Feedback Sources and Measurements Grounded by the focusing mission and the case study components, the continuous improvement approach utilizes 360degree feedback that provides multiple-source, multiple tool data linked to the state and national (ISLLC) school leadership standards as well as NCATE ELCC program standards. Some assessment data such as performance on course practicum activities and final assessments,

EDDINS, KIRK, HOOTEN, & RUSSEL 19

student course evaluations, internship performance, and the masters comprehensive examination provide more of an internal data picture. External reads include performance on the state certification examination as well as perception data from program completer and school district satisfaction surveys, providing rich data about program effectiveness. This feedback supports data-informed decisions that strengthen the seven key program components. Student Perceptions and Performances Feedback from students ensures an abundance of data about their knowledge, skill and satisfaction in relation to program learning. Assessments to measure student learning are embedded in every course, measuring movement towards mastery of course learning objectives through a common set of program Grading Criteria and Rubrics. Program rubrics are used for collaborative participation, written products, course presentations, and professional and school portfolio development. Assessments for learning such as case studies, action learning, problem-based learning projects, and action research are built into course assessment plans. Many course assessments bear the label of practicum activity because of the application to issues and opportunities in students professional settings. Assessment of learning measurements include student performance on the following: course final assessments covering all course and learning objectives, integrated practicum projects, action planning and performance during the internship experience, and performance on the masters degree comprehensive examination and the state principal certification examination. The use of student course evaluations provides a tool to measure student perceptions concerning the relevance of content and the effectiveness of delivery. Students are also surveyed as they exit the program upon completion and again when they serve in school leadership roles. Completer data is organized by role and years beyond

18

NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL

completion. School District Perceptions about Completer Performance Of increasing importance to the continuous improvement work by principal certificate program faculty are the perceptions and recommendations of school district, community, and state stakeholders. Principal mentors who supervise students internship experiences and coach program alumni in school leadership positions are surveyed concerning their perceptions of student and alumni performance. Performance feedback for the universitys education preparation programs are included in the states Accountability System for Educator Preparation Program (ASEP). Information included in this system along comprehensive needs assessment data provided annually by the regions districts inform program policy and practice recommendations advanced twice a year by the Strategic Partners Education Advisory Council (SPEAC) a 43-member stakeholder council that ensures a range of perspectives as it works to fulfill its commitment to provide recommendations for continuous improvement of all educator preparation programs housed at the university. Early in planning efforts, SPEAC organizers determined that an organized practitioner voice for each educator certification area was needed to inform the councils work. Certificate Area Practitioner Sub-Councils (CAPS) have now been established to examine research, best practice, and current program data, with recommendations for the certificate areas then routed through the larger council. Thus, the Principal CAPS advisory group has been convened as a 24-member sub-council, with participation by some of the regions most effective elementary, middle, and high school principals. They represent a diverse mix of campuses by size, location, and district affiliations in the region. They provide perception data about alumni performance in their districts, review data collected from other feedback sources, and make recommendations for mindful program improvement.

EDDINS, KIRK, HOOTEN, & RUSSEL 19

Experts from the Field of Educational Leadership Student and school district feedback is bolstered by a periodic external Critical Friends Review which provides feedback to the faculty engaged in work of program improvement. It is not a gotcha process; rather, it is a comprehensive examination of current program practice by respected colleagues engaged in the field of educational leadership in programs in other states. The feedback clusters around the seven program quality components, providing suggestions that will leverage improvement defined by the component continuums.

Program Improvement Highlights Faculty members use data from all of the measurement tools to strengthen course content and instructional delivery. The Course Design Reflections process supplies each course instructor with a protocol to synthesize multiple sources of data and make decisions about needed improvements. Each instructor is responsible for any needed content revision, resource update, and activity and assessment adjustment. Fulltime and adjunct program faculty meet twice a year to check for alignment issues and depth of content coverage related to the state principal standards, the domains and competencies of the principal certification examination framework, the 2008 leadership standards from the Interstate School Leaders Licensure Consortium, and the leadership preparation program standards from NCATE/Educational Leadership Curriculum Consortium. The Program Instructional Strands by Course Alignment Learning Matrix is used to anchor the alignment discussion. Course reflection and collaborative alignment efforts have yielded some very useful assessment tools including a case study group response format for analyzing and responding to ethical-issue case studies; a template for portfolio artifact summary and reflection development; and templates for diagnosing and leveraging processes, relationships, and group cultures within a complex system. Over a

18

NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL

three-year period, the course assessment data and course design reflections have provided the impetus to transform program instruction delivery from predominantly face-to-face to a hybrid or blended format, with 49% of course instruction occurring on line. The updating and improvement of course resources, objectives, and learning activities to strengthen relevance of course learning is ongoing. Intensive work has also been completed that pulls strands of learning for social justice and ethical decision making across the courses. Program faculty has also worked to ensure greater skill development in the latest technologies related to course curriculum. In addition to being part of the 360-degree feedback loop, the Principal CAPS advisory group provides political and structural support as it carries some of the freight involved in change efforts carefully examining all perception and performance data, determining the issues to address each year, uncovering strategies to meet identified needs, and hosting a collective voice. To date, they have recommended more of a practice to theory approach to program design, a longer and more comprehensive internship experience, a more customized recruitment and preparation approach for individual school districts, a more in-depth selection process for program participants, and maintenance of a blended rather than totally online program delivery. Plans are underway to implement all of the recommendations. Improving the program has also included an ongoing search for additional feedback sources and measurement tools. One example is the need for measurement and feedback data to gauge outcomes related to leadership effects on prek-12 school conditions and school improvement. Housed at the National Center for the Evaluation of Educational Leadership Preparation and Practice and based on the NCATE ELCC standards, the School Leadership Preparation and Practice (SLPPS) survey series has been chosen as a step towards meeting this need. Developed to measure key concepts in a recentlydeveloped conceptual framework of leadership preparation inquiry, the series includes the following: (1) a survey of preparation program

EDDINS, KIRK, HOOTEN, & RUSSEL 19

features to be completed by program faculty, (2) a school leader survey for program completers and school leader alumni, and (3) a survey for teachers who work with school leaders who are program alumni (Pounder, 2012). It will be used for identification of areas for program and course improvement; for development of a case for program resources and support; and for research focused on the relationship between program design/delivery, graduate outcomes, and on the job school improvement work.

Conclusion In summary, meaningful course and program assessment for a school leadership preparation program is crucial to its continuous improvement, particularly in view of the constantly shifting requirements of certifying entities and the ever changing knowledge and skill set associated with school leadership roles. For program completers to make an impact as school leaders, they must engage in learning that is relevant to prek-12 school environments, delivered in a coherent and engaging sequence, and assessed by both internal and external measures of effectiveness. Development, implementation, and improvement of such a program has been the goal of program faculty at a regional university. Informing the field of the facultys journey extends the dialogue as it attempts to provide a realistic look through a more comprehensive 360-degree feedback-driven case study method that provides mindful support for continuous program improvement.

18

NATIONAL FORUM OF EDUCATIONAL ADMINISTRATION AND SUPERVISION JOURNAL

References Department of Educational Leadership and Policy Studies at Texas A&M University-Central Texas. (n.d.). About us: School leadership program mission statement. Retrieved from http://www.ct.tamus.edu/departments/educationleadership/ index.php Bracken, D., Dalton, M., Jako, R., McCauley, C., & Pollman, G. (1997). Should 360-degree feedback be used on for developmental purposes? Greensboro, NC: Center for Creative Leadership. DarlingHammond, L., LaPointe, M., Meyerson, D., & Orr, M. (2009). Preparing school leaders for a changing world. San Francisco, CA: Jossey-Bass. Frye, B., Bottoms, G., & ONeill, K. (2005). The principal internship: How can we get it right? Atlanta, GA: Southern Regional Education Board. Fullan, M. (2009). Motion leadership: The skinny on becoming change savvy. Thousand Oaks, CA: Corwin. Glasman, N., Cibulka, & Ashby, D. (2002). Program self-evaluation for continuous improvement. Educational Administration Quarterly, 38, 257-288. Glickman, C. (2003). Holding sacred ground. New York, NY: Wiley and Sons. Hargreaves, A., & Shirley, D. (2009). The fourth way: The inspiring future for educational change. Thousand Oaks, CA: Corwin. Hernez-Broome, G., & Hughes, R. (n.d.). Leadership development: Past, present, and future [Electronic version]. Human Resource Planning, 24-32. Retrieved from http://www.ccl.org/leadership/pdf/research/cclLeadershipDevel opment.pdf Lashway, L. (2003). Transforming principal preparation. Eric Digest. Eugene, OR: ERIC Clearinghouse on Educational Management. Levine, A. (2005). Educating school leaders. Princeton, NJ: Education Schools Project.

EDDINS, KIRK, HOOTEN, & RUSSEL 19

Orr, M. T. (2009). Program evaluation in leadership preparation and related fields. In M. D. Young & G. Crow (Eds.), Handbook of research on the education of school leaders. New York, NY: Taylor & Francis. Orr, M. T., King, C., & LaPointe, M. ( 2010). Districts developing leaders: Lessons on Consumer actions and program approaches from eight urban districts. Boston, MA: Education Development Center. Pounder, D. (2012). School leadership preparation and practice survey instruments and their uses. Journal of Research on Leadership Education, 7(2), 254-274. Robinson, V. (2010). From instructional leadership to leadership capabilities: Empirical findings and methodological challenges. Leadership and Policy in Schools, 9, 1-26. Seashore Louis, K., Leithwood, K., Wahlstrom, K., & Anderson, S. (2010). Investigating the links to improved student learning: Final report of research findings. Minneapolis, MN: University of Minnesota. Sergiovanni, T. (2006). Rethinking leadership. Thousand Oaks, CA: Sage. Texas Education Agency. (2010). Texas examination of educator standards TExES preparation manual 068. Retrieved from http:texes.ets.org/assets/pdf/testprep_manuals/068_principal_8 2762_web.pdf Texas Principals Leadership Initiative. (2002). The Texas leadership preparation network lighthouse initiative: A continuous improvement process for educational leadership programs. Austin, TX: Author. Tucker, P., Young, M., & Koschoreck, J. (2012). Leading researchbased change in educational leadership preparation: An introduction. Journal of Research on Leadership Education, 7(2), 155-171.

Você também pode gostar