Escolar Documentos
Profissional Documentos
Cultura Documentos
TABLE OF CONTENTS
RESEARCH
ASSOCIATE I. Research Methodology
Luke Maher
II. Executive Overview
RESEARCH III. Assessment Content
MANAGER
IV. Assessment Structure
Lisa Geraci
V. Outreach and Assessment Distribution
VI. Data Use for Institutional Improvement
Project Sources
Education Advisory Boards internal and online (www.educationadvisoryboard.com) research libraries
National Center for Education Statistics [NCES] (http://nces.ed.gov/)
Maples, Glenn, Anna M. Greco, and John R. Tanner, Post Graduation Assessment of Learning
Outcomes, Journal of College Teaching and Learning 5.3 (March 2008): 33-38.
2
2011 The Advisory Board Company
I. RESEARCH METHODOLOGY
Research Parameters
The Council interviewed administrators with oversight of post-graduation student outcomes assessments
at research universities.
Northeast:
Private Research University
University C Midsize 10,600 / 4,300
not-for-profit (very high research activity)
City
3
2011 The Advisory Board Company
II. EXECUTIVE OVERVIEW
Key Observations
All contacts advise against the establishment of an explicit definition of student success,
recommending instead that administrators use post-graduation assessments to identify
institutional strengths and areas of improvement. Contacts believe that any measurable
definition of student success for instance, a definition based on mid-career income or graduate
education rates inevitably implies that some alumni are not successful, and for that reason such
definitions are not useful.
Assessments rely primarily on multiple choice questions, although most assessments include
a small number of qualitative questions as well. Contacts recommend avoiding qualitative-
response questions when possible and reserving necessary qualitative questions until the end of
the assessment.
Most contacts do not survey employers or use social media to track alumni. However,
administrators at some contact institutions maintain institutional databases that track corporate
and employer relationships to cultivate favorable hiring policies.
Assessment data is typically used to advise students about potential careers, promote
outcomes to prospective students, influence strategic planning initiatives, and guide
curricular development for academic units. Contacts recommend distributing assessment data
widely across the institution so that stakeholders can use the data as they see fit.
4
2011 The Advisory Board Company
III. ASSESSMENT CONTENT
Across contact institutions, post-graduation assessments examine six primary subjects: demographics,
career development, graduate and professional education, discipline-specific activities, community
impact, and reflections on undergraduate education.
Demographics
Although institutions already maintain databases of alumni demographic information, contacts suggest
that the collection of demographic data with post-graduate assessments serves two purposes: first, it
provides updates of current demographic information (e.g., place of residence, personal status, etc.); and
second, it allows administrators to easily analyze assessment data based on demographics. For instance,
contacts suggest that it is easier to determine the average income level of alumni who majored in a
particular discipline if the assessment includes a question on undergraduate major, rather than cross-
referencing assessment responses with the institutions database of undergraduate majors.
Sample questions:
What was your undergraduate major? Minor? What was your undergraduate grade point average?
What is your gender? What is your race or ethnic group? What is your citizenship?
What is your personal status (e.g., married/living with partner, widowed, separated, divorced,
single)? How many children do you have?
What level of education have your parents (or legal guardians) completed?
Did you receive financial aid as an undergraduate? If so, what kind (e.g., need- or merit-based)?
Career Development
All assessments collect information on career development. Longitudinal studies assess not only a
graduates current employment, but also how his or her career has evolved since graduation (i.e., whether
he or she has been employed at multiple companies or in multiple industries). Most contacts suggest that
career advancement is the primary indicator of student success. However, some contacts caution that
income level should not be conflated with career advancement, as consistent work in one industry is often
more reflective of success than high income.
Sample questions:
In what type of organization is your principal employment? In what industry?
What is the name of your current employer? What is your job title?
Which of the following best describes your current employment: entry-level, middle management,
senior management, executive-level, chief executive-level?
Have you been employed in the same field since beginning your career? How many times have
you changed fields of employment?
How satisfied are you with your current career?
How well did your undergraduate education prepare you for your current career?
5
2011 The Advisory Board Company
III. ASSESSMENT CONTENT
Graduate and Professional Education
Assessments at all contact institutions collect information on graduate and professional education, such as
the subject area of the graduate degree and whether or not the degree is terminal. Several contacts also
suggest that administrators should assess graduates informal education (e.g., work-related trainings and
certifications, not-for-credit programs, etc.). Additionally, all institutions ask about the amount of debt
that alumni have accrued over the course of their education. According to contacts, continuing education
is a primary indicator of success and academic quality, as advancement into graduate education reflects
appropriate undergraduate preparation.
Sample questions:
How many years after your undergraduate degree did you start your graduate or professional
education?
What graduate and/or professional degrees have you received? In what graduate and/or
professional programs are you currently enrolled?
What is the total amount you have borrowed to finance your graduate or professional education?
Your entire education (including undergraduate)?
How well did your undergraduate education prepare you for graduate and/or professional school?
Discipline-Specific Activities
Many contacts recommend allowing academic programs to add department-specific question modules to
assessments. For instance, assessments at several contact institutions include separate modules for
engineering programs, as several important career milestones for engineering graduates such as
professional certifications or membership in professional societies are not relevant to graduates in any
other program. Contacts at University D suggest that each undergraduate academic college develop a
question module specific to its program and distribute it to alumni as a part of the standard alumni
assessment. This information is useful in program development within academic units.
Sample questions:
Did your undergraduate education offer enough education about governmental technology
policymaking for your current career?
What professional certifications have you received?
Are you a member of any professional societies? If so, which societies?
What professional skills are most relevant to your current career? How well did your
undergraduate education help you in developing those skills?
6
2011 The Advisory Board Company
III. ASSESSMENT CONTENT
Community Impact
Several assessments collect information on altruism, volunteer work, and community leadership. This
includes financial contributions to charitable causes, leadership in community and non-profit
organizations, and the amount of time spent volunteering or engaging in pro bono work. Contacts
recommend collecting this information because service and altruism are typically part of a higher
education institutions mission, so administrators can use the data to guide the institutions efforts in
service and community leadership.
Sample questions:
In the past twelve months, have you been involved beyond making a donation with any charity
or non-profit organizations? If so, what type of organizations?
In the past twelve months, have you donated money to a charity?
How often are you involved in volunteer work?
Sample questions:
In the past year, have you received any publications about this institution? If so, what
publications?
Have you attended an alumni event in the last five years? If so, what type of event? How did you
find out about the event?
Do you often work with people from other cultures or countries in your current employment?
Based on what you know now, how well did your undergraduate experience prepare you to
understand social problems? To write effectively? To be an active member of your community?
How much emphasis should this institution place on faculty research? Liberal arts courses? Moral
development? Diversity in the student body?
What organizations were you involved in as an undergraduate? What sports?
7
2011 The Advisory Board Company
IV. ASSESSMENT STRUCTURE
Question Structure
Multiple Choice Questions
Assessments across all contact institutions rely primarily on multiple choice questions, which typically
take one of two forms: single-selection or multiple-selection. Single-selection questions are appropriate
when asking about demographics or career development, as administrators assume that respondents are
employed by a single employer. However, because single-selection questions ask for only one answer,
they are useful to assess any non-opinion-based information:
What was the total amount of money you borrowed to finance your graduate or professional
education? If you have not yet completed your degree, please estimate the total amount you expect to
borrow. Do not include undergraduate borrowing.
Did not pursue further education $40,000 to 49,999
$0 $50,000 to 74,999
$1 to 9,999 $75,000 to 99,999
$10,000 to 19,999 $100,000 to 149,999
$20,000 to 29,999 $150,000 or more
$30,000 to 39,999 More than $0, but unable to estimate amount
Multiple-selection questions allow respondents to select all answers that apply, and they are useful for
assessing subjects in which respondents may have multiple interests, involvements, or accomplishments.
They are common in questions regarding graduate and professional education, discipline-specific
activities, community impact, and reflections on undergraduate education:
Please indicate which of the following types of
Monetary
organizations you have participated in without pay Member or Officer or
contribution
in the past 12 months, as well as the capacity in participant leader
beyond dues
which you participated.
Your secondary school
Your undergraduate institution
Undergraduate fraternity or sorority
Graduate or professional school
Your childrens school
Artistic or cultural organizations
Environmental organizations
Hospitals or health-related organizations
Local government boards
Political campaigns or organizations
Professional associations
Social service or welfare organizations
Religious organizations
Youth activities or sports organizations
8
2011 The Advisory Board Company
IV. ASSESSMENT STRUCTURE
Emphasizing Clarity in Multiple Choice Questions
Several contacts caution that multiple choice questions can easily become cumbersome, and respondents
may find it difficult to meaningfully respond if the questions are too long or unclear; contacts recommend
length, number of required responses, and clarity in directions guide question development. At University
C, a past alumni survey contained the following question:
Do not Do not
None A lot None A lot
know know
Faculty
research
Teaching
undergraduates
Broad, liberal
arts education
Intercollegiate
athletics
Extracurricular
activities
Commitment
to intellectual
freedom
Upon analysis, administrators found that alumni responses did not provide any meaningful or notable
information. Contacts suggest that respondents are overwhelmed by the large number of responses
required in the question, as well as the two types of emphasis they are asked to consider. After identifying
the primary purpose of the question to determine the ideal institutional emphasis on each aspect of
undergraduate education contacts now ask a similar question that is shorter and clearer:
9
2011 The Advisory Board Company
IV. ASSESSMENT STRUCTURE
Qualitative Questions
Most qualitative questions ask for open-ended reflections or advice, allowing respondents to personally
interpret the prompt and offer their opinions:
1. Looking back on your learning experiences at this institution, what types of experiences
were the most valuable with regard to developing leadership skills?
Contacts recommend avoiding qualitative-response questions when possible, for two primary reasons:
first, qualitative responses require more staff time and resources to analyze than quantitative responses;
and second, alumni are not likely to respond to a large number of qualitative questions. Instead, contacts
recommend reserving qualitative questions for the end of the survey, as by the end of the survey,
respondents are thinking critically about their undergraduate education and are more likely to provide a
thoughtful response.
Survey Development
Across contact institutions, assessments are developed by a variety of campus stakeholders, including the
offices of institutional research, career services, central administrators, and academic units. Contacts
recommend the engagement of all these offices in assessment development so that the survey addresses
the needs of each stakeholder. For instance, career services staff members are best equipped to develop
questions about post-graduation employment, while representatives from academic units know what
information is most useful to program and curricular development. Most contacts recommend the
formation of an assessment committee to engage campus stakeholders.
At University D, a faculty advisory board evaluates assessments before each distribution, typically on
a yearly basis. Contacts there recommend that each question in a potential survey is read out loud, and
committee members are asked how they will use the information from that question. If no members
have an answer for how the information will be used, the question is removed from the survey.
Contacts suggest that this process is vital to ensure that the survey is as concise and direct as possible,
as alumni are more likely to respond to shorter surveys.
10
2011 The Advisory Board Company
IV. ASSESSMENT STRUCTURE
Survey Format
Assessment measures at all contact institutions are distributed electronically to alumni through email.
Typically, alumni receive a link to a secure website and a unique username, which they must enter to
complete the response. University F supplements their electronic survey with a paper survey that is
mailed to alumni without an email address; although this strategy increases the response rate, contacts
suggest that the development, distribution, collection, and analysis of both paper and electronic surveys is
a substantial additional cost to the assessment. Most contacts recommend that assessments be distributed
only electronically, as it balances cost-effectiveness with a satisfactory response rate.
11
2011 The Advisory Board Company
IV. OUTREACH AND ASSESSMENT DISTRIBUTION
Office Collaboration for Contact Information
At most contact institutions, the development office provides a list of alumni contact information for
survey distribution. However, most contacts agree that the development office should not add questions to
assessments, because development administrators may attempt to solicit donations to the institution from
respondents. Contacts identify three strategies for managing the involvement of the development office in
assessment distribution.
Response Rates
The average response rate for post-graduation assessments is approximately 20 percent. However,
contacts suggest that response rates for more recent graduates are higher than those for graduates from
further in the past, so administrators should expect lower rates of return on assessments sent to graduates
from longer ago.
Maintain Some contacts suggest that alumni are more likely to respond to a survey if they
regularly receive other communication from the institution, such as the
Consistent institutions alumni magazine or a campus newsletter. In order to maximize the
Alumni return rate on a planned longitudinal assessment, administrators can increase the
Communication distribution of these publications in the year before assessment distribution.
12
2011 The Advisory Board Company
IV. OUTREACH AND ASSESSMENT DISTRIBUTION
Assessment Frequency
At most contact institutions, longitudinal assessments are
distributed to graduating classes every three to five years; At three to five years, graduates are
surveys are sent to alumni that graduated five, ten, fifteen, and still close enough to their
twenty years before the year of distribution. Some contacts undergraduate education that they
question whether administrators should assess alumni more can tell us how prepared they were
than a few years after graduation; contacts at University D [for their career] when they left.
believe that alumni who graduated three to five years prior to - Council Interview
survey distribution are ideal targets for questions about
academic quality, as their professional and educational attainments are more likely to have resulted from
their undergraduate experiences. The activities of alumni more than three to five years after graduation
are more likely to be affected by post-graduate professional or educational experiences. However, most
contacts believe that longitudinal assessments provide a greater depth of information about long-term
student outcomes and recommend distributing assessments to students at five-year intervals after their
graduation and through the twentieth year.
Outreach to Non-Graduates
The authors of Post-Graduation Assessment of Learning Outcomes (see Project Sources) recommend
that assessments should include outreach to students who do not graduate from the institution. In order to
fully assess academic success and its relationship to academic quality, administrators should also seek the
perspectives of those students who do not complete their undergraduate education. This provides two
primary benefits: first, it allows administrators to identify areas for improvement in the retention and
graduation of students; and second, it provides benchmarking information for administrators to see how
the outcomes for graduates differ from the outcomes for non-graduates.
Administrators can reach out to non-graduates using the contact information from the students time at the
institution. However, administrators should contact non-graduates soon after their time at the institution
(i.e., within five years) in order to ensure that contact information is still accurate. Administrators should
also anticipate a low response rate for non-graduate assessments: although many non-graduates may be
unwilling to respond to a survey, those who were dissatisfied with the institution may want to express
their reasons for leaving to administrators.
13
2011 The Advisory Board Company
IV. OUTREACH AND ASSESSMENT DISTRIBUTION
Contacting Employers
Administrators at contact institutions typically do not survey employers. Contacts suggest that surveying
employers about individual graduates would be costly and inefficient, as the same data can be collected
directly from alumni. However, several contacts recognize the importance of cultivating favorable hiring
policies among local and national employers; contacts suggest that this is best accomplished through
consistent communication with employers. At University B, administrators maintain a database of all
organizations that employ graduates; staff members record every interaction (e.g., attendance at a donor
event, contact at a career fair, etc.) with each organization, and the information is shared across the
institutions corporate relations, development, and institutional research offices. This allows
administrators to monitor all communication with employers and ensure that positive relationships are
established both with employers leadership teams and with their human resources and hiring
departments.
14
2011 The Advisory Board Company
V. DATA USE FOR INSTITUTIONAL IMPROVEMENT
All contacts advise against the development of an explicit definition of post-graduate student success.
Contacts recommend that no measurable definition of student success such as one based on income
level, career satisfaction, or community engagement applies to all graduates; some contacts express
concern that identifying successful alumni implies that other alumni are not successful. Instead of
establishing a single definition of success, contacts recommend using assessment data to identify
institutional strengths and areas of improvement.
Executive Summary Document
At several contact institutions, administrators create an executive summary document that provides an
overview of the assessments results. These documents typically summarize data about four subjects:
career development, graduate and professional education, community impact, and reflections on
undergraduate education; additionally, they describe the assessments methodology and rate of response.
Contacts suggest that a summary document is a clear, concise way to demonstrate the value of the
institutions education to a wide variety of campus stakeholders.
15
2011 The Advisory Board Company
V. DATA USE FOR INSTITUTIONAL IMPROVEMENT
Data Use by Campus Stakeholders
Because assessments collect data on so many aspects of the post-graduation experiences of alumni,
administrators find a wide variety of uses for assessment data on campus. The chart below details the use
of assessment data by subject.
Community impact data can be used to identify more effective ways to engage
alumni in the campus community by highlighting service and volunteering
Community
opportunities in which alumni are involved. This provides administrators with the
Impact
opportunity to engage students, faculty, and staff in similar service opportunities,
encouraging alumni to actively support service initiatives at the institution.
16
2011 The Advisory Board Company
VI. APPENDIX: ASSESSMENT TOOL AT UNIVERSITY F
All references in the following questions to the year following graduation in this survey refer to
the 12 months after you received your bachelors degree from this institution.
1. What was your primary activity the one to which you devoted the most time in the year
following your graduation? Mark one answer in Column 1A.
In what other (secondary) activities were you involved the year following your graduation?
Mark as many as apply in Column 1B.
What are your current primary and secondary activities? In Column 2, mark one primary activity
and as many secondary activities as apply for the current year.
1: Year Following
2: Currently
Graduation
A: B: A: B:
Primary Secondary Primary Secondary
Employed for pay
Student in degree program
Internship, paid or unpaid
Seeking employment
Raising a family
Volunteer activities
Military Service
Other (please specify)
2. If you were employed for pay as your primary activity in the year following graduation, what were
your reasons for making this choice? Mark all that apply.
3. Have you enrolled in a graduate degree program since graduating from your undergraduate
institution?
17
2011 The Advisory Board Company
VI. APPENDIX: ASSESSMENT TOOL AT UNIVERSITY F
4. Please mark all degrees you have received in Column 1, and any degree programs in which you
are currently enrolled in Column 2.
1: Degrees 2: Currently
Received Enrolled
Second bachelors degree
Law degree (e.g., LL.B, J.D.)
Medical degree (e.g., M.D., D.D.S., D.V.M.)
Masters Degrees
Master of arts or science
Business
Engineering
Other professional masters (e.g., M.S.W.)
Other masters degree
Doctoral Degrees (e.g., Ph.D.)
Biological sciences
Engineering or other applied sciences
Humanities or arts
Physical sciences
Social sciences
Professional doctorate (e.g., education)
Other doctorate (please specify)
4.A Did you attend (or are you now attending) your undergraduate institution for any of your
graduate programs?
No. Yes.
4.B How well do you think your undergraduate institution prepared you for graduate or
professional school when you compare yourself with others in your graduate program(s)?
18
2011 The Advisory Board Company
PROFESSIONAL SERVICES NOTE
The Advisory Board has worked to ensure the accuracy of the information it provides to its members.
This project relies on data obtained from many sources, however, and The Advisory Board cannot
guarantee the accuracy of the information or its analysis in all cases. Further, The Advisory Board is not
engaged in rendering clinical, legal, accounting, or other professional services. Its projects should not be
construed as professional advice on any particular set of facts or circumstances. Members are advised to
consult with their staff and senior management, or other appropriate professionals, prior to implementing
any changes based on this project. Neither The Advisory Board Company nor its programs are
responsible for any claims or losses that may arise from any errors or omissions in their projects,
whether caused by the Advisory Board Company or its sources.
2011 The Advisory Board Company, 2445 M Street, N.W., Washington, DC 20037. Any
reproduction or retransmission, in whole or in part, is a violation of federal law and is strictly prohibited
without the consent of the Advisory Board Company. This prohibition extends to sharing this
publication with clients and/or affiliate companies. All rights reserved.