Você está na página 1de 19

UNIVERSITY LEADERSHI P COUNCIL

Assessment of Post-Graduation Student


Outcomes:
Measurement and Communication of Graduate Success

Custom Research Brief

TABLE OF CONTENTS
RESEARCH
ASSOCIATE I. Research Methodology
Luke Maher
II. Executive Overview
RESEARCH III. Assessment Content
MANAGER
IV. Assessment Structure
Lisa Geraci
V. Outreach and Assessment Distribution
VI. Data Use for Institutional Improvement

THE ADVISORY BOARD C OMPANY


WASHINGTON, D.C.
I. RESEARCH METHODOLOGY
Project Challenge
Leadership at a member institution approached the Council with the following questions:

How do institutions define and assess post-graduate success?


What types of longitudinal indicators do institutions use to measure the impact of academic
program quality on graduates success?
How do institutions assess the relationship between program-level student learning outcomes and
post-graduate success?
What systems track internal and external indicators of student success?
Who on campus is responsible for collecting data on post-graduation student success? How is the
information collected?
At what intervals or career milestones are graduates measured?
What role do social media play in the collection of data on post-graduate student success?
How do institutions use assessment data for internal improvement?
What influence does assessment data have on academic program quality?
What advice do institutions offer for longitudinal assessments of post-graduation student success?

Project Sources
Education Advisory Boards internal and online (www.educationadvisoryboard.com) research libraries
National Center for Education Statistics [NCES] (http://nces.ed.gov/)
Maples, Glenn, Anna M. Greco, and John R. Tanner, Post Graduation Assessment of Learning
Outcomes, Journal of College Teaching and Learning 5.3 (March 2008): 33-38.

2
2011 The Advisory Board Company
I. RESEARCH METHODOLOGY
Research Parameters
The Council interviewed administrators with oversight of post-graduation student outcomes assessments
at research universities.

A Guide to the Institutions Profiled in this Brief


Approximate
Institution Location Type Enrollment Classification
(Undergraduate/ Total)

Midwest: Private Research University


University A 9,800 / 4,200
Large City not-for-profit (very high research activity)

Midwest: Private Research University


University B 7,800 / 2,600
Large City not-for-profit (high research activity)

Northeast:
Private Research University
University C Midsize 10,600 / 4,300
not-for-profit (very high research activity)
City

Northeast: Research University


University D Public 45,200 / 38,600
Small City (very high research activity)

Northeast: Private: Research University


University E 25,000 / 11,900
Large City not-for-profit (very high research activity)

Northeast: Private Research University


University F 20,900 / 13,900
Small City not-for-profit (very high research activity)

Source: National Center for Education Statistics

3
2011 The Advisory Board Company
II. EXECUTIVE OVERVIEW
Key Observations

All contacts advise against the establishment of an explicit definition of student success,
recommending instead that administrators use post-graduation assessments to identify
institutional strengths and areas of improvement. Contacts believe that any measurable
definition of student success for instance, a definition based on mid-career income or graduate
education rates inevitably implies that some alumni are not successful, and for that reason such
definitions are not useful.

Contacts recommend assembling a committee of campus stakeholders to review the


assessment tool before distribution and analyze results. In addition to the assessment
administrators, contacts suggest that the committee should have representation from the offices of
institutional research, career services, the institutions central administration, and a variety of
academic units.

Across contact institutions, post-graduation assessments examine six primary subjects:


demographics, career development, graduate and professional education, discipline-specific
activities, community impact, and reflections on undergraduate education. Contacts believe
that these areas provide a thorough understanding of alumnis post-graduation life, and
information on these areas can help administrators to guide the development of academic
programs, career services, and strategic goals for the institution.

Assessments rely primarily on multiple choice questions, although most assessments include
a small number of qualitative questions as well. Contacts recommend avoiding qualitative-
response questions when possible and reserving necessary qualitative questions until the end of
the assessment.

Assessments at all contact institutions are distributed electronically. Longitudinal


assessments are typically distributed every three to five years and are sent to alumni that
graduated five, ten, fifteen, and twenty years before the year of distribution. Contacts
recommend that electronic assessments be divided among several short webpages rather than one
long webpage, as this allows respondents to feel that they are moving quickly through the
assessment, which encourages a higher survey completion rate.

Longitudinal post-graduation assessments typically receive a response rate of


approximately 20 percent. Contacts caution that the response rate for more recent graduates is
higher than the rate for more distant graduates, so administrators should anticipate a lower
response in longitudinal assessments.

Most contacts do not survey employers or use social media to track alumni. However,
administrators at some contact institutions maintain institutional databases that track corporate
and employer relationships to cultivate favorable hiring policies.

Assessment data is typically used to advise students about potential careers, promote
outcomes to prospective students, influence strategic planning initiatives, and guide
curricular development for academic units. Contacts recommend distributing assessment data
widely across the institution so that stakeholders can use the data as they see fit.

4
2011 The Advisory Board Company
III. ASSESSMENT CONTENT
Across contact institutions, post-graduation assessments examine six primary subjects: demographics,
career development, graduate and professional education, discipline-specific activities, community
impact, and reflections on undergraduate education.
Demographics
Although institutions already maintain databases of alumni demographic information, contacts suggest
that the collection of demographic data with post-graduate assessments serves two purposes: first, it
provides updates of current demographic information (e.g., place of residence, personal status, etc.); and
second, it allows administrators to easily analyze assessment data based on demographics. For instance,
contacts suggest that it is easier to determine the average income level of alumni who majored in a
particular discipline if the assessment includes a question on undergraduate major, rather than cross-
referencing assessment responses with the institutions database of undergraduate majors.

Sample questions:
What was your undergraduate major? Minor? What was your undergraduate grade point average?
What is your gender? What is your race or ethnic group? What is your citizenship?
What is your personal status (e.g., married/living with partner, widowed, separated, divorced,
single)? How many children do you have?
What level of education have your parents (or legal guardians) completed?
Did you receive financial aid as an undergraduate? If so, what kind (e.g., need- or merit-based)?

Career Development
All assessments collect information on career development. Longitudinal studies assess not only a
graduates current employment, but also how his or her career has evolved since graduation (i.e., whether
he or she has been employed at multiple companies or in multiple industries). Most contacts suggest that
career advancement is the primary indicator of student success. However, some contacts caution that
income level should not be conflated with career advancement, as consistent work in one industry is often
more reflective of success than high income.

Sample questions:
In what type of organization is your principal employment? In what industry?
What is the name of your current employer? What is your job title?
Which of the following best describes your current employment: entry-level, middle management,
senior management, executive-level, chief executive-level?
Have you been employed in the same field since beginning your career? How many times have
you changed fields of employment?
How satisfied are you with your current career?
How well did your undergraduate education prepare you for your current career?

5
2011 The Advisory Board Company
III. ASSESSMENT CONTENT
Graduate and Professional Education
Assessments at all contact institutions collect information on graduate and professional education, such as
the subject area of the graduate degree and whether or not the degree is terminal. Several contacts also
suggest that administrators should assess graduates informal education (e.g., work-related trainings and
certifications, not-for-credit programs, etc.). Additionally, all institutions ask about the amount of debt
that alumni have accrued over the course of their education. According to contacts, continuing education
is a primary indicator of success and academic quality, as advancement into graduate education reflects
appropriate undergraduate preparation.
Sample questions:
How many years after your undergraduate degree did you start your graduate or professional
education?
What graduate and/or professional degrees have you received? In what graduate and/or
professional programs are you currently enrolled?
What is the total amount you have borrowed to finance your graduate or professional education?
Your entire education (including undergraduate)?
How well did your undergraduate education prepare you for graduate and/or professional school?

Discipline-Specific Activities
Many contacts recommend allowing academic programs to add department-specific question modules to
assessments. For instance, assessments at several contact institutions include separate modules for
engineering programs, as several important career milestones for engineering graduates such as
professional certifications or membership in professional societies are not relevant to graduates in any
other program. Contacts at University D suggest that each undergraduate academic college develop a
question module specific to its program and distribute it to alumni as a part of the standard alumni
assessment. This information is useful in program development within academic units.
Sample questions:
Did your undergraduate education offer enough education about governmental technology
policymaking for your current career?
What professional certifications have you received?
Are you a member of any professional societies? If so, which societies?
What professional skills are most relevant to your current career? How well did your
undergraduate education help you in developing those skills?

6
2011 The Advisory Board Company
III. ASSESSMENT CONTENT

Community Impact
Several assessments collect information on altruism, volunteer work, and community leadership. This
includes financial contributions to charitable causes, leadership in community and non-profit
organizations, and the amount of time spent volunteering or engaging in pro bono work. Contacts
recommend collecting this information because service and altruism are typically part of a higher
education institutions mission, so administrators can use the data to guide the institutions efforts in
service and community leadership.
Sample questions:
In the past twelve months, have you been involved beyond making a donation with any charity
or non-profit organizations? If so, what type of organizations?
In the past twelve months, have you donated money to a charity?
How often are you involved in volunteer work?

Reflections on Undergraduate Education


Assessments at all contact institutions ask alumni to reflect on their undergraduate experience. This
includes information about how graduates were involved with the campus, how well the institution
prepared them for their future, and how engaged they are in the institutions alumni affairs. Several
institutions also ask about the relative importance of their life priorities (e.g., raising a family, being
financially secure, engaging in creative work, etc.). Additionally, some contacts recommend that
assessments include questions on any strategic academic initiatives at the institution (e.g., international
learning, interdisciplinary learning, etc.) in order to establish benchmarks to gauge the success of those
initiatives. Alumni reflections allow administrators to understand how alumni define success for
themselves, and it can also help administrators determine institutional priorities, such as an improvement
in alumni engagement or an increase in the availability of extracurricular activities on campus.

Sample questions:
In the past year, have you received any publications about this institution? If so, what
publications?
Have you attended an alumni event in the last five years? If so, what type of event? How did you
find out about the event?
Do you often work with people from other cultures or countries in your current employment?
Based on what you know now, how well did your undergraduate experience prepare you to
understand social problems? To write effectively? To be an active member of your community?
How much emphasis should this institution place on faculty research? Liberal arts courses? Moral
development? Diversity in the student body?
What organizations were you involved in as an undergraduate? What sports?

7
2011 The Advisory Board Company
IV. ASSESSMENT STRUCTURE
Question Structure
Multiple Choice Questions
Assessments across all contact institutions rely primarily on multiple choice questions, which typically
take one of two forms: single-selection or multiple-selection. Single-selection questions are appropriate
when asking about demographics or career development, as administrators assume that respondents are
employed by a single employer. However, because single-selection questions ask for only one answer,
they are useful to assess any non-opinion-based information:
What was the total amount of money you borrowed to finance your graduate or professional
education? If you have not yet completed your degree, please estimate the total amount you expect to
borrow. Do not include undergraduate borrowing.
Did not pursue further education $40,000 to 49,999
$0 $50,000 to 74,999
$1 to 9,999 $75,000 to 99,999
$10,000 to 19,999 $100,000 to 149,999
$20,000 to 29,999 $150,000 or more
$30,000 to 39,999 More than $0, but unable to estimate amount

Multiple-selection questions allow respondents to select all answers that apply, and they are useful for
assessing subjects in which respondents may have multiple interests, involvements, or accomplishments.
They are common in questions regarding graduate and professional education, discipline-specific
activities, community impact, and reflections on undergraduate education:
Please indicate which of the following types of
Monetary
organizations you have participated in without pay Member or Officer or
contribution
in the past 12 months, as well as the capacity in participant leader
beyond dues
which you participated.
Your secondary school
Your undergraduate institution
Undergraduate fraternity or sorority
Graduate or professional school
Your childrens school
Artistic or cultural organizations
Environmental organizations
Hospitals or health-related organizations
Local government boards
Political campaigns or organizations
Professional associations
Social service or welfare organizations
Religious organizations
Youth activities or sports organizations

8
2011 The Advisory Board Company
IV. ASSESSMENT STRUCTURE
Emphasizing Clarity in Multiple Choice Questions
Several contacts caution that multiple choice questions can easily become cumbersome, and respondents
may find it difficult to meaningfully respond if the questions are too long or unclear; contacts recommend
length, number of required responses, and clarity in directions guide question development. At University
C, a past alumni survey contained the following question:

Cumbersome Question Structure


Indicate how much emphasis you believe this institution currently places and how much it should
place on each of the following:
Current Emphasis: Emphasis Should Be:

Do not Do not
None A lot None A lot
know know
Faculty
research
Teaching
undergraduates
Broad, liberal
arts education
Intercollegiate
athletics
Extracurricular
activities
Commitment
to intellectual
freedom
Upon analysis, administrators found that alumni responses did not provide any meaningful or notable
information. Contacts suggest that respondents are overwhelmed by the large number of responses
required in the question, as well as the two types of emphasis they are asked to consider. After identifying
the primary purpose of the question to determine the ideal institutional emphasis on each aspect of
undergraduate education contacts now ask a similar question that is shorter and clearer:

Streamlined Question Structure


How would you change the emphasis that this institution places on each of the following:
Reduce a Keep Increase
Reduce Increase Do not
great about the a great
somewhat somewhat know
deal same deal
Faculty research
Teaching undergraduates
Broad, liberal arts education
Intercollegiate athletics
Extracurricular activities
Commitment to intellectual
freedom
This question is shorter, requires fewer responses, and contains only one set of directions, making it more
accessible to respondents.

9
2011 The Advisory Board Company
IV. ASSESSMENT STRUCTURE
Qualitative Questions
Most qualitative questions ask for open-ended reflections or advice, allowing respondents to personally
interpret the prompt and offer their opinions:

1. Looking back on your learning experiences at this institution, what types of experiences
were the most valuable with regard to developing leadership skills?

2. What advice would you offer to current undergraduates at this institution?

Contacts recommend avoiding qualitative-response questions when possible, for two primary reasons:
first, qualitative responses require more staff time and resources to analyze than quantitative responses;
and second, alumni are not likely to respond to a large number of qualitative questions. Instead, contacts
recommend reserving qualitative questions for the end of the survey, as by the end of the survey,
respondents are thinking critically about their undergraduate education and are more likely to provide a
thoughtful response.
Survey Development
Across contact institutions, assessments are developed by a variety of campus stakeholders, including the
offices of institutional research, career services, central administrators, and academic units. Contacts
recommend the engagement of all these offices in assessment development so that the survey addresses
the needs of each stakeholder. For instance, career services staff members are best equipped to develop
questions about post-graduation employment, while representatives from academic units know what
information is most useful to program and curricular development. Most contacts recommend the
formation of an assessment committee to engage campus stakeholders.

Establishment of an Assessment Committee

At University D, a faculty advisory board evaluates assessments before each distribution, typically on
a yearly basis. Contacts there recommend that each question in a potential survey is read out loud, and
committee members are asked how they will use the information from that question. If no members
have an answer for how the information will be used, the question is removed from the survey.
Contacts suggest that this process is vital to ensure that the survey is as concise and direct as possible,
as alumni are more likely to respond to shorter surveys.

10
2011 The Advisory Board Company
IV. ASSESSMENT STRUCTURE
Survey Format
Assessment measures at all contact institutions are distributed electronically to alumni through email.
Typically, alumni receive a link to a secure website and a unique username, which they must enter to
complete the response. University F supplements their electronic survey with a paper survey that is
mailed to alumni without an email address; although this strategy increases the response rate, contacts
suggest that the development, distribution, collection, and analysis of both paper and electronic surveys is
a substantial additional cost to the assessment. Most contacts recommend that assessments be distributed
only electronically, as it balances cost-effectiveness with a satisfactory response rate.

Structuring Electronic Surveys in Segments


Contacts advise that ease of survey navigation is a vital consideration for assessment development;
surveys must be simple, clear, and require minimal time investment (typically less than 30 minutes),
or alumni do not respond to them. Aside from phrasing questions clearly, contacts suggest that
electronic surveys should be structured so that respondents navigate several short pages rather than
one long page. This allows respondents to feel that they are making fast progress through the survey,
as they must click through several pages. If questions are consolidated onto one long page, alumni
may feel that the survey is too long and opt not to take it.

11
2011 The Advisory Board Company
IV. OUTREACH AND ASSESSMENT DISTRIBUTION
Office Collaboration for Contact Information
At most contact institutions, the development office provides a list of alumni contact information for
survey distribution. However, most contacts agree that the development office should not add questions to
assessments, because development administrators may attempt to solicit donations to the institution from
respondents. Contacts identify three strategies for managing the involvement of the development office in
assessment distribution.

Strategies for Working with the Development Office


1. Grant assessment administrators access to the institutions alumni-tracking database.
Contacts advise that assessment administrators have access to the institutions alumni-tracking
database so that they can generate contact lists for the appropriate graduating classes. At
University A, institutional research staff members find it difficult to secure the necessary contact
lists from the development office, as development staff members want to ensure that donors are
not overwhelmed by communications from the institution. This prevents assessments from being
distributed as regularly as administrators would prefer.
2. Only release collected data in aggregate form.
In order to assure respondents that their responses will not be used to generate additional
solicitations for donations, contacts at University E recommend that assessment data only be
shared without any of the respondents identifying information (e.g., names, addresses, etc.).
Contacts suggest that alumni are more willing to respond if they know that their individual
responses especially their salary range will not be shared with the development office.
Response Rates
3. Do not send a survey in concert with any solicitations.
Several contacts suggest that alumni are less likely to respond to a survey if the outreach materials
include any reference to solicitations, links to donation sites, or publication subscription offers
from the institution. Contacts recommend that outreach for assessments be sent as a standalone
communication.

Response Rates
The average response rate for post-graduation assessments is approximately 20 percent. However,
contacts suggest that response rates for more recent graduates are higher than those for graduates from
further in the past, so administrators should expect lower rates of return on assessments sent to graduates
from longer ago.

Strategies for Increasing Assessment Response Rates

Administrators at several contact institutions incentivize assessment completion.


At University D, alumni who complete the survey are entered into a drawing for
Offer a gift card to Amazon.com; a winner is selected at random from that pool. Other
Incentives institutions make a donation to the graduating class gift (typically between one
and five dollars) for each completed survey.

Maintain Some contacts suggest that alumni are more likely to respond to a survey if they
regularly receive other communication from the institution, such as the
Consistent institutions alumni magazine or a campus newsletter. In order to maximize the
Alumni return rate on a planned longitudinal assessment, administrators can increase the
Communication distribution of these publications in the year before assessment distribution.

12
2011 The Advisory Board Company
IV. OUTREACH AND ASSESSMENT DISTRIBUTION
Assessment Frequency
At most contact institutions, longitudinal assessments are
distributed to graduating classes every three to five years; At three to five years, graduates are
surveys are sent to alumni that graduated five, ten, fifteen, and still close enough to their
twenty years before the year of distribution. Some contacts undergraduate education that they
question whether administrators should assess alumni more can tell us how prepared they were
than a few years after graduation; contacts at University D [for their career] when they left.
believe that alumni who graduated three to five years prior to - Council Interview
survey distribution are ideal targets for questions about
academic quality, as their professional and educational attainments are more likely to have resulted from
their undergraduate experiences. The activities of alumni more than three to five years after graduation
are more likely to be affected by post-graduate professional or educational experiences. However, most
contacts believe that longitudinal assessments provide a greater depth of information about long-term
student outcomes and recommend distributing assessments to students at five-year intervals after their
graduation and through the twentieth year.
Outreach to Non-Graduates
The authors of Post-Graduation Assessment of Learning Outcomes (see Project Sources) recommend
that assessments should include outreach to students who do not graduate from the institution. In order to
fully assess academic success and its relationship to academic quality, administrators should also seek the
perspectives of those students who do not complete their undergraduate education. This provides two
primary benefits: first, it allows administrators to identify areas for improvement in the retention and
graduation of students; and second, it provides benchmarking information for administrators to see how
the outcomes for graduates differ from the outcomes for non-graduates.
Administrators can reach out to non-graduates using the contact information from the students time at the
institution. However, administrators should contact non-graduates soon after their time at the institution
(i.e., within five years) in order to ensure that contact information is still accurate. Administrators should
also anticipate a low response rate for non-graduate assessments: although many non-graduates may be
unwilling to respond to a survey, those who were dissatisfied with the institution may want to express
their reasons for leaving to administrators.

Question Modules for Non-Graduates


Assessments for non-graduates should survey the same subjects as those for graduates; however,
questions asking about undergraduate education should be revised to ask about a students time at
the institution. Other than that revision, the six primary assessment sections demographics, career
development, graduate education, community impact, discipline-specific activities, and reflections on
the students time at the institution can remain the same as those in the assessments for graduates.
Administrators may then add a question module specific to non-graduates, including questions on
undergraduate degree completion, reasons for leaving the institution, and potential improvements to
the institution.
Sample questions:
Did you complete your undergraduate degree at another institution? If so, what institution? What
was your major?
What were your reasons for leaving this institution?
What services would have improved your experience at this institution? Would those services
have compelled you to stay at the institution?
Did you meet regularly with your academic advisor? How would you improve the advising system
at this institution?

13
2011 The Advisory Board Company
IV. OUTREACH AND ASSESSMENT DISTRIBUTION
Contacting Employers
Administrators at contact institutions typically do not survey employers. Contacts suggest that surveying
employers about individual graduates would be costly and inefficient, as the same data can be collected
directly from alumni. However, several contacts recognize the importance of cultivating favorable hiring
policies among local and national employers; contacts suggest that this is best accomplished through
consistent communication with employers. At University B, administrators maintain a database of all
organizations that employ graduates; staff members record every interaction (e.g., attendance at a donor
event, contact at a career fair, etc.) with each organization, and the information is shared across the
institutions corporate relations, development, and institutional research offices. This allows
administrators to monitor all communication with employers and ensure that positive relationships are
established both with employers leadership teams and with their human resources and hiring
departments.

Avoidance of Social Media in Data Collection


Most contacts do not use social media to collect information on post-graduation student success;
contacts suggest that email and/or paper surveys reach more alumni (especially among older
populations) and produce more detailed, reliable information than social media. However, some
contacts believe that social media can be used to notify alumni that they should anticipate receiving a
survey, which may bolster response rates. Additionally, contacts at University B occasionally update
alumni employment information in the institutions database based on alumni LinkedIn profiles.

14
2011 The Advisory Board Company
V. DATA USE FOR INSTITUTIONAL IMPROVEMENT
All contacts advise against the development of an explicit definition of post-graduate student success.
Contacts recommend that no measurable definition of student success such as one based on income
level, career satisfaction, or community engagement applies to all graduates; some contacts express
concern that identifying successful alumni implies that other alumni are not successful. Instead of
establishing a single definition of success, contacts recommend using assessment data to identify
institutional strengths and areas of improvement.
Executive Summary Document
At several contact institutions, administrators create an executive summary document that provides an
overview of the assessments results. These documents typically summarize data about four subjects:
career development, graduate and professional education, community impact, and reflections on
undergraduate education; additionally, they describe the assessments methodology and rate of response.
Contacts suggest that a summary document is a clear, concise way to demonstrate the value of the
institutions education to a wide variety of campus stakeholders.

Strategies for Writing an Executive Summary Document


Keep the document short.
Administrators at several contact institutions release a full report of assessment responses
(including alumni responses to each question), but those documents are typically dense and
Engagement
inaccessible.of Campus
Contacts Stakeholders
recommend that the summary document be no longer than ten pages.
Highlight a small number of notable answers.
In addition to the assessment overview, executive summary documents at contact institutions also
include a limited number of notable statistics or qualitative answers (typically no more than two
per page). Notable answers are those that reflect especially well on the institution, such as a high
average mid-career salary or a high rate of alumni engagement. Other common statistics include
employment rates of graduates one year after graduation and motivation for making financial
contributions to the institution. The inclusion of graphic representations of these responses breaks
up large blocks of text and makes the document more accessible to a broad audience.
Make the document presentation as professional as possible.
At contact institutions, the executive summary document is posted online and is publicly
accessible. As such, contacts recommend that the document be formatted as an official university
publication, including a cover page, official logos, and university contact information.

15
2011 The Advisory Board Company
V. DATA USE FOR INSTITUTIONAL IMPROVEMENT
Data Use by Campus Stakeholders
Because assessments collect data on so many aspects of the post-graduation experiences of alumni,
administrators find a wide variety of uses for assessment data on campus. The chart below details the use
of assessment data by subject.

Type of Data Institutional Use


Administrators use demographic data to update alumni contact information and
Demographics
personal statuses in institutional databases.
Data on graduate education is used for three primary purposes:
Advise students who are interested in graduate education. For instance,
administrators can use data on alumni who go on to law school to help pre-law
students select an appropriate undergraduate major.
Graduate and
Promote the institution to prospective students. High attainment of graduate and
Professional
professional education among alumni demonstrates the value of an institutions
Education
undergraduate program to prospective students and parents.
Develop strategic goals for the institution. Based on assessment data,
administrators may designate an increase in graduate and professional education
as a strategic priority for the institution or for particular academic programs.
Career development data is use for two primary purposes:
Advise students about post-graduation career options. Career services staff
members use the data to show students potential career paths for their major
based on alumni experiences.
Career Promote the institution to prospective students. At University E, career services
Development staff members present assessment data at admissions events for prospective
students; contacts suggest that information about post-graduation outcomes is
especially compelling for prospective athletes and underrepresented minorities,
as it demonstrates the added value of the institutions education over other, less
expensive institutions.

Faculty members and department leaders use data on discipline-specific activities to


inform curricular development, such as the inclusion of skills and revisions to
Discipline- departmental requirements. In the engineering-specific module of a past alumni
Specific assessment, graduates of the engineering program at University C indicated that they
Activities would have preferred more technical writing experience in their undergraduate
education. Departmental administrators now include writing requirements across
several courses in the undergraduate engineering curriculum.

Community impact data can be used to identify more effective ways to engage
alumni in the campus community by highlighting service and volunteering
Community
opportunities in which alumni are involved. This provides administrators with the
Impact
opportunity to engage students, faculty, and staff in similar service opportunities,
encouraging alumni to actively support service initiatives at the institution.

Central administrators use alumni reflections to guide strategic planning initiatives


and set priorities for institutional growth. At University C, the dean of undergraduate
Reflections on
education of the institution meets with external auditors every two years to review
Undergraduate
the undergraduate curriculum for rigor, depth, and quality; the dean and auditors use
Education
assessment data to evaluate the effectiveness of the curriculum and attainment of
learning outcomes.

16
2011 The Advisory Board Company
VI. APPENDIX: ASSESSMENT TOOL AT UNIVERSITY F
All references in the following questions to the year following graduation in this survey refer to
the 12 months after you received your bachelors degree from this institution.

1. What was your primary activity the one to which you devoted the most time in the year
following your graduation? Mark one answer in Column 1A.
In what other (secondary) activities were you involved the year following your graduation?
Mark as many as apply in Column 1B.
What are your current primary and secondary activities? In Column 2, mark one primary activity
and as many secondary activities as apply for the current year.

1: Year Following
2: Currently
Graduation
A: B: A: B:
Primary Secondary Primary Secondary
Employed for pay
Student in degree program
Internship, paid or unpaid
Seeking employment
Raising a family
Volunteer activities
Military Service
Other (please specify)

2. If you were employed for pay as your primary activity in the year following graduation, what were
your reasons for making this choice? Mark all that apply.

Not applicable Tired of being in school


Was eager to apply my skills in the workplace An advanced degree involved too much debt
Wanted to begin to influence people and Not admitted to preferred graduate or
events directly professional school
Desired the income associated with Advanced study inappropriate to my career
employment goals at the time
Needed non-academic experience prior to Wanted to start paying off my undergraduate
further education debts
Needed time to sort out options Other: _______________________________

3. Have you enrolled in a graduate degree program since graduating from your undergraduate
institution?

No. Skip to Question 5. Yes. Go to Question 4.

17
2011 The Advisory Board Company
VI. APPENDIX: ASSESSMENT TOOL AT UNIVERSITY F
4. Please mark all degrees you have received in Column 1, and any degree programs in which you
are currently enrolled in Column 2.

1: Degrees 2: Currently
Received Enrolled
Second bachelors degree
Law degree (e.g., LL.B, J.D.)
Medical degree (e.g., M.D., D.D.S., D.V.M.)
Masters Degrees
Master of arts or science
Business
Engineering
Other professional masters (e.g., M.S.W.)
Other masters degree
Doctoral Degrees (e.g., Ph.D.)
Biological sciences
Engineering or other applied sciences
Humanities or arts
Physical sciences
Social sciences
Professional doctorate (e.g., education)
Other doctorate (please specify)

4.A Did you attend (or are you now attending) your undergraduate institution for any of your
graduate programs?

No. Yes.

4.B How well do you think your undergraduate institution prepared you for graduate or
professional school when you compare yourself with others in your graduate program(s)?

I was very well prepared. I was generally well prepared.


I was adequately prepared. I was inadequately prepared.

18
2011 The Advisory Board Company
PROFESSIONAL SERVICES NOTE

The Advisory Board has worked to ensure the accuracy of the information it provides to its members.
This project relies on data obtained from many sources, however, and The Advisory Board cannot
guarantee the accuracy of the information or its analysis in all cases. Further, The Advisory Board is not
engaged in rendering clinical, legal, accounting, or other professional services. Its projects should not be
construed as professional advice on any particular set of facts or circumstances. Members are advised to
consult with their staff and senior management, or other appropriate professionals, prior to implementing
any changes based on this project. Neither The Advisory Board Company nor its programs are
responsible for any claims or losses that may arise from any errors or omissions in their projects,
whether caused by the Advisory Board Company or its sources.

2011 The Advisory Board Company, 2445 M Street, N.W., Washington, DC 20037. Any
reproduction or retransmission, in whole or in part, is a violation of federal law and is strictly prohibited
without the consent of the Advisory Board Company. This prohibition extends to sharing this
publication with clients and/or affiliate companies. All rights reserved.

Você também pode gostar