Você está na página 1de 25

Collection Analysis:

Overview
Sponsored by ALCTS CMDS Measures & Education
Committees

Peggy Johnson,
Associate University Librarian
University of Minnesota
m-john@umn.edu
“Culture of Assessment/Evaluation”
 Way to demonstrate
• Relevance
• Value
• Impact
 Considered from the view of
• Users
• Stakeholders
Amos Lakos & Shelley Phipps – “Culture of Assessment”
John Crawford – “Culture of Evaluation”
“Those who fail to move in
the direction of systematic
assessment will be unable
to cope with the
increasingly difficult
questions that promise to
confront collection officers
in years to come.”
Mark Sandler, Univ. of MI
Why do we do it?
As part of good management.
 Accountability: To demonstrate to funders
and clients that the service is delivering
the benefits expected when the
investment was made
 Decision-making: To ensure that resources
are being used efficiently and effectively
(an internal control mechanism)
 Marketing: To report success and
accomplishments (public relations)
Collection assessment assumes
that the criteria for success are
defined and understood by those
doing the assessment and those
to whom it is being reports.
What is it?
A mechanism to determine:
 If the collection is meeting its objectives

 How well it is serving its users

 In which ways or areas it is deficient, and

what remains to done to develop the


collection
 If selectors are performing their
responsibilities effectively
 How to allocate collections/access funds
How is assessment different
from evaluation?
 Evaluation determines how well the
collection supports the goals, needs,
and curriculum of the parent
organization.
 Assessment examines or describes
collections either in their own terms
or relative to other collections and
checklists.
Who is the audience ?
 Accreditation agencies
 Parent organization (administration,
board, senior management)
 Library administration
 CDM supervisor
 Selector
 User community or communities
 Consortial partners
How can we do it well?
 Simple
 Practical
 Repeatable
 Clear focus
 Understandable results
 Meaningful results
 Results lead to action
Collection-based Measures Look at:

 Size

 Growth

 Coverage (depth, breadth, balance)


Collection-based Measures

 Checking lists, catalogs,


bibliographies

 Evaluating the collection directly

 Compiling comparative statistics

 Application of collection standards


Use- and User-based Measures Look at:

 Who is using the collection?

 How often?

 What are users’ expectations?

 What are user’s needs?

 What are their perceptions?


Use- & User-based Measures
 Circulation studies

 In-house use studies

 Survey of users

 Shelf availability studies

 Analysis of online usage of electronic resources

 Analysis of ILL statistics

 Citation studies

 Document delivery tests

 Cost-per-use
Quantitative Measures
 Count things
• Use
• Expenditures
• Titles
• Physical items
Quantitative Measures
 Titles

 Circulation transactions

 Expenditures

 E-metrics

 ILL transactions

 Ratios (monographs/serials; volumes/students;


expenditures/degree programs; electronic/print)
E-Metrics
 Online sessions
 Documents downloaded
 Records downloaded
 Virtual visits
 Turn-aways
 Alert usage
 Personal profile users
 Remote versus onsite usage
Qualitative Research
“A process of inquiry that draws data
from the context in which events
occurs . . . using induction to derive
possible explanations based on
observed phenomena”

Gorman and Clayton, Qualitative Research for the


Information Professional: A Practical Handbook,
2nd ed. (London: Facet, 2005)
Qualitative Measures Look at:

 Strengths
 Weaknesses
 Non-strengths
Qualitative Measures

 Provide the context


 Offer a way to understand the
attitudes that inform the statistics
Quantitative Measures
 Focus groups
 Online or printed surveys
 Interviews (structured or
unstructured)
 Observation
Collection Analysis Methods
Use- or User-based Collection-based

Quantitative •ILL statistics •Collection size and


•Circulation statistics growth
•In-house use statistics •Materials budget size and

•Document delivery
growth
statistics •Collection size standards

•Shelf availability
and formulas
statistics •Expenditures by subject

•E-metrics •Ratios

Qualitative User opinion surveys List checking


User observation Verification studies

Focus groups Citation analysis

Interviews Direct collection checking

Collection mapping

(assigning conspectus
levels)
Steps in a Collection Analysis
Project

Plan Do

Act Study
Where to start?
 Define the question or problem
 Determine metrics to use
 Decide:
• Where to locate the information
• Who will collect the information
• Who will analyze and report the
information
• Who will act on the information
Remember
 Chose measures that matter
 Chose an approach that is simple
 Don’t aim for perfection—good ‘nuff
is OK
 Don’t do it once and never again
 Know your audience
 Present data in a context—explain
what it means

Você também pode gostar