Você está na página 1de 36

UCD LIBRARY

http://www.ucd.ie/library/bibliometrics
bibliometrics@ucd.ie

Bibliometrics -
an introduction
© UCD Library 2010 Version 3 Mar 2010
Section

Bibliometrics:
an overview
• Research impact can be measured in many ways: quantitative approaches
include publication counts, amount of research income, no of PhD students,
size of research group, no of PI projects, views and downloads of online
outputs, number of patents and licenses obtained, and others.

• Use of bibliometrics and citation analysis is only one of these quantitative


indicators.

• The ability to apply it and its importance in the overall assessment of research
varies from field to field

• Attempts at quantitative measures can be contrasted with the main alternative


assessment approach - qualitative peer-review in various forms

• The balance between use of bibliometrics and peer-review in assessing


academic performance at both the individual and unit levels is currently a “hot
topic” being played out locally, nationally and internationally

• This section provides an introductory overview of the field - others look in


more depth at: the key uses of bibliometrics for journal ranking and individual
assessment; the main metrics available; the main data sources and packaged
toolkits available

1
What and Why
• Bibliometrics are ways of measuring patterns of authorship, publication,
and the use of literature.
“In our view, a quality • The pressure to use bibliometrics stems from the quantitative nature of the
judgment on a research result which could be argued to have advantages It also holds out the
unit or
possibility of an efficiency advantage, producing a variety of statistics quite
institute can only be given
by peers, based on a
quickly in comparison to the resource-intensive nature of peer-review of
detailed quality and innovation of intellectual thought.
insight into content and
nature of the research • Any move to use these bibliometric approaches as proxy indicators of the
conducted by the group …. impact or quality of published researchers
impact and scientific is highly controversial, even in those disciplines where citation analysis
quality “works” in that much research output is indexed in the main citation data
are by no means identical sources
concepts.”
Bibliometric Study of the • Bibliometric analysis has formed one part of the local UCD Research
University College Dublin Excellence Framework strategy, looking at the impact of the research at
1998-2007, CWTS, institutional and unit level.
February 2009
• The 2 other key areas where bibliometrics are commonly used are:

• as part of the assessment of an individual in relation to


consideration for promotion, tenure and grant funding

• to consider where to publish research in to obtain maximum


visibility and citation rate by targeting high impact titles

• Despite its many shortcomings ranking tables for universities give


considerable weighting to bibliometrics in their calculations

2
The building blocks
A source dataset

• Collecting the citation information is a huge task, and all sources are highly
selective - only 5 Schools at UCD, for example, have 80% or more of their
substantive research outputs indexed in the ISI citation indexes (CI)

• The main source datasets are those of ISI SCOPUS and Google Scholar
plus subject-specialist options in some fields

• Each collects the citation information from the articles in a select range of
publications only – the overlap between the content of these sources has
been shown to be quite modest in particular studies.

Metric tools and techniques applied to the data source

• Basic building blocks are a series of techniques such as h-index, impact


factor, eigen factor, , SJR, IF, SNIP - these formulae transform the raw
data into various quantitative evaluations of both journals and individual
researchers

Publication counts

Measure productivity but arguably not impact - 28% of the 8,077 items of UCD
research from 1998-2007 indexed in the ISI Citation Indexes were not cited other
than self-citations. Overall as much as 90% of the papers published in scientific
journals are never cited. The SCImago web-based free product provides an easy
graphical presentation of that material for each journal title.

Citation analysis

• Most current bibliometric indicators are based around analysis of citations -


The key concept is that the number of times you get cited is meaningful

• There are two main approaches to citation analysis:

• Citation counts - total number of citations, total number of citations over


a period of time, total number of citations per paper. And more
sophisticated measures such as: number of papers cited more than x times;
number of citations in the x most cited papers

• Normalisation and “crown indicators” Citation counts alone are


commonly used but this is meaningless unless normalised by some
combination of time, journal of publication, broad or narrow field of
research. This benchmarking approach is the most commonly used at
present. There are various initiatives to provide metrics sufficiently
normalized on a number of criteria such that both journals and individuals
can be compared across disciplines using these newer metrics, such as
SNIP for journals and the universal H index for individual researchers.

3
Issues & Limitations
• In some fields it is not the tradition to cite extensively the work that your
“….We publish in
books and scholarship and research is building upon – yet this is the whole principle
monographs, and in of the citation analysis system.
peer-reviewed
journals. However, • Seminal research is also often taken for granted and not cited
we have a range of
real requirements • Where citation is common, the data sources often do not index the
that include official publications where research in a field is typically published – local
reporting to state publications, non-English, monographs, conference and working papers
agencies and are poorly indexed
authorities; public
archaeology and • Negative citations are counted as valid
communication in
regional and local • Manipulation of the system by such means as self-citation, multiple
journals and in authorship, splitting outputs into many articles and journals favouring
interdisciplinary highly cited review articles
publication across
several journals, that • Defining the field and level of granularity at which to benchmark. This can
most bibliometrics
dramatically alter the result for an individual or group when using
are incapable of
measuring“
normalized benchmarked scores

[UCD academic] • Inappropriate use of citation metrics, such as using the Impact Factor of a
journal to evaluate an individual researcher’s output, or comparing h-index
across fields, ignoring the citation pattern variations found

“The terrible legacy of IF is that it is being used to evaluate


scientists, rather than journals, which has become of
increasing concern to many of us. Judgment of individuals
is, of course, best done by in-depth analysis by expert
scholars in the subject area. But, some bureaucrats want a
simple metric. My experience of being on international
review committees is that more notice is taken of IF when
they do not have the knowledge to evaluate the science
independently”
[Alan Fersht “The most influential journals: Impact Factor
and Eigenfactor” PNAS April 28, 2009 vol. 106 no. 17]

4
Section

Where Should I Publish?


Journal Ranking Tools

 The impact or ranking of the journals in which an academic publishes their


work is often taken into account when applications for tenure, promotions and
grants are considered. Therefore it is important to keep this in mind when
choosing a journal in which to publish.
 Discussed below are 5 of the main tools used for journal rankings. Each tool
uses different metrics to rank journals and each also has different journal
coverage. These factors should be kept in mind when assessing journal
rankings.
 Journal metrics should only be compared across the same discipline or sub-
discipline due to varying citation traditions – though the new SNIP metric in
SCOPUS attempts to normalize across all disciplines to enable such cross
comparison
 At present, none of the journal ranking tools adequately categorise multi-
disciplinary journals.
 Some research areas have their own indexing database, which contains a wider
range of source materials and provides citation analysis and where this is the
case this may be preferred to the multi-disciplinary tools covered in this
section.
 The use of journal rankings to assess research output is not appropriate for
some disciplines, e.g. in the arts and humanities research output in journals is
low and citations are infrequent. In some fields, conference proceedings are the
main outlet for disseminating research and in general journal ranking tools do
not cover these adequately

5
ISI Journal Citation Reports
• Journal Citation Reports (JCR) forms part of the subscription-based ISI
suite of Products known as Web of Knowledge which also includes
Web of Science. JCR is the original journal ranking tool, first developed in
the 1950s, and it is the current market leader for journal rankings.

• JCR allows you to search for individual journals or to compare groups of


journals by subject category. JCR provides a range of metrics for each
journal, covering impact over 2 and 5 years, immediacy of citing, if citing
continues over time etc. JCR also provides eigenfactor metrics.

• Key metric: Journal Impact Factor (JIF)


The journal impact factor is the average number of citations received in a
year by articles published in a journal in the previous 2 years. e.g. a journal’s
JIF for the year 2008:

SCImago
 SCImago is a freely available web resource available at
http://www.scimagojr.com/ . This uses Scopus data to provide metrics
and statistical data for journals.

 As well as the Journal Rank Indicator, SCImago provides a number of


other metrics and statistics for journals and it allows you to search for
journals individually or comparatively by discipline and sub-discipline.

 Key metric: SCImago Journal Rank Indicator (SJR)

 The SJR is much like the JIF in principle. However it goes a step further by
mimicking the Google PageRank algorithm. As such it assigns higher
value/weight to citations from more prestigious journals. The SJR covers a
3 year citation window.

e.g. a journal’s SJR for 2008:

 Note that SCImago also gives a calculation identical to the ISI Impact
Factor to enable comparison with JCR using a different data source

6
eigenfactor.org
1. eigenfactor.org is a freely available web resource that provides metrics for
journals using data from ISI’s JCR. As well as the eigenfactor score, the
website also provides the Article Influence score which is more directly
comparable with the JCR JIF.

2. Key metric- Eigenfactor


As with SCImago’s SJR, the developers of the eigenfactor use a similar
method to Google’s PageRank algorithm to rank journals i.e. the
eigenfactor of a journal is based on the citations it receives from other
journals and citations from highly ranked journals are given more weight
than others.

3. The eigenfactor score also takes into account other variables like the
disciplinary relationships between citing and cited journals. It also covers a
5-year citation window. Furthermore, the eigenfactor score is a measure of
the overall impact of a journal, not that of its individual articles (as is the
case with JIF and SJR). For these reasons it is considered quite robust.

Google Scholar & Harzing’s Publish or Perish


 Harzing’s Publish or Perish (PoP) software is free to download
(http://www.harzing.com/pop.htm) and uses Google Scholar (GS) data
to provide metrics for journals. However, you can not search for groups of
journals by subject category as in the other tools.

 Key metric- h-index for journals


The h-index of a journal is the number of its papers that have been cited at
least h times e.g. a journal has an index of 13 if 13 of its papers have been
cited at least 13 times. The date range for articles included in the h-index is
flexible and can be set as desired however the citation window cannot be
fixed so all citations are included.

 This metric might be the most useful for some humanities and social
science subjects as GS, generally speaking, covers more material in these
areas. It also has better coverage of conference proceedings which might
benefit subjects like computer science. The date range flexibility of the h-
index might also suit disciplines where published research is slower to
impact on subsequent publications.

7
SCOPUS
 SCOPUS have enhanced their Journal Analyzer product in 2009/2010 and
they have made deals with both SCImago and CTWS Leiden. The SJR
calculation from SCImago is now included in the product

 Key metric – SNIP Source Normalized Impact per Paper

 This is unique to the SCOPUS product and its built-in toolset.

 Created by Professor Henk Moed at CTWS, University of Leiden, Source


Normalized Impact per Paper (SNIP) measures contextual citation impact by
weighting citations based on the total number of citations in a subject field. The
impact of a single citation is given higher value in subject areas where citations
are less likely, and vice versa.

 Details:
- Measures contextual citation impact by ‘normalizing’ citation values
- Takes a research field’s citation frequency into account
- Considers immediacy - how quickly a paper is likely to have an impact in a
given field
- Accounts for how well the field is covered by the underlying database
- Calculates without use of a journal’s subject classification to avoid delimitation
- Counters any potential for editorial manipulation
More information about SNIP: http://www.journalindicators.com

CONCLUSION
Using different data sources and different metric tools means that journals can
score better or worse in the different products.

The chart below shows that for two Bioethics titles, it varies with each metric as to
which scores the higher

A Comparison of Key Indicators for 2 Journals in Bioethics


4.378 JCR JIF* 1.013

0,108 SCImago SJR*

0.0035008 Eigenfactor*

0.367 SCOPUS SNIP


*these are the current metrics available for these
journals

8
Section

Bibliometrics for your CV

• Bibliometric measurements can be used to assess the output and impact of an


individual’s research. These types of measures are often taken into account
when applications for tenure, promotion or grants are considered. It is
important to be aware of the various metrics that can be used in such
assessment.

• Metrics range from simple publication or citation counts to mathematical


formulae which take into account both the output and impact of a researcher’s
work.

• The three main bibliometric tools, Web of Science, Scopus and Google Scholar
(in collaboration with Publish or Perish software or the Scholarometer browser
add-on for Firefix and Chrome), provide automatic metrics for individual
researchers and they also contain the raw data that can be used to manually
calculate or verify metrics. There are also some specialised tools for certain
disciplines.

• The bibliometrics tools each cover a different range of data, and metrics for the
same individual vary across the 3 products. This should be kept in mind when
assessing individual metrics in any of the tools

9
The Metrics
• A huge variety of metrics have been developed to help assess the output of
researchers. Here are some of the most popular:

• Total number of papers: a simple count of the number of papers a


researcher has published.

• Total number of citations- a count of all the citations received by a


researcher’s published works.

• The h-index has become the most popular metric for assessing the output
of individuals since it was developed by Hirsch in 2005. The h-index of an
individual is the number of their papers that have been cited at least h times
e.g. a researcher has a h-index of 25 if 25 of their papers have been cited at
least 25 times.

• A number of variations on the h-index have emerged. These include:

• 1)Egghe’s g-index which gives more weight to the highest cited


papers

• 2)The individual h-index which accounts for co-authorship in


calculating impact

• 3)The contemporary h-index gives less weight to older cited papers

• 4)The age-weighted citation rate which also accounts for the age of
papers.

• The bibliometric tools discussed below provide some or all of these


metrics for individual researchers.

Web of Science

• Web of Science (WoS) is part of the ISI suite of products and is the current
market leader for bibliometrics.

10
• Wos allows you to use the Author Finder to identify a single author and
view a list of their publications including citations.

• For this list of publications you can also generate a Citation Report. This
provides metrics including the h-index, total number of papers and total
number of citations. Charts and year-by-year citation analysis are also
provided.

• Another product from the ISI suite, Essential Science Indicators, covers
22 fields in science and provides data for ranking scientists.

Google Scholar plus Publish or Perish


• Publish or Perish software (PoP) works with Google Scholar data
to produce metrics for published material.

• Google Scholar covers a lot of material not well represented in the


other bibliometric tools and it may be the best option for
researchers who are not well represented in the other tools.

• PoP produces a wide range of metrics for individuals, much more


than are provided by the other tools.

• In PoP it is important to check the list of results for errors and


duplicates.

11
Scopus
• Scopus allows you to conduct an Author Search to identify a single
author. The search contains useful tools for author disambiguation - by
country, affiliation etc

• For each author you can view a list of publications including citations.

• You can use the Citation Tracker function to produce a Citation


Overview for an author’s publications which includes the h-index and a
year-by-year analysis of citations for each paper.

• It must be kept in mind that Scopus


only includes citations after 1996. This
has a big impact on citation data and
metrics for authors with careers
predating 1996. For such authors the
Scopus h-index does not adequately
represent their body of work.

Some alternative tools


There may be specialist databases for your field that offer citation tools and a good
coverage of the literature, here are some examples:

Spires - free resource covering physics literature. Includes bibliometric data.


http://www.slac.stanford.edu/spires/

Medline - free resource indexing life science and biomedical publications. Includes
citation data. http://medline.cos.com/

CiteSeer - free resource for computer and information science


publications.Includes citation data http://citeseer.ist.psu.edu/citeseer.html

ArXiv - Open access. Covers physics, mathematics, computer science, quantitative


biology, quantitative finance and statistics http://arxiv.org/

12
Section

Calculating the H-index:


Web of Science, SCOPUS or
Google Scholar?

• The h-index , a method of measuring the productivity and impact of an


academic’s work, is often used as a component or metric in the ranking of
higher education institutions.

• The principle citation databases used in this exercise are: Web of Science
(WoS), Scopus and Google Scholar (GS) and the pros and cons of using
each of the three databases to calculate the h-index are discussed below.

• The h-index can be defined as: A scientist has an index h if h of his/her Np papers
have at least h citations each, and the other (Np – h) papers have no more than h citations
each * e.g. a researcher had a h-index of 25 if 25 of their papers have been cited
at least 25 times.

• These databases are selective in their journal coverage and some disciplines are
better served than others. Also, conference proceedings and monographs,
which are key research outlets in some subjects, are not adequately covered.
These factors should be kept in mind when assessing the h-indices of
researchers in such disciplines.

13
Web of Science
Pros

• Excellent depth of coverage (from 1900-present for some journals)

• A huge number of the records are enhanced with cited references

• Recently added 700 regional journals

• First database to incorporate the h-index (using a good graphical display)

• Can view the h-index minus self citations

• Recently included conference proceedings

• Can view stray and orphan records using the “cited references” search

Cons

• Coverage is not as wide as Scopus (about 11,500 journals)

• Better coverage of sciences than arts and humanities

• Doesn’t cover monographs at all

• Facilities for finding and distinguishing between authors are not great

• Western, English language bias

14
Google Scholar
Pros

• Covers not only journals but academic websites, grey literature, pre-prints,
theses etc

• Also includes books from the Google Books project

• Includes items print born and born digital

• Always has a master record or creates one from citations (so stray and
orphan records are directly visible in the results list)

Cons

• Does not automatically calculate the h-index (but can use “Publish or
Perish” software to do this)

• Doesn’t provide a list of journals covered (peer-reviewed or otherwise)

• Does not indicate timescale covered

• Covers some suspect material e.g. course reading lists, student projects etc

• Poorer coverage of print born material

• Hit counts and citation counts can be suspect as they are often inflated

• No way to distinguish between authors who have the same initials

• Results often contain duplicates of the same article (usually as pre-prints


and post-prints)

15
Scopus
Pros

• Includes more than 16,500 journal titles

• Covers some books and conference proceedings

• The ‘more’ feature allows you quickly view stray and orphan records

• Very strong coverage of science and technology journals

• Contains useful tools for author disambiguation

• Automatically generates the h-index

Cons

• Depth of coverage is not as impressive as the width, many journals are only
covered for the last 5 years.

• Poor coverage of arts and humanities (recently improved somewhat as


more journals added)

• The citation enhanced part of Scopus only dates back to 1995. This results
in a very skewed h-index for researchers with longer careers than this.

• Even citations to pre-1995 articles in articles published after 1995 are not
included in the h-index calculation.

16
Section

Bibliometric toolkit:
the ISI product suite

• Thomson Reuters ISI product set is the market leader for bibliometrics and
their range of products is the most widely used

• Product suite includes Web of Science, Journal Citation Reports, Essential


Science Indicators, Sciencewatch, Incites, ResearcherID and others – this
section just covers the key resources

• Journal Citation Reports is the original journal ranking tool, first developed in
the 1950s, and it is the current market leader for journal rankings

• There are serious issues with use of these tools in many fields, particularly
humanities, applied technologies and multidisciplinary areas, due to lack of
coverage of the literature and inadequate categorisation in these areas

17
ISI – key facts
Key facts

• Citations date back to 1900, the longest set available in any product

• 9,200 journals covered, 80% in the “hard” sciences

Some points and some limitations to note

• Highly selective citation sets as the building block of all the metrics, given
estimates of 100,000+ journal titles in existence

• Open Access journals and conference proceedings are poorly covered,


despite some additions

• Monographs are not included at all

• Strong English language and US bias

• Lack of standard author names – all result sets must be checked and
pruned – tools are provided to pick all forms of author name

• Lack of standard affiliation details, and only 3 organisations can be


attributed to each individual author

Main alternative citation data sources for comparison

• SCOPUS from Elsevier

• Google Scholar with Publish or Perish software application

• Subject specialist databases with citation information, available in some


fields - Life Sciences get better covering in Medline for example, Computer
Science in CiteseerX or ACM

18
Web of Science
Cited Reference Search for an author or group –
Find the articles that cite a person's work, analyse citing material by geography,
discipline, document type. Includes all citations to the author’s work, not just those
in the selected titles indexed by ISI

Create a citation report for an author –


includes publications and citations per year, and h-index for the author. These
reports use only citations to materials in the select titles indexed in the product
itself for the calculations

Check citations for an individual article –


Get a count of cites for an individual article, view citing articles, obtain a citation
map

Cited Reference Search for a journal –


Find all the articles citing materials in a particular journal title, limiting to a date
range of publication if desired, and obtain a citation report for the journal content
including publications and citations per year, and h-index for the journal

19
ISI Journal Citation Reports
Key points

• An annual subscription product appearing 6 months after the year of


analysis

• JCR covers over 6950 Science and over 1980 Social Science journals

• Provides 171 subject categories for science and technology and 55 in the
social sciences –multidisciplinary area are categorised poorly or not at all so
the product is of little use in such areas

• It takes 3 years for a journal to appear in JCR and this time lag is
problematic in fast moving areas – and for new journals

Get a list of top ranked journals in your field – sort to highlight a number
of different aspects: count of citations, 2 year impact, 5 year impact; immediacy
index which measures how soon articles are cited in a journal, and half-life
which measures whether citing continues over time for a journal’s content.
Also provides 2 eigenfactor metrics which take into account the impact of the
citing journals and if they in turn are heavily cited as a weighting factor

Check an individual journal title to see its impact and ranking

Obtain similar metrics for an individual title; also includes analysis year by year
and graphical presentation, details of citing and cited journals, and ranking of
the journal title in all relevant categories in JCR

20
Essential Science Indicators
• Offers data for ranking scientists, institutions, countries, and journals

• Covers 22 fields of science

• Uses a rolling 10 year trend analysis and updates every 2 months to


progressively expand the current year.

Science Watch
• Tracks trends and performance in basic research using select data from
Essential Science Indicators

21
Section

Bibliometric toolkit:
SCOPUS

• SCOPUS was launched by Elsevier in 2004

• The suite of citation analysis tools is called Research Performance Management


– RPM

• SCOPUS offers Live Chat to support users of the product

• SCOPUS has improved its Journal Analyzer package and now provides SJR
and SNIP metrics in the product as alternatives to the ISI JIF

22
SCOPUS – key facts
Key facts

 The database contains over 30 million citations

 Citations are only indexed from publications from 1996 which is a


disadvantage in comparison to the ISI suite of products

 The main calculations such as author h-index are also only worked out for
publications from 1996

 There are 15,800 peer reviewed titles in SCOPUS - a lot more than in ISI
and this makes the product better for some fields such as Engineering

 The range of materials included is said to be better for humanities and in


2009 a significant increase in such titles took place

 There is more European content and it does include more languages other
than English than ISI – 60% of coverage is outside the USA

 Open Access titles, proceedings, web pages, patents, book series are all
included

Main alternative citation data sources for comparison

 Thomson Reuters ISI product suite including Web of Knowledge and


Journal Citation Reports

 Google Scholar with Publish or Perish software application

 Subject specialist databases with citation information, available in some


fields - Life Sciences get better covering in Medline for example, Computer
Science in CiteseerX or ACM

23
Authors
Citation Tracker

Use the tools provided to select one author or a group of authors – you can
filter by affiliation, country and subject area and pick variant forms of your
name. This is a strong feature of the product providing assistance in isolating
the correct research outputs, particularly where names are common

Then click on Citation Tracker to create a Citation Overview of the selected


author or group of authors – using the affiliation search a similar approach can
be used to get an overview of various broad UCD subject areas as well

The Citation Overview shows:

• the number of documents cited in any year range 1996-current

• The pattern of citation by year and total cites

• Self-citations can be removed if wished

• The citing documents can be viewed – however unlike ISI there is no


analysis offered by geography, discipline or document type

• The author(s) h-index is provided, based on publications 1996 onwards.


For example, there may be 16 documents found that are cited but the h-
index is calculated only on the 12 published from 1996

• Graphs are provided for h-index, publication count and citation count

24
Journals
Journal Analyzer

Use Journal Analyzer to select a journal title or build up a group of titles. The
analysis includes both graphs and chart displays of:

• Number of articles published per year in the title(s) from 1996 to date

• Number of citations in total to the articles per year

• Number of articles per year not cited, as a percentage

• Selecting a number of titles allows you to see the relative performance of


each – this is a key feature of the product

• 2 key metrics for journals ranking are provided – the SJR and the SNIP
which can give quite varying results

• Unlike JCR, this analyzer does not provide ranked lists of journals for your
field. However, the free SCImago website uses the Scopus data set to
provide this type of ranked listing comparable to the ISI JCR product
http://www.scimagojr.com/

25
Articles
 Each document returned in a search has an indication of the number of
citations to it found in Scopus

 The citing documents can be viewed in turn as a list if desired

 Or, for any particular article in the set, citation tracker can be used to get a
more detailed analysis of the citations by year for that item - as well as
viewing the citing articles

26
Section

Bibliometric toolkit:
Google Scholar woth
Publish or Perish or
Scholarometer

• Google Scholar is one of three principle tools (the others being ISI and
Scopus) used to generate bibliometrics for researchers and for published
research material.

• Publish or Perish works with Google Scholar data to automatically generate


bibliometric data.

• Scholarometer is a new browser add-on for Firefox or Chrome offering similar


functionalities but working in a different manner and offering options to select
name forms and remove or merge articles

27
Google Scholar
About Google Scholar

 Google Scholar (GS) is a specialised Google search engine designed to


retrieve scholarly data on the web.

 It was launched in beta version in November 2004 and is still in its beta or
test phase.

 GS searches publisher websites, pre-print repositories, university websites,


books, technical reports etc for scholarly information.

Some Advantages of GS for Bibliometrics

• A free alternative to fee-based citation databases like Scopus and Web of


Science (WoS).

• Covers a diverse range of sources which can lead to higher citation counts
for some articles.

• Includes material not indexed by other citation databases. As a result, it can


be better for disciplines poorly served by its competitors

• Includes books which is advantageous for disciplines where monographs


are a principle research outlet, for example, in the arts and humanities.

• Better coverage of many conference proceedings than its competitors.

• Provides citation counts for each document that it returns. You can click
on the citation count to view the citing documents.

28
Some Disadvantages of GS for Bibliometrics

• Does not provide a list of the journals that it covers.

• Does not indicate the timescale covered.

•Covers some suspect material e.g. course reading lists, student projects etc

• Poorer coverage of print-based material

• Even the advanced search options are quite limited:

 there is no way to distinguish between authors who have the same


initials.

 It is impossible to conduct a comprehensive search for a journal


which has various title abbreviations.

• Results often contain duplicates of the same article (usually as pre-prints and
post-prints) or even false hits.

29
Publish or Perish
About Publish or Perish

 Publish or Perish (PoP) is a piece of software which is free to download


from the web at http://www.harzing.com/pop.htm. It works with Google
Scholar data to generate metrics for authors, journals and articles.

 The software was developed by Anne-Wil Harzing, Professor in


International Management at the University of Melbourne

 As PoP uses GS data, the advantages and disadvantage of using GS should


be keep in mind when analysing PoP results. PoP is limited to the same
search functionality as GS.

 Using PoP you cannot view groups of journals together or categories of


journals as in Scopus and ISI.

 PoP results can be copied into Windows applications like Excel or saved as
text files for further analysis.

 Results lists should be checked for errors and false hits deselected.

30
Scholarometer
 SCHOLAROMETER is a browser add-on for Firefox or Chrome
and provides an alternative to Publish or Perish if using Google
Scholar as the underlying data source for bibliometric analysis.

 It was developed by Diep Thi Hoang and Fil Menczer at School of


Informatiics and Computing, Indiana.

 You can find out more about this at


forathttp://scholarometer.indiana.edu

 It is fairly new with less than 1,000 downloads at time of writing


(March 2010), give it a try, here is a beginner guide:

 It operates as a side bar in Firefox and offers similar facilities to


Publish or Perish, but a more limited set of metrics

 Notable features are the ability to input various name forms, to remove
unwanted forms of name, to merge duplicate articles and to remove
individual items from the result set that are not the right author – this
improves on the limited abilities Publish or Perish provides which
make that extremely difficult to use for common names

 Another feature of this product is the use of Web 2.0 – when searching
for an author, searchers can allocate tags for the research area of
interest and these then appear in twitter and are used in some metrics.
This is an extremely experimental aspect of the product

Indiana University
Bloomington

31
Section

A brief bibliography

General
Editorial: rating research performance
Watson, Roger
2009, October
Journal of Clinical Nursing 18(20), pp. 2781-2782
http://dx.doi.org/10.1111/j.1365-2702.2009.02926.x
Good brief 2 pager on impact factors, h and g index and some pitfalls

Citation Analysis: A Comparison of Google Scholar, Scopus, and Web of Science


Kiduk Yang and Lokman Meho
Proceedings of the American Society for Information Science and Technology
2007 43 (1), pp 1-15
http://www3.interscience.wiley.com/cgi-bin/fulltext/116328907/PDFSTART

Data sources for performing citation analysis: an overview


Neuhaus, Christoph
Daniel, Hans-Dieter
2008
Journal of Documentation 64(2), pp. 193-210
http://www.emeraldinsight.com/Insight/viewPDF.jsp?contentType=Article&Filename=html/Out
put/Published/EmeraldFullTextArticle/Pdf/2780640202.pdf

Whose metrics? On building citation, usage and access metrics as information services
for scholars
Armbruster, Chris
2009
Working Paper Series, Research Network 1989, Berlin, Germany
http://ssrn.com/abstract=1464706

32
Journal Rankings

The most influential journals: Impact Factor and Eigenfactor


Alan Fersht
2009
PNAS 106(17) pp 6883-6884
http://www.pnas.org/content/106/17/6883.full

Eigenfactor
Crisp, Michael G
2009
Collection Management,34:1,53 — 56
http://dx.doi.org/10.1080/01462670802577279

Comparison of SCImago journal rank indicator with journal impact factor


Falagas, Matthew
Kouranos, Vasilios D.
2008
The FASEB journal : official publication of the Federation of American Societies for
Experimental Biology
22 pp. 2623-2628
http://www.fasebj.org/cgi/content/short/22/8/2623

33
Individual author ranking

Ambiguity, Bias, and compromise: an abc of bibliometric-based performance


indicators
Todd P A
2009
Environment and Planning A 41(4) 765 – 771
http://www.envplan.com/epa/editorials/a424.pdf

On the robustness of the h-index


Vanclay, Jerome K.
2007
Journal of the American Society for Information Science and Technology
58(10), pp 1547-1550
http://arxiv.org/ftp/cs/papers/0701/0701074.pdf

34

Você também pode gostar