This guide gives a brief overview of some tools for measuring scholarly impact that are available through the UNR Libraries and on the free web.
Journal Level Tools
Article/Author Level Tools
Journal Level Metrics
Article Influence Score (an Eigenfactor metric): An expression of the average influence of a journal's articles over the first 5 years after publication. It is calculated by dividing a journal’s Eigenfactor Score by the number of articles in the journal; the scores are then normalized. Scores greater than 1 indicate that each article in the journal has above-average influence. Scores less than 1 indicate that each article in the journal has below-average influence.
Cited Half-Life: The median age of articles that were cited in the year in question. If a journal has a a cited half-life of 5, then half of all citations to a journal's articles were made within the last 5 years.
Eigenfactor Score: A metric that rates journals on the basis of citations but gives higher weight to those citations coming from more influential journals. Self-citations are not included. An alternative to the impact factor.
Impact Factor: A measure of the frequency with which articles in a journal are cited. It is calculated by taking the total number of citations in the current year to articles published in the previous 2 years and dividing by the total of number of articles published in the previous 2 years.
Self-Citation: A reference to an article from the same journal. Self-citations can have an effect on the calculation of impact factors.
SJR (SCImago Journal Rank): A metric that takes into account both the number of citations and the prestige of the journals from which the citations come over the course of 3-year period. An alternative to the impact factor.
Article/Author Level Metrics
Altmetrics: Metrics that go beyond traditional citation-based indicators and incorporate factors such as coverage in blogs, social media, and peer-networking platforms.
Cited Reference Search: A search for articles that have cited a previously published work.
H-Index: A measure of an author's articles that have the highest impact. The h-index is based on an author's most cited articles and the number of citations that s/he has received in other publications. An author with an index of h has published h articles, each of which has been cited in other articles at least h times.
i10 Index: The number of articles by an author that have at least 10 citations (used by Google Scholar Citations).
Immediacy Index: The average number of times an article is cited in the year it is published.
Use more than one tool.
No database indexes all journals or all issues of journals, and some will include publication formats (dissertations, blog posts, book chapters, conference papers, etc.) that are not covered in other databases
Be aware of where the data comes from for any tools you use.
For example, Publish or Perish uses Google Scholar data; if an author's work is poorly covered in Google Scholar, the metrics in Publish or Perish will probably provide unreliable metrics, as it will draw from only some of that author's publications. When a tool does not specify where the data comes from, know that there will be gaps in coverage. Be aware, too, that some data may be manipulated: See also Manipulating Google Scholar Citations and Google Scholar Metrics: simple, easy, and tempting.
Be wary of trying to compare across disciplines.
Journal, article, and author-level metrics are all calculated on the basis of citations. Because citation practices vary greatly among the disciplines, it is unwise to compare across disciplines.
Be aware of variation in author names.
An author's name may appear differently in different databases, and the name may appear differently even within a database. One entry may include a middle initial, another only the first initial. Only some tools, such as Microsoft Academic Search, provide tools for disambiguation.
With free tools, be aware that they may change, disappear, or be unreliable.
Don't count on free tools always being there as a source of data. For example, Google has a history of taking down services and tools that don't meet its expectations. Here's a good Scholarly Kitchen piece on the risks of free services: Mendeley, Connotea, and the Perils of Free Services.
Beware of possible predatory journals and publishers.
Although the open access movement has created opportunities for expanding the reach and impact of scholarly work, some open access journals lack credibility. See Selecting an Open Access Journal for tips on evaluating selecting a journal to publish in.
Beware of questionable metrics.
A number of questionable companies have created their own scholarly metrics. Watch out for these misleading metrics.