Scholars have tried to measure the impact of their research for several decades now. One of the oldest and most popular ways to do so is through journal-level metrics based entirely on citations, but newer methods also measure impact at the article or item level as well as looking at other areas of impact, such as how many times an article is shared on social media, cited in Wikipedia, or mentioned by the news media.
This guide covers some of the main scholarly metrics and tools out there, but you can also explore the Metrics Toolkit for more information on these and other resources.
No metric is helpful without context. Always look at the number or metric for the article, journal, or author you're looking at in context of others in the same discipline. A Journal Impact Factor of 1.5 might seem low on its own, but in many disciplines that's actually high. Other metrics, such as the H index, are inherently biased against newer researchers as they factor in the number of works an author has written. A new researcher with just two articles under their belts can never have an H index higher than 2, no matter how many citations those 2 articles have.
Use more than one tool. No database indexes all journals or all issues of journals, and some will include publication formats (dissertations, blog posts, book chapters, conference papers, etc.) that are not covered in other databases
Be aware of where the data comes from for any tools you use. For example, Eigenfactor uses Web of Science data; if an author's work is poorly covered in Web of Science, the metrics in Eigenfactor will probably provide limited metrics, as it will draw from only some of that author's publications. When a tool does not specify where the data comes from, know that there will be gaps in coverage. Be aware, too, that some data may be manipulated: See also Manipulating Google Scholar Citations and Google Scholar Metrics: simple, easy, and tempting.
With free tools, be aware that they may change, disappear, or be unreliable. Don't count on free tools always being there as a source of data. For example, Google has a history of taking down services and tools that don't meet its expectations.
Beware of journals and publishers that are not transparent about any fees or do not provide actual peer review. A number of questionable companies have also created their own scholarly metrics that sound similar to other metrics out there.