Research impact can be measured in many ways. Citation databases and alternative metric tools can be used to assess the output and impact of an individual’s paper or body of work. Two key areas where bibliometrics are commonly used are:
It is important to be aware of the various metrics applied in such assessments, the different data sources available and their limitations. Metrics range from simple publication or citation counts (so-called research output metrics) to mathematical formulas which take into account both the output and impact of a researcher’s work (author metrics such as the H-index, and journal metrics such as the Journal Impact Factor). Altmetrics are based on social web data rather than traditional citation counts. See this handout with a summary of what bibliometrics are, the main applications, the main tools and metrics and some of the issues and limitations associated with using metrics based research assessment. The Leiden Manifesto for research metrics presents ten principles for best practices.
See the video Bibliometrics for the individual (3:32) by prof. Dermot Diamond of Dublin City University, to get an idea of the value and use of bibliometrics when evaluating an individual's research impact.
Various tools are available to identify a range of bibliometrics. A handful of databases provide tools that allow you to view or track citations to individual papers or to generate reports based on the citation statistics of a paper or group of papers.
At the EUR the three main bibliometric tools available are:
1. Web of Science and Journal Citation Reports - a database of Clarivate Analytics, formerly Thomson Reuters (paid through our subscriptions);
2. Scopus - a database of Elsevier (paid through our subscriptions); and
3. Google Scholar (using Publish or Perish software and Google Scholar Citations; open to everyone)
These tools provide automatic metrics for individual researchers and they also contain the raw data that can be used to manually calculate or verify metrics.
Tip
When assessing individual metrics in any of these tools it should be kept in mind that they each cover a different range of data, and return different results for an author. Web of Science and Scopus don’t cover all available journals and literature. Google Scholar covers more, but it’s impossible to know its boundaries (which are subject to change) and Google Scholar doesn’t take into account the source of a citation. Therefore our advice would always be: gather metrics from a variey of sources in order to get the most complete picture of a paper's or person's influence.
A fourth database, introduced in 2018, is Dimensions. Dimensions brings together grants, publications, citations, alternative metrics, clinical trials, patents and policy documents on one platform. A part of Dimensions is open to everyone, but when you register you will see more options (this is paid through our subscriptions).
Certain disciplines, journals, and document types may not be well represented in the more traditional sources for citation analysis, such as Web of Science and Scopus. In this situation, it becomes necessary to find alternative resources and tools for locating citations to an author or published work. Here's a list of other databases available via the library's website, that offer citation options.
Alternative metrics or Altmetrics
But there's more. "Altmetrics" refers to alternative ways of assessing the impact of authors and publications, usually by including their contributions and mentions in social media (e.g., blogs, X (Twitter)). It has become a means for measuring the broader societal impact of scientific research and has the potential to complement more traditional citation-based metrics. Altmetrics are signals that come earlier than citations - e.g. likes, shares,followers, downloads, posts, mentions and comments (usually accumulated soon after publication) - and may be seen as signals of visibility or communication.
Some examples: Altmetric Explorer (with EUR license), PlumX Metrics and ImpactStory.
For more information about these suppliers of altmetrics, and tips & tricks see the Altmetrics page in this guide.
There has been much debate about the use of bibliometrics in academia. Many academics feel that scholarly metrics place too much emphasis on the quantity of work as opposed to the quality of the work being produced. Another aspect of this debate is the thought that it pressures authors to publish "hot-topic" articles in only the most "impactful" journals as opposed to producing and experimenting with more original work. The use of altmetrics has also added fuel to this debate as many believe the mention of articles/presentations through the social web should be included in the review of their scholarly impact.
Furthermore, differences in publishing practices between disciplines means that bibliometrics cannot be compared across disciplines. Bibliometrics are generally focussed on citation data from journal articles. They may therefore be less relevant in disciplines that are less reliant on journal publishing, such as the arts, humanities, social sciences, computing science and engineering.
A number of challenges also surround altmetrics. Perhaps most prominent are the lack of robustness and the limited uptake of social media in several disciplines and countries. But although its meaning and use is still under discussion, you could - carefully - use it for self-assessment or career development.
Have a look at the video Limitations of bibliometrics (1.43) about some of the limitations of the use of journal impact data in the social sciences and humanities and all the publication types that are missed.
The Research Evaluation and Assessment Service (REAS) team of the University Library - member of the Research Intelligence Community EUR - is available to consult with faculty and staff about measuring research impact. Contact the REAS team to make an appointment for:
More information about the broader scope of research support at EUR (like grants, research data management, publishing or research integrity) can be found at the website of Erasmus Research Services.