Skip to Main Content

Measuring academic impact: Journal level metrics

Journal level metrics

Journal level metrics measure the influence of a journal, taking into account the number of citations received by articles published in the journal.

These metrics, once developed to investigate the scholarly communication system, are nowadays used for different reasons:

  • By librarians to decide which journals the library should subscribe to
  • By publishers and editors to see how their journal is performing compared with other journals
  • By researchers to decide in which journal to publish
  • By research managers in the assessment of research performance

Several journal level metrics exist, with different calculation methods and based on different underlying datasets. The main journal level metrics are the Journal Impact Factor, the CiteScore, the SJR (SCImago Journal Rank) and the SNIP (Source Normalized Impact per Paper). In the chapters below these metrics are introduced: how are they calculated, where can you find them and how can you use them? Due to differences in citation patterns between disciplines you can't always use a journal level metric to compare journals from different disciplines.

Be aware that journal level metrics are no indicator for the quality of individual articles within a journal, or for the quality of a researcher. There is a lot of discussion about the use of journal level metrics. Watch for example how Nobel Laureates speak out against the role of impact factors in research in the video The research counts, not the journal (1:27).

Why can journal level metrics be important for you?

Why do journal level metrics matter for PhD-candidates?

  • When you are searching for literature, journal level metrics give an indication of the importance of a journal within its subject field. Of course this doesn’t mean that journals with no or a low Journal Impact Factor can’t contain relevant articles!
  • When you have to decide to which journal you submit your article, you can compare the journal level metrics of relevant journals for your subject.
  • Some universities or faculties use the Journal Impact Factor of journals to measure the performance of their researchers, for example by giving extra credits for publishing in a journal with a high Journal Impact Factor. In that case, it’s important to know what journal level metrics are available, and for which purposes they can and can’t be used. For example: the journal level metrics of a journal don’t tell you anything about the quality or relevance of individual articles published in that journal.

JIF: Journal Impact Factor

The number of citations in the JCR year to items published in the two previous year divided by the total number of articles & reviews published in the two previous years.

For example:
The Journal of Happiness Studies received in 2020 986 citations to items published in 2018 and 2019. In 2018 and 2019 the journal published in total 256 citable items. The Journal Impact Factor of the Journal of Happiness Studies is 986/256 = 3.852.

Calculation Journal Impact Factor 2020 for the Journal of Happiness Studies

The Journal Impact Factor (JIF) is published in the Journal Citation Reports (JCR) of Clarivate Analytics (formerly Thomson Reuters). You can search for a particular journal or create a list by filtering on categories. JIFs from previous years (going back to 1997) are available in the Journal Profile page of a journal.

When you use the JIF to compare journals be aware that you often can’t use the Journal Impact Factor to make the comparison. The height of the JIF depends on the citation culture within a discipline: the number of citations given and the age of these citations differs per discipline. The JIF doesn’t take these differences into account: it’s not field normalized. Therefore a JIF of 2.000 can be high in one discipline, but relatively low in another discipline.

To compare journals from different disciplines, you can use the quartile or the JIF percentile. These metric show how the journal is performing compared to the other journals within the same category. When the journal is in the highest quartile (Q1) it is within the top 25% of the journals in its category. When you check the quartile of the journals with a JIF of 2.000, you can compare them.

Title Category Journal Impact Factor - 2020 Rank in Category Quartile JIF Percentile
Philosophy & Public Affairs Ethics 2.000 20/56 Q2 65.18
Philosophy & Public Affairs Political Science 2.000 98/182 Q3 46.63
Journal of Environmental Law Law 2.000 43/151 Q2 71.85
Journal of Environmental Law Environmental Studies 2.000 104/125 Q4 17.20
Journal of Asian Public Policy Area Studies 2.000 13/80 Q1 84.38

You can find the quartile and percentile scores of a journal on the Journal Profile page in the JCR under the header ‘Rank by Journal Impact Factor’. A journal can be in multiple categories: the quartile and percentile scores can be different per category.

View the video Journal Citation Reports 2021 (3.39 min) for more information about the new interface of the Journal Citaiton Reports 

CiteScore

CiteScore counts the citations received in four years to articles, reviews, conference papers, book chapters and data papers published in those four years, and divides this by the number of articles, reviews, conference papers, data papers and book chapters published in the four years.

For example:
The Journal of Happiness Studies received 2,220 citations in the years 2016 to 2019 to articles, reviews, conference papers, book chapters and data papers published in 2016, 2017, 2018 and 2019. In 2016, 2017, 2018 and 2019 the journal published in total 476 articles, reviews, conference papers, book chapters and data papers, indexed by Scopus. The CiteScore 2019 of the Journal of Happiness Studies is 2,220/476 = 4.7. 

undefined

If a journal doesn't have a publication history of four years within Scopus, Scopus calculates the CiteScore based on the available articles and the citations these articles received. This is not visible on the Source page in Scopus, see for example Annual Review of Criminology, which is indexed in Scopus since 2018. The 2019 CiteScore is based on the articles published in 2018 and 2019.

Please note: the calculation method of CiteScore was changed in June 2020, see https://blog.scopus.com/posts/citescore-2019-now-live for more information. The older CiteScores (before 2019) were also recalculated using this new method.

The CiteScore of a journal is available in Scopus from 2011, on the Source details page of the journal. You can also visit the freely available website https://www.scopus.com/sources

The CiteScore of a journal doesn’t take differences in citation cultures between disciplines into account, it’s not field normalized. Therefore you can’t use the CiteScores to compare journals from different disciplines. To compare journals you can use the CiteScore rank or percentile – they indicate the relative standing of a journal in its subject field.

The CiteScore rank and percentile are available on the Source details page of the journal in Scopus.  A journal can be assigned to multiple categories – in that case the journal will have multiple ranks and percentiles.

undefined

The video Scopus Tutorial: CiteScore metrics in Scopus (3.07 min) explains how CiteScore is calculated and how you can use it.

SJR: SCImago Journal Rank

The SCImago Journal Rank is a measure of scientific influence of journals that accounts for both the number of citations received by a journal and the importance or prestige of the journals where the citations come from. The SJR indicator expresses the average number of weighted citations received in the selected year by the documents published in the selected journal in the three previous years, --i.e. weighted citations received in year X to documents published in the journal in years X-1, X-2 and X-3. A detailed description of the calculation is available here.

Due to the iterative calculation method it’s impossible to calculate the SJR of a journal yourself.

The most recent SJR of a journal is available in Scopus, on the Source details page of the journal, or on the freely available website https://www.scopus.com/sources. The SJR website - https://www.scimagojr.com/ - shows the historical values of the SJR (from 1999).

See for example the SJR of the Journal of Happiness Studies.

The SJR is field normalized: the differences in citation practice between subject fields are evened out in the calculation. This means that you can use the SJR values to compare journals from different disciplines directly.

SNIP: Source Normalized Impact per Paper

SNIP is the ratio of a source's average citation count per paper and the citation potential of its subject field. SNIP measures a source’s contextual citation impact by weighting citations based on the total number of citations in a subject field. Links to a detailed description of the methodology can be found here. It is not possible to calculate the SNIP of a journal yourself.

The most recent SNIP of a journal is available in Scopus, on the Source details page of the journal, or on the freely available website https://www.scopus.com/sources. The CWTS Journal Indicators website - http://www.journalindicators.com/ - shows the historical values of the SNIP (from 2006).

See for example the SNIP of the Journal of Happiness Studies.

SNIP is field normalized: it takes into account the characteristics of the subject field, here defined as the set of documents citing that source. It considers the length of the reference lists, the speed at which citation impact matures, and the extent to which the database used covers the field’s literature. This means that you can use the SNIP to compare journals from different disciplines directly.

Be aware of bogus journal metrics!

Next to the Journal Impact Factor, CiteScore, SJR and SNIP you might encounter other journal level metrics, for example on the homepage of a journal or in an e-mail you receive from a publisher or editor.

Be aware that also fake, bogus or predatory journal metrics exists. Examples are the CiteFactor and JIFACTOR. An example of a journal with a bogus impact factor on the homepage can be found here. In most cases editors can submit their journal to get such an Impact Factor, and often they have to pay to get this Impact Factor. The metrics are sometimes used by predatory publishers to give a predatory journal an official touch.

When in doubt, check if you can find a detail description of how the impact factor is calculated and if underlying data is available. The publisher Wiley offered some guidelines on how to spot fake metrics. You can also ask the University Library, then we can do a check as well.

Related

Suggested

Support