Bibliometrics can aid in benchmarking the research performance of individuals, institutions, departments or research centres against other comparable individuals, institutions, departments or research centres – as in the image above, it is important that “apples” are compared with “apples” and not watermelons or blueberries.
There are a number of bibliometric tools that can be used for benchmarking. In this Thing, we will cover the two major benchmarking tools used in universities around the world, InCites and SciVal, introduce you to key metrics used in benchmarking exercises, and, in Challenge Me, we will briefly outline VOSviewer, a bibliometric network mapping tool.
Exercise: Search your library’s available online databases to find out if your institution uses these tools (you might have to search for “Scopus” and Web of Science” as these are the respective source databases).
As they are proprietary tools (InCites is owned by Clarivate and SciVal by Elsevier), you may find your university or research centre does not subscribe to them. However, it is good to get an understanding of what they can measure and benchmark as well as how they are used by institutions, departments etc.
InCites Benchmarking & Analytics is a customised, web-based research evaluation tool. InCites can be used for the analysis of institutional productivity, to monitor collaboration activity, identify influential researchers and discover areas of opportunity.
- Read the InCites LibGuide for beginners, “InCites benchmarking & analytics: learn the basics“.
- Watch “InCites benchmarking & analytics: quick tour” on YouTube to learn the basics about benchmarking in InCites.
SciVal offers quick, easy access to the research performance of thousands of research institutions from over two hundred nations worldwide. Using SciVal, researchers can visualise their research performance, benchmark themselves against peers, discover collaborative partnerships, and analyse research trends.
- Watch “Benchmarking in SciVal“, which gives you a quick introduction to using SciVal in this way.
- For more information on using SciVal in benchmarking exercises, read “Benchmarking with SciVal in scholarly communication and research services“.
- Finally, read Chapter 3, “Bibliometric analyses” (pages 9-12) of “Bibliometrics for research management and research evaluation” from Centre for Science and Technology Studies (CWTS), Leiden University.
Consider: How useful are SciVal and/or Incites for understanding or benchmarking the research in your institution? How does your organisation use benchmarking to rank groups or individuals?
It is important to choose metrics that allow you to confidently compare apples with apples and to be aware of the relative strengths and weaknesses of each one.
Read pages 7-20 of “Using SciVal responsibly: a guide to interpretation and good practice“. There is also a similar guide available for InCites.
Scholarly Output is the number of publications or outputs produced by a researcher, group of researchers, or institution during a specific time frame. Consider: What are the key strengths and weaknesses of the Scholarly Output indicator as outlined in the above document?
Citation Count is the number of citations received during a specific citation time frame by a research output or set of outputs.
Consider: What are the weaknesses of the citation count metric?
Field-Weighted Citation Impact (FWCI) is the ratio of the actual number of citations received by an output to date and the “expected” number for an output with similar characteristics.
Outputs in Top Citation Percentiles counts the number or percentage of outputs that fall into the top 1%, 5%, 10% or 25% of the world’s most highly cited outputs.
Publications in Top Journal Percentiles is similar to the Outputs in Top Citation Percentiles, except that it measures the impact of an output at the journal level. Publications in Top Journal Percentiles is the number or percentage of outputs published in the most highly cited journals within a field. This indicator is useful for understanding publishing behaviour.
Consider: What should you not use the Publications in Top Journal Percentiles indicator for?
Take a look at the following universities’ guidelines that provide advice on how to use different tools for benchmarking:
- Australian Catholic University (ACU) explains Benchmarking
- Auckland University of Technology (AUT) provides Institutions’ performance and benchmarking.
Exercise: List at least three questions that can be answered at the individual researcher, department and/or research group level with benchmarking.
Consider: Imagine a situation where you were asked to create a benchmarking report for an academic or research department. What factors would you consider when finding an appropriate entity to compare them against? How would you present the report? Would you provide any context to the data?
Tools like SciVal and Incites will create some metric visualisations for you, however there are also free third-party tools you can use to create more complex and specific visualisations.
VOSviewer is a bibliometric network mapping tool developed by CWTS at Leiden University. It can be used to visualise a variety of bibliometric networks. These networks include journals, researchers, or individual publications and can be constructed based on citation, bibliographic coupling, co-citation, or co-authorship relations. You can also create term maps. There are many different types of networks that can be created using VOSviewer.
Read “Bibliometric mapping for current and potential collaboration detection” to find out how term maps can be created in VOSviewer.
VOSviewer supports data from Scopus, Web of Science, PubMed and Dimensions. Dimensions is a citation database product created by Digital Science, and has a free version that can be accessed by anyone.
Read “Discovering relationships between researchers and publications using Dimensions data just got a lot more colorful!” for instructions on how to use Dimensions data in VOSviewer.
Exercise: Have a go at exporting a dataset from Dimensions and importing it into VOSviewer to start to map patterns of collaboration – what trends are there in the collaboration networks compared to a similar institution?