Skip to Main Content

Research Metrics

Article Level Metrics

Article Level Metrics

Article icon

 

Article level metrics are often used to indicate the impact of scholarly journal articles. The most common metrics base on citations, such as times cited and other field normalized metrics. Again, the metrics are only available for the articles indexed in the specific citation databases.

Recently, altmetrics, which captures the immediate response to the publications in online environments (e.g., social media), are gaining increasing attention. It may be useful for drafting an impact case study, which articulates significance and reach arising from research beyond academia.

 

Times cited

Times Cited

The citation count of an article (times cited) is the number of times it is included in the reference list of other articles or books. The numbers are available for only the articles indexed in and specific to the citation databases used.

 

Use and limitations

Use
  • As compared with journal-based indicators (e.g., journal impact factor), it can offer information at the relevant level of granularity, i.e., the individual research article.
limitation
  • There is no indication on whether articles are cited for positive or negative reasons.
  • Citation performance is a lagging indicator that takes years to turn into a robust signal (not useful to evaluate recent scholarship).
  • It is not well suited to compare researchers at different career stages or in different disciplines.
  • Citation patterns can be skewed by author and journal reputations.

 

Access

Popular platforms include:

 

Reference

  • Declaration on Research Assessment (DORA). (2024). Guidance on the responsible use of quantitative indicators in research assessment. DORA. https://doi.org/10.5281/zenodo.10979644

 

Field-normalized Citation Indicators

Field-normalized citation indicators

Field-normalized citation indicators such as Category Normalized Citation Impact (CNCI) or Field Weighted Citation Impact (FWCI) represent attempts to correct for the citation variability arising from differences between fields, types, and ages of publications. In general:

  • A value of 1 is as expected for the world average.
  • Values above 1 are above average.
  • Values below 1 are below average.

The numbers are available for only the articles indexed in and specific to the citation databases used.

 

Use and limitations

Use
  • They include fields, types, and ages of publications in the calculation.
limitation
  • It can be difficult to define which papers belong in which fields.
  • For datasets comprising only tens or hundreds of papers, such indicators are less reliable because of the impact of highly cited outliers.
  • They are mean-based indicators, which cannot represent the often highly skewed distribution of citations across publications.

 

Access

Popular platforms include:

Indicators Platform Guide
Category Normalized Citation Impact (CNCI) InCites (for Web of Science)
http://find.lib.hku.hk/record=alma991044377799703414
https://incites.help.clarivate.com/Content/Indicators-Handbook/ih-normalized-indicators.htm
Field Weighted Citation Impact (FWCI)

Scopus
http://find.lib.hku.hk/record=alma991012748039703414

https://service.elsevier.com/app/answers/detail/a_id/14894/supporthub/scopus/~/what-is-field-weighted-citation-impact-%28fwci%29%3F/

 

 

Reference

  • Declaration on Research Assessment (DORA). (2024). Guidance on the responsible use of quantitative indicators in research assessment. DORA. https://doi.org/10.5281/zenodo.10979644

 

Smart Citations by scite

Smart Citations by scite

Smart Citations make use of artificial intelligence, aiming to reveal how a scientific paper has been cited by providing the context of the citation and a classification system describing whether it provides supporting or contrasting evidence for the cited claim, or if it just mentions it.

 

Use and limitations

Use
  • It adds contextual information on how the citing paper used the citation (supporting, mentioning or contrasting).
limitation
  • There are limitations of the machine learning model precision.
  • The coverage of articles analyzed by scite may be considered limited. Open access repositories and a variety of open sources as identified by Unpaywall, other relevant open access document sources are utilized. Subscription articles used were available through indexing agreements with over a dozen publishers.

 

Access

scite: http://find.lib.hku.hk/record=HKU_IZ61623882910003414

  1. Sign-up or log-in your personal account
  2. Enter title, author, keywords or DOI to search
  3. Locate the desired article and view the citations in context

    scite search

 

References

  • Nicholson, J. M., Mordaunt, M., Lopez, P., Uppala, A., Rosati, D., Rodrigues, N. P., Grabitz, P., & Rife, S. C. (2021). scite: A smart citation index that displays the context of citations and classifies their intent using deep learning. Quantitative Science Studies, 2(3), 882-898. https://doi.org/10.1162/qss_a_00146

  • Nicholson, J. M., Uppala, A., Sieber, M., Grabitz, P., Mordaunt, M., & Rife, S. C. (2021). Measuring the quality of scientific references in Wikipedia: an analysis of more than 115M citations to over 800 000 scientific articles. Febs j, 288(14), 4242-4248. https://doi.org/10.1111/febs.15608

     

Altmetrics (Alternative Metrics)

Altmetrics

Altmetrics (Alternative metrics) attempts to capture the amount of attention a research output has received in non-academic outlets. It has gained attention as online platforms such as social media, online reference managers, scholarly blogs, and online repositories are deeply embedded into the system of scholarly communication. Different types of altmetric scores, which can be calculated for articles, books, data sets, presentations, and more, can be obtained from a range of commercial providers.

 

Use and limitations

Use
  • Details of the original mentions contributing to the altmetric scores can be useful for a broader examination of research contributions, e.g., highlighting citing policy documents to demonstrate social impact, and highlighting citing patents to demonstrate innovative impact. It may be useful for drafting an impact case study, which include specific, high-magnitude and well-evidenced articulations of significance and reach arising from research beyond academia.
limitation
  • Often presented as a composite score, it represents a weighted measure of all the attention picked up for a research output (i.e., not a raw total of the number of mentions).
  • Some of the activities included, especially those associated with social media, can be prone to being gamed.
  • It provides little context for the type and purposes of engagement and thus, difficult to interpret in terms of broader research impact.
  • Older papers published in the past may be under-represented, as social media is relatively new.

 

Access

Altmetric Explorer: http://find.lib.hku.hk/record=alma991044668008803414

  1. Conduct a search with title, keywords, or DOI within the full Altmetric database
    Step 1 click edit search to search

     
  2. Identify the desired research output for its Altmetric details page
  3. View the Attention Score and the sources, e.g., social media, news, policies, and patents
    Step 3 click the tabs to view details

 

 

Reference

  • Declaration on Research Assessment (DORA). (2024). Guidance on the responsible use of quantitative indicators in research assessment. DORA. https://doi.org/10.5281/zenodo.10979644
  • Mingers , J., & Leydesdorff , L. (2015). A review of theory and practice in scientometrics . European Journal of Operational Research, 246(1), 1 19. https://doi.org/10.1016/j.ejor.2015.04.002
  • Reichard, B., Reed, M. S., Chubb, J., Hall, G., Jowett, L., Peart, A., & Whittle, A. (2020). Writing impact case studies: a comparative study of high-scoring and low-scoring case studies from REF2014. Palgrave Communications, 6(1), 31. https://doi.org/10.1057/s41599-020-0394-7