Tell Your Best Story:

Teaching Bibliometrics at the University of Manitoba

Cody Fullerton

ND-MB ACRL Symposium

May 13, 2019

Background

  • Began teaching bibliometrics in Summer 2016
  • UManitoba Faculty Strike, Fall 2016
  • New Language in Collective Agreement
  • Interest in Metrics dropped
  • Change of Game Plan
  • Results

Strike and Collective Agreement

  • Faculty went on strike for a number of reasons:
    • Workload Protection
    • Research Metrics (Performance Indicators)
    • Job Security
    • Salary was taken off the table

Strike and Collective Agreement

  • 22 Days Later...
    • Among many other wins, UMFA bargained to include a restriction on Performance Indicators:
      • 20.B.1.6.4  "Research Metrics shall not be used as a substitute for a more comprehensive assessment of quality and quantity."

The Fallout

  • Faculty took this a huge win (as they should)
  • But, interest in metrics training took a steep dive
  • Faculty thought:
    • Why do I need to know about metrics when I'm protected from them now?
      • (A reasonable thought)

Change of Game Plan

  • How can we peak faculty interest in metrics?
  • What do they need to know and often look for outside assistance?
  • How can the library change its role on metrics?

Our Answer:

Grant Applications

Objectives

  • Understand the difference between traditional metrics and altmetrics
  • Use various metric tools to collate data that is useful to your professional narrative
  • Reviewing your CV, learn to identify research products that most strongly represents both impact & your research
  • Using a template, write a brief summary that incorporates the metrics you have gathered and/or analyzed

Recommendations

  • Avoid using prestige metrics, such as journal-level, unless it is a descriptive metric like “Top 1% or 10% ranked journal” (found in SciVal)
  • Use field-weighted metrics whenever possible as they are designed to be comparative (e.g. CiteScore, SNIP, SJR)
  • H-index is a metric based on natural numbers and assumes a body of work to generate the index – cannot be used to compare, even within a field of research; ineffective for grad student use
  • Use altmetrics that are descriptive:
    • To convey scholarly uptake, use Scopus/SciVal Views, Mendeley reader counts

"When using publication data for any purpose, it is essential to go beyond the numbers to create a narrative that provides contextual background as to productivity and impact. Crafting a narrative based on publication data depends on the intended purpose."

Carpenter, Cone and Sarli. 2014. doi: 10.1111/acem.12482

Exercise - Metrics

  1. Review a CV for publication list (can be your own)
  2. Locate some of the following metrics:
    • Citation Counts - Scopus, Web of Science
    • Percentile Benchmark - Scopus
    • CiteScore - Scopus
    • SNIP - Scopus
    • Impact Factor (5 Years) - Web of Science
    • Journal Ranking (top 1%, 10%, or Category) - Web of Science
  3. ​Record the Values

Exercise - Altmetrics

  1. Review a CV for publication list (can be your own)
  2. Locate some of the following metrics (if possible):
    • Scholarly Activity; Scholarly Commentary; Social Activity; Mass Media
    • Recall that you can find these, by citation, in Scopus (via Plumprint), ImpactStory, individual resources (e.g. repository or publisher record), Altmetric (via bookmarklet landing on a bibliometric source)
  3. ​Record the Values

Writing a Metrics Narrative (1)

“My contribution to the (x) papers produced during the time I worked in (x lab) consisted of [list technical contribution]. Collectively, these (x) papers have been cited over (x) time(s) in [Scopus/Web of Science]."

Writing a Metrics Narrative (2)

“This project produced (x) non peer-reviewed articles, (x) reports for (y) purpose, (x) magazine articles, (x) newspaper articles, (x) presentations for (y) audiences, and (x) book chapters.”

Writing a Metrics Narrative (3)

“My most highly cited work (brief cite) has been cited (x) times, viewed (x) times by online readers with (x) full text downloads. Additionally, the work has been referred to by news media outlets (x) times; tweeted (x) times worldwide; saved in (x) Mendeley accounts and commented on (x) times in blog sources].”

Results

  • Grad Students were very interested and "GradSteps" sessions were well attended
  • Faculty were also more interested, but wanted more one-on-one attention
  • Still attempting to get rid of the stigma surrounding bibliometrics
  • Proselytizing about ORCID, becoming more active at UManitoba

Other Resources

Questions?

Tell Your Best Story

By codyfullerton