Metrics

 

By Hettrick

The panel Chair commented that this was the most well-argued

and unequivocal case for funding he had seen in any large grant style panel

Feedback on phase 2

  • Reviewed our stakeholders

  • Focussed on key activities

  • Measured our effectiveness on these activities

Now...

Next...

  • Review our effectiveness metrics

  • Start measuring impact

  • Report at Advisory Board in November

  1. Our appeal is widespread

  2. We provide the knowledge needed to change  software practices

  3. Software problems require a diverse organisation

  4. We are the established leader in research software

 

Effectiveness themes

Effectiveness

vs

Impact

Example

problems

Popularity

  1. Invest in researching popularity of similar organisations
  2. Invest in determining usefulness of content

Power of vox pops?

"I gave them a problem that I didn't know how to start answering, and they stepped in without a problem."

 

"It's really made a profound change to the way we do coding"

 

"the training material has been immensely beneficial"

1. Are these compelling?

2. Do we perform in-depth analysis of our impact?

Events

66% of attendees report an increase of confidence

28% report their confidence increased greatly.

1. Is this sufficient?

2. Support with case studies?

3. Run a regular follow up survey?

Citations

Text

"About 70% [of researchers] say their research would be impractical without [software]"

  1. Use page hits as our success metric?

  2. Start writing papers? (Many provisos here)

  3. Develop a blog system that automatically creates DOIs?

Pitfalls and balancing acts

Time spent <> Quality

Metrics for us <> Metrics for funders

Effectiveness <> Impact

Precision <> Clarity

20170518 Metrics report at Advisory Board meeting

By Simon Hettrick

20170518 Metrics report at Advisory Board meeting

  • 1,121