IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director...

17
iREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director [email protected] , www.snowballmetrics.com Snowball Metrics 1

Transcript of IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director...

Page 1: IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director l.colledge@elsevier.com,  l.colledge@elsevier.com.

iREG Forum on University Rankings

16-17 May 2013

Dr Lisa Colledge

Snowball Metrics Program Director

[email protected], www.snowballmetrics.com

Snowball Metrics

1

Page 2: IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director l.colledge@elsevier.com,  l.colledge@elsevier.com.

Snowball Metrics are…

• Endorsed by a group of distinguished UK universities to support their strategic decision making

• Tried and tested methodologies that are available free-of-charge to the higher education sector

• Absolutely clear, unambiguous definitions enable apples-to-apples comparisons so universities can benchmark themselves against their peers to judge the excellence of their performance

Snowball Metrics are unique because:• Universities drive this bottom

up• Academia – industry

collaboration

2

Page 3: IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director l.colledge@elsevier.com,  l.colledge@elsevier.com.

Trends in Research Management

3

Growing recognition of the value of data/metrics to inform and monitor research strategies, to complement but not replace existing methods

“Unless you have [data] you cannot make

informed decisions; you would be acting based on opinions and hearsay.”

Frustration over the lack of a manageable set of standard metrics for sensible measurements

“[There is little] thought leadership

and knowledge development around

best practice.”

Frequent similar data requests from external bodies looking at performance in a way that is not necessarily of most value to universities themselves

“The principle drivers for our systems are often external…

but they shouldn’t be… a research strategy should…

be developed… to respond to our strengths and the

external environment, our systems should be defined to

run our business.”

Page 4: IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director l.colledge@elsevier.com,  l.colledge@elsevier.com.

University-driven (bottom-up) benchmarking is very important

4

This report recommended that universities and funders should work

more collaboratively, and develop stronger relationships with suppliers

Universities need to benchmark themselves to know their position relative to their peers, so they can

strategically align resources to their strengths and weaknesses

“Universities should work together more to make their collective

voice heard by external agencies.”

“The lack of a long-term vision makes it hard to…

co-operate within a university let alone across

the sector.”

“Suppliers do not know what research offices do on a daily

basis.” “How educated are we at asking suppliers the right

questions?”

Page 5: IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director l.colledge@elsevier.com,  l.colledge@elsevier.com.

Snowball Metrics addresses university-driven benchmarking

“Someone needs to take ownership of the process: it is impossible to please all of the people all of the time so

somebody needs to be strong enough to stand behind

decisions and follow through.”

5

Snowball Metrics Project Partners

“It would be great if the top

five [universities]

could collaborate”

Page 6: IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director l.colledge@elsevier.com,  l.colledge@elsevier.com.

The project partners…

• Agree a pragmatic approach from the point of view of the research office

• Endorse metrics to generate a dashboard that supports university strategy

• Draw on and combine university, proprietary and third party / public data

• Ensure that the metrics can be calculated, and in the same way by universities with different systems and data structures

6

Page 7: IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director l.colledge@elsevier.com,  l.colledge@elsevier.com.

Main roles and responsibilities

• Everyone is responsible for covering their own costs

• University project partners– Agree the metrics to be endorsed as Snowball

Metrics– Determine methodologies to generate the metrics in

a commonly understood manner to enable benchmarking, regardless of systems

• Elsevier– Ensures that the methodologies are feasible when

applied to real data, prior to publication of the recipes to share with the sector

– Distribute the recipes using our communications networks

– Day-to-day project management of the global program

• Outside the remit of the Snowball Metrics program– Nature and quality of data sources used to generate

Snowball Metrics – Provision of tools to enable the global sector to

generate and use Snowball Metrics

7

Page 8: IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director l.colledge@elsevier.com,  l.colledge@elsevier.com.

Snowball Metrics are feasibile

8

Page 9: IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director l.colledge@elsevier.com,  l.colledge@elsevier.com.

Metrics can be size-normalised

9

Page 10: IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director l.colledge@elsevier.com,  l.colledge@elsevier.com.

Metrics can be “sliced and diced”

10

Page 11: IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director l.colledge@elsevier.com,  l.colledge@elsevier.com.

Recipe Book shares the methods with the sector free of charge

11

Input Metrics- Applications Volume- Awards Volume

Process Metrics- Income Volume- Market Share

Output Metrics- Scholarly Output- Citation Count- h-index- Field-Weighted Citation Impact- Publications in Top Percentiles- Collaboration

University and discipline levels only

First set of Snowball Metrics

www.snowballmetrics.com/metrics

Page 12: IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director l.colledge@elsevier.com,  l.colledge@elsevier.com.

Elsevier and Snowball Metrics

12

Declaration from the project partners

Agreed and tested methodologies… are and will continue to be shared free-of-charge

None of the project partners will at any stage apply any charges for the methodologies

Any organisation can use these methodologies for their own purposes, public service or commercial (Extracts from Statement of intent, October 2012)

Universities are also requesting the provision of calculated metrics

Some organisations do not want to use the recipe book themselves, and are approaching Elsevier for help to implement and use Snowball Metrics

Elsevier charges for our support in this, and can offer the metrics in a Custom Report, and in Pure (Current Research Information System)

We plan to continue to build commercial tools to help any universities who want to adopt Snowball Metrics but prefer not to generate them in house

Page 13: IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director l.colledge@elsevier.com,  l.colledge@elsevier.com.

Global benchmarking

13

Research Inputs

Research Processes

Research Outputs and Outcomes

Research

Post-Graduate Education

Enterprise Activities

Research applicationsResearch awards

Research income Publications & citationsCollaboration (co-authorship)Impact / Esteem

Post-graduate research

Post-graduate experience

Industrial income and engagement

Contract turnaround timesIndustry research income

PatentingLicensing incomeSpin-out generation / income

Completion rates

People Organisations

Themes / Schemes Researchers

Role Institution Institutional unit External groupings Funder type

Award type Subject area / keywords

Denominators“Slice and dice”Normalise for size

Nu

mera

tors

Den

om

.Vision: Snowball Metrics drive quality and efficiency across higher education’s research and enterprise activities, regardless of system and supplier

Page 14: IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director l.colledge@elsevier.com,  l.colledge@elsevier.com.

Achieving the vision

• Continue to agree, test and share new metrics to illuminate research and enterprise activities

• Facilitate adoption by the sector by “translating” the metrics into standard data formats that can be easily understood by systems

• Ensure that Snowball Metrics support global benchmarking as far as possible

14

Vision: Snowball Metrics drive quality and efficiency across higher education’s research and enterprise

activities, regardless of system and supplier

Page 15: IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director l.colledge@elsevier.com,  l.colledge@elsevier.com.

Global vs national standards for benchmarking

Snowball Metrics start life with a national perspective – currently UK The aim is to “promote” all aspects of Snowball Metrics as far as possible to a global standard

15

UK metrics

Country 2 metricsCountry 1metrics

Illustrative only, testing underway

Common core where benchmarking against global peers can be conducted. Aim is to make this as big as possible

Shared features where benchmarking between

Countries 1 and 2, but not UK, can be conducted e.g. regional

benchmarkingNational peculiarity can support benchmarking within Country 1, but not globally i.e.

national benchmarking

Page 16: IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director l.colledge@elsevier.com,  l.colledge@elsevier.com.

Possible end point of a metric

16

Version enabling global benchmarking

• e.g. Discipline represented by a universal journal classification

Multiple versions enabling regional benchmarking

• e.g. Discipline represented by subject mapping of a regional body

Multiple versions enabling national benchmarking

• e.g. Discipline represented by the UK’s HESA Cost Centres

Incr

easi

ng c

ircle

of

peer

s fo

r be

nchm

arki

ng

Illustrative only, testing underway

Page 17: IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director l.colledge@elsevier.com,  l.colledge@elsevier.com.

THANK YOU FORYOUR

ATTENTION!Contact Dr Lisa [email protected] or

[email protected]