Looking to the future: closing the gaps in our assessment approach

44
ORCID - CASRAI , May 18-19 2015, University of Barcelona Looking to the future: closing the gaps in our assessment approach Paul Wouters

Transcript of Looking to the future: closing the gaps in our assessment approach

ORCID - CASRAI , May 18-19 2015, University of Barcelona

Looking to the future: closing the gaps in our assessment approach

Paul Wouters

2

Outline

• Diagnosis: current state in the organisation of science and scholarship

• Four key problem areas

• Searching for solutions

• A look at the future

3

Diagnosis

4

5

6

7

• A severe imbalance between the dollars available for research and the still-growing scientific community in the United States.

• The training pipe-line produces more scientists than relevant positions in academia, government, and the private sector are capable of absorbing

• Hyper-competition for the resources and positions that are required to conduct science suppresses the creativity, cooperation, risk-taking, and original thinking required to make fundamental discoveries.

• Overvaluing translational research is detracting from an equivalent appreciation of fundamental research of broad applicability

• As competition for jobs and promotions increases, the inflated value given to publishing in a small number of so-called “high impact” journals has put pressure on authors to rush into print, cut corners, exaggerate their findings, and overstate the significance of their work.

• Today, time for reflection is a disappearing luxury for the scientific community.

• The quality of evaluation has declined

8

Research leaders face key questions• How should we monitor our research?

• How can we profile ourselves to attract the right students and staff?

• How should we divide funds?

• What is our scientific and societal impact?

• What is actually our area of expertise?

• How is our research trans-disciplinary connected?

9

Research leaders need more, not less, strategic intelligence• Increasing demand for information about

research:– hyper competition for funding

– globalization

– industry – academic partnerships

– interdisciplinary research challenges

– institutional demands on research & university management

• Increased supply of data about research:– web based research

– deluge of data producing machines and sensors

– increased social scale of research: international teams

– large scale databases of publications, data, and applications

10

Four main problems in current academic research

• The funding system

• The career structure

• The publication system

• The evaluation system

11

Funding system

• level of funding

• balance between project and infrastructure funding

• balance between blue sky and focused funding

• relationship research and teaching

• 1 size 4 all?

12

Career structure

• PhDs and postdocs as cheap labour

• hyper-competition

• mismatch training and job opportunities

• lack of dual careers

• emerging separation between researchers and teachers

• increasing inequalities

• lack of diversity in workforce (this may be improving)

13

Growth of scientific literature

Price (1963)Larsen & Von Ins (2010)

14

Codification by publication

• The publication system is the basis for communication, teaching and codification

• Hence, all evaluation systems in science and scholarship are in the end based on publications

• Commercial interests have been able to use the publication system as source of vast profits

• Publishing for the smallest audience possible?

• Evaluation systems have developed on the basis of information (systems) of these publications:– peer review in various formats

– scientometrics and bibliometrics

➡ discrepancy between evaluation criteria and the social and economic functions of science

➡ evaluation methods (esp. qualitative) have not adapted to increased scale of research

➡ available quantitative measures are often not applicable at the individual level

➡ lack of recognition for new types of work that researchers need to perform

Evaluation Gap

Evaluations are liminal

One often has the feeling that there should have been a clear-cut plan for the purpose and process of an evaluation, but this is often not the case. (…) people realize too late that they had very different notions of plans for evaluation (…) The purpose of the evaluation constitutes an ongoing controversy rather than a common logical starting point.

(p. 15)

Evaluation Machines

• Primary function: make stuff auditable

• Mechanization of control – degradation of work and trust? (performance paradox)

• Risks for evaluand and defensive responses

• What are their costs, direct and indirect?

• Microquality versus macroquality – lock-in

• Goal displacement & strategic behaviour

Constitutive effects

• Limitations of conventional critiques (eg ‘perverse or unintended effects’)

• Effects:• Interpretative frames• Content & priorities• Social identities & relations (labelling)• Spread over time and levels

• Not a deterministic process

• Democratic role of evaluations

Effects of indicators

• Intended effect: behavioural change

• Unintended effects:– Goal displacement– Structural changes

• The big unknown: effects on knowledge?

• Institutional rearrangements

• Does quality go up or down?

20

Searching for solutions

21

New trends in assessment

• Increased bibliometric services at university level available through databases

• Increased self-assessment via “gratis bibliometrics” on the web (h-index; publish or perish; etc.)

• Emergence of altmetrics

• Increased demand for bibliometrics at the level of the individual researcher

• Societal impact measurements required

• Career advice – where to publish?

22

23

CWTS Advanced Analytics

• Tailor-made analysis based on network analysis, text mining and visualisation techniques

• Research strengths analysis

• Find blind spots/hot spots

• Identification of partners/potential new staff

• Enhanced collaborative network analysis

24

Citation density map Clinical

neurology

25

Strengths and weaknesses- University Profiles - Leiden

26

Coverage by fields

Citations

Mendeley

Source: Rodrigo Costas

27

• Other relevant patterns:– Twitter: stronger in Social Sciences and General

medicine, weaker in Natural Sciences and Humanities

Coverage by fields

Source: Rodrigo Costas

28

• Blogs and news media have a strong focus on multidisciplinary journals!

Coverage by fields

Source: Rodrigo Costas

29

CWTS Monitor - Meaningful Metrics• A new interactive way of bibliometric analyses

• Powerful web-based application:– User-friendly reporting interface

– Robust cleaned WoS database run by CWTS

– Fair and correct benchmarking by state-of-the-art indicators

– Highly configurable to client’s specific needs

• Professional bibliometric reporting in your hands

• Scientists affiliated to the CTWS Institute of Leiden University provide expert support

30

CWTS Monitor: Select-Visualise-Conclude

31

What can we do with altmetrics?

• Indicators for evaluation of scientific performance or societal impact?– Mendeley – strongest potential!

– Twitter, blogs, news media, etc. – Not yet there…

• 2 analytical opportunities:

– Social media reception/interest of research topics

– Audiences/communities around topics and scientific publications

Source: Rodrigo Costas

aim is to give researchers a voice in evaluation

➡evidence based arguments➡shift to dialog orientation➡selection of indicators➡narrative component➡Good Evaluation Practices➡envisioned as web service

portfolio

expertise

output

influencenarrative

ACUMEN Portfolio

Career NarrativeLinks expertise, output, and influence together in an evidence-based argument; included content is negotiated with evaluator and tailored to the particular evaluation

Output- publications- public media- teaching- web/social media- data sets- software/tools- infrastructure- grant proposals

Expertise- scientific/scholarly- technological- communication- organizational- knowledge transfer- educational

Influence- on science

- on society

- on economy

- on teaching

Evaluation Guidelines - aimed at both researchers and evaluators- development of evidence based arguments

(what counts as evidence?)- expanded list of research output- establishing provenance- taxonomy of indicators: bibliometric,

webometric, altmetric- guidance on use of indicators- contextual considerations, such as: stage of

career, discipline, and country of residence

34

Looking at the future

35

36

Circulation of knowledge

• focus not only on knowledge production but perhaps even more on use of existing knowledge

• differentiate the mission and roles of knowledge institutions

• create regional knowledge centres

• reform education and teach new “21st century skills”

• invest in quality of teaching and training

• restructure the way we work: combine work with learning

37

38

39

40

41

Key challenges in research information system building• Will the information infrastructure contain high quality

data and indicators?

• Will it enable and support context- and mission-sensitive research assessments?

• Will it enable application of research information for primary research purposes (eg in VREs)?

• Will the public sector remain master in its own house or will it hand over control to the private sector?

• Will it be possible to truly open up the research agenda to all stakeholders – open science in a democratic society?

42

https://vimeo.com/127723908

43

A look at the future• universities have developed robust and reliable information systems with coupled databases

and robust identifiers for objects and people

• publication, citation data and all meta data about research are freely available (companies give these data away as part of their business model)

• the current publication system is replaced by a web based open access platform – the role of the journal has radically changed (and there are far less journals around); the role of books has been upgraded

• universities have developed effective HRM policies with serious career prospects and less oversupply of cheap labour. Not every researcher wants to become a professor anymore (or we create much more types of professorships)

• all universities and research institutes have adopted the Leiden Manifesto for Research Metrics: http://www.nature.com/news/bibliometrics-the-leiden-manifesto-for-research-metrics-1.17351

• research evaluation has become routine, but also context and mission sensitive and a crucial instrument in research itself – the current one-size-4all approach is abandoned: the evaluation machines are servicing research and not the other way around

• most global university rankings have disappeared or have transformed themselves into serious information services

• research funding consists of four components, each with different criteria and goals: scaleable infrastructure; innovative projects; heritage funding; and applied and teaching based research

44

Thank you for your attention

[email protected]