Post on 16-Dec-2015
Linking policy initiatives to available dataAssessment of scholarly activity in SSH and Law in a new perspective
Thed van Leeuwen
Centre for Science and Technology Studies (CWTS), Leiden University
Post-Conference Seminar, Vilnius, Lithuania
September 25th, 2013
2
Outline
• Policy context
• Proposed solutions
• Case study in a Dutch university
• Linking it together !
• Conclusions, discussion, and future steps
3
Policy context
4
Overview of the organization of Dutch research evaluation
Standard Evaluation Protocol (SEP – 2003, 2009)• Association of Dutch Universities (VSNU)• National Research Council (NWO)• Royal Dutch Academy of Sciences (KNAW)
Judging Research on its Merits (2005)
Report “Quality indicators for research in the Humanities” (Committee on quality indicators for the humanities, November 2011).
Report “Towards a framework for the quality assessment of social science research” (Committee on quality indicators for the social sciences, March 2013).
Key issues that were addressed in both reports:– How to deal with heterogeneity? [without ‘standardizing’ it away]– Publication cultures– Societal relevance
5
Proposed solutions
6
Quality indicators for research in the Humanities
Quality Aspects Assessment criteria Indicators
Scholarly output
Scholarly publications
Scholarly use of output
Evidence of scholarlyrecognition
Articles
Monographs
Chapters in books
Dissertations
Other output
Reviews
Citations
Other evidence of use
Scholarly prizes
Personal grants
Other evidence ofrecognition
Quality Aspects Assessment criteria Indicators
Societal quality
Societal publications
Societal use of output
Evidence of societalrecognition
Articles in specialist publications
Monographs for a wider public
Chapters in books for a wider public
Other societal output
Projects in collaboration with civil-society actors
Contract research
Demonstrable civil-society effects
Other evidence of use
Societal prizes
Other evidence ofSocietal recognition
9
A case study in a Dutch University
10
Bibliometric analysis of output in a Dutch university: A case study on research output ‘04-’09
• Scientific disciplines cover medicine, social sciences, law, philosophy, history, and economics & business.
• Publication data: internal output registration system (METIS), covering 2004-2009.
• Various types of scientific output were included.
• Purpose of the study: to analyze the ‘impact’ of the university.
11
Difference between the internal registration system & representation WoS
• Dominance university hospital in WoS realm extremely visible
• Law and Humanities ‘disappear’ in WoS realm
0% 10% 20% 30% 40% 50% 60% 70% 80% 90%
(Bio)medicine
Social sciences
Philosophy
History
Law
School ofManagement
School ofEconomics
All Publications
WoSPublications
12
Composition of the output of the university in METIS:The external coverage of a university
• The category General is in some cases voluminous
• All units do have journal publications !
0% 20% 40% 60% 80% 100%
(Bio)medicine
Social sciences
Philosophy
History
Law
School ofManagement
School ofEconomics
BOOK
CASE
CHAP
CONF
GEN
JOUR
MGZN
PAT
RPRT
THES
13
Does it have impact ?
• Taking all publications into consideration does not make any sense !
• For two units international visibility increases!
(Bio)medicine
Social sciences
Philosophy
History
Law
School of Management
School of Economics
0.00 2.00 4.00 6.00 8.00 10.00 12.00 14.00 16.00
CPP non WoS cited papers onlyCPP all non WoS papersCPP WoS papers
14
Linking it together !
Indicators
Scholarly output
Articles
Monographs
Chapters in books
Dissertations
Other output
Reviews
Citations
Other evidence of use
Scholarly prizes
Personal grants
Other evidence ofrecognition
Journals
Theses
(WoS) Book Reviews
WoS/Scopus/GS Citations
Other
Other
Books
Chapters
Metis categories
Scholarly publications
Scholarly use of output
Evidence of scholarlyrecognition
Criteria
Review committees, editorial boards, etc.
Influencing other scholars
16
Indicators
Articles in specialist publications
Monographs for a wider public
Chapters in books for a wider public
Other societal output
Projects in collaboration with civil-society actors
Contract research
Demonstrable civil-society effects
Other evidence of use
Societal prizes
Other evidence ofSocietal recognition
Societal quality
Criteria
Societal publications
Societal use of output
Evidence of societalrecognition
Non scholarly journals
Monographs for a wider public
Chapters in books for a wider public
Media appearances
Reports
Other
Participation in advisory councils, or the public debate
Media appearances
Participation in advisory councils, or the public debate
Other
Metis categories
17
Conclusions, Discussion, and future steps
18
Some conclusions of the study
1. The Metis data clearly showed the possibilities to link the scientific outlets (registered in Metis) to the proposed assessment schemes.
2. … which also allows to focus on societal quality !
3. Working on an environment that assists research assessments in the SSH should be done in close collaboration with the scholarly community involved.
4. Citation analysis of non WoS source material seemed a fruitful approach.
19
Some recommendations …
1. The next challenge is the adding of the possible audiences of the various outlets now linked to the indicators.
2. In addition to this search for the audiences, inevitably the request for ‘value-ing’ the various indicators will pop up !
3. Challenge in the design of indicators based on such a system is to avoid thinking of this as a numbers game.
4. National discipline-wide initiative to register research output and societal impact seems called for …
20
Future steps…
1. We have inquired the possibilities to conduct a follow-up study within Leiden University, to further improve the methodology and discuss the outcomes with researchers and research managers.
2. We have planned a Workshop to discuss the possibilities to come to a national system of data collection that could support assessment procedures as shown in this presentation.
21
Desert !
22
Development of authorship across all domains of scholarly activity
1981
1983
1985
1987
1989
1991
1993
1995
1997
1999
2001
2003
2005
2007
2009
2011
1.00
2.00
3.00
4.00
5.00
6.00
7.00MULTIDISCIPLINARY JOURNALS
BASIC MEDICAL SCIENCES
BASIC LIFE SCIENCES
BIOMEDICAL SCIENCES
CLINICAL MEDICINE
ASTRONOMY AND ASTROPHYSICS
AGRICULTURE AND FOOD SCIENCE
CHEMISTRY AND CHEMICAL ENGINEERING
BIOLOGICAL SCIENCES
INSTRUMENTS AND INSTRUMENTATION
PHYSICS AND MATERIALS SCIENCE
ENERGY SCIENCE AND TECHNOLOGY
EARTH SCIENCES AND TECHNOLOGY
HEALTH SCIENCES
ENVIRONMENTAL SCIENCES AND TECHNOLOGY
ELECTRICAL ENGINEERING AND TELECOMMUNICATION
PSYCHOLOGY
MECHANICAL ENGINEERING AND AEROSPACE
COMPUTER SCIENCES
CIVIL ENGINEERING AND CONSTRUCTION
GENERAL AND INDUSTRIAL ENGINEERING
EDUCATIONAL SCIENCES
STATISTICAL SCIENCES
SOCIAL AND BEHAVIORAL SCIENCES, INTERDISCI-PLINARY
MANAGEMENT AND PLANNING
SOCIOLOGY AND ANTHROPOLOGY
INFORMATION AND COMMUNICATION SCIENCES
MATHEMATICS
LAW AND CRIMINOLOGY
ECONOMICS AND BUSINESS
LANGUAGE AND LINGUISTICS
POLITICAL SCIENCE AND PUBLIC ADMINISTRATION
HISTORY, PHILOSOPHY AND RELIGION
CREATIVE ARTS, CULTURE AND MUSIC
LITERATURE
Definitions of JIF and Hirsch Index
• Definition of JIF:
– The mean citation score of a journal, determined by dividing
all citations in year T by all citable documents in years T-1 and
T-2.
• Definition of h-index:
– The ‘impact’ of a researcher, determined by the number of
received citations of an oeuvre, sorted by descending order,
where the number of citations equals the rank position.
Problems with JIF
• Some methodological problems of JIF:– Was/is calculated erroneously.– Not field normalized.– Not document type normalized.– Underlying citation distributions are highly skewed
• Some conceptual problems of JIF:– Inflates the impact of all researchers publishing in the
journal.– Promotes journal publishing, as JIF is easily measured.– Stimulates one-indicator thinking.– Is based on expected values only, does not relate to
reality.– Ignores other scholarly virtues.
Problems with H-index• Some bibliometric-mathematical problems of H-
index:– Is mathematically inconsistent in its’ behavior.– Tends to rise only, no decrease possible, and thus
conservative by nature.– Not field normalized.
• Some bibliometric-methodological problems of H-index:– How to define an author?– In which bibliographic/metric environment?
• Some conceptual problems of H-index:– Is biased against youth, and favors age and experience.– Is biased against selective researchers, and favors
highly productive scientists.– No relationship between H-index and research quality.– Ignores other elements of scholarly activity.– Promotes one-indicator thinking.
26
Thank you for your attention!
Any questions?Ask me, or mail me
leeuwen@cwts.nl
27
Appendix on H-index
The H-Index and its limitations
The H-Index, defined as …
• The H-Index is the score that indicates the position at which a publication in a set, the number of received citations is equal to the ranking position of that publication.
• Idea of an American physicist, J. Hirsch, who published about this index in the Proc. NAS USA.
Examples of Hirsch-index values
• Environmental biologist, output of 188 papers, cited 4,788 times in the period 80-04.
• Hirsch-index value of 31
• Clinical psychologist, output of 72 papers, cited 760 time sin the period 80-04.
• Hirsch-index value of 14
0
50
100
150
200
250
300
350
0 20 40 60 80 100 120 140 160 180 200
Value of H-Index= 31
Citations
Publications
0
10
20
30
40
50
60
70
80
0 10 20 30 40 50 60 70 80
Value of H-Index= 14
Citations
Publications
• Actual versus field normalized impact (CPP/FCSm) displayed against the output.
• Large output can be combined with a relatively low impact
Soc
HumMat
Soc
Eng
Psy
Eng ChePsyMed
Med
Che
Med
Med
Phy
PhyBio
BioPhy
Psy
Env
Phy
Med
Bio
MedMed
0.00
1.00
2.00
3.00
4.00
5.00
6.00
7.00
0 50 100 150 200 250
TOTAL PUBLICATIONS
CP
P/F
CS
m
• H-Index displayed against the output.
• Larger output is strongly correlated with a high H-Index value.
Med
Med
Bio
MedPhy Env
PsyPhy
BioBioPhy
Phy MedMed
CheMedMed Psy
CheEng
PsyEng
SocMat
HumSoc
0
10
20
30
40
50
60
0 50 100 150 200 250
TOTAL PUBLICATIONS
H-i
nd
ex
Consistency: Definition
Definition. A scientific performance measure is said to be consistent if and only if for any two actors A and B and for any number n ≥ 0 the ranking of A and B given by the performance measure does not change when A and B both have a new publication with n citations.
33
Consistency: Motivation
• Consistency ensures that if the publishing behavior of two actors does not change over time, their ranking relative to each other also does not change
• Consistency ensures that if the individual researchers in one research group X outperform the individual researchers in another research group Y, the former research group X as a whole outperforms the latter research group Y.
34
Inconsistency of the h-index
35
Actor A Actor B
0 2 4 6 8 10 120
1
2
3
4
5
6
7
8
9
publications
cita
tions
0 2 4 6 8 10 120
1
2
3
4
5
6
7
8
9
publications
cita
tions
h = 4 h = 6
0 2 4 6 8 10 120
1
2
3
4
5
6
7
8
9
publications
cita
tions
0 2 4 6 8 10 120
1
2
3
4
5
6
7
8
9
publications
cita
tions
h = 6h = 8
Problems with the H-Index
• For serious evaluation of scientific performance, the H-Index is as indicator not suitable, as the index:– Is insensitive to field specific characteristics (e.g., difference in citation
cultures between medicine and other disciplines).
– Does not take into account age and career length of scientists, a small oeuvre leads necessarily to a low H-Index value.
– Is inconsistent in its ‘behaviour’.
37
Appendix on JFIS
Other journal impact measures …
• JFIS (CWTS) Journal-to-Field Impact Score
– A field- and document type normalized journal impact score, based on more publication data and longer citation windows.
Journals within their JFIS-values
------------------------------------------------------------------------------------------------------------------------------------------
JOURNAL JFIS Ranking - Field
------------------------------------------------------------------------------------------------------------------------------------------
CELL 7.06 ( 1 - Biochem & Mol Biol)
REV MOD PHYSICS 5.15 ( 2 - Physics)
ANN REV CELL DEV BIOL 5.04 ( 3 - Biochem & Mol Biol)
CHEMICAL REVIEWS 4.90 ( 4 - Chemistry)
NATURE MEDICINE 4.73 ( 5 - Medicine)
ANN REV OF BIOCHEM 4.64 ( 6 - Biochem & Mol Biol)
ANNALS OF MATHEMATICS 4.46 ( 7 - Mathematics)
NATURE BIOTECHNOLOGY 4.07 ( 8 - Biotech & Appl Microb)
ACTA MATHEMATICA 4.01 ( 9 - Mathematics)
BULL AM MATH SOC 4.00 ( 10 - Mathematics)
ANN REV CELL BIOL 3.78 ( 11 - Biochem & Mol Biol)
J AM MATH SOC 3.71 ( 12 - Mathematics)
J ROYAL STAT SOC B 3.49 ( 13 - Statistics & prob)
PROG CHEM ORG NAT PROD 3.35 ( 14 - Organic Chem)
ACTA METALL MATER 3.19 ( 15 - Metall & Met Eng)
ANGEW CHEM-INT EDIT 3.15 ( 16 - Chemistry)
PHYS REV LETT 3.13 ( 17 - Physics)
J MICROELECTROMECH SYST 3.04 ( 18- Elec & Electr Eng)
J RHEOLOGY 3.02 ( 19 - Mechanics)
INVENT MATH 3.01 ( 20 - Mathematics)
------------------------------------------------------------------------------------------------------------------------------