Scholarly impact metrics traditions

33
Scholarly Impact Metrics Traditions and Transformations A guided tour to mainstream approaches to measuring scholarly impact, eg. citation count & impact factor, and a glimpse into new developments, namely altmetrics.

description

 

Transcript of Scholarly impact metrics traditions

Page 1: Scholarly impact metrics traditions

Scholarly Impact Metrics

Traditions and Transformations A guided tour to mainstream approaches to

measuring scholarly impact, eg. citation count & impact factor, and a glimpse into new

developments, namely altmetrics.

Page 2: Scholarly impact metrics traditions

Content

• Why count citations?

• What can they be used for?

• Where to get citation information?

• What are the metrics for individuals

• What are the metrics for journals

• What are the metrics for institutions

Part 1 : TRADITIONS

Page 3: Scholarly impact metrics traditions

Why count publications and citations?

To measure output and impact of scientific research

Page 4: Scholarly impact metrics traditions

Citations: In-text and References

A citation is a reference to a paper / book used in one’s research.

References appended at the end of the paper

in-text citation

Page 5: Scholarly impact metrics traditions

Citations

ACTIVITY 1 With reference to the following short excerpt from a journal paper : 1. Identify where an in-text citation is needed 2. Count the number of times in-text citations occurred within

the 2 paragraphs Time : 3 - 5 minutes Have a quick read and then as a group, discuss and identify

where an in-text citation should go and the number of times in-text citation occurred within the 2 paragraphs.

Page 6: Scholarly impact metrics traditions

Previous studies have indicated that researchers are interested in scholarly communication issues, with attitudes and acceptance of new models and ways of working varying among populations in different fields, academic settings, and countries. In general, traditional publishing practices continue to be reinforced both by institutionalized structures such as tenure and promotion criteria and by “a fundamentally conservative set of faculty attitudes”. New dissemination practices appear to flourish primarily for in-progress or preliminary communication, rather than for final, formal publication.

Posting a paper to an individual’s web page has been shown to be nearly twice as common as any other method for sharing research informally. Discipline-based open archives are fairly popular: the proportion of mathematicians depositing research papers in a subject repository has been variously gauged at 38% and 20%, higher than the average for all subjects. Such online self-archiving, officially recommended by the International Mathematical Union, follows the important pre-Internet tradition of sharing mathematics preprints. In pure mathematics, more researchers consider preprints the most essential resource (44%) than journal articles (approximately 33%); this was the highest preprint rating in any discipline, followed by statistics and operational research at 25%. Besides the preprints and published articles usually deposited, a 2009 survey found that 14% of mathematics / statistics faculty members had deposited raw materials such as images or field notes in a repository, and 10% had deposited raw data sets.

An Excerpt from K. Fowler’s Mathematicians’ views on current publishing issues

Page 7: Scholarly impact metrics traditions

Mathematicians’ views on current publishing issues, PDF

Page 8: Scholarly impact metrics traditions

Why Does One Need to Cite?

1. Research not done in isolation, and builds on the work of others.

2. Provides evidence for arguments and adds credibility by demonstrating

that you have sought different viewpoints

3. Engage in academic conversations – respond to this person, agreeing with

that person, adding something to what has been said, etc.

4. Allows readers to find sources of information (footprints).

among other reasons.

Page 9: Scholarly impact metrics traditions

Some Issues with Citations

1. Authors do not cite majority of their influences. A study by

MacRoberts & MacRoberts in 1989 found that authors cite

about 30% of their influences.

2. Biased citing. In the same study, authors found that some

facts were correctly cited, others were never credited or

credited to secondary sources.

3. Self-citing

4. Citations : some are affirmative, others may be negative.

Authors avoid giving negative credit.

5. Different disciplines have different citation rates.

Page 10: Scholarly impact metrics traditions

Pitfalls and imitations in Citation Analysis

Positive & negative citation not distinguished

Intended or unintended omission

Biased citing practices

Informal influences not included

Assumption on reasons for citing a work may not be valid

Different fields have different citation rates

Some types of publications tend to have higher citation rates

Problems in indexing, citing references, clerical errors, mis-interpretation

Non comprehensive coverage

Page 11: Scholarly impact metrics traditions

Sources of Citation Data

1. Web of Science (output, citation count)

2. Sciverse Scopus (output, citation count)

3. Google Scholar (output, citation count)

4. Journal Citation Report (impact factor)

5. Scimago (journal ranking)

6. Essential Science Indicators (highly cited

papers, hot papers, etc)

Page 12: Scholarly impact metrics traditions

Citation Database: What’s in it

Citation Database A Citation Database B

Page 13: Scholarly impact metrics traditions

The “Big three”: the overlap is quite modest

ISI/WOS SCOPUS Google Scholar Approx. 12,000 journals

Limited coverage of

humanities e.g. monographs

not included

Limited coverage of OA

journals & conference papers,

despite some recent additions

Majority Anglo-Saxon in

origin; English language bias

Contains useful tools for

author disambiguation (Links

to ResearcherID)

Oldest – 1900s (for the

Sciences & Social Sciences)

Approx. 18,000 titles

Greater geographic spread

than WOS – 60% is outside U.S.

Better inclusion of non-journal

material, e.g. conf. papers

Contains useful tools for

author disambiguation

Limited coverage, 1995

Widest range of material

included although no list of

journals included

Gaps in the coverage of

publishers’ archives; no

indication of timescale covered

Results often contain

duplicates of the same article

(e.g. pre-prints, post-prints)

No way to distinguish between

authors with same initials

Difficult to search for a journal

which has various title

abbreviations

Page 14: Scholarly impact metrics traditions

Overlaps (as of early 2013)

Google Scholar

Page 15: Scholarly impact metrics traditions

Researcher Metrics : h-index h-index : measure of productivity and influence A researcher with an index of h has published h papers each of which has been cited at least h times.

Name h-index

E. Witten 110

A. J. Heeger 107

M. L. Cohen 94

A.C. Gossard 94

Source : Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. PNAS, 102(46), 16569-16572.

Name h-index

S.H. Snyder 191

D. Baltimore 160

R. C. Gallo 154

P. Chambon 153

Physics Life Sciences

Page 16: Scholarly impact metrics traditions

16

Strengths of h-index

simple to calculate

combines output and impact

depicts “durable” performance and not single achievements Weaknesses of h-index

discriminates against young researchers

will not capture small but high-quality output

may not depict recent performance

h will never decline so one can “rest on one’s Laurels.”

Page 17: Scholarly impact metrics traditions

Journal Metrics

A. Journal Citation Report (Web of Knowledge) Science edition – over 8,000 journals Social Sciences edition – over 2,600 journals

• Impact factor

• Five year impact factor

• Cited half-life

• Immediacy index

Page 18: Scholarly impact metrics traditions

JCR – Impact Factor The journal Impact Factor (IF) is the average number of times articles from the journal published in the past two years have been cited in the JCR year.

IF of Journal A (2011) = No. of times 2009 & 2010 papers were cited in 2011 (Web of Science) No. of citable papers in Journal A published in 2009 & 2010 In short …

An Impact Factor of 1.0 means that, on average, the articles published one or two years ago have been cited one time. An Impact Factor of 2.5 means that, on average, the articles published one or two years ago have been cited two and a half times. Higher citations rate means your article has higher chances of getting cited

Page 19: Scholarly impact metrics traditions

Journal Metrics

B. Scimago Journal Ranking - Uses Sciverse Scopus data

It expresses the average number of weighted citations received in the selected year by the documents published in the selected journal in the three previous years. Takes into account the number of citations received by the journal, and the prestige of the journals where the citations came from.

Page 20: Scholarly impact metrics traditions

Journal Impact Factor Game

1. Editors requiring authors to include papers from the journal or affiliated

journals in their references

2. Reject negative studies, regardless of quality

3. Publishing mainly popular science articles that deal with hot topics

4. Publishing summaries of articles with relevant citations to them

5. Publishing articles that add citations but not counted as citable

6. Favour review article over original papers

Source : M.E. Falagas and V.G. Alexiou. (2008). The top ten in journal impact factor manipulation. Arch. Immunol. Ther. Exp. 56, 223-226.

Page 21: Scholarly impact metrics traditions

Journal Titles Suppressed in JCR

• JCR monitors journal self citations. <20% is considered acceptable.

• List of titles suppressed for the year are reported in the release notes of each JCR version.

• What titles are suppressed? “Suppressed titles were found to have anomalous citation patterns resulting in a significant distortion of the Journal Impact Factor, so that the rank does not accurately reflect the journal’s citation performance in the literature. ”

Page 22: Scholarly impact metrics traditions

Institutional Rankings

1. Times Higher Education (THE) World University Rankings

2. QS World University Rankings

3. Academic Ranking of World Universities (AWRU)

Some of the players are :

Page 23: Scholarly impact metrics traditions

ACTIVITY 2

Identify institutions ranked 1-5, 6-10, 11-15, 16-20, 21-25, etc from the 3 different rankings. Take note of names of institutions if they appear in all 3 rankings.

1

2

3

4

5

THE 2012 - 2013 QS 2012 AWRU 2012

Page 24: Scholarly impact metrics traditions

Top 10

THE World University Rankings

2012-2013, released Oct 2012

QS World University Rankings

2012, released Sep 2012

Academic Ranking of World

Universities 2012, released Aug 2012

1 Caltech MIT Harvard University

2 Stanford University University of Cambridge Stanford University

3 University of Oxford Harvard University MIT

4 Harvard Univ University College London University of California Berkeley

5 MIT University of Oxford University of Cambridge

6 Princeton University Imperial College London Caltech

7 University of Cambridge Yale University Princeton University

8 Imperial College London University of Chicago Columbia University

9 University of California Berkeley Princeton University University of Chicago

10 University of Chicago Caltech University of Oxford

Institutions in red occur in all 3 rankings

Page 25: Scholarly impact metrics traditions

Ranked 11 – 20

11 Yale University Columbia University Yale University

12 ETH Zurich University of Pennsylvania University of California, LA

13 University of California, LA ETH Zurich Cornell University

14 Columbia University Cornell University University of Pennsylvania

15 University of Pennsylvania Stanford University University of California, San Diego

16 John Hopkins University John Hopkins University University of Washington

17 University College London University of Michigan John Hopkins University

18 Cornell University McGill University University of California, San Francisco

19 Northwestern University University of Toronto University of Wisconsin - Madison

20 University of Michigan Duke University The University of Tokyo

Institutions in purple occur in all 3 rankings

Page 26: Scholarly impact metrics traditions

Ranked 21-30

21 University of Toronto University of Edinburgh University College London

22 Carnegie Mellon University University of California

Berkeley

University of Michigan

23 Duke University University of Hong Kong ETH Zurich

24 University of Washington Australian National University Imperial College London

25 University of Texas at Austin National University of

Singapore

University of Illinois at Urbana-

Champaign

26 Georgia Institute of Technology Kings College London Kyoto University

27 University of Tokyo Northwestern University New York University

University of Toronto

28 University of Melbourne University of Bristol

29 National University of Singapore Ecole Polytechnique Federale

de Lausanne

University of Minnesota

30 University of British Columbia University of Tokyo Northwestern University

Page 27: Scholarly impact metrics traditions

THE’S Methodology (since 2010) Times Higher Education World University Rankings 13 performance indicators, grouped under 5 areas. Consider core missions of the institutions : teaching, research, knowledge transfer, international outlook. 1 Teaching : learning environment 30%. Reputation survey, staff to student ratio, undergrads-to-PhD degree awarded ratio, etc. 2 Research : volume, income, and reputation 30%. University reputation accounts for 17% and is based on academic reputation survey. Research productivity (scaled against staff numbers). 6%. Count number of papers published in academic journals indexed by TR per academic.

Page 28: Scholarly impact metrics traditions

THE Continued

3 Citations : research influence 30%. Role of universities in spreading new knowledge and ideas. Number of times a university’s published works have been cited by others. Data is provided by Thomson Reuters. Papers published between 2006 – 2010, citations counted for period between 2006 to 2011. Data is normalised to reflect variations in citation volume between different disciplines. 4 Industry income : innovation, knowledge transfer 2.5%. how much research income an institution earns from industry. 5 International outlook : staff, students, and research 7.5%. What is international outlook? Diversity on campus and international research collaborations. Ability of university to attract international students & faculty. PLUS, proportion of papers, over the 5 years, that have at least one international co-author.

Page 29: Scholarly impact metrics traditions

QS (since 2011)

• Academic reputation from global survey 40%

• Employer reputation from global survey 10%

• Citations per faculty (data from Scopus) 20%. Latest 5 years of

data is used. Total citation count factored against number of

faculty, thus taking into account the size of institution. Self

citations excluded since 2011.

• Faculty student ratio 20%

• Proportion of international students 5%

• Proportion of international faculty 5%

Page 30: Scholarly impact metrics traditions

AWRU (since 2003) Criteria Indicator Code Weight

Quality of Education Alumni of an institution winning Nobel

Prizes and Fields Medals Alumni 10%

Quality of Faculty

Staff of an institution winning Nobel

Prizes and Fields Medals Award 20%

Highly cited researchers in 21 broad

subject categories HiCi 20%

Research Output

Papers published in Nature and

Science* N&S 20%

Papers indexed in Science Citation

Index-expanded and Social Science

Citation Index

PUB 20%

Per Capita Performance

Per capita academic performance of an

institution

Weighted scores of the above 5

indicators divided by the number of FTE

academic staff.

PCP 10%

Total 100%

* For institutions specialized in humanities and social sciences such as London School of Economics, N&S is not considered, and the weight of N&S is relocated to other indicators.

Page 31: Scholarly impact metrics traditions

How Does This Affect Our Users?

Research Output (productivity)

Citation Count, h-index, g-index, etc (influence)

USER USES OF DATA & METRICS

Faculty Grant proposal, performance appraisal, promotion & tenure (research impact)

Early Career Researcher

Demonstrate research impact & seeking employment in academic / research institutions

School & Admin staff Selection of staff (recruitment, promotion, tenure), selection of external examiners, internal ranking journals (Tier 1, 2, 3 journals)

University & Country Institutional rankings, ROI

Page 32: Scholarly impact metrics traditions

Transformations

Page 33: Scholarly impact metrics traditions

Activities

To access Journal Citation Report, start from Library Toolbar or Library home page 1. Click on “Databases” 2. Click on “Citation Databases” 3. Click on “Journal Citation Reports” 4. Click on “I Accept” To access Scimago, search in google and select Scimago Journal & Country Rank