Is success divisible? How to evaluate success successfully?! New Frontiers In Evaluation

23
Is success divisible? How to evaluate success successfully?! New Frontiers In Evaluation Session F “Talking about Success” April 25 th 06 - 9.40 – 12.30 a.m. Institut für Forschungsinformation und Qualitätssicherung Hornbostel/ Heise 25.04.2006 Institute for Researchinformation and Quality Assurance fan Hornbostel kia Heise Institut für Forschungsinformation und Qualitätssicherung itute for Researchinformation and Quality Assurance sberger Allee 90 175 Bonn forschungsinfo.de

description

Institute for Researchinformation and Quality Assurance. Is success divisible? How to evaluate success successfully?! New Frontiers In Evaluation Session F “Talking about Success” April 25 th 06 - 9.40 – 12.30 a.m. Stefan Hornbostel Saskia Heise - PowerPoint PPT Presentation

Transcript of Is success divisible? How to evaluate success successfully?! New Frontiers In Evaluation

Page 1: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Is success divisible?How to evaluate success successfully?!

New Frontiers In Evaluation Session F “Talking about Success” April 25th 06 - 9.40 – 12.30 a.m.

Institut für Forschungsinformation und Qualitätssicherung

Hornbostel/Heise

25.04.2006

Institute for Researchinformation and Quality Assurance

Stefan HornbostelSaskia Heise IFQ Institut für Forschungsinformation und QualitätssicherungInstitute for Researchinformation and Quality AssuranceGodesberger Allee 90D-53175 Bonn www.forschungsinfo.de

Page 2: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung

Hornbostel/Heise

25.04.2006

IFQ-Project „Final Reports“

The work invested in final reports receives little to no appreciation:

- for the scientists only the findings and publications from a project are of importance and not writing a final report

- for the experts the judgment about the funded project is given

- from the viewpoint of the funding agency a project is finished with

the end of its promotion.

We suggest that the pieces of information included in a final report should be suitable for the monitoring of research activity. On the basis of our project, we would like to investigate whether final reports - in particular the documented output of a project - could be used as an evaluative tool in the DFG research funding.

Page 3: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung

Hornbostel/Heise

25.04.2006

IFQ-Project „Final Reports“

Targets of our project “Final Reports" are

(1) to build up a research monitoring system

(2) to develop an information tool which provides web-access to the findings

Page 4: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung

Hornbostel/Heise

25.04.2006

German Research Foundation (Deutsche Forschungsgemeinschaft, DFG)

Guidelines for final reports (Manual)

1. General Information (DFG-reference-number, applicants, project leader, institution, topic, promotion period, publications)

2. Research Activities and Findings (exploratory questions and target, conducted work – in particular discrepancies, scientific failures – findings, application, connections, usability, patents, industrial co-operations, co-operation partners, project collaborators, diploma, PhD-theses, postdoctoral qualifications)

3. Summary(generally understandable presentation, unexpected results in the developing process, dissemination of information outside the scintific community - possibly press coverage)

Page 5: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung

Hornbostel/Heise

25.04.2006

References for drafting of an expert's report on final reports

Short Expertise

1. Comment on the form of the report (outline, layout)

2. Comment on the content (level of achievement, duration, methodology)

3. Evaluation of the findings (proportionate awareness and promotion, quality of publications, additional requirements)

4. Additional Comments

Page 6: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung

Hornbostel/Heise

25.04.2006

In Final Reports of DFG-funded projects included information, Heise 2006

Page 7: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung

Hornbostel/Heise

25.04.2006

“What relevant information could be asked from the project leaders and how?”, Färkkilä 2004

Page 8: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung Hornbostel/

Heise25.04.06

Which of the following purposes of research

evaluation apply to your organisation?

N = 15 European Funding Agencies

0

2

4

6

8

10

12

14

high 2 3 low

importance of purpose

To ensure that research funding instruments achieve their aims

0

1

2

3

4

5

6

high 2 3 low

importance of purpose

To inform policy, planning and strategic decision-making

0

1

2

3

4

5

6

high 2 3 low

importance of purpose

To allocate funding on the basis of indicators

Page 9: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung Hornbostel/

Heise25.04.06

What does success mean from the viewpoint of a funding agency?

Page 10: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung 25.04.06

Page 11: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung Hornbostel/

Heise25.04.06

Co-authorship in selected years - SCI papers from all fields

Share of single-authors papers

Mean number of Co-authors

1980 24,8 % 2,64

1990 15,7 % 3,34

2000 10,7 % 4,16

Source: Glänzel. W. & Schubert, A. (2004): Analyzing Scientific Networks Through Co-Authorship. In: Moed, H.F. et al.: Handbook of Quantitative Science and Technology Research, 257-276

Page 12: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung 25.04.06

Fractional Counting Complete Counting

Paternity Test

Shareholder

(author, co-author, co-writer sub-author, contributor, hyper-author)

Creatership

Level: Individual Scientist (multi-authorship)

Page 13: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung 25.04.06

Level: Project

Page 14: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung 25.04.06

“Furthermore, many sources of research funding expect researchers to acknowledge any support that contributed to the published work. Just as citation indexing proved to be an important tool for evaluating research contributions, we argue that acknowledgements can be considered as a metric parallel to citations in the academic audit process.”

C. Lee Giles and Isaac G. Councill: Who gets acknowledged: Measuring scientific contributions through automatic acknowledgement indexing. In: Proceedings of the National Academy of Sciences 101(51) pp. 17599-17604, Dec. 21, 2004.

Of 335,000 unique research documents within the CiteSeer computer science archive 188,052 were found to contain acknowledgements (roughly 56%)

Page 15: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung 25.04.06

C. Lee Giles and Isaac G. Councill: Who gets acknowledged: Measuring scientific contributions through automatic acknowledgement indexing. In: Proceedings of the National Academy of Sciences 101(51) pp. 17599-17604, Dec. 21, 2004.

“both the German Science Foundation (Deutsche Forschungsgemeinschaft) and the United Kingdom Engineering and Physical Sciences Research Council (EPSRC) display a steady upward trend in the proportion of acknowledgements received each year during the 1990’s while the Office of Naval Research and IBM slowly become overshadowed by other entities over the decade”.

Page 16: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung 25.04.06

Multidimensional Access Methods - Gaede, Günther (1997)   (Correct)   (253 citations) partially supported by the German Research Society (DFG/SFB 373) and by the ESPRIT Working Group CONTESSAwww.wiwi.hu-berlin.de/~gaede/survey.rev.ps.Z

Efficient PRAM Simulation on a Distributed Memory Machine - Karp, Luby, der Heide (1992)   (Correct)   (72 citations) Science Institute at Berkeley supported in part by DFG-Forschergruppe "Effiziente Nutzung massivftp.uni-paderborn.de/doc/techreports/Informatik/tr-ri-93-134.ps.Z

Decoding Choice Encodings - Nestmann, Pierce (1996)   (Correct)   (52 citations) supported by the DFG, Sonderforschungsbereich 182, project C2, and bywww.cs.auc.dk/~uwe/self/doc/concur96.ps.gz

Statistical Models for Co-occurrence Data - Hofmann, Puzicha (1998)   (Correct)   (24 citations) was supported by the German Research Foundation (DFG) under grant #BU 914/3-1. 1 Introduction Thepublications.ai.mit.edu/ai-publications/1500-1999/AIM-1625.ps

Multiresolution Analysis of Arbitrary Meshes   (222 Citations) Matthias Eck, Tony DeRose, Tom Duchamp, Hugues Hoppe, Michael Lounsbery, Werner Stuetzle (1995)This work was supported in part by a postdoctoral fellowship for the lead author (Eck) from the German Research Foundation (DFG), Alias Research Inc., Microsoft Corp., and the National Science Foundation under grants

On Evaluating Decision Procedures for Modal Logic   (58 Citations) Ullrich Hustadt, Renate A. Schmidt (1997) We thank Christoph Weidenbach and Andreas Nonnengart for their critical comments. The work of the second author is supported by the TraLos- Project funded by the DFGhttp://www.ag2.mpi-sb.mpg.de/~schmidt/publications/MPI-I-97-2-003.ps.gz

An asymptotically optimal multiversion B-tree   (57 Citations) …….Seeger, Peter Widmayer (1996)We want to thank an anonymous referee for an extraordinary effort and thorough discussion that led to a great improvement in the presentation of the paper. This work was partially supported by grants ESPRIT 6881 of the European Community and Wi810/2--5 of the Deutsche Forschungsgemeinschaft DFGhttp://medoc.springer.de:9999/Journals/vldb/tocs/../papers/6005004/60050264.ps.gz

Page 17: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung

Hornbostel/Heise

25.04.2006

Within the preparations of our project “Final Reports” we held explorative interviews (by telephone, average 20 min. duration) with DFG-applicants and asked them, how they would appreciate assignments between project results and producers. Experts from the disciplines chemistry, engineering and educational science participated in the study.

What does success mean from the viewpoint of a recipient of a grant?

Page 18: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung

Hornbostel/Heise

25.04.2006

Chemistry

The interviewed chemists generally find it possible to assign research results to individual projects. With the exception that in interdisciplinary projects or networking it comes to cross-linking and project overlaps, stimulation of projects from another context and therefore no one-to-one allocation of ideas is possible.

Though they consider it possible to assign authors to a project context and to published findings.

Without exception the asked applicants from chemistry acknowledge the third-party-funding organisation in all publications that emerge from the project. In the case of multiple funding, they name every involved funding agency.

Page 19: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung

Hornbostel/Heise

25.04.2006

Engineering

Scientific engineers consider it difficult to relate project output exactly to one project. Results emerge from interal co-operations with other professorships, interactions between scientists that may be relevant for different projects.

While relatively unquestionably publications can be appropriate to authors, the allocation from findings to project contexts is more difficult, due to parallel projects, and is only later reconstructed.

Engineers co-operate with scientists and industry, co-operation is sometimes agreed only for the duration of a certain project, sometimes for longer periods.

Though in the end, all asked scientific engineers acknowledge the third-party-funding organisation in their publications.

Page 20: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung

Hornbostel/Heise

25.04.2006

Educational Science

The asked scientists consider it possible to assign project output and publications to a concrete project context. It may happen that publications from a project context emerge years later (5-10 years). Publications can be related to persons, results are always clearly recognizable.

All asked persons acknowledge the third-party-funding promoter. In educational science this is considered very important – funding by the DFG stands for quality.

In educational science co-operation exists for longer periods of and time is usually concretized in preposition of a project framework.

Page 21: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

DFG Rejection Rate 1995-2004, DFG 2004

27,6

32,9

39,1

42,5

38

40,5

47,4

48,5

44,5

46,3

20

25

30

35

40

45

50

1995 1996 1997 1998 1999 2000 2001 2002 2003 2004

in %

Institut für Forschungsinformation und Qualitätssicherung 25.04.06

Increasing Pressure

Peer Assessment

Grant / Project

Evaluation

Funding Agencies call for Acknowledgments (as a kind of „Property Right“)

Scientists feel obliged to assign outcome to projects

Use of acknowledgments as indicators

Page 22: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung 25.04.06

Success has many fathers, but failure is an orphan

The relation between project and project outcome is a social construction to the same degree as relationship between author and publication

What we measure is the construction set by scientists, not the causal dependency

The evaluation window is not identical with the term of a project

Project outcome is more than publications (Phds, networks, transfer, patents, dissemination of information outside the scientific community …. – some of them hard to meassure)

Therefore success-indicators (like citations etc.) should be combined with peer assessment of final reports

Objectives of systematic use of final reports are - information about programme performance - information for applicants - information for peers about former performance of applicants

Lessons learned:

Page 23: Is success divisible? How to evaluate success successfully?!  New Frontiers In Evaluation

Institut für Forschungsinformation und Qualitätssicherung 25.04.06

Thank you !