Turning Data into Insight. 2 Who We Are Founded in 2002 Stable staff of full-time, seasoned...

download Turning Data into Insight. 2 Who We Are  Founded in 2002  Stable staff of full-time, seasoned practitioners ( 20)  Successfully completed hundreds.

If you can't read please download the document

description

3 Our Two-Pronged Expertise EVALUATION Program Evaluation Evaluation Assessment/Framework Results-Based Management Tools Logic Model/Data Collection Matrix Environmental Scan/Data Review Impact/Cost-Effectiveness Analysis Web Surveys & Interviews Case Studies & Roundtables MEASUREMENT Scientometrics & Technometrics Using Bibliometric Methods/Data Output/Citations/Impact Factor Performance Measurement Comparative Analysis/Benchmarking Collaboration Networks/Strategy Positional Analysis (SWOT) Policy/Management Support

Transcript of Turning Data into Insight. 2 Who We Are Founded in 2002 Stable staff of full-time, seasoned...

Turning Data into Insight 2 Who We Are Founded in 2002 Stable staff of full-time, seasoned practitioners ( 20) Successfully completed hundreds of projects for science-based organizations High rate of repeat business attests to satisfied and confident clients Fully customized versions of four important databases: Web of Science, Scopus, Medline and Questels PlusPat Regularly contribute to the scientific literature and conferences on S&T indicators and program evaluation 3 Our Two-Pronged Expertise EVALUATION Program Evaluation Evaluation Assessment/Framework Results-Based Management Tools Logic Model/Data Collection Matrix Environmental Scan/Data Review Impact/Cost-Effectiveness Analysis Web Surveys & Interviews Case Studies & Roundtables MEASUREMENT Scientometrics & Technometrics Using Bibliometric Methods/Data Output/Citations/Impact Factor Performance Measurement Comparative Analysis/Benchmarking Collaboration Networks/Strategy Positional Analysis (SWOT) Policy/Management Support 4 Indicators & Approaches We Use/Developed Bibliometric and technometric indicators Production and productivity: number of papers (frac/full), papers per GERD, papers per capita, papers per researcher, share, index of specialization Impact: Average of Relative Citations (ARC), Average of Relative Impact Factors* (ARIF), number of citations in the X% most cited papers, distribution of papers by centiles of citations Collaboration: collaboration rates (international, inter- institutional, inter-sectoral), matrix of collaboration, affinity of collaboration, impact of collaborations Interdisciplinarity, recency, network measures 5 Indicators & Approaches We Use/Developed Multicriteria analyses Positional analyses Scale-free indicators Identification of topics: clustering using co-citations, co- word analyses, Latent Dirichlet Allocation (LDA), bibliometric coupling, TFIDF Network visualization Geographical mapping (GIS) Classification: journal- or article-based, mutually exclusive or not 6 Use and Development of Standards Optimization of processes Robustness and knowledge of limits Comparability and benchmarking Communication Interoperability Inertial, not flexible Fallacious perception of measuring the reality Importance of the context and multiple pieces of evidence 7 Conclusion (more of an introduction) There is no doubt about the usefulness of standards Potential difficulties in developing universal standards It may be easier to work towards universal standards than to set standards The development of Wikimetrics. We are working on a mutually exclusive journal-based classification that takes into account the natural organization of scientific journals. Collaborators have translated the classification into 19 languages. The entire classification and method used are on our website and available for non-commercial use. We are awaiting feedback. 8 The Science Behind Science Policy Contact: Grgoire Ct VP Bibliometrics, Science-Metrix