NISO Altmetrics Initiative: A Project Update - Martin Fenner, Technical Lead for the PLOS...
-
Upload
national-information-standards-organization-niso -
Category
Education
-
view
826 -
download
1
description
Transcript of NISO Altmetrics Initiative: A Project Update - Martin Fenner, Technical Lead for the PLOS...
NISO Altmetrics Initiative: A Project
Update
Martin Fenner, http://orcid.org/0000-0003-1419-2405Technical Lead, PLOS Article-Level Metrics
Chair, NISO Alternative Assessment Metrics Project Steering Group
http://fivethirtyeight.com/interactives/world-cup/
Phase I
• Describe the current state of the altmetrics discussion
• Identify potential action items for further work on best practices and standards
Steering Committee• Euan Adie (Altmetric)
• Amy Brand (Harvard University/Digital Science)
• Mike Buschman (Plum Analytics)
• Todd Carpenter (NISO)
• Martin Fenner (Public Library of Science, Chair)
• Gregg Gordon (Social Science Research Network)
• William Gunn (Mendeley)
• Michael Habib (Reed Elsevier)
• Nettie Lagace (NISO)
• Jamie Liu (American Chemical Society)
• Heather Piwowar (ImpactStory)
• John Sack (HighWire Press)
• Peter Shepherd (Project Counter)
• Christine Stohn (Ex Libris, Inc.)
• Greg Tananbaum (Scholarly Publishing & Academic Resources Coalition)
In-Person Meetings• October 9, 2013 in San Francisco
• December 11, 2013 in Washington, DC
• January 23, 2014 in Philadelphia
All meetings were streamed and recorded
Interviews
• Thirty researchers, administrators, librarians, funders (and others)
• Semi-structured interview
• March – April 2014
Approach• Open format: lightning talks,
brainstorming, breakout groups, etc.
• Include all stakeholders
• Focus on collecting unstructured input
• Make all material (including audio recordings of steering group) publicly available
Should metrics be hidden to prevent herd mentality? (yes, like Reddit) Quality and Gaming San Francisco
Define credible sources for inclusion: Twitter, Facebook Quality and Gaming San Francisco
A problem is data quality and provenance Quality and Gaming San Francisco
Quality assessment of studied data Quality and Gaming San Francisco
Validity of reliability of altmetrics Quality and Gaming San Francisco
Data quality & validity: How valid the altmetrics data … ? have to assess? Quality and Gaming San Francisco
Approaches to factor out the effects of gaming (e.g., not counting self citation)
Quality and Gaming San Francisco
Define acceptable promotion versus gaming Quality and Gaming San Francisco
Auto-spam detection; trolling Quality and Gaming San Francisco
Interactions with more traditional altmetrics Quality and Gaming San Francisco
Define what behaviors are considered as cheating/gaming Quality and Gaming San Francisco
Define what reaction to gaming should be public sharing? Data tainting? We ingnore it?
Quality and Gaming San Francisco
Are there any “ungameable” systems out there at all and can we learn anything from them?
Quality and Gaming San Francisco
What are effective measures to audit published altmetrics for accuracy? Quality and Gaming San Francisco
Potential Action ItemsData Quality and Gaming
• Promote and facilitate use of persistent identifiers.
• Research issues surrounding the reproducibility of metrics across providers.
• Develop strategies to improve data quality through normalization of source data across providers.
• Explore creation of standardized APIs or download or exchange formats to facilitate data gathering.
• Develop strategies to increase trust, e.g., openly available data, audits, or a clearinghouse.
• Study potential strategies for defining and identifying systematic gaming of new metrics.
Next Steps• Finalize and release white paper and draft
new work item proposal for standards/best practices based on the study
• Proposal vetted by NISO leadership and members
• Proposal approved and working groups formed for Phase II of the project
Lessons Learned?
Definitions• Altmetrics
• Article-Level Metrics
• Research Assessment
• Scholarly Impact
• Quality of Research
Data Quality and Gaming
• Disconnect between frequent concerns by users and lack of (public) activity by altmetrics providers
• Strategies to improve data quality include standards, audits, open data, and a central clearinghouse
• Users of altmetrics data (researchers, institutions, funders) should be more proactive in demanding data quality checks and transparency
Grouping and Aggregation• Common practice for a single research output,
e.g. for multiple versions, multiple locations, or grouping of all metrics into a single score
• Common practice for multiple research outputs, e.g. by journal, by author, or by institution
• Challenge of uniquely identifying authors, institutions and funders associated with these research outputs
• Aggregation remains problematic, more empirical evidence and transparency needed