Eval_Elec_Resources.pptx

42
EVALUATING AND SELECTING ONLINE RESOURCES: AN AMERICAN LIBRARY ASSOCIATION TECH SOURCE WORKSHOP Jill E. Grogg, Electronic Resources Librarian and Associate Professor The University of Alabama Libraries [email protected] Rachel A. Fleming-May, Assistant Professor School of Information Sciences, The University of Tennessee- Knoxville

Transcript of Eval_Elec_Resources.pptx

Page 1: Eval_Elec_Resources.pptx

EVALUATING AND SELECTING ONLINE RESOURCES: AN AMERICAN LIBRARY ASSOCIATION TECH SOURCE WORKSHOP

Jill E. Grogg,

Electronic Resources Librarian and Associate Professor

The University of Alabama Libraries

[email protected]

Rachel A. Fleming-May,

Assistant Professor

School of Information Sciences, The University of Tennessee-Knoxville

[email protected]

Page 2: Eval_Elec_Resources.pptx

OUR INTEREST IN THIS

ISSUE:

Jill:Electronic Resources Librarian, ARL Library

Rachel:Former Practitioner , ARL LibraryResearch interest in “Use” on a practical and theoretical level

Use…USAGE—measurement of e-Resource usage

The Concept of Electronic Resource Usage and Libraries (Library Technology Reports, Aug./Sept. 2010)1

Page 3: Eval_Elec_Resources.pptx

COLLECTION PRACTICES REDUXOLD: Supply-side NEW: Demand-driven

• Use is not primary• Print-based• Inputs only led to large

institutions held hostage by rankings (e.g., ARL)

• Growth rate not sustainable

• Information is widely, cheaply available

• Patron demand and use analysis drive collection decisions

• Assessment culture

Custodians

of scholarship

Enable

digital environment

for scholarship

21st c. Library

Page 4: Eval_Elec_Resources.pptx

DATA-DRIVEN DECISION-MAKING

•Evolution of “cost-effective”•Use multiple variables to make big

decisions•Triangulate:

1. Usage statistics and cost per use2. User feedback, even something as simple

as Survey Monkey3. External measures of quality, where

applicable (Eigenfactor or Impact Factors)• Seek continuing education opportunities

for statistical analysis

Page 5: Eval_Elec_Resources.pptx

AGENDA:• Discuss approaches to

understanding the use and value of e-resources Overview of current concepts

and practice related to e-Resource usage measurement

Discussion of approaches to augmenting data

• Learn the basics of negotiation for librarians, including a BATNA (Best Alternative to a Negotiated Agreement)

Page 6: Eval_Elec_Resources.pptx

Question:

How do you currently

measure the quality of the

electronic resources you

provide for your patrons?

USE?

Page 7: Eval_Elec_Resources.pptx

What is “use”, really?AN EVENT?

SOMETHING THAT CAN BE MEASURED?

…WITH NUMBERS?Why does it matter?

“The principle of usefulness says simply that libraries should collect what patrons use.”2

Page 8: Eval_Elec_Resources.pptx

TO MEASURE USE WE FOCUS ON

Inputs…and Outputs

…number of patrons who enter the building; number of dollars invested in e-resources

…number of book circulations; number of electronic article downloads

Page 9: Eval_Elec_Resources.pptx

HOW VALUABLE IS DATABASE X TO UNDERGRADUATE STUDENTS AT OUR INSTITUTION?

Current Approach to Assessment:

Focus (ostensible) Undergraduate Students’ use of a specific information resource

Presumption: Use is proof of usefulness/value

Tool(s) • Usage reports: log-ons to and downloads from Database X

• Cost Data

Data Statistical; time-specific. Some granularity—possible to identify some detail about individual sessions

Focus of Assessment (actual)

Cost of Database X/Number of times Database X is logged on to by a segment of the user population

Enhanced Understanding?

Compare measures of access to those of other databases; measures for Database X at peer /aspirational institutions.

Potential outcomes:

Decision to keep/eliminate subscription based on perceived “performance” or importance of Database X to students.

Page 10: Eval_Elec_Resources.pptx

“Among other changes, the Complete College

Tennessee Act funds higher education based in part on

success and outcomes, including higher rates of

degree completion.”

So What? Why can’t we

continue to assess things

that way?So What? Why can’t we

continue to assess things

that way?

Page 11: Eval_Elec_Resources.pptx

Potential Drawbacks?

• Many instances of use are removed from the library, thus unobservable

• Statistical Data about usage provides a sketch of when and where, what (in a more limited sense)…

• …but no “why.”

USE IS FREQUENTLY ASSESSED IN ORDER TO GENERATE “OBJECTIVE” DATA FOR DECISION MAKING.

Page 12: Eval_Elec_Resources.pptx

Understanding

USE Matters.

“Questions such as, ‘Who uses these resources?’ or ‘Are these huge outlays of funds justified in terms of use, or value derived from use?’ or ‘What difference do all of these resources make to students and faculty in universities?’ must be answered if university administrators, trustees, students, and faculty are expected to support ever-increasing levels of funding for the acquisition and development of these resources and services.”5

Page 13: Eval_Elec_Resources.pptx

Use is often treated as a PRIMITIVE CONCEPT in Library and Information Science:

an idea so fundamental to the theoretical framework as to be indefinable, even when presented as a phenomenon to be measured and quantified.

Page 14: Eval_Elec_Resources.pptx

“An obvious problem is

that there is no clear

definition of what

comprises ‘use’ nor is

it likely that library

science will soon

develop one, for it is

as elusive as the

concept of information,

with which it is

confounded.”3

Page 15: Eval_Elec_Resources.pptx

“as the pendulum swings from physical library use to online use of libraries, we need to develop measurement and assessment methods to accurately portray how users are using the library”

“some of the basic ‘natural laws of library and information science’ may not apply as well or as consistently in the realm of electronic information discovery and use”4

Page 16: Eval_Elec_Resources.pptx

SO, IS USE A PRIMITIVE CONCEPT?

No. Use does not, in fact, have a singular conceptual meaning in the LIS domain and can signify many actions, processes, and events.

Page 17: Eval_Elec_Resources.pptx

THE USE TYPOLOGY: DIMENSIONS OF USE

I. Use as an Abstraction Ia. Use as a FacilitatorII. Use as an ImplementIII. Use as a ProcessIV. Use as a Transaction IVa. Use as a Connector

Page 18: Eval_Elec_Resources.pptx

“Of the 57,148 households [surveyed], 27,511 (48.1%) had a household member who used the public library in the past year. ”6

Use as an Abstraction

• A GENERAL TERM FOR ALL TYPES OF LIBRARY/INFORMATION USE

• DISASSOCIATED FROM ANY SPECIFIC INSTANCE OF THE PHENOMENON

Page 19: Eval_Elec_Resources.pptx

USE AS A TRANSACTION

• Isolated instances of library or information use

•Can be recorded and quantified

•Removed from the user Vendor-supplied data (COUNTER

compliant or otherwise) Transaction log analysis

Including page view time measurement (are they really reading?)

Log-ons—what about database timeouts?

Page 20: Eval_Elec_Resources.pptx

“statistics provided by electronic book vendors…show that [our] community uses e-books quite heavily. The data do not show, however, how books are used. For instance, the available statistics show that a book has been accessed but do not differentiate between a one-second click on a title and a five-hour immersion in a book…the data also do not tell us why an electronic version of a book was used instead of the paper version.”8

Page 21: Eval_Elec_Resources.pptx

UNDERSTANDING OF

USE AS A PROCESS

Article Download

Visit to the Reference

Desk

Db A: Log on

Application of library/information resources, materials, and/or services…

To complete a complex or multi-stage task

To the solution of a problem

REMEDY?

“This study reveals that undergraduate students experience information use in a complex, multi-tiered way that needs to be addressed by higher educators when creating information literacy

pedagogy.”7

Page 22: Eval_Elec_Resources.pptx

Abandon Statistics?No.

AUGMENT STATISTICAL ASSESSMENT WITH OTHER APPROACHES: REQUIRES MULTIPLE DATA COLLECTION

METHODS REQUIRES “BIPARTISAN” SUPPORT, I.E.,

WORKING WITH PUBLIC SERVICES TO GAIN A FULLER UNDERSTANDING OF HOW AND WHY PATRONS USE THE RESOURCES THEY DO.

“USE AS PROCESS” = FUNDAMENTAL SHIFT IN APPROACH TO EVALUATION.

Page 23: Eval_Elec_Resources.pptx

HOW VALUABLE IS DATABASE X TO UNDERGRADUATE STUDENTS AT OUR INSTITUTION?

Current Approach to Assessment:

Focus (ostensible) Undergraduate Students’ use of a specific information resource

Presumption: Use is proof of usefulness

Tool(s) Usage reports: log-ons to and downloads from Database XCost Data

Data Statistical; time-specific. Some granularity—possible to identify some detail about individual sessions

Focus of Assessment (actual)

Cost of Database X/Number of times Database X is logged on to by a segment of the user population

Enhanced Understanding?

Compare measures of access to those of other databases; measures for Database X at peer /aspirational institutions.

Potential outcomes:

Decision to keep/eliminate subscription based on perceived “performance” or importance of Database X to students.

Page 24: Eval_Elec_Resources.pptx

TOOLS• Usage Statistics, plus…• Students’ own

Words Interviews Focus groups Surveys Research journals

Actions Observed behavior

Incl. usability Citations in school work

Improved understanding (after instruction in use of a specific resource)

DATA—QUESTIONS TO ANSWER:

• Who are the students using this resource… or not?

• Why do they use this particular resource (e.g., JSTOR) …instead of another?

• When and Where do they use it?

• How do they use it? For what purposes?

• How do they feel about this resource and its role in their schoolwork?

Page 25: Eval_Elec_Resources.pptx

HOW VALUABLE IS DATABASE X TO UNDERGRADUATE STUDENTS AT OUR INSTITUTION?

Focus of Assessment

Undergraduate Students’ use of a specific information resource

Presumption: It’s not possible to fully understand the importance of an information resource through observation alone.

Tool(s) Multi-method

Potential Partnerships?

• Instruction librarians• Undergraduate course instructors• Student workers• Graduate students

Data • Statistical: what and where• Granular and individual; affective

Enhanced Understanding?

True value of particular resource or resource type in the learning process

Potential outcomes

• Decision to keep/eliminate subscription based on enhanced picture of Database X’s “performance” or importance to students.• Generation of Reportable Data regarding resource usage

outcomes

Page 26: Eval_Elec_Resources.pptx

CHALLENGES…• Lack of time for more

sophisticated data collection

• Lack of research expertise

Partnerships outside institution: Similar institutions Consortial partners

Within institution: Academic

departments Students Instructors

Other entities Research groups Data collection units Student success

programs

& STRATEGIES

Page 27: Eval_Elec_Resources.pptx

• Grant funded by IMLS, December 2009-2012 Principal Investigators Carol

Tenopir, UTK; Martha Kyrillidou, ARL, Paula Kaufman, UIUC

• Purpose: “…to study the value of academic libraries to students, faculty, policymakers, funders…” and Return on Investment (ROI) in academic libraries

• Comprehensive: models incorporate all inputs in the

library system (faculty, staff, students, library resources) and determine how each influences the system

articulate all values of the library and areas of investment and return

Teaching/Learning

Research Social/Professional

e-Science

Collaborative Scholarship

Institutional Repositories

Functional AreasSc

hola

rly E

ndea

vors

Slide adapted from Carol Tenopir’s presentation, “ForumValue, Outcomes, and Return on Investment of Academic Libraries (Lib-Value) (funded by IMLS)” at the January, 2010 ARL Assessment Forum, Boston, MA.

THE LIB-VALUE PROJECT:

Page 28: Eval_Elec_Resources.pptx

KEY QUESTIONS:• Does the reputation of a

university’s library influence Enrollment? Recruitment of faculty

and students? Material or financial

donations?• Do library resources and/or

services play a role in Student success? Retention?

Page 29: Eval_Elec_Resources.pptx

•Unsure of focus/type of research to conduct

CHALLENGES…

Institutional priorities

Regional accreditation Supporting student

learning Facilitating a

culture of assessment

Engagement in the Scholarship of Teaching and Learning (SoTL)

& STRATEGIES, PART II:

Page 30: Eval_Elec_Resources.pptx

QUESTIONS:

•What are you doing at your library NOW to evaluate electronic resource purchases?

•What would you like to be doing?

•Why are you NOT doing it?

•What could you STOP doing in order to perform more granular and expansive data analysis?

Page 31: Eval_Elec_Resources.pptx

ACTIVITY:

•Think of an evaluative project at your institution What is the objective of this project? What is the question this project is designed to

answer? Which specific tools are being applied to

answering the question? What kind of data is being generated or gathered

to answer this question? Are there tools, approaches, or data that might be

more suitable to the job? Which? Why?

Page 32: Eval_Elec_Resources.pptx

•Different evaluations for different products – one checklist cannot fit everything

•Determination of “cost-effective” = community needs analysis

•Discipline-specific, project-based = manageable

•Stewards of e-resources = using evidence rather than assumption to justify expenses

Page 33: Eval_Elec_Resources.pptx

SAMPLE WORKSHEET

Page 34: Eval_Elec_Resources.pptx

COMMUNITY NEEDS ANALYSIS, SIMPLIFIED•Why do you want to conduct a community needs

analysis for e-resources? (Goals & objectives)•Assess current collection/services situation –

what data do you need to conduct assessment and how will you do this (methods)?

•Who is your community? – what data you need to answer this question and how will you do this (methods)?

•What does your community need? – what data do you need to answer this question and how will you answer this question (methods)?

Page 35: Eval_Elec_Resources.pptx

PATRON-DRIVEN ACQUISITION

•Exploit vendor-provided resources – they can help with workflow (e.g., ebrary, EBL)

•Collection management becomes risk management Maintaining largest pool of titles possible? Removing and adding titles based on

demand? Building rules for different publishers

Page 36: Eval_Elec_Resources.pptx

NEGOTIATION AND E-RESOURCES

• In current environment, negotiation is a given

•Licenses control risk and are written by lawyers Who is at greatest risk for infringement? The

library? The vendor/publisher/content provider? The user?

•Approach all with dispassion•Licenses, like evaluation, should reflect

local needs

Page 37: Eval_Elec_Resources.pptx

BATNA

•BATNA: Best alternative to a negotiated agreement – alternatives almost always exist

•Use it in lieu of a bottom line•Fisher and Ury: BATNA as key to going into

negotiation confidently• Is free good enough for your user

community based on your community needs analysis, including evaluation of usage statistics?

Page 38: Eval_Elec_Resources.pptx

NEGOTIATION CHECKLIST(S)

•Conducted background research?• Identified BATNA?•Determined deal breakers?•Differentiated less important items from true deal

breakers?•Communicated with appropriate personnel?

•Created shared document that outlines internally-agreed upon definitions, etc. for: Authorized users Signatory authority Jurisdiction

Page 39: Eval_Elec_Resources.pptx

LICENSING CHECKLIST(S)

Licensing experts* advise including:•The name of the licensor and licensee•The name of the person in your

organization who has negotiating and signing authority for agreements

•A description of the content being licensed•The duration of time for licensing the

content

*Becky Albitz, Rick Anderson, Trisha Davis, Fiona Durrant, Ann Okerson, and more

Page 40: Eval_Elec_Resources.pptx

SAMPLE ITEMS TO INCLUDE

•Rights (should all be yes)•Organization X’s Responsibilities (should all

be no)•Vendor Responsibilities (should all be yes)•Unacceptable terms (and why, with

applicable policies, state statutes, etc.)

Page 41: Eval_Elec_Resources.pptx

QUESTIONS?

Thank you for your time!

Page 42: Eval_Elec_Resources.pptx

1. Fleming-May, Rachel A., and Jill E. Grogg. 2010. The concept of electronic resource usage and libraries. Vol. 46, Library Technology Reports.

2. Swigger, Keith, and Adeline Wilkes. 1991. The use of citation data to evaluate serials subscriptions in an academic library. Serials Review 17 (2):41-46; 52.

3. Ibid.

4. Peters, Thomas A. 2002. What's the use? the value of e-resource usage statistics. New Library World 103 (1172/3):39-47.

5. Miller, Rush, and Sherrie Schmidt. 2002. E-Metrics: Measures for Electronic Resources. Serials: The Journal for the Serials Community 15 (1):19-25.

6. Sin, Sei-Ching Joanna, and Kyung-Sun Kim. 2008. Use and non-use of public libraries in the information age: A logistic regression analysis of household characteristics and library services variables. Library & Information Science Research 30 (3):207-215.

7. Maybee, Clarence. 2006. Undergraduate Perceptions of Information Use: The Basis for Creating User-Centered Student Information Literacy Instruction. The Journal of Academic Librarianship, 32(1), 79-85.

8. Levine-Clark, Michael. 2006. Electronic Book Usage: A Survey at the University of Denver. portal: Libraries and the Academy 6 (3):285-299.

9. Luther, Judy. 2008. University investment in the library: What's the return? In Library Connect White Papers.

10. Tenopir, Carol. 2010. University Investment in the Library, Phase II: An International Study of the Library's Value to the Grants Process. In Library Connect White Papers.