Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What...

35
Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m. Sheraton Wall Centre Vancouver Gulf Islands BCD Members: Bob Fox, Chair (Louisville) 2015-2016 Austin Booth (Buffalo) 2015-2017 Colleen Cook (McGill) 2015-2016 Bella Gerlich (Texas Tech) 2015-2017 Arnold Hirshon (Case Western) 2015-2016 Artemis Kirk (Georgetown) 2015-2016 Jennifer Paustenbaugh, Vice Chair (BYU) 2015-2018 Mary-Jo Romaniuk (Manitoba) 2015-2017 Betsy Wilson (Washington) 2015-2016 Staff Liaison: Sue Baughman (ARL) Agenda 09:30 a.m. Welcome and Introductions – Bob Fox, Chair Review of Agenda and Documentation The committee will address several topics in order to provide feedback to ARL staff on upcoming activities. Two documents will guide the discussion and as a result, a work plan for the remainder of 2016 will be developed for the committee’s review. Documents: § Assessment Committee (AC) Activities § ARL Assessment Team* Priorities 09:35 a.m. Validation of Current and Ongoing (Section 1) – AC Activities The committee is asked to validate the four activities listed on page 1. 9:40 a.m. Activities for Committee Discussion (Section 2, pages 2 and 3) – AC Activities The committee is asked for its input and feedback on three topics: 1. Develop the cohort of assessment professionals. What are the Committee’s goals for developing this community? What strategies could be employed to do this? The Library Assessment Conference Steering Committee recommends that a two to three hour afternoon session be held on November 2 for ARL libraries’ assessment staff and survey coordinators. What topics should be included if we proceed with this meeting? 1

Transcript of Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What...

Page 1: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

Assessment Committee April 26, 2016

9:30 a.m. to 10:30 a.m.

Sheraton Wall Centre Vancouver Gulf Islands BCD

Members: Bob Fox, Chair (Louisville) 2015-2016 Austin Booth (Buffalo) 2015-2017 Colleen Cook (McGill) 2015-2016 Bella Gerlich (Texas Tech) 2015-2017 Arnold Hirshon (Case Western) 2015-2016 Artemis Kirk (Georgetown) 2015-2016 Jennifer Paustenbaugh, Vice Chair (BYU) 2015-2018 Mary-Jo Romaniuk (Manitoba) 2015-2017 Betsy Wilson (Washington) 2015-2016 Staff Liaison: Sue Baughman (ARL) Agenda 09:30 a.m. Welcome and Introductions – Bob Fox, Chair Review of Agenda and Documentation The committee will address several topics in order to provide feedback to ARL staff on upcoming activities. Two documents will guide the discussion and as a result, a work plan for the remainder of 2016 will be developed for the committee’s review. Documents:

§ Assessment Committee (AC) Activities § ARL Assessment Team* Priorities

09:35 a.m. Validation of Current and Ongoing (Section 1) – AC Activities The committee is asked to validate the four activities listed on page 1. 9:40 a.m. Activities for Committee Discussion (Section 2, pages 2 and 3) – AC Activities The committee is asked for its input and feedback on three topics:

1. Develop the cohort of assessment professionals. What are the Committee’s goals for developing this community? What strategies could be employed to do this? The Library Assessment Conference Steering Committee recommends that a two to three hour afternoon session be held on November 2 for ARL libraries’ assessment staff and survey coordinators. What topics should be included if we proceed with this meeting?

1

Page 2: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

2

2. Contribute to the creation of SPEC Kit topics for the 2017 proposal process. Each spring a list of possible topics are identified to be included in the call for proposals. The topics generally fall within ARL’s strategic priorities.

In 2017 four proposals will be accepted. The list of 2016 topics included:

a. Coordinated stewardship of collective collections b. Workforce implications of new services c. Inclusive design initiatives of either spaces or services, involving

participation of diverse, underrepresented communities d. Creating and maintaining inclusive climates through staffing,

collections, and programming e. Non-traditional expertise among library professionals f. Prevalence of disciplinary expertise among staff g. Acquiring and managing new forms of scholarship h. Data management i. Innovations in engaging users with collections j. Strategies for articulating values and impact of library services

Are these topics relevant for 2017 and/or what other topics should be added?

3. Consider next steps and timing involved with developing outcomes based strategic metrics and dashboards / Balanced Scorecard initiatives. What should be included to be useful to ARL members? ARL Assessment Team will conduct research on data warehouse protocols in 2016.

10:20 a.m. ARL Assessment Team* Priorities The team has listed nine items that they believe will improve communications to ARL directors and the staff and enhance users’ experiences with surveys and services. Of the items listed, does the committee have feedback or questions? 10:27 a.m. Next Steps * Members of the ARL Assessment Team: Sue Baughman, Henry Gross, Shaneka Morris, Angela Pappalardo, Gary Roebuck, and Amy Yeager. Other Documents: 1. Assessment Committee Final Report (November 18, 2015) 2. Assessment Program Report submitted by the ARL Assessment Team (February 22, 2016)

2

Page 3: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

1

ARL Assessment Committee

Activities from the November 18, 2015 Report (created April 12, 2016)

Project or Activity

Status

Notes

1. Current and Ongoing Projects

1. A. Study of ARL historical salary data by examining minority- and gender-based salary gaps. Administered a survey that examines the extent to which gender and minority salary differences can be explained by factors beyond gender and race. The survey asked questions regarding family size, family-related leaves, attitudes toward promotion, and years of experience, among other variables. A subsample of ARL institutions participated in this part of the study.

Quinn Galbraith (Brigham Young), Visiting Program Officer, is completing his research project on researching salary equity issues. He presented a preliminary report to the Committee in March. The Committee identified other areas of interest for exploration.

Committee suggested program session for the Fall Association Meeting in September 2016. Stanley Wilder is planning to update his review of trends as part of this session.

1. B. Administer facilities and special collections with current surveys - May 2016

1. C. Administer LibQUAL+ and ClimateQUAL on request through 2016.

1. D. Complete ARL’s commitment to the Measuring Up, IMLS-funded grant researching impact and metrics of institutional repositories.

3

Page 4: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

2

Project or Activity Status

Notes

2. Discussion Topics – Assessment Committee Meeting – April 26

2. A. Develop the cohort of assessment professionals. What are the Committee’s goals for developing this community? What strategies could be employed to do this?

Discussion Topic: The Library Assessment Conference Steering Committee recommends that a half-day session be held on November 3 for ARL libraries’ assessment staff and survey coordinators. What topics should be included if we proceed with this meeting?

2. B. Contribute to the creation of SPEC Kit topics for the 2017 proposal process.

Each spring a list of possible topics are identified to be included in the call for proposals. The topics generally fall within ARL’s strategic priorities. In 2017 four proposals will be accepted. Are these topics relevant for 2017 and/or what other topics should be added? The list of 2016 topics included: • Coordinated stewardship of

collective collections • Workforce implications of new

services • Inclusive design initiatives of either

spaces or services, involving participation of diverse, underrepresented communities

• Creating and maintaining inclusive climates through staffing, collections, and programming

• Non-traditional expertise among library professionals

4

Page 5: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

3

• Prevalence of disciplinary expertise among staff

• Acquiring and managing new forms of scholarship

• Data management • Innovations in engaging users with

collections • Strategies for articulating values

and impact of library services

2. C. Develop outcomes based strategic metrics and dashboards / Balanced Scorecard initiatives. What should be included to be useful to ARL members?

ARL Assessment Team will conduct research on data warehouse protocols in 2016.

5

Page 6: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

4

Project or Activity

Status

Notes

3. Overall Program Review – Issues, Activities and Programs to Included and/or Considered

3. A. Deliver LibQUAL+, ClimateQUAL and MINES for Libraries.

Pending the review of the overall assessment program, if these protocols are recommended for continuation, they will need to be updated or re-grounded.

ARL Assessment Team recommends putting any additional MINES implementations on hold until after the overall review of the assessment program. ARL Staff are not trained to support this protocol except to run final reports so this protocol is dependent on outside consultants for set up activities with libraries.

3. B. Administer descriptive surveys: facilities, special collections; budget; salary, statistics

3. C. Standardize survey and accreditation practices with partner organizations

3. D. Review ARL publications including SPEC Kits to determine ongoing viability, pricing models, etc.

6

Page 7: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

5

Project or Activity

Status

Notes

4. Future Issues, Activities and Programs for 2017 and Beyond

4. A. Consider possible new ideas for measuring outcomes and value. Examples: 1) Linking between student success

and information seeking attitudes and behaviors.

2) Linking existing data sources and institutional outcomes

3) Identifying value of libraries in student success.

4. B. Rethink the StatsQUAL infrastructure and service delivery (streamlining infrastructure and technology updates)

4. C. Share outcomes based strategic metrics and dashboards / Balanced Scorecard initiatives

ARL Assessment Team will conduct research on data warehouse protocols in 2016.

7

Page 8: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

Association of Research Libraries Assessment Program

ARL Assessment Team Priorities

April 18, 2016 The ARL Assessment Team reviewed the list of short and long-term suggestions presented in its report dated February 22, 2016. Two lists of activities are presented. The first is a list of priorities for 2016 that the Assessment Team believes will improve communications to ARL members and their staff and enhance the users’ experiences with surveys and services. Several other ARL teams will be important in successfully completing these tasks: Administrative, Communications, and IT. The second list includes tasks that are already underway or will not take a large amount of staff time and resources to complete. The Assessment Team suggests that the longer-term suggestions included in their report are fully considered as part of the overall review of the assessment program. Pending review of these priorities with the Assessment Committee, the ARL Assessment Team will create a detailed work plan with assignments for tasks and due dates. Members of the Assessment Team are: Sue Baughman, Henry Gross, Shaneka Morris, Angela Pappalardo, Gary Roebuck, and Amy Yeager. 1. All Surveys and Services. Create an overall communication plan for the Assessment

Program. 2. ARL Descriptive Surveys. Create an online, self-paced training module for Survey

Coordinators that complements the one-on-one support. Create a page on ARL Statistics to house the help videos and training materials.

3. ARL Descriptive Surveys. Create a quarterly Assessment Program Newsletter/Press

Release that includes information about all descriptive surveys for the coming year and communicates survey due dates, publication timelines, and the dates that the Assessment Program plans to send reminder/follow up communications. Situate the newsletter for ease of discovery.

4. ARL Descriptive Surveys. Develop strategies to refresh the culture of assessment for

ARL member libraries that stresses the importance of this data collection and the benefit of maintaining a collection cycle. Some initial ideas could include:

a. Add a segment to the New Directors Orientation program. b. Post a message from the Chair of the Assessment Committee to the membership

at the launch of the data collection cycle. c. Add articles in the quarterly newsletter regarding the importance of data

collection, how data is used, best practices in data use, etc. d. Modify the data collection cycle so that all surveys are due in October of each

year. (This would also mean that institutions whose fiscal year ends on September 30 would have a more realistic deadline.)

8

Page 9: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

2

5. ClimateQUAL. Develop or update a series of tools in support of the protocol to reduce the

amount of time needed to orient new libraries to the survey. Create an online, self-paced training module for survey coordinators. Revise ClimateQUAL procedures manual. Update the ClimateQUAL website.

6. LibQUAL. Create a SMART Sheet (Google planning tool) for all due dates and work

processes. 7. LibQUAL. Create a dashboard for subscribers to improve the user experience. 8. MINES for Libraries. Put a moratorium on adding any new implementation of this tool

until after the overall review of the assessment program. Complete this protocol for the two institutions currently engaging with it.

9. StatsQUAL Infrastructure. Review needs to update technology and conduct research on

possible platforms. Internal Activities for the Assessment Team in 2016 1. ARL Assessment Blogs. Move the ClimateQUAL and general assessment blog to the ARL

main website. Refresh main blog sites as part of overall brand refresh. 2. ClimateQUAL. Develop stored data procedures and quality control report to enable more

than one Assessment Team member to create reports. 3. ClimateQUAL. Enumerate the features that are needed to more effectively support

internal creation of custom reports. 4. Measuring Up Grant. Complete ARL’s tasks for the grant. 5. Research to Support Future Discussions. Conduct preliminary research in the

following areas: a. Ways to bring the different sources of data together (creating a warehouse). b. State of the art and platforms to support the building of customized surveys.

6. ARL and StatsQUAL Web Pages. Identify information regarding statistics and

assessment on the main ARL webpages and the StatsQUAL webpages that should be reviewed and updated during the overall ARL brand work.

9

Page 10: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

Assessment Committee Final Report November 18, 2015 Charge of Committee The Assessment Committee (AC) is a standing committee that develops and oversees ARL’s role in describing the contributions of libraries to research, teaching, learning and community service as captured through outcomes assessment. This program develops new analytics that are responsive to the changing roles and needs of ARL members while also maintaining longitudinal and comparative peer data. A primary goal is to empower agile and dynamic decision-making by members through the mining and use of timely and relevant data in new ways that will enable integration with other data used by parent institutions and professional organizations. Key concerns include demonstrating the value and cost effectiveness of library resources and services in ways promoting alignment with institutional outcomes. The AC connects with other statistical and assessment entities that generate university rankings, provide benchmarks, identify resource strengths and weaknesses, monitor organizational performance, and measure productivity. Similarly, the AC monitors these academic indicators to recognize potential changes in library direction and involvement (for instance, by integrating learning and research analytics tools to develop improvements in existing services or the development of new services). The committee also supports ARL’s internal assessment and evaluation processes. Assessment is not the exclusive domain of the Assessment Committee. It is the intent of the ARL Design process that all Design Teams engage in assessment. Framework and Context for the Committee

1. Describe the Breadth and Focus of the Committee

● Engage in assessment activities and developments that have strategic importance to

member libraries by enabling innovation through the testing and experimentation of new methods, and generating reliable and valid evidence through well-tested approaches.

● Emphasize approaches that move libraries from simply articulating to demonstrating and predicting how libraries impact research, teaching, learning and community service, and how libraries are transforming from being information repositories to sites of knowledge production.

● Support library directors by providing data that enables them to grasp quickly the key strengths of their parent institutions and libraries, including providing indicators and data about human resources and organizational development.

● Promote multiple methods and approaches that articulate the value of research libraries in helping parent institutions achieve desired outcomes.

● Work with partner organizations to streamline and standardize statistical surveys and to help define and understand accreditation requirements related to libraries.

10

Page 11: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

2

● Continue to collect strategic longitudinal data that demonstrate the enduring investments and values of research institutions and research libraries.

● Engage library assessment staff with expertise in data collection, development of analytical interfaces, quantitative and qualitative research methods, survey research and data management, the presentation of research results, and marketing and communicating the value of libraries in compelling and engaging ways.

● Through liaisons, the AC also intersects with and supports the work of other ARL committees, particularly the committees on Policy and Advocacy, Diversity and Inclusion, and Member Engagement and Outreach and similarly it supports the SoA initiatives as they work to develop new methods to evaluate the success of Collective Collections, Scholarly Dissemination Engine, Libraries that Learn, Innovation Lab, and the ARL Academy.

2. Current Programs, Projects and Activities of the Association That Fit Within the Scope of the Committee Membership Funded -- Historical and foundational data on investments and human resources:

● ARL Statistics (longitudinal trends) o Outcome: describe ARL member libraries

● ARL Annual Salary Survey o Outcome: describe salaries and demographics of professional workforce in ARL

libraries Cost recovery -- Established ‘new measures’ ARL protocols:

● LibQUAL+ (statistically significant trends); can be linked to student and faculty performance if the library implements the confidential protocol

o Outcome: Capture library service quality perceptions on information access, service, and library as place

● ClimateQUAL o Outcome: Implement and improve your organizational climate and diversity

assessment ● MINES for Libraries; can be linked to student and faculty performance if the technology

infrastructure is setup appropriately o Outcome: Measure the impact of networked electronic services:

Grant-funded -- Testing and Experimentation

● LibValue Toolkit; making available multifaceted outcomes assessment protocols o Outcomes: learning, scholarly reading, information commons impact, digitized

collections impact ● Measuring Up – current IMLS grant

o Outcome: understanding impact metrics for institutional repositories Convene experts and methodologists as well as other organizations that are engaged in similar work as ours in higher education including continuing engagement with NCES/IPEDS, ACRL, CARL, etc.

11

Page 12: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

3

Build community through shared approaches and perspectives, such as the Library Assessment Conference and related training like the Strategic Assessment workshops. Continue the current partnership with the University of Washington to support the Library Assessment Conference and engage key strategic alliances for future sustainability. 3. Priorities: Based on committee conversations and feedback from ARL directors, we anticipate maintaining the mix of programs indicated above but note that this is always subject to ongoing review for relevancy. Going forward, we do expect to see a growing emphasis in both committee and ARL staff work devoted to assessing outcomes important to institutional scorecard priorities: university level indicators such as increasing research productivity/awareness, and improving student success (for example percent going to graduate school), retention and graduation rates as well as organizational indicators such as salary equity, workplace climate, updating learning spaces. For some of these outcome areas, we need to increase awareness of existing products and services and for other areas, we need to develop new approaches and protocols. It may be useful to think of a hierarchy of needs from the immediate to the longer term along the lines depicted by the Global Libraries Initiative Impact Planning and Assessment Roadmap and/or along the lines of the Value Scorecard models. Below are some example projects grouped into three categories: current activities underway, projects that are considered for transformation, and new projects. Examples of existing projects underway:

● Supporting VPO researching salary equity issues - Dec 2016 ● Capturing investment on facilities and special collections with new survey tools - May

2016 ● Delivering LibQUAL+, ClimateQUAL and MINES for Libraries to libraries that engage -

2016 ● IMLS-funded grant researching impact and metrics of institutional repositories as well

as engaging VPOs - Nov 2018 ● Developing the cohort of assessment professionals (community of practice) and new

tools through the Library Assessment Conference (October 2016 and planning 2018) and Strategic Assessment workshops as needed

Examples of projects that we are considering for transformation:

● Capture the effects of the links between student success to information seeking attitudes and behaviors (LibQUAL+ Confidential) and regrounding LibQUAL+ (two to five years)

● Rethink the StatsQUAL infrastructure and service delivery (one to three years, possibly as part of the ARL rebranding)

● Transform SPEC Kits to provide information where we collectively decide we need more information (planning new approach for 2017)

● Share outcomes based strategic metrics and dashboards/ Balanced Scorecard initiatives (sharing twice a year in 2016-2018)

12

Page 13: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

4

● Standardize survey and accreditation practices with partner organizations (establish partnership)

Examples of new ideas for capturing outcomes:

● How research analytics in the disciplines are comparing across different institutions? Capture and import from existing data sources and link them to institutional outcomes for peer comparisons across different segments (large/small, private/public, etc.); work with members in defining useful segments among ARL member libraries for creating dashboards (2016 or 2017 or 2018 or not now)

● Assessing the state of outcomes assessment: in collaboration with a VPO capture the effectiveness of outcomes assessment in ARL libraries (this can take place through a coordinated effort of focus groups across interested ARL libraries) (2016 or 2017 or 2018 or not now)

● What is the value of libraries in student success? In collaboration with a VPO capture efforts in ARL libraries that relate libraries to student success measures (2016 or 2017 or 2018 or not now)

Due to the number of potential new projects, it will be essential for the committee to prioritize its efforts. A way of mapping activities and articulating responsibilities for ARL staff, committee members, assessment professionals, and other organizations would be useful. There is a need for the AC to articulate how we can support the other design team projects while we also leave space for the AC to decide what are key assessment priorities and projects. If there are research directions important for the membership to engage, the AC can bring these forward. As a way of encouraging new ways of thinking about assessment, a number of questions were captured by Elliott Shore during his listening tours with ARL member libraries and their staff:

• What tools can we use to tackle today’s problems • How can we measure what scholars do (scholarly productivity)? • Are there predictive models relevant to our work? • How do we present dynamic information? • How can we rely more on trend analysis? • Who are the leaders in this? How are they doing it? Who’s experimenting with the best

stuff right now? As the Association charts new priorities it would be useful to reflect on how these questions can inform the work of the Committee. The chair of the Assessment Committee (Bob Fox), the vice-chair (Jennifer Paustenbaugh) and ARL staff (Martha Kyrillidou) also had the benefit of interacting with the Coordinating committee before (Sept 23) and after the ARL membership meeting where priorities for projects were defined. As articulated in the reports of the other enabling capacity committees and design teams, we see assessment being an integral part of all of these components. In particular we see the potential of developing a strong thread in research data management, where many of the issues are parallel to those we face in assessment with a slightly different scale and focus. Furthermore, assessment has a strong community of practice and can engage the talents of assessment staff from member institutions to assist with key

13

Page 14: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

5

priorities of the association. Already expressions of interest for serving as Visiting Program Officers have been articulated by a number of institutions and the members of the assessment committee stand ready to engage their staff in service of the association priorities. Assessment Committee Robert Fox, chair (Louisville) Jennifer Paustenbaugh, vice-chair (BYU) Austin Booth (Buffalo) Colleen Cook (McGill) Bella Gerlich (Texas Tech) Arnold Hirshon (Case Western) Artemis Kirk (Georgetown) Mary-Jo Romaniuk (Manitoba) Betsy Wilson (Washington) Staff Liaison: Sue Baughman (ARL)

14

Page 15: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

Assessment Program

Executive Summary February 22, 2016

This report was prepared by the ARL Assessment Team to provide the status of each of the components of the current assessment portfolio. Our goal is to assist the Assessment Committee with its planning and prioritization of activities and projects, to inform the new program director of assessment, and to effectively plan for the operational elements for the next 3 to 6 months. Each section of this report describes the trajectory and suggests future directions for that part of the work. It begins with a review of the technical infrastructure, a key part of a successful assessment program. General observations drawn from the report: ● The portfolio of services is rich in variety but contains many discrete components that

make it difficult for libraries to conduct a comprehensive scan of the data from all of the surveys they utilize.

● The signature LibQUAL Program needs a thorough review in order to regain its market share both within and beyond ARL libraries.

● The StatsQUAL technical infrastructure is out-of-date and is hampering efforts to provide quality service to users of ARL’s assessment products.

● The way in which the programs have been organized has put an undue burden on staff to customize, modify, explain, and implement ARL’s current suite of assessment tools.

● The need to incorporate a communications and marketing plan; capacity-focused needs assessment and process analysis; strategic planning and budgeting; and systematized development, testing, and incorporation of user feedback into all survey programs are critical next steps.

The ARL Board recently approved the recommendation from the Financial Strategies Task Force to conduct a review of the assessment program. Such a review will ensure the continued success of programs that should be ongoing and more importantly, facilitate discussions about a future framework for the services and programs that should be provided by ARL. The Assessment Committee’s report to the Board provides a foundation for beginning this discussion. The members of the ARL Assessment Team stand ready to support this effort and the Assessment Committee. Some of the operational challenges and suggestions provided in this report could be addressed with minimum financial resources and focused staff time. Implementation of a number of the suggestions would be dependent on the outcome of the program review. Respectfully submitted, Elliott Shore, Executive Director Assessment Team: Sue Baughman, Henry Gross, Shaneka Morris, Angela Pappalardo, Gary Roebuck, Amy Yeager

15

Page 16: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

2

Technical Overview The current StatsQUAL platform was written in-house (initially with contractors) using Microsoft’s ASP.NET and SQL Server and is hosted on dedicated servers at Rackspace (https://www.rackspace.com). There is a single install of the StatsQUAL software that powers all of the survey instruments. The StatsQUAL data management system and its associated websites are separate from the corporate ARL site. Each survey or suite of surveys has a separate user interface and login page that is its own domain. The StatsQUAL software operates by looking at the url being visited to determine which one of these instruments to access. Account holders have one single account that is activated for the surveys that their library permits them to access. While this process creates some problems, such as when a bug introduced in a feature for one instrument affects all others at the same time, it also means that fixing a bug in one instrument also fixes it immediately in the all of them. It also greatly simplifies development as there is no need to maintain separate branches (versions) of code for each one, and installing a new version only has to be done once. Rarely has a feature been delayed due to a bug that it caused in another instrument. In addition to the StatsQUAL survey platform that has been developed, content sections on this website have been created for news, events, publications, and general informational pages. These pages duplicate information found on ARL’s main website, while some content can only be found on these instrument-specific pages. This creates a sub-optimal experience for users, and creates additional work for the Assessment Team. The Assessment Team recommends merging all of the content on these programmatic sites into the main ARL website and removing it from the StatsQUAL platform, with the domains directing to the appropriate page on arl.org. With this change, what would remain on the StatsQUAL platform are data collection, extraction, and analysis for LibQUAL+, ARL Statistics, and ClimateQUAL moving them from their own domains, to become subdomains under arl.org (i.e. libqual.arl.org). The content remaining on the StatsQUAL website and all of the information transferred to the arl.org site needs to be thoroughly reviewed and updated. The MINES for Libraries does not actually collect any data through this domain due to the nature of the instrument. The Assessment Team also suggests that existing commercial products be considered, or that an RFP is created to find a partner to manage the technical side of the assessment program, giving the team more time to focus on the content of the surveys. The team believes that the scope of StatsQUAL can be narrowed without eliminating the flexibility it offers and provide a reliable platform that meets user needs. LibValue - Technical Issues There was interest in trying to operationalize some of the tools developed as part of the LibValue grant. However, adequate resources were never secured to accomplish this. The site is purely informational, with the exception of a javascript-based roi-calculator, which could be included on another site in a manner similar to the Strategic Thinking and Design Interactive Strategic

16

Page 17: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

3

Plan Database, and become an area for people to submit their research to share. If this idea were to proceed, finding a different mechanism to accomplish this is recommended. It is also recommended that this site be maintained as informational and all content merged into the main ARL website, similar to the other information-only StatsQUAL websites. Finally, the University of Tennessee, Knoxville is currently hosting a bibliographic database that ARL has agreed to take over at some point, and when that time comes, an off-the-shelf bibliographic platform could support this. Assessment Blogs ARL hosts two wordpress blogs, one for ClimateQUAL participants and one for general assessment information. These blogs should be merged into ARL’s main website if possible, or at least be added to ARL’s other wordpress sites if they must remain as separate wordpress sites. Subscriptions In 2010, ARL decided to monetize some of the data in ARL Statistics and LibQUAL+ in a subscription model. How this works in the two instruments differs, but the model was expanded beyond what was initially intended and resources were not given to ensure a positive user experience. As such, there is a lot of confusion about the model, which has resulted in a very poor experience for both the users as well as the staff. The recommendation is to remove the subscription models from these sites, and if there a desire to keep some of the data behind a paywall, then other solutions for distributing this other than through the StatsQUAL platform should be identified.

17

Page 18: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

4

ARL Descriptive Surveys The ARL Descriptive Surveys are a series of assessment efforts that describe collections, expenditures, and staffing, staff salaries, budget projections, sources of funds expended, library facilities, and special collections in ARL member libraries through the following surveys: The ARL Statistics1, the ARL Annual Salary Survey, the annual Budget Survey, the Source of Funds survey, and the Facilities Inventory, respectively. The Association began collecting and publishing annual ARL Statistics data in 1961-62. ARL Salary Survey statistics have been collected and published annually since 1972-73. ARL has collected Budget Survey data since 2008-09. The Source of Funds survey data collection began in 2009-10, and the Facilities Inventory is the newest data collection effort with the first survey cycle in 2015. Currently, the ARL Statistics and the Salary Survey data are collected annually with 100% participation from all member libraries. Member libraries and non-member organizations2 heavily use the data from these two surveys for benchmarking and research purposes. The participation rate for the Budget Survey has increased steadily since the 2011-12 survey, and member libraries are keen to obtain benchmarkable data about student library fees (Source of Funds survey) and renovation expenditures (Facilities Inventory) in ARL Libraries. Unlike the annual ARL Statistics and Salary Surveys, the Source of Funds Survey is collected in odd numbered years, and Facilities Inventory data are collected every three years. The operational challenges associated with coordinating these data collection efforts include:

1. Communication - Currently, announcing open surveys, answering questions about definitions and deadlines, and training new Survey Coordinators all occur via email through the [email protected] inbox. In the past, each descriptive survey was announced separately via email, resulting in up to five survey announcements annually. In years where Survey Coordinator turnover is high and there is much confusion about survey deadlines and definitions, managing these communiqués takes up the bulk of staff time and attention and threatens to delay the data verification process.

2. Planning - The ARL Statistics data are used to calculate the ARL Investment Index,

commonly referred to as the annual ranking. The Assessment Program routinely granted extensions to libraries that needed more time to submit the ARL Statistics data. The Investment Index cannot be calculated and the ARL Statistics publication cannot be produced until all data are received, cleaned, and verified. This usually means that the release dates for the annual ranking and the publication vary from year to year based on the duration of the extensions, making it difficult to plan the publication timeline in advance and meet members’ information needs.

1 The ARL Statistics survey collects data about special collections as part of the survey cycle. 2 Non-member organizations pay a subscription fee to access ARL Statistics data set. Access to the ARL Salary Survey is not included in the subscription.

18

Page 19: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

5

3. Data analytics - As more surveys have been added, adequate ways to extract newer survey data in an easy-to-use manner to enable users to get the information they want have not been addressed. Currently, only the institution-level3 ARL Statistics data are available in Analytics; however, repeated requests to make the historical Law and Health Science Library data available in Analytics are received.

4. Managing Subscriptions - ARL offers a number of subscription options for purchasing

ARL Statistics and Salary Survey-related subscriptions. In 2010, ARL decided to monetize the previously open-access ARL Statistics data and began offering subscriptions to ARL Statistics Analytics, which provides access to the ARL Statistics data sets going back to 1908 and to the analytical interface for the data. Over the past 5 years, it seems that subscribers have become confused about the ARL Statistics Analytics subscription. Some subscribe to Analytics thinking that the print is included (it is not), and some subscribe to the print when they really want access to the data through Analytics.

The suggestions to address these issues are: Short Term, 2016-2017

1. Communication a. Create a quarterly Assessment Program Newsletter/Press Release that includes

all descriptive surveys for the coming year and communicates survey due dates, publication timelines, and the dates that the Assessment Program plans to send reminder/follow up communications. The quarterly newsletters could be placed in the news section of the corporate ARL site and in the publications database on the ARL Statistics site. This would streamline communications about the descriptive surveys.

b. Create an online, self-paced training module for Survey Coordinators that

complements the one-on-one support that the Assessment Program prides itself on. A page on ARL Statistics to house the help videos and training materials could be created. This would facilitate and enhance the training of new Survey Coordinators and allow staff to focus more energy on data preparation and data mining. Examples:

1. ARL Position Description Bank 2. ACRL Metrics

c. Currently, the current survey cycles are not publicized on the corporate ARL site

or the ARL Statistics site. Repurposing some of the empty space on these two sites to publicize open surveys would provide another access point for information about the survey timelines.

3 Institution-level data reflect the sum of the Main + Law + Health Science ARL Statistics surveys.

19

Page 20: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

6

Longer Term, Beyond 2017

1. Communication - Assess the current communication plan for the descriptive surveys to ensure that publications and communications are properly branded and that the target audiences are reached.

2. Planning

a. Assess the suite of descriptive surveys and subscription products to see if current offerings meet the needs of the membership.

b. Assess the impact of generous extensions for data submissions and the efficiency of techniques used to clean the data and produce the publications. Automating as much of the data cleaning and production as possible is proposed so that the publication timelines can be consistent and reliable.

3. Data Analytics

a. Assess the feasibility of including data from all descriptive surveys in Analytics. This would be part of a larger site-wide effort and content strategy to improve the discoverability of desired information.

b. Conduct a cost/benefit analysis regarding future in-house enhancements to the current suite of analytical tools in Analytics. Perhaps utilizing a third party vendor would be a better way to maximize staff capacity and meet members’ needs.

4. Managing Subscriptions - Assess the ROI associated with managing and maintaining

subscriptions to ARL Statistics Analytics.

20

Page 21: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

7

ClimateQUAL Current Program ClimateQUAL, formerly called the Organizational Climate and Diversity Assessment, is an online survey that collects information about staff perceptions concerning their library's (a) commitment to the principles of diversity, (b) organizational policies and procedures, and (c) staff attitudes. The survey addresses a number of climate issues, such as diversity, teamwork, learning, and fairness, as well as current managerial practices, and staff attitudes and beliefs. It contains questions designed to understand how organizational procedures and policies affect staff perception of service quality in a library setting. Respondents are also asked to answer questions about individual identity issues as well as their team or work unit. This program has five aims: (1) Foster a culture of healthy organizational climate and diversity; (2) Help libraries better understand staff perceptions of organizational climate and diversity; (3) Facilitate the on-going collection and interpretation of staff feedback; (4) Identify best practices in managing organizational climate; and (5) Enable libraries to interpret and act on data. Libraries use the ClimateQUAL data to improve their organizational climate and culture for delivering superior services to the communities they serve. Today, ClimateQUAL is a standardized, scalable, and empirically sound assessment tool for research libraries interested in improving their organizational climate and diversity culture. The cost to implement the protocol is $5,000. The ClimateQUAL Team oversees operational and technical aspects of the protocol and meets as needed to review the survey efficacy, implementation guidelines, data reporting, and other issues. Team members are Sue Baughman, Henry Gross, Paul Hanges (University of Maryland), Charles Lowry (retired ARL Executive Director), Shaneka Morris, and Gary Roebuck. 2016 Four libraries, two ARL members and two non-ARL libraries, are implementing ClimateQUAL between January and March 2016. Four additional libraries are considering implementation of the instrument in 2016. 66 libraries have administered the survey since 2007 through December 2015. Of this number, 10 libraries have administered the instrument more than one time, which is an expectation of all libraries that participate in this protocol. Historical Background The ClimateQUAL protocol is a joint endeavor between ARL and the University of Maryland Industrial and Organizational Psychology (I/OP) program. The instrument was developed in 1999 by the University of Maryland Libraries in partnership with the University of Maryland Industrial and Organizational Psychology (I/OP) program as a protocol for measuring the climate and culture of the University of Maryland libraries and was called the Organizational Climate and Diversity Assessment (OCDA). In 2007, ARL and the UM Libraries, in partnership with the I/OP program tested the generalizability of the OCDA protocol across multiple library organizations. During Phase I, five ARL institutions tested a modified OCDA survey and

21

Page 22: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

8

validated the hypothesis that a healthy organization provides better customer service. In 2008, during Phase II, ten ARL and non-ARL institutions expanded the pilot further refining the protocol. The protocol was transferred to ARL in 2009 as an ongoing operation of its library assessment service and was renamed ClimateQUAL. Challenge Points The operational challenges associated with the ClimateQUAL protocol include:

1. The communication process requires a good deal of interaction with the library staff that are leading the administration of the survey. This orientation includes the mapping of the organizational structure to teams or department and discussion about the anonymity and confidentiality of the survey.

2. The interface of ClimateQUAL requires a bit of staff involvement to create reports - it is

not fully automated.

3. The data that libraries receive requires some organization development experience to create improvement strategies. ARL staff (Baughman and Morris) review each report with the team leaders at library conducting the survey to respond to questions. Due to lack of ARL staff time, additional support is not possible without creating a consulting engagement. Many libraries do not have the additional resources to contract with ARL or other consultants for additional support. Members of the ClimateQUAL team are publishing a book on ClimateQUAL (expected spring 2016 publication date), which includes a chapter on developing improvement strategies.

The suggestions to address these issues are: Short Term, 2016-2017

1. Integrate the ClimateQUAL registration announcement into the annual/quarterly communications about the ARL Statistics and Salary Survey. A communication plan should be created to maximize awareness of this protocol and its value to libraries.

2. Develop or update a series of tools in support of the protocol to reduce the amount of

time needed to orient new libraries to the survey: a. Create an online, self-paced training module for survey coordinators. See

ACRLMetrics as an example. b. Revise ClimateQUAL procedures manual. c. Update the ClimateQUAL website.

3. Install Crystal Reports on multiple computers so that more than one Assessment Team

member can create reports. This will reduce the reliance on the Director of Administration and Operations for reports and SPSS files.

22

Page 23: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

9

Long Term, Beyond 2017

1. Using LibQUAL as a template, modify the administrative interface to create more efficient and easy access. Users would have:

a. Access to and tabulation of data from the survey dashboard for participating libraries and ARL Staff

b. Access to SPSS data files within the Data Repository for ARL Staff

2. Conduct a review with ClimateQUAL users of the types of data available to ensure it is most useful (e.g. how racial data is reported, types of reports available, etc.). This review will enable the ClimateQUAL Team to address multiple issues and complete a thorough refresh of the protocol.

23

Page 24: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

10

LibQUAL Current Program LibQUAL is a customer satisfaction survey offered to the library community by the Association since 2000. It is delivered through a custom-built platform that allows libraries to configure and launch their surveys, access data and reports, compare their results over time, and benchmark against their peers. In keeping with the Association’s commitment to provide access to its content to people with disabilities, the survey questionnaire is compatible with screen readers, and a Voluntary Product Accessibility Template (VPAT) is available to help participating libraries comply with their institutions’ accessibility mandates. The survey questionnaire has a responsive design and can be completed on mobile devices. Historical Background Through 2015, there have been 2,877 institutional surveys implemented across 1,307 institutions in 29 countries, with 18 language translations, and over 2.3 million respondents. Hundreds of articles and conference papers have been published about the protocol. Early growth in the number of participating institutions leveled off in the mid-2000s, and participation has declined since 2007, both among ARL member institutions and overall. Challenge Points The implementation of new features annually since 2010 has improved the protocol for participating libraries but has led to the accrual of technical debt that is beginning to affect the reliability of the system. As a result of these challenges, ARL staff have instituted a moratorium on custom survey implementations and new feature development until the technical infrastructure is stabilized. A more fundamental challenge is the need to update the questionnaire to reflect changes that have taken place in research libraries since the original grounding research was conducted in the 1990s, particularly in the Library as Place and Information Control dimensions of the survey. Two comments from libraries participating in the 2013 LibQUAL Canada consortium are representative of the types of feedback received about the survey:

- We have participated in 2010 and 2013. All the automated features of LibQUAL are great, as is the ability to compare between years. However, LibQUAL itself is less and less appealing. Some of our users are confused and even alienated by the questions and it does not serve as a useful promotional vehicle for us, often the point of surveys. - We received feedback that the LibQUAL instrument is confusing, looks out-of-date, and some of our faculty refused to answer it because they took issue with some of the questions. It may be due for a redesign. That said, it has provided us with some data to show how we’ve improved over time.

The suggestions to address these issues are:

24

Page 25: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

11

Short Term, 2016-2017

1. Refresh the website and user documentation materials.

2. Continue to improve usability of the platform and questionnaire.

3. Investigate and implement a report-writing application that produces documents meeting WCAG 2.0 accessibility requirements.

4. Evaluate and possibly eliminate the membership subscription service

Longer Term, Beyond 2017

1. Engage interested ARL member libraries and other stakeholders in research on regrounding the survey.

2. Focus on future improvements that benefit ARL member libraries and strengthen ARL’s

leadership in assessment.

3. Institute a development process that incorporates needs assessment, planning and budgeting, development, testing, user feedback, and communications and marketing.

The biggest challenge on the technology side is maintenance of an aging system. The repercussions of keeping everything in a “live” system, including slow downs due to the size of the database are fully experienced. Accommodations for legacy data when updating the system have been necessary. Some steps to address this with the creation of an analytical data store have been made, but that is only the first step, and further improvements could be made with how to store the actual survey and response data.

25

Page 26: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

12

LibQUAL+ Participation 2000-2015 This chart illustrates the longitudinal use of the protocol by all participating institutions and by ARL member institutions.

26

Page 27: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

13

ARL Member Participation as a Percentage of Total LibQUAL+ Participation, by Year

27

Page 28: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

14

ARL Members and Frequency of LQ Survey Use (2000-2015)* Overall, 125 organizational units within ARL member institutions have used LibQUAL+ at least once. Of these, 20.8% have used LibQUAL+ only one time and 48.8% have used LibQUAL+ three or fewer times, while 5.6% have participated ten or more times.

* This count represents discrete organizational units (e.g. main library, law library, etc.). Thus, some members may be represented in multiple rows (e.g. an institution’s main library has used LQ once but its law library has used it three times).

28

Page 29: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

15

LibQUAL+ Revenue and Expenses 2000-2015

Year Revenue Expenses Surplus (Deficit)

Notes

2000 24,000 1,541 22,459

2001 84,000 40,612 43,388

2002 541,210 256,623 284,587

2003 687,157 288,025 399,132

2004 545,136 460,419 84,717

2005 380,247 594,421 (214,174) Revenue was below budgeted expectations

2006 850,198 732,606 117,592

2007 912,023 1,175,693 (263,670) Developer contractor costs for transition from ColdFusion to .NET platform

2008 530,807 956,496 (425,689) Developer contractor costs for transition from ColdFusion to .NET platform

2009 691,166 570,558 120,608

2010 731,450 379,349 325,101

2011 704,092 529,177 174,915

2012 614,080 325,542 288,538

2013 840,236 477,530 362,706

2014 502,695 406,852 95,843

2015* 439,340 265,033 174,307

Totals 9,077,837 7,460,477 1,617,360

*2015 figures are unaudited

29

Page 30: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

16

MINES for Libraries Historical Background MINES for Libraries® was developed by Brinley Franklin, former Vice Provost for University Libraries, University of Connecticut and Terry Plum, former Assistant Dean for Technology and Director, Simmons Graduate School of Library and Information Science at Mount Holyoke Program, as an online transaction-based survey to supplement a library cost analysis study that was originally developed in the 1980’s. In May 2003, MINES was adopted as part of the Association of Research Libraries New Measures Program and is now one of ARL’s library assessment tools offered through StatsQUAL. MINES collects data not available through vendor-supplied usage statistics, and can help libraries better understand the impact of their networked electronic resources. It can help give insight into questions such as: ● Who is using the resources in terms of status (e.g., undergraduate) or affiliation (e.g.,

School of Business). ● The locations from where networked services are being used (enabling librarians to

plan user support services accordingly). ● The purpose and intent of use, permitting academic librarians to identify which

category of electronic resources contributes most to their institution's’ primary mission of instruction/education, funded research, patient care, public service, and other activities. The survey is also valuable in the refinement of collection development and service decisions.

Implementation and Pricing MINES gathers the data points cited above through the deployment of a short survey, 3-5 questions, delivered at the point of use of an e-journal, database article, or digital collection or service. Users are only queried once per online session, while data is tracked to reflect the use of multiple resources. The MINES protocol recognizes and addresses the problem of non-respondents to Web surveys and represents a random sample, allowing institutions to make inferences about the population. Ideally, the survey is administered in real time over the course of a year in periodic two hour time blocks using a random moments sampling plan; different time periods can be utilized. The participating library or consortium is responsible for implementing the survey’s technical infrastructure, so that a comprehensive sampling plan can survey all electronic service users, regardless of their point of entry (OPAC, library Web, etc.). Successful local assessment infrastructures have included IP validating scripts, scripts generating links for databases and journals, and OpenURL technologies. Two of the more comprehensive assessment infrastructures involve placing the survey at the campus router or using EZproxy, a widely adopted, re-writing proxy server. Any library running EZproxy can locally implement an

30

Page 31: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

17

application that presents the MINES survey to networked users as they initiate a session and captures networked services usage both locally and remotely during the sampled time periods. During MINES engagements, ARL has historically provided: ● Advice and recommendations for setting up the library’s assessment infrastructure to

increase the number of potential survey respondents (through the engagement of Terry Plum as a consultant).

● Advice on the structure and content of the survey (again in tandem with Terry Plum as a

consultant). ● Validation and analysis of the data and preparation of a final report. At the conclusion of

the survey period, the library or consortium receives an analysis that provides insights on the impact of networked electronic services by analyzing the use of digital resources and identifying the demographics and purpose of use.

The fee charged for MINES has ranged from $7,000 to $15,000 per year, depending on the length of the implementation and the final deliverables. Challenge Points The operational challenges associated with the MINES protocol include: ● While the protocol itself provides invaluable information, how it is implemented has

varied widely due to the myriad of ways authentication and provision of access to networked resources is done at educational institutions. This has been a major roadblock to it scaling widely (as LibQUAL has) since it requires technical staff with fairly sophisticated expertise (typically available only at large institutions). At one point, the Assessment Team was working with EzProxy to standardize implementation in some way but when ownership of this product transferred to OCLC, this process reset and stalled.

● Much of the advice provided to participants on the technical implementation of the

protocol and the development and customization of the survey instrument have been through the contracting of Terry Plum as a consultant. As Terry has now retired from his post at Simmons College and has joined a library management consulting practice, his availability may change, impacting ARL’s ability to accept and manage future MINES engagements.

Suggestions The suggestions to address these issues are:

1. ARL needs to evaluate whether MINES continues to fit into the portfolio of assessment tools it wishes (and has the capacity) to offer its members and others in the academic library community.

31

Page 32: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

18

2. ARL needs to clarify its intellectual property rights in relation to the MINES protocol

with its original developers. The matters of ownership and licensing rights and its duration should be clarified and put in writing.

3. Since Terry Plum has been so heavily involved in each engagement, ARL needs to

approach him to determine his willingness and availability to continue supporting this project in this capacity. If he is not available, serious consideration has to be given to securing the necessary expertise (both on technical implementation as well as survey design) elsewhere. This expertise is not currently available in-house and is essential to the successful completion of a MINES project.

4. Until these issues above are addressed, it is suggested that ARL suspend the acceptance

of any new MINES implementation requests.

32

Page 33: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

19

LibValue Values, Outcomes, and Return on Investment of Academic Libraries ("LibValue") was a three-year study funded by the Institute of Museum and Library Services to define and measure ways in which libraries create value through teaching and learning, research, and social, professional, and public engagement. The study was a collaboration among the University of Tennessee, Knoxville, the University of Illinois at Urbana-Champaign Libraries, and ARL, with partners at Syracuse University and Bryant University. Principal investigators were Carol Tenopir, professor in the School of Information Sciences and director of the Center for Information and Communication Studies, University of Tennessee, and Paula Kaufman, former professor and university librarian, University of Illinois at Urbana-Champaign. ARL’s role was to provide coordination and communications for the project. LibValue was initially implemented between December 1, 2009, and November 30, 2012; the project received a no-cost extension through 2013 for additional communications activities. ARL hosted six webinars about the project in February-August 2013. In 2013-2014, ARL launched a website intended to serve as an archival home for the project (replacing a website hosted by the University of Tennessee) and to provide a forum for libraries to share related research. There was consideration of ARL taking over and expanding a bibliographic database of articles on ROI and value developed for the project, which is currently hosted by the University of Tennessee. Measuring Up “Measuring Up: Assessing Accuracy and Reported Use and Impact of Digital Repositories,” is an IMLS-funded grant and ARL is one of the collaborators on the project. The goals of the Measuring Up project are to address the challenges of accurate reporting on the use of digital repositories through web analytics software and to produce recommendations for best practices for measuring their impact. ARL’s role in the project is to provide Google analytics training, conduct two surveys and analyze the data, and report on these findings. Sue Baughman and Gary Roebuck are serving on the project team. The principal investigator for the project is Kenning Arlitsch, Dean of the Library of Montana State University. The project will be completed in 2017.

33

Page 34: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

ASSESSMENT PROGRAM UPDATES SINCE FEBRUARY 2016

In February 2016, the ARL Assessment Team prepared a report for the Board and the Assessment Committee on the status of each component of the current assessment portfolio. The purpose of the report was to assist the Committee with its planning and prioritization of activities and projects and to develop a work plan for 2016. The Assessment Committee met in February and March to review the Team’s report, provide feedback on the position announcement for the Program Director for Assessment, and plan for its April 26 meeting in Vancouver.

A. Library Assessment Conference

• Published the 2014 proceedings in February. Two papers featured in a Research Library Issues #288.

• Reviewed 250 proposals for papers, short papers, and poster sessions for the October 31–November 2, 2016 conference to be held in Crystal City, Virginia.

• Opened registration for 2016 conference with early bird registration ending July 1, 2016.

• Released the application announcement for ten scholarships for individuals from historically underrepresented racial and ethnic groups to attend the conference on March 25.

• Members of the Steering Committee are Sue Baughman (co-chair, ARL), Jackie Belanger (Washington), Bob Fox (Louisville), Steve Hiller (co-chair, Washington), Lisa Hinchliffe (Illinois, Urbana-Champaign), Martha Kyrillidou (consultant for ARL), Vivian Lewis (McMaster), Megan Oakleaf (Syracuse), Jennifer Paustenbaugh (Brigham Young), Stephen Town [retired, University of York).

B. LibQUAL+

• Twelve ARL institutions and 77 non-ARL institutions are registered to implement LibQUAL+ in 2016. Registration is ongoing and open until November 1. Participation by type of institution is: 84 college or university libraries; 2 law libraries; 1 European business school library; and 2 community college libraries.

• Seven libraries at LIBER (Association of European Research Libraries completed the survey and a combined results report is being prepared.

• Creating highlights of 2015 survey administrations.

C. ClimateQUAL Highlights

• Five institutions administered survey in 2016. Five additional libraries have inquired about the protocol.

• Finalized chapters for book about ClimateQUAL for summer publication.

D. Annual Survey Activities

• ARL Statistics 2014–2015 (1 data submissions needed before completing data verification, expected completion in June-July 2016).

34

Page 35: Assessment Committee April 26, 2016 9:30 a.m. to 10:30 a.m ... · 4/26/2016  · initiatives. What should be included to be useful to ARL members?ARL Assessment Team will conduct

• ARL Annual Salary Survey 2015–2016 (data verification underway, expected completion in June 2016).

E. Strategic Assessment Seminar

• Conducted strategic assessment seminar with 18 individuals on April 14-15. This is fifth session held since 2014. The seminar is led by Steve Hiller (Washington) and Raynna Bowlby (Consultant) and is designed for individuals who are responsible for leading the library's assessment program and who seek to develop a strategic agenda for assessment.

• Intended learning outcomes are: (1) Appreciate the strategic intent and changing focus of library assessment; (2) Be familiar with the rationale for library assessment activities; (3) Increase knowledge of library assessment methods, tools, and protocols; (4) Consider effective uses of metrics and evidence for planning, decision-making, communicating, engaging and action; (5) Think through approaches to collaborative and institution-specific library assessment; (6) Position your role and responsibilities to cultivate a culture of assessment; and (7) Maximize your effectiveness and value to the organization.

F. IMLS Grant–Measuring Up

• Surveys administered to academic library directors and institutional repository (IR) managers for the IMLS grant on ‘Measuring Up’ to identify models for capturing reliable web usage data for institutional repositories and digital collections.

• Data analysis is underway for 98 director responses and 82 IR managers.

G. Salary Equity and Gender Review

• Quinn Galbraith, Visiting Program Officer from Brigham Young University, is finalizing his research project that included a survey of a sample ARL libraries’ staff to examine the extent to which gender and minority salary differences can be explained by factors beyond gender and race and a historical review of ARL salary data.

H. Program Director for Assessment

• Released announcement for position on April 12. • Appointed search committee: Sue Baughman, chair of search committee and ARL

Deputy Executive Director; Ryan Brennan, ARL Accountant; Bob Fox, chair of Assessment Committee and Dean, University Libraries, Louisville; Artemis Kirk, member of the Assessment Committee and University Librarian, Georgetown; Wendy Lougee, University Librarian and McKnight Presidential Professor, Minnesota; Liz Mengel, Associate Director, Collections and Academic Services, The Sheridan Libraries, Johns Hopkins University; Gary Roebuck, ex-officio to search committee and ARL Director of Administration and Operations; and Judy Ruttenberg, ARL Program Director

35