Setting a Course for your Library: Data Analysis and ... · 9/1/2017 7 Trendline reliability: A...

23
9/1/2017 1 Setting a Course for your Library: Data Analysis Strategies

Transcript of Setting a Course for your Library: Data Analysis and ... · 9/1/2017 7 Trendline reliability: A...

9/1/2017

1

Setting a Course for your Library:

Data Analysis Strategies

9/1/2017

2

Reporting: Trustees, managers

=(new-old)/old=(G3-H3)/H3

Cumulative LibA LibB LibC LibD District YTD 2016 YTD 2015 % chg

Circulation 1,142,260 1,485,443 741,673 1,275,918 2,043,090 6,688,384 7,023,412 -4.8%

Counter 418,116 537,371 272,610 481,701 0 1,709,797 1,699,333

Holds picked up 201,654 284,757 144,161 283,928 0 914,500 924,253

Patron PC use 42,832 37,747 20,037 36,608 0 137,224 151,734

Mobile app visits 207,629 207,629 104,899

Wireless use 272,569 272,569 210,343

Programs etc

9/1/2017

3

New building / renovation or shelf reconfiguration

Total

collection

size

Anticipated

% checked

in

Number

checked

in

# per 3'

shelf

Total

Shelves

per

collection

5 shelf

units

3 shelf

units

Adult fiction 17873 74% 13226 35 378 76 126

Adult NF 34000 70% 23800 23 1035 207 345

YA fiction 5106 52% 2655 20 133 27 44

YA NF 1225 85% 1041 30 35 7 12

9/1/2017

4

9/1/2017

5

9/1/2017

6

Trend lines – quick and easy

9/1/2017

7

Trendline reliability: A trendline is most reliable when the R-squared value is at or near 1.00.

9/1/2017

8

Group discussion

Spend a few minutes reviewing Crystal Springs Library’s statistics in

your handout.

Using the library metrics and remembering their community needs

assessment, what direction / goals would you propose for the Crystal

Springs Public Library?

Discuss ideas at your table.

9/1/2017

9

Survey Topic Areas

Immediate Survey

9/1/2017

10

Follow-Up Survey

9/1/2017

11

Follow-Up Survey:

9/1/2017

12

Analyzing qualitative data: the open ended question

• Step 1: Read through comments

• Step 2: Condense and categorize data

o Underline common words, themes

o Use margins for notes

• Step 3: Describe themes or categories

• Step 4: Share findings

Example: Staff Appreciation

• Thank you (peer or supervisor)

• Financial Reward

• Celebration

• Director/Board Recognition

9/1/2017

13

• (3)

Thank you, informal (6)

Financial Reward

(3)

Celebration (7) pro (3), con (2), meh (2)

DirectorFormal

Recognition(2)

Thank You Financial Celebration Director/Formal

Shelver

Shelver

Clerk

Clerk

Librarian

Librarian

Branch Manager

Branch Manager

9/1/2017

14

Thank You Financial Celebration DirectorFormal

Shelver X M

Shelver X P X

Clerk X C

Clerk X P

Librarian X M

Librarian X M

Branch Manager X X C

Branch Manager X P X

Question: What did you like most?

Activity/Topic: Respondent stated that they enjoyed

the specific activity or the topic

covered in the program

Social: Respondent stated that they enjoyed

spending time with their friends or

interacting with other people

Educational: Respondent stated that education or learning was what

they liked most

Creativity: Respondent stated that

they enjoyed the opportunity to express

their creativity

Fun: Respondent stated that having fun was what they

liked most

Personal Benefit: Respondent described a

personal benefit that they gained through their participation

Drop Project Outcome

answers here 1

1 1 1

1 1

1 1 1

1

1

TOTALS 1 4 1 3 1 1

9/1/2017

15

Benchmarking

• Types of benchmarking

o Process (internal) benchmarking

• Quality assessment tool. Be the best for your patrons

o Comparative (external) benchmarking

• How you ‘measure up’ to your peers

Pros & Cons

• Highlight excellence / under performance

• Identify gaps

• Adaptable based on customer needs

9/1/2017

16

Pros & Cons

• Staff resistance

o Pride

o Fear of change

• Not a quick method

• Lack of resources (Time & money)

Process benchmarking

• Compares processes

• Select a process that needs improvement

o Real life examples

• Time spent shelving books

• Call center dropped calls

• Moving into a smaller space

9/1/2017

17

Benchmarking road map

Determine what to benchmark

Define metrics

Develop data collection methodology

Collect data

Road Map continued

Identify performance or practice gap

Identify reasons for deficiencies

Develop action plan

Institutionalize as part of continuous performance plan

9/1/2017

18

Discuss with a partner

What process in your library would benefit

from process benchmarking?

Comparative benchmarking

• Quantitative

o Examples

• Library Journal Index (Star Library ratings)

• State standards

• State statistical reports

• Local comparisons

9/1/2017

19

Uses for external benchmarking

• Advocacy

• Funding requests

• Measure success

• Public Relations – why we’re great

How to choose peers

• Relevance

• Strategy

• The “right” numbers

o Best combination of criteria

o Enough, but not too many peers

9/1/2017

20

Criteria for selecting peers

• Population of legal service area

• Total operating expenditures

• Legal basis type

• Urban/suburban/rural setting

• Number of outlets

• Size of staff

9/1/2017

21

IMLS Data Toolsimls.gov

Subscription Data Tools

• PLA Metrics

o Public Library Data Service (PLDS)

• Edge Initiative

• Counting Opinions

9/1/2017

22

State Data Toolslrs.org

9/1/2017

23

Improvement through benchmarking

• Analyze the data

• Set goals and develop an action plan

• Monitor the process

• What are Crystal Springs Public Library’s key

statistics when implementing peer comparisons?

• How can you use the comparison statistics to ask the

board for more resources? What would you ask for?

Group discussion