In Search of Patterns at the Desk

Post on 06-Dec-2014

720 views 1 download

Tags:

description

"In Search of Patterns at the Desk: An Analysis of Reference Question Tracking Logs" is being presented at the 4th QQML 2012 International Conference in Limerick, Ireland.

Transcript of In Search of Patterns at the Desk

In Search of Patterns at the Desk:

An Analysis of Reference Question Tracking Logs Susan [Gardner] Archambault

Loyola Marymount University, Los Angeles (USA)

Loyola Marymount University

Private Catholic University in Los Angeles, CA

5600+ undergraduates and 1900+ graduates

William H. Hannon Library opened in 2009

Reference Service at LMU

24/5 Information Desk Staffed by students,

library staff, outsourced staff

Desk encounters recorded using Gimlet question tracking system

14,210 volumes in the print Reference Collection

Over 200 Electronic Databases

Gimlet Interface

Purpose of Study

Content of questions (subject, difficulty level)

Content of answers (characteristics of sources used, accuracy)

Patterns (by patron type, service provider, subject, or time)

Develop reference question tagging scheme

Methodology

Content analysis of LMU reference questions, Fall 2010/Spring 2011 academic year

Excel file data dump, deleted all non-reference questions and questions not asked at Info Desk

Methodolgy

Took free text Q&A fields and recoded into “Reference Tag,” “School/College,” “Subject,” “Exact Source,” and “Quality”

New fields finalized after several rounds of 50-question sample calibrations and “norming sessions” by 3 coders

Old Reference Tags (Beginning)

1 Citation Style

2 External Web Page

3 Known Item

4 Reference Book

5 Referral

6 Reserves

7 Retrieval

8 Search Construction

9 Topic Source

Final Revised Reference Tags

1 Catalog Use & Lookup

2 Database Help & Use

3 External Web Page

4 Internal Web Page

5 Reference Book (print)

6 Referral

7 Reserves

8 Retrieval

9 Other

School/College

1 Business

2 Communication & Fine Arts

3 Education

4 Film & Television

5 Law

6 Liberal Arts

7 Science

8 General Interest

Quality

1 Inappropriate sources recommended

2 Incomplete

3 Acceptable

Methodology

• Sampled from 3,422 total questions

• Random 20% sample from all questions at levels 1-3 difficulty on READ (Reference Effort Assessment Data) Scale

• All questions included from levels 4-6

• Total sample size=931 questions

Methodology

Analyzed sample in SPSS to look at frequencies and relationships

Examined standardized residuals for significance

Selected Frequencies

Reference Tag: Totals

Catalog Use & Lookup: Exact Sources

Top Vendors: Database Help & Use (More Than 5x)

Exact Source: Database Help & Use (Used More than 5 times)Database Times

UsedJSTOR 47

Academic Search Complete

45

Proquest 45

PsycINFO 33

Business Source Complete

28

Lexis Nexis 23

Ebsco 21

MLA Intl Bibliography 18

ERIC 17

ATLA 14

Bibloi 14

ABI Inform 13

Mergent 13

CQ Researcher 12

OneSearch 12

WorldCAT 12

Business & Co. R.C. 10

GVRL 9

Lit. Resource Cntr 9

Euromonitor 8

Lit. Criticism Online 8

Soc. Abstracts 8

Science Direct 7

Biography in Context

6

CMMC 6

Opposing Viewpts 6

Proquest Dissert. 6

Sage Jnls Online 6

Exact Source: External Web Page

Exact Source: Internal Web Page

Exact Source: Ref Book (Print)

College

Liberal Arts: Subject Areas

Science: Subject Areas

CFA: Subject Areas

Accuracy: Student Worker Versus Librarian

o Database Use & Lookup: • Students

recommend more general sources versus subject-specific

Who Answered:Above Level “3” Difficulty

LAC Librarians Staff Students

11 252 33 15

Monthly Patterns

Shorter & Less Difficult

Longer Questions (16+ min.)

More Difficult Questions (Above “3”)

Patterns By Hour

Longer Questions (16+ min.)

More Difficult Questions (Above “3”)

Day of Week Patterns: Difficulty Level (Above “3”)

Databases/Higher Difficulty Level (Above “3”)Database Name Time

sDatabase Name Times

Academic Search Complete 30 Business & Co. Resource Cntr. 10

JSTOR 30 CQ Researcher 10

Proquest 28 ABI Inform 10

PsycINFO 20 Mergent 10

Business Source Complete 19 Google Scholar 8

MLA Int'l Bib. 14 OneSearch 8

Ebsco 13 Euromonitor 7

ATLA 12 Gale Virtual Ref. Library 7

ERIC 12

Communication & Mass Media Compl. 6

Lexis Nexis 11 Lit. Resource Cntr. 6

Patterns By College & Subject

Colleges with Longer Questions(16+ min.)College Number of Questions

Business 48

Communication & Fine Arts 21

Education 6

Law 5

Liberal Arts 87

Science 10

Colleges Higher Difficulty Level (Above “3”)

College Times

Business 63

Communication & Fine Arts 27

Education 14

Film & Television 6

Liberal Arts 148

Science 14

College of Liberal Arts: Subjects with Higher Difficulty (Above “3”)

Subject Times

English 29

History 17

Philosophy 5

Psychology 15

Sociology 5

Theology 23

Patterns: Fall Versus Spring

Semester Total Questions: CFA

Fall 32

Spring 17

Subject Fall Spring

English 29 11

Psychology 18 7

More Business Questions On Monday

Day of Week Number of Questions

Sunday 7

Monday 23

Tuesday 18

Wednesday 15

Thursday 8

Friday 10

Saturday 6

More Theology Questions On Tuesday

Day of Week Number of Questions

Sunday 5

Monday 11

Tuesday 23

Wednesday 13

Thursday 10

Friday 10

Limitations of Study

Interdisciplinary questions could not be categorized by subject easily

Despite “norming” sessions coders independently coded, so no interrater reliability

Small sample size (20%) for first three difficulty levels

Dependent on desk staff to accurately record all stats

Key Findings: Collections

Print reference collection used in only 5.9% of all questions

Small group of sources used to answer majority of ref. questions: (29 unique reference titles used for 0.2% of all possible titles)

Key Findings: Collections

95 unique databases used (48% of all databases available)

Key Findings: Collections

24% of all reference questions required an internal web page (LibGuide etc.) as a source

50% of all reference questions required the library catalog as a source

41% of all reference questions required a database as a source

Key Findings: Staffing

More difficult/longer reference questions Oct.-Nov. and Mar.-April; less difficult and shorter in Sept.

Mon-Wed. between 2-6pm should double-staff the desk and have librarian expertise; Sat. is lighter

Key Findings: Staffing

Librarians answered 81% of all the difficult questions (above a “3”)

Key Findings: Databases

Business Source Complete

Lexis Nexis

Good candidates for Database workshops based on frequency and difficulty:

JSTOR Proquest Vendor Ebsco Vendor

(show Academic Search Complete and PsycINFO)

Key Findings: Subjects

Subject areas we serve the most at the Desk (based on difficulty/volume):

Business English Psychology Theology History Education

Key Findings: Methodology

For Reference tagging scheme, source-based approach worked better than strategy-based

Thank You to the Other Coders

Alexander JusticeReference Librarian/Ref. Collection Development CoordinatorLoyola Marymount University, Los AngelesEmail: ajustice@lmu.edu

Andrew TootOvernight Information Desk SupervisorLAC/Loyola Marymount University, Los AngelesEmail: andrewtoot@gmail.com

Additional Acknowledgements

Thank you to the William H. Hannon Library Research Incentive Travel Grant

Thank you to the LMU Office of Assessment/Laura Massa

Additional Information

READ Scale: readscale.org Gimlet: gimlet.us PPT: bit.ly/deskpattern

Contact Info:Susan [Gardner] ArchambaultEmail: susan.gardner@lmu.eduTwitter: @susanLMU