BibEval – A framework for usability evaluations of online library services
description
Transcript of BibEval – A framework for usability evaluations of online library services
BibEval – A framework forusability evaluations of online library services
Thomas Weinhold, Bernard Bekavac, Sonja Hamann*
Swiss Institute for Information Research (SII), Namics AG*
Libraries in the Digital Age (LIDA) 2014
16-20 June 2014, Zadar (Croatia)
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Innovation and cooperation initiative with 20 sub-projects
Vision: creation of a national portal to improve access and retrieval of scientific information (www.e-lib.ch)
e-lib.ch – Swiss electronic library
Seite 2
ACCEPT
Infonet Economy
RODIN
Genève
Multivio
Martigny
e-codices
Fribourg
Infoclio.ch Bern
swissbib
Basel
Best-Practices
DOI desk
E-Depot
e-rara.chKartenportal.ch
Long-term preservation
Marketing e-lib.ch
Metadata servers
retro.seals.ch
Web portal e-lib.ch
Zürich
ElibEval
Information literacy
Search skills
Chur
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Sub-project "ElibEval"
Usability evaluations of web sites and applications developed in the context of e-lib.ch
Conception and creation of analytical tools to support information providers in carrying out their own evaluations
Seite 3
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Situation of libraries
Page 4
Changes in environment and increasing competition
Mission:
"[..] encourage the library and information sector to work with partners and users to maximize the potential of digital technology to deliver services that enable seamless and open access by users to cultural and information resources."
(IFLA Strategic Plan 2010-15, http://www.ifla.org/files/hq/gb/strategic-plan/2010-2015.pdf)
Offer the same ease of use, robustness and performance as internet search engines combined with the quality, trust and relevance traditionally associated with libraries
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Challenges for libraries
Merging of heterogeneous information Organizing interaction of various systems, so that users can pursue their
objectives without hindrance ("don't burden users with library-interna")Seite 5
Physicalcollection
Library catalogue
Additionalinformation
Digitalcollection
Databases
Website
Indexing
Management
Presentation
Support
Libraries
Archiving
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
User-perceived quality of library online services
Page 6
(Tsakonas & Papatheodorou, 2006)
“The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use.”
(ISO 9241-Part 11: Guidance on usability)
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Usability evaluation methods
Two main criteria to categorize usability evaluation methods: When (formative vs. summative evaluation)
Who (user-oriented/empirical vs. expert-oriented/analytical methods)
Page 7
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Usability evaluation of online library services
As in our own project, libraries generally use a wide spectrum of methods for usability evaluations
Kupersmith (2012) provides a good overview
According to this literature review the most commonly used method is user observation / usability tests
Observation of real user behaviour
Time-consuming and expensive
Page 8
Heuristic evaluation is a widely used instrument (cheaper, quicker)
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Heuristic evaluation
1. Visibility of system status
2. Match between system and real world
3. User control and freedom
4. Consistency and standards
5. Error prevention
6. Recognition rather than recall
7. Flexibility and efficiency of use
8. Aesthetic and minimalist design
9. Help users recognize, diagnose and recover from errors
10.Help and documentation
Page 9
Experts examine whether an interface is compliant with established usability principles (the "heuristics")
Nielsen's heuristics (1994):
(http://www.nngroup.com/articles/ten-usability-heuristics/)
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Motivation for the development of library specific heuristics
Most studies limit themselves to common heuristics, e.g. Nielsen’s 10 heuristics
Lack of library specific recommendations (e.g. Clyde 1996, Clausen 1999, Raward 2001)
Problems of common heuristics:
too generic for an in-depth analysis extensive knowledge in the field of user interface design is needed
Our Goal: Develop more detailed heuristics, which
are suited to the specific requirements of library websites are easy to use and allow a judgement even to non-experts assist developers in building user-friendly library websites
Page 10
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Methodical approach
Three cornerstones:
literature review “usability evaluations in libraries” best-practice-analysis of library websites focus group (to discuss and further refine our concept)
Result:
modular, hierarchically structured list of evaluation criteria all website elements and evaluation criteria classified into mandatory and
optional
Page 11
This concept aims at maximizing the applicability of the heuristics for libraries of different size and type.
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
BibEval – Structure
4 sectors divided into sub sectors
Different components in each sub sector
Questions/evaluation criteria for each hierarchy level
12
search & explore the collection(s)
information & communication
personalization & customization
user participation
simple search
Suchen & Erkunden
presentation & access
search & exploration
advanced search
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Usage of BibEval – Selection of sectors and components
Seite 13
http://www.cheval-lab.ch/en/usability-of-library-online-services/criteria-catalogue-bibeval/
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
BibEval – Severity rating and comments
Seite 14
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
BibEval – Reports / Export functions
Seite 15
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
BIBEVAL – PROJECT ADMINISTRATION
Seite 16
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Project Administration
Seite 17
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Conclusions and further work
One criticism levelled against heuristic evaluation: in-depth knowledge of HCI is required in order to apply this method
correctly (Blandford et al. 2007; Warren 2001) In formulating our evaluation criteria, we focused on end-user
perspectives
Continuous improvement of our criteria catalog to keep it up to date Extension of our web application (e.g. deleting questions, add own
criteria)
Page 18
Enable libraries to conduct self-evaluations
Further development through community