CARES_usability_study

5
User-testing an interactive mapping site Background information: I prepared this report after conducting a web usability study for a group called CARES (Center for Agricultural, Resource and Environmental Systems) at the University of Missouri. It was distributed to CARES staff and college administrators. The CARES site enables anyone to access public information (GIS data, economic models, census data, and much more) online. Presenting such a large amount of data in a user-friendly format was quite a challenge, and we decided that conducting focus groups would be the best way to make sure people could use the site effectively. I coordinated the usability project, which included training observers, recruiting evaluators, facilitating debriefing sessions, and writing follow-up reports summarizing what we had learned during the sessions. GOALS The main goal of this study was to determine the usability of the CARES Map Room,which is a core component of the new CARES website. A secondary goal was to determine if people could use the site's new main page and navigation bars to access other parts of the site. METHODOLOGY We developed two short lists of tasks (one for each group of evaluators). The tasks were designed to take about half an hour to complete; a pre-tester completed a set of tasks in 20 minutes. The tasks were: find a CARES staffers contact info find a project manager make a map of a particular area showing county boundaries, rivers and streams, highways, and incorporated areas zoom in on a town add roads and streets print the map We asked two groups of five people each to evaluate the new CARES website. These participants (evaluators) were lay professionals -- all were familiar with personal computing and using the internet but none were computer scientists or expert GIS users. The evaluators were told they'd be testing the "user-friendliness" of an interactive mapping website. Members of the CARES staff (observers) were paired with these evaluators. Observers had been given log sheets to keep track of evaluator actions and comments, as well as the amount of time spent on particular tasks. Observers had also been advised not to communicate with the evaluators during the session or to offer help unless an evaluator got stuck on a particular problem for more than five minutes.

Transcript of CARES_usability_study

Page 1: CARES_usability_study

User-testing an interactive mapping site

Background information: I prepared this report after conducting a web usability study for a group called CARES (Center for Agricultural, Resource and Environmental Systems) at the University of Missouri. It was distributed to CARES staff and college administrators.

The CARES site enables anyone to access public information (GIS data, economic models, census data, and much more) online. Presenting such a large amount of data in a user-friendly format was quite a challenge, and we decided that conducting focus groups would be the best way to make sure people could use the site effectively.

I coordinated the usability project, which included training observers, recruiting evaluators, facilitating debriefing sessions, and writing follow-up reports summarizing what we had learned during the sessions.

GOALS The main goal of this study was to determine the usability of the CARES “Map Room,” which is a core component of the new CARES website. A secondary goal was to determine if people could use the site's new main page and navigation bars to access other parts of the site.

METHODOLOGY

We developed two short lists of tasks (one for each group of evaluators). The tasks were designed to take about half an hour to complete; a pre-tester completed a set of tasks in 20 minutes. The tasks were:

find a CARES staffer’s contact info find a project manager make a map of a particular area showing county boundaries, rivers and streams,

highways, and incorporated areas zoom in on a town add roads and streets print the map

We asked two groups of five people each to evaluate the new CARES website. These participants (“evaluators”) were lay professionals -- all were familiar with personal computing and using the internet but none were computer scientists or expert GIS users. The evaluators were told they'd be testing the "user-friendliness" of an interactive mapping website.

Members of the CARES staff (“observers”) were paired with these evaluators. Observers had been given log sheets to keep track of evaluator actions and comments, as well as the amount of time spent on particular tasks. Observers had also been advised not to communicate with the evaluators during the session or to offer help unless an evaluator got stuck on a particular problem for more than five minutes.

Page 2: CARES_usability_study

During a brief welcome and overview, the evaluators were told they’d have about 30 minutes to work on a set of tasks. They were encouraged to think aloud as they worked through these tasks and were reassured that they were not being tested, but rather that they were testing the site.

After the welcome and overview, the evaluators turned on their computer monitors, which displayed the front page of the new CARES website. They began work on their tasks and the observers recorded the time and the evaluators' actions and comments.

After approximately 35 minutes we got the group together again for a discussion and debriefing.

RESULTS

Overall, the evaluators were able to complete their tasks (with some prompting -- see specifics below) and were very positive and enthusiastic about the site. After exploring the site, several evaluators requested the URL so they could visit it again on their own. Several evaluators outlined specific ideas for uses for the site.

Several usability problems surfaced -- notes from our debriefing session follows.

OBSERVATIONS and RECOMMENDATIONS

Toolbar / Help

* Observation: Evaluators thought they were selecting a tool by clicking an icon in the “help” section on the right side of the screen. Since the “pan” tool was selected by default, the evaluators ended up panning when they thought they were zooming in on an area. This led to serious usability problems; sometimes the users ended the session and started over because they had become too disoriented to continue.

* Observation: Evaluators also seemed reluctant to click on the “i” or “info”tool, which would have allowed them to identify valuable information about any attribute on the map. The “i” icon is apparently unfamiliar to non-GIS users -- most of the observers said they ended up having to prompt evaluators to use the “i” tool.

* Recommendation: Default to the “i” tool instead of the “pan” tool. When users click on the map, it won't move as it does when the “pan” tool is active. Instead, a user-friendly dialog box will appear, prompting the user to select an attribute to learn more. In this way, users will learn about the “i” tool at the same time they realize that a tool has been selected for them.

Page 3: CARES_usability_study

* Recommendation: Create a “tool help” button on the toolbar. This would direct users’attention to the toolbar instead of dividing attention between the help toolbar and the actual toolbar.

* Observation: The yellow tool buttons seemed to attract users’ attention, often to the point of distracting them from the gray tool buttons. Even though the gray tools were often more appropriate, the evaluators seemed not to see them.

* Recommendation: Make all tool buttons the same color.

* Observation: Some users did not realize they had to click at the corners to draw a polygon. Some users did not realize they had to double-click to stop drawing.

* Observation: Once users understood that the “real” toolbar was at the top and the “help” toolbar was to the side, they tended to abandon the help toolbar in favor of the one-line explanations that appear when the mouse is moved over a tool in the toolbar.

Vocabulary / terms

* Observations: Some users had trouble with “incorporated areas,” “hydrography,” and/or “interactive map.” Several users didn't understand the scale ratios.

Recommendations: Might want to define some words and terms in parentheses (for instance, print “(More detailed)” alongside the 1:100,000 designation). Also might cross-list some common layers (rivers and streams under “water quality” as well as “hydrography,” for instance).

Labeling

* Observation: Users wanted to see labels on towns, rivers, etc., especially when they printed a map out. As discussed above, they were unaware that they could use the “i” tool to identify features.

* Recommendation: Yan is already taking care of this.

Navigation

* Observation: Users generally seemed willing to use the “modify map” option to add data layers to their maps. This is encouraging. There was some confusion, though, over how to start a new map.

Page 4: CARES_usability_study

* Recommendation: Replace the “Interactive map” option on the blue bar at top with “Start new map.” It might also be useful to have the “Download” function lead to a page within Map Room that defaults to download the data selected for that particular map.

* Observation: Most users were surprised that the main page's “Contact us” link led to an e-mail window.

* Recommendation: “Contact us” should lead to staff contact info page. Might want to add an “E-mail webmaster” link to the header or footer on every page.

*Observation: Evaluators tended to use their browsers’ “Back” and “Print” buttons.

* Recommendation: As suggested earlier, a yellow toolbar might focus people's attention on the site’s buttons. If it's possible to make a “Back” arrow on the site's toolbar, that would keep people focused on the CARES toolbar instead of resorting to the browser’s “Back” button.

Extent

* Observation: Some users wanted to see a state-wide extent and were frustrated when the site prompted them to choose a county.

* Recommendation: The statewide option would be more obvious if the “Select a county” drop-down box listed “Statewide” as the default selection.

General comments

* Several people wanted to see and interact with data from other states -- there was some curiosity as to the extent of CARES' datasets. Might want to explain this in the "About CARES" section.

* One user wanted access to metadata. This user apparently did not notice the “i” buttons beside each dataset name. Hyperlinked words like "More info..." after the name of each dataset might be more effective than the "i" button.

* Several users wanted a tutorial linked from the main page.

* The "Map Room" link on the main page was confusing to some users. They suggested that a phrase like "Make a map" would be more explicit.

FURTHER RESEARCH Once these problems are addressed, the stage will be set for another round of tests later this summer.

Page 5: CARES_usability_study

Some topics we might want to explore are: * Whether these improvements to the interactive mapping section resulted in greater usability * The effectiveness of the site's decision support tools * Types of questions people come to the CARES site with * Types of questions people think the CARES site should be able to answer

Follow-up: CARES staff dramatically improved the web site's usability based on the results of this study. We conducted another round of focus groups later that summer, and CARES staff unveiled the new user-friendly web site at www.cares.missouri.eduat a “launch party” in November 2000.