You cant manage what you cant measure - User Vision Breakfast Briefing

Post on 23-Jan-2018

438 views 4 download

Transcript of You cant manage what you cant measure - User Vision Breakfast Briefing

You can’t manage what you can’t measure

January 2016

1

CX metrics

2

https://www.experiencedynamics.com/blog/2015/03/ 30-ux-statistics-you-should-not-ignore-infographic

Contents

3

Why measure?

• The value of collecting metrics around user behaviour

Methodology

• Analytics

• A/B testing

• True Intent studies

• User testing I

• User testing II: Remote quantitative

• One-minute survey

• Top task identification

• Tree testing

• Site search metrics

Benchmarking

Dashboards

Why measure? The value of collecting metrics around user behaviour

4

The value of measuring behaviour

Measuring user behaviour can tell you:

What users are doing on your site

What problems they are having

Why they are having those problems

Whether users can accomplish what they came to your site to do

Which paths are most likely to lead to success

How satisfied users are with your site

These data are about the user experience

5

Methodology How metrics are collected

6

Analytics

7

Key metrics: ► Conversion rates ► Bounce rates ► Time on site

Best for: Understanding what users are

doing Measuring the impact of changes

over time

A/B testing

8

Key metrics: ► Conversion rates

Best for: Testing specific hypotheses Comparing two (or more) options Testing changes before

implementing them

A/B testing

9

Which version did more testers click on – A or B?

A B

x

www.whichtestwon.com

A/B testing

10

www.usefulusability.com

True intent studies

11

Key metrics: ► User goals ► Path analysis ► Subjective ratings of success ► Subjective ratings of experience

(e.g., SEQ, SUS, NPS)

Best for: Understanding why users are

visiting your site Assessing the effectiveness of

different journeys/paths Collecting more reliable

satisfaction data

True intent studies

12

Thank you and goodbye!

[Additional questions as needed]

How easy or difficult did you find this site to use? (1-7 scale)

Were you able to accomplish what you came here to do today? (yes/no)

***Survey pops under until user is finished***

What have you come to this site to do today? (multiple choice)

Will you help us improve?

User testing

13

Key metrics: ► Completion rates ► Subjective ratings of experience

(e.g., SEQ, SUS)

Best for: Understanding why usability

problems exist

User testing

14

Oh, that’s great, that’s all the information I need. And I love the

colours they use here.

On this 1-7 scale, how easy or difficult did you think it was to find the information you were looking for?

Let me see… 3 is slightly difficult? I’d give it a 3. My

spelling isn’t very good, but the search should still have worked better than

it did.

(SEQ)

Remote quantitative user testing

15

Key metrics: ► Success / failure ► Confidence in answer ► Time on task

Best for: Adding a reliable, quantitative

metric to user testing Ideal for benchmarking

Remote quantitative user testing

16

Success, within ideal timeframe

100

Scoring a task, examples

Success, but 2 minutes over ideal time

75

Success, within ideal timeframe, but not sure of answer

85

Wrong answer, low confidence

40

Wrong answer, high confidence

10

Timeout, exceeded 5 minutes

40

Gave up 40

One-minute survey

17

Key metrics: ► Ranked order of site attributes – good and

bad

Best for: Giving direction to website

improvements Prioritizing projects

“Please choose the THREE factors from the list below that best describe your actual experience with the [ ] website”

One-minute survey

18

“Please choose the THREE factors from the list below that best describe your actual experience with the [ ] website”

26 statements in total

Top task identification

19

Key metrics: ► What customers most want to do

on a website or intranet – top tasks

► What they definitely don’t want to do – tiny tasks

Best for: Establishing user needs Complex, high traffic websites

with a lot of content and user types

Top task identification

20

Top task identification

21

www.xkcd.com

Tree testing

22

Key metrics: ► First-click success rate ► Task success and failure rates

Best for: Evaluating information

architecture and category labels after card sorting exercise

1 2 3

Tree testing

23

Site search metrics

24

Key metrics: ► Search results precision ► Search usability ► Content ‘searchability’

Best for: Addressing search issues Auditing content Comparing against search best practice

Site search metrics

25

Search on ‘weekend events’ on 17 December 2015

Metrics and the lifecycle of a website

26

Top task identification User testing

Analytics

Site search measurement

A/B testing Site search measurement

User testing

1-minute survey

Analytics

True intent

A/B testing

Benchmarking How metrics can be used to track site performance over time

27

The value of measuring behaviour over time

Benchmarking can tell you whether

changes to your site make users more

successful and satisfied

Benchmarking can tell you how best to

allocate development resources

to improve the overall user experience

28

Benchmarking

29

www.dilbert.com

All metrics

30

Completion rates

SEQ SUS

NPS

User goals

Path analysis

Subjective ratings of success

Conversion rates

Time on task

Bounce rates

Time on site

Top tasks

Tiny tasks Success / failure / disaster rates

Search results precision

Search usability

Content searchability

Take action priorities First click success rate

ERQ

Example: informational website

31

Completion rates

SEQ SUS

NPS

User goals

Path analysis

Subjective ratings of success

Conversion rates

Time on task

Bounce rates

Time on site

Top tasks

Tiny tasks Success / failure rates

Search results precision

Search usability

Content searchability

Take action priorities First click success rate

ERQ

Example: Benchmarking for an informational website

32 32

ORIGINAL SITE

Benchmarking metrics

Top Task identification

Success/failure/ disaster rates

Time on task

Take action priorities

Example: Benchmarking for an informational website

33 33

Benchmarking metrics

Success/failure rates

Time on task

SEQ

Take action priorities

Search results precision

Search usability

Content searchability

Tools for an informational website

34 34

Benchmarking metrics

Success/failure rates

Time on task

SEQ

Take action priorities

Search results precision

Search usability

Content searchability

Remote quant user testing

One-minute survey

Site search metrics

E-Commerce website: increase conversion

35

Completion rates

SEQ SUS

NPS

User goals

Path analysis

Subjective ratings of success

Conversion rates

Time on task

Bounce rates

Time on site

Top tasks

Tiny tasks Success / failure / disaster rates

Search results precision

Search usability

Content searchability

Take action priorities First click success rate

ERQ

E-Commerce benchmarking: increase conversion

36 36

Benchmarking metrics

User goals

Subjective success

SEQ

NPS

Conversion rates

Take action priorities

Tools for an E-Commerce website: increase conversion

37 37

Benchmarking metrics

User goals

Subjective success

SEQ

NPS

Conversion rates

Take action priorities

True Intent study

Analytics

One minute survey

38

39

40

41

42