Webinar: Common Mistakes in A/B Testing

67
A/B-testing Mistakes & Quick Fixes

Transcript of Webinar: Common Mistakes in A/B Testing

Page 1: Webinar: Common Mistakes in A/B Testing

A/B-testing Mistakes & Quick Fixes

Page 2: Webinar: Common Mistakes in A/B Testing

CRO Expert &A/B-testing Ninja

Partner ManagerBenelux & Nordics

Your hosts

Page 3: Webinar: Common Mistakes in A/B Testing

Nr 1 Website Optimization Platform

Delivering best customer experiences at every touch point on the web and mobile apps

Page 4: Webinar: Common Mistakes in A/B Testing

Nr 1 in Scandinavia on Online Conversion Rate Optimization

The only 2 Star Solution Partner in Scandinavia!

Page 5: Webinar: Common Mistakes in A/B Testing

Agenda

● Brief overview of A/B-testing● Common A/B-testing mistakes● Some customer cases● Summary● QA

Page 6: Webinar: Common Mistakes in A/B Testing

Brief overviewof A/B-testing

Page 7: Webinar: Common Mistakes in A/B Testing

➔ An optimization method that involves testing different versions of a web page (or app)

➔ The variations are identical except for a few things that might affect a user's behavior

➔ Calculations are made to see if the effect is not coincidence

What is A/B-testing?

Page 8: Webinar: Common Mistakes in A/B Testing

Here’s how it works

Page 9: Webinar: Common Mistakes in A/B Testing

● Visitors are randomly selected to see different variations(cookies are stored)

● Keeping track of your KPIs● Downloading content from the cloud or redirecting the

visitor to a different URL

An A/B-testing tool in a nutshellThree primary things

Page 10: Webinar: Common Mistakes in A/B Testing

➔ To learn more about the visitor’s behaviour in order to formulate new hypotheses

➔ We want to achieve our online goals e.g. increased sales or more leads

Why should you test?

Page 11: Webinar: Common Mistakes in A/B Testing

10 common mistakes

Page 12: Webinar: Common Mistakes in A/B Testing

Testing areaIf there is an obvious opportunity to shift behaviour, expose insight or increase number conversions

You test everything

Just Do It (JFDI)Issues where a fix is easy to identify or the change is a no-brainer

Put your findings into buckets

ExploreYou need more information to triangulate the problem. If an item is in this bucket, you need to do further digging, more data points

(red = not suitable for testing)

Page 13: Webinar: Common Mistakes in A/B Testing

No (analytics) integration

● Troubleshooting tests● (Segmenting results)● Test that “flipp”● Tests that don’t make any sense● Broken test● What drives the difference

Page 14: Webinar: Common Mistakes in A/B Testing

Best-in-class Integrations

Page 15: Webinar: Common Mistakes in A/B Testing

Your test will finish in 100 years!

★ Use a test duration calculator★ https://www.optimizely.com/resources/sample-size-calculator★ http://apps.conversionista.se/visual-test-duration-calculator/

Page 16: Webinar: Common Mistakes in A/B Testing

You draw conclusions based on an ongoing test

Page 17: Webinar: Common Mistakes in A/B Testing

Optimizely’s Stats Engine● New way of measuring significance in a dynamic environment

Results● Make a decision as soon as you see significant results● Test many goals and variations accurately at the same time● No extra work for experimenters

Traditional Statistics

Stats Engine

Percent of tests with winners or losers declared 0.36 0.22

Percent of tests with a change in significant declaration 0.37 0.04

Page 18: Webinar: Common Mistakes in A/B Testing

● Segmenting● Customer service● Session replay● Eyetracking● User testing● Form analytics

Your hypothesis is crapUse input from:

● Search analysis● A/B-testing ● Web analysis● Competitors● Customer contacts● Surveys

Page 19: Webinar: Common Mistakes in A/B Testing

Solution:Question your ideas

http://dah.la/hypothesis-creator

Read the blog post about how to use the formulahttps://conversionista.se/ab-test-hypoteser/

IAR

Page 20: Webinar: Common Mistakes in A/B Testing

Magine TVInternet TV Streaming Service

Page 21: Webinar: Common Mistakes in A/B Testing

The challenge: More leads without changing the sign-up

Page 22: Webinar: Common Mistakes in A/B Testing

The landing page

Page 23: Webinar: Common Mistakes in A/B Testing

Scroll map analysisGenerates a map based on where the visitors of your website click or scroll

Page 24: Webinar: Common Mistakes in A/B Testing

The analysis with Google Analytics

In the funnel visualization reports we found a bigger drop off between signup and thank you page than between landing page and signup page

Page 25: Webinar: Common Mistakes in A/B Testing

The HypothesisSince we have observed that [We have a big drop off between the

Signup and the Thank you Page]. By [Analyzing the data in Google

Analytics And Crazy Egg]. We want to [Move up the “Instructions”]

which should lead to [more people signing up]. The effect will be

measured by [the number of people signing up]

http://dah.la/hypothesis-creator

The hypothesis formula

Page 26: Webinar: Common Mistakes in A/B Testing

The testOriginal Variation

KEY CHANGES:Move the instructions to the top of the page

Page 27: Webinar: Common Mistakes in A/B Testing

Variation 1 outperforms the Original

Micro Conversion

Goal

Micro Conversion

Goal

Macro Conversion

Goal

Page 28: Webinar: Common Mistakes in A/B Testing
Page 29: Webinar: Common Mistakes in A/B Testing

Your tests are not prioritized

Opportunity

High

LowLowEffortHigh

Opportunity factors to take into consideration:

➔ Complexity➔ Resources➔ Decisions

Effort factors to take into consideration:

➔ Potential➔ Scale➔ Goal

Page 30: Webinar: Common Mistakes in A/B Testing

SpotifySpotify’s Premium Trial Flow

Page 31: Webinar: Common Mistakes in A/B Testing

Original

● Premium Trial page (US)● High drop off● Asked to provide credit

card details to start the premium trial

Page 32: Webinar: Common Mistakes in A/B Testing

● User testing● Short survey →

○ Data shows that the primary reason to not start the premium trail is■ Does not want to

give away their credit card details

Input

Page 33: Webinar: Common Mistakes in A/B Testing

Test Hypothesis

“Eftersom att vi med DATAANALYS har observerat att en stor del av de som

lämnar premiumflödet (i data) gör det p.g.a. att de INTE VILL GE BORT sina betaluppgifter

kommer vi säga VARFÖR de måste ge det vilket kommer leda till att fler gör det.

Något vi kommer att mäta i antal köp.”

http://dah.la/hypothesis-creator

The hypothesis formula

Page 34: Webinar: Common Mistakes in A/B Testing

Hypothesis: “Give the user a reason...”

We only use this to verify your account, you won't be charged anything for your trial

We need this because our music deals only allow free trials for users that are credit card or PayPal holders

We need this just in case you decide to stay Premium after your free month

B

C

D

Page 35: Webinar: Common Mistakes in A/B Testing

Test Results

C. “Because of our music...”

B. “Verify your account…”

D. “If you want to continue...”

A. Original

Variations CC PAGE Thank You page

Page 36: Webinar: Common Mistakes in A/B Testing
Page 37: Webinar: Common Mistakes in A/B Testing

You run a “bad” test

Page 38: Webinar: Common Mistakes in A/B Testing

SwedofficeB2B E-Commerce Site

Page 39: Webinar: Common Mistakes in A/B Testing

Original

Page 40: Webinar: Common Mistakes in A/B Testing

Solution

Page 41: Webinar: Common Mistakes in A/B Testing

Test ResultsNo difference between the variations

A/B-test (1)

Original Variation

Page 42: Webinar: Common Mistakes in A/B Testing

Why?!

Page 43: Webinar: Common Mistakes in A/B Testing

Retake

Page 44: Webinar: Common Mistakes in A/B Testing

A/B-test (2)

Conversions + 6%Revenue per Visitor + 10%

Original Variation

Page 45: Webinar: Common Mistakes in A/B Testing
Page 46: Webinar: Common Mistakes in A/B Testing

You don’t isolate the variations and end up with no change

Page 47: Webinar: Common Mistakes in A/B Testing

Different Traffic Sources not taken into considerationMaximize ROI on your PPC investment

Page 48: Webinar: Common Mistakes in A/B Testing

OptimizelyHow Optimizely Maximized

ROI on their PPC investment

Page 49: Webinar: Common Mistakes in A/B Testing

Google Keyword-Insertion

Page 50: Webinar: Common Mistakes in A/B Testing

Creating Symmetry

Page 51: Webinar: Common Mistakes in A/B Testing

Original

Page 52: Webinar: Common Mistakes in A/B Testing

Variation

39% Increase in Sales Leads

Bounce Rate Decreased

Quality Score went up

Cost per Lead went down

39% Increase in Sales Leads

Bounce Rate Decreased

Quality Score went up

Cost per Lead went down

39% Increase in Sales Leads

Bounce Rate Decreased

Quality Score went up

Cost per Lead went down

Page 53: Webinar: Common Mistakes in A/B Testing

Results

39% Increase in # of Sales LeadsBounce Rate DecreasedGoogle Quality Score went upCost per Lead went down

Page 54: Webinar: Common Mistakes in A/B Testing
Page 55: Webinar: Common Mistakes in A/B Testing

Do not get risky - be aware of bugs

- Make sure not to direct all traffic to a “broken” or bad performing variation

- Preview your variations in cross browser tests- Use phased rollouts to avoid dissatisfaction

Page 56: Webinar: Common Mistakes in A/B Testing

Phased Rollouts

Page 57: Webinar: Common Mistakes in A/B Testing

Phased RolloutsThe Sad Story...

Page 58: Webinar: Common Mistakes in A/B Testing

Using code blocks to be flexible

Page 59: Webinar: Common Mistakes in A/B Testing

Phased RolloutsThe Happy Story...

Page 60: Webinar: Common Mistakes in A/B Testing

Inkcards’ challenge

Page 61: Webinar: Common Mistakes in A/B Testing

Phased Rollouts

Page 62: Webinar: Common Mistakes in A/B Testing
Page 63: Webinar: Common Mistakes in A/B Testing

Summary: Common Testing Mistakes

➔ You test everything on your site

➔ No integrations➔ Your test will finish in a 100

years➔ You draw conclusions

based on an ongoing test➔ You put in too little effort

on your hypothesis

➔ Your test isn’t prioritized➔ You don't learn anything➔ You change everything at

once➔ You don't account for

different traffic sources➔ Be aware of bugs

Page 64: Webinar: Common Mistakes in A/B Testing

Key take aways

1. The only bad test is the one where you don’t learn anything

2. Expect the unexpected

3. Only test where you can trigger a behaviour change - where

we make decisions

4. Formulate your test hypothesis WELL !important

REMEMBER & DON’T FORGET

Page 65: Webinar: Common Mistakes in A/B Testing

Q&A

Page 67: Webinar: Common Mistakes in A/B Testing

Thanks!

CRO Expert & A/B-testing Ninja

Partner ManagerBenelux & Nordics

conversionista.se optimizely.com