Prentice Hall, © 200919-1 Evaluation of Effectiveness Part 5: Principles: How to Win the Battle of...

24
19-1 Prentice Hall, © 2009 Evaluation of Effectiveness Part 5: Principles: How to Win the Battle of the Buzz Chapter 19

Transcript of Prentice Hall, © 200919-1 Evaluation of Effectiveness Part 5: Principles: How to Win the Battle of...

19-1Prentice Hall, © 2009

Evaluation of Effectiveness

Part 5: Principles: How to Win the Battle of the Buzz

Chapter 19

19-2Prentice Hall, © 2009

Questions We’ll AnswerQuestions We’ll Answer

• How well do you understand why and how advertising evaluation is conducted?

• Can you list and explain the stages of message evaluation?

• What are the key areas of media evaluation?

• How are campaigns and IMC programs evaluated?

CHAPTER KEY POINTS

19-3Prentice Hall, © 2009

• Many executive feel advertising is only successful if it produces sales.

• Others feel advertising should emphasize long-term brand building.

• If advertising delivers the desired communication effects, but sales don’t increase, was the advertising ineffective?

How does impact work? How does impact work?

IMPACT: DOES IT WORK?

19-4Prentice Hall, © 2009

• Intuitive analysis is based on an experienced managers judgment.

• Measurement tracks consumer responses with structured feedback like response cards and calls.

• Formal evaluation is necessary:– Financial stakes are high—production of :30 spot

averages $200,000; national media costs several million.

– Advertising optimization—reducing the risk failure through testing, analyzing, tracking performance, and making changes to increase performance.

– Identify “best practices”—what works and what doesn’t, so brand advertising continues to improve.

Evaluating EffectivenessEvaluating Effectiveness

IMPACT: DOES IT WORK?

19-5Prentice Hall, © 2009

• Testing—to predict results– Sample ads are tested before they run.

• Monitoring—to track performance– Performance is tracked to see if anything needs

to be changed.

• Measurement—to evaluate the results– The results, or actual effects, are measured after

the campaign runs.

Types of EvaluationTypes of Evaluation

IMPACT: DOES IT WORK?

19-6Prentice Hall, © 2009

1. Developmental research• Pretesting to see if an idea will work, or another is

better.

2. Concurrent research• Tracking studies and test marketing to see how

campaign is unfolding and how messages and media are working.

3. Posttesting research• Comparing the impact of campaign after it’s over

against a benchmark, baseline, or other starting point.

4. Diagnostic research• Taking apart an ad to see what elements are working

and which aren’t; examine frame by frame or piece by piece.

Stages of EvaluationStages of Evaluation

IMPACT: DOES IT WORK?

19-7Prentice Hall, © 2009

• It’s difficult to measure advertising’s effect on sales:– Other factors affect sales (e.g., pricing, distribution,

competition), making it hard to isolate impact.– Effects are delayed; it’s hard to link sales to

advertising.• Communication effects an be measured as surrogate

measures for sales impact.– Awareness of the advertising, purchase intention,

preference, liking.• Good evaluation plans, as well as effective

promotional work, are guided by a model of how people respond to advertising.

Facets: Measuring ResponsesFacets: Measuring Responses

IMPACT: DOES IT WORK?

19-8Prentice Hall, © 2009

• Companies that conduct research and perform diagnostic methods to identify an ad’s strong and weak points:

– Ameritest: brand linkage, attention, motivation, communication, flow of attention and emotion through the commercial.

– ARS: persuasion, brand/ad recall, communication.– Diagnostic research: brand recall, main idea, attribute statements

(importance, uniqueness, believability).

– IPSOS-ASI: recall, attention, brand linkage, persuasion, (brand switch, purchase probability), communication.

– Mapes and Ross: brand preference change, ad/brand recall, idea communication, key message delivery, like/dislike, believability, comprehension, desire to take action, attribute communication.

– Millward Brown: branding, enjoyment, involvement, understanding, ad flow, brand integration, feelings to ad, main stand-out idea, likes/dislikes, impressions, persuasion, new news, believability, relevance

– RoperASW: overall reaction, strengths and weaknesses, understanding, clutter-busting, attention, main message, relevance, appeal, persuasiveness, motivate trial, purchase intent.

Copy TestingCopy Testing

MESSAGE EVALUATION

19-9Prentice Hall, © 2009

• Concept Testing– Compares the effectiveness of various message

strategies and their creative ideas (the Big Idea).

• Pre-testing– Helps marketers make final go/no-go decisions

about finished/nearly finished ads using photoboards or animatics.

• Diagnostics– Designed to diagnose strengths and weaknesses

of ideas to improve work still in development or to learn more in order to improve subsequent advertisements.

Message Development ResearchMessage Development Research

MESSAGE EVALUATION

19-10Prentice Hall, © 2009

During Execution: During Execution: Concurrent TestingConcurrent Testing

MESSAGE EVALUATION

• Coincidental Surveys– In broadcast media, random calls to target market determine

stations choices, ads they’ve seen/heard, brand perceptions.

• Tracking Studies – Every 3 to 6 months, measure top-of-mind brand

awareness.

– Brand tracking tracks the performance of the brand.

• Test Markets– Evaluate product variations, campaign or media elements.

– Generally two or more markets with markets as controls.

19-11Prentice Hall, © 2009

Posttesting: Posttesting: After Execution ResearchAfter Execution Research

MESSAGE EVALUATION

• Breakthrough: attention—interest, enjoyability, liking• Engagement tests—eye-tracking as readers scan ads• Memory tests—recognition test, recall tests, unaided recall,

aided recall• Emotion test—MRI measures brain activity• Likeability tests—relevant, important, enjoyable, entertaining,

fun• Persuasion tests—intention to buy, motivation • Inquiry tests—measures number of responses to an ad• Scanner research—tally up purchase and collect consumer

buying info• Single-source research—advertising and brand purchase data

come from the same households, linking advertising to sales

19-12Prentice Hall, © 2009

• How did each media vehicle perform? Were reach and frequency objectives met?

• Services include Simmons-Scarborough, Arbitron, MediaMark.

• For outdoor, traffic counts don’t equal exposure.

• For Web or Internet advertising, what is measured and how does it compare to traditional media: hits, click-throughs, minutes spent?

• Alternative or guerilla marketing is even more difficult to equate to traditional media.

Evaluating Audience ExposureEvaluating Audience Exposure

MEDIA EVALUATION

19-13Prentice Hall, © 2009

Advertising ROI and Advertising ROI and Media EfficiencyMedia Efficiency

MEDIA EVALUATION

• Return on investment (cost to sales ratio) is hard to calculate because many factors affect sales.

• How do you determine if you’re overadvertising or underadvertising?

• Wearout—recall stabilizes or declines and irritation increases until there’s no or less response (can be a combo of creative impact and media buying).

• Media optimization— the goal is optimum media performance getting the most impact for the investment.

19-14Prentice Hall, © 2009

• Last, and perhaps most important, stage in the development of a campaign plan.

• Determines whether the campaign’s message and media were effective.

• Measures the overall impact on the brand, but the pieces are still evaluated to determine their individual effectiveness.

Why evaluate campaigns?Why evaluate campaigns?

EVALUATING MARKETING COMMUNICATIONS CAMPAIGNS

19-15Prentice Hall, © 2009

• Certain marketing communication functions such as public relations and sales promotion, do some things better than other areas.

• An integrated plan uses the best tools to accomplish the desired effect.

Marcom ToolsMarcom Tools

EVALUATING MARKETING COMMUNICATIONS CAMPAIGNS

Principle: Advertising is particularly effective in

accomplishing such objectives as creating exposure, awareness, and brand image, and

delivering brand reminders.

19-16Prentice Hall, © 2009

• The objective is to generate an immediate behavior response (transaction, buy).

• Use toll-free numbers, mail-in coupons, Web site or email address, an offer in the copy.

• Response is easy to measure in terms of effectiveness and ROI.– Total responses divided by total mailed =

response per thousand (RPM)

Direct ResponseDirect Response

EVALUATING MARKETING COMMUNICATIONS CAMPAIGNS

19-17Prentice Hall, © 2009

• May be necessary to evaluate both trade and consumer promotions.

• Payout analysis compares the costs of a promotion to the expected sales.

• Breakeven analysis—finds the point at which the total cost of the promotion exceeds the total revenues.

Sales PromotionSales Promotion

EVALUATING MARKETING COMMUNICATIONS CAMPAIGNS

19-18Prentice Hall, © 2009

• Measure the success in getting out the message in terms of output and outcomes– Output: materials produced and distributed; how

many press releases ran

– Input: acceptance and impact of materials; changes in public opinion

• Content analysis: Was coverage favorable?• Public opinion studies: Have attitudes,

behaviors, or knowledge changed?

Public RelationsPublic Relations

EVALUATING MARKETING COMMUNICATIONS CAMPAIGNS

19-19Prentice Hall, © 2009

• Traffic volume– Page views– Site visitors

• Click-through rates– Ads sold as pay-per click

• Cost per lead– An attempt to measure ROI using a

conversion rate (percent of visitors who complete desired action)

Web Site EvaluationWeb Site Evaluation

EVALUATING MARKETING COMMUNICATIONS CAMPAIGNS

19-20Prentice Hall, © 2009

Special Advertising SituationsSpecial Advertising Situations

EVALUATING MARKETING COMMUNICATIONS CAMPAIGNS

• Retail advertising

• B2B advertising

• International advertising

• Objective: generate store traffic– Simple counts of people at

promotions and events

• Objective: visibility – Participation counts at

events, or “how-to” classes– Sign-up and fill-out forms

• Objective: loyalty– Participation in frequency

clubs or loyalty programs

19-21Prentice Hall, © 2009

Special Advertising SituationsSpecial Advertising Situations

EVALUATING MARKETING COMMUNICATIONS CAMPAIGNS

• Retail advertising

• B2B advertising

• International advertising

• Objective: generate response/sales leads– Lead count based on calls,

emails, and cards returned to the advertiser

• Objective: conversion rates—number of leads who make a purchase

19-22Prentice Hall, © 2009

Special Advertising SituationsSpecial Advertising Situations

EVALUATING MARKETING COMMUNICATIONS CAMPAIGNS

• Retail advertising

• B2B advertising

• International advertising

• Difficult to evaluate because of the number of markets, distance, cost and variety of cultures

• Evaluation should focus initially on pretesting to help correct big problems (due to unfamiliarity with the culture, language or consumer behavior) before they occur

19-23Prentice Hall, © 2009

• It’s difficult to evaluate and estimate the impact of synergy.

• Brand tracking can measure campaign effectiveness by adding and taking away ingredients, and studying the effects of those changes.

• The challenge: look at the big picture rather than individual pieces and parts.

• Advertisers seek an evaluation method that brings all the individual metrics together to efficiently and effectively evaluate and predict communication effectiveness.

Campaign EvaluationCampaign Evaluation

EVALUATING MARKETING COMMUNICATIONS CAMPAIGNS

19-24Prentice Hall, © 2009

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic,

mechanical, photocopying, recording, or otherwise, without the prior written permission of the publisher. Printed in the United States of America.

Copyright © 2009 Pearson Education, Inc.  Copyright © 2009 Pearson Education, Inc.  Publishing as Prentice HallPublishing as Prentice Hall