Improving Response to Household Surveys Using Mail Contact ...kostat.go.kr/iwsm/download/2014/1. Don...
Transcript of Improving Response to Household Surveys Using Mail Contact ...kostat.go.kr/iwsm/download/2014/1. Don...
Improving Response to Household
Surveys Using Mail Contact to
Request Responses over the Internet:
Results From a Series
of Experiments
For: Statistics Korea, September 16, 2014
By
Don A. Dillman*
*Regents Professor, Department of Sociology and the
Social and Economic Sciences Research Center,
Washington State University, Pullman, WA 99164-4014
Background
Thank you for the opportunity to speak to this conference.
Let me begin by qualifying my remarks.
This is my first trip to Korea, and I have much to learn about Korean culture and the implications that has for designing effective surveys.
I am looking forward to learning from you this week.
In this tutorial I will describe survey challenges we are facing in the U.S. and ways of overcoming them.
I look forward to learning from you whether these experiences might apply to Korea.
2 Don A. Dillman at Statistics Korea September
2014
We are losing the telephone and random
digit dialing for sample survey data
collection in the U.S.
Low response rates; public opinion surveys < 10%.
Poor landline household coverage; 60% overall and 45% of households with children.
Cell-phone fill-ins are person-specific
Geographic location of respondents is uncertain.
Less and less interaction in the U.S. now occurs over voice telephone.
Communication is now mostly asynchronous (e.g. texting)
Telephone surveys have become a cultural misfit.
We now bank electronically, buy goods electronically and get most of our information over the Internet, why should surveys still be done by telephone?
3 Don A. Dillman at Statistics Korea September
2014
The Internet is not yet a good
replacement for the telephone
Household coverage is around 75-80%.
Household member use of the internet is not universal; a division of labor in some households
There is no household sample frame
Individuals have multiple email addresses.
Email only contact produces low response rates.
Smart phones are replacing laptops for some people, and they are not survey-friendly.
Internet response rates from email only contact are often as low as telephone.
4 Don A. Dillman at Statistics Korea September
2014
In the U.S. we need to get past
two beliefs that are no longer true
Telephone alone is adequate for most surveys.
Web, using only email contact, is an adequate replacement for most telephone surveys.
Telephone-only surveys have many significant drawbacks: social desirability, forced shortening of questions, severe length limitations, and more.
Email-only/web surveys do not solve response rate concerns for general populations; In addition they accentuate non-response error (higher education people more likely to respond)
5 Don A. Dillman at Statistics Korea September
2014
Are mail surveys a possibility?
Perhaps. They now provide our best coverage for households from address-based sampling of U.S. Postal service residential addresses)
Mail is our highest response rate mode--except for government in-person surveys)
Five regional, state and inter-state tests of 12 page questionnaires from WSU, 2007-2012, produced response rates of 38-71% (mean 53%).
Postal mail’s function is changing (fewer letters from friends and less bill-paying) but, other functions are increasing.
Postal mail has become a partner for “orders by the web” and even delivery for United Parcel Service (UPS) and Federal Express.
But mail has also changed from a low cost to high cost mode of delivery (especially compared to the internet).
6 Don A. Dillman at Statistics Korea September
2014
To overcome coverage and
response rate problems…
Mixed-mode surveys are increasingly common.
There are two quite different uses of mixed-mode: Offer multiple response modes for answering
questionnaires.
Use different contact modes to encourage people to respond by that mode or another mode.
Research reveals that “mode of contact” is considerably more powerful that the offered “response mode” for improving response rates.
7 Don A. Dillman at Statistics Korea September
2014
How can mail contact be used to
get people to respond over the
Internet?
The rationale for using this approach to surveying and details for implementing it are provided in: Internet, Phone, Mail and Mixed-Mode Surveys: the
Tailored Design Method, 4th edition by Dillman, Smyth and Christian. John Wiley Co. Hoboken, NJ USA.
It is a 75% revision of the 3rd edition published in 2009.
I will provide a synopsis of these methods in the remainder of this presentation.
Journal citations are provided at the end.
8 Don A. Dillman at Statistics Korea September
2014
Book and related website wiley.com/go/dillman
provide methodological details of experiments
and survey implementation procedures
Don A. Dillman at Statistics Korea September
2014
9
Coverage provides a compelling reason to
use household postal contact for
household surveys
The U.S. Postal Service “Delivery Sequence File”
includes 95-97% of all residential addresses in the U.S.
Household coverage is far better than for any telephone
or Internet list
Generally available through two contractors licensed by
the U.S. Postal Service
This list is frequently updated
Only occupied households receive delivery
Household addresses can be used to deliver mail
questionnaires and/or request completion of a web
survey
Don A. Dillman at Statistics Korea September
2014
10
Mail contact does not restrict us to
Mail Responses
As web use increases, we can expect a greater % of
sample members who are willing and able to respond by
web
If enough responses are obtained, web response might
be less costly in terms of data entry and postage than
mail response
It is possible web response can be obtained more
quickly than mail response
Can we develop push-to-web systems that are as
effective or more so than paper alone?
11 Don A. Dillman at Statistics Korea September
2014
A potential strength of mail
contact
We can make multiple contacts and shape each in
different way to draw recipient attention and interest.
We can change later contacts to focus on different types
of recipients (the nonrespondents);
Heterogeneity of households poses special challenges:
number of adults, related vs. unrelated, education,
employment, economic resources.
But, we need criteria for shaping each request.
Don A. Dillman at Statistics Korea September
2014
12
Our response model focuses on making
behaviorally relevant changes between
initial contact and follow-ups
Don A. Dillman at Statistics Korea September
2014
13
Initial Contact Elements
Survey topic
Individual questions
Questionnaire layout
Packaging of survey request
Correspondence appearance and
content
Incentives
Follow-up Adaptations
Reminder of intentions
Communications with new
arguments
Reinforcement of reciprocal
obligations
Change in implementation
procedures, e.g. fed ex
Fit between mode of contact and
response mode
We use social exchange concepts to get
people to go from mail request to web
response?
Rewards or benefits: what one expects to
gain, and typically does gain, from being a
respondent.
Burden or costs: what one will have to give
up or spend to be a respondent.
Trust in delivery of benefits: expectation that
in the long run rewards from being a
respondent outweigh the burden and risks of
not responding. 14 Don A. Dillman at Statistics Korea September
2014
What is different about Internet surveys
and what can we do to overcome
problems?
Benefits vary: Do not have to find a mailbox to return
questionnaire.
Faster to provide answers for some.
Technologies easier for some people but not others.
Overall, the benefit of responding by whatever mode is usually not great (see various preference studies).
15 Don A. Dillman at Statistics Korea September
2014
What is different about responding over
the Internet and what can we do to
overcome these barriers?
Additional burden (or costs) for some people:
Effort required to get from letter to computer when we don’t have email addresses.
Computer literacy is low for some individuals.
Computer is not immediately operational.
Emails from a stranger are hard to find after 1-2 days.
Old software, poor connections, for some. Required answers. Some devices not amenable to answering most
surveys.
16 Don A. Dillman at Statistics Korea September
2014
What is different about Internet surveys
and what can we do to overcome
problems?
Trust is a much larger issue in Internet surveys:
Really bad things can happen on the Internet—
malware and harvesting of addresses or account
numbers.
Trust of email from “stranger” is low.
Source of emails and websites is sometimes
faked.
Will answers be stored in corporate or
government files for future reference?
Legitimacy may be hard to verify. 17 Don A. Dillman at Statistics Korea September
2014
To apply and test these ideas, for
improving response we
conducted five address-based
household studies.
1. Lewiston, ID-Clarkston, WA Survey 2007
2. Washington Community Survey 2008
3. Washington Economic Survey 2009
4. WA, PA, AL Tri-state Electricity Survey 2011
5. WA and NE Water Management Survey 2012
All tests involved using mail contact to push respondents to the web by withholding mail from the initial mailings.
Don A. Dillman at Statistics Korea September
2014
18
These five studies involved:
Designing the “next” study based upon results from the
previous study(ies); we added new features in each test
to see how response rates were affected and to reduce
non-response error.
35 experimental treatments were implemented, some of
which were controls carried forward from study to study.
Ineffective strategies were not carried forward.
Design Criteria
20-25 minute surveys
12 page questionnaires (in paper)
90-140 individual responses required
Used visual design principles and unified mode construction for
web and mail
Don A. Dillman at Statistics Korea September
2014
19
Over time, we tested these
elements:
Pure mode choice (mail and web)
Effects of withholding paper questionnaire until late
Web+mail: withholding mail until the 3rd of 4 contacts
2web+mail: withholding mail until the 4th and final contact
Effects of requesting paper only response
Effect of providing web response directions
Effect of $5 cash incentive with web response request
Effect of $5 cash incentive with paper response request
Effect of a second incentive ($2 to $4)
Effect of out-of-state vs. in-state university sponsorship
on response from other states Don A. Dillman at Statistics Korea September
2014
20
Can we use a postal request
and incentives to obtain web
responses?
E-mail survey requests cannot include token
cash incentives in a meaningful way—thus one
of our most effective ways of achieving response
is not available in such surveys.
Can we use mail contacts to deliver a token
incentive, while still requesting a web response?
Does it make a difference if we use an incentive
with more than one of the requests?
Don A. Dillman at Statistics Korea September
2014
21
Why consider small cash
incentives with the request
They lower nonresponse error (differences
between respondents and the sample frame).
They work together with number of contacts.
From a social exchange perspective we
provided small reward with the request to
respond and encourage development of trust
that study was important.
Two previous experiments illustrate these
effects Don A. Dillman at Statistics Korea September
2014
22
The kind of incentive makes a
big difference in results
Sending token $ with the request improves
response rates significantly and reduces non-
response error
Material incentives sent with request help, but
are much less effective than $
Payments afterwards, including charity
donations, are less effective
Explanation is the difference between social
and economic exchange
Don A. Dillman at Statistics Korea September
2014
23
Enclosed incentives are not just used
to improve response rates, but to also
reduce nonresponse error
0
10
20
30
40
50
60
70
80
<35 36-49 50-60 61+
Age
Response rates by age and incentives for 1993 Survey
of Washington State New Drivers License Holders
Without $2
With $2
24 Don A. Dillman at Statistics Korea September
2014
Incentives: A test of enclosed vs.
post payment incentives after each
oif four contacts
Incentive 1st Mailing
(%)
2nd Mailing
(%)
3rd Mailing
(%)
4th Mailing
(%)
No incentive 20.7 36.7 46.7 52.0
$1 Cash 40.7 52.0 61.3 64.0
$5 Cash 48.7 60.7 66.7 71.3
$5 Check 52.0 62.7 66.7 67.3
$10 Check 44.0 56.7 62.0 66.7
$20 Check 54.0 70.7 75.3 79.3
$40 Check 54.0 63.3 66.0 69.3
Promise of $50 23.3 43.3 53.3 56.7 Note: Each treatment group contained 150 subjects (James and Bolstein 1992)
25 Don A. Dillman at Statistics Korea September
2014
Response rates by Contacts and Incentives
The research goal we were
pursuing
What elements should we hook together and
in what way, so that we could get high
response rates and response quality (little or
no non-response error).
Perhaps, we thought, a “TDM” could be
developed for combining web and mail
responses, and not need to mix aural
methods that would introduce significant
measurement differences.
Don A. Dillman at Statistics Korea September
2014
26
Experiment 1. 2007 Lewiston, ID-
Clarkston, WA Study: The prototype
The “Lewiston/Clarkston study was the first of five experiments
testing how we could use mail contacts to push people to
respond by the web
I will go into more detail setting up this study than the other
experiments to give you some background on our methods
This was a regional test in a blue-collar, lower income, rural
region of the U.S. If we can get elements of a method for
pushing people to the web to work here, then maybe we can get
it to the work elsewhere (e.g., state-wide, national)
Detailed results are published here: (Smyth, J.D., Dillman, D.A.,
Christian, L.M., & O’Neill, A. 2010. American Behavioral
Scientist. 53: 1423-1448.)
Don A. Dillman at Statistics Korea September
2014
27
The data collection procedures
12 page questionnaire, 50 items, up to 80 responses
(depending upon branching), a 20-25 minutes survey
Four contacts Pre-notice letter (we will eliminated this from later
experiments) Questionnaire (or web request) Thank-you post card Replacement questionnaire (adjusted by treatment)
$5 token cash incentive included with initial mail
questionnaire or web request
Data collected November 7, 2007, to January 10, 2008
Don A. Dillman at Statistics Korea September
2014
28
We tailored our design to the
survey topic and location
Use of pictures of location to be surveyed
Creation of common screens for mail and web
Use of common branding for mail and web
Choice of stationary, envelopes and content based upon
rethinking of personalization strategies given that names
could not be used
Unified-mode construction for mail and web
Don A. Dillman at Statistics Korea September
2014
29
Tailoring/personalizing the survey
to the location and population
Photos taken of local landmarks, artwork, and symbols
to make survey recognizable and visually attractive
Don A. Dillman at Statistics Korea September
2014
30
For example, consider the
cover and back page of the
mail questionnaire
Don A. Dillman at Statistics Korea September
2014
31
Consider the opening page of
the web questionnaire
Don A. Dillman at Statistics Korea September
2014
32
Design of the web survey—focus
on population not sponsor
Example: Question 2 Similar design format to paper survey, and use of familiar image
in upper left-hand corner of the screen.
Don A. Dillman at Statistics Korea September
2014
33
We used a unified design between mail (on left) and web (on right)
Don A. Dillman at Statistics Korea September
2014
34
Personalized Correspondence
All letters used WSU
stationary
Photo of
questionnaire cover
used to tie different
elements together
Don A. Dillman at Statistics Korea September
2014
35
Exterior of Envelopes
(2nd and 4th Contacts) Used WSU address labels
Used a return label showing the photo from survey cover and the
survey title to increase familiarity
Don A. Dillman at Statistics Korea September
2014
36
We compared four treatments
1. Mail preference with web mention: Send mail
questionnaire and mention web with initial request
2. Push-to-mail: Send mail questionnaire but withhold
mention of web for about two weeks
3. Push-to-web: Web invitation with no mail questionnaire,
but explain that mail questionnaire will be sent in about
two weeks
4. Equal preference: It is your choice!
Don A. Dillman at Statistics Korea September
2014
37
Initial withholding of mail
option drove 41% to the web!
Treatments Web (%) Paper(%) Total (%)
Mail preference with web
mention
4 58 62
Push-to-Mail
(web in third contact)
1 70 71
Push-to-web
Mail questionnaire sent in
3rd of 4 contacts
41 14 55
Equal preference (choice) 13 50 63
Don A. Dillman at Statistics Korea September
2014
38
Push-to-mail had highest response.
Push-to-web had lowest response rate
Don A. Dillman at Statistics Korea September
2014
58.0
70.0
14.0
50.0
4.0
1.0
41.0
13.0
0
10
20
30
40
50
60
70
80
Mail preference w/web mention
Push-to-mail Push-to-web Choice preference
Web
62%
71%
55%
63%
When given the initial choice of web or mail in the mail preference
with web mention and choice preference groups, few respondents
chose web
39
High response rates are
desirable but we need also to
focus on nonresponse error
It does not help much to improve response rates if
our respondents are different from non-respondents
on variables important to the study objectives
Thus, we need to compare respondent
characteristics on web vs. mail within the different
treatment groups.
If respondents are different from nonrespondents in
ways that affect results of the data analysis, then we
have a significant problem
Don A. Dillman at Statistics Korea September
2014
40
In the push-to-web treatment,
web and mail respondents
were quite different
Don A. Dillman at Statistics Korea September
2014
0
10
20
30
40
50
60
70
80
90
100
* p ≤ .05
* * * *
*
*
*
41
Web and mail respondents in the
push-to-web group were also different
on 7 of 24 substantive attitude/opinion
items Responded
by Mail
Responded
by Web Diff.
% attached to the area 90.0 80.4 -9.6
% think willingness for community involvement has
increased 47.7 31.7 -16.0
% think fish population increased 18.9 38.0 19.1
% more internet use improves quality of life 43.4 62.1 18.7
% think more cell use improves quality of life 26.9 44.1 17.2
% think environmental protection is too weak 16.3 30.7 14.4
% gray wolves not threat to domestic animals 2.5 9.9 7.4
Don A. Dillman at Statistics Korea September
2014
Red indicates significant difference at .05 level.
42
But the complete push-to-web group
was quite similar to the complete push-
to-mail treatment
Don A. Dillman at Statistics Korea September
2014
*
* p ≤ .05 43
0
10
20
30
40
50
60
70
80
90
100
Push-to-web
Push-to-mail
Conclusions from 2007
Lewiston-Clarkston study
Web on its own brings in specific types of
respondents and leaves others out
Our best chance of reducing nonresponse
error from a web study is to include a mail
option.
Web alone is not desirable!
Web and mail used together bring in a wider
range of respondents that is comparable to
mail used alone
Don A. Dillman at Statistics Korea September
2014
44
Immediate Implications
Getting 41% of households to respond over the
Internet to a mail request, and another 14% to a
paper follow-up was considered quite
successful!
We immediately started the process of seeing if
we could transition this accomplishment to state
and national data collection.
In the next segment I will be present results from
such tests.
Don A. Dillman at Statistics Korea September
2014
45
Experiments 2 and 3: Moving
from regional to state-wide
data collection (WCS & WES)
Tests aimed at isolating factors that affected response
A similar model was used, i.e. personalize questionnaire
to the state with pictures
We also experimented with new implementation
procedures that we thought might improve and also help
us understand reasons for nonresponse.
Don A. Dillman at Statistics Korea September
2014
46
2008 Washington Community
Survey (WCS)
Example of the mail version:
Don A. Dillman at Statistics Korea September
2014
47
Response rate trends for WCS were
similar to 2007 LCS; we can “push” 2/3
of responses to web, but obtained a
lower response rate(46% vs. 57%)
31.3
51.9
15.0
3.1
56.7
0
10
20
30
40
50
60
$5 Push-to-web w/card $5 Push-to-mail $5 Mail-only
Don A. Dillman at Statistics Korea September
2014
46%
55% 57%
48
A $5 cash incentive with WCS request
was very effective for increasing
response rates, especially for push-to-
web groups
52.5
39.2
0
10
20
30
40
50
60
$5 Mail Mail w/o $5
31.3
13.4
0
10
20
30
40
50
60
$5 Internet Internet w/o $5
Push-to-web
(+17.9*)
(+13.3*)
* p ≤ .05 Don A. Dillman at Statistics Korea September
2014
49
Demographic results in the push-to-web
group were similar to 2007 LCS. The mail
follow-up brought in different kinds of
respondents than did the web.
0
10
20
30
40
50
60
70
80
90
Education (HSor less)
Age (65+) # in HH (2 orless)
Married (% Yes) Employed (%Yes)
Income($25/year or
less)
Web Mail follow-up
* *
* * *
*
* p ≤ .05 Don A. Dillman at Statistics Korea September
2014
50
The combined push-to-web group was
demographically similar to the mail-
only group
Don A. Dillman at Statistics Korea September
2014
51
0
10
20
30
40
50
60
70
Education (HS orless)
Age (65+) # in HH (2 orless)
Married (% Yes) Employed (%Yes)
Income ($25/yearor less)
Push-to-web Mail-only
Complete push-to-web group was more
representative than the web-only
respondents(comparison to U.S. Census
Bureau’s American Community Survey)
Don A. Dillman at Statistics Korea September
2014
52
5
15
25
35
45
55
65
75
Education (HS orless)
Children in HH # in HH (2 orless)
Married (% Yes) Employed (%Yes)
Income ($25/yearor less)
Web-only Push-to-web ACS
Experiment 3. 2009 Washington
Economic Survey (WES)
Continued building on prior studies (LCS &
WCS), and used methods we knew worked.
New question: Will sending the mail follow-up with
a second $5 incentive and in a Priority Mail (PM)
envelope increase response rates?
Don A. Dillman at Statistics Korea September
2014
53
2009 WES- Using state map to
connect visually with the sample Example of the mail version:
Don A. Dillman at Statistics Korea September
2014
54
Using Priority Mail + a second $5 incentive
increased response rates, particularly for
the mail-only group; the effect was due
entirely to incentive (not shown)
33.8 32.7
18.2 14.9
68.4 58.8
0
10
20
30
40
50
60
70
80
Web+Mail PM+$5 Web+Mail Mail-only PM+$5 Mail-only
Don A. Dillman at Statistics Korea September
2014
52% 48%
68%
59%
55
What about item non-
response?
If mail higher item non-response rates then
perhaps that negates gaining the additional
responses.
Thus, it was important to evaluate this. (See
Messer, Edwards and Dillman, Survey Practice, 2012).
Don A. Dillman at Statistics Korea September
2014
56
Web vs. mail item non-response in
the push-to-web groups for LCS,
WCS, and WES
Item nonresponse rates lower for web
2.7 2.7
6.1 6.2 6.9
11.6
0
2
4
6
8
10
12
14
2007 LCS 2008 WCS 2009 WES
Don A. Dillman at Statistics Korea September
2014
57
BUT, push-to-web and mail-only groups have
similar overall item nonresponse rates; Mail
respondents in push-to-web groups were less
able to respond (older with less education).
3.6 4.2
8.0
5.0 4.2
8.1
0
2
4
6
8
10
12
14
2007 LCS 2008 WCS 2009 WES
Push-to-web
Mail-only
Don A. Dillman at Statistics Korea September
2014
58
Mail-only was less expensive per
respondent than push-to-web
Average WCS & WES costs/respondent
$39.05
$30.26
$0.00
$5.00
$10.00
$15.00
$20.00
$25.00
$30.00
$35.00
$40.00
$45.00
Cost/Respondent
Push-to-web Mail-only
Don A. Dillman at Statistics Korea September
2014
59
Why was web more
expensive?
Fewer respondents for allocating costs.
Web survey construction was not free. I
had to pay for that staff time and network
costs just as any outside survey sponsor
would have to pay.
The mailing costs were about the same,
leaving only data entry costs for mail as
the major cost difference.
Don A. Dillman at Statistics Korea September
2014
60
Push-to-web was not faster after
the first week.
0
10
20
30
40
50
60
70
10) $5 Web+Mail 11) $5 Web+Mail PM 12) $5 Web+Mail PM+$5
13) $5 Mail-only 14) $5 Mail-only PM 15) $5 Mail-only PM+$5
Don A. Dillman at Statistics Korea September
2014
2009 WES response times
61
Some limitations of the WCS & WES
Conducted for local statewide population in
same state as the sponsor
Washington also had higher than average Internet
penetration and levels of SES (vs. U.S.)
Can we survey in other states with similar
results, and push even harder for web (withhold
mail through three mailings)?
Don A. Dillman at Statistics Korea September
2014
62
Experiment 4: 2011 Three-State
Electricity Survey (TSES)
Continued building on prior studies (LCS, WCS, WES)
New questions:
Can push-to-web (web+mail) be used effectively in...
1) More distant states?
2) States with lower SES and Internet access?
Is 2web+mail (withholding mail to 4th contact instead of the 3rd more effective for pushing respondents to the web?
Don A. Dillman at Statistics Korea September
2014
63
States in the 2011Electricity Survey
Examples of the mail covers:
Don A. Dillman at Statistics Korea September
2014
64
11.4 12.8
28
19.6 21.2
20.3
0
10
20
30
40
50
60
Web+Mail Mail-only Web+Mail Mail-only Web+Mail Mail-only
Alabama Pennsylvania Washington
Push-to-web was less effective than in-
state, especially in Alabama with lower
SES & Internet access
Alabama: lower Internet penetration and SES, also distant
Pennsylvania: demographically similar but distant
Washington: control population
Don A. Dillman at Statistics Korea September
2014
31%
38% 34%
49% 48% 50%
65
Key observation:
We began to worry at this point, about the
effect of web requests from unknown
sources.
The web is a scary place! People worry about
viruses. How do we make these contacts for
a web response legitimate and effective?
We need to focus on improving trust
Don A. Dillman at Statistics Korea September
2014
66
A double push-to-web (hold mail to 4th
contact) was more effective than web+mail
in Pennsylvania, but not in Washington Web+mail:
1) $5 Web request, 2) reminder, 3) $2 Mail follow-up, 4) Reminder
2Web+Mail:
1) $5 Web request, 2) Reminder, 3) $2 Web request, 4) Mail follow-up
32.3 28 19.4
12.6
11.9 20.3
17.6 21.5
0
10
20
30
40
50
60
2Web+Mail Web+Mail 2Web+Mail Web+Mail
Pennsylvania Washington Don A. Dillman at Statistics Korea September
2014
44% 48%
37% 34%
67
Conclusions from Tri-state study
Was there a backlash against web? No. A very small
number of respondents called to request a paper
questionnaire. More common question was, “Why is
someone in Washington interested in our (distant)
state?”
2web+mail may be the best design for increasing web
response rates, particularly in more distant populations
In WA, the web+mail design performed even better than
in the 2008 & 2009 statewide studies (WCS, WES)
However, in PA and AL, only about 1/3 of web+mail respondents
chose web, and total web+mail response rates were significantly
lower than in WA
Don A. Dillman at Statistics Korea September
2014
68
Experiment 5: 2012 Water Management
Survey in Washington and Nebraska
Continued building on prior studies (LCS, WCS,
WES, TSES)
We have now adopted the double push-to-web
(2web+mail) for all designs
New questions:
Is within-state university sponsorship more effective at
obtaining responses than out-of-state university
sponsorship?
1) Does mode matter, when great distances between sponsor
and sampled households exist?
Don A. Dillman at Statistics Korea September
2014
69
WA
How do residents respond to
an out-of-state sponsor vs. a
within-state sponsor?
NE
Don A. Dillman at Statistics Korea September
2014
70
Experiment 5. 2012 Water Management
Survey, Cross-State sponsorship
Examples of the mail covers:
Don A. Dillman at Statistics Korea September
2014
71
Nebraska Washington
38.2
23.5
15.2
15.2
Mai
l-o
nly
2W
eb+M
ail
Mai
l-o
nly
2W
eb+M
ail
Nebraska Sponsor Washington Sponsor
32 25.7
10.6
11.2
0
10
20
30
40
50
60
70
Mai
l-o
nly
2W
eb+M
ail
Mai
l-o
nly
2W
eb+M
ail
Washington Sponsor Nebraska Sponsor
43%
51% 47%
37%
53% 57%
47%
39%
Don A. Dillman at Statistics Korea
September 2014
Within-state sponsored surveys achieved
higher response rates than out-of-state
sponsored surveys in both states and
across both modes
72
35.0
24.6
13.2
13.1
0
10
20
30
40
50
60
70
Mail-only 2Web+Mail Mail-only 2Web+Mail
Local (within-state) sponsor Distant (out-of-state) sponsor
When we combined data across states, we
found the same trends; also mail-only
groups obtained higher response rates than
2web+mail groups
48%
54%
47%
38%
Don A. Dillman at Statistics Korea
September 2014
73
In sum, within-state-sponsored surveys
obtained higher response rates than out-of-
state-sponsored surveys
We see similar patterns across Washington and
Nebraska and across the two modes.
Within-state-sponsored surveys (as compared with out-
of-state-sponsored surveys) achieve about: 4-10% higher response rates for mail-only groups
6-15% higher response rates for web groups of the 2web+mail mode
Same response rates for mail groups of the 2web+mail mode
Tentative conclusion; it’s important for recipients of survey
requests to know and trust survey sponsor, but we have
“barely” begun to investigate this issue.
Don A. Dillman at Statistics Korea
September 2014
74
Final summary: What did we learn
from the five studies?
1. Lewiston, ID-Clarkston, WA Regional Study 2007
2. Washington Community Survey 2008
3. Washington Economic Survey 2009
4. WA, PA, AL Tri-state electricity Survey 2011
5. WA and NE Water Management Survey 2012
Don A. Dillman at Statistics Korea September
2014
75
Response rates for push-to-web
versus mail-only (or mostly) designs,
2007-2011 studies
71%
41%
57%
31%
68%
34%
50%
28%
46%
12%
38%
11%
14%
15%
18%
20%
22% 20%
0%
10%
20%
30%
40%
50%
60%
70%
80%
Response Rates for Mail-Only vs. a Web+mail (withhold mail from first two contacts)
Lewiston-
Clarkston
Washington
Community
Washington
Economic
Washington
Electricity
Pennsylvania
Electricity Alabama
Electricity
Don A. Dillman at Statistics Korea September
2014
76
Response rates for more stringent
tests of 2web+mail vs. mail-only
designs, 2011-2012 studies
50%
32%
46%
19%
50%
32%
51%
24%
12%
18%
11%
15%
0%
10%
20%
30%
40%
50%
60%
70%
80%
Mail-Only vs. 2Web+Mail (withhold mail until fourth contact)
Washington
Electric
Pennsylvania
Electric
Washington
Water
Nebraska
Water
Don A. Dillman at Statistics Korea September
2014
77
Summary of findings (1)
1. Response rates 53% (71% to 38%) across 10
postal-only treatments on various state
populations (Washington to Alabama)
2. Response rates 43% (55% - 31%) across 10 push
to web treatment groups.
3. There are significant differences between web and
mail respondents (education, age, income, marital
status).
4. Demographically, the web+mail treatment
respondents are similar to mail-only respondents.
5. A web+mail approach results in an average of
about 62% of responses coming in over the web.
Don A. Dillman at Statistics Korea September
2014
78
Summary of findings (2)
6. Offering a choice of modes in the first contact
(mail vs. web) lowers response rates.
7. Offering a choice of modes results in a much
greater proportion (80%) of responses coming in
by mail.
8. A $5 token cash incentive with an initial web
request (paper alternative withheld) dramatically
improves web and total (31% vs. 13%) response
rates.
9. A second cash incentive in the 3rd or 4th contact
also improves response rates by 5-10 percentage
points.
Don A. Dillman at Statistics Korea September
2014
79
Why not simply give respondents
a choice of responding by paper
vs. web? Many surveyors think offering “choice” will improve response.
Choice produces two effects:
A lower response rate.
Most people respond by mail
However, we may be able to change that by adding email augmentation—a quick email that follows postal mailing containing a token incentive (reward) and provides an electronic link to make it easier (less costly) for people to respond. (See: Millar, Morgan and Don A. Dillman. 2011. Improving response to Web and Mixed-Mode Surveys. Public Opinion Quarterly 75 (2): 249-269)
For some survey situations (but usually not households) we can get both postal and email addresses.
Don A. Dillman at Statistics Korea September
2014
80
Don A. Dillman at Statistics Korea September
2014
81
Results from a student survey in which only mail
contact was used to request responses vs. when
email augmentation used to request responses
only over the web (Millar and Dillman, 2011)
p=.001
59.7
51.3 47.742.3
0
10
20
30
40
50
60
70
Web + Email Aug.
Mail Choice Web
We applied these results to a critical survey
conducted under time constraints
Needed to survey 600 post-prelim graduate students with 12 page
survey in less than month and produce report within six weeks.
Day 1 Postal request for web response with $2
Day 4 Email augmentation with electronic link
Day 10 Second email
Day 18 Postal request with paper questionnaire
Day 22 Final email-augmentation
Don A. Dillman at Statistics Korea September
2014
82
The paper questionnaire
mailed on April 14th
Don A. Dillman at Statistics Korea September
2014
83
77% response rate from web + mail; email augmentation
produced 31percentage point increase in 10 hours!
Don A. Dillman at Statistics Korea September
2014
84
Elaboration
Final response was 77%
Response rate increased an additional 12 percentage points after postal questionnaire sent; ½ respondents by paper and ½ by web.
The paper questionnaire went to 200 individuals, 32% responded.
The mixed-mode approach with email augmentation was quite effective.
Don A. Dillman at Statistics Korea September
2014
85
How did this happen?
Postal mail used token cash incentive to
provide a reward and get attention to the
attached communication.
Email augmentation of that contact to provide
electronic link decreased costs of
responding.
Multiple communications by multiple modes
and paper copy improved trust that
responding to the survey was important. Don A. Dillman at Statistics Korea September
2014
86
Another way of looking at what we did. We
systematically applied social exchange.
Day 1 Postal request for web response with$2 (reward with
request; trust encouraged by sponsorship)
Day 4 Email augmentation with electronic link (decrease
inconvenience, i.e. cost)
Day 10 Second email (survey is important, as social reward)
Day 18 Postal request with paper questionnaire (reduce cost to
some of respondents; also conveys message survey is
important)
Day 22 Final email. (survey is important and trust encouraged
by repeated contacts.)
Don A. Dillman at Statistics Korea September
2014
87
The strength of social exchange for
influencing behavior depends upon:
Understanding relative strength of means for affecting
costs, rewards and trust
Achieving additive effects from multiple actions
Enhance those additive effects through:
o multiple contacts
o multiple modes of communication
o Changing communication across contacts to reach
different audiences
o Offering multiple modes of responding
Don A. Dillman at Statistics Korea September
2014
88
Towards the Future
Web+mail data collection models with mail contact only have significant potential.
When we can use email contact (only available for certain populations) we do even better.
In the future I am hopeful we can go all electronic, but we are not there yet.
Research needs to be conducted in response to changes that continue to occur with regard to who uses the internet and their ways of interacting with it.
Much remains to be learned.
Don A. Dillman at Statistics Korea September
2014
89
Thank you!
Don A. Dillman, Washington State Univ. Social and Economic Sciences Research Center and Department of
Sociology, Pullman, WA 99164-4014
Contact: [email protected]
http://www.sesrc.wsu.edu/dillman/
Selected references
1. Smyth, J.D., Dillman, D.A., Christian, L.M., & O’Neill, A. (2010).
“Using the Internet to survey small towns and communities:
Limitations and possibilities in the early 21st century.” American
Behavioral Scientist 53: 1423-1448.
2. Dillman, D.A., Smyth, J.D., Christian, L.M. 2014. Internet, Phone,
Mail and Mixed-Mode Surveys; The Tailored Design Method 4th
edition. John Wiley Co.
3. Messer, Benjamin L. and Don A. Dillman. 2011. “Surveying the
General Public Over the Internet Using Address-Based Sampling
and Mail Contact Procedures.” Public Opinion Quarterly 75(3):429-
57.
4. Messer, Benjamin L., Michelle L. Edwards, & Don A. Dillman.
(2012). “Determinants of Web & Mail Item Nonresponse in
Address-Based Samples of the General Public.” Survey Practice,
April:. http://wwww.surveypractice.org
Don A. Dillman at Statistics Korea September
2014
91
Selected references, page 2
5. Millar, Morgan and Don A. Dillman. 2011. Improving response to
Web and Mixed-Mode Surveys. Public Opinion Quarterly 75 (2):
249-269
6. Messer, Benjamin L. 2012. “Pushing households to the web: Results
from Web+mail experiments using address based samples of the
general public and mail contact procedures.” Ph.D. Dissertation.
Washington State University, Pullman.
7. Edwards, Michelle L., Don A. Dillman and Jolene D. Smyth.
Forthcoming. An Experimental Test of the Effects of Survey
Sponsorship on Internet and Mail Survey Response. Public Opinion
Quarterly.
Don A. Dillman at Statistics Korea September
2014
92
For Additional Information
For additional information on these studies contact Don Dillman at: [email protected]
Web page information is at:
http://www.sesrc.wsu.edu/dillman/
Postal address: Don A. Dillman, Ph.D.
133 Wilson Hall
Washington State University
Pullman, WA 99163-4014
United States of America
Don A. Dillman at Statistics Korea September
2014
93