Evaluating a Web-based Emilda B. Rivers, Establishment ... › fedcasic › fc2006 › ppt ›...

Post on 30-Jun-2020

0 views 0 download

Transcript of Evaluating a Web-based Emilda B. Rivers, Establishment ... › fedcasic › fc2006 › ppt ›...

Evaluating a Web-based

Establishment Survey of U.S. Academic

Institutions using a

Web-based Response Behavior

Survey

Emilda B. Rivers, National Science Foundation

Scott D. Crawford,Survey Sciences Group, LLC

2

Background

Graduate Students and Postdoctorates in Science and Engineering (GSS)

• Four sponsors

• Conducted annually since 1972

• U.S. academic institutions

• Introduced the Web in 1998

3

Background

Graduate Students and Postdoctorates in Science and Engineering (GSS)

National estimates• For fall each year• Graduate enrollment• Postdoctoral (postdoc) appointments• In science, engineering, and health-related

disciplines

4

Background

Graduate Students and Postdoctorates in Science and Engineering (GSS)

Enrollment data by• Discipline (field/area of study)• Geographic location• Demographics (citizenship, sex, race/ethnicity)• Highest degree granted• Sources and mechanisms of support

5

Background

Graduate Students and Postdoctorates in Science and Engineering (GSS)

Data collection• Starts in October/November each year• “Deadline” is set for January 31• Activities wrap up in July/August

6

Background

Graduate Students and Postdoctorates in Science and Engineering (GSS)

• Each institution (or similar) in the GSS is assigned a contact person (coordinator)

• Coordinator receives…Introductory emailMailed packet with paper surveys and Web instructionsIndividualized contacts over approx. 10 months

7

Background

Graduate Students and Postdoctorates in Science and Engineering (GSS)

• Confirm contact information

• Identify new or defunct departments/programs/centers with graduate students or postdocs

• Provide enrollment data for eligible departments/programs/centers

8

Background

Graduate Students and Postdoctorates in Science and Engineering (GSS)

High response rates (98%+) typically• But most responses after January

deadline• Often responses are incomplete or

require imputation

9

Background

Feedback

Complex survey

•Multiple respondents

•Multiple record keeping sources

10

Current SRS Need

Close a major gap in knowledge

• Learn more about the individuals assigned the task of providing postdoc data in the GSS

• Learn more about the process they use to provide postdoc data in the GSS

What is the impact of response behavior on GSS postdoc data quality?

11

The Pilot Response Behavior Survey (RBS)

A survey focused on…

• Respondent characteristics• Organizational (institutional or

departmental) characteristics• Postdoc definitions• Response behavior• Respondent perception of quality

12

The Pilot Response Behavior Survey (RBS)

Generally, the RBS is focused on understanding as much as it can about the response process related to postdocdata in a setting where the unit of interest is an establishment

13

Pre-Test RBS

Conducted Fall 2005 between 9/21/05 and 10/06/05 (15 days)

• Letter / Email invitation (experimental)• Email / Letter reminder (experimental)• Email reminder• Letter reminder• Email reminder

14

Pre-Test RBS

• 432 respondents invited to the pre-test

288 responded (66.67% AAPOR RR2)

258 completed the survey (89.58% completion rate), 14 of partials reported they were not the appropriate respondent

• Mean length, 23.71 minutes overall

15

Pre-Test RBS Lessons Learned

Rapid follow-up data collection procedure can work

Pilot changesLonger data collection period

Telephone contact

16

Pre-Test RBS Lessons Learned

• Literature that supports mail contact prior to email contact may not be supported with this population

• Experimental design demonstrated no difference with mode of first contact on RR or CR

Pilot changesKeep with mail first due to benefit of more control on timing with follow-up contacts

17

Pre-Test RBS

Interesting Findings

While this was a pre-test, several interesting findings emerged that will inform the development of the pilot survey

18

Respondent Characteristics

Respondent education low – especially for those who did not have “primary”responsibility for collecting GSS data

0

10

20

30

40

Perc

ent

PrimaryContact

15.4 25.1 28.5 4.1 26.9

Not PrimaryContact

37.1 22.9 17.1 3.0 20.0

Less than BA

BA/BS MA/MS Prof Degree

Doctoral

19

Respondent Characteristics

Most respondents did not have sufficient knowledge of the institution’s computer systems to find the postdocdata

0102030405060708090

Perc

ent

Knows Computer System 83.1 81.9 80.2 76.0 51.4 50.0 47.1

Grad Counts

Grad Demos

Grad Enroll Status

Grad Financi

al

Postdoc Counts

PD Citizen

Non Fac Res Staff

20

Response Behavior

Primary contact or not…

• Respondents clearly able to respond on their own to some but not all of the data requested

• Large percent of “primary”respondents unable to provide all of the postdoc-related data

0

20

40

60

80

100

Perc

ent

Primary Contact 77.6 77.3 76.2 57.8 46.2 44.3 39.0Not Primary Contact 87.9 81.7 87.1 69.0 57.2 54.8 51.4

Grad Counts

Grad Demos

Grad Enroll Status

Grad Financi

al

Postdoc Counts

PD Citizen

Non Fac Res Staff

21

Response Behavior

Although it took 10 months for a near complete response, a large number of respondents reported that the current data collection schedule is a good one 78

80

82

84

86

88

90

92

94

96

Perc

ent

Good Time 94.1 84.0Grad Counts Postdoc Counts

22

05

1015

2025

3035

40

Better Month 2.4 1.4 2.2 33.6 39.3 35.7 36.9 19.4 8.7 6.6 3.7 4.5 3.3 2.7 3.1 28.6

Jul-04

Aug-04

Sep-04

Oct-04

Nov-04

Dec-04

Jan-05

Feb-05

Mar-05

Apr-05

May-05

Jun-05

Jul-05

Aug-05

Sep-05

Don't Know

Response Behavior

Oddly, when those who thought October – January was not a good time were asked when a good time would be… they overwhelmingly chose October – January as “better months”

23

Respondent Behavior

Approx. 1/3 of the institutions rely on the paper survey to use as a worksheet for completing the Web later

24

Postdoc Data

• Large proportion of respondents report no standard way that postdoc data are organized

• Further, of those who DO have a standard postdoc definition at their institution, approx. 1/3 report that the standard differs within the institution

This raises the question… how does the respondent provide postdoc data when there is so much variation?

25

Benefits of Web Survey for the RBS

• Familiar mode to GSS

• Rapid data collection

• Reduction of respondent burden through…Flexibility of when they complete the surveyAbility to customize the questionnaire to the respondent

26

RBS Benefits

The Pre-Test RBS proved to be valuable for…

• Providing baseline data about respondents in establishment surveys

• Identifying and quantifying areas in which survey data collection may be changed to improve the quality of data being collected

• Identifying areas where more in-depth study will be required to fully understand the impact of the survey process on postdoc data quality

27

RBS Limitations

• Current design has its focus on individual respondents – while it is likely that multiple respondents are truly involved at the reporting unit level

• Relies on self-reported assessments of quality

• Same mode design may allow for mode induced errors to go undetected

28

Next Steps

• Conduct Pilot RBS

•Analyze results

• More in-depth study

29

Scott D. CrawfordSurvey Sciences Group, LLCscott@surveysciences.com

734-231-4600 x100

Emilda B. Rivers National Science Foundation

erivers@nsf.gov703-292-7773

Contact the authors