Designing Effective Web Surveys
In: Public opinion quarterly: journal of the American Association for Public Opinion Research, Band 73, Heft 3, S. 592-595
ISSN: 0033-362X
18 Ergebnisse
Sortierung:
In: Public opinion quarterly: journal of the American Association for Public Opinion Research, Band 73, Heft 3, S. 592-595
ISSN: 0033-362X
In: The public opinion quarterly: POQ, Band 68, Heft 1, S. 57-80
ISSN: 1537-5331
In: Public opinion quarterly: journal of the American Association for Public Opinion Research, Band 68, Heft 1, S. 57-80
ISSN: 0033-362X
In: Social science computer review: SSCORE, Band 27, Heft 2, S. 196-212
ISSN: 1552-8286
Web surveys offer new opportunities for achieving high-quality responses to open-ended questions because the interactive nature of the web allows questions to be tailored to individual respondents. This article explores how respondents' level of interest in the topic of the question can influence whether they provide a response and the quality of their answers. In addition, we examine whether an interactive follow-up probe, asked after people submit their initial response to the open-ended question, can improve the quality of responses. We find that respondents' interest in the question topic significantly affects the responses to open-ended questions, and interactively probing responses to open-ended questions in web surveys can improve the quality of responses for some respondents, particularly for those very interested in the question topic. Nonresponse remains a significant problem for open-ended questions; we found high item nonresponse rates for the initial question and even higher nonresponse to the probe, especially for those less interested in the topic of the question. Consequently, interactive probing should only be used for a few key open-ended questions within a survey where high-quality responses are essential and may be more effective for respondents who are already motivated to provide a response.
In: Sociological methods and research, Band 37, Heft 3, S. 393-425
ISSN: 1552-8294
This paper explores how the visual design of scalar questions influences responses in web surveys. We present the results of five experiments embedded in two web surveys of university students. We find that consistently presenting the positive end of the scale first did not impact responses but increases response times. Displaying the categories in multiple columns influence how respondents process the scale and increase response times. Separating the midpoint, ``don't know'' option, or endpoints spatially does not impact responses when the visual and conceptual midpoint align. Removing the graphical layout of the scale influences responses when lower numbers indicate more positive categories and increases response time. Finally, response times are longer for polar point scales with numeric labels, but there are no differences in responses. Overall, our results suggest that the visual design of response scales impacts measurement, but that some manipulations produce larger and more significant differences than others.
In: The public opinion quarterly: POQ, Band 72, Heft 1, S. 103-113
ISSN: 1537-5331
In: Public opinion quarterly: journal of the American Association for Public Opinion Research, Band 72, Heft 1, S. 103-113
ISSN: 0033-362X
In: Public opinion quarterly: journal of the American Association for Public Opinion Research, Band 71, Heft 1, S. 113-125
ISSN: 0033-362X
In: The public opinion quarterly: POQ, Band 71, Heft 1, S. 113-125
ISSN: 1537-5331
In: Public Opinion Quarterly, Band 71, Heft 1, S. 113-125
SSRN
In: Public Opinion Quarterly, Band 72, Heft 1, S. 103-113
SSRN
In: The public opinion quarterly: POQ, Band 73, Heft 2, S. 325-337
ISSN: 1537-5331
In: Journal of survey statistics and methodology: JSSAM, Band 12, Heft 3, S. 674-696
ISSN: 2325-0992
Abstract
Using multiple modes of contact has been found to increase survey participation over a single contact mode. Text messaging has emerged as a new mode to contact survey participants in mixed-mode survey designs, especially for surveys that include web and/or phone data collection. However, it is unclear how to best combine text messages with mailings and other outreach contacts to improve response rates and data quality. To explore the effectiveness of using text messaging as a contact mode, we conducted a full factorial experiment that varies the sequencing of text messages with mailing contacts (early versus late reminder) and the time text messages were sent (morning versus afternoon). The experiment was implemented in a follow-up wave of a mixed-mode nationally representative longitudinal survey with two sample groups (Cooperative versus Other Respondents). For Cooperative Respondents, text reminders seemed to be effective at increasing completion rates, with the early text reminder being somewhat more effective than the late text reminder, at least early in the field period. For Other Respondents, text invitations were effective at improving the completion rate, but effects diminished quickly once the invitation letter was sent. Additionally, the early text reminder appears to be more effective than the late text reminder at increasing completion rates for Other Respondents. The sequencing of text messages did not affect data quality across sample groups or substantially impact nonresponse. The time of day the text messages were sent did not affect any of the outcome measures examined.
In: American behavioral scientist: ABS, Band 53, Heft 9, S. 1423-1448
ISSN: 1552-3381
Researchers who are interested in small towns and rural communities in the United States often find that they need to conduct their own sample surveys because many large national surveys, such as the American Community Survey, do not collect enough representative responses to make precise estimates. In collecting their own survey data, researchers face a number of challenges, such as sampling and coverage limitations. This article summarizes those challenges and tests mail and Internet methodologies for collecting data in small towns and rural communities using the U.S. Postal Service's Delivery Sequence File as a sample frame. Findings indicate that the Delivery Sequence File can be used to sample households in rural locations by sending them invitations via postal mail to respond to either paper-and-pencil or Internet surveys. Although the mail methodology is quite successful, the results for the Internet suggest that Web surveys alone exclude potentially important segments of the population of small towns and rural communities. However, Web surveys supplemented with postal questionnaires produce results quite similar to those of mail-only surveys, representing a possible cost savings for researchers who have access to Web survey capabilities.