Mobile Web Surveys: A Total Survey Error Perspective
While mobile phones or cell phones have been a challenge for telephone survey researchers for
some time, the Internet or Web capabilities of mobile phones have begun to receive attention in
the last few years. There are a number of ways that Internet-enabled smartphones can affect
survey data collection, and the implications of these for various sources of errors are only
now being fully explored. There are three broad approaches to the opportunities and challenges
posed by mobile Web.
Daily activity sees data constantly flowing through cameras, the internet, satellites, radio frequencies, sensors, private appliances, cars, smartphones, tablets and the like. Among all the tools currently used, mobile devices, especially mobile phones, smartphones and tablets, are the most widespread, with their use becoming prevalent in everyday life within both developed and developing countries. Shopping, reading newspapers, participating in forums, projecting and completing surveys, communicating with friends and making new ones, filing tax returns and getting involved in politics are all examples of how ingrained mobile technology is to modern lifestyle.
Mobile devices allow a wide range of heterogeneous activities and, as a result, have great potential in terms of the different types of data that can be collected. The use of mobile devices to collect, analyse and apply research data is explored here. This book focuses on the use of mobile devices in various research contexts, aiming to provide a detailed and updated knowledge on what is a comparatively new field of study. This is done considering different aspects: main methodological possibilities and issues; comparison and integration with more traditional survey modes or ways of participating in research; quality of collected data; use in commercial market research; representativeness of studies based only on the mobile-population; analysis of the current spread of mobile devices in several countries, and so on. Thus, the book provides interesting research findings from a wide range of countries and contexts.
This book was developed in the framework of WebDataNet’s Task Force 19. WebDataNet, was created in 2009 by a group of researchers focusing on the discussion on data collection methods. Supported by the European Union programme for the Coordination of Science and Technology, WebDataNet has become a unique, multidisciplinary network that has brought together leading web-based data collection experts from several institutions, disciplines, and relevant backgrounds from more than 35 different countries.
Previous studies have not found effective ways of encouraging participants to use smartphones to complete web surveys. We hypothesize that conditional differential incentives (the amount depending on the device the respondent uses to complete the web survey) can increase overall participation rates and the proportion of respondents who use a particular device in web surveys. We conducted an experiment using a volunteer online access panel in Russia with 5,474 invitations sent to regular mobile Internet users. We varied the invitation mode (SMS vs. e-mail) and encouragement to use a particular device for completing the survey – mobile phone or personal computer (PC). SMS increased the proportion of mobile web respondents, while e-mail increased the proportion of PC web respondents. As expected, differential incentives increased the overall participation rates by 8-10 percentage points if higher incentives were offered for completing the survey on a mobile phone. Contrary to expectations, offering higher incentives to PC web respondents did not produce higher participation rates compared to the control condition. Both encouraging the use of a mobile phone and offering higher incentives were effective at increasing the proportion of respondents using mobile devices. In terms of both participation rates and the proportion of respondents using mobile devices, offering incentives 50% higher was as efficient as offering incentives 100% higher for mobile web respondents. Offering higher incentives to mobile web respondents also had an effect on sample composition. Significantly higher participation rates were found among females and those with higher education.
There is some evidence that a scrolling design may reduce breakoffs in mobile web surveys compared to a paging design, but there is little empirical evidence to guide the choice of the optimal number of items per page. We investigate the effect of the number of items presented on a page on data quality in two types of questionnaires – with or without user-controlled skips. Three versions of a 30-item instrument were compared, with 5, 15 or all 30 questions presented on a page, in two different surveys, one with skips and one without. We found that displaying 30 items on a page reduced the breakoff rate by almost a third compared to presenting 5 items per page in the questionnaire without skips, however, the difference was not statistically significant. In both surveys with and without skips the completion times were significantly lower in the 30-item per page condition; however, item nonresponse rates were also higher. We give some practical recommendations to guide choices while designing questionnaires for mobile web surveys.
In this paper, we conduct a meta-analysis of breakoff rates in mobile web surveys. We test if an optimization of web surveys for mobile devices, invitation mode (SMS vs. e-mail), survey length, expected duration in the invitation, survey design (scrolling vs. paging), prerecruitment, number of reminders, design complexity (grids, drop-down questions, sliders, images, progress indicator), incentives, an opportunity to skip survey questions, and an opportunity to select the preferred mode (PC or mobile web) have an effect on breakoffs. The meta-analysis is based on 14 studies (39 independent samples) conducted using online panels – probability-based and non-probability-based. We found that mobile optimized surveys, email invitations, shorter surveys, using a prerecruitment, more reminders, a less complex design, and an opportunity to choose the preferred survey mode decrease breakoff rates in mobile web surveys. No effect of a scrolling design, incentives, indicating expected duration in the invitation, and an opportunity to skip survey questions were found.
A large number of findings in survey research suggest that responses to sensitive questions are situational and can vary in relation to context. The methodological literature demonstrates that social desirability biases are less prevalent in self-administered surveys, particularly in Web surveys, when there is no interviewer and less risk of presenting oneself in an unfavorable light. Since there is a growing number of users of mobile Web browsers, we focused our study on the effects of different devices (PC or cell phone) in Web surveys on the respondents’ willingness to report sensitive information. To reduce selection bias, we carried out a two-wave cross-over experiment using a volunteer online access-panel in Russia. Participants were asked to complete the questionnaire in both survey modes: PC and mobile Web survey. We hypothesized that features of mobile Web usage may affect response accuracy and lead to more socially desirable responses compared to the PC Web survey mode. We found significant differences in the reporting of alcohol consumption by mode, consistent with our hypothesis. But other sensitive questions did not show similar effects. We also found that the presence of familiar bystanders had an impact on the responses, while the presence of strangers did not have any significant effect in either survey mode. Contrary to expectations, we did not find evidence of a positive impact of completing the questionnaire at home and trust in data confidentiality on the level of reporting. These results could help survey practitioners to design and improve data quality in Web surveys completed on different devices.