Hostname: page-component-8448b6f56d-dnltx Total loading time: 0 Render date: 2024-04-25T01:42:05.126Z Has data issue: false hasContentIssue false

How Face-to-Face Interviews and Cognitive Skill Affect Item Non-Response: A Randomized Experiment Assigning Mode of Interview

Published online by Cambridge University Press:  13 June 2016

Abstract

Technology and the decreased cost of survey research have made it possible for researchers to collect data using new and varied modes of interview. These data are often analyzed as if they were generated using similar processes, but the modes of interview may produce differences in response simply due to the presence or absence of an interviewer. In this paper, we explore the differences in item non-response that result from different modes of interview and find that mode makes a difference. The data are from an experiment in which we randomly assigned an adult population to an in-person or self-completed survey after subjects agreed to participate in a short poll. For nearly every topic and format of question, we find less item non-response in the self-complete mode. Furthermore, we find the difference across modes in non-response is exacerbated for respondents with low levels of cognitive abilities. Moving from high to low levels of cognitive ability, an otherwise average respondent can be up to six times more likely to say “don’t know” in a face-to-face interview than in a self-completed survey, depending on the type of question.

Type
Original Articles
Copyright
© The European Political Science Association 2016 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

*

Andrew Gooch, Postdoctoral Fellow, Institution for Social and Policy Studies and the Center for the Study of American Politics, 77 Prospect Street, New Haven, CT 06511 (andrew.gooch@yale.edu). Lynn Vavreck, Professor of Political Science and Communication Studies, University of California, Los Angeles, 4289 Bunche Hall Los Angeles, CA 90095 (lvavreck@ucla.edu). This research is supported by a grant from the National Science Foundation (SES-1023940). The authors thank Brian Law for managing the project at the MGM Grand and Felipe Nunes, Sylvia Friedel, Gilda Rodriguez, Adria Tinnin, and Chris Tausanovitch for their participation in Las Vegas. Doug Rivers and Jeff Lewis provided programming support; John Aldrich, Larry Bartels, Alan Gerber, Gary Jacobson, Simon Jackman, Vince Hutchings, Gary Segura, John Zaller, and Brian Humes helped with the design of the experiment. Finally, the authors are grateful to Mike Thies who provided valuable feedback on drafts of the paper. To view supplementary material for this article, please visit http://dx.doi.org/10.1017/psrm.2016.20

References

Acree, Michael, Ekstrand, Maria, Coates, Thomas J., and Stall, Ron. 1999. ‘Mode Effects in Surveys of Gay Men: A Within-Individual Comparison of Responses by Mail and Telephone’. The Journal of Sex Research 36(1):6775.Google Scholar
Althaus, Scott L. 2003. Collective Preferences in Democratic Politics: Opinion Surveys and the Will of the People. New York: Cambridge University Press.Google Scholar
Ansolabehere, Stephen, and Schaffner, Brian F.. 2014. ‘Does Survey Mode Still Matter? Findings from a 2010 Multi-Mode Comparison’. Political Analysis 22(3):285304.Google Scholar
Atkeson, Lonna Rae, Adams, Alex N., and Alvarez, R. Michael. 2014. ‘Nonresponse and Mode Effects in Self- and Interviewer-Assisted Surveys’. Political Analysis 22(3):304320.Google Scholar
Berinsky, Adam J. 2004. Silent Voices: Public Opinion and Political Participation in America. Princeton, NJ: Princeton University Press.Google Scholar
Bishop, George, and Smith, Andrew. 2001. ‘Response-Order Effects and the Early Gallup Split-Ballots’. Public Opinion Quarterly 65(4):479505.Google Scholar
Bishop, George F., Hippler, Hans-Juergen, Schwarz, Norbert, and Strack, Fritz. 1988. ‘A Comparison of Response Effects in Self-Administered and Telephone Surveys’. In Robert M. Groves, Paul P. Biemer, Lars E. Lyberg, James T. Massey, William L. Nicholls and Joseph Waksberg (eds), Telephone Survey Methodology, 321339. New York: Wiley Press.Google Scholar
Bless, Herbert, Mackie, Diane M. and Schwarz, Norbert. 1992. ‘Mood Effects on Attitude Judgments: Independent Effects of Mood Before and After Message Elaboration’. Journal of personality and social psychology 63(4):585.Google Scholar
Bradburn, Norman M., and Sudman, Seymour. 1988. Polls and Surveys: Understanding What They Tell Us. San Francisco, CA: Jossey-Bass Press.Google Scholar
Carmines, Edward G., and Stimson, James A.. 1980. ‘The Two Faces of Issue Voting’. American Political Science Review 74(1):7891.Google Scholar
Carroll, John B. 2003. ‘The Higher-Stratum Structure of Cognitive Abilities: Current Evidence Supports G and About Ten Broad Factors’. In Helmuth Nyborg (ed.), The Scientific Study of General Intelligence: Tribute to Arthur R. Jensen, 521. New York: Pergamon Press.Google Scholar
Chang, Linchiat, and Krosnick, Jon A.. 2010. ‘Comparing Oral Interviewing with Self-Administered Computerized Questionnaires: An Experiment’. Public Opinion Quarterly 74(1):154167.Google Scholar
Converse, Jean M. 1976. ‘Predicting No Opinion in Polls’. Public Opinion Quarterly 40(4):515530.Google Scholar
Converse, Philip E. 1964. ‘The Nature of Belief Systems in Mass Publics’. In David E. Apter (ed.), Ideology and Discontent, 206261. New York: Free Press.Google Scholar
Converse, Philip E. 1970. ‘Attitudes and Non-Attitudes: Continuation of a Dialogue’. In Edward R. Tufte (ed.), The Quantitative Analysis of Social Problems, 168189. Reading: Addison-Wesley Press.Google Scholar
Coombs, Clyde H., and Coombs, Lolagene C.. 1976. ‘“Don’t Know”: Item Ambiguity or Respondent Uncertainty?’. Public Opinion Quarterly 40(4):497514.Google Scholar
Cronbach, Lee J. 1946. ‘Response Sets and Test Validity’. Educational and Psychological Measurement 6(4):475494.Google Scholar
de Leeuw, Edith D. 1992. Data Quality in Mail, Telephone and Face-to-Face Surveys. Amsterdam: T.T.-publikaties.Google Scholar
Delli Carpini, Michael X., and Keeter, Scott. 1996. What Americans Know About Politics and Why it Matters. New Haven, CT: Yale University Press.Google Scholar
Feick, Lawrence F. 1989. ‘Latent Class Analysis of Survey Questions That Include Don’t Know Responses’. Public Opinion Quarterly 53(4):525547.Google Scholar
Fowler, Floyd J., Roman, Anthony M., and Xiao Di, Zhu. 1998. ‘Mode Effects in a Survey of Medicare Prostate Surgery Patients’. Public Opinion Quarterly 62(1):2946.Google Scholar
Francis, Joe D., and Busch, Lawrence. 1975. ‘What We Now Know About “I Don’t Know”’. Public Opinion Quarterly 39(2):207218.Google Scholar
Gano-Phillips, Susan, and Fincham, Frank D.. 1992. ‘Assessing Marriage Via Telephone Interviews and Written Questionnaires: A Methodological Note’. Journal of Marriage and Family 54(3):630635.Google Scholar
Gerber, Alan S. and Green, Donald P.. 2012. Field Experiments: Design, Analysis, and Interpretation. New York, NY: WW Norton.Google Scholar
Gooch, Andrew. 2015. ‘Measurements of Cognitive Skill by Survey Mode: Marginal Differences and Scaling Similarities’, Research & Politics 2(3):111.Google Scholar
Heerwegh, Dirk. 2009. ‘Mode Differences Between Face-to-Face and Web Surveys: An Experimental Investigation of Data Quality and Social Desirability Effects’. International Journal of Public Opinion Research 21(1):111121.Google Scholar
Heerwegh, Dirk, and Loosveldt, Geert. 2008. ‘Face-to-Face Versus Web Surveying in a High-Internet-Coverage Population: Differences in Response Quality’. Public Opinion Quarterly 72(5):836846.Google Scholar
Holbrook, Allyson L., Green, Melanie C., and Krosnick, Jon A.. 2003. ‘Telephone Versus Face-to-Face Interviewing of National Probability Samples with Long Questionnaires: Comparisons of Respondent Satisficing and Social Desirability Response Bias’. Public Opinion Quarterly 67(1):79125.Google Scholar
Kiesler, Sara, and Sproull, Lee S.. 1986. ‘Response Effects in the Electronic Survey’. Public Opinion Quarterly 50(3):402413.Google Scholar
Krosnick, Jon A. 1991. ‘Response Strategies for Coping with the Cognitive Demands of Attitude Measures in Surveys’. Applied cognitive psychology 5(3):213236.Google Scholar
Krosnick, Jon A. 2010. ‘Response Strategies for Coping with the Cognitive Demands of Attitude Measures in Surveys’. Applied Cognitive Psychology 5(3):213236.Google Scholar
Krosnick, Jon A, Holbrook, Allyson L., Berent, Matthew K., Carson, Richard T., Hanemann, W. Michael, Kopp, Raymond J., Mitchell, Robert Cameron, Presser, Stanley, Ruud, Paul A., Smith, V. Kerry, Moody, Wendy R., Green, Melanie C., and Conaway, Michael. 2002. ‘The Impact of “No Opinion” Response Options on Data Quality: Non-Attitude Reduction or an Invitation to Satisfice?’. Public Opinion Quarterly 66(3):371403.Google Scholar
Malhotra, Neil, and Krosnick, Jon A.. 2007. ‘The Effect of Survey Mode and Sampling on Inferences About Political Attitudes and Behavior: Comparing the 2000 and 2004 ANES to Internet Surveys With Nonprobability Samples’. Political Analysis 15(3):286323.Google Scholar
Malhotra, Neil, Krosnick, Jon A., and Haertel, Edward. 2007. The Psychometric Properties of the GSS Wordsum Vocabulary Test. Chicago, IL: National Opinion Research Center.Google Scholar
Mondak, Jeffery J. 2001. ‘Developing Valid Knowledge Scales’. American Journal of Political Science 45(1):224238.Google Scholar
Mondak, Jeffery J., and Davis, Belinda Creel. 2001. ‘Asked and Answered: Knowledge Levels When We Will Not Take” ‘Don’t Know” for an Answer’. Political Behavior 23(3):199224.Google Scholar
Oppenheim, Abraham N. 1992. Questionnaire Design, Interviewing, and Attitude Measurement. London: Pinter Press.Google Scholar
Rapoport, Ronald B. 1982. ‘Sex Differences in Attitude Expression: A Generational Explanation’. Public Opinion Quarterly 46(1):8696.Google Scholar
Sanders, David, Clarke, Harold D., Stewart, Marianne C., and Whiteley, Paul. 2007. ‘Does Mode Matter for Modeling Political Choice? Evidence from the 2005 British Election Study’. Political Analysis 15(3):257285.Google Scholar
Schwarz, N.Bless, H. Bless, H., H.J., Hippler, F., Strack, and S, Sudman., 1994. ‘Cognitive and Communicative Aspects of Survey Measurement’. Trends and Perspectives in Empirical Social Research, pp. 257285.Google Scholar
Schwarz, Norbert, and Bohner, Gerd. 2001. ‘The Construction of Attitudes’. In Abraham Tesser and Norbert Schwarz (eds), Blackwell Handbook of Social Psychology: Intraindividual Processes, 436457. New York: Wiley Press.Google Scholar
Strack, Fritz, Schwarz, Norbert, and Wanke, Michaela. 2010. ‘Response Strategies for Coping With the Cognitive Demands of Attitude Measures in Surveys’. Applied Cognitive Psychology 5(3):213236.Google Scholar
Sudman, Seymour, and Bradburn, Norman M.. 1974. Response Effects in the Electronic Survey. Chicago, IL: Aldine Press.Google Scholar
Sudman, Seymour, Bradburn, Norman M. and Schwarz, Norbert. 1996. Thinking About Answers: The Application of Cognitive Processes to Survey Methodology. San Fransico, CA: Jossey-Bass.Google Scholar
Tourangeau, Roger, Rips, Lance J., and Rasinski, Kenneth. 2000. The Psychology of Survey Response. New York: Cambridge University Press.Google Scholar
Supplementary material: File

Gooch and Vavreck supplementary material

Gooch and Vavreck supplementary material 1

Download Gooch and Vavreck supplementary material(File)
File 148 KB
Supplementary material: PDF

Gooch and Vavreck supplementary material

Gooch and Vavreck supplementary material 2

Download Gooch and Vavreck supplementary material(PDF)
PDF 32.7 KB
Supplementary material: PDF

Gooch and Vavreck supplementary material

Gooch and Vavreck supplementary material 3

Download Gooch and Vavreck supplementary material(PDF)
PDF 180.9 KB
Supplementary material: File

Gooch and Vavreck supplementary material

Gooch and Vavreck supplementary material 4

Download Gooch and Vavreck supplementary material(File)
File 87.1 KB