featured, Notebook

Bystanders and response bias in face-to-face surveys in Africa

By Zack Zimbalist

Public opinion surveys are crucial fonts for understanding the public’s perceptions, values, and attitudes across the world. By conducting such surveys repeatedly with random samples, social scientists are able to track how responses change over time. This allows researchers to capture the dynamics of social perceptions on a range of interesting topics, from the economy, to health, education, crime, corruption, democracy, and government performance.

Ideally, respondents feel secure to disclose accurate information (avoiding reporting bias and item non-response) in the context of a face-to-face interview. Yet, survey research in political science seldom accounts for peer effects caused by bystanders. Much of the existing research focuses primarily on the effects of parents and spouses on self-reporting illicit activities or marriage-related issues. Moreover, these studies have mainly been carried out in industrialized countries with only a few studies that are also confined to similar survey questions in a small sample of developing countries.

This is thus the first study to investigate bystander effects across a large sample of developing countries for a broad set of questions related to social, political, and economic outcomes. Studying the presence of bystanders is important because third parties are often present in population surveys, especially in developing country contexts where extended family members and communities live in close proximity. For example, a bystander is present at 34% of interviews conducted by the Afrobarometer survey. Of those, 16% of respondents in the total sample is accompanied by non-familial bystanders, 6% by their spouses, and 12% by their children.

Using survey data from over 45,000 households across 34 African countries (collected by the Afrobarometer), my new articleBystanders and response bias in face-to-face surveys in Africa” finds that bystanders, especially non-familial ones, substantially affect responses to an array of questions (some sensitive and some not). The paper also demonstrates that these biased responses run counter to biases due to fear (linked to the perception of a state interviewer) and are most likely explained by social desirability when one is among her peers (a few people or a small crowd). The biases are far rarer for interviews conducted among just a spouse or children.

Let me provide a few examples from the article. First, in the presence of non-familial bystanders, respondents understate the extent (or supply) of democracy and their satisfaction with democracy and report higher levels of fear of political violence. These results run counter to respondents’ overstatement of supply and satisfaction with democracy with respect to the perception of a government interviewer. I argue that these overstatements correspond to the fear of criticizing the government’s performance on this dimension. By contrast, in the presence of non-familial bystanders, the opposite effect is most likely driven by social desirability concerns around what one’s neighbors believe to be the appropriate answer.

Second, respondents supervised by their peers express more disapproval for the performance of their MPs, local government, and the mayor. Here, again, it seems likely that expressing disapproval for politicians and government is, on average, the socially desirable response. This result contrasts with reporting systematically more approval in the case of perceiving a state interviewer (a fear-induced bias).

Third, in line with the social desirability of reporting disapproval of elected officials, respondents supervised by non-familial bystanders report higher levels of corruption in the presidential office, among MPs and government officials. Again, this result runs counter to the fear-induced state interviewer response, which is underreporting corruption levels to state interviewers.

In addition to misreporting, bystanders of both kin and non-kin are strongly associated with higher rates of item nonresponse. The levels of nonresponse and the gaps between bystander and non-bystander interviews were largest for arguably sensitive questions wherein a “don’t know” answer could be seen as satisficing and socially desirable.

This article’s results suggest the need to implement additional measures to measure and mitigate bystander presence. To measure bystander bias in contexts outside of Africa, other surveys such as other regional barometers and Pew polls would do well to include a question on the presence of bystanders. Mitigating bystander-induced biases is a thornier challenge that requires further experimentation across contexts. One alternative approach is self-administration in high literacy contexts (which eliminates the biases caused by bystanders overhearing answers) as some research has shown that respondents are more willing to answer sensitive questions when they are self-administered (see Krumpal, 2013 for a review). In addition, indirect modes of administration such as endorsement experiments, list experiments (or item count or unmatched count technique) (Glynn, 2013) or randomized response techniques (RRT) (Coutts & Jann, 2011; Rosenfeld, Imai, & Shapiro, 2015) could also be tested. Despite their limitations, indirect methods may improve data collection on sensitive questions. Moreover, they are more implementable than self-administration in low literacy contexts. Further research would be helpful in bolstering our understanding of whether, and to what extent, these methods obtain more reliable estimates across different contexts.

Overall, the article provides new evidence of substantial effects of bystanders across a range of survey questions in a large sample of democratic and non-democratic African countries. Securing private interviews is a sine qua non for obtaining accurate data. In the absence of this, alternative techniques could be deployed to ensure that respondents are free to provide honest assessments and perspectives on important economic, political and social questions.

Read the full article on IJSRM here.