Are item-missings related to later attrition?
A follow up on last month’s post . Respondents do seem to be less compliant in the waves before they drop out from a panel survey. This may however not neccesarily lead to worse data. So, what else do we see before attrition takes place? Let have a look at missing data:
First, we look at missing data in a sensitive question on income amounts. Earlier studies ( here , here, here ) have already found that item nonresponse on sensitive questions predicts later attrition. I find that item nonresponse does increase before attrition, but only because of the fact that respondents are more likely to refuse to give an answer. And that increase is largely due to respondents who will later refuse to participate in the study as a whole. So, item refusals are a good predictor of later study refusals. The proportion of “Don’t know” respondents does not increase over time.
Missing income data in BHPS in 5 waves before attrition (click to enlarge)
Does this finding for a sensitive question extend to all survey questions? No. Over all questions combined, I find that refusals increase before attrition takes place, but from a very low base (see the Y-axis scale in the figure below). Moreover, there is no difference between the groups, meaning that those who drop out of the survey do not have more item-missings than those respondents who are “always interviewed”. It may seem odd that item missings increase for respondents who always happily participate. I suspect however that this may be related to the fact that both interviewers and respondents may have known in the last wave(s) that the BHPS was coming to an end after 18 years of interviewing.
Missing data for all survey questions in BHPS in waves before attrition (click to enlarge)
What to do with this information? It seems that later study refusals can be identified using a combination of item nonresponses and survey compliance indicators. Once these respondents are identified, the next step would be to target them with survey design features that try to prevent attrition. These survey design features should target some of the concerns and motivations such respondents have that cause them to drop out from the survey.
- Do respondents become sloppy before attrition?
- measurement and nonresponse error in panel surveys
- Imagine we have great covariates for correcting for unit nonresponse...
- Personality predicts the likelihood and process of attrition in a panel survey
- Dependent Interviewing and the risk of correlated measurement errors