Dependent Interviewing and the risk of correlated measurement errors

Longitudinal surveys ask the same people the same questions over time. So questionnaires tend to be rather boring for respondents after a while. “Are you asking me this again, you asked that last year as well!” is what many respondents probably think during an interview. As methodologists who manage panel surveys, we know this process may be rather boring, but in order to document change over time, we just need to ask respondents the same questions over and over.

Some measures of change over time would become biased if we just repeat questions year-on-year. For example, we know that if we ask respondents twice about their occupation, less than half of all of them have the same occupational codes over time. We know from other statistics (e.g. tax returns), that that is not true. Most people stay in the same occupation over time. Now, you may think, dear reader, that that is probably due to the fact that occupation is rather difficult to measure and code in general, and you are right. Unreliable question will lead to a lot of spurious change over time.

Dependent Interviewing helps to make codes consistent over time and reduce such spurious change. The idea is that, instead of coding occupation independently year-on-year, you ask respondents in year 2 the question “last year, you said you were a bankteller, is that still the case?". There are many different variants to ask this Dependent Interviewing question, and the exact wording is important for the outcomes. Especially, because we do not want respondent to say “yes” too easily to questions we ask.

“Last year, you told me you told me you worked as a bankteller, is that still the case?"

Recently, a paper I wrote on the effects of various forms of Dependent Interviewing came out in Field Methods . It was actually the first paper I wrote for my Ph.D, and I started work on it in 2006. So, it has been quite a journey to get this story on paper and get it published. I am very happy to see it on paper now. We did an experiment, where we tried out different DI-designs in a four-wave panel study, to study effects of data quality of each different DI-design. Specifically, we looked at whether respondents might falsely confirm data from the previous year that we knew contained measurement error. The bottom line of the study is that when Dependent Interviewing is applied to income amount questions over time, it does improve data quality, and we don’t need to worry so much about respondents wrongly agreeing to pre-loaded data from the previous year. Read the full paper here.

Avatar
Peter Lugtig
Associate Professor of Survey Methodology

I am an associate professor at Utrecht University, department of Methodology and Statistics.

Related