panel survey

Dependent Interviewing and the risk of correlated measurement errors

Longitudinal surveys ask the same people the same questions over time. So questionnaires tend to be rather boring for respondents after a while. “Are you asking me this again, you asked that last year as well!” is what many respondents probably think during an interview. As methodologists who manage panel surveys, we know this process may be rather boring, but in order to document change over time, we just need to ask respondents the same questions over and over.

measurement and nonresponse error in panel surveys

I am spending time at the Institute for Social and Economic Research in Colchester, UK where I will work on a research project that investigates whether there is a tradeoff between nonresponse and measurement errors in panel surveys. Survey methodologists have long believed that multiple survey errors have a common cause. For example, when a respondent is less motivated, this may result in nonresponse (in a panel study attrition), or in reduced cognitive effort during the interview, which in turn leads to measurement errors.

Planned Missingness

I recently gave a talk at an internal seminar on planned missingness for a group of developmental psychologists. The idea behind planned missingness is that you can shorten interview time or reduce costs, if you decide as a researcher not to administer all your instruments to everyone in your sample. When you either randomly assign people to receive a particular instrument, or do so by design (i.e. only collect bio-markers in an at-risk group), your missing data will either be Missing Completely At Random (MCAR) or Missing at Random (MAR).

Is panel attrition the same as nonresponse?

All of my research is focused on the methods of assembling and analysis of panel survey data. One of the primary problems of panel survey projects is attrition or drop-out. Over the course of a panel survey, many respondents decide to no longer participate. Last july I visited the panel survey methods workshop in Melbourne, at which we had extensive discussions about panel attrition. How to study it, what the consequences are (bias) for survey estimates, and how to prevent it from happening altogether.

panel conditioning

In late august of 2011 I attended the Internet Survey Methodology Workshop. There were people from academia, official statistics and market research agencies there. One of the issues discussed there has had me thinking since: the topic of panel conditioning. Some people seem really worried that respondents in panel surveys start behaving or thinking differently because of repeated participation in a survey. Panel conditioning is closely linked with the issue of ‘professional’ respondents.