Since the mid 2000s, several research organizations have set up large probability‐based online panel surveys. In these panels, individuals or households are followed over time to study change. When the respondents also represent the general population well, these panel studies can also be used to study change at the population level, or do cross‐sectional surveys of the general population. To make sure that these online panels can be used to generalize study findings to the population, they start out with a probability‐based sample. After recruiting people offline, those without Internet access are then given access and a computer at home. In this way, the panel surveys will be representative of people with‐ and without Internet access. Both initial nonresponse in the panel recruitment phase, and attrition over time threaten the external validity of probability‐based online surveys however. This paper uses the Dutch LISS panel as an example to investigate the extent of nonresponse and attrition bias in the panel. To do so, we separate 9 groups of respondents, who each follow a distinct pattern of dropout. Then, within each group we look at the correlates of attrition, and compare these to the correlates of initial nonresponse bias. We show that initial nonresponse and attrition are two very different processes in a probability‐based panel survey. The correlates of early attrition in the panel survey are very different from the correlates of initial nonresponse. We also find large differences between the correlates of different types of attrition, implying that attrition at various stages of the panel survey is selective. In terms of the contribution to nonresponse bias, we find initial nonresponse bias to contribute more to overall nonresponse bias than attrition. The chapter concludes with a discussion of our findings and implications for survey practice.