Survey Design: Response Bias

This is part 4 of our series on Survey Design, previous segments are available in our archives.

One of the great things about surveys is that they allow you to ask deep and probing questions of your customers. One of the bad things about surveys is that your customer can choose not to answer those questions. In fact, the vast majority of your customers will never fill out a survey you send to them.

Bias is a big problem with surveys. In most cases, the customers who answer your surveys are not representative of the overall population of your customers. Worse yet, there is no guarantee that your customers are answering your survey truthfully!

There are a number of kinds of bias that can infect your data, and I’ll review some of them here so you can try to avoid them:

  • Self-selection Bias. When you send a survey, the respondent can choose to complete or ignore the survey. Since the respondents self-select whether they respond, they are not going to represent the entirety of your audience! You will need to take this into account and interpret the results accordingly.
  • Incentive Bias. One way to overcome self-selection bias is to provide an incentive, typically in the form of money, to complete a survey. However, this introduces Incentive Bias since the respondent is now motivated by money instead of their interest in helping you. This means they are more likely to answer questions incorrectly since they get the incentive for completing the survey, not for completing the survey truthfully.
  • Completion Bias. The longer your survey, the less likely respondents will complete the entire thing. That means the results for the questions at the end will be less reliable than those at the beginning because the self-selection bias is even stronger. Randomizing the order of your questions can help, but that might not be possible so you’ll need to take the completion rate for questions into account when interpreting results.

One way to overcome these kinds of bias is to triangulate results with other sources of data. If you ask your customers how often they use your product, you can compare the results with the data from your analytics system. If they claim to use the product everyday but your data says they use it once a week you know you have some bias in your data.

Keep in mind that surveys don’t need to be anonymous. You can tie survey responses back to specific customers and combine their responses with everything else you know about them. Customers may be more truthful if they are anonymous so it is up to you if tying the data back to specific customers is worth any additional bias.

Tomorrow we’ll jump into the best part of surveys – dimensioned analysis!

 

Quote of the Day: So, don’t ask me no questions / And I won’t tell you no lies” – Lynyrd Skynyrd, “Don’t Ask Me No Questions”