How to perform smart sampling and data checking?

With 10,000+ projects performed on the platform, market research experts have compiled common concerns regarding sampling and data checking and provide their suggestions.

Is there a need to set a more specific predefined panel?

Generally, it would be ideal to define a specific predefined panel such that the survey respondents are like your target audience. However, if you have a very niche target audience, you can forgo some criteria and specify a broader predefined panel to optimise the sampling cost.

For illustration, Research Project A targets a niche audience of active users of a certain product from a certain brand. The sample is further specified with the criteria of using a certain range of formats or sizes of the product because those are more commonly purchased. In this case, the sample is difficult to reach, and thus requires a higher sampling cost. Considering the range of product formats or sizes are already commonly purchased, we recommend specifying a more general predefined panel (i.e. active users of the product from the brand) to have an optimal sampling cost.

Should I accept respondents who provided short and quick open-ended responses?

If you have included open-ended questions such as likes/dislikes feedback questions, oftentimes you will receive short and quick responses such as “nothing” or “I don’t know”. You might feel inclined to remove these respondents, but we would like to reassure you that this is not a problem and there is no need to exclude them.

Due to the nature of open-ended questions, some respondents view the questions as time-consuming and mentally overwhelming to answer regardless of the incentives offered. In addition, those quick responses can be accepted as “neutral responses” that represent a neutral stance. It might be better to have more respondents sharing their opinions and thoughts, but it is also not realistic to assume everyone has something to write about.

Therefore, we suggest not removing these quick and short responses as long as they are not deliberate fraud responses. To reduce this occasion, you can reduce the number of open-ended questions in your survey. Besides, we recommend phrasing the questions to be more precise and engaging so respondents are more willing to express their thoughts.

Should I accept some form of inconsistent or inattentive responses from respondents?

If you have implemented interrelated questions (e.g. a question about purchase frequency and another about purchase timeframe), you might notice that some respondents can provide inconsistent or inattentive responses. Naturally, you may think of using conditional display logic for the questions or question options to ensure that respondents will answer the questions more accurately.

However, suppose you have many questions that are related to each other. In that case, it will take a significant amount of time to implement complicated logic, which might create unexpected logic errors in the survey or restrict respondents too much in their responses.

We propose that it is more feasible to accept a certain amount of responses that are not fully attentive or consistent to save time and cost, at least for questions that are not your primary research focus, such as demographic questions. Although not desirable, it is realistic to anticipate that respondents might be careless and not attentive enough to remember their answers to previous questions. This is especially apparent if the survey is longer and more complicated, e.g. multiple similar questions asked for a series of product concepts.

As long as the responses are not deliberately fraud, a certain degree of leeway for response inconsistency is acceptable in exchange for a less complicated display logic that could be prone to errors.

Alternatively, we recommend including a prompt or reminder about their answers to the previous related questions, such as “You have selected …… in the previous question” in your question. This will prompt them to respond under the consideration of their previous answer. You can implement this using JavaScript to customise the survey.

Get in touch for pre-defined panel and survey customisation

If you would like to have our assistance in getting the specific sample that is not listed in the existing predefined panels or implementing the research customisation, please feel free to contact us for a quote at’s team of research experts are here to help.

Written on 25 May 2022 by:
Jun Jie Chow image
Jun Jie Chow
Market Researcher

Read these articles next:

Example Conjoint Report on Ice-cream

Example Conjoint Report on Ice-cream

Research tools Case studies 2 October 2019

Example conjoint analysis report for preferences in ice-cream cones. View article

Random effect models with lmer function in R

Research tools 19 August 2021

Random effects are everywhere in survey data. Let's try to do appropriate modelling for them in R! View article

Conjoint analysis segmentation

How to Do Segmentation (subgroup analysis)?

Target audience Research tools 7 July 2020

Learn how to do segmentation on the platform with this visual guide. View article