We’ll be the first to admit – conversation about data quality processes can leave many of us feeling apathetic, as it’s not the most exciting topic. This reputation can lead to hesitation when it comes to digging deeper into the matter. However, in this age of DIY surveys and online polling, it’s easy for data quality to suffer.
Read below for how to apply simple quality assurance techniques to the survey research process.
Quality Control vs. Quality Assurance:
First, it is important to distinguish between quality control (QC) and quality assurance (QA).
Quality Control Processes:
- Employed after research has been conducted to identify potential faults within collected data
- Reactive vs. Proactive
- Can be subjective
Quality Assurance Processes:
- Employed before research has been conducted to limit potential faults within collected data.
- Proactive vs. Reactive
- More objective
QC is certainly a necessary piece of a well-rounded strategy, but many times we put more emphasis on quality control than quality assurance.
Through quality assurance, we supplement what the QC process lacks – ultimately forming an approach that is proactive instead of reactive.
At Elevated Insights, we do everything we can to build in quality assurance techniques throughout the quantitative research process. Below I’ll share just a few of the ways we go about this.
1) Design your survey in a way that is conducive to data quality:
While there are a multitude of survey design techniques for data quality, some of the most important we incorporate into our surveys are:
- Mobile optimization – as access to smartphones/tablets increase; more and more online surveys are taken through a mobile device. When designing and testing surveys Elevated Insights takes a “mobile-first” approach to survey design.
- Survey length – we recommend keeping survey length to 10-minutes or less. Surveys that exceed this threshold are more vulnerable to respondent fatigue, which in turn can lead to poor quality data and/or lower response rates.
- Managing engagement & gamification – we choose to employ more engaging questions (slider scales, drag and drops, etc.), use images where applicable, and “gamify” certain question types. Below is an example of a MaxDiff question where respondents interact with the question by playing a card-sort game.
2) Leverage your survey platform to test your survey:
Many survey platforms provide tools you can leverage which decrease the likelihood of sending out a flawed survey. These platforms allow the ability to:
- Manually test through a “preview” of the survey
- Spell check
- Automatically check survey accessibility
- Generate test, or “simulated”, responses
Generating simulated data can be extremely useful to identify errors in the survey related to survey flow/logic, quotas, terminates, etc. These errors can be easily identified by checking this simulated data within built-in reporting tools or even by manually checking the raw data.
3) Create an objective framework for tagging and cleaning responses:
When tagging responses for removal, it is important to first set an objective framework to remove responses from the dataset.
Although tagging responses subjectively on somewhat of a case-by-case basis may seem like an obvious route to higher quality data, we must be careful that we do not unintentionally introduce our own bias.
This can be achieved by setting “response error thresholds” or standards that a respondent must hit before having their response removed. A couple examples:
- Tagging responses with more than two “poor data quality indicators” throughout their response (e.g. selected a red herring, provided a gibberish response, straight lining)
- Determine a “minimum” amount of time to be spent on a survey
The reality is, respondents ARE going to make mistakes (we’re all human). However, this does not necessarily mean their entire response is invalid.
Elevated Insights would love to assist with your upcoming research needs and we pledge to deliver quality insights by combining both quality assurance and quality control techniques to our data collection process. Shoot us an email – we would love to talk through ways we could help!
For more information on data quality processes, please see our Beating the Cheaters E-Book, which you can access here.