Respondent quality and consistency

In market research fieldwork on October 7, 2011 by sdobney Tagged: , , ,

Having just finished processing another survey, the issue of respondent quality came up again. This time with regard to an online panel, but it comes up on all types and modes of surveys and is a core part of checking data in (at least it should be).

Respondent quality for online research tends to get a lot of attention because a great deal of online research uses panels and when a panel is being used it has often been recruited online and it’s difficult to know who has been recruited or verify and validate who they are. With large numbers of panels offering monetary incentives to take part, there’s a tendency to find so called ‘professional respondents’ – people in it for the money, where the quality of their answers may be more about trying to complete as many surveys as possible in the shortest amount of time, than giving proper considered responses. We’ve seen this from consumers and from specialist panels like hospital consultants.

Respondent quality has become a big potential issue as online research has become the dominant form of market research. However, respondent quality touches on all market research surveys, whatever the mode. And it’s not just professional respondents. Respondents may give poor quality responses for a variety of reasons – the question might be badly phrased for a start or it assumes a level of knowledge or interest the respondent simply doesn’t have.

In a pan-European telephone survey for a global courier company we actually graded all the respondents according to how well their answers matched known sales data, so we could see how responses varied by level of knowledge or recall. In face-to-face research we’ve seen respondents hurried and harried by interviewers into giving answers and, like the best of people, the respondent tries to please the interviewer. In a face-to-face pilot for a major (and lengthy) questionnaire into green issues, we almost fell at the first hurdle when the opening question ‘What environmental issues can you think of?’ was countered with a blank look from the elderly respondent and the question ‘What does environment mean?’

Respondent quality and errors are a fact of life. In trade-off questionnaires like conjoint, we can run consistency checks and we’ve found about 4-5% of respondents in the UK and about 6-8% of respondents in the US giving inconsistent or irrational answers. In modelling we assume there is an error term in the data respondent’s give us. We all make the wrong or bad or erroneous choices and answers some of the time.

But to help, smart questionnaires will include consistency checks both within the survey, and outside the survey. For instance in-survey checks might ask the same question in slightly different ways or checking trade-off questions for consistency. For instance you might ask about brand recollection, then which brands have been bought. It’s difficult to buy a brand you can’t remember (but we do all make mistakes, so it is possible only to remember the brand at the buy question). Out-of-survey consistency checks might include checks against known external data or the use of invented companies or brands to test validity of answers.

We, like most research agencies, do a detailed questionnaire by questionnaire check of responses (eye-balling the data). We will drop questionnaires that seem ‘funny’ and if a panel is involved ask the panel providers to check the bona fides of the respondents (we’ve done this for medical panels where doctors just weren’t giving considered options). We will check for internal consistency, time spent on the questionnaire, the style of answers (eg key-pressing), statistical analysis against other respondents for potential outliers, and validate written in answers for sense and relevance. For trade-off studies, we’ve found it’s very easy to pick up inconsistent respondents. And where we have data, we will check against known information. We don’t expect perfect answers and sometimes that shows too. For more information on fieldwork and quality considerations for online research see our main site


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: