Survey design has a significant impact on respondent engagement and data quality. A study examined the effect of survey complexity on 1,500 surveys and 800,000 responses. It found that more complex surveys with more questions and answer matrices led to lower respondent engagement scores. This lower engagement resulted in higher abandonment rates, less consistent responses, and lower quality data overall. The study showed that survey design must be optimized to maintain respondent engagement in order to ensure high quality research data.
2. has collected SurveyScore data for Table 1: Survey Complexity (High to Low Engagement)
more than 10,000 surveys, with over Moderate Complexity Medium Complexity High Complexity
2.6 million completes. These surveys Design Attributes
SurveyScore = 35 SurveyScore = 9 SurveyScore = 4
span product categories (such as food Survey length (min) 9 16 17
and beverage, financial, technology,
Total survey pages 38 39 43
entertainment, health and beauty,
health care and travel) and research Total number of questions 40 41 45
methods (such as concept screening, Avg. number of rows/matrix 4 13 13
line and package optimization, and Avg. number of columns/matrix 5 6 6
attitude and usage studies). Total number of matrix questions 8 8 8
Our team sought to determine
whether certain survey design vari- or partial rates. While survey length MarketTools fielded three surveys
ables could reliably predict the proved to be generally predictive of with varying levels of complexity,
composite engagement measure of most respondent engagement mea- categorized as moderate, medium and
respondent behavior and percep- sures, there was wide variation in high. We analyzed 1,000 completes
tion that comprises TrueSample the design variables that were most for each survey. The experimental
SurveyScore. We built a model to influential in driving various mea- surveys had the same series of ques-
predict engagement using survey sures of engagement. For example, tions about demographics, products
design variables and the TrueSample for the survey rating measure, one of purchased, etc., but differed based on
SurveyScore database as inputs. the most predictive design variables the number of products respondents
Predictability is an indication that was the elapsed time per page of the said they purchased. The level of
ic ly
survey design impacts engagement survey. For the speeding measure, complexity increased as more prod-
in a consistent way, implying that however, elapsed time per page was ucts were chosen and more brand
n n
we could recommend adjustments to not even in the top five most impor- attribute questions were displayed. In
ro O
the design variables that would mini- tant design variables. the moderate category, respondents
ct n
mize adverse effects on engagement. Thus, adjusting just one parameter were asked one question per product.
Specifically, we modeled the impact may not be sufficient to elicit desir- In the medium-complexity category,
of more than 20 survey design vari- able behavior from respondents, nor respondents received 17 brand attri-
le tio
ables (independent variables) that are will it singlehandedly improve their bute questions per product. In the
within the control of survey design- perception of the survey-taking expe- high-complexity category, respon-
E u
ers - such as survey length, and total rience. Instead, the findings reveal that dents were asked 17 questions for
word count - on several respondent engagement is driven by a complex every product chosen, plus additional
r ib
engagement measures (dependent interaction among design variables. open-ended questions.
o r
variables) reflecting the respondents’ This means that simple survey We computed and compared the
perception of the survey and behavior design guidelines or rules are inad- SurveyScore for the three surveys.
F st
during the survey. equate for motivating the desired Predictably, it dropped precipitously
respondent engagement. There is no with the higher complexity levels.
i
Clear indication axiom that applies in all cases, such The medium- and high-complexity
The research revealed that a multivar- as, “Surveys that require more than surveys received an extremely low
D
iate model that captures the complex 20 minutes result in poor respondent score, as shown in Table 1.
interaction among design variables is engagement.” In fact, our researchers Next, we conducted a series
able to predict overall engagement, uncovered several examples of long of statistical tests to evaluate the
comprised of both experiential and surveys that had a higher-than-normal effect of respondent engagement
behavioral variables. The fact that the survey rating as well as a lower-than- on data quality. By conducting dif-
impact of these variables is predictable normal partial rate, which would run ferent analyses, we were able to
provides a clear indication that survey contrary to what one would expect if examine data quality from various
design directly influences respon- length alone were a deciding variable. angles for a more comprehensive
dent perception and behavior, i.e., Conversely, we found examples of review. Specifically, we investigated
engagement, in a consistent way. This short surveys that had a lower-than- the following.
means that survey designers do have normal survey rating because of the Will unengaging surveys:
some degree of control in improving design of other variables.
engagement. This also means that the • Increase the odds of sample bias?
SurveyScore can be predicted prior An effect on quality • Make respondents more apt to
to deploying a survey to help guide With the impact of survey design answer the same question incon-
design modifications. on respondent engagement estab- sistently?
We uncovered another inter- lished, the research team endeavored • Make respondents more prone to
esting finding when we examined to determine whether engagement random answer choices?
the influence of particular survey had an effect on data quality. The • Make respondents more likely
design elements on specific aspects TrueSample SurveyScore database to provide inconsistent answer
of engagement, such as survey rating allowed us to test this hypothesis. choices?
To purchase paper reprints of this article, contact Edward Kane at FosteReprints at 866-879-9144 x131 or edk@fosterprinting.com.
4. by removing bad respondents and
designing surveys that keep good
respondents engaged. Research pro-
fessionals now have evidence that
survey design not only influences
whether respondents abandon a
survey but also impacts the data for
those who complete it.
The ability to predict the effect
of various survey design variables on
respondent engagement will help
survey designers maximize engage-
ment to increase the reliability of
their data. Researchers no longer
have to assume that a long survey will
jeopardize the quality of the results,
since we have shown that it is pos-
opposed to one of the available prod- of selecting a lower unit price over sible to compensate for the adverse
ucts. Once we removed the “none” a higher one. The net result: surveys effects of certain design variables by
option from our model, the predic- with a low SurveyScore translated adjusting others. By using engage-
tion accuracy dropped significantly to lower predictability and thus to ment measurement and prediction
ic ly
for the high-complexity survey. In lower data quality. tools, researchers can know that
addition, the lower-scoring surveys survey design affects data quality, can
n n
had more violations in price selec- Take responsibility measure engagement to help improve
ro O
tion order, meaning the respondents Our conclusion? Researchers must survey design and optimize design to
enhance the reliability of results. | Q
ct n
tended to violate the expected order take responsibility for data quality
le tio
E u
r ib
o r
F st
D i