SlideShare une entreprise Scribd logo
1  sur  4
Télécharger pour lire hors ligne
Designed
                  to engage
online research



                  What is the impact of survey design
                  on respondent engagement?



                  T        he impact of questionable online survey respondents on data
                           quality is well-documented. Previous research-on-research by
                           our firm, MarketTools Inc., has shown that fake, duplicate or
                  unengaged respondents compromise data quality. But what about the
                                                                                                                     By Nallan Suresh
                                                                                                                     and Michael Conklin




                                             ic ly
                  design of the survey, which may affect all respondents, with both good
                  and bad intentions?



                                            n n
                      MarketTools conducted a comprehensive study that examines the




                                          ro O
                  effect of survey design on data quality and found that, in order to ensure




                                        ct n
                  the quality of research data, researchers must not only remove “bad”
                  respondents from their samples, they must also design surveys that keep
                  the good respondents engaged.




                                      le tio
                  Are interrelated



                                     E u
                  Experienced researchers have long assumed that survey design, respon-
                  dent engagement and data quality are interrelated. For example, it



                                  r ib
                  seems obvious that long and complex questionnaires will increase the



                                 o r
                  likelihood of undesirable behaviors such as speeding and survey aban-
                  donment, and that data quality will suffer if there is a high percentage of



                                F st
                  unengaged respondents in the survey sample.
                      As we sought to understand and quantify the effect of survey



                                   i
                  design on respondent engagement and data quality, we used our firm’s
                  TrueSample SurveyScore measurements from over 1,500 surveys and



                                 D
                  800,000 responses to conduct a two-phase research study. Phase 1 of our
                  research evaluated whether survey design influences the way respondents
                  perceive a survey and how they behave while answering survey ques-
                  tions. Phase 2 of our research examined the effect design variables and
                  engagement measures have on the quality of response data. Simply put,
                  we sought to determine whether “good” respondents driven “bad” by
                  poorly designed and complex surveys could lead to reduced data quality.
                  If so, we can help researchers to optimize their survey design to improve
                                                      overall data quality.                                          Editor’s note: Nallan Suresh is senior
 snapshot                                                    Show the impact
                                                                                                                     director, panel analytics, at San
                                                                                                                     Francisco research firm MarketTools
                                                             TrueSample SurveyScore is designed
 When researchers effectively                                to be an objective measure of survey
                                                                                                                     Inc. He can be reached at nallan.
 use the many facets of survey                               engagement and help show research-                      suresh@markettools.com. Michael
                                                                                                                     Conklin is chief methodologist in the
 design at their disposal, they                              ers the impact that survey design
                                                                                                                     Minneapolis office of MarketTools
 make great strides toward                                   has on engagement. It is a function
                                                             of both experiential variables, such                    Inc. He can be reached at michael.
 enhancing the respondent’s                                  as respondents’ rating of the survey                    conklin@markettools.com. To view
 experience and the quality of                               taking experience, and behavioral                       this article online, enter article ID
 the data they provide.                                      variables, such as survey abandonment                   20100704 at quirks.com/articles.
                                                             and speeding. To date, MarketTools
                         © 2010 Quirk’s Marketing Research Review (www.quirks.com). Reprinted with permission from the July 2010 issue.
                         This document is for Web posting and electronic distribution only. Any editing or alteration is a violation of copyright.
has collected SurveyScore data for                                Table 1: Survey Complexity (High to Low Engagement)
more than 10,000 surveys, with over                                                  Moderate Complexity   Medium Complexity    High Complexity
2.6 million completes. These surveys              Design Attributes
                                                                                      SurveyScore = 35      SurveyScore = 9     SurveyScore = 4
span product categories (such as food             Survey length (min)                        9                     16                 17
and beverage, financial, technology,
                                                  Total survey pages                         38                    39                 43
entertainment, health and beauty,
health care and travel) and research              Total number of questions                  40                    41                 45
methods (such as concept screening,               Avg. number of rows/matrix                 4                     13                 13
line and package optimization, and                Avg. number of columns/matrix              5                     6                   6
attitude and usage studies).                      Total number of matrix questions           8                     8                   8
    Our team sought to determine
whether certain survey design vari-             or partial rates. While survey length              MarketTools fielded three surveys
ables could reliably predict the                proved to be generally predictive of               with varying levels of complexity,
composite engagement measure of                 most respondent engagement mea-                    categorized as moderate, medium and
respondent behavior and percep-                 sures, there was wide variation in                 high. We analyzed 1,000 completes
tion that comprises TrueSample                  the design variables that were most                for each survey. The experimental
SurveyScore. We built a model to                influential in driving various mea-                surveys had the same series of ques-
predict engagement using survey                 sures of engagement. For example,                  tions about demographics, products
design variables and the TrueSample             for the survey rating measure, one of              purchased, etc., but differed based on
SurveyScore database as inputs.                 the most predictive design variables               the number of products respondents
Predictability is an indication that            was the elapsed time per page of the               said they purchased. The level of



                                   ic ly
survey design impacts engagement                survey. For the speeding measure,                  complexity increased as more prod-
in a consistent way, implying that              however, elapsed time per page was                 ucts were chosen and more brand



                                  n n
we could recommend adjustments to               not even in the top five most impor-               attribute questions were displayed. In




                                ro O
the design variables that would mini-           tant design variables.                             the moderate category, respondents




                              ct n
mize adverse effects on engagement.                 Thus, adjusting just one parameter             were asked one question per product.
Specifically, we modeled the impact             may not be sufficient to elicit desir-             In the medium-complexity category,
of more than 20 survey design vari-             able behavior from respondents, nor                respondents received 17 brand attri-




                            le tio
ables (independent variables) that are          will it singlehandedly improve their               bute questions per product. In the
within the control of survey design-            perception of the survey-taking expe-              high-complexity category, respon-



                           E u
ers - such as survey length, and total          rience. Instead, the findings reveal that          dents were asked 17 questions for
word count - on several respondent              engagement is driven by a complex                  every product chosen, plus additional



                        r ib
engagement measures (dependent                  interaction among design variables.                open-ended questions.



                       o r
variables) reflecting the respondents’              This means that simple survey                      We computed and compared the
perception of the survey and behavior           design guidelines or rules are inad-               SurveyScore for the three surveys.



                      F st
during the survey.                              equate for motivating the desired                  Predictably, it dropped precipitously
                                                respondent engagement. There is no                 with the higher complexity levels.



                         i
Clear indication                                axiom that applies in all cases, such              The medium- and high-complexity
The research revealed that a multivar-          as, “Surveys that require more than                surveys received an extremely low



                       D
iate model that captures the complex            20 minutes result in poor respondent               score, as shown in Table 1.
interaction among design variables is           engagement.” In fact, our researchers                  Next, we conducted a series
able to predict overall engagement,             uncovered several examples of long                 of statistical tests to evaluate the
comprised of both experiential and              surveys that had a higher-than-normal              effect of respondent engagement
behavioral variables. The fact that the         survey rating as well as a lower-than-             on data quality. By conducting dif-
impact of these variables is predictable        normal partial rate, which would run               ferent analyses, we were able to
provides a clear indication that survey         contrary to what one would expect if               examine data quality from various
design directly influences respon-              length alone were a deciding variable.             angles for a more comprehensive
dent perception and behavior, i.e.,             Conversely, we found examples of                   review. Specifically, we investigated
engagement, in a consistent way. This           short surveys that had a lower-than-               the following.
means that survey designers do have             normal survey rating because of the                    Will unengaging surveys:
some degree of control in improving             design of other variables.
engagement. This also means that the                                                               • Increase the odds of sample bias?
SurveyScore can be predicted prior              An effect on quality                               • Make respondents more apt to
to deploying a survey to help guide             With the impact of survey design                     answer the same question incon-
design modifications.                           on respondent engagement estab-                      sistently?
    We uncovered another inter-                 lished, the research team endeavored               • Make respondents more prone to
esting finding when we examined                 to determine whether engagement                      random answer choices?
the influence of particular survey              had an effect on data quality. The                 • Make respondents more likely
design elements on specific aspects             TrueSample SurveyScore database                      to provide inconsistent answer
of engagement, such as survey rating            allowed us to test this hypothesis.                  choices?
        To purchase paper reprints of this article, contact Edward Kane at FosteReprints at 866-879-9144 x131 or edk@fosterprinting.com.
We measured the consistency of
                                                                                                        the responses to questions that
                                                                                                        were repeated in separate sections
                                                                                                        of the survey, and we found that
                                                                                                        recall discrepancies increased as the
                                                                                                        SurveyScore dropped - proof that
                                                                                                        more complicated surveys lead to
                                                                                                        inconsistent and unreliable responses
                                                                                                        and lower data quality.
                                                                                                             We then measured the consis-
                                                                                                        tency of responses across all possible
                                                                                                        question pairs to develop an incon-
                                                                                                        sistency metric. This metric enabled
                                                                                                        us to determine if a given selection
                                                                                                        was random or closer to the expected
                                                                                                        response. The more unusual this
                                                                                                        pairing was - meaning the likelihood
                                                                                                        of its occurrence was low given the
                                                                                                        incidence of all the other options
                                                                                                        for these questions - the higher the
                                                                                                        departure from the expected value



                                  ic ly
                                                                                                        and the higher the inconsistency
                                                                                                        metric. Our finding was that incon-



                                 n n
                                                                                                        sistency increased as the SurveyScore




                               ro O
                                                                                                        dropped, contributing to lower over-




                             ct n
                                                                                                        all data quality for the more complex
                                                                                                        surveys (Figure 2).
                                                                                                             Finally, we sought to determine




                           le tio
                                                                                                        if surveys with a low SurveyScore
                                                                                                        caused respondents to lose focus



                          E u
                                                                                                        and provide inconsistent or unpre-
                                                                                                        dictable responses. To measure the



                       r ib
                                                                                                        choice predictability of each of the



                      o r
                                                                                                        surveys, we used a discrete choice
                                                                                                        model (DCM) exercise (Figure 3).



                     F st
                                                                                                        Specifically, we tried to predict
                                                                                                        respondents’ product selections on



                        i
• Make respondents tend to select                Figure 1 shows that as the number                      two tasks based on their selections
  “none” as an answer choice?                    of products selected increased -                       on seven other tasks (DCM sections



                      D
                                                 thereby increasing the number of                       were identical across all surveys).
    We examined whether a high                   questions to be answered - the par-                    We asked, for example, that respon-
abandonment rate could cause bias                tial or abandonment rate grew for                      dents select the one product they
in completed responses and thereby               the more complicated surveys.                          would prefer to buy from each page,
reduce overall data quality. In other                As shown in the graph on the                       if any, and based on their answers
words, as the surveys became more                right of Figure 1, of those respondents                to previous questions, we tried to
complicated and their SurveyScore                who did not abandon the survey, the                    predict their response. The respon-
dropped, did the makeup of the                   percentage who selected five products                  dents could also choose “none” as a
respondents change and create the                was much lower for the medium- and                     response, indicating that they would
potential for biased data?                       high-complexity surveys than it was                    choose none of the products.
    The answer was yes. As illustrated           for the moderate survey. So, while the                      During this exercise, we noticed
in the diagram in Figure 1, respon-              actual data had a higher percentage                    that the accuracy of the prediction
dents who completed the medium- or               of respondents that had purchased                      (when the selection of “none” was
high-complexity surveys were more                five products, many of these did not                   also included) was 75-79 percent for
tolerant of the increased question load          make it through the survey, result-                    all surveys, a relatively high predic-
(the more products they selected, the            ing in sample bias.                                    tion rate. However, the model for
more questions they were asked),                     Our research also tested whether                   the medium- and high-complexity
leading to bias in those groups com-             the respondents’ ability to answer                     surveys gave a much greater empha-
pared to the group of respondents                the same questions consistently                        sis to the “none” selection, meaning
who completed the moderate                       during a single survey was a func-                     that the respondents for these sur-
survey. The graph on the left of                 tion of the survey’s complexity.                       veys tended to select no product, as
              © 2010 Quirk’s Marketing Research Review (www.quirks.com). Reprinted with permission from the July 2010 issue.
              This document is for Web posting and electronic distribution only. Any editing or alteration is a violation of copyright.
by removing bad respondents and
                                                                                designing surveys that keep good
                                                                                respondents engaged. Research pro-
                                                                                fessionals now have evidence that
                                                                                survey design not only influences
                                                                                whether respondents abandon a
                                                                                survey but also impacts the data for
                                                                                those who complete it.
                                                                                    The ability to predict the effect
                                                                                of various survey design variables on
                                                                                respondent engagement will help
                                                                                survey designers maximize engage-
                                                                                ment to increase the reliability of
                                                                                their data. Researchers no longer
                                                                                have to assume that a long survey will
                                                                                jeopardize the quality of the results,
                                                                                since we have shown that it is pos-
opposed to one of the available prod-   of selecting a lower unit price over    sible to compensate for the adverse
ucts. Once we removed the “none”        a higher one. The net result: surveys   effects of certain design variables by
option from our model, the predic-      with a low SurveyScore translated       adjusting others. By using engage-
tion accuracy dropped significantly     to lower predictability and thus to     ment measurement and prediction



                                ic ly
for the high-complexity survey. In      lower data quality.                     tools, researchers can know that
addition, the lower-scoring surveys                                             survey design affects data quality, can



                               n n
had more violations in price selec-     Take responsibility                     measure engagement to help improve




                             ro O
tion order, meaning the respondents     Our conclusion? Researchers must        survey design and optimize design to
                                                                                enhance the reliability of results. | Q




                           ct n
tended to violate the expected order    take responsibility for data quality




                         le tio
                        E u
                     r ib
                    o r
                   F st
                    D i

Contenu connexe

Similaire à Quirk's Article: Designed to Engage

Complete the following assignments using excel and the following t
Complete the following assignments using excel and the following tComplete the following assignments using excel and the following t
Complete the following assignments using excel and the following tLynellBull52
 
Data Quality Doesn’t Just Happen: And Here’s What Some of the Industry’s Most...
Data Quality Doesn’t Just Happen: And Here’s What Some of the Industry’s Most...Data Quality Doesn’t Just Happen: And Here’s What Some of the Industry’s Most...
Data Quality Doesn’t Just Happen: And Here’s What Some of the Industry’s Most...InsightInnovation
 
TitleABC123 Version X1Article Analysis TopicsPSYCH.docx
TitleABC123 Version X1Article Analysis TopicsPSYCH.docxTitleABC123 Version X1Article Analysis TopicsPSYCH.docx
TitleABC123 Version X1Article Analysis TopicsPSYCH.docxjuliennehar
 
Pre-Launch Content Evaluation of an Animated Movie
Pre-Launch Content Evaluation of an Animated Movie    Pre-Launch Content Evaluation of an Animated Movie
Pre-Launch Content Evaluation of an Animated Movie Sanika Deshpande
 
Explaining recommendations: design implications and lessons learned
Explaining recommendations: design implications and lessons learnedExplaining recommendations: design implications and lessons learned
Explaining recommendations: design implications and lessons learnedKatrien Verbert
 
quantitative-research.pptx
quantitative-research.pptxquantitative-research.pptx
quantitative-research.pptxcjoypingaron
 
New Media, New Influencers and Implications for Public Relations
New Media, New Influencers and Implications for Public RelationsNew Media, New Influencers and Implications for Public Relations
New Media, New Influencers and Implications for Public Relationsmmmkatya
 
Lit chapter 1 (auto recovered)
Lit chapter 1 (auto recovered)Lit chapter 1 (auto recovered)
Lit chapter 1 (auto recovered)Joe Nat
 
A Study On Employee Perception Towards Performance Appraisal System In Arun I...
A Study On Employee Perception Towards Performance Appraisal System In Arun I...A Study On Employee Perception Towards Performance Appraisal System In Arun I...
A Study On Employee Perception Towards Performance Appraisal System In Arun I...Jill Brown
 
UXPA 2023 Poster: Atomic Research in Practice: Using a Feedback Repository to...
UXPA 2023 Poster: Atomic Research in Practice: Using a Feedback Repository to...UXPA 2023 Poster: Atomic Research in Practice: Using a Feedback Repository to...
UXPA 2023 Poster: Atomic Research in Practice: Using a Feedback Repository to...UXPA International
 
Interactive Recommender Systems
Interactive Recommender SystemsInteractive Recommender Systems
Interactive Recommender SystemsKatrien Verbert
 
The Influence of Participant Personality in Usability Tests
The Influence of Participant Personality in Usability TestsThe Influence of Participant Personality in Usability Tests
The Influence of Participant Personality in Usability TestsCSCJournals
 
Nikolaou HR Pro Recruitment Conference 2019
Nikolaou HR Pro Recruitment Conference 2019Nikolaou HR Pro Recruitment Conference 2019
Nikolaou HR Pro Recruitment Conference 2019Ioannis Nikolaou
 
Ethnography in Software Design *UPDATED for Big Design 2015*
Ethnography in Software Design *UPDATED for Big Design 2015*Ethnography in Software Design *UPDATED for Big Design 2015*
Ethnography in Software Design *UPDATED for Big Design 2015*Kelly Moran
 
Mixed-initiative recommender systems
Mixed-initiative recommender systemsMixed-initiative recommender systems
Mixed-initiative recommender systemsKatrien Verbert
 
MODULARISM: THE MODES OF THE PROVISIONING OF SPECIFICITY-SURVEYS IN A COMPAR...
MODULARISM:  THE MODES OF THE PROVISIONING OF SPECIFICITY-SURVEYS IN A COMPAR...MODULARISM:  THE MODES OF THE PROVISIONING OF SPECIFICITY-SURVEYS IN A COMPAR...
MODULARISM: THE MODES OF THE PROVISIONING OF SPECIFICITY-SURVEYS IN A COMPAR...Terence Morris
 
Validity in Research
Validity in ResearchValidity in Research
Validity in ResearchEcem Ekinci
 
Understand the nature and purposes of research
Understand the nature and purposes of researchUnderstand the nature and purposes of research
Understand the nature and purposes of researchbubblybubbly
 

Similaire à Quirk's Article: Designed to Engage (20)

Complete the following assignments using excel and the following t
Complete the following assignments using excel and the following tComplete the following assignments using excel and the following t
Complete the following assignments using excel and the following t
 
Types of design brm
Types of design brmTypes of design brm
Types of design brm
 
Data Quality Doesn’t Just Happen: And Here’s What Some of the Industry’s Most...
Data Quality Doesn’t Just Happen: And Here’s What Some of the Industry’s Most...Data Quality Doesn’t Just Happen: And Here’s What Some of the Industry’s Most...
Data Quality Doesn’t Just Happen: And Here’s What Some of the Industry’s Most...
 
TitleABC123 Version X1Article Analysis TopicsPSYCH.docx
TitleABC123 Version X1Article Analysis TopicsPSYCH.docxTitleABC123 Version X1Article Analysis TopicsPSYCH.docx
TitleABC123 Version X1Article Analysis TopicsPSYCH.docx
 
Pre-Launch Content Evaluation of an Animated Movie
Pre-Launch Content Evaluation of an Animated Movie    Pre-Launch Content Evaluation of an Animated Movie
Pre-Launch Content Evaluation of an Animated Movie
 
Explaining recommendations: design implications and lessons learned
Explaining recommendations: design implications and lessons learnedExplaining recommendations: design implications and lessons learned
Explaining recommendations: design implications and lessons learned
 
Sample Research Essay
Sample Research EssaySample Research Essay
Sample Research Essay
 
quantitative-research.pptx
quantitative-research.pptxquantitative-research.pptx
quantitative-research.pptx
 
New Media, New Influencers and Implications for Public Relations
New Media, New Influencers and Implications for Public RelationsNew Media, New Influencers and Implications for Public Relations
New Media, New Influencers and Implications for Public Relations
 
Lit chapter 1 (auto recovered)
Lit chapter 1 (auto recovered)Lit chapter 1 (auto recovered)
Lit chapter 1 (auto recovered)
 
A Study On Employee Perception Towards Performance Appraisal System In Arun I...
A Study On Employee Perception Towards Performance Appraisal System In Arun I...A Study On Employee Perception Towards Performance Appraisal System In Arun I...
A Study On Employee Perception Towards Performance Appraisal System In Arun I...
 
UXPA 2023 Poster: Atomic Research in Practice: Using a Feedback Repository to...
UXPA 2023 Poster: Atomic Research in Practice: Using a Feedback Repository to...UXPA 2023 Poster: Atomic Research in Practice: Using a Feedback Repository to...
UXPA 2023 Poster: Atomic Research in Practice: Using a Feedback Repository to...
 
Interactive Recommender Systems
Interactive Recommender SystemsInteractive Recommender Systems
Interactive Recommender Systems
 
The Influence of Participant Personality in Usability Tests
The Influence of Participant Personality in Usability TestsThe Influence of Participant Personality in Usability Tests
The Influence of Participant Personality in Usability Tests
 
Nikolaou HR Pro Recruitment Conference 2019
Nikolaou HR Pro Recruitment Conference 2019Nikolaou HR Pro Recruitment Conference 2019
Nikolaou HR Pro Recruitment Conference 2019
 
Ethnography in Software Design *UPDATED for Big Design 2015*
Ethnography in Software Design *UPDATED for Big Design 2015*Ethnography in Software Design *UPDATED for Big Design 2015*
Ethnography in Software Design *UPDATED for Big Design 2015*
 
Mixed-initiative recommender systems
Mixed-initiative recommender systemsMixed-initiative recommender systems
Mixed-initiative recommender systems
 
MODULARISM: THE MODES OF THE PROVISIONING OF SPECIFICITY-SURVEYS IN A COMPAR...
MODULARISM:  THE MODES OF THE PROVISIONING OF SPECIFICITY-SURVEYS IN A COMPAR...MODULARISM:  THE MODES OF THE PROVISIONING OF SPECIFICITY-SURVEYS IN A COMPAR...
MODULARISM: THE MODES OF THE PROVISIONING OF SPECIFICITY-SURVEYS IN A COMPAR...
 
Validity in Research
Validity in ResearchValidity in Research
Validity in Research
 
Understand the nature and purposes of research
Understand the nature and purposes of researchUnderstand the nature and purposes of research
Understand the nature and purposes of research
 

Dernier

Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationSafe Software
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhisoniya singh
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsMemoori
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...HostedbyConfluent
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptxHampshireHUG
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdfhans926745
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksSoftradix Technologies
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphNeo4j
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slidespraypatel2
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationSafe Software
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersThousandEyes
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxKatpro Technologies
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 

Dernier (20)

Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping Elbows
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial Buildings
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other Frameworks
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slides
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 

Quirk's Article: Designed to Engage

  • 1. Designed to engage online research What is the impact of survey design on respondent engagement? T he impact of questionable online survey respondents on data quality is well-documented. Previous research-on-research by our firm, MarketTools Inc., has shown that fake, duplicate or unengaged respondents compromise data quality. But what about the By Nallan Suresh and Michael Conklin ic ly design of the survey, which may affect all respondents, with both good and bad intentions? n n MarketTools conducted a comprehensive study that examines the ro O effect of survey design on data quality and found that, in order to ensure ct n the quality of research data, researchers must not only remove “bad” respondents from their samples, they must also design surveys that keep the good respondents engaged. le tio Are interrelated E u Experienced researchers have long assumed that survey design, respon- dent engagement and data quality are interrelated. For example, it r ib seems obvious that long and complex questionnaires will increase the o r likelihood of undesirable behaviors such as speeding and survey aban- donment, and that data quality will suffer if there is a high percentage of F st unengaged respondents in the survey sample. As we sought to understand and quantify the effect of survey i design on respondent engagement and data quality, we used our firm’s TrueSample SurveyScore measurements from over 1,500 surveys and D 800,000 responses to conduct a two-phase research study. Phase 1 of our research evaluated whether survey design influences the way respondents perceive a survey and how they behave while answering survey ques- tions. Phase 2 of our research examined the effect design variables and engagement measures have on the quality of response data. Simply put, we sought to determine whether “good” respondents driven “bad” by poorly designed and complex surveys could lead to reduced data quality. If so, we can help researchers to optimize their survey design to improve overall data quality. Editor’s note: Nallan Suresh is senior snapshot Show the impact director, panel analytics, at San Francisco research firm MarketTools TrueSample SurveyScore is designed When researchers effectively to be an objective measure of survey Inc. He can be reached at nallan. use the many facets of survey engagement and help show research- suresh@markettools.com. Michael Conklin is chief methodologist in the design at their disposal, they ers the impact that survey design Minneapolis office of MarketTools make great strides toward has on engagement. It is a function of both experiential variables, such Inc. He can be reached at michael. enhancing the respondent’s as respondents’ rating of the survey conklin@markettools.com. To view experience and the quality of taking experience, and behavioral this article online, enter article ID the data they provide. variables, such as survey abandonment 20100704 at quirks.com/articles. and speeding. To date, MarketTools © 2010 Quirk’s Marketing Research Review (www.quirks.com). Reprinted with permission from the July 2010 issue. This document is for Web posting and electronic distribution only. Any editing or alteration is a violation of copyright.
  • 2. has collected SurveyScore data for Table 1: Survey Complexity (High to Low Engagement) more than 10,000 surveys, with over Moderate Complexity Medium Complexity High Complexity 2.6 million completes. These surveys Design Attributes SurveyScore = 35 SurveyScore = 9 SurveyScore = 4 span product categories (such as food Survey length (min) 9 16 17 and beverage, financial, technology, Total survey pages 38 39 43 entertainment, health and beauty, health care and travel) and research Total number of questions 40 41 45 methods (such as concept screening, Avg. number of rows/matrix 4 13 13 line and package optimization, and Avg. number of columns/matrix 5 6 6 attitude and usage studies). Total number of matrix questions 8 8 8 Our team sought to determine whether certain survey design vari- or partial rates. While survey length MarketTools fielded three surveys ables could reliably predict the proved to be generally predictive of with varying levels of complexity, composite engagement measure of most respondent engagement mea- categorized as moderate, medium and respondent behavior and percep- sures, there was wide variation in high. We analyzed 1,000 completes tion that comprises TrueSample the design variables that were most for each survey. The experimental SurveyScore. We built a model to influential in driving various mea- surveys had the same series of ques- predict engagement using survey sures of engagement. For example, tions about demographics, products design variables and the TrueSample for the survey rating measure, one of purchased, etc., but differed based on SurveyScore database as inputs. the most predictive design variables the number of products respondents Predictability is an indication that was the elapsed time per page of the said they purchased. The level of ic ly survey design impacts engagement survey. For the speeding measure, complexity increased as more prod- in a consistent way, implying that however, elapsed time per page was ucts were chosen and more brand n n we could recommend adjustments to not even in the top five most impor- attribute questions were displayed. In ro O the design variables that would mini- tant design variables. the moderate category, respondents ct n mize adverse effects on engagement. Thus, adjusting just one parameter were asked one question per product. Specifically, we modeled the impact may not be sufficient to elicit desir- In the medium-complexity category, of more than 20 survey design vari- able behavior from respondents, nor respondents received 17 brand attri- le tio ables (independent variables) that are will it singlehandedly improve their bute questions per product. In the within the control of survey design- perception of the survey-taking expe- high-complexity category, respon- E u ers - such as survey length, and total rience. Instead, the findings reveal that dents were asked 17 questions for word count - on several respondent engagement is driven by a complex every product chosen, plus additional r ib engagement measures (dependent interaction among design variables. open-ended questions. o r variables) reflecting the respondents’ This means that simple survey We computed and compared the perception of the survey and behavior design guidelines or rules are inad- SurveyScore for the three surveys. F st during the survey. equate for motivating the desired Predictably, it dropped precipitously respondent engagement. There is no with the higher complexity levels. i Clear indication axiom that applies in all cases, such The medium- and high-complexity The research revealed that a multivar- as, “Surveys that require more than surveys received an extremely low D iate model that captures the complex 20 minutes result in poor respondent score, as shown in Table 1. interaction among design variables is engagement.” In fact, our researchers Next, we conducted a series able to predict overall engagement, uncovered several examples of long of statistical tests to evaluate the comprised of both experiential and surveys that had a higher-than-normal effect of respondent engagement behavioral variables. The fact that the survey rating as well as a lower-than- on data quality. By conducting dif- impact of these variables is predictable normal partial rate, which would run ferent analyses, we were able to provides a clear indication that survey contrary to what one would expect if examine data quality from various design directly influences respon- length alone were a deciding variable. angles for a more comprehensive dent perception and behavior, i.e., Conversely, we found examples of review. Specifically, we investigated engagement, in a consistent way. This short surveys that had a lower-than- the following. means that survey designers do have normal survey rating because of the Will unengaging surveys: some degree of control in improving design of other variables. engagement. This also means that the • Increase the odds of sample bias? SurveyScore can be predicted prior An effect on quality • Make respondents more apt to to deploying a survey to help guide With the impact of survey design answer the same question incon- design modifications. on respondent engagement estab- sistently? We uncovered another inter- lished, the research team endeavored • Make respondents more prone to esting finding when we examined to determine whether engagement random answer choices? the influence of particular survey had an effect on data quality. The • Make respondents more likely design elements on specific aspects TrueSample SurveyScore database to provide inconsistent answer of engagement, such as survey rating allowed us to test this hypothesis. choices? To purchase paper reprints of this article, contact Edward Kane at FosteReprints at 866-879-9144 x131 or edk@fosterprinting.com.
  • 3. We measured the consistency of the responses to questions that were repeated in separate sections of the survey, and we found that recall discrepancies increased as the SurveyScore dropped - proof that more complicated surveys lead to inconsistent and unreliable responses and lower data quality. We then measured the consis- tency of responses across all possible question pairs to develop an incon- sistency metric. This metric enabled us to determine if a given selection was random or closer to the expected response. The more unusual this pairing was - meaning the likelihood of its occurrence was low given the incidence of all the other options for these questions - the higher the departure from the expected value ic ly and the higher the inconsistency metric. Our finding was that incon- n n sistency increased as the SurveyScore ro O dropped, contributing to lower over- ct n all data quality for the more complex surveys (Figure 2). Finally, we sought to determine le tio if surveys with a low SurveyScore caused respondents to lose focus E u and provide inconsistent or unpre- dictable responses. To measure the r ib choice predictability of each of the o r surveys, we used a discrete choice model (DCM) exercise (Figure 3). F st Specifically, we tried to predict respondents’ product selections on i • Make respondents tend to select Figure 1 shows that as the number two tasks based on their selections “none” as an answer choice? of products selected increased - on seven other tasks (DCM sections D thereby increasing the number of were identical across all surveys). We examined whether a high questions to be answered - the par- We asked, for example, that respon- abandonment rate could cause bias tial or abandonment rate grew for dents select the one product they in completed responses and thereby the more complicated surveys. would prefer to buy from each page, reduce overall data quality. In other As shown in the graph on the if any, and based on their answers words, as the surveys became more right of Figure 1, of those respondents to previous questions, we tried to complicated and their SurveyScore who did not abandon the survey, the predict their response. The respon- dropped, did the makeup of the percentage who selected five products dents could also choose “none” as a respondents change and create the was much lower for the medium- and response, indicating that they would potential for biased data? high-complexity surveys than it was choose none of the products. The answer was yes. As illustrated for the moderate survey. So, while the During this exercise, we noticed in the diagram in Figure 1, respon- actual data had a higher percentage that the accuracy of the prediction dents who completed the medium- or of respondents that had purchased (when the selection of “none” was high-complexity surveys were more five products, many of these did not also included) was 75-79 percent for tolerant of the increased question load make it through the survey, result- all surveys, a relatively high predic- (the more products they selected, the ing in sample bias. tion rate. However, the model for more questions they were asked), Our research also tested whether the medium- and high-complexity leading to bias in those groups com- the respondents’ ability to answer surveys gave a much greater empha- pared to the group of respondents the same questions consistently sis to the “none” selection, meaning who completed the moderate during a single survey was a func- that the respondents for these sur- survey. The graph on the left of tion of the survey’s complexity. veys tended to select no product, as © 2010 Quirk’s Marketing Research Review (www.quirks.com). Reprinted with permission from the July 2010 issue. This document is for Web posting and electronic distribution only. Any editing or alteration is a violation of copyright.
  • 4. by removing bad respondents and designing surveys that keep good respondents engaged. Research pro- fessionals now have evidence that survey design not only influences whether respondents abandon a survey but also impacts the data for those who complete it. The ability to predict the effect of various survey design variables on respondent engagement will help survey designers maximize engage- ment to increase the reliability of their data. Researchers no longer have to assume that a long survey will jeopardize the quality of the results, since we have shown that it is pos- opposed to one of the available prod- of selecting a lower unit price over sible to compensate for the adverse ucts. Once we removed the “none” a higher one. The net result: surveys effects of certain design variables by option from our model, the predic- with a low SurveyScore translated adjusting others. By using engage- tion accuracy dropped significantly to lower predictability and thus to ment measurement and prediction ic ly for the high-complexity survey. In lower data quality. tools, researchers can know that addition, the lower-scoring surveys survey design affects data quality, can n n had more violations in price selec- Take responsibility measure engagement to help improve ro O tion order, meaning the respondents Our conclusion? Researchers must survey design and optimize design to enhance the reliability of results. | Q ct n tended to violate the expected order take responsibility for data quality le tio E u r ib o r F st D i