This presentation focuses on issues in the collection, validation, and analysis of data obtained via social media platforms. It was presented at the symposium “Using Virtual Platforms to Engage Stakeholders in Research: Weaving the Threads Together” in Denver, Colorado, March 19, 2018. More information about Trial Promoter is available here: http://trialpromoter.org.
Call Girls Rishikesh Just Call 8250077686 Top Class Call Girl Service Available
Trial Promoter: A Web-Based Tool to Test Stakeholder Engagement in Research on Social Media
1. Trial Promoter: A Web-Based Tool to Test
Stakeholder Engagement in Research on
Social Media
Katja Reuter, PhD*; Alicia MacLennan, MS; Namquyen Le, MPH; Praveen Angyan, MS
*Speaker: Assistant Professor of Clinical Preventive Medicine, Department of Preventive Medicine, Keck
School of Medicine of USC; Director of Digital Innovation and Communication, Southern California Clinical
and Translational Science Institute (SC CTSI), University of Southern California (USC)
Presented at “Using Virtual Platforms to Engage Stakeholders in Research: Weaving the Threads Together,”
Denver, Colorado, March 19, 2018
Issues in the collection, validation, and analysis of
data obtained via social media platforms
2. Enabling Rigorous Evaluative Digital and
Social Media Studies
Develop evidence-based
digital clinical research
recruitment methods
Develop evidence-based
digital health promotion
interventions
3. Trial Promoter: http://trialpromoter.org
Reuter et al. Trial Promoter: A Web-Based Tool for Boosting the Promotion of
Clinical Research Through Social Media. J Med Internet Res
2016;18(6):e144DOI: 10.2196/jmir.4726
4. Collecting Data from Multiple Experiments
TRIAL PROMOTOR
SANDBOX
Experiment 1 Experiment 2 Experiment 3 Experiment 4
Messaging strategies to be tested
Automated distribution across multiple platforms
5. Example: Assessment of Psycholinguistic
Methods
Experiment Condition/Factor
(Psycholinguistic
methods)
Condition/Level(s)
(Number of variants)
Number of original
FDA message stems
used for message
development*
Number of
final
messages
generated
I Perspective-taking 3 (you vs. we vs.
anyone/everyone)
10 30 (3x10)
II Information-
packaging
2 (specific new
information mentioned
first vs. last)
10 20 (2x10)
III Numeracy 2 (percentage vs. raw
numbers)
10 20 (2x10)
IV Information-
packaging x
numeracy
4 (2x2) 8 32 (4x8)
Total number of messages generated: 102
8. SOCIAL MEDIA WEBSITE CONVERSION
Social media user
clicks on link in social
media message.
Social media user
completes carries out
asked behavior.
Retweets, Replies, Likes
(Twitter);
Shares, Comments; Likes
(Facebook);
Reposts; Comments; Likes
(Instagram)
Cost
SECONDARY OUTCOME MEASURES
PRIMARY OUTCOME VARIABLES
Clicks on website links
Number of sessions (visits to landing
page)
Number of surveys completed
Number of contact requests
Number of people enrolled
Impressions
Clicks
Number of website pages
viewed (pageviews)
Time spent on webpage
11. Issues in the Collection and Validation of Data
– Metrics
• Metrics vary among platforms, e.g., Twitter has ‘likes’ – number of times a
user liked a tweet (ad) - vs. Facebook’s ‘reactions’ – number of reactions on
your ads: like, love, haha, wow, sad or angry.
• Tracking metrics on a per social media message basis (solution: use a
tracking URL with Google UTM parameters). Set a unique value for
parameter “campaign content”, e.g., http://sc-
ctsi.org/?utm_source=twitter&utm_medium=ad&utm_campaign=tcors%20pr
oject&utm_term=cessation&utm_content=1-message
12. Issues in the Collection and Validation of Data
– Platform policies
• Twitter requires additional time for ad review (solution: send ads 3 business
days before scheduled display date)
• Ads can be delayed or even denied (e.g., anti-tobacco content, Twitter does
not allow clinical trials to be advertised)
13. Issues in the Collection and Validation of Data
– Data
• Organic engagement data are cumulative (solution: daily manual data
download) vs. ad data are reportable within a 1-day period
• Facebook organic data are deleted after 180 days (solution: daily manual
data download)
– Start and end times for ads
• Times are approximate
• Budget may be exhausted before requested end time and ad is no longer
shown
14. – Tracking click times (via ClickMeter)
• Distinguish between bots (from engines crawlers, RSS feed generators,
validators, etc.) vs. human clicks (unique IP address from last 30 minutes)
• ClickMeter tracks click times with per second granularity (vs. Google
Analytics’ per hour granularity)
– Google Analytics
• Free vs. subscription fee for ClickMeter
• Enable conversion tracking
– IPs addresses
• Exclude IPs of study team members from data wherever feasible
• Dynamic IPs - resets address with every connection, (e.g., mobile devices)
Issues in the Collection and Validation of Data
15. Biostatistician’s Perspective
Melissa L. Wilson, MPH, PhD, Assistant Professor, Department of Preventive
Medicine, Keck School of Medicine of USC, University of Southern California
“A main challenge is the large number of zeros that result from the
kind of data we generate from digital platforms. Because the data is
count data and distribution is heavily skewed toward 0
(overdispersed), we use the negative binomial distribution to model
it.
From this model, we can estimate marginal effects to obtain the
click through rates, for example for messages or website use.
However, these models don’t fit perfectly so we are considering
using Bayesian methods.”
Issues in the Analysis of Data
16. Looking Ahead
1. Looking for collaborators and partners to further test and develop
the tool
2. Standardized reporting of metrics across supported platforms
3. Standardized targeting and budget development across supported
platforms
4. R21 proposal with Vanderbilt University to evaluate effectiveness of
automated messaging for enhancing the signup rate in national
research registry among patients with rare diseases
5. Assessment of influence of clinical trial variables such as disease
type, location, and demographic on effectiveness of automated social
media recruitment messaging
17. Thank you!
Contact Information
Email: katja.reuter@usc.edu
Twitter: @dmsci
Blog: https://digitalmediaandscience.wordpress.com
Katja Reuter, PhD
Assistant Professor of Clinical Preventive Medicine, Institute for Health Promotion and Disease
Prevention Research, Department of Preventive Medicine, Keck School of Medicine of USC
18. Trial Promoter Interface
Shows imported clinical trial information and disease keywords that were
included in the test messages
Reuter et al. Trial Promoter: A Web-Based Tool for Boosting the Promotion of Clinical Research
Through Social Media J Med Internet Res 2016;18(6):e144. doi:10.2196/jmir.4726 PMID:27357424
19. Parameterized Message Templates
Reuter et al. Trial Promoter: A Web-Based Tool for Boosting the Promotion of Clinical Research
Through Social Media J Med Internet Res 2016;18(6):e144. doi:10.2196/jmir.4726 PMID:27357424
Local Trial Promoter interface shows parameterized message templates for
Twitter and Facebook that were used during testing.
20. Efficiency
Compared the labor and cost of Trial Promoter with the labor and cost of a
social media manager
The generation of a social media message takes an average of 5-15 minutes
including selecting the content, writing the message, adding an optional image
or video, and posting the message (based on internal analysis).
According to jobs and recruiting site Glassdoor, the national average annual
salary for a social media manager in 2015 was US $52,000.
A social media manager would have required 43-131 hours of labor equivalent
of US $1800-$5400 labor cost to generate the number of messages Trial
Promoter has generated in 10 weeks (525).