Artificial Intelligence and Mobile Apps for Mental Healthcare: A Social Informatics Perspective
1. ARTIFICIAL INTELLIGENCE AND MOBILE
APPS FOR MENTAL HEALTHCARE:
A SOCIAL INFORMATICS PERSPECTIVE
Alyson Gamble
Simmons University
Boston, Massachusetts, USA
InternetTechnologies & Society 2020Conference (ITS 2020), São Paulo, Brazil
4-7 February 2020
3. Introduction: Research and researcher
Researcher
■ Third year PhD student
■ Former science librarian
– Education technology
– Research ethics
– Research data management
■ Identify as queer
Research
■ Preliminary, pre-case study:
literature review
■ User experience of biomedical
engineering tools
– Dr. RongTang
■ Social informatics work
– Dr. Colin Rhinesmith
3
4. Introduction
Social informatics studies the
social aspects of information
technologies.
Artificial Intelligence (AI) is used
in mobile applications for mental
healthcare (MHapps)
Chatbots (chatterbots) are AI
apps that interpret user input
and reply accordingly
4
5. Framework
■ Borrowed theory
– Understanding social work/psychology theories within
information science context
■ Queer theory
– Gender and sexuality are socially constructed
■ Social informatics
– Focus on avoidance of technological
fundamentalism/determinism
5
10. MHapps: Basics
Mobile mental
health is not part
of a new
landscape.
Internet-based
mental health
care and online
interventions
have been
utilized for many
years.
Virtual humans
have been
employed since
1973 to assist
with mitigating
suicide risk.
As mobile devices
became more
prevalent,
MHapps
developed,
including some
that integrated
lessons from
online therapy.
Chatbots were
included among
these mobile
healthcare
strategies.
10
12. Chatbots:
Basics
■ Application of AI with either a text or voice-based
communication interface
■ Depending on the interface, the chatbot can read
written language, including emojis, or translate speech
into text, which the bot then reads and uses to craft a
response
– This process is “a conversation.”
■ Complexity ranges from simple response tools to more
involved communications
– Whether the chatbot is using decision trees, keyword
matching, machine learning, natural language
processing, and other techniques
■ Technology is selected based on the intended
application of the bot
12
13. A brief history of time chatbots
13
1966
ELIZA, the Clinical
Rogerian psychotherapist
1972
PARRY, the paranoid
schizophrenic
1972
ELIZA meets PARRY via
ARPANET at first
International Conference
on Computer
Communications
1991
Dr. SBAISTO (Sound
Blasting Acting Intelligent
Text to Speech Operator),
employs a therapeutic
model
1995
ALICE (Artificial Linguistic
Internet Computer Entity)
launches and AIML
(Artificial Intelligence
Markup Language) is now
used in other bots
2000
ALICE wins Loebner Prize
“the most human
computer”
2001
ALICE wins Loebner Prize
for the second time
2004
ALICE wins Loebner
Prize…again!
14. 14
MHapp
Chatbot1
Country of
Development
Audience Interface Approach Issues
Shim Sweden Public
Free
iOS
Pre-defined text
Free text
Positive psychology
Third-wave CBT2
Privacy regulations unclear
No published clinical studies
Woebot United States Public iOS and Android
FB Messenger
Pre-defined text
Free text
Decision trees
CBT
Longer user
interactions than
Shim
FB privacy regulations
Wysa India &
United
Kingdom
Public
Can upgrade to
coaching
iOS and Android
(Left FB in May 2018)
Self-reflection
CBT
DBT3
Limited clinical background of
developers
Therachat United States Public, Clinical iOS and Android
Sentiment analysis
Journaling Only covered by HIPAA4 in
clinical setting
No published clinical studies
1Information current as of 24 April 2019; 2CBT: Cognitive BehavioralTherapy; 3DBT: Dialectical BehaviorTherapy; 4Health Insurance
Portability and Accountability Act of 1996
16. The world is facing a shortage of mental health clinicians, while the rate
of mental illness grows.
Many individuals facing mental illness will continue to need in-person
treatment from a variety of clinicians throughout their lives.
Depending on the type, treatment, and severity of their conditions,
people with mental illness may face institutionalization and
unemployment, or may lead otherwise productive and healthy
existences.
16
17. Mental healthcare disparities
By the numbers…
■ 9 : 100,000
mental healthcare workers : people
(WHO, 2017)
■ 60%
U.S. counties without psychiatrists
(Garg & Glick, 2018)
■ 20%
People inU.S. with mental illness
(Bakker et al., 2016)
By the issue…
■ Stigmatization of mental illness
– Less likely to seek help
– Low treatment adherence
■ Lack of healthcare coverage, e.g.:
– Rural populations
– Economically disadvantaged
■ Intersectional identities, e.g.:
– Age
– LGBTQ+
17
18. MHapp chatbots can…
■ Provide a more advanced form of search for therapeutic
techniques while creating a database of user queries
■ Offer just-in-time advice and resources
■ Provide space to express emotions and thoughts
■ Track moods and analyze sentiments
18
19. MHapp chatbots can…
■ Use natural dialogue to emulate a human connection
– Through this exchange, create a potentially therapeutic environment
– User’s positive relationship with the chatbot can help improve the
user’s mental health
■ Help people:
– Work through issues
– Develop mental health literacy
– Practice therapeutic techniques
19
21. MHapp chatbots can’t…
■ Replace face-to-face therapy
■ Always protect user privacy
■ Always pass theTuringTest
– Repetitive interactions can have a negative effect
21
22. Gaps in clinical testing of MHapp chatbots
General low efficacy rates of MHapps
Can be unclear to users which parts have been clinically tested
Efficacy
Lack of clinical standards for operation
Safety resources not always required
American PsychiatricAssociation (APA)’sApp Evaluation Model1 not required
Standards
Lack of regulations for data privacy
Users unaware of what information is collected and where it is stored
When utilized via other platform (e.g., Facebook) is regulated by it
Privacy
Lack of regulations for data security
Unclear who has access to the user databases
Security
Approaches don't work for all people and conditions
Exacerbate mental health concerns related to Internet use
Perceived lack of privacy and security can contribute to fear of stigma
Audience
22
1 Steps to evaluation of a MHapp: gathering basic information about the app, checking for its privacy and safety, evaluating its efficacy, assessing
its ease of use, and determining its data sharing capabilities (APA, 2018).
23. While an AI chatbot may provide a person with a place to access
therapeutic tools, a forum to discuss issues, and a way to track moods
and increase mental health literacy, AI is not a replacement for a
therapist.
Unlike psychiatrists, AI cannot prescribe lifesaving medications.
AI is a tool in addressing mental illness, not a panacea.
It must be treated as one element of a sociotechnical system, not as the
whole.
Ultimately, if AI chatbots and other MHapps are to have a positive
impact, they must be regulated, and society must avoid techno-
fundamentalism in relation to AI for mental health.
23
28. I’m not a clinician...
28
…but, based on my preliminary research, I can
provide recommendations for how to address
concerns about the MHapp chatbots.
…and one person can’t fix all the issues
surrounding the mental healthcare system…
29. Address concerns on three levels…
Client
• View interactive
policies in clear
language from start
of use
• Provide clear
explanation of tool
limitations
• Give access to
transparent
advertising
Clinician
• Advocate for
protection of patient
confidentiality
• Be continuously
engaged by
developers
• Engage clients in
conversations about
use of MHapps and
MHapp chatbots
Regulatory
• Be governed by an
agency1
• Follow guidelines2
• Recognize potential
harm by AI3
29
1 e.g., U.S. Substance Abuse and Mental Health Services Administration (SAMHSA)
2 e.g., Electronic Privacy Information Center’s Code of Fair Information Practices
3 e.g.,AI Now Institute
31. Future Research
Dissertation
■ Mixed-methods studies of information seeking behaviors of LGBTQ+ individuals with
mental health concerns
– Focus on intersectional stigma
– How do these people utilize MHapps and MHapp chatbots?
■ Understanding of how people locate and utilize the information contained in these
tools can help inform the development of standards to govern the applications’ future
iterations, as well as societal implications for their implementations
Post-Doc (the dream…)
■ Expand to underserved communities, particularly those without LGBTQ+ support
■ User-experience testing of similar tools, especially in underrepresented populations
■ Work with clinicians to develop tools tailored to specific needs
31
33. References & Acknowledgements
■ Bibliography available at https://bit.ly/31yEXKu
■ The author wishes to thank Drs. Colin Rhinesmith and Rong
Tang for their guidance on this manuscript, as well as Kathy
Da Silva, Corey Rainboth, Aimee Melvin, Adam McInnish, and
Bill Ames for their support during the initial stages of this
research.
■ Funding for travel to ITS 2020 was provided by Simmons
University
33