Talking Tech - the art and science of communicating complex ideas (Bristech2018)
1. Talking Tech
THE ART AND SCIENCE OF COMMUNICATING COMPLEX IDEAS
cecilia@ceciliaunlimited.com
@CeciliaUnltd
2. Bowles, Cennydd - Future Ethics
Over the coming decades,
our industry will ask
people to trust us with
their data, their vehicles,
and even their families’
safety.
3.
4. ‘None the wiser
perhaps my lord,
but certainly
better informed.’
‘I've listened to
you for an hour
and I'm none
the wiser.’
5. The post-truth world
Michael Gove, 6 June 2016
‘I think that the
people of this
country have had
enough of experts’
6. What Michael Gove actually said
‘I think that the people of this country have had
enough of experts from organisations
with acronyms saying that
they know what is best and
getting it consistently wrong’
9. Dan McQuillan
People’s Councils for Ethical Machine Learning
Social Media + Society April-June 2018: 1–10
If we cannot understand
exactly what is being weighed
in the balance, it is very hard to
tell under what circumstances
harm may be caused or in what
ways the operations might be
unethical
10. “There is a general uneasiness around the
potential of AI to disrupt existing jobs.
This is compounded by a lack of
communication between employers and
employees when it comes to keeping staff
updated with the introduction of new
technology.”
Investors in People, ‘Artificial Intelligence at Work: Perception
& Attitudes’, 2018
11. Emily Steiner on Twitter
“Their lives are
short & their
bodies transitory;
the days pass
quickly, centuries
roll by, & death
comes before you
know it.”
“We have described
everything briefly,
because people prefer
simple things that
don't take long to
explain.”
12.
13. “if you ever write an
article that has robots or
artificial intelligence in the
headline, you are
guaranteed that it will
have twice as many
people click on it”
House of Lords Select Committee
on Artificial Intelligence
April 2017
21. Marketing yourself
“Expertise and training in the fields of Catalysis,
Design of Experiments (DoE) and Principal
Component Analysis (PCA). An experienced scientist
with additional expertise in automation,
multivariate data analysis, process development
and problem solving.”
22.
23. ….organizations and
institutions that produce
these breakthroughs seem
to be more innovative and
flexible than those
responsible for anticipating
and coping with their
effects…
This situation has
generated serious
challenges for society in
the past, and the future
holds promise of even
more serious challenges.
Trust and Regulation
GENERIC LESSONS LEARNED ABOUT
SOCIETAL RESPONSES TO EMERGING
TECHNOLOGIES PERCEIVED AS
INVOLVING RISKS
Stern et al, 2009
27. Don’t talk about the science
The first rule of climate science?
Don’t talk about the science
28. Language matters
The war on
climate
change
The race
against climate
change
Flusberg, S.J., Matlock, T. and Thibodeau, P.H., 2018.
War metaphors in public discourse. Metaphor and
Symbol, 33(1), pp.1-18.
29. Be clear and honest about risks….
A common misunderstanding is that a 100-year flood is likely
to occur only once in a 100-year period.
In fact, there is approximately a 63.4% chance of one or
more 100-year floods occurring in any 100-year period.
On the Danube River at Passau, Germany, the actual
intervals between 100-year floods during 1501 to 2013
ranged from 37 to 192 years.
30. …but focus also on the consequences
“While scientists tend
to focus on the
probability a risk will
be realized, the public
tends to focus on the
consequences.”
“Public judgments of the
seriousness of hazards are
related to the extent to
which a possible
consequence is dreaded,
especially if the
consequence is potentially
unbounded in its effects.”
GENERIC LESSONS LEARNED ABOUT
SOCIETAL RESPONSES TO
EMERGING TECHNOLOGIES
PERCEIVED AS INVOLVING RISKS
Stern et al, 2009
31. Don’t be afraid to complicate the
narrative
When people encounter
complexity, they become
more curious and less
closed off to new
information. They listen,
in other words.
Amanda Ripley
Complicating the narratives
Medium, June 2018
32. “Anecdotes may not be data,
but anecdotes are the stories
we tell in order to make the
data mean something to
people.”
Andrew Thaler
When I talk about climate change
Southern Fried Science Jan 2017
35. Why are we so
worried about AI?
Surely humans are
always able to pull
the plug?
People asked a
computer, “Is there a
God?” And the
computer said, “There
is now,” and a bolt of
lightning fused the
plug.
Stephen Hawking quoted in The Times, 14 October 2018
42. If you’re informing an engaged
audience, talk about the tech
If you’re selling, talk about the benefits
If you’re persuading, talk about
values and identity
43. Help people build a reliable conceptual model
Be transparent about risks
Use complexity to encourage curiosity
Use the right language – or change it
Tell a good story
Build a compelling vision of the future
The House of Lords Select Committee on Artificial Intelligence last year reported that:
Journalists covering AI informed us that it was often difficult to cover AI and automation issues in a responsible and balanced manner, given the current level of public interest in the subject.47 Sarah O’Connor, employment correspondent for the Financial Times, told us that “if you ever write an article that has robots or artificial intelligence in the headline, you are guaranteed that it will have twice as many people click on it”, and at least some journalists were sensationalising the subject in order to drive web traffic and advertising revenues.48
So the worse the likely consequence, the lower the level of trust, even if its highly unlikely to happen.
One of the things that emerges from the climate change research is how polarized the debate is, particularly in the US – you either believe in climate change or you don’t, you either agree something should be done or you don’t. The thing about complex issues is they are so rarely binary, but as we know our media is good at setting extreme views against one another in a false dichotomy. I don’t know if this is because they patronizingly assume that people need this simplistic view in order to understand, or because they know it sells newspapers/generates clicks. But actually introducing complexity and nuance can be highly effective in getting people to engage better with a topic.
Researchers have a name for the kind of divide America is currently experiencing. They call this an “intractable conflict,” as social psychologist Peter T. Coleman describes in his book The Five Percent, and it’s very similar to the kind of wicked feuds that emerge in about one out of every 20 conflicts worldwide. In this dynamic, people’s encounters with the other tribe (political, religious, ethnic, racial or otherwise) become more and more charged. And the brain behaves differently in charged interactions. It’s impossible to feel curious, for example, while also feeling threatened.
Columbia University – Difficult Conversations Lab
And finally, a couple of tried and tested favourites for all comms people: stories and visions of the future. You’ve almost certainly heard or used the phrase that anecdotes are not data in the past, but this is what Marine science and conservation consultant and underwater robotics expert Andrew Thaler has to say about it.
To finish, I’m going to show you a great example of an organization using a lot of these ideas. This is KPMG’s short film about artificial intelligence – have a watch and see, they do all the things we’ve just been discussing:
They show you that AI is about doing the boring, the ordinary, the mundane
You don’t see a single humanoid robot in here – they’re all robot arms and familiar looking tech that doesn’t frighten us
They hark back to the industrial revolution to say ‘look, we’ve done this before, it didn’t kill us, it made us better’
They tell you it’s already all around you and has been since 1914
They change the language – intelligent automation instead of AI
They don’t talk about the tech – they talk about the outcomes
They make it about humans
They paint a compelling, positive vision of the future