We focus on Invisible Interfaces and their influence on digital experiences. With the advent of 5G creating the foundation for the increased adoption of ‘invisibility’ in our interaction with technology – we’ll discuss what this could mean for the UX and CX industry.
3. Some Principles…
• It is positively encouraged to disagree.
• I hope to raise a few questions - the answers are unknown.
• Covid19 implications run throughout.
• First of a number of discussion sessions over the coming months;
• Digital Doubles
• New growth measures
• Conscious Customers
• Life Centred Design
4. We’ll discuss….
• The Advent of 5G
• The World of ‘invisibility’
• Biometrics
• AI
• The world of ‘Neuro’.
• A revision of HCI?
• Ethics & Privacy
• Testing Invisible Interfaces & the future
6. The path has been laid…
• The Best Interface Is No Interface – Golden Krishna
• Don’t Make Me Think – Steve Krug (maybe we need to think more)
• Ruined By Design – Mike Monteiro
• Hooked (I have a real issue with this and him…) – Nir Eyal
• Hippo – Pete Trainor
7. But is being redrawn…
• The Age of Surveillance Capitalism – Shoshana Zuboff
• The Perils of Perception – Bobby Duffy
• The Death of the Gods – Carl Miller
• Human Compatible – Stuart Russell
8. 5G enables…
• ‘True’ Self-driving cars
• ‘True’ Remote Healthcare & personalised medicine
• ‘True’ Super sharp resolution for video conferences….
• ‘True’ frictionless interactions..
• Smart Cities
• Deeply Integrated AI
• Life Centred Design
• ‘Bigger’ Big Data (huge post Covid19 growth)
9. What does that mean?
1. Creates ‘Always on’ interfaces.
2. Humans become a more direct ‘data source’
3. Creating exhausts/trails that look much deeper into individuals
4. Enabling the creation of ‘on the fly’ personalisation and experiences to a
level not seen before.
So;
• What implications and opportunities does a true ‘always on’ capability offer
in terms of experience design?
10. “Digital technology has had a direct impact on the way humans now engage
- at a much deeper level than we think.
We have become a black and white, yes and no, right and wrong society,
directly (and maybe subconsciously) influenced by the binary foundations of
digital (platforms) – the 1’s and 0’s that provide the basis for the industry.
We have ceased to be analogue, we have ceased to see the grey. The grey
is where knowledge is found, ideas are created and culture grows.
Singularity is already here – and we didn’t even need AI to achieve it.“
14. The ‘ideal’ use case
“I use voice assistants for accessibility reasons. I am quadriplegic and they
help me lead an independent life. They assist with tasks others take for
granted like adjusting my heating and lights, window blinds, and TV and
music.
I have considered the privacy implications but for me the independence
these assistants offer me far outweighs my privacy concerns.”
15. The not so ‘ideal’ use case
In Portland, Oregon, a woman discovered that her Echo had taken it upon
itself to send recordings of private conversations to one of her husband’s
employees.
In a statement, Amazon said that the Echo must have misheard the wake
word, misheard a request to send a message, misheard a name in its
contacts list and then misheard a confirmation to send the message, all
during a conversation about hardwood floors.
16. “Alexa, sing a song”
“Alexa, find quick dinner recipes”
“Alexa, turn on Whisper mode”
The days when you had to wake up your partner to ask Alexa to turn off the alarm are over. Whisper
mode is a feature that allows you to whisper to Alexa, who in turn whispers when responding to you.
“Alexa, how do I receive calls?”
“Alexa, remind me to call Dad in 10 minutes”
“Alexa, what’s 13 degrees Celsius in Fahrenheit?”
“Alexa, play relaxing music”
“Alexa, play Trivia Hero”
“Alexa, announce breakfast is ready”
Ask Alexa to make announcements on all compatible Echo devices in your household or the Alexa
app.
“Alexa, how can I create my own story?”
“Alexa, give me a book recommendation”
“Alexa, open Relaxing Piano”
“Alexa, what's on my to-do list?”
“Alexa, give me a limerick”
“Alexa, play the song that goes, ‘I’ve got that sunshine in my pocket’”
“Alexa, how do I make slime?”
17. The experience of voice
• Micro interactions (and experiences) as sources of data gathering feeding
into wider more commercially attractive experiences.
• A single touchpoint within a larger ‘journey’?
• A single interaction for a specific need and service.
• Requires increased allowances for error – a need to design additional
error recognition into experiences.
18. ‘Observational’ tech…
• The changing foundations from a human/digital relationship perspective
have been in place for some time now;
• Security cameras.
• Number plate recognition systems.
• Fingerprint or facial access to our own devices.
• Auto passport gates at airports.
19. The face is a window into a persons mind
-Aristotle.
20. Facial Recognition – a practical
example of the challenges.
On average, adults in urban cultures scowl when they were angry 30% of
the time.
Which means that some 70% of the time adults do not scowl when angry.
21. Facial Recognition – an advanced
example
• Alipay (part of Alibaba) provide a service called ‘Smile to Pay’ – which
enables users to do exactly what it sounds like.
In an imaginary use case - could we therefore see 5G allowing us to
withdraw money from ATM’s simply by looking into a screen?
How do we design, test, monitor manage for that?
22. Gesture, Haptics & Movement
• Google Soli (foundation for the Pixel phones gesture controls)
• BMW – ICE controllers
• IoT sensors within Care Homes.
• Haptic responses to movement to reassure.
• Partially sighted interactions
• ‘Embracelet’ – creating positive emotions remotely. A physical reaction from a ‘remote’
intent
• Neuro Gestures.
• Interrupt the brain’s messages to hands and legs to predict movements
23. Some Biometric discussion points
• How can we accurately test for gesture?
• What can be learnt from voice interactions in terms of experience design?
How much control do the designers really have? How successful are true
voice interactions at present?
• In terms of facial recognition – how can this truly be understood for an
individual?
• How much more specific do ‘personas’ need to go if we take into account
deeper visual behaviours?
• How do we blend a mix of screen & invisible interactions into our research
and design processes?
25. Three (very broad) levels of ‘AI’
• Perception
• Automating Judgement
• Predicting (Social & Individual) Outcomes
Each of these have raised (ethical) concerns/uncertainty and confusion in
the public which has been fed by the (primarily) the digital & tech sector.
26. Perception
Genuine, rapid technological progress
• Content identification (Shazam, reverse image search)
• Face recognition*
• Medical diagnosis from scans
• Speech to text
• Deepfakes*
* Ethical concerns because of high accuracy
27. Automating Judgement
Far from perfect, but improving
• Spam detection
• Detection of copyrighted material
• Automated essay grading
• Hate speech detection
• Content recommendation
Ethical concerns in part because some error is inevitable
29. How do we test AI?
• Perception
• Automating Judgement
• Predicting (Social & Individual) Outcomes
Each of these have raised (ethical) concerns/uncertainty and confusion in
the public which has been fed by the (primarily) the digital & tech sector.
30. The core questions to ask of AI
algorithms
1. Is it any good when tried in new parts of the real world?
2. Would something simpler, and more transparent and robust, be just as good?
3. Could I explain how it works (in general) to anyone who is interested?
4. Could I explain to an individual how it reached its conclusion in their particular
case?
5. Does it know when it is on shaky ground, and can it acknowledge uncertainty?
6. Do people use it appropriately, with the right level of scepticism?
7. Does it actually help in practice?
31. A testing & evaluation framework
Currently – Phase 1 in terms of testing “AI” Algorithms is completed regularly – Phases 2 -4 are
not… we need to be aware of this and consider how we can control the environmental and
‘algorithmic’ growth.
Pharmaceuticals Algorithms
Phase 1
Safety: Initial testing on human
subjects
Digital testing: Performance on test
cases
Phase 2
Proof-of-concept: Estimating efficacy
and optimal use on selected subjects
Laboratory testing: Comparison with
humans, user testing
Phase 3
Randomized Controlled Trials:
Comparison against existing
treatment in clinical setting
Field testing: Controlled trials of
impact
Phase 4
Post-marketing surveillance: For
long-term side-effects
Routine use: Monitoring for problems
32. Some AI discussion points
• Where should experience designers sit in the development of AI solutions?
• Should we consider creating positive friction when needed with
interactions that are led by AI?
• Can we blend AI testing in with usability & user experience testing?
• Is the ‘user’ actually more than just the ‘physical’ human is it a blend of AI
and human?
As Experience Designers I think we have a deep responsibility to
understand AI and the tests that any algorithm has undergone. This feeds
into our research and designs.
35. Creating ‘Certainty’
1. Improve the data set of an individual, improve the tech and AI capabilities
to such an extent that you can develop the profile to absolutely accurately
predict the next move of a human.
2. You simplify the human to make them more predictable.
38. Neuro Marketing….
• Did not exist in 2002.
• Is seen as a serious approach to ‘persuading/influencing’ human
behaviour.
• To get people to do what you want them to do.
• But - can the findings of an FRMI scan or a FB wristband highlight
someone’s actual action?
• “I wasn’t going to buy that, Amazon thought I wanted to..”
• “I didn’t mean to think that – my brain did it without me knowing”
39. Neuro Marketing discussion points
• How do we design for intent and not action?
• Should we?
• How will our work change with the ‘advent’ of Neuro Marketing? Will it
change at all?
• Will deep personalisation become more and more engrained?
• How will research develop?
• What testing could accommodate such an ability of being able to ‘look’
inside an individuals brain?
40. Neuro Marketing Forecasting….& AI
• What behaviours might ‘we’ expect to happen in the population
• What behaviours might ‘we’ expect people to show when presented with a
specific argument, language, product.
• Will we create more individual ‘false’ situations (remember FB) to create,
develop and learn intent?
• What do we as experience designers do with that?
• Should we do anything?
42. Who is in control and what’s the
impact?
Human or ‘Computer’?
Do we design for an AI prediction or AI making a judgement?
For intention or action?
To pre-empt movements?
How can we possibly accommodate the speed of this change and data flow?
Can we integrate multiple invisible interfaces across a journey and
accommodate significant uncertainty and larger margins for error?
43. Who is in control and what’s the
impact?
• What ‘value’ will humans gain from interactions that are invisible and
‘powered’ by AI, neuro etc..?
• As interfaces and the underlying tech and (decisions) become invisible
should we look to create experiences that are more emotionally fulfilling?
• Should we therefore focus even more on creating real ‘delight/joy’ in our
experience design?
44. Influenced Behaviour
• In almost all situations experience design and tech influences behaviour.
This can be a positive (for the individual and society) – reducing vehicle
speed, preventing shoplifting, staying at home etc.
• It could also can also be a negative and restrict ‘true’ human behaviour –
for example, I constantly delight when I’m recommended certain products
or music by Amazon or Spotify that I dislike because I’ve randomly clicked
on various links – in an attempt to confuse and game the tracking systems.
45. Influenced Behaviour
• Capturing, processing and modelling data created by influenced behaviour
creates influenced solutions.
• The skills of an experience designer become more refined and understood
– data is more clearly seen as just one part of the deal (try telling
customers that….)
• Therefore creating a need for wider experience and service design skills.
• Sociologists
• Anthropologists
• Psychologists
• Philosophers….. (I would say that..)
46. Q- Will ‘we’ rail against the tech that
supports invisibility or embrace it?
• Will negatively influenced behaviour come to more prominence?
• Forcing designers to create more and more ‘restrictive’ experiences.
• To demand that experience professionals increase skillset & resource.
• Is an appreciation by people of big data (with Covid19) here to stay?
• In a watershed moment is it being seen as a real leveller and enabler?
• Is the workplace changing beyond recognition? If so, are invisible
interfaces going to accelerate or hinder change?
48. We aimed for tech surpassing human capability.
We created tech surpassing human vulnerability.
49. But what is privacy today?
• Big data is being seen by some as a saviour in the current climate.
• Will this change societies relationship with privacy?
• How could this affect experience design?
• Are we opening up to being ‘observed’ for the greater societal and human
good?
50. Privacy is a human right
• We will keep all your personal data for as long as your account remains
open. You can close your account at any time using your account settings.
If you close your account, we will delete the personal data associated with
your account.
• If your account is inactive for a period of eighteen months, we reserve the
right to close your account and delete your personal data.
51. Our Privacy Challenge
• Privacy by Design.
• Is it our responsibility to create privacy into the research, solutions and
experiences that we design?
• How will experiences taking place across multiple interfaces protect
privacy? Should we lead these discussions?
• How do we build an ‘openness’ on privacy into our ‘users’ awareness?
Should we? Is it our job?
52. Thinking about Ethics
• Should we use findings from Neuro technology to create experiences?
• Should we try and influence thoughts?
• How do we create ethical experiences that have significant AI inputs –
when the algorithm is out of our control?
• Should we therefore ensure that we position ourselves at the centre even more?
• Should we become the ‘ethical’ voice of the user
• Persuasion and Influence – will still be the battleground – but be much
more personalised that before - What is our stance/position?
53. Q- Will Invisible Interfaces create more
division?
• Will screens be used by those that cannot afford ‘sentient’ hardware and
applications?
• Or will they be the preserve of the few – with everyone else being ‘forced’
to use sentient-led technology.
• Will we see observational technology becoming the norm.
54. Wilful blindness
• …failing to see - or admit to ourselves or our colleagues - the issues and
problems in plain sight… we prefer ignorance as we are afraid of
questioning…
55. Some questions for you.
• Can you see and communicate the unintended consequences of invisibility
early enough to clients and come up with solutions?
• Do you know enough about the tech being used by clients and the data
ramifications of the use of that tech?
• Do you see yourself as a UX’er, UI designer or someone developing
‘solutions’ that will enhance people’s lives?
• How do you feel when you are developing a solution you know is not for
the benefit of the user (we don’t live in an ideal world or industry – do we?)
57. Our (early) approach
• With screen-based interactions we have developed ‘buffers’ – body
language, facial expressions and the occasional sigh tend to offer
glimpses into the true feelings of a user.
• We’ve started developing tools to look at gait, speed of walk (and speech)
in addition to more pronounced body and facial clues alongside true tone
of voice and use of specific language trigger words and terminology.
58. Our (early) approach
• Using beacons and IoT sensors (and platform) to create a networked
environment in which to distract/observe/develop relationships and
understand behaviours.
• Using wearables in context. Data and learnings can be extrapolated from
understanding context in detail.
• Investigating ethical neuro technology that can understand reaction but do
not directly influence decisions.
59. The future Experience Designer?
• The platforms that we use, test, develop and enhance have been
developed by technologists.
• These platforms are ‘stable’……
• Sociologists, Anthropologists, Psychologists are all moving to the forefront
as technology becomes more embedded in everyday lives.
• The increased use of these skills in experience design will allow
technology to become an enabler rather than the lead.
• Moving more naturally into Service Design & Blueprint.
60. The future Experience Designer?
• Understands AI, its limitations and best use cases.
• Has an ethical design mindset
• Understands the developing human-computer relationship - (as the human
is the interface) – so the ‘user’ actually becomes more than just the
human.
61. Upcoming Discussions
• Digital Doubles – Mid April
• Life Centred Design – May
• Experience UX Blog - https://www.experienceux.co.uk/ux-blog/
• Remote workshops and training ongoing
• The Need for Humane Tech – 16th April - www.firesmoke.co
Instead, they did something else with their faces. People also scowled when they were not angry. “They scowl when they’re concentrating, they scowl when someone tells them a bad joke, they scowl when they have gas, they scowl for lots of reasons,”
Instead, they did something else with their faces. People also scowled when they were not angry. “They scowl when they’re concentrating, they scowl when someone tells them a bad joke, they scowl when they have gas, they scowl for lots of reasons,”
Does AI remove interactions? Instead of us going through a journey designed by ‘us’ we go through a journey designed by ‘It’?
Will it remove the need for us to interact, engage with those that we previously have. Will it remove interfaces?
We have to also look towards the impact and potential implications of neuroscience when we are considering the invisible interfaces. The claims made by some in the digital industry as to the impact of neuro marketing are outlandish at best and misleading at worst. But, but, but – we need to take this into account when we are trying to understand the tension between a customers true need and our clients underlying beliefs.
HOW CAN WE SHOW EXAMPLES OF TRUE BEHAVIOUR IF THE CLIENT HAS BEEN COMMITTED TO USING NEURO APPROACHES TO INFLUENCING USER BEHAVIOUR.