SlideShare utilise les cookies pour améliorer les fonctionnalités et les performances, et également pour vous montrer des publicités pertinentes. Si vous continuez à naviguer sur ce site, vous acceptez l’utilisation de cookies. Consultez nos Conditions d’utilisation et notre Politique de confidentialité.
SlideShare utilise les cookies pour améliorer les fonctionnalités et les performances, et également pour vous montrer des publicités pertinentes. Si vous continuez à naviguer sur ce site, vous acceptez l’utilisation de cookies. Consultez notre Politique de confidentialité et nos Conditions d’utilisation pour en savoir plus.
A talk about ux, trust and privacy - and how these are becoming increasing important in human-computer interaction. This connection we have with our smart-everythings, it is no longer merely about exchanging data back and forth. Our connection with computers now does revolve around values which we normally find in human-to-human relationships: understanding and trust.
We humans expect that these machines do the computing effort to understand us; smart personalisation.We also expect that we can trust these machines – and the companies behind them - to keep what they learn about us to themselves. We expect them to respect our privacy. Our security.
And as designers, we need to deliver great, personal experiences. We also need to deliver trustworthy products. We owe it to both our users and the people who hire us to actively think about privacy, and to implement privacy in the flows and designs we deliver.
THE FOUR PHASES OF COGNIZANT
COMPUTING 1. SYNC ME Store copies of my digital assets and keep it in sync across all end points and contexts 2. SEE ME Know where I am (and have been) on the internet and in the real world. Understand my mood and context to better align services 3. KNOW ME Understand what I want and need and proactively present it to me 4. BE ME Act on my behalf based on learned and explicit rules 94% COMPLETED 82% WEARABLES UPDATE 31% … LOADING DATA INITIALIZING.. 12%
Meanwhile, at the Apple Watch
announcement, 19× Personal. relevant are you engaging at the right moment? glanceable can you deliver value in milliseconds? personal do you approach people in the right manner? EXPERIENCES FOR THE MOST PERSONAL DEVICE EVER ARE
No other Apple device has
ever been so connected to the wearer. It is important to be mindful of this connection . Apple Watch Human Interface Design Guidelines, 2015
2nd Annual Poll on How
Personal Technology is Changing our Lives - January 2015, Microsoft Concern about privacy jumped 5 points between 2014 and 2015. THE PRIVACY CHALLENGE
91% of adults ‘agree’ or
‘strongly agree’ that consumers have lost control over how personal information is collected and used by companies. Pew Research Privacy Panel Survey, January 2014 LOSS OF CONTROL
38% of those who say
not concerned about privacy online say they do mind companies using information about them 71% of those who are happy to share personal information with companies and brands that they like are concerned about how information collected about them is being used by companies Personalisations vs Privacy, Ipsos MORI, 2014 USER INCONSISTENCY
THE FACEBOOK PARADOX THE FACEBOOK
PARADOX 91% of adults feel consumers have lost control over how personal information is collected and used by companies. 58% of the entire adult population (and 71% of internet users) is on Facebook.
Increased knowledge of the consumer
and the ﬁne-tuning of oﬀers that are perceived as personal and highly relevant should lead to an increase in spend. Gartner on the monetization of contextualization
DATA VALUE EXCHANGE insights in
own behaviourcustomer insights peer comparisonoptimising processes & products personalisation (right message) relevance (right person, right moment) accountability (on ADDD)data driven decisions CONSUMERBUSINESS magic moon standard moon no moon value on the consumer side should be equal or more
privacy THE UNTRUTH UX VS
The truth is that collecting information about people allows you to make signiﬁcantly better products and the more information you collect, the better products you can build . Dustin Curtis, “Privacy VS. User Experience” (2014) #FALSE (or at least a very one-sighted truth)
Whether or not better products
can be made by collecting more user data is a matter of… • Context • Opinion • Causation/correlation • Data quality (accurate? relevant? complete?) • Deﬁne: better product MORE DATA, BETTER PRODUCTS?
PRIVACY @ SENTIANCE UX DESIGN
Businesses can deliver a grand user experience and tremendous value to both customers and the company, only if they safeguard their users’ privacy and security. →
Niveau 1 Niveau 2 Niveau
3 aggressive driver … … waking up … city worker sleeping walking running car subway train bus boat zigzagging standing airplane* biking arriving at home, work, the gym, ... shopping lunch inactive noisy environment* in a meeting horse* @home working in company* couch potato workaholic sportive N2 moments N3 proﬁles N1 events watching tv motorcycle agitated* suddenly stopped tram climbing* SENTIANCE FROM MOBILE DATA TO SMART LIFE
(it is however, an essential
requirement) SECURITY = PRIVACY “Security is a very important topic, but it’s primarily a technical topic, and to a large extent it’s a very well- understood one. If you pay attention to security, it is possible to get it right, whereas privacy is something that’s much more ﬂuid and is much more about social norms, expectations, implicit contracts between consumers and providers.” Pilgrim Baert – co-founder of AlertMe
THE INTERNET OF EVERYTHING: DESIGNER
ROLES ARE UNDERGOING CHANGE UX design has been extended to address all aspects of a product or service as perceived by its users – that includes the control they have of their personal dta, their privacy.
BEING CREDIBLE useful usable desirable
credible valuable ﬁndable accessible User Experience Honeycomb (Peter Morville) credible 2004 the information you present to users 2015 taking responsibility to keep personal data safe
PRIVACY-BY-DESIGNER: DELIVER BOTH PERSONAL &
TRUST We owe it to both our users and the people who hire us to actively think about privacy, and to implement privacy in the ﬂows and designs we deliver. B. We need to deliver trustworthy products. A. We need to deliver great, personal experiences.
1. You need to fully
understand the end goal (by asking the right questions) WHY Why are we doing this? What do you want to achieve? What is required to achieve this? What is the best way? WHAT WHO Who is impacted by this? What do they expect? GOAL DATA PIA
2. There are rules, guidelines,
toolkits. (which continuously evolve) UX PRIVACY • Apple, Android, .. design guidelines • Interaction patterns • Best (and worst) practice examples • Models & frameworks • User research methods • Emerging trends • … TOOLKITS: omnigraﬄe, illustrator, ﬁreworks, pen & paper, … • Existing & upcoming EU Law (GDPR) • Local privacy act & royal decrees • Local telecommunications law • Privacy watchdog recommendations • ToS of the platform (iOS, Android) • Internal policies • … TOOLKITS: information classiﬁcation, risk assessments, privacy policies, PIA framework, …
2. There are rules, guidelines,
toolkits. (which continuously evolve) General Data Protection Regulation EXPANDED SCOPE any organization processing personal data of EU residents PRIVACY-BY-DESIGN & DEFAULT EXPLICIT INFORMED CONSENT DATA BREACH NOTIFICATION DPA & possibly consumer DPO REQUIRED Users/month threshold or location data DATA PORTABILITY
4. You can not do
it alone. (it is multi-disciplinary and cross-departmental) Privacy requires a clear mandate to get things done. Everybody accepts it is important – but not a single department has it as a priority. Have privacy as part as the project plan and estimates as soon as possible. A continuous need to explain the signiﬁcance of privacy in the overall product & company picture Have privacy as a deliverable, avoids the delays & soring costs of adding it after the facts.
Privacy is not only a
fundamental right, it can also be a competitive advantage . Neelie Kroes Conform to EU legislation? Ready for the world market, then. People can trust you with their digital identities → sets you apart from competition
5. The devil is the
details. (and the cost of mistakes is high) • up to 1,000,000 EUR ﬁne or up to 2% of the annual worldwide turnover in case of an enterprise, whichever is greater (Draft GDPR, art 79) • customers leaving • • customer complaints • customers leaving UX Privacy
6. Practice Honest Communication. (from
the start) Consider a breach likely – and prepare accordingly. VISA’s ‘Responding to a Data Breach – Communications Guidelines for Merchants‘ guidelines. do not play the victim express regret take ownership be accountable
1. What happened? (tell what
you know at that time) crisis communications (works for downtime communication too) 2. What is being done *NOW*? (investigate, take systems oﬄine, ..) 3. How does this aﬀect your customers? (both short- and long term) 4. What are you doing to minimize risk? What can your customers do? 5. How do people get more information or updates? (folluw up) 6. What are you doing prevent this from happening again?
Privacy does not beneﬁt from
a “do ﬁrst, ask forgiveness later” strategy. (avoid: “Hey, we just lost all this data of yours you did not we had in the ﬁrst place.”) which data you gather & what for set correct expectations
FROM THE START: informed explicit
consent (avoid: “Hey, we just lost all this data of yours you did not we had in the ﬁrst place.”) clear aﬃrmative action use plain language
1. You need to fully
understand the end goal – Ask the right questions 2. There are rules, guidelines and toolkits – Rules & tools evolve. Fast. 3. Less is more – Value & proportionality 4. You can’t do it alone. – Multi-disciplinary and cross-departmental 5. The devil is in the details – and the cost of mistakes high 6. Honest communications – from the start As designers, then what can we easily do that improves both UX and privacy?
1. Design for Explicit: Opt-In
By signing this contract, you agree we have the right to collect and pass on all your information. In case you do not want your bank to pass on your credit information to third partners and other divisions, please write ‘I do not agree’ on the contract and hand it over to the person behind the till. EXPLICIT EXPLICIT NOT EXPLICIT (hidden opt-out) NO YES IF YOU AGREE, PLEASE CHECK THIS BOX:
3. Design for Choice: Consent
In your designs and ﬂows, take into account both having and not having the data. Design personalized experiences for when you have data. Design good alternatives for not having the data. Today will be sunny Weather for Olen, Belgium where we know you live. Check out the weather! Antwerpen
Privacy as a trading function
Accelerometer Gyroscope Microphone Camera In-App Usage GPS Expected value User acceptance Wi-Fi Browsing History The more permissions are required, more added value is expected from the mobile app. Calendar SMS Light GSR
5. Design for Because.. Explain
your magic. When users know of the existence of a certain algorithm, their satisfaction with the product increases over time , probably as they start to understand its workings better. Yet when they discovered an algorithm they were previously unaware of, users felt betrayed.
Worst case scenario “In the
extreme case, it may be that whenever a software developer in Menlo Park adjusts a parameter, someone somewhere wrongly starts to believe themselves to be unloved. ” – Eslami et all.
6. Design for Transparency Show
people their data selfs. If we are going to allow algorithms and expert rules to steer our behaviour, we must know they understand that correctly. Allow for: - Correction - Reset
7. Design for forming secure
habits “Burner accounts” Kinja introduced these for anonymous commenting. They made private keys understandable through metaphor. “…if you lose the burner key initially issued we will not be able to retrieve this information for you or reset the account. Save your key.”
Snowden Challenge at SXSW Combine
exceptional ux with privacy at INCEPTION, not afterwards “ Combine exceptional UX with privacy at inception, not afterwards.” Edward Snowden’s Challenge to Startups at SXSW
As the need for permanent
access to data increases, so does the need for ethics & morality. Weak AI (expert systems) Strong AI (singularity) Machine Learning Deep Learning Recommender Systems Autonomous Systems Transition period (Danger Zone)
GOOD UX AND PRIVACY Privacy
is about more than data. Privacy does not beneﬁt from a “do ﬁrst, ask forgiveness later” strategy. It’s their data. Not yours. (Safeguarding it is a joint eﬀort, though). It is their choice. Design the best possible experience, regardless the choice.