SlideShare une entreprise Scribd logo
1  sur  23
Privacy Risk Models for
Designing Privacy-Sensitive
Ubiquitous Computing
Systems
Jason Hong
Jennifer Ng
Scott Lederer
James Landay

Carnegie Mellon
Carnegie Mellon
University of California, Berkeley
University of Washington
Motivation

Ubiquitous Computing is Coming
Advances in wireless networking, sensors, devices
– Greater awareness of and interaction with physical world

“But what about my privacy?”

Find Friends

E911
Motivation

But Hard to Design Privacy-Sensitive Ubicomp Apps
Discussions on privacy generate lots of heat but not light
–
–
–
–

Big brother, overprotective parents, telemarketers, genetics…
Many conflicting values
Often end up talking over each other
Hard to have reasoned debates and create designs that
address the issues

Need a design method that helps design teams:
– Identify
– Prioritize
– Manage privacy risks for specific applications

Propose Privacy Risk Models for doing this
Privacy Risk Model Analogy
Security Threat Model

“[T]he first rule of security analysis is this:
understand your threat model. Experience
teaches that if you don’t have a clear threat model
–
a clear idea of what you are trying to
prevent and what technical capabilities your
adversaries have – then you won’t be able to think
analytically about how to proceed. The threat
model is the starting point of any security
analysis.”
- Ed Felten
Privacy Risk Model

Two Parts: Risk Analysis and Risk Management
Privacy Risk Analysis
– Common questions to help design teams identify potential risks
– Like a task analysis

Privacy Risk Management
– Helps teams prioritize and manage risks
– Like severity rankings in heuristic evaluation

Will present a specific privacy risk model for ubicomp
– Draws on previous work, plus surveys and interviews
– Provide reasonable level of protection for foreseeable risks
Outline


Motivation



Privacy Risk Analysis



Privacy Risk Management



Case Study: Location-enhanced Instant Messenger
Privacy Risk Analysis

Common Questions to Help Design Teams Identify
Risks
Social and Organizational Context
–
–
–
–
–

Who are the users?
What kinds of personal info are shared?
Relationships between sharers and observers?
Value proposition for sharing?
…
Social and Organizational Context
Who are the users? Who shares info? Who sees it?

Different communities have different needs and norms
– An app appropriate for families might not be for work settings

Affects conditions and types of info willing to be shared
– Location information with spouse vs co-workers
– Real-time monitoring of one’s health

Start with most likely users
– Ex. Find Friends
– Likely sharers are people using mobile phone
– Likely observers are friends, family, co-workers

Find Friends
Social and Organizational Context
What kinds of personal info are shared?

Different kinds of info have different risks and norms
– Current location vs home phone# vs hobbies

Some information already known between people
– Ex. Don’t need to protect identity with your friends and family

Different ways of protecting different kinds of info
– Ex. Can revoke access to location, cannot for birthday or name
Social and Organizational Context
Relationships between sharers and observers?
Kinds of risks and concerns
– Ex. Risks w/ friends are unwanted intrusions, embarrassment
– Ex. Risks w/ paid services are spam, 2nd use, hackers

Incentives for protecting personal information
– Ex. Most friends don’t have reason to intentionally cause harm
– Ex. Neither do paid services, but want to make more money

Mechanisms for recourse
– Ex. Kindly ask friends and family to stop being nosy
– Ex. Recourse for paid services include formally complaining,
switching services, suing
Social and Organizational Context
Value proposition for sharing personal information?
What incentive do users have for sharing?
Quotes from nurses using locator badges
– “I think this is disrespectful, demeaning and degrading”
– “At first, we hated it for various reasons, but mostly we felt we
couldn’t take a bathroom break without someone knowing
where we were…[but now] requests for medications go right to
the nurse and bedpans etc go to the techs first... I just love [the
locator system].”

When those who share personal info do not benefit in
proportion to perceived risks, then the tech is likely to fail
Privacy Risk Analysis

Common Questions to Help Design Teams Identify
Risks
Social and Organizational Context
–
–
–
–
–

Who are the users?
What kinds of personal info are shared?
Relationships between sharers and observers?
Value proposition for sharing?
…

Technology
–
–
–
–
–

How is personal info collected?
Push or pull?
One-time or continuous?
Granularity of info?
…
Technology

How is personal info collected?
Different technologies have different tradeoffs for privacy
Network-based approach
– Info captured and processed by external computers that users
have no practical control over
– Ex. Locator badges, Video cameras

Client-based approach
– Info captured and processed on end-user’s device
– Ex. GPS, beacons
– Stronger privacy guarantees, all info starts with you first
Technology
Push or pull?

Push is when user sends info first
– Ex. you send your location info on E911 call
– Few people seem to have problems with push

Pull is when another person requests info first

E911

– Ex. a friend requests your current location
– Design space much harder here
need to make people aware of requests
want to provide understandable level of control
don’t want to overwhelm

Find Friends
Technology

One-time or continuous disclosures?
One-time disclosure
– Ex. observer gets snapshot

Fewer privacy concerns

Continuous disclosure
– Ex. observer repeatedly gets info

Greater privacy concerns
– “It’s stalking, man.”

Find Friends

Active Campus
Technology

Granularity of info shared?
Different granularities have different utility and risks
Spatial granularity
– Ex. City? Neighborhood? Street? Room?

Temporal granularity
– Ex. “at Boston last month” vs “at Boston August 2 2004”

Identification granularity
– Ex. “a person” vs “a woman” vs “alice@blah.com”

Keep and use coarsest granularity needed
– Least specific data, fewer inferences, fewer risks
Outline


Motivation



Privacy Risk Analysis



Privacy Risk Management



Case Study: Location-enhanced Instant Messenger
Privacy Risk Management
Helps teams prioritize and manage risks

First step is to prioritize risks by estimating:
– Likelihood that unwanted disclosure occurs
– Damage that will happen on such a disclosure
– Cost of adequate privacy protection

Focus on high likelihood, high damage, low cost risks first
– Like heuristic eval, fix high severity and/or low cost
– Difficult to get exact numbers, more important is the process
Privacy Risk Management
Helps teams prioritize and manage risks
Next step is to help manage those risks
How does the disclosure happen?
– Accident? Bad user interface? Poor conceptual model?
– Malicious? Inside job? Scammers?

What kinds of choice, control, and awareness are there?
– Opt-in? Opt-out?
– What mechanisms? Ex. Buddy list, Invisible mode
– What are the default settings?

Better to prevent or to detect abuses?
– “Bob has asked for your location five times in the past hour”
Case Study

Location-enhanced Instant Messenger
New features
–
–
–
–

Request a friend’s current location
Automatically show your location
Invisible mode, reject requests
Default location is “unknown”

Who are the users?
– Typical IM users

Relationships?
– Friends, family, classmates, …

One-time or continuous?
– One-time w/ notifications
Case Study

Location-enhanced Instant Messenger
Identifying potential privacy risks
– Over-monitoring by friends and family
– Over-monitoring at work place
– Being found by malicious person (ex. stalker, mugger)

Assessing the first risk, over-monitoring by family
– Likelihood depends on family, conservatively assign “high”
– Damage might be embarrassing but not life-threatening, assign
“medium”

Managing the first risk
– Buddy list, Notifications for awareness, invisible mode,
“unknown” if location not disclosed
– All easy to implement, cost is “low”
Discussion
Privacy risk models are only a starting point
– Like task analysis, should try to verify assumptions and answers
– Can be combined with field studies, interviews, low-fi prototypes
Summary
Privacy risk models for helping design teams
prioritize, and manage risks

identify,

Privacy risk analysis for identifying risks
– Series of common questions, like a task analysis

Privacy risk management for prioritizing & managing risks
– Like severity ratings in heuristic evaluation

Described our first iteration of privacy risk model
– Help us evolve and advance it!

Contenu connexe

Similaire à Privacy Risk Models for Designing Privacy-Sensitive Ubiquitous Computing Systems, presented at DIS2004

Internet Safety Lesson Plan 1
Internet Safety Lesson Plan 1Internet Safety Lesson Plan 1
Internet Safety Lesson Plan 1
Quennith
 
Internet 1 lesson plan
Internet 1 lesson planInternet 1 lesson plan
Internet 1 lesson plan
Quennith
 
Introduction to Privacy and Social Networking
Introduction to Privacy and Social NetworkingIntroduction to Privacy and Social Networking
Introduction to Privacy and Social Networking
Jason Hong
 
Your Privacy & Security on the Web
Your Privacy & Security on the WebYour Privacy & Security on the Web
Your Privacy & Security on the Web
ankitgadgil
 
Final communication and connectedness v3
Final communication and connectedness v3 Final communication and connectedness v3
Final communication and connectedness v3
Mia Horrigan
 
Mentoring Youth in the Digital Age
Mentoring Youth in the Digital AgeMentoring Youth in the Digital Age
Mentoring Youth in the Digital Age
TeamMates
 
Mentoring Youth in the Digital Age
Mentoring Youth in the Digital AgeMentoring Youth in the Digital Age
Mentoring Youth in the Digital Age
TeamMates
 
Privacy in the Age of Ubiquitous Computing, Stanford PCD seminar March 2004
Privacy in the Age of Ubiquitous Computing, Stanford PCD seminar March 2004Privacy in the Age of Ubiquitous Computing, Stanford PCD seminar March 2004
Privacy in the Age of Ubiquitous Computing, Stanford PCD seminar March 2004
Jason Hong
 

Similaire à Privacy Risk Models for Designing Privacy-Sensitive Ubiquitous Computing Systems, presented at DIS2004 (20)

Internet Safety Lesson Plan 1
Internet Safety Lesson Plan 1Internet Safety Lesson Plan 1
Internet Safety Lesson Plan 1
 
Fostering an Ecosystem for Smartphone Privacy
Fostering an Ecosystem for Smartphone PrivacyFostering an Ecosystem for Smartphone Privacy
Fostering an Ecosystem for Smartphone Privacy
 
Internet 1 lesson plan
Internet 1 lesson planInternet 1 lesson plan
Internet 1 lesson plan
 
The Art of Human Hacking : Social Engineering
The Art of Human Hacking : Social Engineering The Art of Human Hacking : Social Engineering
The Art of Human Hacking : Social Engineering
 
Introduction to Privacy and Social Networking
Introduction to Privacy and Social NetworkingIntroduction to Privacy and Social Networking
Introduction to Privacy and Social Networking
 
Charles Wright SMAHRT Presentation
Charles Wright SMAHRT PresentationCharles Wright SMAHRT Presentation
Charles Wright SMAHRT Presentation
 
Your Privacy & Security on the Web
Your Privacy & Security on the WebYour Privacy & Security on the Web
Your Privacy & Security on the Web
 
Technology & Mental Health Parents Seminar: Sophie Linington
Technology & Mental Health Parents Seminar: Sophie LiningtonTechnology & Mental Health Parents Seminar: Sophie Linington
Technology & Mental Health Parents Seminar: Sophie Linington
 
Wilbanks Can We Simultaneously Support Both Privacy & Research?
Wilbanks Can We Simultaneously Support Both Privacy & Research?Wilbanks Can We Simultaneously Support Both Privacy & Research?
Wilbanks Can We Simultaneously Support Both Privacy & Research?
 
Your digital identity - are you feeling lucky?
Your digital identity - are you feeling lucky?Your digital identity - are you feeling lucky?
Your digital identity - are you feeling lucky?
 
Social engineering
Social engineering Social engineering
Social engineering
 
Final communication and connectedness v3
Final communication and connectedness v3 Final communication and connectedness v3
Final communication and connectedness v3
 
Common ethical issues
Common ethical issuesCommon ethical issues
Common ethical issues
 
Who Can Help Me Write My Paper - Researchabout.Web
Who Can Help Me Write My Paper - Researchabout.WebWho Can Help Me Write My Paper - Researchabout.Web
Who Can Help Me Write My Paper - Researchabout.Web
 
Survey Methodology for Security and Privacy Researchers
Survey Methodology for Security and Privacy ResearchersSurvey Methodology for Security and Privacy Researchers
Survey Methodology for Security and Privacy Researchers
 
Ethics Half Day
Ethics Half DayEthics Half Day
Ethics Half Day
 
Mentoring Youth in the Digital Age
Mentoring Youth in the Digital AgeMentoring Youth in the Digital Age
Mentoring Youth in the Digital Age
 
Mentoring Youth in the Digital Age
Mentoring Youth in the Digital AgeMentoring Youth in the Digital Age
Mentoring Youth in the Digital Age
 
Privacy in the Age of Ubiquitous Computing, Stanford PCD seminar March 2004
Privacy in the Age of Ubiquitous Computing, Stanford PCD seminar March 2004Privacy in the Age of Ubiquitous Computing, Stanford PCD seminar March 2004
Privacy in the Age of Ubiquitous Computing, Stanford PCD seminar March 2004
 
Social Networking What Are The Ramificationsv2
Social Networking What Are The Ramificationsv2Social Networking What Are The Ramificationsv2
Social Networking What Are The Ramificationsv2
 

Dernier

Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Victor Rentea
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
?#DUbAI#??##{{(☎️+971_581248768%)**%*]'#abortion pills for sale in dubai@
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Victor Rentea
 

Dernier (20)

MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontology
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
CNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In PakistanCNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In Pakistan
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
 
Understanding the FAA Part 107 License ..
Understanding the FAA Part 107 License ..Understanding the FAA Part 107 License ..
Understanding the FAA Part 107 License ..
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
 
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
 

Privacy Risk Models for Designing Privacy-Sensitive Ubiquitous Computing Systems, presented at DIS2004

  • 1. Privacy Risk Models for Designing Privacy-Sensitive Ubiquitous Computing Systems Jason Hong Jennifer Ng Scott Lederer James Landay Carnegie Mellon Carnegie Mellon University of California, Berkeley University of Washington
  • 2. Motivation Ubiquitous Computing is Coming Advances in wireless networking, sensors, devices – Greater awareness of and interaction with physical world “But what about my privacy?” Find Friends E911
  • 3. Motivation But Hard to Design Privacy-Sensitive Ubicomp Apps Discussions on privacy generate lots of heat but not light – – – – Big brother, overprotective parents, telemarketers, genetics… Many conflicting values Often end up talking over each other Hard to have reasoned debates and create designs that address the issues Need a design method that helps design teams: – Identify – Prioritize – Manage privacy risks for specific applications Propose Privacy Risk Models for doing this
  • 4. Privacy Risk Model Analogy Security Threat Model “[T]he first rule of security analysis is this: understand your threat model. Experience teaches that if you don’t have a clear threat model – a clear idea of what you are trying to prevent and what technical capabilities your adversaries have – then you won’t be able to think analytically about how to proceed. The threat model is the starting point of any security analysis.” - Ed Felten
  • 5. Privacy Risk Model Two Parts: Risk Analysis and Risk Management Privacy Risk Analysis – Common questions to help design teams identify potential risks – Like a task analysis Privacy Risk Management – Helps teams prioritize and manage risks – Like severity rankings in heuristic evaluation Will present a specific privacy risk model for ubicomp – Draws on previous work, plus surveys and interviews – Provide reasonable level of protection for foreseeable risks
  • 6. Outline  Motivation  Privacy Risk Analysis  Privacy Risk Management  Case Study: Location-enhanced Instant Messenger
  • 7. Privacy Risk Analysis Common Questions to Help Design Teams Identify Risks Social and Organizational Context – – – – – Who are the users? What kinds of personal info are shared? Relationships between sharers and observers? Value proposition for sharing? …
  • 8. Social and Organizational Context Who are the users? Who shares info? Who sees it? Different communities have different needs and norms – An app appropriate for families might not be for work settings Affects conditions and types of info willing to be shared – Location information with spouse vs co-workers – Real-time monitoring of one’s health Start with most likely users – Ex. Find Friends – Likely sharers are people using mobile phone – Likely observers are friends, family, co-workers Find Friends
  • 9. Social and Organizational Context What kinds of personal info are shared? Different kinds of info have different risks and norms – Current location vs home phone# vs hobbies Some information already known between people – Ex. Don’t need to protect identity with your friends and family Different ways of protecting different kinds of info – Ex. Can revoke access to location, cannot for birthday or name
  • 10. Social and Organizational Context Relationships between sharers and observers? Kinds of risks and concerns – Ex. Risks w/ friends are unwanted intrusions, embarrassment – Ex. Risks w/ paid services are spam, 2nd use, hackers Incentives for protecting personal information – Ex. Most friends don’t have reason to intentionally cause harm – Ex. Neither do paid services, but want to make more money Mechanisms for recourse – Ex. Kindly ask friends and family to stop being nosy – Ex. Recourse for paid services include formally complaining, switching services, suing
  • 11. Social and Organizational Context Value proposition for sharing personal information? What incentive do users have for sharing? Quotes from nurses using locator badges – “I think this is disrespectful, demeaning and degrading” – “At first, we hated it for various reasons, but mostly we felt we couldn’t take a bathroom break without someone knowing where we were…[but now] requests for medications go right to the nurse and bedpans etc go to the techs first... I just love [the locator system].” When those who share personal info do not benefit in proportion to perceived risks, then the tech is likely to fail
  • 12. Privacy Risk Analysis Common Questions to Help Design Teams Identify Risks Social and Organizational Context – – – – – Who are the users? What kinds of personal info are shared? Relationships between sharers and observers? Value proposition for sharing? … Technology – – – – – How is personal info collected? Push or pull? One-time or continuous? Granularity of info? …
  • 13. Technology How is personal info collected? Different technologies have different tradeoffs for privacy Network-based approach – Info captured and processed by external computers that users have no practical control over – Ex. Locator badges, Video cameras Client-based approach – Info captured and processed on end-user’s device – Ex. GPS, beacons – Stronger privacy guarantees, all info starts with you first
  • 14. Technology Push or pull? Push is when user sends info first – Ex. you send your location info on E911 call – Few people seem to have problems with push Pull is when another person requests info first E911 – Ex. a friend requests your current location – Design space much harder here need to make people aware of requests want to provide understandable level of control don’t want to overwhelm Find Friends
  • 15. Technology One-time or continuous disclosures? One-time disclosure – Ex. observer gets snapshot Fewer privacy concerns Continuous disclosure – Ex. observer repeatedly gets info Greater privacy concerns – “It’s stalking, man.” Find Friends Active Campus
  • 16. Technology Granularity of info shared? Different granularities have different utility and risks Spatial granularity – Ex. City? Neighborhood? Street? Room? Temporal granularity – Ex. “at Boston last month” vs “at Boston August 2 2004” Identification granularity – Ex. “a person” vs “a woman” vs “alice@blah.com” Keep and use coarsest granularity needed – Least specific data, fewer inferences, fewer risks
  • 17. Outline  Motivation  Privacy Risk Analysis  Privacy Risk Management  Case Study: Location-enhanced Instant Messenger
  • 18. Privacy Risk Management Helps teams prioritize and manage risks First step is to prioritize risks by estimating: – Likelihood that unwanted disclosure occurs – Damage that will happen on such a disclosure – Cost of adequate privacy protection Focus on high likelihood, high damage, low cost risks first – Like heuristic eval, fix high severity and/or low cost – Difficult to get exact numbers, more important is the process
  • 19. Privacy Risk Management Helps teams prioritize and manage risks Next step is to help manage those risks How does the disclosure happen? – Accident? Bad user interface? Poor conceptual model? – Malicious? Inside job? Scammers? What kinds of choice, control, and awareness are there? – Opt-in? Opt-out? – What mechanisms? Ex. Buddy list, Invisible mode – What are the default settings? Better to prevent or to detect abuses? – “Bob has asked for your location five times in the past hour”
  • 20. Case Study Location-enhanced Instant Messenger New features – – – – Request a friend’s current location Automatically show your location Invisible mode, reject requests Default location is “unknown” Who are the users? – Typical IM users Relationships? – Friends, family, classmates, … One-time or continuous? – One-time w/ notifications
  • 21. Case Study Location-enhanced Instant Messenger Identifying potential privacy risks – Over-monitoring by friends and family – Over-monitoring at work place – Being found by malicious person (ex. stalker, mugger) Assessing the first risk, over-monitoring by family – Likelihood depends on family, conservatively assign “high” – Damage might be embarrassing but not life-threatening, assign “medium” Managing the first risk – Buddy list, Notifications for awareness, invisible mode, “unknown” if location not disclosed – All easy to implement, cost is “low”
  • 22. Discussion Privacy risk models are only a starting point – Like task analysis, should try to verify assumptions and answers – Can be combined with field studies, interviews, low-fi prototypes
  • 23. Summary Privacy risk models for helping design teams prioritize, and manage risks identify, Privacy risk analysis for identifying risks – Series of common questions, like a task analysis Privacy risk management for prioritizing & managing risks – Like severity ratings in heuristic evaluation Described our first iteration of privacy risk model – Help us evolve and advance it!

Notes de l'éditeur

  1. Ubicomp embeds technology at extreme level of intimacy and scale
  2. http://www.freedom-to-tinker.com/archives/000317.html
  3. Privacy risk analysis inspired by observation that lots of recurring questions keep coming up
  4. Sample of the questions, see paper for full set These questions are meant as a starting point for inquiry, can add more questions where appropriate
  5. Ex. Many people willing to always share location with family, but only between work hours with co-workers
  6. A person’s name, which could be a legal name, or first or last name only; • A person’s address, which could be a mailing address, email address, homepage, blog, or instant messenger address; • A unique identifier, which could be a social security number or bank account number; • Names or pseudonyms that cannot be easily traced, for example a disposable identifier used for anonymous HIV testing; • A person’s appearance or behavior, for example web browsing habits, fashion style, or writing style; • A person’s social categorization, including “gender, ethnicity, religion, age, education, region, sexual orientation, linguistic patterns, organizational memberships and classifications, health status, employment, leisure activities… credit risk, IQ, SAT scores, life style categorization for mass marketing” [29]; and • A person’s relationship with others, who they are in love with, who they like, who they dislike, what services they use.
  7. “I think this is disrespectful, demeaning and degrading” “I guess my question is how does this help the NURSE?” The second group of nurses was initially skeptical, but was won over because management did not abuse the system and because they eventually saw the value of such a system. One nurse wrote, “I admit, when we first started using it we all hated it for some of the same reasons cited above [in the message board] but I do think it is a timesaver! It is very frustrating when someone floats to our unit and doesn’t have a tracker…can’t find them for [doctor] calls, [patient] needs etc.” “At first, we hated it for various reasons, but mostly we felt we couldn’t take a bathroom break without someone knowing where we were…[but now] requests for medications go right to the nurse and bedpans etc go to the techs first. If they are tied up, then we get a reminder page and can take care of the pts needs. I just love [the locator system].”
  8. Sample of the questions, see paper for full set These questions are meant as a starting point for inquiry, can add more questions where appropriate
  9. “I would be creeped if my friends found me. And they said I saw you here. It would just be weird.” “for professors, it would be weird”
  10. How does the unwanted disclosure take place? Is it an accident (for example, hitting the wrong button)? A misunderstanding (for example, the data sharer thinks they are doing one thing, but the system does another)? A malicious disclosure? • How much choice, control, and awareness do data sharers have over their personal information? What kinds of control and feedback mechanisms do data sharers have to give them choice, control, and awareness? Are these mechanisms simple and understandable? What is the privacy policy, and how is it communicated to data sharers? • What are the default settings? Are these defaults useful in preserving one’s privacy? • In what cases is it easier, more important, or more cost-effective to prevent unwanted disclosures and abuses? Detect disclosures and abuses? • Are there ways for data sharers to maintain plausible deniability? • What mechanisms for recourse or recovery are there if there is an unwanted disclosure or an abuse of personal information?
  11. Reject, system doesn’t know, not there to reply