4. Privacy definitions
Individual rights
• “the right to be let alone” (Warren and Brandeis, 1890)
Informational self-determination
• “Privacy is the claim of individuals, groups, or institutions to determine for themselves when,
how, and to what extent information about them is communicated to others.” (Alan Westin,
1967)
Access and control
• “control access to oneself and to personal information about oneself ” (Adam Moore, 1998)
Pluralistic resemblance
• “Privacy is a plurality of different things.” “It is a set of protections against a related cluster
of problems” (Daniel Solove, 2008)
Privacy & data minimization
• “Data controllers should collect only the personal data they really need, and should keep it
only for as long as they need it”. (European Data Protection Directive 95/46/EC, 1995)
5. Debate privacy vs. security
– tradeoff: security & privacy + need to have both
• get more of one, at the expense of the other • “These two components of security – safety
• after 9/11, give up civil liberties & privacy to and privacy … I work from the assumption that
national security you need to have both.” – Donald Kerr (US
• popular response: “I have nothing to hide” deputy director of national intelligence)
• “The nothing to hide argument is an argument • “Security and privacy are not opposite ends of
that the privacy interest is generally minimal to a seesaw. There is no security without privacy.
trivial, thus making the balance against And liberty requires both security and privacy.”
security concerns a foreordained victory for –Bruce Schneier (security commenter)
security”. – Daniel Solove (privacy scholar)
8. Content protection
Core techniques
Encryption: first line of defense
• + prevent unauthorized access
• – no content protection after decryption
Symmetric secret secret
key key key
plaintext ciphertext Homer
plaintext
Marge
Asymmetric
Homer’s public Homer’s private
key key key
plaintext ciphertext plaintext
Lisa & Bart Homer
9. Content protection
Core techniques
Digital watermarking: second line of defense
• embed information imperceptibly
• e.g. to prove ownership
secret secret
watermarking watermarking
key key
original
content
watermarked watermark
content
embedding detection / extraction
watermark
distribution / processing / attack
10. Digital watermarking illustration
original image
watermarked image
512×512
watermark extracted watermark
64×64 image (correlation = 0.9997)
11. Privacy issue in content protection
creation use control payment
distribution monitor
usage
User’s privacy nightmare
12. Research motivation
Conflict
• content protection interests of provider
• privacy rights of user
Can we reconcile privacy with protection of content?
20. Contribution (J.OIR 2009, E-Health Handbook 2009)
Back ground Problem & solution
E-Health system Problem
• privacy sensitive content • content sharing– interoperability
• overview of patient’s medical history (healthcare provider)
• privacy protection (patient)
Privacy threats
Our solution
• cross reference content & ID across
providers • architecture distributed e-health
• intensive use of patient’s ID • limited trust in healthcare service
• different sensitivity levels providers
• mediating service
• data anonymization
• practical validation
22. Contribution (PET’06, CMS’05)
Background Problem & solution
Personal content distribution Problem
• (phone) cameras, blogs, social networks, • privacy protection (an individual)
search engines • personal content distribution (other
• private pictures taken & published individuals)
• technology trends worsen situation
Our solution
Emerging privacy threats • detection mechanism
• governments and industry • control pictures taken by others
• normal individuals • no restriction & no privacy
infringement for photographers
• distribution channel
• non-professional adversary
24. Privacy analysis framework
SYSTEM SPECIFIC
Assumption &
High-level
usage
description
scenarios
METHODOLOGY
Define Data Map Privacy Select
Identify
Flow Threats to Risk-based Elicit Privacy Privacy
Misuse Case
Diagram DFD Prioritization Requirements Enhancing
Scenarios
(DFD) Elements Solutions
KNOWLEDGE Mapping Risk Mapping Mapping
Privacy
threats Assessment Privacy misuse Privacy
threat tree
components Techniques cases to Objectives to
patterns
to DFD (Not included) Requirements Solutions
25. Privacy threat analysis – illustration
Privacy properties Privacy threats
Data Flow Unlinkability Linkability
Anonymity & Pseudonymity Identifiability
Plausible deniability Non-repudiation
Undetectability & Unobservability Detectability
Confidentiality Disclosure of information
Content awareness content Unawareness
Data Policy and consent compliance policy and consent Noncompliance
Flow
Diagram Threat Tree Pattern
Privacy threats Entity Data Data Process
flow store
Linkability X X X X
Identifiability X X X X
Non-repudiation X X X
Detectability X X X
Information disclosure X X X
Content unawareness X
Consent/policy X X X
noncompliance
26. Elicited privacy requirements &
mitigation strategies
n° Threat scenarios Privacy requirements Suggested mitigation strategy
1 Linkability of social network data store Unlinkability of data entries within the social network database Protection of the data store, by applying of data anonymization
techniques, such as k-anonymity
k-
2 Linkability of data Unlinkability of messages of Employ anonymity system,
flow (user-portal) user-portal communication e.g. TOR
3 Linkability of entities the social network users Unlinkability of different pseudonyms (user IDs) of social network users Technical enforcement: Use anonymity system such as TOR, for
communication between user and social network web portal
User privacy self-awareness (aware revealing too much information online
self-
can be privacy invasive)
Channel and message confidentiality (of data flow) Use anonymity system, such as TOR
4 Identifiability at the social network data store Anonymity of social network users such that the user will not be Protection of the data store, by applying of data anonymization
identified from social network database entries techniques, such as k-anonymity
k-
5 Identifiability at data flow of user data stream Anonymity of social network users such that the user will not be Technical enforcement: use anonymity system, such as TOR, for
(use-rportal)
(use-rportal) identified from user-portal communication
user- communication between user and social network web portal
29. Anonymous buyer-seller watermarking
protocols
2. Watermark generation &
embedding Building blocks
• homomorphic encryption:
watermarking in encrypted domain
o M ∈ M ,o C ∈ C , ∀m1 , m2 ∈ M :
E (m1 o M m2 ) = E (m1 ) o C E (m2 )
• group signature
• zero-knowledge proof
Properties
• traceability (seller’s security)
• non-repudiation (seller’s security)
1. Registration 3. Identification & • non-framing (buyer’s security)
arbitration • anonymity & unlinkability (buyer’s
security)
30. Registration phase
Buyer Group
Manager
gski
gski ← GSjoin ( gpk , uski ) reg i ← GSiss ( gpk , isk , upki )
Secure & authenticated channel
Group manager
• Buyer’s ID
Buyer group joining
• secret signature key
31. Watermark generation &
embedding phase
Buyer Seller
π1,π 2
' '
( sk B , pk B ) ← BKgen(1k )
'
C ← JEnc( pk j , sk B )
'
ci ← BEnc( pk B , WBi )
'
'
m ← ( pk B , j , (ci )li =1 , C ) WATemb ( swk , X , BEnc ( pk B , W ))
S m ← GSsig ( gpk , gski , m)
Anonymous channel (S & B)
Zero knowledge proofs
• Fair Encryption of private Key
• Bit Encryption of watermark
32. Watermark generation & embedding
Basic concept Type I
• Seller & Buyer generate part of watermark • security (S & B)
• Seller doesn’t know: buyer’s watermark & • multiple transactions
watermarked content delivered to the buyer
• Buyer doesn’t know: original content & seller’s X' = X ⊕ V,
watermark E(Y) = E(X'⊕σ (W)) = E(X' ) ⊗ E (σ (W))
intermediate
⊕ →
watermarked content
original content index watermark
⊕ →
permutation final
→ watermarked content
buyer’s watermark permuted
buyer’s watermark
33. Watermark generation & embedding
Type II
• not limited to permutation tolerant watermarks
X' = X ⊕ V, E(W) = E(WS + WB ) = E(WS ) × E(WB ),
E(Y) = E(X'⊕ W) = E(X' ) ⊗ E(W)
+ →
composite watermark
buyer’s watermark seller’s watermark additive homomorphic
× →
⊕ →
⊕ →
final
watermarked content
original content index watermark Intermediate watermarked content
35. Identification and arbitration phase
Seller
Judge
W ' ← WAT det( swk , Y )
Secure & authenticated channel
( B i ,τ ) ← GSopen ( gpk , osk , reg , m, sm ) Group
Manager
36. Implementation
Type III BSW protocol
Parameters Communication complexity
• 512×512-pixel image, ≈ 2 Mbit • (in exchanged bits)
• Watermark 128 bits • Watermark generation & embedding: ≈
• Paillier modulus N of 1024 bits 8 Mbit
• run on CPU at 2.4 GHz • Identification & arbitration: ≈ 0.4 Mbit
Execution time (in seconds) • Expansion factor: ≈ 4.2
• Registration: <0.5 sec
• Identification & arbitration: < 2.5 sec
• most computational load @ Seller
watermark generation & embedding (WGE) phase execution time
38. Conclusions
Privacy threats emerge
• trust in providers
Need balance
• content protection (provider) & privacy protection (user)
Privacy
• as security, embodied value
Build privacy in
• goal-oriented, framework
Content protection techniques
• also protect privacy
Yes, it is possible to reconcile privacy with protection of content
39. List of publications
International Journals
Mina Deng, Kim Wuyts, Riccardo Scandariato, Bart Preneel, and Wouter Joosen. A privacy threat analysis framework: supporting the
elicitation and fulfillment of privacy requirements. Requirement Engineering Journal special issue on Data Privacy, to appear, 27 pages,
2010.
Alfredo Rial, Mina Deng, Tiziano Bianchi, Alessandro Piva, and Bart Preneel. Anonymous buyer-seller watermarking protocols: formal
definitions and security analysis. IEEE Transactions on Information Forensics and Security, to appear, 11 pages, 2010.
Mina Deng, Danny De Cock, and Bart Preneel. Towards a cross-context identity management framework in e-health. Online Information
Review, international journal 33(3):422-442, 2009.
Mina Deng and Bart Preneel. Attacks on two buyer-seller watermarking protocols and an improvement for revocable anonymity. International
Journal of Intelligent Information Technology Application, 1(2):53-64, 2008.
Book Chapters
Mina Deng, Danny De Cock, and Bart Preneel. An interoperable cross-context architecture to manage distributed personal e-health
information. In M. M. Cunha, R. Simoes, and A. Tavares, editors, Handbook of Research on Developments in e- Health and
Telemedicine: Technological and Social Perspectives, ISBN: 978-1-61520-670-4, chapter 27, pages 576-602. Hershey, PA, USA: IGI
Global, Inc., 2009.
Mina Deng and Bart Preneel. On secure buyer-seller watermarking protocols with revocable anonymity. In Kyeong Kang, editor, E-Commerce,
ISBN: 978-953-7619-98-5, chapter 11, pages 184-202. IN-TECH Education and Publishing, Vienna, Austria, 2009.
International conferences (Selected)
Mina Deng, Tiziano Bianchi, Alessandro Piva, and Bart Preneel. An efficient buyer-seller watermarking protocol based on composite signal
representation. In Proceedings of the 11th ACM workshop on Multimedia and security (MMSEC), pages 9-18, Princeton, New Jersey,
USA. ACM New York, NY, USA, 2009.
Mina Deng and Bart Preneel. On secure and anonymous buyer-seller watermarking protocol. In Abdelhamid Mellouk, Jun Bi, Guadalupe Ortiz,
Dickson K. W. Chiu, and Manuela Popescu, editors, Third International Conference on Internet and Web Applications and Services
(ICIW), pages 524-529, Athens, Greece. IEEE Computer Society, 2008.
Mina Deng, Lothar Fritsch, and Klaus Kursawe. Personal rights management – taming camera-phones for individual privacy enforcement. In
George Danezis and Philippe Golle, editors, Privacy Enhancing Technologies, 6th International Workshop (PET), Revised Selected Papers,
volume 4258 of Lecture Notes in Computer Science, pages 172-189, Cambridge, UK. Springer, 2006.