SlideShare une entreprise Scribd logo
1  sur  51
Télécharger pour lire hors ligne
Agreeing to Disagree About Privacy:
Markets as Privacy, Identity and Trust
Mechanisms

                 Jonathan Cave
                13 August 2012
       EINS – Internet Science Network of
          Excellence Summer School
Short version
•  Privacy is a relatively recent invention
•  New technologies challenge the underlying
   assumptions
  –  What is private?
  –  Are data and identities the same thing?
  –  How does privacy relate to risk and (a sense of) security?
•  Easy answers are best avoided:
  –  More privacy, trust, security are better
  –  Privacy is dead
  –  Identities should be strong and unique
  –  Collect all the data you can, then work out how to protect
     or re-use them

                           Internet Privacy and Identity, Trust   2
                              and Reputation Mechanisms
Markets play a crucial role
•  Markets can protect or erode privacy – or change
   the way it operates. This can lead to markets in
   privacy (protection or invasion) itself.
•  Private data are increasingly valuable – this can lead
   to markets in personal data and information
   –  Some of this value is created by use of PII, and should be
      shared
   –  Some is merely captured by technology or given away by
      inattention
   –  My data may say something about me, people like me or
      you
   –  Not everything of value needs to be protected by property
      rights
•  Privacy of action is also important and may need
                          Internet Privacy and Identity, Trust     3
                               and Reputation Mechanisms
Outline
•    Working definitions
•    A networked (abstract) view of rights regimes
•    Privacy as a human and/or economic right
•    Social mechanisms - rights in market settings
•    Privacy and markets
•    Topics for discussion




                          Internet Privacy and Identity, Trust   4
                             and Reputation Mechanisms
WORKING DEFINITIONS OF
PRIVACY
          Internet Privacy and Identity, Trust   5
             and Reputation Mechanisms
Some essential building blocks
•    Privacy
•    Security
•    Identity
•    Trust
•    Technical tools: games, mechanism design,
     networks, lattices, partitions




                         Internet Privacy and Identity, Trust   6
                            and Reputation Mechanisms
Privacy of what?
                        Privacy of action


                                                           the right to be let alone
   relational privacy



                        informational privacy




   spatial privacy                                           corporeal privacy




                           Internet Privacy and Identity, Trust                        7
                              and Reputation Mechanisms
Privacy…




•  (inter) subjectivity –
   –  my view, your view, others’
      views
   –  A regress: I think that you
      think that… this is private
•  Hidden in plain sight –
   private, invisible or beneath
   notice?
•  Functions of privacy
                    16/08/20    Internet Privacy and Identity, Trust   8
                                   and Reputation Mechanisms
                       12
Privacy in the Information* Society
* or “Knowledge” or “Belief”

 •  ‘Protected space’ has evolved to include bodies, actions, history and
    judgements
 •  Privacy as a right or an interest
      –  Privacy interests can be traded-off, sold or given away
      –  Privacy rights are
          •  deeper; linked to self-control, -respect and –responsibility
          •  limited for children, criminals, public figures
          •  economic (FIPP) or ‘human’ (OECD)
 •  Privacy is also subjective
      –  What infringes my privacy may be of no consequence to you
      –  Actions relating to privacy may trigger conflicts or open dialogue
 •  Either view is contingent or uncertain. Things change, but
      –  It is hard to claw back information
      –  It may be equally hard to reveal it at a later date
      –  Private information may involve opinion as well as fact


                                      Internet Privacy and Identity, Trust    9
                                         and Reputation Mechanisms
Privacy and publicity
 •  We are all more or less public figures
    –  We cannot control what is known about ourselves
    –  We do not carefully choose what to reveal
    –  The collective judgement may be a stampede
 •  This may be self-fulfilling
    –  ‘Give a dog a bad name…’
    –  Particularly true where collective judgement brings us into or out of
       the public eye
 •  Privacy may be protected by limiting access or flooding
    observers
 •  Privacy is perhaps most important as a societal mechanism
    to
    –    Let us act for ourselves
    –    Provide respite and recovery
    –    Provide a currency goodwill or trust
    –    Give us a reason to be trustworthy
                                  Internet Privacy and Identity, Trust         10
                                     and Reputation Mechanisms
Identity
•  Used and abused in ever more profound and ever more trivial
   ways
•  Multiplies
   –  by design or otherwise
   –  for good (compartmentalisation) or ill (accountability)
•  Converges and coalesces through data-mining, persistence,
   sharing
•  How many should we have; what pulls them together or
   apart?
•  More identity is not always better:
   –    Anonymous (cash) transactions are cheap – ID costs may deter good trades
   –    Privacy and anonymity interests may limit ID
   –    Reliance on (technical) ID may crowd out finer (character) judgement
   –    Powerful ID is attractive and potentially corrupting
   –    Opting out may become widespread – or impossible
•  growing tensions between (relatively) Identity, Trustphysical identity
                           Internet Privacy and unique                  11
   and increasingly fragmented useful or effective legal and
                              and Reputation Mechanisms
Trust
•  If technologies and ‘new market’ institutions provide the warp of
   the social fabric, trust provides the weft
•  Trust means different things to people, systems and
   organisations
•  Trust is central to the relation of privacy and security:
  – Customers must trust business security arrangements to
    safeguard their privacy
  – Personal privacy and system security form coffer dams
    against attack
•  Trust always involves an ‘incomplete contract’ –
   –  Monitoring dissipates the savings of trust
   –  Assurance (penalties) ≠ insurance (indemnities)
   –  Reputation and identity are informal versions
                   Trustworthy                Untrustworthy
•  Trust and trustworthiness need to be appropriately of
      Trusting
                 Appropriate delegation,
                 specialisation
                                         Enforcement costs, costs matched
                                         adverse incidents
                    Excess contracting, monitoring      Lost gains from trade,
       Untrusting
                    costs; race-to-the-bottom.          inappropriate risk allocation

                                       Internet Privacy and Identity, Trust             12
                                          and Reputation Mechanisms
Technical elements I
•  Games:
   –  Players, strategies, information, preferences, solution concepts
   –  Non-cooperative, bargaining, cooperative
•  Mechanism design:
   –  solution concepts help us characterise outcomes of strategic
      situations
   –  mechanism design lets us design rules to favour desirable outcomes
•  Networks
   –  Often binary graphs (nodes connected by links, subsets of N2) – may
      be necessary to consider n-ary networks (subsets of 2N);
   –  Links have strength, direction, duration, state dependence, salience,
      subjectivity
   –  Links and nodes are dual
   –  A topology (notion of closeness) with parameters (path length,
      clustering, etc.)
   –  Much network theory comes from electronics –emphasis on ‘shortest
                                  Internet Privacy and Identity, Trust
                                                      needs relaxation for privacy13
      paths’ and ‘nearest neighbours’ – clearlyMechanisms
                                     and Reputation
Technical elements II
•  More networks
   –  Networks are layered (people, data, ideas, things,…)
   –  Self-organised networks
   –  Epistemic networks: ‘knows’ as links
•  Lattices:
   –  Partially-ordered sets – complete if GLB and LUB of any two elements
      are in set
   –  Tarski theorem: isotone functions on complete lattices have fixed
      points
•  Partitions:
   –  Dividing a set into an exhaustive collection of disjoint subsets
   –  Used to describe information (subsets are ‘events’), rights (below)
   –  Partitions make a lattice – agreeing to disagree as an example of
      Tarski
•  Models of communication, association, behaviour
   and the propagation of risk and Identity, Trust
                        Internet Privacy
                           and Reputation Mechanisms
                                                                            14
PRIVACY AS A HUMAN AND/
OR ECONOMIC RIGHT
          Internet Privacy and Identity, Trust   15
             and Reputation Mechanisms
Transatlantic and intergenerational tussle
•  EU version – privacy as fundamental human right
   –  Primarily data protection
   –  Inalienable, with emphasis on consent (‘cookie law’)
   –  Right to be forgotten
•  US version – privacy as economic right
   –  Opting in/out
   –  Personalised or class profiling
   –  Three-party involvement
•  Tussle – mines in the “Safe Harbo(u)r”
•  Consequence – neither human right nor economic value are
   protected
•  Other issues
   –  Government involvement
   –  Impact of national security, crime prevention, anti-terrorism
       •  ACTA and DPI as a special case
                                    Internet Privacy and Identity, Trust   16
                                       and Reputation Mechanisms
A NETWORKED (ABSTRACT)
VIEW OF RIGHTS REGIMES
         Internet Privacy and Identity, Trust   17
            and Reputation Mechanisms
A suggested framework for rights regimes
•  Rights may be seen as a lattice
    –  Based on a partition into ‘equivalent’ situations or outcomes
    –  Partially ordered by inclusion (finer distinctions)
•  This creates a mechanism for communication and negotiation
    –  A language L to map a (set of) situations E into public action or
       utterance L(E)
    –  First round – all parties form their judgement and ‘do their thing’
    –  Second round – each party refines his judgement based on what
       others have done, leading to a (finer) posterior
    –  Process converges by Tarski to a common knowledge consensus
    –  Union-consistency: !∩​!↑′ =∅  %&'  ((!)=((​!↑′ )  +ℎ-&  ((!)=(
      (!∪​!↑′ )
    –  If the language is union consistent, agreeing to disagree is impossible
•  Can be applied to options and outcomes
•  Public language – right to act
•  Further partial order: preference over Identity, Trust
                              Internet Privacy and
                                                   actions.
                                 and Reputation Mechanisms
                                                                             18
SOCIAL MECHANISMS -
RIGHTS IN MARKET SETTINGS
          Internet Privacy and Identity, Trust   19
             and Reputation Mechanisms
Rights as property rights
•  Personal data and actions have externalities – this
   can lead to market failures
•  Individual property rights can
  –  Prevent encroachment (even if non-transferable)
  –  Facilitate trades and bargains
  –  Encourage optimal use of information
  –  Produce strategic manipulation and distortion (‘acting out’)
•  Collective property rights may be needed
  –  Informational commons
  –  Jointly private data (esp. of transactions)
  –  Conventions and rules
•  Bundling and unbundling are vital
                            Internet Privacy and Identity, Trust   20
                               and Reputation Mechanisms
Privacy preferences and markets
•  Use of personal profiling for targeted third-party
   monetisation
•  What information is ‘sensitive’?
•  Is data privacy linked to PII?
•  Business models
   –  Harvest and resell behavioural data that reveals preferences
   –  Mine and recombine stored profile information
   –  ‘Nudge’ users into preferred actions
   –  Sell ID theft and other forms of privacy protection
   –  Privacy intermediaries
   –  Privacy as a ‘local’ public good or a social construct
   –  Privacy as an asset (with derivatives)
   –  PITs and PETs            Internet Privacy and Identity, Trust 21
                                and Reputation Mechanisms
Efficiency
•  Selection vs. incentives – who should bear the risks
   and costs of privacy protection?
  –  Balance, power to act, preferences, risk aversion,
     resilience
  –  Repudiation and re-issue
  –  Protecting people from themselves
•  Unintended (behavioural) consequences
  –  Cynicism, paranoia, opportunism leading to poor data,
     absent data, crime
  –  Privacy as a social construct
  –  Crowding in and crowding out
  –  Changes to accountability, responsibility and transparency

                           Internet Privacy and Identity, Trust   22
                              and Reputation Mechanisms
PRIVACY AND MARKETS

         Internet Privacy and Identity, Trust   23
            and Reputation Mechanisms
Privacy affects the functioning of many
markets
•  Example – open, closed and discretionary order
   books in financial markets (how to interpret trade
   data)
•  Trust in automated transactions systems
•  Exploiting asymmetries of trust – revelation of
   private data as a trust-enhancing mechanism
•  Strong differences in national and cultural attitudes
•  Mutually assured identity




                         Internet Privacy and Identity, Trust   24
                            and Reputation Mechanisms
Some peculiarities of the market environment
•  Network effects and interoperability:
     –  tipping equilibrium (“Winner takes all”)
     –  Excess volatility or excess inertia
     –  Norms and conventions (cohesion vs. contagion)
•    Security economics (hard shells and soft centres)
•    Reluctance to exchange good and bad information
•    IPR and standards
•    Legal liability and public confidence
•    The importance of the public sector
     –  Large-scale public procurement and launching customers
     –  Support for innovation and standardisation
     –  Direct, self- and co-regulation
•  ‘Splitting’ between the Wild West and the Walled Garden
•  Two bad puns:
     –  Trust and anti-trust
     –  Security and securities
                                    Internet Privacy and Identity, Trust   25
                                       and Reputation Mechanisms
SOME EXAMPLES

         Internet Privacy and Identity, Trust   26
            and Reputation Mechanisms
An example: biometrics and privacy
•  Strength of biometrics can threaten privacy
   –  Unauthorised or invasive use of data, indirect abuse of personal data
   –  Inappropriate or non-consensual identification, denial of identity services
   –  Even if all the data are accurate, portions may give misleading or invasive
      impressions
   –  May give away too much
   –  People may not be careful enough – they certainly don’t seem to be.
•  Biometrics can also enhance privacy
   –  Mutual identification to limit inappropriate access
   –  May remove need for more invasive data gathering
   –  Protection through weakness
       •  Limited scalability
       •  Lack of interoperability standards
       •  Proprietary interest in collected data
       •  Need for cooperation and consent
   –  Commercial interest in offering security of data and identity-based decisions
   –  Technical tricks: cancellation, liveness tests, degradation
   –  Strongly anonymised records
                                    Internet Privacy and Identity, Trust              27
                                       and Reputation Mechanisms
Some examples
•  CCTV cameras in public and shared private spaces
      –    London has more than the whole of the US
      –    Every aspect of life is watched by someone
      –    Direct and measurable impacts – some perverse
      –    Backed by technology (ANPRS, face recognition, voice analysis)
      –    Linked to direct intervention
      –    Blurred public-private boundaries
•    Hoodies and Niqābs
•    Biometrics
•    DNA
•    Data mashing and other ‘recombinant’ data uses
•    Loyalty cards and commercial profiling
•    Virtual worlds


                                      Internet Privacy and Identity, Trust   28
                                         and Reputation Mechanisms
Some examples
•    CCTV cameras in public and shared private spaces


•  Hoodies and Niqābs
      –  Religious and ‘tribal’ group identities, or personal freedom?
      –  Do we trust those who withhold their identities?
           •  In commercial spaces
           •  In employment
      –  To what extent are they chosen?
      –  To what extent does our reaction force their choice?
•    Biometrics
•    DNA
•    Data mashing and other ‘recombinant’ data uses
•    Loyalty cards and commercial profiling
•    Virtual worlds




                                          Internet Privacy and Identity, Trust   29
                                             and Reputation Mechanisms
Some examples
•    CCTV cameras in public and shared private spaces
•    Hoodies and Niqābs


•  Biometrics
      –    A pervasive ‘strong’ form of identity – perhaps too strong?
      –    Merely physical identity
      –    Can be used for identification, authentication and indexing
      –    Confusion about technology and human factors
            •  Real vs. behavioural impacts
            •  Type I, II and III errors
      –  Where is the private sector?
      –  Privacy and utility currently protected by weaknesses in technology
•    DNA
•    Data mashing and other ‘recombinant’ data uses
•    Loyalty cards and commercial profiling
•    Virtual worlds


                                              Internet Privacy and Identity, Trust   30
                                                 and Reputation Mechanisms
Some examples
•    CCTV cameras in public and shared private spaces
•    Hoodies and Niqābs
•    Biometrics


•  DNA
      –    Like biometrics, a link to the physical body
      –    Unlike biometrics; persistent traces and durable ‘template’
      –    “Forensic” use (for legal and commercial decisions)
      –    May indicate more than identity (health, capabilities, kinship)
      –    Silent as to time, intent

•    Data mashing and other ‘recombinant’ data uses
•    Loyalty cards and commercial profiling
•    Virtual worlds




                                              Internet Privacy and Identity, Trust   31
                                                 and Reputation Mechanisms
Some examples
•    CCTV cameras in public and shared private spaces
•    Hoodies and Niqābs
•    Biometrics
•    DNA


•  Data mashing and other ‘recombinant’ data uses
      –  Refers to combination of data from different sources
      –  Hard to reconcile with existing privacy protections - informed consent
         in a networked world
      –  Identification not necessary for privacy infringement
      –  The liability and intellectual property issues are profound and
         unsolved
      –  Meanwhile, commercial and civil society development is racing ahead.
•    Loyalty cards and commercial profiling
•    Virtual worlds




                                              Internet Privacy and Identity, Trust   32
                                                 and Reputation Mechanisms
Some examples
•    CCTV cameras in public and shared private spaces
•    Hoodies and Niqābs
•    Biometrics
•    DNA
•    Data mashing and other ‘recombinant’ data uses

•  Loyalty cards and commercial profiling
      –    Who owns personal data (recent security breaches)?
      –    The practice is old; the power and scope are new
      –    A change in the implied client-customer relation
      –    Obstacle to search and competition or gateway to mass
           personalisation?
•    Virtual worlds




                                         Internet Privacy and Identity, Trust   33
                                            and Reputation Mechanisms
Some examples
•    CCTV cameras in public and shared private spaces
•    Hoodies and Niqābs
•    Biometrics
•    DNA
•    Data mashing and other ‘recombinant’ data uses
•    Loyalty cards and commercial profiling

•  Virtual worlds
      –  From transaction spaces to social networks, Second Life and
         MMORPGs
      –  Mutual and proportionate identification
      –  Who is the relevant person?
      –  Delegated identity for transactional avatars
      –  The closeness of chosen identities




                                          Internet Privacy and Identity, Trust   34
                                             and Reputation Mechanisms
Governance of privacy and identity



                   Administrations




                    Technology

                                                      Citizens
                                                   (consumers,
        Business
                                                   communities,
                                                   civil society)



                      Internet Privacy and Identity, Trust          35
                         and Reputation Mechanisms
TOPICS FOR DISCUSSION

          Internet Privacy and Identity, Trust   36
             and Reputation Mechanisms
A warning from history
•  Business, government and civil society all have strong stakes, but start
   from different places
•  Isolated events exert disproportionate influence
•  Different agendas involved in privacy and security discussions are not
   necessarily consistent
•  Challenge to business is to embrace these issues and take joint
   ownership
•  Various scenarios are within our reach
               High security                             Low security

High Privacy   o  Virtual agora or closed community      o  Peer-to-peer

Low privacy    o  The surveillance society (in theory)   o  The surveillance society (in practice)



•  Failure through success – big data analytics
•  Success through failure – learning to be careful and policy
   improvement


   16/08/2012
ADDITIONAL EXAMPLES OF
TENSION BETWEEN
INTERNET INNOVATIONS AND
PRIVACY
          Internet Privacy and Identity, Trust   38
             and Reputation Mechanisms
Privacy and innovation – ST systems model


   Market failures                           System failures


   Positive externalities (spillovers)       Failures in infrastructural provision
                                             and investment
   Public goods and appropriability          Lock-in / path dependency failures


   Imperfect and asymmetric                  Institutional failures
   information
   Market dominance                          Interaction failures


                                             Capabilities failures



                                      Internet Privacy and Identity, Trust           39
                                         and Reputation Mechanisms
                     Sources: Smith (1999), Martin and Scott (2000) and EC (2006)
Privacy and innovation – ST systems model

    Market failures                     System failures


    Function creep                      Failures in infrastructural
                                        provision and investment
    ‘Tragedy of the data commons’       Lock-in / path dependency
                                        failures (‘opt-in’/’opt-out’)
    Transparency of data subjects       Mismatch regulatory practices
    versus opacity of systems           and data practices
    No incentives for newcomers with    Interaction failures
    privacy as USP
                                        Privacy authorities and law
                                        enforcement practices


                                Internet Privacy and Identity, Trust    40
                                   and Reputation Mechanisms
Case-study examples – Cloud Computing


                                    Growing EU market (68B (2011)->150B (2015))




                             Software as a Service
                              Platform as a Service
                           Infrastructure as a Service




        A model for enabling convenient, on-demand network access
           to a shared pool of configurable computing resources
                                 US NIST


                  Motives for adoption:
             Cost reduction; cost accounting;
             Time to market; greening of ICT




                                   Internet Privacy and Identity, Trust           41
                                      and Reputation Mechanisms
Cloud computing - 2
•  Tension 1                                            "  Approach/solution 1
 –  Data controller-data processor (Art 29                 "  Technology (encryption):
    WP; Art 17 95/46: security measures)                      SIENA (from IAAS to PAAS
•  Tension 2                                                  and SAAS)
 –  Informed choice and consent (auditing
    SOC2/SOC3 including privacy and                     "  Approach/solution 2
    security)                                              "  Security as a Service
•  Tension 3                                               "  Forrester: $1.5B market
 –  Ownership, confidentiality and law                        (2015)
    enforcement                                            "  BUT
•  Tension 4                                                  "  Apple’s SDK for iOS only
 –  Appropriate SLAs (data integrity, data                       moderate attention for
    disclosure, data preservation, data                          ‘concern for user’s
    location/transfer, rights over services/                     data’ (only in closed Walled
    content, property rights and duties                          Garden of the Apps Store)
    (Queen Mary’s univ study)
•  Tension 5                                            "  Approach/solution 3
 –  User expectations vs privacy                           "  Cloud neutral approach?
 –  WTP vs WTA, right of ownership (data
    portability), right to access (Facebook,
    GMail)                             Internet Privacy and Identity, Trust                     42
                                           and Reputation Mechanisms
Case-study examples – Behavioural
targeting
  Policy pressure: implementation of
 more stricter ePrivacy article related to
       use of cookies, June 2011




                                             Market forecast: $4.4 b in 2012
Behavioural targeting - 2
•     Tension 1
       –  Explicit and informed prior consent; (Art 29 WP; "  Approach/solution 1
          ePrivacy directive art. 5(3)); leads to unwanted          "  Policy approach on informed
          effects (pop-ups)                                            consent (‘browser settings are
•     Tension 2                                                        sufficient’)
       –  New intrusive technologies and tools
          •  Respawning (‘evercookie’)                           "  Approach/solution 2
          •  HTML5 – persistent cookies
                                                                    "  Control instruments to users
                                                                       "  ‘Track me not’ browser button
          •  Device fingerprinting (unique consumer
             identification)                                           "  ‘Advertising Icon Option’
                                                                       " Transparant privacy policies
•     Tension 3
       –  Trust and confidence
                                                                 "  Approach/solution 3
          •  Consumers show reluctance when confronted
                                                                    "  Different approaches
             with BT practices
                                                                       "  Just in Time contextual
          •  Cookie practices hard to understand and to act
             upon (Flash cookies)
                                                                          advertising
                                                                       "  Consumers show reluctance
          •  Generic privacy policies not informative and too
             long                                                         when confronted with BT
•     Tension 4
                                                                          practices
       –  Regulation is perceived to distort business
          practices                             Internet Privacy and Identity, Trust                  44
                                                  and Reputation Mechanisms
Case-study examples – Location based services

                        All data in device
                       All data in network
                     Linkage between media




   GPS


                                                                    Mash up



                       Location tracking




     Friend finder
                                                             Google Street View


                      Internet Privacy and Identity, Trust                        45
                         and Reputation Mechanisms
Location based services - 2
•     Tension 1
                                                                  "  Approach/solution 1
       –  Regulatory practices
            •  2002/58/EC Harmonisation of opt-in                    "  Soft regulatory practices:
               consent and withdrawal of consent                        "  offering opt-out Germany:
            •  Data Retention Directive 2006/24/EC;                        244.000 citizens
               financial burden on telco’s and ISPs
            •  Conflicting regulatory frameworks
                   –  Definition of personal data, traffic        "  Approach/solution 2
                      data, location data                            "  Control instruments to users
•     Tension 2
                                                                        "  Switching off GPS button
       –  Strict regulatory practices
            •  Switzerland: blur all faces, all number
               plates, sensitive facilities, clothing             "  Approach/solution 3
            •  Art 29 WP: storage of photo’s: from 12 to 6           "  New technologies
               months
•     Tension 3                                                         "  Automatic face blurring
       –  Lack of user control (location tracking)                         technologies
            •  Gathering of profiles (Malte Spitz)                      "  Number plate blurring
            •  Selling of aggregate data (TomTom)
•     Tension 4
       –  Collection of sensitive data (faces, number
          plates, clothing, buildings)



                                                Internet Privacy and Identity, Trust                   46
                                                   and Reputation Mechanisms
Case-study examples – RFID
  Ø  Building block for Internet of
                    things
 Ø  Growth $5.03B (2009) - $5.63B
                   (2010)
    Ø  Use in multitude of domains
       ü  health care; cattle; pets;
                logistics; ..
               Ø  Unique ID
     Ø  Limited control and choice/
             consent for user




                                        Internet Privacy and Identity, Trust   47
                                           and Reputation Mechanisms
RFID - 2
•     Tension 1
       –  Awareness raising by EDPS, Art 29 WP, consumer            •  Approach/solution 1
          groups
                                                                      •  Regulatory practices
            •  Limited awareness by consumers
                                                                      •  Privacy impact assessment
            •  Undesired disclosure of data
                                                                         RFID
            •  EC Recommendation 12 May 2009
            •  Industrial PIA RFID
•     Tension 2
                                                                    •  Approach/solution 2
       –  Critical approach from industry                             •  Control instruments to users
            •  RFID singled out as privacy intrusive                    •  ‘Killer application’
                technology                                              •  Deep sleep mode
            •  Privacy problems are in back end                         •  Transparency tools (?)
            •  Data encryption in chip is costly
            •  Disabling of RFID means limited access to after
                sales services
•     Tension 3
       –  Convergence of technologies with privacy
          implications
            •  Biometrics (fingerprint recognition
            •  Corporeal intrusion (swarm technology)



                                                  Internet Privacy and Identity, Trust               48
                                                     and Reputation Mechanisms
Case-study Biometrics
 Growing market: $4.2B (2010) -
           $11.2B (2015)
   Largest share: finger printing
  Facial, iris, voice show higher
                CAGR
 Market driver: homeland security




                                       Decentralised systems
                                          (authentication)
                                        Centralised systems
                                    (fraud detection, illegal and
                                         criminal activities)




                                                Internet Privacy and Identity, Trust   49
                                                   and Reputation Mechanisms
Biometrics - 2
•     Tension 1                                                         "  Approach/solution 1
       –  Storage of sensitive data                                      "  Regulatory practices
            •  False positives                                     "  Globalisation of regulation (US-
            •  Third party use                                                  VISIT/SIS/VIS …)
            •  Consent and choice                                           "  ICAO standardisation
•     Tension 2                                                     "  Fine-tuning EU regulations (<12
       –  Limited accuracy                                                             age)
            •  Enrolment/identification (UK, NL)
•     Tension 3                                                         "  Approach/solution 2
       –  Back-firing of public failures on                         "  Control instruments to users
          private business                                               "  Transparency tools (?)
            •  Distrust of decentralised
               biometric systems                                        "  Approach/solution 3
                  –  Single sign on                                     "  Offering surplus value
                  –  Access systems                                          "  User friendliness
•     Tension 4                                                          "  Speed (Single sign on)
       –  Public distrust of premium services                        "  Trust (One time passwords)
            •  Advantageous for specific                          "  Added value services (premium)
               groups
                  –  Air transport                                          "  Approach 4
                  –  Banking                                         "  Demonstrating public value
                                                                    "  Identity fraud (double dippers)
                                                                              "  Crowd control
                                             Internet Privacy and Identity, Trust                        50
                                                and Reputation Mechanisms
Concluding the cases
•  Legal/regulatory issues abound
     –  Stronger enforcement; stricter adherence to privacy constraints;
        homeland security
     –  Harmonisation fails; different approaches in different countries
     –  Influx of different regimens (sector specific regulations, police and
        judicial coordination)
•  Privacy intrusions are part of public services as well! Specific uses and
   misuses are hard to differentiate for the public at large (fraud detection vs
   commercial use of data)
•  Privacy innovation is restricted (face blurring technology, decentralised
   biometrics)
•  Usually soft approach: awareness raising, opt-in/opt-out offers;
   transparency measures
•  Business practices mostly oriented towards data collection and use;
   privacy is only secondary to business models
•  Companies with inherent privacy friendly approaches have modest
   market shares and are not very visible to the public at large
                                   Internet Privacy and Identity, Trust       51
                                      and Reputation Mechanisms

Contenu connexe

Tendances

д-р Лючиана Дюранти – Расширенная версия презентации на английском языке к се...
д-р Лючиана Дюранти – Расширенная версия презентации на английском языке к се...д-р Лючиана Дюранти – Расширенная версия презентации на английском языке к се...
д-р Лючиана Дюранти – Расширенная версия презентации на английском языке к се...
Natasha Khramtsovsky
 
2 7-2013-big data and e-discovery
2 7-2013-big data and e-discovery2 7-2013-big data and e-discovery
2 7-2013-big data and e-discovery
Exterro
 
4. florence dupre new usagesprivacy legal framework fd
4. florence dupre new usagesprivacy legal framework fd4. florence dupre new usagesprivacy legal framework fd
4. florence dupre new usagesprivacy legal framework fd
GlobalForum
 
NIST Cybersecurity Framework Background and Review | Jack Whitsitt
NIST Cybersecurity Framework Background and Review | Jack WhitsittNIST Cybersecurity Framework Background and Review | Jack Whitsitt
NIST Cybersecurity Framework Background and Review | Jack Whitsitt
Jack Whitsitt
 
IABC social media for government by Jeff Braybrook
IABC social media for government by Jeff BraybrookIABC social media for government by Jeff Braybrook
IABC social media for government by Jeff Braybrook
Kristine Simpson
 
Hit Conference Flyer 2012 6.26
Hit Conference Flyer 2012 6.26Hit Conference Flyer 2012 6.26
Hit Conference Flyer 2012 6.26
kcheaney
 
Social media and records management challenges 2012 09-17-m
Social media and records management challenges 2012 09-17-mSocial media and records management challenges 2012 09-17-m
Social media and records management challenges 2012 09-17-m
Kevin Kim
 
The death of data protection sans obama
The death of data protection sans obamaThe death of data protection sans obama
The death of data protection sans obama
Lilian Edwards
 

Tendances (18)

Hivos and Responsible Data
Hivos and Responsible DataHivos and Responsible Data
Hivos and Responsible Data
 
д-р Лючиана Дюранти – Расширенная версия презентации на английском языке к се...
д-р Лючиана Дюранти – Расширенная версия презентации на английском языке к се...д-р Лючиана Дюранти – Расширенная версия презентации на английском языке к се...
д-р Лючиана Дюранти – Расширенная версия презентации на английском языке к се...
 
Distributed defense against disinformation: disinformation risk management an...
Distributed defense against disinformation: disinformation risk management an...Distributed defense against disinformation: disinformation risk management an...
Distributed defense against disinformation: disinformation risk management an...
 
д-р Лючиана Дюранти - Презентация на английском языке к семинару в Москве 23 ...
д-р Лючиана Дюранти - Презентация на английском языке к семинару в Москве 23 ...д-р Лючиана Дюранти - Презентация на английском языке к семинару в Москве 23 ...
д-р Лючиана Дюранти - Презентация на английском языке к семинару в Москве 23 ...
 
SemTech West 2011 - Digital Provenance
SemTech West 2011 - Digital ProvenanceSemTech West 2011 - Digital Provenance
SemTech West 2011 - Digital Provenance
 
2 7-2013-big data and e-discovery
2 7-2013-big data and e-discovery2 7-2013-big data and e-discovery
2 7-2013-big data and e-discovery
 
The Business(es) of Disinformation
The Business(es) of DisinformationThe Business(es) of Disinformation
The Business(es) of Disinformation
 
4. florence dupre new usagesprivacy legal framework fd
4. florence dupre new usagesprivacy legal framework fd4. florence dupre new usagesprivacy legal framework fd
4. florence dupre new usagesprivacy legal framework fd
 
Towards Socially-Responsible Management of Personal Information in Social Net...
Towards Socially-Responsible Management of Personal Information in Social Net...Towards Socially-Responsible Management of Personal Information in Social Net...
Towards Socially-Responsible Management of Personal Information in Social Net...
 
Community Disaster Incident Response
Community Disaster  Incident ResponseCommunity Disaster  Incident Response
Community Disaster Incident Response
 
NIST Cybersecurity Framework Background and Review | Jack Whitsitt
NIST Cybersecurity Framework Background and Review | Jack WhitsittNIST Cybersecurity Framework Background and Review | Jack Whitsitt
NIST Cybersecurity Framework Background and Review | Jack Whitsitt
 
IABC social media for government by Jeff Braybrook
IABC social media for government by Jeff BraybrookIABC social media for government by Jeff Braybrook
IABC social media for government by Jeff Braybrook
 
Introduction to National Critical Infrastructure Cyber Security: Background a...
Introduction to National Critical Infrastructure Cyber Security: Background a...Introduction to National Critical Infrastructure Cyber Security: Background a...
Introduction to National Critical Infrastructure Cyber Security: Background a...
 
BYOD: Beating IT's Kobayashi Maru
BYOD: Beating IT's Kobayashi MaruBYOD: Beating IT's Kobayashi Maru
BYOD: Beating IT's Kobayashi Maru
 
CorpExecProfile
CorpExecProfileCorpExecProfile
CorpExecProfile
 
Hit Conference Flyer 2012 6.26
Hit Conference Flyer 2012 6.26Hit Conference Flyer 2012 6.26
Hit Conference Flyer 2012 6.26
 
Social media and records management challenges 2012 09-17-m
Social media and records management challenges 2012 09-17-mSocial media and records management challenges 2012 09-17-m
Social media and records management challenges 2012 09-17-m
 
The death of data protection sans obama
The death of data protection sans obamaThe death of data protection sans obama
The death of data protection sans obama
 

Similaire à Jonathan Cave, University of Warwick (Plenary): Agreeing to Disagree About Privacy: Markets as Privacy, Identity and Trust Mechanisms

DRC -- Cybersecurity concepts2015
DRC -- Cybersecurity concepts2015DRC -- Cybersecurity concepts2015
DRC -- Cybersecurity concepts2015
T. J. Saotome
 

Similaire à Jonathan Cave, University of Warwick (Plenary): Agreeing to Disagree About Privacy: Markets as Privacy, Identity and Trust Mechanisms (20)

Presentation on Information Privacy
Presentation on Information PrivacyPresentation on Information Privacy
Presentation on Information Privacy
 
Chp 05 anonymity, security, privacy and civil liberties (shared)
Chp 05   anonymity, security, privacy and civil liberties (shared)Chp 05   anonymity, security, privacy and civil liberties (shared)
Chp 05 anonymity, security, privacy and civil liberties (shared)
 
Innovative Ideas in Privacy Research.ppt
Innovative Ideas in Privacy Research.pptInnovative Ideas in Privacy Research.ppt
Innovative Ideas in Privacy Research.ppt
 
dexa.ppt
dexa.pptdexa.ppt
dexa.ppt
 
Internet Issues (How to Deal on Internet Security)
Internet Issues (How to Deal on Internet Security)Internet Issues (How to Deal on Internet Security)
Internet Issues (How to Deal on Internet Security)
 
Ethical Dilemmas/Issues in CyberWorld
Ethical Dilemmas/Issues in CyberWorldEthical Dilemmas/Issues in CyberWorld
Ethical Dilemmas/Issues in CyberWorld
 
Online Privacy, the next Battleground
Online Privacy, the next BattlegroundOnline Privacy, the next Battleground
Online Privacy, the next Battleground
 
E commerce
E commerce E commerce
E commerce
 
Chapter 3
Chapter 3Chapter 3
Chapter 3
 
Corporate Intelligence: Bridging the security and intelligence community
Corporate Intelligence: Bridging the security and intelligence communityCorporate Intelligence: Bridging the security and intelligence community
Corporate Intelligence: Bridging the security and intelligence community
 
Ns lecture5: Introduction to Computer, Information, and Network Security.
Ns lecture5: Introduction to Computer, Information, and Network Security.Ns lecture5: Introduction to Computer, Information, and Network Security.
Ns lecture5: Introduction to Computer, Information, and Network Security.
 
Cyber Security - ASGFOA
Cyber Security - ASGFOACyber Security - ASGFOA
Cyber Security - ASGFOA
 
Privacy & the Internet: An Overview of Key Issues
Privacy & the Internet: An Overview of Key IssuesPrivacy & the Internet: An Overview of Key Issues
Privacy & the Internet: An Overview of Key Issues
 
Data set Legislation
Data set   Legislation Data set   Legislation
Data set Legislation
 
internet security and cyber lawUnit1
internet security and  cyber lawUnit1internet security and  cyber lawUnit1
internet security and cyber lawUnit1
 
3 - Social Media and Enterprise
3 - Social Media and Enterprise3 - Social Media and Enterprise
3 - Social Media and Enterprise
 
week 7.pptx
week 7.pptxweek 7.pptx
week 7.pptx
 
Digital Identity
Digital Identity Digital Identity
Digital Identity
 
Physician Office Presentation
Physician Office PresentationPhysician Office Presentation
Physician Office Presentation
 
DRC -- Cybersecurity concepts2015
DRC -- Cybersecurity concepts2015DRC -- Cybersecurity concepts2015
DRC -- Cybersecurity concepts2015
 

Plus de i_scienceEU

Plus de i_scienceEU (20)

Internet science conference
Internet science conferenceInternet science conference
Internet science conference
 
[participants Communicating Privacy Risks to Users] EINS summer school
[participants Communicating Privacy Risks to Users] EINS summer school[participants Communicating Privacy Risks to Users] EINS summer school
[participants Communicating Privacy Risks to Users] EINS summer school
 
Kave Salamatian, Universite de Savoie and Eiko Yoneki, University of Cambridg...
Kave Salamatian, Universite de Savoie and Eiko Yoneki, University of Cambridg...Kave Salamatian, Universite de Savoie and Eiko Yoneki, University of Cambridg...
Kave Salamatian, Universite de Savoie and Eiko Yoneki, University of Cambridg...
 
Runa Sandvik, The Tor Project, London: Online Anonymity: Before and After th...
 Runa Sandvik, The Tor Project, London: Online Anonymity: Before and After th... Runa Sandvik, The Tor Project, London: Online Anonymity: Before and After th...
Runa Sandvik, The Tor Project, London: Online Anonymity: Before and After th...
 
Karmen Guevara, University of Cambridge: Dimensions of Identity, Trust and Pr...
Karmen Guevara, University of Cambridge: Dimensions of Identity, Trust and Pr...Karmen Guevara, University of Cambridge: Dimensions of Identity, Trust and Pr...
Karmen Guevara, University of Cambridge: Dimensions of Identity, Trust and Pr...
 
Joss Wright, Oxford Internet Institute (Plenary): Privacy-Preserving Data Ana...
Joss Wright, Oxford Internet Institute (Plenary): Privacy-Preserving Data Ana...Joss Wright, Oxford Internet Institute (Plenary): Privacy-Preserving Data Ana...
Joss Wright, Oxford Internet Institute (Plenary): Privacy-Preserving Data Ana...
 
Chris Marsden, University of Essex (Plenary): Regulation, Standards, Governan...
Chris Marsden, University of Essex (Plenary): Regulation, Standards, Governan...Chris Marsden, University of Essex (Plenary): Regulation, Standards, Governan...
Chris Marsden, University of Essex (Plenary): Regulation, Standards, Governan...
 
Lizzie Coles-Kemp, Royal Holloway University of London: Privacy Awareness: An...
Lizzie Coles-Kemp, Royal Holloway University of London: Privacy Awareness: An...Lizzie Coles-Kemp, Royal Holloway University of London: Privacy Awareness: An...
Lizzie Coles-Kemp, Royal Holloway University of London: Privacy Awareness: An...
 
Caspar Bowden EINS Summer School
Caspar Bowden EINS Summer SchoolCaspar Bowden EINS Summer School
Caspar Bowden EINS Summer School
 
Joanna Kulesza, University of Lodz: Transboundary Challenges of Privacy Prote...
Joanna Kulesza, University of Lodz: Transboundary Challenges of Privacy Prote...Joanna Kulesza, University of Lodz: Transboundary Challenges of Privacy Prote...
Joanna Kulesza, University of Lodz: Transboundary Challenges of Privacy Prote...
 
Network of Excellence in Internet Science (Supported Activities, Stavrakakis,...
Network of Excellence in Internet Science (Supported Activities, Stavrakakis,...Network of Excellence in Internet Science (Supported Activities, Stavrakakis,...
Network of Excellence in Internet Science (Supported Activities, Stavrakakis,...
 
Network of Excellence in Internet Science (Supported Activities, Callegati, U...
Network of Excellence in Internet Science (Supported Activities, Callegati, U...Network of Excellence in Internet Science (Supported Activities, Callegati, U...
Network of Excellence in Internet Science (Supported Activities, Callegati, U...
 
Network of Excellence in Internet Science (SEA4, Organisation of open calls, ...
Network of Excellence in Internet Science (SEA4, Organisation of open calls, ...Network of Excellence in Internet Science (SEA4, Organisation of open calls, ...
Network of Excellence in Internet Science (SEA4, Organisation of open calls, ...
 
Network of Excellence in Internet Science (SEA3, Dissemination & Cooperation,...
Network of Excellence in Internet Science (SEA3, Dissemination & Cooperation,...Network of Excellence in Internet Science (SEA3, Dissemination & Cooperation,...
Network of Excellence in Internet Science (SEA3, Dissemination & Cooperation,...
 
Network of Excellence in Internet Science (SEA2, Standardisation & Legislatio...
Network of Excellence in Internet Science (SEA2, Standardisation & Legislatio...Network of Excellence in Internet Science (SEA2, Standardisation & Legislatio...
Network of Excellence in Internet Science (SEA2, Standardisation & Legislatio...
 
Network of Excellence in Internet Science (SEA1, E-presence, Dissemination an...
Network of Excellence in Internet Science (SEA1, E-presence, Dissemination an...Network of Excellence in Internet Science (SEA1, E-presence, Dissemination an...
Network of Excellence in Internet Science (SEA1, E-presence, Dissemination an...
 
Network of Excellence in Internet Science (Multidisciplinarity and its Implic...
Network of Excellence in Internet Science (Multidisciplinarity and its Implic...Network of Excellence in Internet Science (Multidisciplinarity and its Implic...
Network of Excellence in Internet Science (Multidisciplinarity and its Implic...
 
Network of Excellence in Internet Science (Multidisciplinarity and its Implic...
Network of Excellence in Internet Science (Multidisciplinarity and its Implic...Network of Excellence in Internet Science (Multidisciplinarity and its Implic...
Network of Excellence in Internet Science (Multidisciplinarity and its Implic...
 
Network of Excellence in Internet Science (Multidisciplinarity and its Implic...
Network of Excellence in Internet Science (Multidisciplinarity and its Implic...Network of Excellence in Internet Science (Multidisciplinarity and its Implic...
Network of Excellence in Internet Science (Multidisciplinarity and its Implic...
 
Network of Excellence in Internet Science (JRA8, Internet for Sustainability,...
Network of Excellence in Internet Science (JRA8, Internet for Sustainability,...Network of Excellence in Internet Science (JRA8, Internet for Sustainability,...
Network of Excellence in Internet Science (JRA8, Internet for Sustainability,...
 

Dernier

1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
PECB
 
Gardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch LetterGardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch Letter
MateoGardella
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
heathfieldcps1
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
Chris Hunter
 

Dernier (20)

Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
Gardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch LetterGardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch Letter
 
PROCESS RECORDING FORMAT.docx
PROCESS      RECORDING        FORMAT.docxPROCESS      RECORDING        FORMAT.docx
PROCESS RECORDING FORMAT.docx
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
 

Jonathan Cave, University of Warwick (Plenary): Agreeing to Disagree About Privacy: Markets as Privacy, Identity and Trust Mechanisms

  • 1. Agreeing to Disagree About Privacy: Markets as Privacy, Identity and Trust Mechanisms Jonathan Cave 13 August 2012 EINS – Internet Science Network of Excellence Summer School
  • 2. Short version •  Privacy is a relatively recent invention •  New technologies challenge the underlying assumptions –  What is private? –  Are data and identities the same thing? –  How does privacy relate to risk and (a sense of) security? •  Easy answers are best avoided: –  More privacy, trust, security are better –  Privacy is dead –  Identities should be strong and unique –  Collect all the data you can, then work out how to protect or re-use them Internet Privacy and Identity, Trust 2 and Reputation Mechanisms
  • 3. Markets play a crucial role •  Markets can protect or erode privacy – or change the way it operates. This can lead to markets in privacy (protection or invasion) itself. •  Private data are increasingly valuable – this can lead to markets in personal data and information –  Some of this value is created by use of PII, and should be shared –  Some is merely captured by technology or given away by inattention –  My data may say something about me, people like me or you –  Not everything of value needs to be protected by property rights •  Privacy of action is also important and may need Internet Privacy and Identity, Trust 3 and Reputation Mechanisms
  • 4. Outline •  Working definitions •  A networked (abstract) view of rights regimes •  Privacy as a human and/or economic right •  Social mechanisms - rights in market settings •  Privacy and markets •  Topics for discussion Internet Privacy and Identity, Trust 4 and Reputation Mechanisms
  • 5. WORKING DEFINITIONS OF PRIVACY Internet Privacy and Identity, Trust 5 and Reputation Mechanisms
  • 6. Some essential building blocks •  Privacy •  Security •  Identity •  Trust •  Technical tools: games, mechanism design, networks, lattices, partitions Internet Privacy and Identity, Trust 6 and Reputation Mechanisms
  • 7. Privacy of what? Privacy of action the right to be let alone relational privacy informational privacy spatial privacy corporeal privacy Internet Privacy and Identity, Trust 7 and Reputation Mechanisms
  • 8. Privacy… •  (inter) subjectivity – –  my view, your view, others’ views –  A regress: I think that you think that… this is private •  Hidden in plain sight – private, invisible or beneath notice? •  Functions of privacy 16/08/20 Internet Privacy and Identity, Trust 8 and Reputation Mechanisms 12
  • 9. Privacy in the Information* Society * or “Knowledge” or “Belief” •  ‘Protected space’ has evolved to include bodies, actions, history and judgements •  Privacy as a right or an interest –  Privacy interests can be traded-off, sold or given away –  Privacy rights are •  deeper; linked to self-control, -respect and –responsibility •  limited for children, criminals, public figures •  economic (FIPP) or ‘human’ (OECD) •  Privacy is also subjective –  What infringes my privacy may be of no consequence to you –  Actions relating to privacy may trigger conflicts or open dialogue •  Either view is contingent or uncertain. Things change, but –  It is hard to claw back information –  It may be equally hard to reveal it at a later date –  Private information may involve opinion as well as fact Internet Privacy and Identity, Trust 9 and Reputation Mechanisms
  • 10. Privacy and publicity •  We are all more or less public figures –  We cannot control what is known about ourselves –  We do not carefully choose what to reveal –  The collective judgement may be a stampede •  This may be self-fulfilling –  ‘Give a dog a bad name…’ –  Particularly true where collective judgement brings us into or out of the public eye •  Privacy may be protected by limiting access or flooding observers •  Privacy is perhaps most important as a societal mechanism to –  Let us act for ourselves –  Provide respite and recovery –  Provide a currency goodwill or trust –  Give us a reason to be trustworthy Internet Privacy and Identity, Trust 10 and Reputation Mechanisms
  • 11. Identity •  Used and abused in ever more profound and ever more trivial ways •  Multiplies –  by design or otherwise –  for good (compartmentalisation) or ill (accountability) •  Converges and coalesces through data-mining, persistence, sharing •  How many should we have; what pulls them together or apart? •  More identity is not always better: –  Anonymous (cash) transactions are cheap – ID costs may deter good trades –  Privacy and anonymity interests may limit ID –  Reliance on (technical) ID may crowd out finer (character) judgement –  Powerful ID is attractive and potentially corrupting –  Opting out may become widespread – or impossible •  growing tensions between (relatively) Identity, Trustphysical identity Internet Privacy and unique 11 and increasingly fragmented useful or effective legal and and Reputation Mechanisms
  • 12. Trust •  If technologies and ‘new market’ institutions provide the warp of the social fabric, trust provides the weft •  Trust means different things to people, systems and organisations •  Trust is central to the relation of privacy and security: – Customers must trust business security arrangements to safeguard their privacy – Personal privacy and system security form coffer dams against attack •  Trust always involves an ‘incomplete contract’ – –  Monitoring dissipates the savings of trust –  Assurance (penalties) ≠ insurance (indemnities) –  Reputation and identity are informal versions Trustworthy Untrustworthy •  Trust and trustworthiness need to be appropriately of Trusting Appropriate delegation, specialisation Enforcement costs, costs matched adverse incidents Excess contracting, monitoring Lost gains from trade, Untrusting costs; race-to-the-bottom. inappropriate risk allocation Internet Privacy and Identity, Trust 12 and Reputation Mechanisms
  • 13. Technical elements I •  Games: –  Players, strategies, information, preferences, solution concepts –  Non-cooperative, bargaining, cooperative •  Mechanism design: –  solution concepts help us characterise outcomes of strategic situations –  mechanism design lets us design rules to favour desirable outcomes •  Networks –  Often binary graphs (nodes connected by links, subsets of N2) – may be necessary to consider n-ary networks (subsets of 2N); –  Links have strength, direction, duration, state dependence, salience, subjectivity –  Links and nodes are dual –  A topology (notion of closeness) with parameters (path length, clustering, etc.) –  Much network theory comes from electronics –emphasis on ‘shortest Internet Privacy and Identity, Trust needs relaxation for privacy13 paths’ and ‘nearest neighbours’ – clearlyMechanisms and Reputation
  • 14. Technical elements II •  More networks –  Networks are layered (people, data, ideas, things,…) –  Self-organised networks –  Epistemic networks: ‘knows’ as links •  Lattices: –  Partially-ordered sets – complete if GLB and LUB of any two elements are in set –  Tarski theorem: isotone functions on complete lattices have fixed points •  Partitions: –  Dividing a set into an exhaustive collection of disjoint subsets –  Used to describe information (subsets are ‘events’), rights (below) –  Partitions make a lattice – agreeing to disagree as an example of Tarski •  Models of communication, association, behaviour and the propagation of risk and Identity, Trust Internet Privacy and Reputation Mechanisms 14
  • 15. PRIVACY AS A HUMAN AND/ OR ECONOMIC RIGHT Internet Privacy and Identity, Trust 15 and Reputation Mechanisms
  • 16. Transatlantic and intergenerational tussle •  EU version – privacy as fundamental human right –  Primarily data protection –  Inalienable, with emphasis on consent (‘cookie law’) –  Right to be forgotten •  US version – privacy as economic right –  Opting in/out –  Personalised or class profiling –  Three-party involvement •  Tussle – mines in the “Safe Harbo(u)r” •  Consequence – neither human right nor economic value are protected •  Other issues –  Government involvement –  Impact of national security, crime prevention, anti-terrorism •  ACTA and DPI as a special case Internet Privacy and Identity, Trust 16 and Reputation Mechanisms
  • 17. A NETWORKED (ABSTRACT) VIEW OF RIGHTS REGIMES Internet Privacy and Identity, Trust 17 and Reputation Mechanisms
  • 18. A suggested framework for rights regimes •  Rights may be seen as a lattice –  Based on a partition into ‘equivalent’ situations or outcomes –  Partially ordered by inclusion (finer distinctions) •  This creates a mechanism for communication and negotiation –  A language L to map a (set of) situations E into public action or utterance L(E) –  First round – all parties form their judgement and ‘do their thing’ –  Second round – each party refines his judgement based on what others have done, leading to a (finer) posterior –  Process converges by Tarski to a common knowledge consensus –  Union-consistency: !∩​!↑′ =∅  %&'  ((!)=((​!↑′ )  +ℎ-&  ((!)=( (!∪​!↑′ ) –  If the language is union consistent, agreeing to disagree is impossible •  Can be applied to options and outcomes •  Public language – right to act •  Further partial order: preference over Identity, Trust Internet Privacy and actions. and Reputation Mechanisms 18
  • 19. SOCIAL MECHANISMS - RIGHTS IN MARKET SETTINGS Internet Privacy and Identity, Trust 19 and Reputation Mechanisms
  • 20. Rights as property rights •  Personal data and actions have externalities – this can lead to market failures •  Individual property rights can –  Prevent encroachment (even if non-transferable) –  Facilitate trades and bargains –  Encourage optimal use of information –  Produce strategic manipulation and distortion (‘acting out’) •  Collective property rights may be needed –  Informational commons –  Jointly private data (esp. of transactions) –  Conventions and rules •  Bundling and unbundling are vital Internet Privacy and Identity, Trust 20 and Reputation Mechanisms
  • 21. Privacy preferences and markets •  Use of personal profiling for targeted third-party monetisation •  What information is ‘sensitive’? •  Is data privacy linked to PII? •  Business models –  Harvest and resell behavioural data that reveals preferences –  Mine and recombine stored profile information –  ‘Nudge’ users into preferred actions –  Sell ID theft and other forms of privacy protection –  Privacy intermediaries –  Privacy as a ‘local’ public good or a social construct –  Privacy as an asset (with derivatives) –  PITs and PETs Internet Privacy and Identity, Trust 21 and Reputation Mechanisms
  • 22. Efficiency •  Selection vs. incentives – who should bear the risks and costs of privacy protection? –  Balance, power to act, preferences, risk aversion, resilience –  Repudiation and re-issue –  Protecting people from themselves •  Unintended (behavioural) consequences –  Cynicism, paranoia, opportunism leading to poor data, absent data, crime –  Privacy as a social construct –  Crowding in and crowding out –  Changes to accountability, responsibility and transparency Internet Privacy and Identity, Trust 22 and Reputation Mechanisms
  • 23. PRIVACY AND MARKETS Internet Privacy and Identity, Trust 23 and Reputation Mechanisms
  • 24. Privacy affects the functioning of many markets •  Example – open, closed and discretionary order books in financial markets (how to interpret trade data) •  Trust in automated transactions systems •  Exploiting asymmetries of trust – revelation of private data as a trust-enhancing mechanism •  Strong differences in national and cultural attitudes •  Mutually assured identity Internet Privacy and Identity, Trust 24 and Reputation Mechanisms
  • 25. Some peculiarities of the market environment •  Network effects and interoperability: –  tipping equilibrium (“Winner takes all”) –  Excess volatility or excess inertia –  Norms and conventions (cohesion vs. contagion) •  Security economics (hard shells and soft centres) •  Reluctance to exchange good and bad information •  IPR and standards •  Legal liability and public confidence •  The importance of the public sector –  Large-scale public procurement and launching customers –  Support for innovation and standardisation –  Direct, self- and co-regulation •  ‘Splitting’ between the Wild West and the Walled Garden •  Two bad puns: –  Trust and anti-trust –  Security and securities Internet Privacy and Identity, Trust 25 and Reputation Mechanisms
  • 26. SOME EXAMPLES Internet Privacy and Identity, Trust 26 and Reputation Mechanisms
  • 27. An example: biometrics and privacy •  Strength of biometrics can threaten privacy –  Unauthorised or invasive use of data, indirect abuse of personal data –  Inappropriate or non-consensual identification, denial of identity services –  Even if all the data are accurate, portions may give misleading or invasive impressions –  May give away too much –  People may not be careful enough – they certainly don’t seem to be. •  Biometrics can also enhance privacy –  Mutual identification to limit inappropriate access –  May remove need for more invasive data gathering –  Protection through weakness •  Limited scalability •  Lack of interoperability standards •  Proprietary interest in collected data •  Need for cooperation and consent –  Commercial interest in offering security of data and identity-based decisions –  Technical tricks: cancellation, liveness tests, degradation –  Strongly anonymised records Internet Privacy and Identity, Trust 27 and Reputation Mechanisms
  • 28. Some examples •  CCTV cameras in public and shared private spaces –  London has more than the whole of the US –  Every aspect of life is watched by someone –  Direct and measurable impacts – some perverse –  Backed by technology (ANPRS, face recognition, voice analysis) –  Linked to direct intervention –  Blurred public-private boundaries •  Hoodies and Niqābs •  Biometrics •  DNA •  Data mashing and other ‘recombinant’ data uses •  Loyalty cards and commercial profiling •  Virtual worlds Internet Privacy and Identity, Trust 28 and Reputation Mechanisms
  • 29. Some examples •  CCTV cameras in public and shared private spaces •  Hoodies and Niqābs –  Religious and ‘tribal’ group identities, or personal freedom? –  Do we trust those who withhold their identities? •  In commercial spaces •  In employment –  To what extent are they chosen? –  To what extent does our reaction force their choice? •  Biometrics •  DNA •  Data mashing and other ‘recombinant’ data uses •  Loyalty cards and commercial profiling •  Virtual worlds Internet Privacy and Identity, Trust 29 and Reputation Mechanisms
  • 30. Some examples •  CCTV cameras in public and shared private spaces •  Hoodies and Niqābs •  Biometrics –  A pervasive ‘strong’ form of identity – perhaps too strong? –  Merely physical identity –  Can be used for identification, authentication and indexing –  Confusion about technology and human factors •  Real vs. behavioural impacts •  Type I, II and III errors –  Where is the private sector? –  Privacy and utility currently protected by weaknesses in technology •  DNA •  Data mashing and other ‘recombinant’ data uses •  Loyalty cards and commercial profiling •  Virtual worlds Internet Privacy and Identity, Trust 30 and Reputation Mechanisms
  • 31. Some examples •  CCTV cameras in public and shared private spaces •  Hoodies and Niqābs •  Biometrics •  DNA –  Like biometrics, a link to the physical body –  Unlike biometrics; persistent traces and durable ‘template’ –  “Forensic” use (for legal and commercial decisions) –  May indicate more than identity (health, capabilities, kinship) –  Silent as to time, intent •  Data mashing and other ‘recombinant’ data uses •  Loyalty cards and commercial profiling •  Virtual worlds Internet Privacy and Identity, Trust 31 and Reputation Mechanisms
  • 32. Some examples •  CCTV cameras in public and shared private spaces •  Hoodies and Niqābs •  Biometrics •  DNA •  Data mashing and other ‘recombinant’ data uses –  Refers to combination of data from different sources –  Hard to reconcile with existing privacy protections - informed consent in a networked world –  Identification not necessary for privacy infringement –  The liability and intellectual property issues are profound and unsolved –  Meanwhile, commercial and civil society development is racing ahead. •  Loyalty cards and commercial profiling •  Virtual worlds Internet Privacy and Identity, Trust 32 and Reputation Mechanisms
  • 33. Some examples •  CCTV cameras in public and shared private spaces •  Hoodies and Niqābs •  Biometrics •  DNA •  Data mashing and other ‘recombinant’ data uses •  Loyalty cards and commercial profiling –  Who owns personal data (recent security breaches)? –  The practice is old; the power and scope are new –  A change in the implied client-customer relation –  Obstacle to search and competition or gateway to mass personalisation? •  Virtual worlds Internet Privacy and Identity, Trust 33 and Reputation Mechanisms
  • 34. Some examples •  CCTV cameras in public and shared private spaces •  Hoodies and Niqābs •  Biometrics •  DNA •  Data mashing and other ‘recombinant’ data uses •  Loyalty cards and commercial profiling •  Virtual worlds –  From transaction spaces to social networks, Second Life and MMORPGs –  Mutual and proportionate identification –  Who is the relevant person? –  Delegated identity for transactional avatars –  The closeness of chosen identities Internet Privacy and Identity, Trust 34 and Reputation Mechanisms
  • 35. Governance of privacy and identity Administrations Technology Citizens (consumers, Business communities, civil society) Internet Privacy and Identity, Trust 35 and Reputation Mechanisms
  • 36. TOPICS FOR DISCUSSION Internet Privacy and Identity, Trust 36 and Reputation Mechanisms
  • 37. A warning from history •  Business, government and civil society all have strong stakes, but start from different places •  Isolated events exert disproportionate influence •  Different agendas involved in privacy and security discussions are not necessarily consistent •  Challenge to business is to embrace these issues and take joint ownership •  Various scenarios are within our reach High security Low security High Privacy o  Virtual agora or closed community o  Peer-to-peer Low privacy o  The surveillance society (in theory) o  The surveillance society (in practice) •  Failure through success – big data analytics •  Success through failure – learning to be careful and policy improvement 16/08/2012
  • 38. ADDITIONAL EXAMPLES OF TENSION BETWEEN INTERNET INNOVATIONS AND PRIVACY Internet Privacy and Identity, Trust 38 and Reputation Mechanisms
  • 39. Privacy and innovation – ST systems model Market failures System failures Positive externalities (spillovers) Failures in infrastructural provision and investment Public goods and appropriability Lock-in / path dependency failures Imperfect and asymmetric Institutional failures information Market dominance Interaction failures Capabilities failures Internet Privacy and Identity, Trust 39 and Reputation Mechanisms Sources: Smith (1999), Martin and Scott (2000) and EC (2006)
  • 40. Privacy and innovation – ST systems model Market failures System failures Function creep Failures in infrastructural provision and investment ‘Tragedy of the data commons’ Lock-in / path dependency failures (‘opt-in’/’opt-out’) Transparency of data subjects Mismatch regulatory practices versus opacity of systems and data practices No incentives for newcomers with Interaction failures privacy as USP Privacy authorities and law enforcement practices Internet Privacy and Identity, Trust 40 and Reputation Mechanisms
  • 41. Case-study examples – Cloud Computing Growing EU market (68B (2011)->150B (2015)) Software as a Service Platform as a Service Infrastructure as a Service A model for enabling convenient, on-demand network access to a shared pool of configurable computing resources US NIST Motives for adoption: Cost reduction; cost accounting; Time to market; greening of ICT Internet Privacy and Identity, Trust 41 and Reputation Mechanisms
  • 42. Cloud computing - 2 •  Tension 1 "  Approach/solution 1 –  Data controller-data processor (Art 29 "  Technology (encryption): WP; Art 17 95/46: security measures) SIENA (from IAAS to PAAS •  Tension 2 and SAAS) –  Informed choice and consent (auditing SOC2/SOC3 including privacy and "  Approach/solution 2 security) "  Security as a Service •  Tension 3 "  Forrester: $1.5B market –  Ownership, confidentiality and law (2015) enforcement "  BUT •  Tension 4 "  Apple’s SDK for iOS only –  Appropriate SLAs (data integrity, data moderate attention for disclosure, data preservation, data ‘concern for user’s location/transfer, rights over services/ data’ (only in closed Walled content, property rights and duties Garden of the Apps Store) (Queen Mary’s univ study) •  Tension 5 "  Approach/solution 3 –  User expectations vs privacy "  Cloud neutral approach? –  WTP vs WTA, right of ownership (data portability), right to access (Facebook, GMail) Internet Privacy and Identity, Trust 42 and Reputation Mechanisms
  • 43. Case-study examples – Behavioural targeting Policy pressure: implementation of more stricter ePrivacy article related to use of cookies, June 2011 Market forecast: $4.4 b in 2012
  • 44. Behavioural targeting - 2 •  Tension 1 –  Explicit and informed prior consent; (Art 29 WP; "  Approach/solution 1 ePrivacy directive art. 5(3)); leads to unwanted "  Policy approach on informed effects (pop-ups) consent (‘browser settings are •  Tension 2 sufficient’) –  New intrusive technologies and tools •  Respawning (‘evercookie’) "  Approach/solution 2 •  HTML5 – persistent cookies "  Control instruments to users "  ‘Track me not’ browser button •  Device fingerprinting (unique consumer identification) "  ‘Advertising Icon Option’ " Transparant privacy policies •  Tension 3 –  Trust and confidence "  Approach/solution 3 •  Consumers show reluctance when confronted "  Different approaches with BT practices "  Just in Time contextual •  Cookie practices hard to understand and to act upon (Flash cookies) advertising "  Consumers show reluctance •  Generic privacy policies not informative and too long when confronted with BT •  Tension 4 practices –  Regulation is perceived to distort business practices Internet Privacy and Identity, Trust 44 and Reputation Mechanisms
  • 45. Case-study examples – Location based services All data in device All data in network Linkage between media GPS Mash up Location tracking Friend finder Google Street View Internet Privacy and Identity, Trust 45 and Reputation Mechanisms
  • 46. Location based services - 2 •  Tension 1 "  Approach/solution 1 –  Regulatory practices •  2002/58/EC Harmonisation of opt-in "  Soft regulatory practices: consent and withdrawal of consent "  offering opt-out Germany: •  Data Retention Directive 2006/24/EC; 244.000 citizens financial burden on telco’s and ISPs •  Conflicting regulatory frameworks –  Definition of personal data, traffic "  Approach/solution 2 data, location data "  Control instruments to users •  Tension 2 "  Switching off GPS button –  Strict regulatory practices •  Switzerland: blur all faces, all number plates, sensitive facilities, clothing "  Approach/solution 3 •  Art 29 WP: storage of photo’s: from 12 to 6 "  New technologies months •  Tension 3 "  Automatic face blurring –  Lack of user control (location tracking) technologies •  Gathering of profiles (Malte Spitz) "  Number plate blurring •  Selling of aggregate data (TomTom) •  Tension 4 –  Collection of sensitive data (faces, number plates, clothing, buildings) Internet Privacy and Identity, Trust 46 and Reputation Mechanisms
  • 47. Case-study examples – RFID Ø  Building block for Internet of things Ø  Growth $5.03B (2009) - $5.63B (2010) Ø  Use in multitude of domains ü  health care; cattle; pets; logistics; .. Ø  Unique ID Ø  Limited control and choice/ consent for user Internet Privacy and Identity, Trust 47 and Reputation Mechanisms
  • 48. RFID - 2 •  Tension 1 –  Awareness raising by EDPS, Art 29 WP, consumer •  Approach/solution 1 groups •  Regulatory practices •  Limited awareness by consumers •  Privacy impact assessment •  Undesired disclosure of data RFID •  EC Recommendation 12 May 2009 •  Industrial PIA RFID •  Tension 2 •  Approach/solution 2 –  Critical approach from industry •  Control instruments to users •  RFID singled out as privacy intrusive •  ‘Killer application’ technology •  Deep sleep mode •  Privacy problems are in back end •  Transparency tools (?) •  Data encryption in chip is costly •  Disabling of RFID means limited access to after sales services •  Tension 3 –  Convergence of technologies with privacy implications •  Biometrics (fingerprint recognition •  Corporeal intrusion (swarm technology) Internet Privacy and Identity, Trust 48 and Reputation Mechanisms
  • 49. Case-study Biometrics Growing market: $4.2B (2010) - $11.2B (2015) Largest share: finger printing Facial, iris, voice show higher CAGR Market driver: homeland security Decentralised systems (authentication) Centralised systems (fraud detection, illegal and criminal activities) Internet Privacy and Identity, Trust 49 and Reputation Mechanisms
  • 50. Biometrics - 2 •  Tension 1 "  Approach/solution 1 –  Storage of sensitive data "  Regulatory practices •  False positives "  Globalisation of regulation (US- •  Third party use VISIT/SIS/VIS …) •  Consent and choice "  ICAO standardisation •  Tension 2 "  Fine-tuning EU regulations (<12 –  Limited accuracy age) •  Enrolment/identification (UK, NL) •  Tension 3 "  Approach/solution 2 –  Back-firing of public failures on "  Control instruments to users private business "  Transparency tools (?) •  Distrust of decentralised biometric systems "  Approach/solution 3 –  Single sign on "  Offering surplus value –  Access systems "  User friendliness •  Tension 4 "  Speed (Single sign on) –  Public distrust of premium services "  Trust (One time passwords) •  Advantageous for specific "  Added value services (premium) groups –  Air transport "  Approach 4 –  Banking "  Demonstrating public value "  Identity fraud (double dippers) "  Crowd control Internet Privacy and Identity, Trust 50 and Reputation Mechanisms
  • 51. Concluding the cases •  Legal/regulatory issues abound –  Stronger enforcement; stricter adherence to privacy constraints; homeland security –  Harmonisation fails; different approaches in different countries –  Influx of different regimens (sector specific regulations, police and judicial coordination) •  Privacy intrusions are part of public services as well! Specific uses and misuses are hard to differentiate for the public at large (fraud detection vs commercial use of data) •  Privacy innovation is restricted (face blurring technology, decentralised biometrics) •  Usually soft approach: awareness raising, opt-in/opt-out offers; transparency measures •  Business practices mostly oriented towards data collection and use; privacy is only secondary to business models •  Companies with inherent privacy friendly approaches have modest market shares and are not very visible to the public at large Internet Privacy and Identity, Trust 51 and Reputation Mechanisms