SlideShare une entreprise Scribd logo
1  sur  49
Télécharger pour lire hors ligne
Electronic copy available at: http://ssrn.com/abstract=2390934
1
Privacy as identity territoriality: Re-conceptualising behaviour in cyberspace
(Working paper series #14.1.p)
Ciarán Mc Mahon,
RCSI CyberPsychology Research Centre,
Dublin, Ireland.
Mary Aiken,
RCSI CyberPsychology Research Centre,
Dublin, Ireland.
Electronic copy available at: http://ssrn.com/abstract=2390934
2
Abstract
Recent exposés of global surveillance have heightened already-heated debates
about privacy in a technological society. In this paper, we explore the context and
probable effects of this crisis, the character of privacy concerns, re-interpret what
is meant by 'privacy', provide some solutions to the crisis. Fundamentally, we
explore privacy not as a forensic or civil rights issue, but instead as a secondary
psychological drive, the combination of two more profound drives – territoriality
and identity. As such, the problem lies in the fact that cyberspace generally, is a
wholly interconnected and networked environment which makes the demarcation
of individual spaces problematic. However, by viewing privacy as identity
territoriality, and by examining the psychology of those concepts, we can chart
solutions to this crisis more easily. For example, we should interpret lower
privacy concerns among youth as reflective of normal identity development
processes in adolescence and young adulthood. Similarly, aspects of gender and
temporality can be fruitfully incorporated to the discourse on privacy. On this
basis, possible solutions and research directions are outlined to overcome the
privacy crisis.
Electronic copy available at: http://ssrn.com/abstract=2390934
3
Introduction
While it has become a common trope in public and academic discourse since at least
Warren and Brandeis' (1890) paper, privacy has become a more pressing concern since the
dawn of the ‘information age’ in the 1980’s. Computer technology initially, and the internet
in particular, have rapidly increased the potential for personal information to be sourced and
combined in injurious fashion. In addition, the arrival and proliferation of social media and
cloud computing in the last decade has exponentially increased the amount of what is known
as ‘personally identifiable information’ within reach of adversarial actors.
This has extended privacy, among other related concepts, out of a very physical sense
of ‘someone watching me when I would prefer not to be watched’ – the foregoing citation
having apparently been inspired by reportage of attendance at a Warren family funeral – into
much more conceptual sense of ‘someone looking at my information’. For the purpose of this
article, these are the sort of data we are most interested in: what may be termed consciously
self-generated information: i.e. our emails, text messages, social media and other online
service profiles and content, cloud-hosted documentation, online photo albums etc. This
excludes, for example, material we are not usually conscious of creating, such as our image
being recorded by onstreet cameras, or the use of our medical records. While these are
obviously important areas for privacy research, and surveillance studies in particular, they are
beyond the scope of this paper, where we are most concerned about privacy qua personal
subjective experience.
Following from that delineation, it is argued below that to properly understand not
only our most recent privacy concerns and crises, but the psychology of information
generally, we must unpack the very concept of privacy itself – firstly in this limited sense of
‘personally identifiable information’, latterly, in a broader sense. We demonstrate that if
privacy concerns are appreciated for what they fundamentally are, namely territoriality of
4
identity, then this current crisis becomes easier to unpack and manage. While privacy has
been previously linked to territoriality (Pastalan, 1970) and identity to territoriality (e.g.
Brown, 2009; Edney, 1974), this is the first, that we know of, to combine all three,
particularly in the context of information technology and cyberspace.
What we are dealing with is our individual and primaeval drive to mark out and
defend our own unique domain of development and subsistence. Our identity is our own self-
property, the product of our individual creation and labour, the symbolic denotations and
connotations of which we experience as our own private territory. Where the problem arises
is when these factors play out in an implicitly shared, networked and contested context such
as cyberspace. However, as we shall see, there are demographic and psychological aspects to
the process of identity and territory development, so that it may be possible to design
technologies that allow for them, making this space safer and more civilised.
The privacy crisis
What is likely to be the most significant single episode in the recent history in this area
concerns the reveal of massive global surveillance by Mr Edward Snowden beginning in the
middle of last year (cf. Greenwald, 2013) and continuing at time of writing (January, 2014).
While the full and precise nature of this episode are not only too extensive to detail here, it is
still underway, so let us limit ourselves to an interpretation of its cyberpsychology – what
exactly it means in the context of online behaviour – before exploring how its consequences
can be overcome. An initial and uncontroversial premise of this paper must be, however, that
the Snowden affair is unlikely to leave no trace in subsequent privacy behaviour, discourse
and research.
It should be noted at the outset however, that the intellectual milieu into which
Snowden landed was not a tabula rasa – privacy concerns had been intensely rising for some
5
time. Notwithstanding the figures of Julian Assange and Chelsea Manning, with whom
Edward Snowden can also juxtaposed with, the major internet service corporations had been
under journalistic and academic scrutiny for some time with regard to how they treated user
information. For example, Google Street View came under immediate question with regard to
privacy concerns on its launch in (Mills, 2007), and, as we shall discuss further below, there
have long been similar concerns about Facebook in academia (Debatin, Lovejoy, Horn, &
Hughes, 2009).
It is beyond the remit of this paper to consider the ethics of global surveillance, or, for
that matter, its subsequent exposure. It is however, within its remit to consider the evolution
and adaptation of the human cyberpsychological response to the subsequent atmosphere
created, given that neither the former nor the latter aspect are likely to have concluded. As
long as massive amounts of data are created, massive amounts of data will be stored: but
equally, massive potential for disclosure will exist.
What is interesting to conjecture however, is whether or not the Snowden revelations
will have an effect on behaviour: a perspective the current author believes is more enlighten
than simply ignoring the episode, as many academics have done. In that regard, we do not
have much data to go on, but what we do have is curious. On the one hand, as might be
expected, we read reports that not only are there fears that terrorists will change their modes
of operations (Whitehead, 2013), but also that intelligence organisations are doing likewise –
with the KGB apparently investing in typewriters (Fitsanakis, 2013). On the other hand, there
are estimates that the United States cloud computing industry alone could lose anywhere
between $21.5 billion (Castro, 2013) and $180 billion (Staten, 2013) in lost revenue over the
next three years as a result.
In terms of the behaviour of the private individual though, there are also some
indicative reports. An admittedly self-selected (and possibly indicative) survey of 528
6
American writers found that 28% had ‘curtailed or avoided social media activities’ since the
Snowden revelations (PEN American Center, 2013). More generalisable figures are available
from a series of surveys carried out over May – July 2013, involving over 2,100 American
adult internet users (Annalect, 2013) – that the percent of Internet users who rate themselves
as either “concerned” or “very concerned” about their online privacy rose from 48% in June
to 57% in July. This shift, the authors note, “is largely from unconcerned Internet users
becoming concerned—not from the normal vacillation found among neutral Internet users”
(p.2), with even 27% of ‘unconcerned’ internet users taking some form of action, like editing
their social media profile, researching ways to improve privacy or using private modes while
browsing (Annalect, 2013). Thus we have some inkling that internet users as a class may
have been roused to a new reality. But from what? A more subjective, qualitative account
provides some allusion:
Hearing Edward Snowden tell us in detail how our privacy is a joke was the stone
that started the landslide. My denial about the impact of my online activities on
my privacy and my relationships with others has come to an end. I started
changing my online behaviour. I started to feel very angry. I should not have to
be censor what I say online to avoid "incriminating" myself.
I should be able to share my personal life online according to my own wishes, and
not those of corporations and governments… (Dale, 2013, n.p.)
This is a revealing paragraph, but it is the penultimate point raised by Dale (2013), a
Canadian writer, which is immediately pertinent: while Snowden fundamentally exposed
surveillance by governments, this activity is carried out via the services built by corporations.
As Marwick (2014) notes:
Data about your online and offline behavior are combined, analyzed, and sold to
marketers, corporations, governments, and even criminals. The scope of this
7
collection, aggregation, and brokering of information is similar to, if not larger
than, that of the NSA, yet it is almost entirely unregulated and many of the
activities of data-mining and digital marketing firms are not publicly known at
all. (n.p.)
So while Snowden’s expose of the NSA may continue to grab the headlines, there is probably
far more subtle surveillance of ostensibly our information going on by profit-driven
enterprises. In some cases, these are third parties, who aggregate and analyse data bought
from elsewhere, but in other cases they are the analysed by the entities who collected the data
in the first place – but in the vast majority of instances the data has been gathered entirely
legally, and indeed supplied by us willingly.
In that regard, it is worth standing back from this whole episode, of governmental and
corporate surveillance of our personally identifiable information – our privacy – and
reflecting for a moment. The histrionics of this particular blog post is noteworthy, written in
the wake of an open letter (reformgovernmentsurveillance.com) signed by major technology
companies, calling on governments to limit surveillance in the wake of the Snowden
revelations:
The giant online firms complaining of intrusion on personal data is laughable.
Nearly all of the companies above are very happy to mine personal data to make
money — they don’t like the government beating them at their own game.
(Payne, 2013, n.p.)
This post, titled with a ‘hypocrisy warning’, is at least getting some way towards the truth.
But, being mindful of Bohr’s characterisation of truth1
, we perhaps should not rest a such an
analysis. While it is true that whatever surveillance, legal or illegal, moral or immoral, that
has been carried out by the NSA and its international partner organisations, would not have
1
“One of the favourite maxims of my father was the distinction between the two sorts of truths, profound truths
recognized by the fact that the opposite is also a profound truth, in contrast to trivialities where opposites are
obviously absurd” - Hans Bohr, ‘‘My Father,’’ in S. Rozental (1967, p. 328)
8
been possible without the use of the technology built by those companies, neither would they
have been possible without the participation – nay, industry – of its users. Social networks are
the most obvious example – the entire social graph, the most valuable aspect of Facebook’s
multi-billion dollar value, was not built by its developers or engineers, but by a billion
individual private users, laboriously, thanklessly and gratuitously clicking ‘like’.
Fundamentally, no spy had to work very hard to get extremely valuable information about the
world’s internet users: in fact, many of us queued long hours for the privilege of buying the
technology with we could provide them with it.
There have been many retorts to the ‘outrage’ over the Snowden revelations largely
hung on words like ‘naïve’ and to the effect that we should have expected that our emails and
suchlike were being read, both by the companies who service them and the governments who
control them, but the reader can be assured that this is not another one of them. The crux of
the issue is more fundamental than that: why have we participated in this so readily?
A research study published a few months ahead of the Snowden revelations
documented how personal attributes such as sexual orientation, religious and political views,
personality traits, intelligence, happiness, and use of addictive substances could all be easily
and accurately inferred from a Facebook user’s Likes (Kosinski, Stillwell, & Graepel, 2013).
The authors, though remarking on the obvious privacy concerns regarding such a finding,
conclude with the hope “…that the trust and goodwill among parties interacting in the digital
environment can be maintained by providing users with transparency and control over their
information” (Kosinski et al., 2013, p. 4). While such a hope may have been thwarted by
subsequent revelations, my preceding question remains: how on earth could people not
realise that the dots they were placing in cyberspace would eventually be joined up?
There has long been a truism in ‘internet lore’ which goes along the lines of ‘if you’re
not paying, you’re the product’, (Zittrain, 2012) which, despite containing an intuitive nugget
9
of truth, has been wonderfully deconstructed as both inaccurate and disempowering
(Powazek, 2012). It is, however, an aim of this paper to explore where the power has gone in
this equation: where it seems we have given away much privacy and not been paid.
Another issue often thrown in with the ‘naiveté’ argument is the infamous privacy
policy – the lengthy legalistic tracts presented at the start of every programme installation and
profile instantiation, with so much signed away with a single click. In 2008 it was calculated
that “if Americans were to read online privacy policies word-for-word, we estimate the value
of time lost as about $781 billion annually” (McDonald & Cranor, 2008, p. 564). Or, to put it
more granularly, an average of 40 minutes per day – which must be juxtaposed with the
reported estimate of ‘time spent browsing the internet per day’ of 72 minutes (McDonald &
Cranor, 2008). Clearly, this is far from the reality and while it is admittedly a lot of reading,
not much of it easy or accessible, but it certainly is all there: we have wilfully ignored it.
How have we sleepwalked into this situation? Have we been massively duped by tech
evangelism, into an illusion of an egalitarian cyber utopia, only to have its nightmarish
surveillance state underbelly exposed? Yet, critically, what’s this ‘we’ business – who exactly
is at risk? All of these questions, with their attendant histrionics, are just as important to
answer as the more sardonic questions which have developed from the Snowden revelations
of the ‘no harm, no foul’ variety, which wonder what the big deal is. Well, what is it?
Perhaps it is not so much that we are being surveilled by the government, nor that the
corporations are using governments to surveil us, but that we are using governments and
corporations to surveil our selves, and each other.
Demography and character of privacy concerns
To examine this crisis of privacy information, as cyberpsychologists it is first most instructive
to determine who exactly is experiencing the crisis – the demography of those most
10
concerned. In terms of our thesis, below we examine privacy in the manner as a psychologist
would, with regard to age and gender and so on.
The first thing that we notice from such a literature review is that, despite much
handwringing and hoopla in the media and popular discourse is that we don’t really know
much about those concerned about privacy. This is perhaps a result of a combination of
legalistic and computer science treatment to the topic – ‘privacy’ is a reified thing which we
should all be concerned about and ‘factor in’ everywhere – but such a blunt trauma approach
is intellectually and culturally redundant. We will learn a lot more about this issue if we tease
out its sociology.
At the outset, it is reasonably clear from the literature that there are gender differences
with regard to privacy – a clear male/female spilt has been reported in privacy concerns on
Facebook, with women being significantly more concerned, and some significant differences
with regard to behaviour (Hoy & Milne, 2010). Additionally, Fogel and Nehmad (2009)
reported that female respondents had significantly higher scores for privacy concerns but not
for privacy behaviour. Similar clear divides between males and females with regard to
privacy in general, if not concern, attitude or particular behaviour, have been reported
elsewhere (Moscardelli & Divine, 2007; Youn & Hall, 2008).
While this is a reasonably facile feature to isolate in the human sciences, it is worth
noting, given that it concurs with our central assertion that privacy is a territorial concept. It
has long been implied, if not asserted, in evolutionary psychology that territoriality is a more
male concern, and plenty of research has borne this out – males more likely to sit on chairs
marked with personal objects in a university library (Sommer & Becker, 1969); males more
likely to negatively evaluate a stranger who sits directly opposite them (Fisher & Byrne,
1975); cross-culturally, men exhibit more nonsharing behaviour in residence halls (Kaya &
Weber, 2003). Consequently, it is suggested that differences in privacy behaviour online
11
across genders can be interpreted as a more cautious or conservative approach amongst
women, and a more exploratory or aggressive approach amongst men. Though beyond the
scope of this article, it should be clear that gendered approaches to territoriality in cyberspace
have applicability far beyond the concept of privacy (cf. ‘Why women aren't welcome on the
internet’, Hess, 2014).
A factor considerably less easy to isolate is that of age, a point which is implicitly
recognised by Bélanger and Crossler's (2011) review of research in this area. Like much
‘weird’ research (‘Western, Educated, Industrialized, Rich, and Democratic’; Henrich, Heine,
& Norenzayan, 2010), their review found that “information privacy research has been heavily
reliant on student based and USA-centric samples” (Bélanger & Crossler, 2011, p. 1019).
While the cultural aspect is noteworthy, it is in the present context of lesser importance as we
are most interested in the Western, internet-using population: it is the fact that privacy
research has been largely carried out on student samples – i.e. between 18 and 25 years of age
– which is most problematic. Any inference of whether or not privacy concerns vary with age
is consequently impossible when using such a narrow range of age. For example, a study of
risk taking, trust, and privacy concerns used a sample wherein the “average age was almost
22 years old (range: 17–32 years)” (Fogel & Nehmad, 2009, p. 156) and a similar study
restricted itself to adults aged 18-24 (Hoy & Milne, 2010). Admittedly, neither study analyses
with respect to age, but they make it difficult to determine whether or not there are age
differences in privacy concerns – a claim which recurs in the media.
It does seem that there may be some basis to that claim, given that, almost a decade
ago, it was reported that “age is positively related to reading [of privacy policies] and
negatively related to trust” p. 24 (Milne & Culnan, 2004). In addition, a non-peer-reviewed
study, reported that all ages of internet users were in some agreement about privacy concerns,
but that younger users were less so (Hoofnagle, King, Li, & Turow, 2010). This is in addition
12
to an interesting paper from a prolific source, and one to which we will return, which reports
that adolescents reported disclosing more information on Facebook and using the privacy
settings less than adults (Christofides, Muise, & Desmarais, 2011). Consequently, let us
entertain, at least the possibility, that the privacy ‘issue’ may vary over the lifespan. How this
issue should be conceptualised – either in terms of ‘concerns’ or ‘behaviours’ – is not so
clear, but suffice it to say that it would not be unheard of for such factors to be mutually
contradictory.
Such an perspective on human nature requires a certain amount of wisdom with
regard to human nature, and while the current author would do well to avoid claiming such a
personal attribute, it is generally wise to treat of the opinions of one’s senior generations.
Moreover, let it not pass unremarked that much of the debate, including the foregoing
paragraphs, have dealt with the matrix ‘privacy X age’ very much in a youth-centric
calculation. We seem to have concluded that ‘younger people’ are less concerned about
privacy than ‘older people’, but what exactly do those older people think? A Finnish study of
older adults reiterates in its title, for emphasis, the following quote from one of its
participants: “If someone of my age put her photo on the net, I would think she is a little silly
and empty-headed” (p. 50). That opinion is revealing of that cohort’s opinion of social
networking, but there are other privacy-related issues also:
… our participants perceived uploading photos as not safe. The participants were
very careful about providing personal details related to names, relationships, and
families. In fact, they stated that they want to keep private all such personal
details as concern the circles of personal life (Lehtinen, Näsänen, & Sarvas, 2009,
p. 50)
If there were ever an argument that more educating and informing ‘silver surfers’ about the
wonders of social networking will result in greater disclosure by them of personal
13
information, let us quietly retire it now: older users have little desire to engage in social self-
disclosure.
In addition, a UK report reveals that “older internet users tend to be more cautious and
yet people with more years of online experience are less cautious.” (Coles-Kemp, Lai, &
Ford, 2010, p. 4). As such, we have something of a paradox - experience and age should go
hand-in-hand. This finding somewhat echoes another study which found that while age
predicted information privacy concerns, internet experience did not (Zukowski & Brown,
2007). Both of these findings are, to an extent, glossed over, perhaps with influence from e-
commerce and marketing fields, due to their contradictory nature. If we look at things
rationally and objectively, and of course view those involved as ‘users’ per se, then of course
as they get older and get more experience, they should become more competent and be happy
to engage online more. However, this elides a satisfactory understanding of human nature –
not to mention the wider question of the inherent ‘good’ in uploading copious amounts of
personal information to the internet, which we will have to set aside for the time being!
The simple point is that we should not assume that older generations can simply be
taught to use the internet in the same way as young people, or for that matter, vice-versa. The
following quotation illustrates this point neatly:
Age-stratified cohorts can also be studied at a single point in time, and change can
be inferred from the differences between the age groups, but this assumes that
younger generations will grow up to resemble older generations, which would not
be the case for life-stage-related behavior. Past research on youth may help to
shed light on the kinds of behaviors that young people can be expected to
outgrow. (Herring, 2008, p. 87)
The bottom line is that we need to have a more profound appreciation of human lifespan
development if we are to better understand age-related differences in privacy concerns,
14
attitudes and behaviours in the online context. To give but one example, one can easily put
one’s hand to several studies exploring the use of social media to encourage political and
civic engagement by the younger generation (Gibson & Mcallister, 2009; Gueorguieva, 2007;
Owen, 2009; Valenzuela, Park, & Kee, 2009; Vitak et al., 2011; Xenos & Foot, 2008) but for
seniors this is a redundant search: because older people do not need to be encouraged to vote
to anything like the same degree. Ultimately, what we are talking about are differences in
self-concept: an elusive topic at the best of times, but one which we must appreciate if we are
to understand privacy concerns.
It is possible that we have assumed that those who do not like ‘sharing’ their private
information (such as older users), do not do so because they are not tech-savvy, but because
they are sceptical about service providers intentions with their data, which is may derive from
being inherently more confident in their personal identity: their territory is well-established
and they have many years experience of defending it. This is a perspective we must consider
in opposition to the an underlying assumption that older adults concerns about new
technology could be largely erased by better education – that older meant less experience
with technology. Such a perspective, ignoring their experience of life, wisdom of society and
perhaps even scepticism of government and surveillance, now seems embarrassingly naïve in
the light of the Snowden revelations. It also allows us to entertain the possibility that had
older adults taken to the internet before tweens and young adults, such networks might never
have been constructed by such a population of ‘users’. But to understand why such a
networks have in fact been built, we need to understand more about the root concept at play:
the self.
Privacy as territoriality of identity
15
An idea long in germination but short in appreciation is the notion that when we are online,
‘surfing the web’, ‘navigating the information superhighway’, that we act as if we are
negotiating territory. This is a critical learning if we are to understand our relationship to the
internet – our place within cyberspace – because it has been shown that the self is
fundamentally a locative system (Benson, 2001). In order to know who we are, we need to
know where we are:
These “locative demands” are a constitutive part of every moment of a person’s
life. They are what underpin the idea that who you are is a function of where you
are, of where you have been, and of where you want to be. (Benson, 2003, p. 62)
This clearly applies as much to online behaviour as it does elsewhere: note the differences
inherent in ‘I have a Twitter account’ and ‘I am on Twitter’: the former connoting disdain and
lack of enthusiasm, the latter satisfaction and regular use. In this way, we position and
situation our selves online as befits our social and reputational concerns.
More to the point, in the context of personally identifiable information, “Selfhood and
identity are constituted in and by acts of assertion, … and specifically in and by acts of
authorship and ownership, acts of responsibility-taking or responsibility-denying…, as well
as in acts of self-location generally” (Benson, 2013, p. 58). We declare and defend our
ownership of our places as acts of selfhood – and note how information technology has
endorsed our psychology in that regard - ‘My Computer’, ‘My Account’, ‘My Profile’ and so
on.
However, there is a more fundamental and historic aspect to this, which goes to the
heart of Western liberal and civilised thinking: the idea of ‘self as property’. As such we turn
to the dawn of the modern era at end of the 17th
century:
Though the Earth, and all inferior Creatures be common to all Men, yet every
Man has a Property in his own Person. This no Body has any Right to, but
16
himself. The Labour of his Body, and the Work of his Hands, we may say, are
properly his. Whatsoever then he removes out of the State that Nature hath
provided, and left in it, he hath mixed his Labour with, and joyned to it
something that is his own, and thereby makes it his Property. (Locke, 1689/1751,
p.166, 2nd
Treatise, Ch. V, pgh. 27).
This passage reflects a deep current in the relationship between politics and psychology: that
human rights are intimately tied up with property rights. Crucially in the 21st
century context,
not only is our identity our property, but our online identity is the product of our own labour,
and as such it is also our property. Moreover, from this perspective, when we return to the
reality of the industry involved in the creation of, for example, the Facebook social graph,
one might have cause to muse upon the status and legitimacy of its present ownership (cf.
wagesforfacebook.com).
There are other aspects of Locke’s perspective on the self which we will return to
presently, but first it is worth examining how the self has been understood in cyberspace. In
that regard, there is no better theorist to turn to than John Suler. Writing on the human
predilection for acting on the internet in ways that would not be considered conventional in a
face-to-face context, known as the ‘online disinhibition effect’, (a phenomenon which
obviously has significance in the realm of privacy disclosures), Suler tackles the notion of
self:
In fact, a single disinhibited “online self” probably does not exist at all, but rather
a collection of slightly different constellations of affect, memory, and thought that
surface in and interact with different types of online environments.
Different modalities of online communication (e.g., e-mail, chat, video) and
different environments (e.g., social, vocational, fantasy) may facilitate diverse
expressions of self. (Suler, 2004, p. 325)
17
Consequently, when we place our selves in different online environments, when we create
different types of digital content, we can be said to be expressing different constellations of
the self. This is obviously interesting for creative and artistic reasons, but it is clearly
problematic in a privacy context: because our self-territory is therefore shifting online. Ergo,
privacy of what exactly?
The issue is really quite a tricky one: privacy of a physical space is a matter of walls
and screens – how can this be achieved in the nebulous world of cyberspace? Marking out a
unique and personal territory in an environment fundamentally devoted to connection and
association seems like an impossibility.
In that regard it is helpful to return to earlier work on human territoriality, where
Edney (1974, p. 962) notes that pertinent definitions involve “an interesting variety of
concepts: space (fixed or moving), defense, possession, identity, markers, personalization,
control, and exclusiveness of use”. Notably, Pastalan (1970, p. 88) further argue that “privacy
may constitute a basic form of human territoriality”. Moreover, Proshansky, Ittleson, and
Rivlin (1970, p. 175) explain that “in any situational context, the individual attempts to
organize his environment so that it maximizes his freedom of choice… In all cases
psychological privacy serves to maximize freedom of choice”.
Consequently we have arrived at a picture of the self which, on first examination, will
struggle to maintain its integrity in cyberspace – constructing and marking different aspects
of its uniqueness in a networked and fluctuating context. Privacy concerns in this
environment would seem the least of one’s worries: from this analysis, sanity would be under
threat after a short spell in cyberspace. Ultimately, the solution to this problem of establishing
our own identity in a networked environment is to realise that the internet is a contested
territory of intersubjective construction: everyone thinks they own it.
18
However, as we shall see below, there are good psychological reasons why this is not
(immediately) the case, which can also be used to help chart possible solutions to this privacy
crisis. As an aside though, it can be useful to review some other recent media concerns about
cyberspace in the lens of territoriality – for example, responses to internet filtering (e.g.
Penny, 2014) should be viewed less as anti-censorship and more as resistance of
confinement. We view the internet both as something we have built and something we
explore – we do not like it being curtailed. However, let us dispense with the tech evangelists
belief that the internet is some kind panacea to humanity’s baser, primeval instincts: in many
cases, it is quite the opposite. At any rate, what seems to be occurring with regard to privacy
concerns in cyberspace is the emergence of well-understood psychological processes in an
unusual environment.
Psychological aspects of privacy concerns
When we combine the insights of the two previous sections, namely that there are
demographic and developmental aspects to privacy, and that privacy is fundamentally an
aspect of self, as a combination of identity and territory, then it becomes clear that we should
consider the lifespan aspects of both identity and territoriality. Let us examine the self in
more granularity.
As we have already alluded, there is a possibility that older adults behavioural
patterns online are less to do with lack of education and more to do with life-experience and
wisdom. On closer examination, particularly at the other end of the age scale, we find plenty
of psychological theory to support this hunch, beginning chiefly in the work of a colossus of
psychoanalysis and developmental psychology, Erik Erikson and his concept of the ‘identity
crisis’ (Erikson, 1968). The central idea of the identity crisis is that during adolescence,
19
having overcome childhood, and being on the way to adulthood, we attempt to answer the
solitary question of who we are. His own words best describe the profundity of this process:
The young person, in order to experience wholeness, must feel a progressive
continuity between that which he has come to be during the long years of
childhood and that which he promises to become in the anticipated future;
between that which he conceives himself to be and that which he perceives others
to see in him and to expect of him. Individually speaking, identity includes, but is
more than, the sum of all the successive identifications of those earlier years
when the child wanted to be, and often was forced to become, like the people he
depended on. Identity is a unique product, which now meets a crisis to be solved
only in new identifications with age mates and with leader figures outside of the
family. (Erikson, 1968, p. 87)
The process of overcoming this crisis has been developed further (Marcia, 1966), as
involving notions such as identity moratorium (being in the midst of the identity crisis,
characterised by vagueness, variability and rebelliousness), identity foreclosure (accepting
the identity which one is expected to take, usually provided by parents, characterised by
rigidity and authoritarianism), identity diffusion (may or may not have had a crisis,
characterised by noncommitment, aimlessness and malleability) and identity achievement
(having successfully overcome the crisis, characterised by commitment to organisational and
ideological choices, with stable, realistic, and internally selected goals). Since then we have a
well-established research strand within developmental psychology building on this theoretical
framework, establishing, among other things, vocational identity (Porfeli, Lee, Vondracek, &
Weigold, 2011), parental and caregiver styles (Cakir & Aydin, 2005; Wiley & Berman,
2012), risk and peer group (Dumas, Ellis, & Wolfe, 2012) as well as isolating certain gender
differences (Archer, 1989; Thorbecke & Grotevant, 1982).
20
However, the general gist of the Eriksonian approach is that this identity crisis should
be resolved during the teenage years, or soon afterwards. This no longer appears to be the
case as Côté (2006) demonstrates that there is significant evidence that this process of
transition to adulthood is taking considerably longer these days, and for a significant
percentage of the population, is delayed until the late 20s – a finding he describes as one of
the least contested in contemporary developmental psychology. As such, we can expect to
find evidence of identity crisis and construction right from early teens up to thirty-odd years
of age2
. Moreover, as Weber and Mitchell (2008, p. 26) point out, for such young people
interested in scrutinising and studying new forms of self and identity, “new technologies are a
good place to start these investigations.” There is, in fact, increasing evidence and support for
this view, that adolescents and young adults are using social networks as locations for
identity construction: as Greenhow and Robelia (2009) report that their high-school sample of
participants often obscured their full name on their MySpace profiles, and that it was easy to
see their “quest for self-discovery, identity exploration, and self-presentation playing out
within their MySpace profiles” (p. 131). Moreover, as Pempek, Yermolayeva, and Calvert
(2009) argue, “the communication with friends that occurs on Facebook may help young
adults resolve key developmental issues that may be present during emerging adulthood,
including both identity and intimacy development” (p. 236). More recently, Jordán-Conde,
Mennecke, and Townsend (2013) support this theoretical direction, and in fact go as far as to
suggest that the affordances of social media, by their suggestion of many and increasing
possibilities for adult development, could be contributing to a prolonged period of
adolescence (p. 9). Furthermore, in all of these studies another message is clear: adolescents
and young adults view privacy risks as a price worth paying for the use of social networks.
2
It may be a trite observation, but in this context it is worth noting the relative youth of the founder of
Facebook, Mark Zuckerberg (b. 1986), whose perspective on privacy he famously summarised as ‘public is the
new social norm’ (<http://mashable.com/2010/01/10/facebook-founder-on- privacy/) and it would not be
difficult to find several more examples of similarly-aged tech entrepreneurs with reductionist views on privacy.
21
In other words, what is happening for youth on social networks and in the internet in
general, is a general form of identity construction and exploration of individuality – they are
creating their own territories of personalisation. This is why they may behave as if they are
unconcerned about privacy concerns even though they say that they are – because they are at
that stage of their lives when their personal identities have not yet fully formed and these
behaviours represent that process of self-formation. Disclosure of personally revealing
information on the internet for such children, adolescents and young adults, while
questionable from a privacy perspective, may well be entirely necessary from an identity-
development perspective – these activities which are referred to often as bricólage (Benson,
2013; Herring, 2004; Weber & Mitchell, 2008). There is an experimental aspect to these
identity developmental processes in cyberspace – one might find the term ‘beta self’ a useful
metaphor.
Moreover, from a territorial-development perspective, it should be noted that there are
considerable and varied evolutionary, competitive and sexual pressures on those in this age
group. For example, it can be expected that younger (prepubescent) children will necessarily
be more conservative than those in their middle teens, as risk-taking rises with the onset of
puberty. Moreover, we see many examples of territoriality in social networks, from gender
differences in self-promotion (Mehdizadeh, 2010), ethnic and racial displays (Grasmuck,
Martin, & Zhao, 2009; Seder & Oishi, 2009) and age differences (Strano, 2008).
Interestingly, it is also worth reporting that it there is a considerably amount of
comparative optimism (or ‘unrealistic optimism’ or ‘optimistic bias’) at play with regard to
privacy risk online (Baek, Kim, & Bae, 2014) – in other words, that we assume that other
people’s privacy is more at risk than our own. This was found to be related to a number of
factors – such as age (wherein subjects were more likely to think that a younger person was
most at risk of privacy breach than an older) and maternalistic personality traits and previous
22
experience of privacy infringement but also in an unexpected way with privacy protective
behaviours. Baek, Kim, and Bae (2014) predicted that comparative optimism with regard to
privacy would be negatively related to the adoption of such protections – i.e. that those who
were most optimistic about their safety would be consequently emboldened to engage in risky
behaviour. From the narrative which we have advanced in this paper to this point, it should
not be surprising to find that this hypothesis was not supported – most likely because those
who are optimistic about their safety online have taken steps to defend their territory.
Furthermore, the application of territoriality to cyberspace finds strong support from
pre-existing theoretical developments and applications. For example, Schwarz (2010) uses
the concept of ‘field’ (Bourdieu, 1992) to explore the use of self-portraits by teenagers as
social capital. Interestingly, Schwarz (2010) argues that, within the particular social network
in question, users choice of photograph is less a ‘reflexively chosen identity’ and more to do
with their positioning within that website’s hierarchy, and also that many dropped out of such
a competitive environment. Consequently, there can be casualties between identity and
territoriality in the production of personally identifiable information, which must be borne in
mind with the rise of the #selfie (OxfordWords, 2013).
Paradoxes unbound
When we view privacy as a psychologically-complex combination of identity and territory,
certain oft-trumpeted ‘paradoxes’ of internet behaviour become easier to understand. For
example, as Lepore (2013) laments
… the paradox of an American culture obsessed, at once, with being seen and
with being hidden, a world in which the only thing more cherished than privacy is
publicity. In this world, we chronicle our lives on Facebook while demanding the
23
latest and best form of privacy protection—ciphers of numbers and letters—so
that no one can violate the selves we have so entirely contrived to expose. (n.p)
Again, as we have said, this is inevitable, as we enter a contested territory, attempting to own
our own corner in a space which is completely shared. This is a point which has been
stumbled over many times: “Herein lies the privacy paradox. Adults are concerned about
invasion of privacy, while teens freely give up personal information. This occurs because
often teens are not aware of the public nature of the Internet” (Barnes, 2006, p.3/11). On the
contrary, it is likely that teens are quite aware of the public nature of the information they
reveal, and the fact that they are revealing it is indicative of progress through a complex
process of identity development.
Similarly, Acquisti and Gross (2006) seem surprised by the many ‘privacy concerned’
college undergraduates who joined Facebook and also revealed large amounts of personal
information, echoed in a subsequent study which found that “paradoxically, more control
over the publication of their private information decreases individuals’ privacy concerns and
increases their willingness to publish sensitive information” (Brandimarte, Acquisti, &
Loewenstein, 2010). Furthermore, when examined longitudinally, users demonstrated
increasingly privacy-seeking behavior, progressively decreasing the amount of personal data
shared publicly yet increased amount and scope of data shared privately (and also with
Facebook & 3rd
parties) (Stutzman, Gross, & Acquisti, 2012). Notably, in all of these studies,
the sample population were college students of North America.
In other words, the risk is more about the other members of the network, not the risk
of the network itself, because that risk is shared across all members of the network and is the
risk implicit in joining the contested territory. The whole point for users is to enter that space,
where the development of one’s identity can be compared with one’s peers, with whom one
competes in the usual evolutionary fashion. Of course, this does not make sense if you view
24
the phenomenon in terms of communication qua sharing information, because it looks like
contradictory processes of information leakage. When the ‘information’ is seen as part of the
user’s psychology, as the bricólage necessary for the development of self-territory of those in
a culture which supports a period of extended adolescence, it becomes more understandable.
As such, the idea of privacy as ‘information risk’ is not quite as useful as privacy as identity
territory.
The crux of the issue is not, however, any ‘paradox’, but how both our identity and
territoriality, with such competing drives, can be protected legally. To do so requires a
recognition that the latter aspect necessitates the creation of some form of controllable
boundaries, while the former requires some form of personalisation. In sum, we want to
create a representation of ourselves online, but at the same time, we do want other people to
actually see it: the most exclusive shows are cordoned off by nothing more than a velvet rope.
This is equally apparent with regard to the infamous ‘Facebook Newsfeed
controversy’. Now a prime feature of that social network, when the feed was first introduced,
which combines updates from all one’s friends into one continuously refreshing stream
(information which had been previously only available on their respective profile pages),
there was considerable uproar from members of the network. As Hoadley, Xu, Lee, and
Rosson (2010) explain, all that information was previously accessible, but not easily or
efficiently. Moreover, as Boyd (2008, p.18) points out, privacy isn’t simply a matter of
‘zeroes and ones’ but more about “a sense of control over information, the context where
sharing takes place, and the audience who can gain access”. We are not annoyed that the
information has been seen, but that our identity has been thrust into a territory that we were
not prepared for.
These interpretations of privacy behaviours are supported elsewhere, with findings
that they may include strategies for gaining social capital and audience control (Ellison,
25
Vitak, Steinfield, & Gray, 2011) and also that audience size and diversity impacts disclosure
and use of privacy settings (Vitak, 2012). At the same time however, we this does not
discount the importance and subtlety of the network itself, noting important work on the
distinction between exhibitions and performances in social media, with their underlying
algorithms as curators (Hogan, 2010), and the ethics of Facebook’s disclosive power (Light
& McGrath, 2010). While these issues are somewhat beyond the scope of this paper, it is
worthwhile noting that they are consonant with the interpretation of social media as
participatory surveillance (Bucher, 2012; Tokunaga, 2011; Westlake, 2008). In sum, while
online privacy behaviours may reflect a certain expression of identity and territoriality, the
opposite is also true: we are quite adept at using that same technology to compare ourselves
to others, to judge their character and to blockade ourselves from them.
As has been noted, the complexity of demarcating territory in cyberspace is a tricky
one, whereby it might even seem more complex when we add another dimension to that
context, though the opposite is actually the case. There is a temporal aspect to both our
identity and territorial drives, which is reflected in our privacy desires. A recent study breaks
new ground in this area, finding that social media users desire for their content to remain
online was quite varied – some posts to become more private over time, some posts to
become more visible (Bauer et al., 2013). Interestingly, as we might now expect from a pool
with a median age of 26, of whom 27% were students, and 11% were unemployed, the study
found that participants were very poor at predicting how they would like to change the
visibility of their material over time (Bauer et al., 2013). Hence, given that this does not give
much support to the idea of putting ‘expiration dates’ on posted material, Bauer et al. (2013,
p.10) suggest that “a way forward might be to design interfaces that promote reflection about
older content” – this would, in the current author’s estimation, be an excellent design feature
26
as it may help the user, not only to upon their ‘digital exhaust’, but also overcome the process
of identity moratorium.
Moreover, it is this temporal aspect which may prove instrumental to
surmounting wider issues of privacy – not merely problematic Facebook status updates.
Temporality is also a key aspect of the self, as in Locke’s (1690) understanding:
personal Identity; i.e. the sameness of a Rational being: And as far as this
consciousness can be extended backwards to any past Action or Thought, so far
reaches the Identity of that Person; it is the same self now it was then; and ‘tis by
the same self with this present one that now reflects on it, that that Action was
done. (Locke, 1700) p.183
This is the problematic aspect of Locke’s work, as it verges on exonerating the criminal from
the acts he does not remember, and in fact a few paragraphs later he muses on whether or not
drunks deserve such an amnesty. However, this position is premised on the assumption that
“there being no moment of our lives wherein we have the whole train of all our past actions
before our eyes in one view” (p. 184). This, however, is not quite the case for the youth of
today. Perhaps it is unfair, psychologically, for their bricólage to be stored eternally and their
potentially embarrassing attempts at identity construction remaining publicly searchable?
Locke argues that our sense of identity is based on our continuity of consciousness of what
we can remember, which surely requires that we are allowed to forget what we would like to
move on from.
Moreover, we also find that there is similar temporal aspect to human territoriality
(Edney, 1974, p. 969), whereby “families living in defended homes had lived significantly
longer on the property, but also anticipated a longer residence there in the future.” Ergo, we
expect that those who spend more time on Facebook, for example, will be most concerned
about their privacy, but will also be more likely to see Facebook as both part of their identity,
27
and part of their future. In fact, this is exactly what the research shows, with items such as “I
would be sorry if Facebook shut down” and “Facebook is part of my everyday activity” being
part of a scale of Facebook intensity positively related to social capital (Ellison, Steinfield, &
Lampe, 2007) and also “positive relationship between participants’ use of the advanced
privacy settings and both bridging and bonding social capital” (Ellison et al., 2011, p. 27).
Idiographic responses
Such observations lead us to the hope that there may well be technical innovations
which can help us emerge from this privacy crisis in a way which respects human
psychological development. Much has been written in popular media about the notion of
‘ephemeral communication’, which is probably most notably exemplified by Snapchat, a
mobile app whereby photos can be sent to fellow users but will be automatically deleted after
less than ten seconds. While there are obviously concerns about the actuality of this deletion,
(Shein, 2013), this does represent an interesting development for technology use, particularly
for those in identity crisis stages of development.
Additionally, ephemerality and anonymity have been shown not to be detrimental to
the formation of community norms, status and identity on the website known as 4chan
(Bernstein, Harry, Andr, Panovich, & Vargas, 2011). That particular website, known more
for its infamy and vulgarity more than anything else, is worth mentioning given that it often
represents an image of the internet quite different to what most users would like to experience
on a regular basis, yet, as Bernstein et al.’s (2011) paper demonstrates, it does manage to
‘work’ as a community. Consequently, we should be mindful that people will produce
workable solutions for themselves in whatever fashion they care. As one tweet put it,
“Privacy can be achieved by multiplying public selves. Leaving breadcrumbs here to throw
people on to the wrong trail” (Horning, 2014). The vista of privacy only being achievable via
28
distraction and disruption – in effect, deliberate identity dissociation via multiple accounts –
is thought-provoking, to say the least.
An alternative approach, presumably a more publicly acceptable method of user
empowerment, requires a re-evaluation of the ‘big data’ phenomenon – a move from the
nomothetic to the idiographic. This is where notions like sousveillance and lifelogging enter
the discourse. The former term is most notably associated with Steve Mann and is generally
understood to mean a type of ‘inverse surveillance’, whereby they individual monitors those
in authority – those who they believe are in control of the surveillance. Sousveillance hence
empowers the individual by “… affording all people to be simultaneously master and subject
of the gaze, wearable computing devices offer a new voice in the usually one-sided dialogue
of surveillance” (Mann, Nolan, & Wellman, 2003, p. 348). While such concept represents a
certain philosophical innovation, the rebalancing of a power notwithstanding, it is hardly
practical to ask the population at large to wear video cameras at all times. Assuming that ‘all
people’ will want to participate in sousveillance, is assuming a lot, to put it mildly, such as
but not limited to: that all people disapprove of surveillance, that they would be sufficiently
motivated to engage in sousveillance, and also would be financially and technically
competent to do so, and so on. It is also perhaps over-egging the power pudding – the proof
of which is that in the intervening decade, such wearable technologies have yet to break
through (though more of this anon). The value of sousveillance as a concept lies however, in
its return of the focus to the individual.
In that light, and perhaps in response to the oppositional/conspiratorial theme in the
sousveillance literature, we also have the related concept of lifelogging, which, while also
relying on wearable technology (usually a small neck-slung camera), does not mean to
obviously evoke notions of countering a police state. The idea here is to attempt to record as
much of a person’s everyday life as possible and to save everything to physical memory.
29
However, as Sellen & Whittaker (2010, p.73) note, “many lifelogging systems lack an
explicit description of potential value for users” and one is left wondering what the point of
these projects really are, except perhaps for the engineers to overcome the technical
challenges involved in storing huge quantities of photographic data. In response to such
criticism, the likes of Doherty et al. (2011) have responded by outlining the many potential
benefits of lifelogging, such as assessing one’s health and physical activity, improving
personal efficiency and lifestyle analysis, all of which have considerable merit - though it
must be noted that privacy concerns have been dealt with in passing, at best.
But the trump benefit to lifelogging which is fundamental and implicit in the whole
enterprise is that of memory enhancement: if we wear these small cameras around our necks
we may never forget anything. However, though, as Locke (1700) explored centuries ago, we
are what we remember and are conscious of. In fairness, Doherty et al. (2012, p. 169)
mention in passing, that “forgetting is very important”, but it needs to be recognised more
profoundly by lifeloggers: keeping a permanent record of a person’s total life activity, while
it has obvious benefits, may also have negative side-effects. There are many and varied links
between memory and well-being, for example: overgeneral autobiographical memory and
trauma (Moore & Zoellner, 2007), positive effects of false memories (Howe, Garner, & Patel,
2013), depressive realism (Ackermann & DeRubeis, 1991) and childhood amnesia (Peterson,
Grant, & Boland, 2005).
Essentially, it is important for the development of a positive self-concept that we
forget, or at least reinterpret, painful past memories. As such, it is important that the
psychological effects of new technologies, for whatever noble purpose they are designed, be
fully audited. While the nuances of ‘memory as a human experience’ contra ‘memory as a
computer feature’ have been noted for some time (Sellen et al., 2007), actually figuring out
the beneficial aspects for the individual user are only more recently being explored (Crete-
30
Nishihata et al., 2012). The latter study is indicative of benefits as it focuses on older users,
and the possible mental health benefits of using lifelogging-related technologies to ameliorate
cognitive/memory impairments. However, at the same time, while this is an apt clinical
usage, it is most likely that if such technologies ever get to market in a widespread manner
(which is, despite previous false dawns, always possible – see recent CES press), they are
most likely to be aimed at a younger demographic – who, as we have seen, will have far
different psychological motivations.
The main point here is that in this post-Snowden environment, there are many
possible solutions for developers to engage with to overcome public fears with regard to their
data collection, but the fundamental aim must be to empower and enlighten the individual
user. This point has been made before - O’Hara, Tuffield, and Shadbolt (2008) speak at
length on lifelogging, the empowerment of the individual alongside the attendant privacy
concerns – yet it has never been more relevant. If people are going to continue to be asked to
surrender their personally identifiable information to a remote server, visible to whom they
do not know, there must be a quid pro quo of returned utility. That particular point is hardly a
controversial one, yet a parallel observation remains unexplored: while academics and
technicians muse on the viability, utility and privacy of the likes of sousveillance and
lifelogging, the basic data needed to achieve most of those technologies assumed aims
already exists and is incredibly widespread. To put it bluntly, my iPhone already gathers huge
amounts of data about where I go, what activity I am engaged with and who I deal with, but it
is next to impossible for me to view that data aggregated anywhere: I am interested in my
metadata, can someone please show it to me?
In sum, we need to provide individual users with greater visibility of the data which
they are sharing with service providers. For example the MIT Immersion project
(iimmersion.media.mit.edu) provides a very interesting reflection to the user of their Gmail
31
metadata. At the same time, the response to Apple’s inclusion of a ‘most frequent locations’
in iOS7 (e.g. Rodriguez, 2013) was nothing short of embarrassing: such data must be
collected for basic phonecalls to be successfully carried, it doesn’t happen by magic.
Ultimately we must not be surprised when our data is reflected back to us, in fact we must
demand it and it should be presented to us as a matter of course, if not legislated for.
Such a position recognises the fact that, though some of the more paranoid
participants in the debate might want to, that if is possible at the moment, it very soon will
become impossible to live ‘offline’ (cf. Kessler, 2013). Data about, by and for individual
members of society will continue to be created at an exponential rate and therefore we must
reflect upon how it is used. No one wants to live in the sort of police state surveillance state
which Fernback (2013) seems to believe is upon us, where we must form ‘discourses of
resistance’ against social networks, yet at the same time no-one can deny that there is an
inherent truth to Tokunaga's (2011) finding that there is a considerable amount of
‘interpersonal surveillance’ ongoing also – we are spying on each other informally to a
considerable extent also. The latter’s investigation occurred in the context of romantic
relationships, unsurprisingly finding that ‘interpersonal electronic surveillance’ was more
likely to be exercised by younger users of social networks than older, but it does chime with
other findings that the majority of time spent in such contexts is ‘stalking’ or simply
browsing other users’ profiles (e.g. Lewis and West, 2009) – amateur surveillance, in other
words.
The irony of certain responses to the privacy and surveillance phenomena is
encapsulated in a certain passage in Fernback’s (2013) paper. In describing Facebook’s
effectively bait-and-switch changing of policies which made private data public, Fernback
(2013) reports how many Facebook groups were set up in response, noting themes such as
intrusiveness and disappointment in the ‘discourses of resistance’ therein. However, in
32
accounting of Facebook users’ anger at not having consented their data to be used in this
way, Fernback (2013) neglects to mention whether or not those users’ he cites were asked for
permission as to whether their posts – which are quoted at length and with users’ full names –
could be reproduced in his paper.
Future directions
In that light there are a number of final observations which are worth bearing in mind as we
attempt to maturely rebuild cyberspace. In attempting to focus analysis and use of data away
from nomothetic to idiographic, let us also admit why this is important. ‘Big’ datasets are
inherently problematic, and as Ohm notes, (2010, p. 1704), “data can be either useful or
perfectly anonymous but never both”. Through a series of examples, Ohm (2010) shows that
datasets which were ostensibly anonymised were quickly cracked through combination with
other data, and subjects recognized – a process he terms ‘accretive reidentification’, which
will necessarily create and amplify privacy harms. A good example of this is provided by
Zimmer (2010) in showing how the dataset on which Lewis, Kaufman, Gonzalez, Wimmer,
and Christakis’ (2008) study was based, on its subsequent release, was quickly cracked,
participants identified, and had to be recalled. Similarly, de Montjoye, Hidalgo, Verleysen,
and Blondel (2013, p. 1) conclude that “even coarse datasets provide little anonymity. These
findings represent fundamental constraints to an individual’s privacy and have important
implications for the design of frameworks and institutions dedicated to protect the privacy of
individuals.” As Ohm (2010, p. 1723) agrees, “results suggest that maybe everything is PII
[personally identifiable information] to one who has access to the right outside information.”
In characterising the combination of datasets as having the potential to create for any
particular individual a ‘database of ruin’, Ohm (2010), describes some factors which may be
33
useful in assessing the risk of these privacy harms, such as data-handling techniques, data
quantity and motives for use. These, it must be said, do not over-awe Grimmelman (2014),
who runs the rule over ‘Big data’s other privacy problem’ – in effect, that while privacy
harms or accretive reidentification may be problematic, we also need to be wary of who is
managing the data, and who is managing them – oversight and quis custodes ipsos custodies.
This has ‘Empedoclean regress’ been previously explained as the ‘burden of omniscience’ by
Wilson (2000) p. 244 – a conundrum from which there does not seem to be any obvious
solution. Grimmelman (2014) concludes with some rather vague propositions he calls ‘little
data’, or ‘democratic data’, which do not seem contradictory of the idiographic approach
mentioned above. What is necessary in all of this is for us to acknowledge and demarcate our
own self-worth in our personal information – what the Germans apparently call
informationelle selbstbestimmung – ‘knowing what data you have and being in control over
how it’s used’ (Bartlett, 2013). This may be something of a pipe-dream, but wishing that
NSA surveillance will come to an end, or that there will be no more Snowdens, are both
equally more so naive. The most pragmatic direction out of the crisis is to focus on the
development of strategies and applications of personal reflective development that teach the
individual user more about themselves, their data and its worth.
There are many other potential avenues out of the crosshairs of surveillance and
subversion. For one thing, with regard to youth, it could be argued on the basis of the
research reported above that it is not appropriate for those under the age of 18 to have their
data permanently recorded. Could it be internationally and legally agreed that youth have a
right to ephemeral networks, and their data to be deleted up to their majority? Ultimately we
need to be more understanding around and educational with regard to identity play and
electronic bricólage. These are technical challenges, but what of it? It would be a shame
indeed if the software developers of the world collectively decided that, after having solved
34
so many previously-intractable problems of computer science and engineering, the protection
of minors on the internet was beyond their self-professed genius.
Other approaches rest on recognising the reality of data accretion and the variability
in personally identifiable information. It might be worthwhile constructing a base level of
online communication in a society, a sort of online directory: this is to recognise that
individual territories have shared boundaries. Access to such post-boxes could be built on a
nested hierarchy of personal information – i.e. if you have a person’s first name and surname,
you may not be able to communicate with them, but if you have their postcode in addition,
you may be able to send them a very short, text-only message, whereas if you have their full
name, date of birth and phone number, you might be able to send them a large, encrypted file.
In this way, a public network would be internally protected by social networks and private
relationships.
Alternatively, by recognising how people usually manage issues of territoriality – i.e.
in solidarity – we can decentralise the network further with collective privacy. should
consider organisational communication as collectively private - in that each member of a
team should be able to see every other member’s messages. In this way, we have complete
internal transparency: a radical departure organisationally, but one which is probably more
realistic psychologically than most current scenarios.
Of course, the issues of ethical database management still apply in many of these
scenarios. In recognising that the collection of large amounts of data ultimately brings large
amounts of power, which will always be problematic, whatever the motivation or necessity of
such gathering, there are however, contradictory possibilities, when we reflect on this and the
wisdom of the ancients. When Bacon (1597) concluded scientia potenta est (‘knowledge is
power’) and Acton observed that “power tends to corrupt and absolute power corrupts
35
absolutely” we can easily see how the collection of ‘absolute knowledge’ can cause
problems.
It might also lead one to infer, moving from objective to subjective spheres, and
mindful of our previous reference to ‘depressive realism’, that we recoil from ‘absolute
knowledge’ about ourselves out of some kind of deep-seated fear of ‘knowing too much’
about ourselves. We ignore where our information goes, because it implicates and
embarrasses us. Instead, we prefer repression, imagined identities and tentative territories –
cultural values long overdue for revision.
36
References
Ackermann, R., & DeRubeis, R. J. (1991). Is depressive realism real? Clinical Psychology
Review, 11(5), 565–584. doi:10.1016/0272-7358(91)90004-E
Acquisti, A., & Gross, R. (2006). Imagined communities: Awareness, information sharing,
and privacy on the Facebook. In PET 2006, LNCS 4258 (pp. 36–58). Berlin,
Heidelberg: Springer-Verlag.
Annalect. (2013). Annalect Q2 2013 Online Consumer Privacy Study (pp. 1–4). Retrieved
from www.annalect.com
Archer, S. L. (1989). Gender differences in identity development: Issues of process, domain
and timing. Journal of Adolescence, 12(2), 117–138. doi:10.1016/0140-
1971(89)90003-1
Bacon, F. (1597). Meditationes sacrae. London: Humfred Hooper.
Baek, Y. M., Kim, E., & Bae, Y. (2014). My privacy is okay, but theirs is endangered: Why
comparative optimism matters in online privacy concerns. Computers in Human
Behavior, 31, 48–56. doi:10.1016/j.chb.2013.10.010
Barnes, S. B. (2006). A privacy paradox: Social networking in the United States. First
Monday, 11(9), 1–11.
Bartlett, J. (2013). iSPY: How the internet buys and sells your secrets. The Spectator.
Retrieved January 20, 2014, from http://www.spectator.co.uk/features/9093961/little-
brothers-are-watching-you/
Bauer, L., Cranor, L. F., Komanduri, S., Mazurek, M. L., Reiter, M. K., Sleeper, M., & Ur, B.
(2013). The post anachronism. In Proceedings of the 12th ACM workshop on
Workshop on privacy in the electronic society - WPES ’13 (pp. 1–12). New York,
New York, USA: ACM Press. doi:10.1145/2517840.2517859
37
Bélanger, F., & Crossler, R. E. (2011). Privacy in the digital age: A review of information
privacy research in information systems. MIS Quarterly, 35(4), 1017–1041. Retrieved
from http://dl.acm.org/citation.cfm?id=2208951
Benson, C. (2001). The cultural psychology of self: Place, morality and art in human worlds.
Routledge.
Benson, C. (2013). Acts not tracts! Why a complete psychology of art and identity must be
neuro-cultural. In Art and Identity: Essays on the Aesthetic Creation of Mind (pp. 39–
66). Amsterdam: Rodopi.
Bernstein, M. S., Harry, D., Andr, P., Panovich, K., & Vargas, G. (2011). 4chan and /b/: An
analysis of anonymity and ephemerality in a large online community. In ICWSM,
Association for the Advancement of Artificial Intelligence (pp. 1–8).
doi:10.1.1.207.9761
Boyd, D. (2008). Facebook’s privacy trainwreck: Exposure, invasion, and social
convergence. Convergence: The International Journal of Research into New Media
Technologies, 14(1), 13–20. doi:10.1177/1354856507084416
Brandimarte, L., Acquisti, A., & Loewenstein, G. (2010). Misplaced confidences: Privacy
and the control paradox. In Ninth Annual Workshop on the Economics of Information
Security. Cambridge, MA.
Brown, G. (2009). Claiming a corner at work: Measuring employee territoriality in their
workspaces. Journal of Environmental Psychology, 29(1), 44–52.
doi:10.1016/j.jenvp.2008.05.004
Bucher, T. (2012). Want to be on the top? Algorithmic power and the threat of invisibility on
Facebook. New Media & Society, 14(7), 1164–1180. doi:10.1177/1461444812440159
Cakir, S., & Aydin, G. (2005). Parental attitudes and ego identity status of Turkish
adolescents. Adolescence, 40, 847–859.
38
Castro, D. (2013). How much will PRISM cost the U.S. cloud computing industry? (pp. 1–9).
Washington, DC. Retrieved from www.itif.org
Christofides, E., Muise, A., & Desmarais, S. (2011). Hey Mom, What’s on Your Facebook?
Comparing Facebook Disclosure and Privacy in Adolescents and Adults. Social
Psychological and Personality Science, 3(1), 48–54. doi:10.1177/1948550611408619
Coles-Kemp, L., Lai, Y., & Ford, M. (2010). Privacy on the internet: Attitudes and
behaviours (pp. 1–34). Retrieved from vome.org.uk
Côté, J. E. (2006). Emerging adulthood as an Institutionalized moratorium: risks and benefits
to identity formation. In J. J. Arnett & J. L. Tanner (Eds.), Emerging adults in
America: Coming of age in the 21st century (pp. 85–116). Washington, DC:
American Psychological Association. doi:10.1037/11381-004
Crete-Nishihata, M., Baecker, R. M., Massimi, M., Ptak, D., Campigotto, R., Kaufman, L. D.,
… Black, S. E. (2012). Reconstructing the past: personal memory technologies are not
just personal and not just for memory. Human–Computer Interaction, 27(1-2), 92–
123. doi:10.1080/07370024.2012.656062
Dale, J. (2013). Facebook privacy is a joke: How Edward Snowden changed my online
habits. rabble.ca. Retrieved January 06, 2014, from
http://rabble.ca/news/2013/08/facebook-privacy-joke-how-edward-snowden-changed-
my-online-habits
De Montjoye, Y.-A., Hidalgo, C. A., Verleysen, M., & Blondel, V. D. (2013). Unique in the
crowd: The privacy bounds of human mobility. Scientific reports, 3, 1376.
doi:10.1038/srep01376
Debatin, B., Lovejoy, J. P., Horn, A.-K., & Hughes, B. N. (2009). Facebook and online
privacy: Attitudes, behaviors, and unintended consequences. Journal of Computer-
Mediated Communication, 15(1), 83–108. doi:10.1111/j.1083-6101.2009.01494.x
39
Doherty, A. R., Caprani, N., Conaire, C. Ó., Kalnikaite, V., Gurrin, C., Smeaton, A. F., &
O’Connor, N. E. (2011). Passively recognising human activities through lifelogging.
Computers in Human Behavior, 27(5), 1948–1958. doi:10.1016/j.chb.2011.05.002
Doherty, A. R., Pauly-Takacs, K., Caprani, N., Gurrin, C., Moulin, C. J. A., O’Connor, N. E.,
& Smeaton, A. F. (2012). Experiences of aiding autobiographical memory using the
SenseCam. Human-Computer Interaction, 27(27), 37–41.
doi:10.1080/07370024.2012.656050
Dumas, T. M., Ellis, W. E., & Wolfe, D. A. (2012). Identity development as a buffer of
adolescent risk behaviors in the context of peer group pressure and control. Journal of
adolescence, 35(4), 917–27. doi:10.1016/j.adolescence.2011.12.012
Edney, J. J. (1974). Human territoriality. Psychological Bulletin, 81(12), 959–975.
doi:10.1037/h0037444
Ellison, N. B., Steinfield, C., & Lampe, C. (2007). The benefits of Facebook “Friends:”
Social capital and college students’ use of online social network sites. Journal of
Computer-Mediated Communication, 12(4), 1143–1168. doi:10.1111/j.1083-
6101.2007.00367.x
Ellison, N. B., Vitak, J., Steinfield, C., & Gray, R. (2011). Privacy Online. (S. Trepte & L.
Reinecke, Eds.)Environment (pp. 19–32). Berlin, Heidelberg: Springer.
doi:10.1007/978-3-642-21521-6
Erikson, E. H. (1968). Identity: Youth and crisis. New York: Norton.
Fernback, J. (2013). Sousveillance: Communities of resistance to the surveillance
environment. Telematics and Informatics, 30(1), 11–21.
doi:10.1016/j.tele.2012.03.003
40
Fisher, J. D., & Byrne, D. (1975). Too close for comfort: Sex differences in response to
invasions of personal space. Journal of Personality and Social Psychology, 32(1), 15–
21. doi:10.1037/h0076837
Fitsanakis, J. (2013). Are Russian spies switching to typewriters to avoid interception? |
intelNews.org on WordPress.com. intelNews.org. Retrieved January 06, 2013, from
http://intelnews.org/2013/07/12/01-1298/
Fogel, J., & Nehmad, E. (2009). Internet social network communities: Risk taking, trust, and
privacy concerns. Computers in Human Behavior, 25(1), 153–160.
doi:10.1016/j.chb.2008.08.006
Gibson, R. K., & Mcallister, I. (2009). Crossing the web 2.0 frontier? Candidates and
campaigns online in the Australian Federal Election of 2007. In ECPR General
Conference (pp. 1–30). Potdam, Germany.
Grasmuck, S., Martin, J., & Zhao, S. (2009). Ethno-racial identity displays on Facebook.
Journal of Computer-Mediated Communication, 15(1), 158–188. doi:10.1111/j.1083-
6101.2009.01498.x
Greenhow, C., & Robelia, B. (2009). Informal learning and identity formation in online
social networks. Learning, Media and Technology, 34(2), 119–140.
doi:10.1080/17439880902923580
Greenwald, G. (2013, June 6). NSA collecting phone records of millions of Verizon
customers daily. The Guardian. Retrieved from
http://www.theguardian.com/world/2013/jun/06/nsa-phone-records-verizon-court-
order
Grimmelmann, J. (2014). Big data’s other privacy problem. In Big Data and the Law (Vol.
11, pp. 1–10). West Academic. Retrieved from
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2358079
41
Gueorguieva, V. (2007). Voters, MySpace, and YouTube: The impact of alternative
communication channels on the 2006 election cycle and beyond. Social Science
Computer Review, 26(3), 288–300. doi:10.1177/0894439307305636
Henrich, J., Heine, S. J., & Norenzayan, A. (2010). The weirdest people in the world? The
Behavioral and brain sciences, 33(2-3), 61–135. doi:10.1017/S0140525X0999152X
Herring, S. C. (2004). Slouching toward the ordinary: Current trends in computer-mediated
communication. New Media & Society, 6(1), 26–36. doi:10.1177/1461444804039906
Herring, S. C. (2008). Questioning the generational divide: Technological exoticism and
adult constructions of online youth identity. In D. Buckingham (Ed.), Youth, Identity,
and DigitalMedia (pp. 71–92). Cambridge, MA: The MIT Press.
doi:10.1162/dmal.9780262524834.071
Hess, A. (2014). Why women aren’t welcome on the internet. Pacific Standard. Santa
Barbara, California. Retrieved January 23, 2014, from
http://www.psmag.com/navigation/health-and-behavior/women-arent-welcome-
internet-72170/
Hoadley, C. M., Xu, H., Lee, J. J., & Rosson, M. B. (2010). Privacy as information access
and illusory control: The case of the Facebook News Feed privacy outcry. Electronic
Commerce Research and Applications, 9(1), 50–60. doi:10.1016/j.elerap.2009.05.001
Hogan, B. (2010). The presentation of self in the age of social media: Distinguishing
performances and exhibitions online. Bulletin of Science, Technology & Society,
30(6), 377–386. doi:10.1177/0270467610385893
Hoofnagle, C., King, J., Li, S., & Turow, J. (2010). How different are young adults from
older adults when it comes to information privacy attitudes and policies? Retrieved
from http://ssrn.com/abstract=1589864
42
Horning, R. (2014). Privacy can be achieved by multiplying public selves. Leaving
breadcrumbs here to throw people on to the wrong trail. Twitter. Retrieved January
08, 2014, from https://twitter.com/marginalutility/status/419205717829365760
Howe, M. L., Garner, S. R., & Patel, M. (2013). Positive consequences of false memories.
Behavioral sciences & the law, 31(5), 652–65. doi:10.1002/bsl.2078
Hoy, M. G., & Milne, G. R. (2010). Gender differences in privacy-related measures for
young adult Facebook users. Journal of Interactive Advertising, 10(2), 28–46.
Jordán-Conde, Z., Mennecke, B., & Townsend, A. (2013). Late adolescent identity definition
and intimate disclosure on Facebook. Computers in Human Behavior.
doi:10.1016/j.chb.2013.07.015
Kaya, N., & Weber, M. J. (2003). Territorial behavior in residence halls: A cross-cultural
study. Environment & Behavior, 35(3), 400–414. doi:10.1177/0013916503035003005
Kessler, S. (2013). Think you can live offline without being tracked? Here’s what it takes.
Fast Company. Retrieved January 20, 2014, from
http://www.fastcompany.com/3019847/think-you-can-live-offline-without-being-
tracked-heres-what-it-takes
Kosinski, M., Stillwell, D., & Graepel, T. (2013). Private traits and attributes are predictable
from digital records of human behavior. Proceedings of the National Academy of
Sciences of the United States of America, 110(15), 5802–5.
doi:10.1073/pnas.1218772110
Lehtinen, V., Näsänen, J., & Sarvas, R. (2009). “A little silly and empty-headed” – Older
adults’ understandings of social networking sites. In HCI 2009 - People and
Computers XXIII (pp. 45–54).
Lepore, J. (2013, June). Privacy in an age of publicity. The New Yorker. New York. Retrieved
January 09, 2014, from
43
http://www.newyorker.com/reporting/2013/06/24/130624fa_fact_lepore?currentPage
=all
Lewis, J., & West, A. (2009). “Friending”: London-based undergraduates’ experience of
Facebook. New Media & Society, 11(7), 1209–1229. doi:10.1177/1461444809342058
Lewis, K., Kaufman, J., Gonzalez, M., Wimmer, A., & Christakis, N. (2008). Tastes, ties, and
time: A new social network dataset using Facebook.com. Social Networks, 30(4),
330–342. doi:10.1016/j.socnet.2008.07.002
Light, B., & McGrath, K. (2010). Ethics and social networking sites: A disclosive analysis of
Facebook. Information Technology & People, 23(4), 290–311.
doi:10.1108/09593841011087770
Locke, J. (1700). An essay concerning humane understanding, in four books (4th ed.).
London: Awnsham and John Churchil. Retrieved from
https://archive.org/details/essayconcernin00lockuoft
Locke, J. (1751). The works of John Locke, Esq; In three volumes, Vol. II (5th ed.). London:
S. Birt, D. Browne, T. Longman... Retrieved from
http://books.google.com/books?id=q-k-AAAAcAAJ&pgis=1
Mann, S., Nolan, J., & Wellman, B. (2003). Sousveillance: Inventing and using wearable
computing devices for data collection in surveillance environments. Surveillance &
Society, 1(3), 331–355.
Marcia, J. E. (1966). Development and validation of ego-identity status. Journal of
Personality and Social Psychology, 3(5), 551–558. doi:10.1037/h0023281
Marwick, A. E. (2014). How your data are being deeply mined. The New York Review of
Books, 61(1). Retrieved from
http://www.nybooks.com/articles/archives/2014/jan/09/how-your-data-are-being-
deeply-mined/?pagination=false
44
McDonald, A. M., & Cranor, L. F. (2008). The cost of reading privacy policies. I/S: A
Journal on Law and Policy for the Information Society, 4(3), 540–565.
Mehdizadeh, S. (2010). Self-presentation 2.0: Narcissism and self-esteem on Facebook.
Cyberpsychology, behavior and social networking, 13(4), 357–64.
doi:10.1089/cyber.2009.0257
Mills, E. (2007, June 4). Google’s street-level maps raising privacy concerns. USA Today.
San Jose, CA. Retrieved from
http://usatoday30.usatoday.com/tech/news/internetprivacy/2007-06-01-google-maps-
privacy_N.htm
Milne, G. R., & Culnan, M. J. (2004). Strategies for reducing online privacy risks: Why
consumers read (or don’t read) online privacy notices. Journal of Interactive
Marketing, 18(3), 15–29. doi:10.1002/dir.20009
Moore, S. A., & Zoellner, L. A. (2007). Overgeneral autobiographical memory and traumatic
events: an evaluative review. Psychological bulletin, 133(3), 419–37.
doi:10.1037/0033-2909.133.3.419
Moscardelli, D. M., & Divine, R. (2007). Adolescents’ concern for privacy when using the
internet: An empirical analysis of predictors and relationships with privacy-protecting
behaviors. Family and Consumer Sciences Research Journal, 35(3), 232–252.
doi:10.1177/1077727X06296622
O’Hara, K., Tuffield, M. M., & Shadbolt, N. (2008). Lifelogging: Issues of identity and
privacy with memories for life. In First International Workshop on Identity in the
Information Society. Arona, Italy. Retrieved from http://eprints.ecs.soton.ac.uk/15993/
Ohm, P. (2010). Broken promises of privacy: Responding to the surprising failure of
anonymization. UCLA Law Review, 57, 1701–1777.
45
Owen, D. (2009). Election media and youth political engagement. Journal of Social Science
Education, 7(2), 14–24.
Pastalan, L. A. (1970). Privacy as an expression of human territoriality. In L.A. Pastalan & D.
H. Parson (Eds.), Spatial behavior of older people (pp. 88–101). Ann Arbor, MI:
University of Michigan.
Payne, S. (2013, December 9). Hypocrisy alert: Big tech firms complain of data intrusion.
The Spectator. Retrieved from
http://blogs.spectator.co.uk/coffeehouse/2013/12/hypocrisy-alert-big-tech-firms-
complain-of-data-intrusion/
Pempek, T. A., Yermolayeva, Y. A., & Calvert, S. L. (2009). College students’ social
networking experiences on Facebook. Journal of Applied Developmental Psychology,
30(3), 227–238. doi:10.1016/j.appdev.2008.12.010
PEN American Center. (2013). Chilling effects: NSA surveillance drives U.S. writers to self-
censor. New York. Retrieved from pen.org
Penny, L. (2014). David Cameron’s internet porn filter is the start of censorship creep. The
Guardian. Retrieved January 08, 2014, from
http://www.theguardian.com/commentisfree/2014/jan/03/david-cameron-internet-
porn-filter-censorship-creep
Peterson, C., Grant, V. V, & Boland, L. D. (2005). Childhood amnesia in children and
adolescents: Their earliest memories. Memory (Hove, England), 13(6), 622–37.
doi:10.1080/09658210444000278
Porfeli, E. J., Lee, B., Vondracek, F. W., & Weigold, I. K. (2011). A multi-dimensional
measure of vocational identity status. Journal of adolescence, 34(5), 853–71.
doi:10.1016/j.adolescence.2011.02.001
46
Powazek, D. (2012). I’m not the product, but I play one on the internet. powazek.com/.
Retrieved January 06, 2014, from http://powazek.com/posts/3229
Proshansky, H. M., Ittleson, W. H., & Rivlin, L. G. (1970). Freedom of choice and behavior
in a physical setting. In H. M. Proshansky, W. H. Ittleson, & L. G. Rivlin (Eds.),
Environmental psychology. New York: Holt, Rinehart & Winston.
Rodriguez, S. (2013, August 8). iOS 7’s controversial new feature: Frequent Locations
tracking map. Los Angeles Times. Los Angeles. Retrieved from
http://www.latimes.com/business/technology/la-fi-tn-ios-7-frequent-locations-
tracking-map-20130808,0,3403916.story
Schwarz, O. (2010). On friendship, boobs and the logic of the catalogue: Online self-portraits
as a means for the exchange of capital. Convergence: The International Journal of
Research into New Media Technologies, 16(2), 163–183.
doi:10.1177/1354856509357582
Seder, J. P., & Oishi, S. (2009). Ethnic/racial homogeneity in college students’ Facebook
friendship networks and subjective well-being. Journal of Research in Personality,
43(3), 438–443. doi:10.1016/j.jrp.2009.01.009
Sellen, A., Fogg, A., Aitken, M., Hodges, S., Rother, C., Wood, K., & Ave, J. J. T. (2007).
Do life-logging technologies support memory for the past? An experimental study
using SenseCam. In CHI ’07 Proceedings of the SIGCHI Conference on Human
Factors in Computing Systems (pp. 81–90).
Shein, E. (2013). Ephemeral data. Communications of the ACM, 56(9), 20.
doi:10.1145/2500468.2500474
Sommer, R., & Becker, F. D. (1969). Territorial defense and the good neighbor. Journal of
personality and social psychology, 11(2), 85–92.
47
Staten, J. (2013). The cost of PRISM will be larger than ITIF projects. Forrester Research.
Retrieved January 06, 2014, from http://blogs.forrester.com/james_staten/13-08-14-
the_cost_of_prism_will_be_larger_than_itif_projects
Strano, M. M. (2008). User descriptions and interpretations of self presentation through
Facebook profile images. Journal of Psychosocial Research on Cyberspace, 2(2), 1.
Retrieved from
http://cyberpsychology.eu/view.php?cisloclanku=2008110402&article=1
Stutzman, F., Gross, R., & Acquisti, A. (2012). Silent listeners: The evolution of privacy and
disclosure on Facebook. Journal of Privacy and Confidentiality, 4(2), 7–41.
Suler, J. (2004). The online disinhibition effect. CyberPsychology & Behavior, 7(3), 321–
326. doi:10.1089/1094931041291295
Thorbecke, W., & Grotevant, H. D. (1982). Gender differences in adolescent interpersonal
identity formation. Journal of youth and adolescence, 11(6), 479–92.
doi:10.1007/BF01538808
Tokunaga, R. S. (2011). Social networking site or social surveillance site? Understanding the
use of interpersonal electronic surveillance in romantic relationships. Computers in
Human Behavior, 27(2), 705–713. doi:10.1016/j.chb.2010.08.014
Valenzuela, S., Park, N., & Kee, K. F. (2009). Is there social capital in a social network site?:
Facebook use and college students’ life satisfaction, trust, and participation. Journal
of Computer-Mediated Communication, 14(4), 875–901. doi:10.1111/j.1083-
6101.2009.01474.x
Vitak, J. (2012). The impact of context collapse and privacy on social network site
disclosures. Journal of Broadcasting & Electronic Media, 56(4), 451–470.
doi:10.1080/08838151.2012.732140
48
Vitak, J., Zube, P., Smock, A., Carr, C. T., Ellison, N. B., & Lampe, C. (2011). It’s
complicated: Facebook users' political participation in the 2008 election.
Cyberpsychology, behavior and social networking, 14(3), 107–14.
doi:10.1089/cyber.2009.0226
Warren, S., & Brandeis, L. D. (1890). The right to privacy. Harvard Law Review, 4(5), 193–
220. Retrieved from http://www.jstor.org/stable/1321160
Weber, S., & Mitchell, C. (2008). Imagining, keyboarding, and posting identities: Young
people and new media technologies. In D. Buckingham (Ed.), Youth, Identity, and
Digital Media (pp. 25–48). Cambridge, MA: The MIT Press.
Westlake, E. J. (2008). Friend me if you Facebook: Generation Y and performative
surveillance. TDR/The Drama Review, 52(4), 21–40. doi:10.1162/dram.2008.52.4.21
Whitehead, T. (2013, October 22). We will lose terrorists because of GCHQ leaks, warns
minister. The Telegraph. Retrieved from
http://www.telegraph.co.uk/news/uknews/terrorism-in-the-uk/10397795/We-will-
lose-terrorists-because-of-GCHQ-leaks-warns-minister.html
Wiley, R. E., & Berman, S. L. (2012). The relationships among caregiver and adolescent
identity status, identity distress and psychological adjustment. Journal of adolescence,
35(5), 1203–13. doi:10.1016/j.adolescence.2012.04.001
Wilson, R. A. (2000). Prometheus rising (2nd ed.). Tempe, AZ: New Falcon.
Xenos, M., & Foot, K. (2008). Not your father’s internet: The generation gap in online
politics. In W. L. Bennett (Ed.), Civic life online: Learning how digital media can
engage youth (pp. 51–70). Cambridge, MA: The MIT Press.
doi:10.1162/dmal.9780262524827.051
49
Youn, S., & Hall, K. (2008). Gender and online privacy among teens: Risk perception,
privacy concerns, and protection behaviors. CyberPsychology & Behavior, 11(6),
763–5. doi:10.1089/cpb.2007.0240
Zimmer, M. (2010). “But the data is already public”: On the ethics of research in Facebook.
Ethics and Information Technology, 12(4), 313–325. doi:10.1007/s10676-010-9227-5
Zittrain, J. (2012). Meme patrol: “When something online is free, you’re not the customer,
you’re the product.” Future of the Internet. Retrieved January 23, 2014, from
http://blogs.law.harvard.edu/futureoftheinternet/2012/03/21/meme-patrol-when-
something-online-is-free-youre-not-the-customer-youre-the-product/
Zukowski, T., & Brown, I. (2007). Examining the influence of demographic factors on
internet users’ information privacy concerns. In SAICSIT 2007 (pp. 197–204). Fish
River Sun, Sunshine Coast, South Africa.

Contenu connexe

Tendances

Understanding Users' Privacy Motivations and Behaviors in Online Spaces
Understanding Users' Privacy Motivations and Behaviors in Online SpacesUnderstanding Users' Privacy Motivations and Behaviors in Online Spaces
Understanding Users' Privacy Motivations and Behaviors in Online Spaces
Jessica Vitak
 
Thesis Literature Review and Propsal
Thesis Literature Review and PropsalThesis Literature Review and Propsal
Thesis Literature Review and Propsal
Brittany Bates
 
Multimode network based efficient and scalable learning of collective behavior
Multimode network based efficient and scalable learning of collective behaviorMultimode network based efficient and scalable learning of collective behavior
Multimode network based efficient and scalable learning of collective behavior
IAEME Publication
 
1 a rookie's guide to network science
1 a rookie's guide to network science1 a rookie's guide to network science
1 a rookie's guide to network science
Jason Brownlee
 

Tendances (19)

Understanding Users' Privacy Motivations and Behaviors in Online Spaces
Understanding Users' Privacy Motivations and Behaviors in Online SpacesUnderstanding Users' Privacy Motivations and Behaviors in Online Spaces
Understanding Users' Privacy Motivations and Behaviors in Online Spaces
 
Thesis Literature Review and Propsal
Thesis Literature Review and PropsalThesis Literature Review and Propsal
Thesis Literature Review and Propsal
 
FMCS3100 'Caught in a Digital Romance'
FMCS3100 'Caught in a Digital Romance'FMCS3100 'Caught in a Digital Romance'
FMCS3100 'Caught in a Digital Romance'
 
Multimode network based efficient and scalable learning of collective behavior
Multimode network based efficient and scalable learning of collective behaviorMultimode network based efficient and scalable learning of collective behavior
Multimode network based efficient and scalable learning of collective behavior
 
Futureinternet 04-00955
Futureinternet 04-00955Futureinternet 04-00955
Futureinternet 04-00955
 
More information, less knowledge
More information, less knowledgeMore information, less knowledge
More information, less knowledge
 
Network Theory: A Brief Introduction june 2012
Network Theory: A Brief Introduction june 2012Network Theory: A Brief Introduction june 2012
Network Theory: A Brief Introduction june 2012
 
Social Network Theory
Social Network Theory Social Network Theory
Social Network Theory
 
Nodal Points - The Emerging Real-Time Social Web (@Reboot 10)
Nodal Points -  The Emerging Real-Time Social Web (@Reboot 10)Nodal Points -  The Emerging Real-Time Social Web (@Reboot 10)
Nodal Points - The Emerging Real-Time Social Web (@Reboot 10)
 
LOCATION PRIVACY ONLINE: CHINA, THE NETHERLANDS AND SOUTH KOREA
LOCATION PRIVACY ONLINE: CHINA, THE NETHERLANDS AND SOUTH KOREALOCATION PRIVACY ONLINE: CHINA, THE NETHERLANDS AND SOUTH KOREA
LOCATION PRIVACY ONLINE: CHINA, THE NETHERLANDS AND SOUTH KOREA
 
Just What Is Social in Social Media? An Actor-Network Critique of Twitter Age...
Just What Is Social in Social Media? An Actor-Network Critique of Twitter Age...Just What Is Social in Social Media? An Actor-Network Critique of Twitter Age...
Just What Is Social in Social Media? An Actor-Network Critique of Twitter Age...
 
DIGITAL_DRIFT_FINAL7May14
DIGITAL_DRIFT_FINAL7May14DIGITAL_DRIFT_FINAL7May14
DIGITAL_DRIFT_FINAL7May14
 
2009 - Connected Action - Marc Smith - Social Media Network Analysis
2009 - Connected Action - Marc Smith - Social Media Network Analysis2009 - Connected Action - Marc Smith - Social Media Network Analysis
2009 - Connected Action - Marc Smith - Social Media Network Analysis
 
An Introduction to Network Theory
An Introduction to Network TheoryAn Introduction to Network Theory
An Introduction to Network Theory
 
Social media security users guide edited
Social media security users guide  editedSocial media security users guide  edited
Social media security users guide edited
 
Social Media Analytics
Social Media AnalyticsSocial Media Analytics
Social Media Analytics
 
1 a rookie's guide to network science
1 a rookie's guide to network science1 a rookie's guide to network science
1 a rookie's guide to network science
 
Social Network Theory & Analysis
Social Network Theory & Analysis Social Network Theory & Analysis
Social Network Theory & Analysis
 
Ties
TiesTies
Ties
 

En vedette (6)

BookazineBits Thursday August 25 2016
BookazineBits Thursday August 25 2016BookazineBits Thursday August 25 2016
BookazineBits Thursday August 25 2016
 
HCFDC Ces algorithmes qui font et défont nos sociétés
HCFDC Ces algorithmes qui font et défont nos sociétésHCFDC Ces algorithmes qui font et défont nos sociétés
HCFDC Ces algorithmes qui font et défont nos sociétés
 
HCFDC Noam Chomsky is wrong. At last. Thanks to the internets.
HCFDC Noam Chomsky is wrong. At last. Thanks to the internets.HCFDC Noam Chomsky is wrong. At last. Thanks to the internets.
HCFDC Noam Chomsky is wrong. At last. Thanks to the internets.
 
HCFDC Internet, la petite fabrique du social et du renouveau des confrontations
HCFDC Internet, la petite fabrique du social et du renouveau des confrontationsHCFDC Internet, la petite fabrique du social et du renouveau des confrontations
HCFDC Internet, la petite fabrique du social et du renouveau des confrontations
 
Keynote BDX i/o
Keynote BDX i/oKeynote BDX i/o
Keynote BDX i/o
 
Culture Libre Lawrence Lessig
Culture Libre Lawrence LessigCulture Libre Lawrence Lessig
Culture Libre Lawrence Lessig
 

Similaire à Privacy as identity territoriality re-conceptualising behaviour in cyberspace

Two tales of privacy in online social networks
Two tales of privacy in online social networksTwo tales of privacy in online social networks
Two tales of privacy in online social networks
Harshitha Reddy
 
Invasion Of Privacy In Canadian Media
Invasion Of Privacy In Canadian MediaInvasion Of Privacy In Canadian Media
Invasion Of Privacy In Canadian Media
Kelly Ratkovic
 
Running head Social Media5Social Media.docx
Running head Social Media5Social Media.docxRunning head Social Media5Social Media.docx
Running head Social Media5Social Media.docx
agnesdcarey33086
 

Similaire à Privacy as identity territoriality re-conceptualising behaviour in cyberspace (20)

Two tales of privacy in online social networks
Two tales of privacy in online social networksTwo tales of privacy in online social networks
Two tales of privacy in online social networks
 
Invasion Of Privacy In Canadian Media
Invasion Of Privacy In Canadian MediaInvasion Of Privacy In Canadian Media
Invasion Of Privacy In Canadian Media
 
Privacy vs personalization: advisory for brand and comms practitioners into 2...
Privacy vs personalization: advisory for brand and comms practitioners into 2...Privacy vs personalization: advisory for brand and comms practitioners into 2...
Privacy vs personalization: advisory for brand and comms practitioners into 2...
 
What Data Can Do: A Typology of Mechanisms . Angèle Christin
What Data Can Do: A Typology of Mechanisms . Angèle Christin What Data Can Do: A Typology of Mechanisms . Angèle Christin
What Data Can Do: A Typology of Mechanisms . Angèle Christin
 
Dan Trottier
Dan TrottierDan Trottier
Dan Trottier
 
Social-Media
Social-MediaSocial-Media
Social-Media
 
McGregor Watkins
McGregor WatkinsMcGregor Watkins
McGregor Watkins
 
Running head Social Media5Social Media.docx
Running head Social Media5Social Media.docxRunning head Social Media5Social Media.docx
Running head Social Media5Social Media.docx
 
Studying Cybercrime: Raising Awareness of Objectivity & Bias
Studying Cybercrime: Raising Awareness of Objectivity & BiasStudying Cybercrime: Raising Awareness of Objectivity & Bias
Studying Cybercrime: Raising Awareness of Objectivity & Bias
 
Identification of inference attacks on private Information from Social Networks
Identification of inference attacks on private Information from Social NetworksIdentification of inference attacks on private Information from Social Networks
Identification of inference attacks on private Information from Social Networks
 
Scraping the Social? Issues in real-time social research (Departmental Semina...
Scraping the Social? Issues in real-time social research (Departmental Semina...Scraping the Social? Issues in real-time social research (Departmental Semina...
Scraping the Social? Issues in real-time social research (Departmental Semina...
 
Web Science Session 2: Social Media
Web Science Session 2: Social MediaWeb Science Session 2: Social Media
Web Science Session 2: Social Media
 
Hayes Privacy And Social Media Paper, October 29, 2010
Hayes   Privacy And Social Media Paper, October 29, 2010Hayes   Privacy And Social Media Paper, October 29, 2010
Hayes Privacy And Social Media Paper, October 29, 2010
 
Comprehensive Social Media Security Analysis & XKeyscore Espionage Technology
Comprehensive Social Media Security Analysis & XKeyscore Espionage TechnologyComprehensive Social Media Security Analysis & XKeyscore Espionage Technology
Comprehensive Social Media Security Analysis & XKeyscore Espionage Technology
 
10 Overriding Themes from SXSW (March 2014)
10 Overriding Themes from SXSW (March 2014)10 Overriding Themes from SXSW (March 2014)
10 Overriding Themes from SXSW (March 2014)
 
A Call to Action: Protecting the Right to Consumer Privacy Online
A Call to Action: Protecting the Right to Consumer Privacy OnlineA Call to Action: Protecting the Right to Consumer Privacy Online
A Call to Action: Protecting the Right to Consumer Privacy Online
 
Stage II Critical Reading
Stage II Critical ReadingStage II Critical Reading
Stage II Critical Reading
 
Attitudes and the_digital_divide_attitude_measurem
Attitudes and the_digital_divide_attitude_measuremAttitudes and the_digital_divide_attitude_measurem
Attitudes and the_digital_divide_attitude_measurem
 
‘Personal data literacies’: A critical literacies approach to enhancing under...
‘Personal data literacies’: A critical literacies approach to enhancing under...‘Personal data literacies’: A critical literacies approach to enhancing under...
‘Personal data literacies’: A critical literacies approach to enhancing under...
 
International Journal of Engineering Research and Development
International Journal of Engineering Research and DevelopmentInternational Journal of Engineering Research and Development
International Journal of Engineering Research and Development
 

Plus de Fabrice Epelboin

ScPo - SoMe - introduction
ScPo - SoMe - introductionScPo - SoMe - introduction
ScPo - SoMe - introduction
Fabrice Epelboin
 
GCHQ - The art of deception / Reputation War
GCHQ - The art of deception / Reputation WarGCHQ - The art of deception / Reputation War
GCHQ - The art of deception / Reputation War
Fabrice Epelboin
 
Sc po some-09-astroturfing
Sc po some-09-astroturfingSc po some-09-astroturfing
Sc po some-09-astroturfing
Fabrice Epelboin
 
Social Technology leveraging during the Arab Spring - Euromed june 2013
Social Technology leveraging during the Arab Spring - Euromed june 2013Social Technology leveraging during the Arab Spring - Euromed june 2013
Social Technology leveraging during the Arab Spring - Euromed june 2013
Fabrice Epelboin
 

Plus de Fabrice Epelboin (20)

Enseigner a distance
Enseigner a distance Enseigner a distance
Enseigner a distance
 
HCFDC progrès & technologies, une mise en perspective
HCFDC progrès & technologies, une mise en perspectiveHCFDC progrès & technologies, une mise en perspective
HCFDC progrès & technologies, une mise en perspective
 
Keynote Devoxx 2016
Keynote Devoxx 2016Keynote Devoxx 2016
Keynote Devoxx 2016
 
Sciences Po. Paris - Social Media Behin the Scenes - Aggregation & disintegra...
Sciences Po. Paris - Social Media Behin the Scenes - Aggregation & disintegra...Sciences Po. Paris - Social Media Behin the Scenes - Aggregation & disintegra...
Sciences Po. Paris - Social Media Behin the Scenes - Aggregation & disintegra...
 
LGP oct '14
LGP oct '14LGP oct '14
LGP oct '14
 
ScPo - SoMe - History
ScPo - SoMe - HistoryScPo - SoMe - History
ScPo - SoMe - History
 
ScPo - SoMe - introduction
ScPo - SoMe - introductionScPo - SoMe - introduction
ScPo - SoMe - introduction
 
GCHQ - The art of deception / Reputation War
GCHQ - The art of deception / Reputation WarGCHQ - The art of deception / Reputation War
GCHQ - The art of deception / Reputation War
 
Sorbonne astroturfing
Sorbonne astroturfingSorbonne astroturfing
Sorbonne astroturfing
 
ScPo SoMe week 11
ScPo SoMe week 11ScPo SoMe week 11
ScPo SoMe week 11
 
Sc po some-10-crowds
Sc po some-10-crowdsSc po some-10-crowds
Sc po some-10-crowds
 
Sc po some-09-astroturfing
Sc po some-09-astroturfingSc po some-09-astroturfing
Sc po some-09-astroturfing
 
Medef labcom-bad buzz
Medef labcom-bad buzzMedef labcom-bad buzz
Medef labcom-bad buzz
 
Sc po some-05
Sc po some-05Sc po some-05
Sc po some-05
 
Sc po some-04
Sc po some-04Sc po some-04
Sc po some-04
 
Sc po some-03
Sc po some-03Sc po some-03
Sc po some-03
 
Sc po some-02
Sc po some-02Sc po some-02
Sc po some-02
 
Sc po some-01
Sc po some-01Sc po some-01
Sc po some-01
 
Sc po2013 atelier-s01
Sc po2013 atelier-s01Sc po2013 atelier-s01
Sc po2013 atelier-s01
 
Social Technology leveraging during the Arab Spring - Euromed june 2013
Social Technology leveraging during the Arab Spring - Euromed june 2013Social Technology leveraging during the Arab Spring - Euromed june 2013
Social Technology leveraging during the Arab Spring - Euromed june 2013
 

Dernier

The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
heathfieldcps1
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
PECB
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
SoniaTolstoy
 

Dernier (20)

The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
Advance Mobile Application Development class 07
Advance Mobile Application Development class 07Advance Mobile Application Development class 07
Advance Mobile Application Development class 07
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writing
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
social pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajansocial pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajan
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 

Privacy as identity territoriality re-conceptualising behaviour in cyberspace

  • 1. Electronic copy available at: http://ssrn.com/abstract=2390934 1 Privacy as identity territoriality: Re-conceptualising behaviour in cyberspace (Working paper series #14.1.p) Ciarán Mc Mahon, RCSI CyberPsychology Research Centre, Dublin, Ireland. Mary Aiken, RCSI CyberPsychology Research Centre, Dublin, Ireland.
  • 2. Electronic copy available at: http://ssrn.com/abstract=2390934 2 Abstract Recent exposés of global surveillance have heightened already-heated debates about privacy in a technological society. In this paper, we explore the context and probable effects of this crisis, the character of privacy concerns, re-interpret what is meant by 'privacy', provide some solutions to the crisis. Fundamentally, we explore privacy not as a forensic or civil rights issue, but instead as a secondary psychological drive, the combination of two more profound drives – territoriality and identity. As such, the problem lies in the fact that cyberspace generally, is a wholly interconnected and networked environment which makes the demarcation of individual spaces problematic. However, by viewing privacy as identity territoriality, and by examining the psychology of those concepts, we can chart solutions to this crisis more easily. For example, we should interpret lower privacy concerns among youth as reflective of normal identity development processes in adolescence and young adulthood. Similarly, aspects of gender and temporality can be fruitfully incorporated to the discourse on privacy. On this basis, possible solutions and research directions are outlined to overcome the privacy crisis.
  • 3. Electronic copy available at: http://ssrn.com/abstract=2390934 3 Introduction While it has become a common trope in public and academic discourse since at least Warren and Brandeis' (1890) paper, privacy has become a more pressing concern since the dawn of the ‘information age’ in the 1980’s. Computer technology initially, and the internet in particular, have rapidly increased the potential for personal information to be sourced and combined in injurious fashion. In addition, the arrival and proliferation of social media and cloud computing in the last decade has exponentially increased the amount of what is known as ‘personally identifiable information’ within reach of adversarial actors. This has extended privacy, among other related concepts, out of a very physical sense of ‘someone watching me when I would prefer not to be watched’ – the foregoing citation having apparently been inspired by reportage of attendance at a Warren family funeral – into much more conceptual sense of ‘someone looking at my information’. For the purpose of this article, these are the sort of data we are most interested in: what may be termed consciously self-generated information: i.e. our emails, text messages, social media and other online service profiles and content, cloud-hosted documentation, online photo albums etc. This excludes, for example, material we are not usually conscious of creating, such as our image being recorded by onstreet cameras, or the use of our medical records. While these are obviously important areas for privacy research, and surveillance studies in particular, they are beyond the scope of this paper, where we are most concerned about privacy qua personal subjective experience. Following from that delineation, it is argued below that to properly understand not only our most recent privacy concerns and crises, but the psychology of information generally, we must unpack the very concept of privacy itself – firstly in this limited sense of ‘personally identifiable information’, latterly, in a broader sense. We demonstrate that if privacy concerns are appreciated for what they fundamentally are, namely territoriality of
  • 4. 4 identity, then this current crisis becomes easier to unpack and manage. While privacy has been previously linked to territoriality (Pastalan, 1970) and identity to territoriality (e.g. Brown, 2009; Edney, 1974), this is the first, that we know of, to combine all three, particularly in the context of information technology and cyberspace. What we are dealing with is our individual and primaeval drive to mark out and defend our own unique domain of development and subsistence. Our identity is our own self- property, the product of our individual creation and labour, the symbolic denotations and connotations of which we experience as our own private territory. Where the problem arises is when these factors play out in an implicitly shared, networked and contested context such as cyberspace. However, as we shall see, there are demographic and psychological aspects to the process of identity and territory development, so that it may be possible to design technologies that allow for them, making this space safer and more civilised. The privacy crisis What is likely to be the most significant single episode in the recent history in this area concerns the reveal of massive global surveillance by Mr Edward Snowden beginning in the middle of last year (cf. Greenwald, 2013) and continuing at time of writing (January, 2014). While the full and precise nature of this episode are not only too extensive to detail here, it is still underway, so let us limit ourselves to an interpretation of its cyberpsychology – what exactly it means in the context of online behaviour – before exploring how its consequences can be overcome. An initial and uncontroversial premise of this paper must be, however, that the Snowden affair is unlikely to leave no trace in subsequent privacy behaviour, discourse and research. It should be noted at the outset however, that the intellectual milieu into which Snowden landed was not a tabula rasa – privacy concerns had been intensely rising for some
  • 5. 5 time. Notwithstanding the figures of Julian Assange and Chelsea Manning, with whom Edward Snowden can also juxtaposed with, the major internet service corporations had been under journalistic and academic scrutiny for some time with regard to how they treated user information. For example, Google Street View came under immediate question with regard to privacy concerns on its launch in (Mills, 2007), and, as we shall discuss further below, there have long been similar concerns about Facebook in academia (Debatin, Lovejoy, Horn, & Hughes, 2009). It is beyond the remit of this paper to consider the ethics of global surveillance, or, for that matter, its subsequent exposure. It is however, within its remit to consider the evolution and adaptation of the human cyberpsychological response to the subsequent atmosphere created, given that neither the former nor the latter aspect are likely to have concluded. As long as massive amounts of data are created, massive amounts of data will be stored: but equally, massive potential for disclosure will exist. What is interesting to conjecture however, is whether or not the Snowden revelations will have an effect on behaviour: a perspective the current author believes is more enlighten than simply ignoring the episode, as many academics have done. In that regard, we do not have much data to go on, but what we do have is curious. On the one hand, as might be expected, we read reports that not only are there fears that terrorists will change their modes of operations (Whitehead, 2013), but also that intelligence organisations are doing likewise – with the KGB apparently investing in typewriters (Fitsanakis, 2013). On the other hand, there are estimates that the United States cloud computing industry alone could lose anywhere between $21.5 billion (Castro, 2013) and $180 billion (Staten, 2013) in lost revenue over the next three years as a result. In terms of the behaviour of the private individual though, there are also some indicative reports. An admittedly self-selected (and possibly indicative) survey of 528
  • 6. 6 American writers found that 28% had ‘curtailed or avoided social media activities’ since the Snowden revelations (PEN American Center, 2013). More generalisable figures are available from a series of surveys carried out over May – July 2013, involving over 2,100 American adult internet users (Annalect, 2013) – that the percent of Internet users who rate themselves as either “concerned” or “very concerned” about their online privacy rose from 48% in June to 57% in July. This shift, the authors note, “is largely from unconcerned Internet users becoming concerned—not from the normal vacillation found among neutral Internet users” (p.2), with even 27% of ‘unconcerned’ internet users taking some form of action, like editing their social media profile, researching ways to improve privacy or using private modes while browsing (Annalect, 2013). Thus we have some inkling that internet users as a class may have been roused to a new reality. But from what? A more subjective, qualitative account provides some allusion: Hearing Edward Snowden tell us in detail how our privacy is a joke was the stone that started the landslide. My denial about the impact of my online activities on my privacy and my relationships with others has come to an end. I started changing my online behaviour. I started to feel very angry. I should not have to be censor what I say online to avoid "incriminating" myself. I should be able to share my personal life online according to my own wishes, and not those of corporations and governments… (Dale, 2013, n.p.) This is a revealing paragraph, but it is the penultimate point raised by Dale (2013), a Canadian writer, which is immediately pertinent: while Snowden fundamentally exposed surveillance by governments, this activity is carried out via the services built by corporations. As Marwick (2014) notes: Data about your online and offline behavior are combined, analyzed, and sold to marketers, corporations, governments, and even criminals. The scope of this
  • 7. 7 collection, aggregation, and brokering of information is similar to, if not larger than, that of the NSA, yet it is almost entirely unregulated and many of the activities of data-mining and digital marketing firms are not publicly known at all. (n.p.) So while Snowden’s expose of the NSA may continue to grab the headlines, there is probably far more subtle surveillance of ostensibly our information going on by profit-driven enterprises. In some cases, these are third parties, who aggregate and analyse data bought from elsewhere, but in other cases they are the analysed by the entities who collected the data in the first place – but in the vast majority of instances the data has been gathered entirely legally, and indeed supplied by us willingly. In that regard, it is worth standing back from this whole episode, of governmental and corporate surveillance of our personally identifiable information – our privacy – and reflecting for a moment. The histrionics of this particular blog post is noteworthy, written in the wake of an open letter (reformgovernmentsurveillance.com) signed by major technology companies, calling on governments to limit surveillance in the wake of the Snowden revelations: The giant online firms complaining of intrusion on personal data is laughable. Nearly all of the companies above are very happy to mine personal data to make money — they don’t like the government beating them at their own game. (Payne, 2013, n.p.) This post, titled with a ‘hypocrisy warning’, is at least getting some way towards the truth. But, being mindful of Bohr’s characterisation of truth1 , we perhaps should not rest a such an analysis. While it is true that whatever surveillance, legal or illegal, moral or immoral, that has been carried out by the NSA and its international partner organisations, would not have 1 “One of the favourite maxims of my father was the distinction between the two sorts of truths, profound truths recognized by the fact that the opposite is also a profound truth, in contrast to trivialities where opposites are obviously absurd” - Hans Bohr, ‘‘My Father,’’ in S. Rozental (1967, p. 328)
  • 8. 8 been possible without the use of the technology built by those companies, neither would they have been possible without the participation – nay, industry – of its users. Social networks are the most obvious example – the entire social graph, the most valuable aspect of Facebook’s multi-billion dollar value, was not built by its developers or engineers, but by a billion individual private users, laboriously, thanklessly and gratuitously clicking ‘like’. Fundamentally, no spy had to work very hard to get extremely valuable information about the world’s internet users: in fact, many of us queued long hours for the privilege of buying the technology with we could provide them with it. There have been many retorts to the ‘outrage’ over the Snowden revelations largely hung on words like ‘naïve’ and to the effect that we should have expected that our emails and suchlike were being read, both by the companies who service them and the governments who control them, but the reader can be assured that this is not another one of them. The crux of the issue is more fundamental than that: why have we participated in this so readily? A research study published a few months ahead of the Snowden revelations documented how personal attributes such as sexual orientation, religious and political views, personality traits, intelligence, happiness, and use of addictive substances could all be easily and accurately inferred from a Facebook user’s Likes (Kosinski, Stillwell, & Graepel, 2013). The authors, though remarking on the obvious privacy concerns regarding such a finding, conclude with the hope “…that the trust and goodwill among parties interacting in the digital environment can be maintained by providing users with transparency and control over their information” (Kosinski et al., 2013, p. 4). While such a hope may have been thwarted by subsequent revelations, my preceding question remains: how on earth could people not realise that the dots they were placing in cyberspace would eventually be joined up? There has long been a truism in ‘internet lore’ which goes along the lines of ‘if you’re not paying, you’re the product’, (Zittrain, 2012) which, despite containing an intuitive nugget
  • 9. 9 of truth, has been wonderfully deconstructed as both inaccurate and disempowering (Powazek, 2012). It is, however, an aim of this paper to explore where the power has gone in this equation: where it seems we have given away much privacy and not been paid. Another issue often thrown in with the ‘naiveté’ argument is the infamous privacy policy – the lengthy legalistic tracts presented at the start of every programme installation and profile instantiation, with so much signed away with a single click. In 2008 it was calculated that “if Americans were to read online privacy policies word-for-word, we estimate the value of time lost as about $781 billion annually” (McDonald & Cranor, 2008, p. 564). Or, to put it more granularly, an average of 40 minutes per day – which must be juxtaposed with the reported estimate of ‘time spent browsing the internet per day’ of 72 minutes (McDonald & Cranor, 2008). Clearly, this is far from the reality and while it is admittedly a lot of reading, not much of it easy or accessible, but it certainly is all there: we have wilfully ignored it. How have we sleepwalked into this situation? Have we been massively duped by tech evangelism, into an illusion of an egalitarian cyber utopia, only to have its nightmarish surveillance state underbelly exposed? Yet, critically, what’s this ‘we’ business – who exactly is at risk? All of these questions, with their attendant histrionics, are just as important to answer as the more sardonic questions which have developed from the Snowden revelations of the ‘no harm, no foul’ variety, which wonder what the big deal is. Well, what is it? Perhaps it is not so much that we are being surveilled by the government, nor that the corporations are using governments to surveil us, but that we are using governments and corporations to surveil our selves, and each other. Demography and character of privacy concerns To examine this crisis of privacy information, as cyberpsychologists it is first most instructive to determine who exactly is experiencing the crisis – the demography of those most
  • 10. 10 concerned. In terms of our thesis, below we examine privacy in the manner as a psychologist would, with regard to age and gender and so on. The first thing that we notice from such a literature review is that, despite much handwringing and hoopla in the media and popular discourse is that we don’t really know much about those concerned about privacy. This is perhaps a result of a combination of legalistic and computer science treatment to the topic – ‘privacy’ is a reified thing which we should all be concerned about and ‘factor in’ everywhere – but such a blunt trauma approach is intellectually and culturally redundant. We will learn a lot more about this issue if we tease out its sociology. At the outset, it is reasonably clear from the literature that there are gender differences with regard to privacy – a clear male/female spilt has been reported in privacy concerns on Facebook, with women being significantly more concerned, and some significant differences with regard to behaviour (Hoy & Milne, 2010). Additionally, Fogel and Nehmad (2009) reported that female respondents had significantly higher scores for privacy concerns but not for privacy behaviour. Similar clear divides between males and females with regard to privacy in general, if not concern, attitude or particular behaviour, have been reported elsewhere (Moscardelli & Divine, 2007; Youn & Hall, 2008). While this is a reasonably facile feature to isolate in the human sciences, it is worth noting, given that it concurs with our central assertion that privacy is a territorial concept. It has long been implied, if not asserted, in evolutionary psychology that territoriality is a more male concern, and plenty of research has borne this out – males more likely to sit on chairs marked with personal objects in a university library (Sommer & Becker, 1969); males more likely to negatively evaluate a stranger who sits directly opposite them (Fisher & Byrne, 1975); cross-culturally, men exhibit more nonsharing behaviour in residence halls (Kaya & Weber, 2003). Consequently, it is suggested that differences in privacy behaviour online
  • 11. 11 across genders can be interpreted as a more cautious or conservative approach amongst women, and a more exploratory or aggressive approach amongst men. Though beyond the scope of this article, it should be clear that gendered approaches to territoriality in cyberspace have applicability far beyond the concept of privacy (cf. ‘Why women aren't welcome on the internet’, Hess, 2014). A factor considerably less easy to isolate is that of age, a point which is implicitly recognised by Bélanger and Crossler's (2011) review of research in this area. Like much ‘weird’ research (‘Western, Educated, Industrialized, Rich, and Democratic’; Henrich, Heine, & Norenzayan, 2010), their review found that “information privacy research has been heavily reliant on student based and USA-centric samples” (Bélanger & Crossler, 2011, p. 1019). While the cultural aspect is noteworthy, it is in the present context of lesser importance as we are most interested in the Western, internet-using population: it is the fact that privacy research has been largely carried out on student samples – i.e. between 18 and 25 years of age – which is most problematic. Any inference of whether or not privacy concerns vary with age is consequently impossible when using such a narrow range of age. For example, a study of risk taking, trust, and privacy concerns used a sample wherein the “average age was almost 22 years old (range: 17–32 years)” (Fogel & Nehmad, 2009, p. 156) and a similar study restricted itself to adults aged 18-24 (Hoy & Milne, 2010). Admittedly, neither study analyses with respect to age, but they make it difficult to determine whether or not there are age differences in privacy concerns – a claim which recurs in the media. It does seem that there may be some basis to that claim, given that, almost a decade ago, it was reported that “age is positively related to reading [of privacy policies] and negatively related to trust” p. 24 (Milne & Culnan, 2004). In addition, a non-peer-reviewed study, reported that all ages of internet users were in some agreement about privacy concerns, but that younger users were less so (Hoofnagle, King, Li, & Turow, 2010). This is in addition
  • 12. 12 to an interesting paper from a prolific source, and one to which we will return, which reports that adolescents reported disclosing more information on Facebook and using the privacy settings less than adults (Christofides, Muise, & Desmarais, 2011). Consequently, let us entertain, at least the possibility, that the privacy ‘issue’ may vary over the lifespan. How this issue should be conceptualised – either in terms of ‘concerns’ or ‘behaviours’ – is not so clear, but suffice it to say that it would not be unheard of for such factors to be mutually contradictory. Such an perspective on human nature requires a certain amount of wisdom with regard to human nature, and while the current author would do well to avoid claiming such a personal attribute, it is generally wise to treat of the opinions of one’s senior generations. Moreover, let it not pass unremarked that much of the debate, including the foregoing paragraphs, have dealt with the matrix ‘privacy X age’ very much in a youth-centric calculation. We seem to have concluded that ‘younger people’ are less concerned about privacy than ‘older people’, but what exactly do those older people think? A Finnish study of older adults reiterates in its title, for emphasis, the following quote from one of its participants: “If someone of my age put her photo on the net, I would think she is a little silly and empty-headed” (p. 50). That opinion is revealing of that cohort’s opinion of social networking, but there are other privacy-related issues also: … our participants perceived uploading photos as not safe. The participants were very careful about providing personal details related to names, relationships, and families. In fact, they stated that they want to keep private all such personal details as concern the circles of personal life (Lehtinen, Näsänen, & Sarvas, 2009, p. 50) If there were ever an argument that more educating and informing ‘silver surfers’ about the wonders of social networking will result in greater disclosure by them of personal
  • 13. 13 information, let us quietly retire it now: older users have little desire to engage in social self- disclosure. In addition, a UK report reveals that “older internet users tend to be more cautious and yet people with more years of online experience are less cautious.” (Coles-Kemp, Lai, & Ford, 2010, p. 4). As such, we have something of a paradox - experience and age should go hand-in-hand. This finding somewhat echoes another study which found that while age predicted information privacy concerns, internet experience did not (Zukowski & Brown, 2007). Both of these findings are, to an extent, glossed over, perhaps with influence from e- commerce and marketing fields, due to their contradictory nature. If we look at things rationally and objectively, and of course view those involved as ‘users’ per se, then of course as they get older and get more experience, they should become more competent and be happy to engage online more. However, this elides a satisfactory understanding of human nature – not to mention the wider question of the inherent ‘good’ in uploading copious amounts of personal information to the internet, which we will have to set aside for the time being! The simple point is that we should not assume that older generations can simply be taught to use the internet in the same way as young people, or for that matter, vice-versa. The following quotation illustrates this point neatly: Age-stratified cohorts can also be studied at a single point in time, and change can be inferred from the differences between the age groups, but this assumes that younger generations will grow up to resemble older generations, which would not be the case for life-stage-related behavior. Past research on youth may help to shed light on the kinds of behaviors that young people can be expected to outgrow. (Herring, 2008, p. 87) The bottom line is that we need to have a more profound appreciation of human lifespan development if we are to better understand age-related differences in privacy concerns,
  • 14. 14 attitudes and behaviours in the online context. To give but one example, one can easily put one’s hand to several studies exploring the use of social media to encourage political and civic engagement by the younger generation (Gibson & Mcallister, 2009; Gueorguieva, 2007; Owen, 2009; Valenzuela, Park, & Kee, 2009; Vitak et al., 2011; Xenos & Foot, 2008) but for seniors this is a redundant search: because older people do not need to be encouraged to vote to anything like the same degree. Ultimately, what we are talking about are differences in self-concept: an elusive topic at the best of times, but one which we must appreciate if we are to understand privacy concerns. It is possible that we have assumed that those who do not like ‘sharing’ their private information (such as older users), do not do so because they are not tech-savvy, but because they are sceptical about service providers intentions with their data, which is may derive from being inherently more confident in their personal identity: their territory is well-established and they have many years experience of defending it. This is a perspective we must consider in opposition to the an underlying assumption that older adults concerns about new technology could be largely erased by better education – that older meant less experience with technology. Such a perspective, ignoring their experience of life, wisdom of society and perhaps even scepticism of government and surveillance, now seems embarrassingly naïve in the light of the Snowden revelations. It also allows us to entertain the possibility that had older adults taken to the internet before tweens and young adults, such networks might never have been constructed by such a population of ‘users’. But to understand why such a networks have in fact been built, we need to understand more about the root concept at play: the self. Privacy as territoriality of identity
  • 15. 15 An idea long in germination but short in appreciation is the notion that when we are online, ‘surfing the web’, ‘navigating the information superhighway’, that we act as if we are negotiating territory. This is a critical learning if we are to understand our relationship to the internet – our place within cyberspace – because it has been shown that the self is fundamentally a locative system (Benson, 2001). In order to know who we are, we need to know where we are: These “locative demands” are a constitutive part of every moment of a person’s life. They are what underpin the idea that who you are is a function of where you are, of where you have been, and of where you want to be. (Benson, 2003, p. 62) This clearly applies as much to online behaviour as it does elsewhere: note the differences inherent in ‘I have a Twitter account’ and ‘I am on Twitter’: the former connoting disdain and lack of enthusiasm, the latter satisfaction and regular use. In this way, we position and situation our selves online as befits our social and reputational concerns. More to the point, in the context of personally identifiable information, “Selfhood and identity are constituted in and by acts of assertion, … and specifically in and by acts of authorship and ownership, acts of responsibility-taking or responsibility-denying…, as well as in acts of self-location generally” (Benson, 2013, p. 58). We declare and defend our ownership of our places as acts of selfhood – and note how information technology has endorsed our psychology in that regard - ‘My Computer’, ‘My Account’, ‘My Profile’ and so on. However, there is a more fundamental and historic aspect to this, which goes to the heart of Western liberal and civilised thinking: the idea of ‘self as property’. As such we turn to the dawn of the modern era at end of the 17th century: Though the Earth, and all inferior Creatures be common to all Men, yet every Man has a Property in his own Person. This no Body has any Right to, but
  • 16. 16 himself. The Labour of his Body, and the Work of his Hands, we may say, are properly his. Whatsoever then he removes out of the State that Nature hath provided, and left in it, he hath mixed his Labour with, and joyned to it something that is his own, and thereby makes it his Property. (Locke, 1689/1751, p.166, 2nd Treatise, Ch. V, pgh. 27). This passage reflects a deep current in the relationship between politics and psychology: that human rights are intimately tied up with property rights. Crucially in the 21st century context, not only is our identity our property, but our online identity is the product of our own labour, and as such it is also our property. Moreover, from this perspective, when we return to the reality of the industry involved in the creation of, for example, the Facebook social graph, one might have cause to muse upon the status and legitimacy of its present ownership (cf. wagesforfacebook.com). There are other aspects of Locke’s perspective on the self which we will return to presently, but first it is worth examining how the self has been understood in cyberspace. In that regard, there is no better theorist to turn to than John Suler. Writing on the human predilection for acting on the internet in ways that would not be considered conventional in a face-to-face context, known as the ‘online disinhibition effect’, (a phenomenon which obviously has significance in the realm of privacy disclosures), Suler tackles the notion of self: In fact, a single disinhibited “online self” probably does not exist at all, but rather a collection of slightly different constellations of affect, memory, and thought that surface in and interact with different types of online environments. Different modalities of online communication (e.g., e-mail, chat, video) and different environments (e.g., social, vocational, fantasy) may facilitate diverse expressions of self. (Suler, 2004, p. 325)
  • 17. 17 Consequently, when we place our selves in different online environments, when we create different types of digital content, we can be said to be expressing different constellations of the self. This is obviously interesting for creative and artistic reasons, but it is clearly problematic in a privacy context: because our self-territory is therefore shifting online. Ergo, privacy of what exactly? The issue is really quite a tricky one: privacy of a physical space is a matter of walls and screens – how can this be achieved in the nebulous world of cyberspace? Marking out a unique and personal territory in an environment fundamentally devoted to connection and association seems like an impossibility. In that regard it is helpful to return to earlier work on human territoriality, where Edney (1974, p. 962) notes that pertinent definitions involve “an interesting variety of concepts: space (fixed or moving), defense, possession, identity, markers, personalization, control, and exclusiveness of use”. Notably, Pastalan (1970, p. 88) further argue that “privacy may constitute a basic form of human territoriality”. Moreover, Proshansky, Ittleson, and Rivlin (1970, p. 175) explain that “in any situational context, the individual attempts to organize his environment so that it maximizes his freedom of choice… In all cases psychological privacy serves to maximize freedom of choice”. Consequently we have arrived at a picture of the self which, on first examination, will struggle to maintain its integrity in cyberspace – constructing and marking different aspects of its uniqueness in a networked and fluctuating context. Privacy concerns in this environment would seem the least of one’s worries: from this analysis, sanity would be under threat after a short spell in cyberspace. Ultimately, the solution to this problem of establishing our own identity in a networked environment is to realise that the internet is a contested territory of intersubjective construction: everyone thinks they own it.
  • 18. 18 However, as we shall see below, there are good psychological reasons why this is not (immediately) the case, which can also be used to help chart possible solutions to this privacy crisis. As an aside though, it can be useful to review some other recent media concerns about cyberspace in the lens of territoriality – for example, responses to internet filtering (e.g. Penny, 2014) should be viewed less as anti-censorship and more as resistance of confinement. We view the internet both as something we have built and something we explore – we do not like it being curtailed. However, let us dispense with the tech evangelists belief that the internet is some kind panacea to humanity’s baser, primeval instincts: in many cases, it is quite the opposite. At any rate, what seems to be occurring with regard to privacy concerns in cyberspace is the emergence of well-understood psychological processes in an unusual environment. Psychological aspects of privacy concerns When we combine the insights of the two previous sections, namely that there are demographic and developmental aspects to privacy, and that privacy is fundamentally an aspect of self, as a combination of identity and territory, then it becomes clear that we should consider the lifespan aspects of both identity and territoriality. Let us examine the self in more granularity. As we have already alluded, there is a possibility that older adults behavioural patterns online are less to do with lack of education and more to do with life-experience and wisdom. On closer examination, particularly at the other end of the age scale, we find plenty of psychological theory to support this hunch, beginning chiefly in the work of a colossus of psychoanalysis and developmental psychology, Erik Erikson and his concept of the ‘identity crisis’ (Erikson, 1968). The central idea of the identity crisis is that during adolescence,
  • 19. 19 having overcome childhood, and being on the way to adulthood, we attempt to answer the solitary question of who we are. His own words best describe the profundity of this process: The young person, in order to experience wholeness, must feel a progressive continuity between that which he has come to be during the long years of childhood and that which he promises to become in the anticipated future; between that which he conceives himself to be and that which he perceives others to see in him and to expect of him. Individually speaking, identity includes, but is more than, the sum of all the successive identifications of those earlier years when the child wanted to be, and often was forced to become, like the people he depended on. Identity is a unique product, which now meets a crisis to be solved only in new identifications with age mates and with leader figures outside of the family. (Erikson, 1968, p. 87) The process of overcoming this crisis has been developed further (Marcia, 1966), as involving notions such as identity moratorium (being in the midst of the identity crisis, characterised by vagueness, variability and rebelliousness), identity foreclosure (accepting the identity which one is expected to take, usually provided by parents, characterised by rigidity and authoritarianism), identity diffusion (may or may not have had a crisis, characterised by noncommitment, aimlessness and malleability) and identity achievement (having successfully overcome the crisis, characterised by commitment to organisational and ideological choices, with stable, realistic, and internally selected goals). Since then we have a well-established research strand within developmental psychology building on this theoretical framework, establishing, among other things, vocational identity (Porfeli, Lee, Vondracek, & Weigold, 2011), parental and caregiver styles (Cakir & Aydin, 2005; Wiley & Berman, 2012), risk and peer group (Dumas, Ellis, & Wolfe, 2012) as well as isolating certain gender differences (Archer, 1989; Thorbecke & Grotevant, 1982).
  • 20. 20 However, the general gist of the Eriksonian approach is that this identity crisis should be resolved during the teenage years, or soon afterwards. This no longer appears to be the case as Côté (2006) demonstrates that there is significant evidence that this process of transition to adulthood is taking considerably longer these days, and for a significant percentage of the population, is delayed until the late 20s – a finding he describes as one of the least contested in contemporary developmental psychology. As such, we can expect to find evidence of identity crisis and construction right from early teens up to thirty-odd years of age2 . Moreover, as Weber and Mitchell (2008, p. 26) point out, for such young people interested in scrutinising and studying new forms of self and identity, “new technologies are a good place to start these investigations.” There is, in fact, increasing evidence and support for this view, that adolescents and young adults are using social networks as locations for identity construction: as Greenhow and Robelia (2009) report that their high-school sample of participants often obscured their full name on their MySpace profiles, and that it was easy to see their “quest for self-discovery, identity exploration, and self-presentation playing out within their MySpace profiles” (p. 131). Moreover, as Pempek, Yermolayeva, and Calvert (2009) argue, “the communication with friends that occurs on Facebook may help young adults resolve key developmental issues that may be present during emerging adulthood, including both identity and intimacy development” (p. 236). More recently, Jordán-Conde, Mennecke, and Townsend (2013) support this theoretical direction, and in fact go as far as to suggest that the affordances of social media, by their suggestion of many and increasing possibilities for adult development, could be contributing to a prolonged period of adolescence (p. 9). Furthermore, in all of these studies another message is clear: adolescents and young adults view privacy risks as a price worth paying for the use of social networks. 2 It may be a trite observation, but in this context it is worth noting the relative youth of the founder of Facebook, Mark Zuckerberg (b. 1986), whose perspective on privacy he famously summarised as ‘public is the new social norm’ (<http://mashable.com/2010/01/10/facebook-founder-on- privacy/) and it would not be difficult to find several more examples of similarly-aged tech entrepreneurs with reductionist views on privacy.
  • 21. 21 In other words, what is happening for youth on social networks and in the internet in general, is a general form of identity construction and exploration of individuality – they are creating their own territories of personalisation. This is why they may behave as if they are unconcerned about privacy concerns even though they say that they are – because they are at that stage of their lives when their personal identities have not yet fully formed and these behaviours represent that process of self-formation. Disclosure of personally revealing information on the internet for such children, adolescents and young adults, while questionable from a privacy perspective, may well be entirely necessary from an identity- development perspective – these activities which are referred to often as bricólage (Benson, 2013; Herring, 2004; Weber & Mitchell, 2008). There is an experimental aspect to these identity developmental processes in cyberspace – one might find the term ‘beta self’ a useful metaphor. Moreover, from a territorial-development perspective, it should be noted that there are considerable and varied evolutionary, competitive and sexual pressures on those in this age group. For example, it can be expected that younger (prepubescent) children will necessarily be more conservative than those in their middle teens, as risk-taking rises with the onset of puberty. Moreover, we see many examples of territoriality in social networks, from gender differences in self-promotion (Mehdizadeh, 2010), ethnic and racial displays (Grasmuck, Martin, & Zhao, 2009; Seder & Oishi, 2009) and age differences (Strano, 2008). Interestingly, it is also worth reporting that it there is a considerably amount of comparative optimism (or ‘unrealistic optimism’ or ‘optimistic bias’) at play with regard to privacy risk online (Baek, Kim, & Bae, 2014) – in other words, that we assume that other people’s privacy is more at risk than our own. This was found to be related to a number of factors – such as age (wherein subjects were more likely to think that a younger person was most at risk of privacy breach than an older) and maternalistic personality traits and previous
  • 22. 22 experience of privacy infringement but also in an unexpected way with privacy protective behaviours. Baek, Kim, and Bae (2014) predicted that comparative optimism with regard to privacy would be negatively related to the adoption of such protections – i.e. that those who were most optimistic about their safety would be consequently emboldened to engage in risky behaviour. From the narrative which we have advanced in this paper to this point, it should not be surprising to find that this hypothesis was not supported – most likely because those who are optimistic about their safety online have taken steps to defend their territory. Furthermore, the application of territoriality to cyberspace finds strong support from pre-existing theoretical developments and applications. For example, Schwarz (2010) uses the concept of ‘field’ (Bourdieu, 1992) to explore the use of self-portraits by teenagers as social capital. Interestingly, Schwarz (2010) argues that, within the particular social network in question, users choice of photograph is less a ‘reflexively chosen identity’ and more to do with their positioning within that website’s hierarchy, and also that many dropped out of such a competitive environment. Consequently, there can be casualties between identity and territoriality in the production of personally identifiable information, which must be borne in mind with the rise of the #selfie (OxfordWords, 2013). Paradoxes unbound When we view privacy as a psychologically-complex combination of identity and territory, certain oft-trumpeted ‘paradoxes’ of internet behaviour become easier to understand. For example, as Lepore (2013) laments … the paradox of an American culture obsessed, at once, with being seen and with being hidden, a world in which the only thing more cherished than privacy is publicity. In this world, we chronicle our lives on Facebook while demanding the
  • 23. 23 latest and best form of privacy protection—ciphers of numbers and letters—so that no one can violate the selves we have so entirely contrived to expose. (n.p) Again, as we have said, this is inevitable, as we enter a contested territory, attempting to own our own corner in a space which is completely shared. This is a point which has been stumbled over many times: “Herein lies the privacy paradox. Adults are concerned about invasion of privacy, while teens freely give up personal information. This occurs because often teens are not aware of the public nature of the Internet” (Barnes, 2006, p.3/11). On the contrary, it is likely that teens are quite aware of the public nature of the information they reveal, and the fact that they are revealing it is indicative of progress through a complex process of identity development. Similarly, Acquisti and Gross (2006) seem surprised by the many ‘privacy concerned’ college undergraduates who joined Facebook and also revealed large amounts of personal information, echoed in a subsequent study which found that “paradoxically, more control over the publication of their private information decreases individuals’ privacy concerns and increases their willingness to publish sensitive information” (Brandimarte, Acquisti, & Loewenstein, 2010). Furthermore, when examined longitudinally, users demonstrated increasingly privacy-seeking behavior, progressively decreasing the amount of personal data shared publicly yet increased amount and scope of data shared privately (and also with Facebook & 3rd parties) (Stutzman, Gross, & Acquisti, 2012). Notably, in all of these studies, the sample population were college students of North America. In other words, the risk is more about the other members of the network, not the risk of the network itself, because that risk is shared across all members of the network and is the risk implicit in joining the contested territory. The whole point for users is to enter that space, where the development of one’s identity can be compared with one’s peers, with whom one competes in the usual evolutionary fashion. Of course, this does not make sense if you view
  • 24. 24 the phenomenon in terms of communication qua sharing information, because it looks like contradictory processes of information leakage. When the ‘information’ is seen as part of the user’s psychology, as the bricólage necessary for the development of self-territory of those in a culture which supports a period of extended adolescence, it becomes more understandable. As such, the idea of privacy as ‘information risk’ is not quite as useful as privacy as identity territory. The crux of the issue is not, however, any ‘paradox’, but how both our identity and territoriality, with such competing drives, can be protected legally. To do so requires a recognition that the latter aspect necessitates the creation of some form of controllable boundaries, while the former requires some form of personalisation. In sum, we want to create a representation of ourselves online, but at the same time, we do want other people to actually see it: the most exclusive shows are cordoned off by nothing more than a velvet rope. This is equally apparent with regard to the infamous ‘Facebook Newsfeed controversy’. Now a prime feature of that social network, when the feed was first introduced, which combines updates from all one’s friends into one continuously refreshing stream (information which had been previously only available on their respective profile pages), there was considerable uproar from members of the network. As Hoadley, Xu, Lee, and Rosson (2010) explain, all that information was previously accessible, but not easily or efficiently. Moreover, as Boyd (2008, p.18) points out, privacy isn’t simply a matter of ‘zeroes and ones’ but more about “a sense of control over information, the context where sharing takes place, and the audience who can gain access”. We are not annoyed that the information has been seen, but that our identity has been thrust into a territory that we were not prepared for. These interpretations of privacy behaviours are supported elsewhere, with findings that they may include strategies for gaining social capital and audience control (Ellison,
  • 25. 25 Vitak, Steinfield, & Gray, 2011) and also that audience size and diversity impacts disclosure and use of privacy settings (Vitak, 2012). At the same time however, we this does not discount the importance and subtlety of the network itself, noting important work on the distinction between exhibitions and performances in social media, with their underlying algorithms as curators (Hogan, 2010), and the ethics of Facebook’s disclosive power (Light & McGrath, 2010). While these issues are somewhat beyond the scope of this paper, it is worthwhile noting that they are consonant with the interpretation of social media as participatory surveillance (Bucher, 2012; Tokunaga, 2011; Westlake, 2008). In sum, while online privacy behaviours may reflect a certain expression of identity and territoriality, the opposite is also true: we are quite adept at using that same technology to compare ourselves to others, to judge their character and to blockade ourselves from them. As has been noted, the complexity of demarcating territory in cyberspace is a tricky one, whereby it might even seem more complex when we add another dimension to that context, though the opposite is actually the case. There is a temporal aspect to both our identity and territorial drives, which is reflected in our privacy desires. A recent study breaks new ground in this area, finding that social media users desire for their content to remain online was quite varied – some posts to become more private over time, some posts to become more visible (Bauer et al., 2013). Interestingly, as we might now expect from a pool with a median age of 26, of whom 27% were students, and 11% were unemployed, the study found that participants were very poor at predicting how they would like to change the visibility of their material over time (Bauer et al., 2013). Hence, given that this does not give much support to the idea of putting ‘expiration dates’ on posted material, Bauer et al. (2013, p.10) suggest that “a way forward might be to design interfaces that promote reflection about older content” – this would, in the current author’s estimation, be an excellent design feature
  • 26. 26 as it may help the user, not only to upon their ‘digital exhaust’, but also overcome the process of identity moratorium. Moreover, it is this temporal aspect which may prove instrumental to surmounting wider issues of privacy – not merely problematic Facebook status updates. Temporality is also a key aspect of the self, as in Locke’s (1690) understanding: personal Identity; i.e. the sameness of a Rational being: And as far as this consciousness can be extended backwards to any past Action or Thought, so far reaches the Identity of that Person; it is the same self now it was then; and ‘tis by the same self with this present one that now reflects on it, that that Action was done. (Locke, 1700) p.183 This is the problematic aspect of Locke’s work, as it verges on exonerating the criminal from the acts he does not remember, and in fact a few paragraphs later he muses on whether or not drunks deserve such an amnesty. However, this position is premised on the assumption that “there being no moment of our lives wherein we have the whole train of all our past actions before our eyes in one view” (p. 184). This, however, is not quite the case for the youth of today. Perhaps it is unfair, psychologically, for their bricólage to be stored eternally and their potentially embarrassing attempts at identity construction remaining publicly searchable? Locke argues that our sense of identity is based on our continuity of consciousness of what we can remember, which surely requires that we are allowed to forget what we would like to move on from. Moreover, we also find that there is similar temporal aspect to human territoriality (Edney, 1974, p. 969), whereby “families living in defended homes had lived significantly longer on the property, but also anticipated a longer residence there in the future.” Ergo, we expect that those who spend more time on Facebook, for example, will be most concerned about their privacy, but will also be more likely to see Facebook as both part of their identity,
  • 27. 27 and part of their future. In fact, this is exactly what the research shows, with items such as “I would be sorry if Facebook shut down” and “Facebook is part of my everyday activity” being part of a scale of Facebook intensity positively related to social capital (Ellison, Steinfield, & Lampe, 2007) and also “positive relationship between participants’ use of the advanced privacy settings and both bridging and bonding social capital” (Ellison et al., 2011, p. 27). Idiographic responses Such observations lead us to the hope that there may well be technical innovations which can help us emerge from this privacy crisis in a way which respects human psychological development. Much has been written in popular media about the notion of ‘ephemeral communication’, which is probably most notably exemplified by Snapchat, a mobile app whereby photos can be sent to fellow users but will be automatically deleted after less than ten seconds. While there are obviously concerns about the actuality of this deletion, (Shein, 2013), this does represent an interesting development for technology use, particularly for those in identity crisis stages of development. Additionally, ephemerality and anonymity have been shown not to be detrimental to the formation of community norms, status and identity on the website known as 4chan (Bernstein, Harry, Andr, Panovich, & Vargas, 2011). That particular website, known more for its infamy and vulgarity more than anything else, is worth mentioning given that it often represents an image of the internet quite different to what most users would like to experience on a regular basis, yet, as Bernstein et al.’s (2011) paper demonstrates, it does manage to ‘work’ as a community. Consequently, we should be mindful that people will produce workable solutions for themselves in whatever fashion they care. As one tweet put it, “Privacy can be achieved by multiplying public selves. Leaving breadcrumbs here to throw people on to the wrong trail” (Horning, 2014). The vista of privacy only being achievable via
  • 28. 28 distraction and disruption – in effect, deliberate identity dissociation via multiple accounts – is thought-provoking, to say the least. An alternative approach, presumably a more publicly acceptable method of user empowerment, requires a re-evaluation of the ‘big data’ phenomenon – a move from the nomothetic to the idiographic. This is where notions like sousveillance and lifelogging enter the discourse. The former term is most notably associated with Steve Mann and is generally understood to mean a type of ‘inverse surveillance’, whereby they individual monitors those in authority – those who they believe are in control of the surveillance. Sousveillance hence empowers the individual by “… affording all people to be simultaneously master and subject of the gaze, wearable computing devices offer a new voice in the usually one-sided dialogue of surveillance” (Mann, Nolan, & Wellman, 2003, p. 348). While such concept represents a certain philosophical innovation, the rebalancing of a power notwithstanding, it is hardly practical to ask the population at large to wear video cameras at all times. Assuming that ‘all people’ will want to participate in sousveillance, is assuming a lot, to put it mildly, such as but not limited to: that all people disapprove of surveillance, that they would be sufficiently motivated to engage in sousveillance, and also would be financially and technically competent to do so, and so on. It is also perhaps over-egging the power pudding – the proof of which is that in the intervening decade, such wearable technologies have yet to break through (though more of this anon). The value of sousveillance as a concept lies however, in its return of the focus to the individual. In that light, and perhaps in response to the oppositional/conspiratorial theme in the sousveillance literature, we also have the related concept of lifelogging, which, while also relying on wearable technology (usually a small neck-slung camera), does not mean to obviously evoke notions of countering a police state. The idea here is to attempt to record as much of a person’s everyday life as possible and to save everything to physical memory.
  • 29. 29 However, as Sellen & Whittaker (2010, p.73) note, “many lifelogging systems lack an explicit description of potential value for users” and one is left wondering what the point of these projects really are, except perhaps for the engineers to overcome the technical challenges involved in storing huge quantities of photographic data. In response to such criticism, the likes of Doherty et al. (2011) have responded by outlining the many potential benefits of lifelogging, such as assessing one’s health and physical activity, improving personal efficiency and lifestyle analysis, all of which have considerable merit - though it must be noted that privacy concerns have been dealt with in passing, at best. But the trump benefit to lifelogging which is fundamental and implicit in the whole enterprise is that of memory enhancement: if we wear these small cameras around our necks we may never forget anything. However, though, as Locke (1700) explored centuries ago, we are what we remember and are conscious of. In fairness, Doherty et al. (2012, p. 169) mention in passing, that “forgetting is very important”, but it needs to be recognised more profoundly by lifeloggers: keeping a permanent record of a person’s total life activity, while it has obvious benefits, may also have negative side-effects. There are many and varied links between memory and well-being, for example: overgeneral autobiographical memory and trauma (Moore & Zoellner, 2007), positive effects of false memories (Howe, Garner, & Patel, 2013), depressive realism (Ackermann & DeRubeis, 1991) and childhood amnesia (Peterson, Grant, & Boland, 2005). Essentially, it is important for the development of a positive self-concept that we forget, or at least reinterpret, painful past memories. As such, it is important that the psychological effects of new technologies, for whatever noble purpose they are designed, be fully audited. While the nuances of ‘memory as a human experience’ contra ‘memory as a computer feature’ have been noted for some time (Sellen et al., 2007), actually figuring out the beneficial aspects for the individual user are only more recently being explored (Crete-
  • 30. 30 Nishihata et al., 2012). The latter study is indicative of benefits as it focuses on older users, and the possible mental health benefits of using lifelogging-related technologies to ameliorate cognitive/memory impairments. However, at the same time, while this is an apt clinical usage, it is most likely that if such technologies ever get to market in a widespread manner (which is, despite previous false dawns, always possible – see recent CES press), they are most likely to be aimed at a younger demographic – who, as we have seen, will have far different psychological motivations. The main point here is that in this post-Snowden environment, there are many possible solutions for developers to engage with to overcome public fears with regard to their data collection, but the fundamental aim must be to empower and enlighten the individual user. This point has been made before - O’Hara, Tuffield, and Shadbolt (2008) speak at length on lifelogging, the empowerment of the individual alongside the attendant privacy concerns – yet it has never been more relevant. If people are going to continue to be asked to surrender their personally identifiable information to a remote server, visible to whom they do not know, there must be a quid pro quo of returned utility. That particular point is hardly a controversial one, yet a parallel observation remains unexplored: while academics and technicians muse on the viability, utility and privacy of the likes of sousveillance and lifelogging, the basic data needed to achieve most of those technologies assumed aims already exists and is incredibly widespread. To put it bluntly, my iPhone already gathers huge amounts of data about where I go, what activity I am engaged with and who I deal with, but it is next to impossible for me to view that data aggregated anywhere: I am interested in my metadata, can someone please show it to me? In sum, we need to provide individual users with greater visibility of the data which they are sharing with service providers. For example the MIT Immersion project (iimmersion.media.mit.edu) provides a very interesting reflection to the user of their Gmail
  • 31. 31 metadata. At the same time, the response to Apple’s inclusion of a ‘most frequent locations’ in iOS7 (e.g. Rodriguez, 2013) was nothing short of embarrassing: such data must be collected for basic phonecalls to be successfully carried, it doesn’t happen by magic. Ultimately we must not be surprised when our data is reflected back to us, in fact we must demand it and it should be presented to us as a matter of course, if not legislated for. Such a position recognises the fact that, though some of the more paranoid participants in the debate might want to, that if is possible at the moment, it very soon will become impossible to live ‘offline’ (cf. Kessler, 2013). Data about, by and for individual members of society will continue to be created at an exponential rate and therefore we must reflect upon how it is used. No one wants to live in the sort of police state surveillance state which Fernback (2013) seems to believe is upon us, where we must form ‘discourses of resistance’ against social networks, yet at the same time no-one can deny that there is an inherent truth to Tokunaga's (2011) finding that there is a considerable amount of ‘interpersonal surveillance’ ongoing also – we are spying on each other informally to a considerable extent also. The latter’s investigation occurred in the context of romantic relationships, unsurprisingly finding that ‘interpersonal electronic surveillance’ was more likely to be exercised by younger users of social networks than older, but it does chime with other findings that the majority of time spent in such contexts is ‘stalking’ or simply browsing other users’ profiles (e.g. Lewis and West, 2009) – amateur surveillance, in other words. The irony of certain responses to the privacy and surveillance phenomena is encapsulated in a certain passage in Fernback’s (2013) paper. In describing Facebook’s effectively bait-and-switch changing of policies which made private data public, Fernback (2013) reports how many Facebook groups were set up in response, noting themes such as intrusiveness and disappointment in the ‘discourses of resistance’ therein. However, in
  • 32. 32 accounting of Facebook users’ anger at not having consented their data to be used in this way, Fernback (2013) neglects to mention whether or not those users’ he cites were asked for permission as to whether their posts – which are quoted at length and with users’ full names – could be reproduced in his paper. Future directions In that light there are a number of final observations which are worth bearing in mind as we attempt to maturely rebuild cyberspace. In attempting to focus analysis and use of data away from nomothetic to idiographic, let us also admit why this is important. ‘Big’ datasets are inherently problematic, and as Ohm notes, (2010, p. 1704), “data can be either useful or perfectly anonymous but never both”. Through a series of examples, Ohm (2010) shows that datasets which were ostensibly anonymised were quickly cracked through combination with other data, and subjects recognized – a process he terms ‘accretive reidentification’, which will necessarily create and amplify privacy harms. A good example of this is provided by Zimmer (2010) in showing how the dataset on which Lewis, Kaufman, Gonzalez, Wimmer, and Christakis’ (2008) study was based, on its subsequent release, was quickly cracked, participants identified, and had to be recalled. Similarly, de Montjoye, Hidalgo, Verleysen, and Blondel (2013, p. 1) conclude that “even coarse datasets provide little anonymity. These findings represent fundamental constraints to an individual’s privacy and have important implications for the design of frameworks and institutions dedicated to protect the privacy of individuals.” As Ohm (2010, p. 1723) agrees, “results suggest that maybe everything is PII [personally identifiable information] to one who has access to the right outside information.” In characterising the combination of datasets as having the potential to create for any particular individual a ‘database of ruin’, Ohm (2010), describes some factors which may be
  • 33. 33 useful in assessing the risk of these privacy harms, such as data-handling techniques, data quantity and motives for use. These, it must be said, do not over-awe Grimmelman (2014), who runs the rule over ‘Big data’s other privacy problem’ – in effect, that while privacy harms or accretive reidentification may be problematic, we also need to be wary of who is managing the data, and who is managing them – oversight and quis custodes ipsos custodies. This has ‘Empedoclean regress’ been previously explained as the ‘burden of omniscience’ by Wilson (2000) p. 244 – a conundrum from which there does not seem to be any obvious solution. Grimmelman (2014) concludes with some rather vague propositions he calls ‘little data’, or ‘democratic data’, which do not seem contradictory of the idiographic approach mentioned above. What is necessary in all of this is for us to acknowledge and demarcate our own self-worth in our personal information – what the Germans apparently call informationelle selbstbestimmung – ‘knowing what data you have and being in control over how it’s used’ (Bartlett, 2013). This may be something of a pipe-dream, but wishing that NSA surveillance will come to an end, or that there will be no more Snowdens, are both equally more so naive. The most pragmatic direction out of the crisis is to focus on the development of strategies and applications of personal reflective development that teach the individual user more about themselves, their data and its worth. There are many other potential avenues out of the crosshairs of surveillance and subversion. For one thing, with regard to youth, it could be argued on the basis of the research reported above that it is not appropriate for those under the age of 18 to have their data permanently recorded. Could it be internationally and legally agreed that youth have a right to ephemeral networks, and their data to be deleted up to their majority? Ultimately we need to be more understanding around and educational with regard to identity play and electronic bricólage. These are technical challenges, but what of it? It would be a shame indeed if the software developers of the world collectively decided that, after having solved
  • 34. 34 so many previously-intractable problems of computer science and engineering, the protection of minors on the internet was beyond their self-professed genius. Other approaches rest on recognising the reality of data accretion and the variability in personally identifiable information. It might be worthwhile constructing a base level of online communication in a society, a sort of online directory: this is to recognise that individual territories have shared boundaries. Access to such post-boxes could be built on a nested hierarchy of personal information – i.e. if you have a person’s first name and surname, you may not be able to communicate with them, but if you have their postcode in addition, you may be able to send them a very short, text-only message, whereas if you have their full name, date of birth and phone number, you might be able to send them a large, encrypted file. In this way, a public network would be internally protected by social networks and private relationships. Alternatively, by recognising how people usually manage issues of territoriality – i.e. in solidarity – we can decentralise the network further with collective privacy. should consider organisational communication as collectively private - in that each member of a team should be able to see every other member’s messages. In this way, we have complete internal transparency: a radical departure organisationally, but one which is probably more realistic psychologically than most current scenarios. Of course, the issues of ethical database management still apply in many of these scenarios. In recognising that the collection of large amounts of data ultimately brings large amounts of power, which will always be problematic, whatever the motivation or necessity of such gathering, there are however, contradictory possibilities, when we reflect on this and the wisdom of the ancients. When Bacon (1597) concluded scientia potenta est (‘knowledge is power’) and Acton observed that “power tends to corrupt and absolute power corrupts
  • 35. 35 absolutely” we can easily see how the collection of ‘absolute knowledge’ can cause problems. It might also lead one to infer, moving from objective to subjective spheres, and mindful of our previous reference to ‘depressive realism’, that we recoil from ‘absolute knowledge’ about ourselves out of some kind of deep-seated fear of ‘knowing too much’ about ourselves. We ignore where our information goes, because it implicates and embarrasses us. Instead, we prefer repression, imagined identities and tentative territories – cultural values long overdue for revision.
  • 36. 36 References Ackermann, R., & DeRubeis, R. J. (1991). Is depressive realism real? Clinical Psychology Review, 11(5), 565–584. doi:10.1016/0272-7358(91)90004-E Acquisti, A., & Gross, R. (2006). Imagined communities: Awareness, information sharing, and privacy on the Facebook. In PET 2006, LNCS 4258 (pp. 36–58). Berlin, Heidelberg: Springer-Verlag. Annalect. (2013). Annalect Q2 2013 Online Consumer Privacy Study (pp. 1–4). Retrieved from www.annalect.com Archer, S. L. (1989). Gender differences in identity development: Issues of process, domain and timing. Journal of Adolescence, 12(2), 117–138. doi:10.1016/0140- 1971(89)90003-1 Bacon, F. (1597). Meditationes sacrae. London: Humfred Hooper. Baek, Y. M., Kim, E., & Bae, Y. (2014). My privacy is okay, but theirs is endangered: Why comparative optimism matters in online privacy concerns. Computers in Human Behavior, 31, 48–56. doi:10.1016/j.chb.2013.10.010 Barnes, S. B. (2006). A privacy paradox: Social networking in the United States. First Monday, 11(9), 1–11. Bartlett, J. (2013). iSPY: How the internet buys and sells your secrets. The Spectator. Retrieved January 20, 2014, from http://www.spectator.co.uk/features/9093961/little- brothers-are-watching-you/ Bauer, L., Cranor, L. F., Komanduri, S., Mazurek, M. L., Reiter, M. K., Sleeper, M., & Ur, B. (2013). The post anachronism. In Proceedings of the 12th ACM workshop on Workshop on privacy in the electronic society - WPES ’13 (pp. 1–12). New York, New York, USA: ACM Press. doi:10.1145/2517840.2517859
  • 37. 37 Bélanger, F., & Crossler, R. E. (2011). Privacy in the digital age: A review of information privacy research in information systems. MIS Quarterly, 35(4), 1017–1041. Retrieved from http://dl.acm.org/citation.cfm?id=2208951 Benson, C. (2001). The cultural psychology of self: Place, morality and art in human worlds. Routledge. Benson, C. (2013). Acts not tracts! Why a complete psychology of art and identity must be neuro-cultural. In Art and Identity: Essays on the Aesthetic Creation of Mind (pp. 39– 66). Amsterdam: Rodopi. Bernstein, M. S., Harry, D., Andr, P., Panovich, K., & Vargas, G. (2011). 4chan and /b/: An analysis of anonymity and ephemerality in a large online community. In ICWSM, Association for the Advancement of Artificial Intelligence (pp. 1–8). doi:10.1.1.207.9761 Boyd, D. (2008). Facebook’s privacy trainwreck: Exposure, invasion, and social convergence. Convergence: The International Journal of Research into New Media Technologies, 14(1), 13–20. doi:10.1177/1354856507084416 Brandimarte, L., Acquisti, A., & Loewenstein, G. (2010). Misplaced confidences: Privacy and the control paradox. In Ninth Annual Workshop on the Economics of Information Security. Cambridge, MA. Brown, G. (2009). Claiming a corner at work: Measuring employee territoriality in their workspaces. Journal of Environmental Psychology, 29(1), 44–52. doi:10.1016/j.jenvp.2008.05.004 Bucher, T. (2012). Want to be on the top? Algorithmic power and the threat of invisibility on Facebook. New Media & Society, 14(7), 1164–1180. doi:10.1177/1461444812440159 Cakir, S., & Aydin, G. (2005). Parental attitudes and ego identity status of Turkish adolescents. Adolescence, 40, 847–859.
  • 38. 38 Castro, D. (2013). How much will PRISM cost the U.S. cloud computing industry? (pp. 1–9). Washington, DC. Retrieved from www.itif.org Christofides, E., Muise, A., & Desmarais, S. (2011). Hey Mom, What’s on Your Facebook? Comparing Facebook Disclosure and Privacy in Adolescents and Adults. Social Psychological and Personality Science, 3(1), 48–54. doi:10.1177/1948550611408619 Coles-Kemp, L., Lai, Y., & Ford, M. (2010). Privacy on the internet: Attitudes and behaviours (pp. 1–34). Retrieved from vome.org.uk Côté, J. E. (2006). Emerging adulthood as an Institutionalized moratorium: risks and benefits to identity formation. In J. J. Arnett & J. L. Tanner (Eds.), Emerging adults in America: Coming of age in the 21st century (pp. 85–116). Washington, DC: American Psychological Association. doi:10.1037/11381-004 Crete-Nishihata, M., Baecker, R. M., Massimi, M., Ptak, D., Campigotto, R., Kaufman, L. D., … Black, S. E. (2012). Reconstructing the past: personal memory technologies are not just personal and not just for memory. Human–Computer Interaction, 27(1-2), 92– 123. doi:10.1080/07370024.2012.656062 Dale, J. (2013). Facebook privacy is a joke: How Edward Snowden changed my online habits. rabble.ca. Retrieved January 06, 2014, from http://rabble.ca/news/2013/08/facebook-privacy-joke-how-edward-snowden-changed- my-online-habits De Montjoye, Y.-A., Hidalgo, C. A., Verleysen, M., & Blondel, V. D. (2013). Unique in the crowd: The privacy bounds of human mobility. Scientific reports, 3, 1376. doi:10.1038/srep01376 Debatin, B., Lovejoy, J. P., Horn, A.-K., & Hughes, B. N. (2009). Facebook and online privacy: Attitudes, behaviors, and unintended consequences. Journal of Computer- Mediated Communication, 15(1), 83–108. doi:10.1111/j.1083-6101.2009.01494.x
  • 39. 39 Doherty, A. R., Caprani, N., Conaire, C. Ó., Kalnikaite, V., Gurrin, C., Smeaton, A. F., & O’Connor, N. E. (2011). Passively recognising human activities through lifelogging. Computers in Human Behavior, 27(5), 1948–1958. doi:10.1016/j.chb.2011.05.002 Doherty, A. R., Pauly-Takacs, K., Caprani, N., Gurrin, C., Moulin, C. J. A., O’Connor, N. E., & Smeaton, A. F. (2012). Experiences of aiding autobiographical memory using the SenseCam. Human-Computer Interaction, 27(27), 37–41. doi:10.1080/07370024.2012.656050 Dumas, T. M., Ellis, W. E., & Wolfe, D. A. (2012). Identity development as a buffer of adolescent risk behaviors in the context of peer group pressure and control. Journal of adolescence, 35(4), 917–27. doi:10.1016/j.adolescence.2011.12.012 Edney, J. J. (1974). Human territoriality. Psychological Bulletin, 81(12), 959–975. doi:10.1037/h0037444 Ellison, N. B., Steinfield, C., & Lampe, C. (2007). The benefits of Facebook “Friends:” Social capital and college students’ use of online social network sites. Journal of Computer-Mediated Communication, 12(4), 1143–1168. doi:10.1111/j.1083- 6101.2007.00367.x Ellison, N. B., Vitak, J., Steinfield, C., & Gray, R. (2011). Privacy Online. (S. Trepte & L. Reinecke, Eds.)Environment (pp. 19–32). Berlin, Heidelberg: Springer. doi:10.1007/978-3-642-21521-6 Erikson, E. H. (1968). Identity: Youth and crisis. New York: Norton. Fernback, J. (2013). Sousveillance: Communities of resistance to the surveillance environment. Telematics and Informatics, 30(1), 11–21. doi:10.1016/j.tele.2012.03.003
  • 40. 40 Fisher, J. D., & Byrne, D. (1975). Too close for comfort: Sex differences in response to invasions of personal space. Journal of Personality and Social Psychology, 32(1), 15– 21. doi:10.1037/h0076837 Fitsanakis, J. (2013). Are Russian spies switching to typewriters to avoid interception? | intelNews.org on WordPress.com. intelNews.org. Retrieved January 06, 2013, from http://intelnews.org/2013/07/12/01-1298/ Fogel, J., & Nehmad, E. (2009). Internet social network communities: Risk taking, trust, and privacy concerns. Computers in Human Behavior, 25(1), 153–160. doi:10.1016/j.chb.2008.08.006 Gibson, R. K., & Mcallister, I. (2009). Crossing the web 2.0 frontier? Candidates and campaigns online in the Australian Federal Election of 2007. In ECPR General Conference (pp. 1–30). Potdam, Germany. Grasmuck, S., Martin, J., & Zhao, S. (2009). Ethno-racial identity displays on Facebook. Journal of Computer-Mediated Communication, 15(1), 158–188. doi:10.1111/j.1083- 6101.2009.01498.x Greenhow, C., & Robelia, B. (2009). Informal learning and identity formation in online social networks. Learning, Media and Technology, 34(2), 119–140. doi:10.1080/17439880902923580 Greenwald, G. (2013, June 6). NSA collecting phone records of millions of Verizon customers daily. The Guardian. Retrieved from http://www.theguardian.com/world/2013/jun/06/nsa-phone-records-verizon-court- order Grimmelmann, J. (2014). Big data’s other privacy problem. In Big Data and the Law (Vol. 11, pp. 1–10). West Academic. Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2358079
  • 41. 41 Gueorguieva, V. (2007). Voters, MySpace, and YouTube: The impact of alternative communication channels on the 2006 election cycle and beyond. Social Science Computer Review, 26(3), 288–300. doi:10.1177/0894439307305636 Henrich, J., Heine, S. J., & Norenzayan, A. (2010). The weirdest people in the world? The Behavioral and brain sciences, 33(2-3), 61–135. doi:10.1017/S0140525X0999152X Herring, S. C. (2004). Slouching toward the ordinary: Current trends in computer-mediated communication. New Media & Society, 6(1), 26–36. doi:10.1177/1461444804039906 Herring, S. C. (2008). Questioning the generational divide: Technological exoticism and adult constructions of online youth identity. In D. Buckingham (Ed.), Youth, Identity, and DigitalMedia (pp. 71–92). Cambridge, MA: The MIT Press. doi:10.1162/dmal.9780262524834.071 Hess, A. (2014). Why women aren’t welcome on the internet. Pacific Standard. Santa Barbara, California. Retrieved January 23, 2014, from http://www.psmag.com/navigation/health-and-behavior/women-arent-welcome- internet-72170/ Hoadley, C. M., Xu, H., Lee, J. J., & Rosson, M. B. (2010). Privacy as information access and illusory control: The case of the Facebook News Feed privacy outcry. Electronic Commerce Research and Applications, 9(1), 50–60. doi:10.1016/j.elerap.2009.05.001 Hogan, B. (2010). The presentation of self in the age of social media: Distinguishing performances and exhibitions online. Bulletin of Science, Technology & Society, 30(6), 377–386. doi:10.1177/0270467610385893 Hoofnagle, C., King, J., Li, S., & Turow, J. (2010). How different are young adults from older adults when it comes to information privacy attitudes and policies? Retrieved from http://ssrn.com/abstract=1589864
  • 42. 42 Horning, R. (2014). Privacy can be achieved by multiplying public selves. Leaving breadcrumbs here to throw people on to the wrong trail. Twitter. Retrieved January 08, 2014, from https://twitter.com/marginalutility/status/419205717829365760 Howe, M. L., Garner, S. R., & Patel, M. (2013). Positive consequences of false memories. Behavioral sciences & the law, 31(5), 652–65. doi:10.1002/bsl.2078 Hoy, M. G., & Milne, G. R. (2010). Gender differences in privacy-related measures for young adult Facebook users. Journal of Interactive Advertising, 10(2), 28–46. Jordán-Conde, Z., Mennecke, B., & Townsend, A. (2013). Late adolescent identity definition and intimate disclosure on Facebook. Computers in Human Behavior. doi:10.1016/j.chb.2013.07.015 Kaya, N., & Weber, M. J. (2003). Territorial behavior in residence halls: A cross-cultural study. Environment & Behavior, 35(3), 400–414. doi:10.1177/0013916503035003005 Kessler, S. (2013). Think you can live offline without being tracked? Here’s what it takes. Fast Company. Retrieved January 20, 2014, from http://www.fastcompany.com/3019847/think-you-can-live-offline-without-being- tracked-heres-what-it-takes Kosinski, M., Stillwell, D., & Graepel, T. (2013). Private traits and attributes are predictable from digital records of human behavior. Proceedings of the National Academy of Sciences of the United States of America, 110(15), 5802–5. doi:10.1073/pnas.1218772110 Lehtinen, V., Näsänen, J., & Sarvas, R. (2009). “A little silly and empty-headed” – Older adults’ understandings of social networking sites. In HCI 2009 - People and Computers XXIII (pp. 45–54). Lepore, J. (2013, June). Privacy in an age of publicity. The New Yorker. New York. Retrieved January 09, 2014, from
  • 43. 43 http://www.newyorker.com/reporting/2013/06/24/130624fa_fact_lepore?currentPage =all Lewis, J., & West, A. (2009). “Friending”: London-based undergraduates’ experience of Facebook. New Media & Society, 11(7), 1209–1229. doi:10.1177/1461444809342058 Lewis, K., Kaufman, J., Gonzalez, M., Wimmer, A., & Christakis, N. (2008). Tastes, ties, and time: A new social network dataset using Facebook.com. Social Networks, 30(4), 330–342. doi:10.1016/j.socnet.2008.07.002 Light, B., & McGrath, K. (2010). Ethics and social networking sites: A disclosive analysis of Facebook. Information Technology & People, 23(4), 290–311. doi:10.1108/09593841011087770 Locke, J. (1700). An essay concerning humane understanding, in four books (4th ed.). London: Awnsham and John Churchil. Retrieved from https://archive.org/details/essayconcernin00lockuoft Locke, J. (1751). The works of John Locke, Esq; In three volumes, Vol. II (5th ed.). London: S. Birt, D. Browne, T. Longman... Retrieved from http://books.google.com/books?id=q-k-AAAAcAAJ&pgis=1 Mann, S., Nolan, J., & Wellman, B. (2003). Sousveillance: Inventing and using wearable computing devices for data collection in surveillance environments. Surveillance & Society, 1(3), 331–355. Marcia, J. E. (1966). Development and validation of ego-identity status. Journal of Personality and Social Psychology, 3(5), 551–558. doi:10.1037/h0023281 Marwick, A. E. (2014). How your data are being deeply mined. The New York Review of Books, 61(1). Retrieved from http://www.nybooks.com/articles/archives/2014/jan/09/how-your-data-are-being- deeply-mined/?pagination=false
  • 44. 44 McDonald, A. M., & Cranor, L. F. (2008). The cost of reading privacy policies. I/S: A Journal on Law and Policy for the Information Society, 4(3), 540–565. Mehdizadeh, S. (2010). Self-presentation 2.0: Narcissism and self-esteem on Facebook. Cyberpsychology, behavior and social networking, 13(4), 357–64. doi:10.1089/cyber.2009.0257 Mills, E. (2007, June 4). Google’s street-level maps raising privacy concerns. USA Today. San Jose, CA. Retrieved from http://usatoday30.usatoday.com/tech/news/internetprivacy/2007-06-01-google-maps- privacy_N.htm Milne, G. R., & Culnan, M. J. (2004). Strategies for reducing online privacy risks: Why consumers read (or don’t read) online privacy notices. Journal of Interactive Marketing, 18(3), 15–29. doi:10.1002/dir.20009 Moore, S. A., & Zoellner, L. A. (2007). Overgeneral autobiographical memory and traumatic events: an evaluative review. Psychological bulletin, 133(3), 419–37. doi:10.1037/0033-2909.133.3.419 Moscardelli, D. M., & Divine, R. (2007). Adolescents’ concern for privacy when using the internet: An empirical analysis of predictors and relationships with privacy-protecting behaviors. Family and Consumer Sciences Research Journal, 35(3), 232–252. doi:10.1177/1077727X06296622 O’Hara, K., Tuffield, M. M., & Shadbolt, N. (2008). Lifelogging: Issues of identity and privacy with memories for life. In First International Workshop on Identity in the Information Society. Arona, Italy. Retrieved from http://eprints.ecs.soton.ac.uk/15993/ Ohm, P. (2010). Broken promises of privacy: Responding to the surprising failure of anonymization. UCLA Law Review, 57, 1701–1777.
  • 45. 45 Owen, D. (2009). Election media and youth political engagement. Journal of Social Science Education, 7(2), 14–24. Pastalan, L. A. (1970). Privacy as an expression of human territoriality. In L.A. Pastalan & D. H. Parson (Eds.), Spatial behavior of older people (pp. 88–101). Ann Arbor, MI: University of Michigan. Payne, S. (2013, December 9). Hypocrisy alert: Big tech firms complain of data intrusion. The Spectator. Retrieved from http://blogs.spectator.co.uk/coffeehouse/2013/12/hypocrisy-alert-big-tech-firms- complain-of-data-intrusion/ Pempek, T. A., Yermolayeva, Y. A., & Calvert, S. L. (2009). College students’ social networking experiences on Facebook. Journal of Applied Developmental Psychology, 30(3), 227–238. doi:10.1016/j.appdev.2008.12.010 PEN American Center. (2013). Chilling effects: NSA surveillance drives U.S. writers to self- censor. New York. Retrieved from pen.org Penny, L. (2014). David Cameron’s internet porn filter is the start of censorship creep. The Guardian. Retrieved January 08, 2014, from http://www.theguardian.com/commentisfree/2014/jan/03/david-cameron-internet- porn-filter-censorship-creep Peterson, C., Grant, V. V, & Boland, L. D. (2005). Childhood amnesia in children and adolescents: Their earliest memories. Memory (Hove, England), 13(6), 622–37. doi:10.1080/09658210444000278 Porfeli, E. J., Lee, B., Vondracek, F. W., & Weigold, I. K. (2011). A multi-dimensional measure of vocational identity status. Journal of adolescence, 34(5), 853–71. doi:10.1016/j.adolescence.2011.02.001
  • 46. 46 Powazek, D. (2012). I’m not the product, but I play one on the internet. powazek.com/. Retrieved January 06, 2014, from http://powazek.com/posts/3229 Proshansky, H. M., Ittleson, W. H., & Rivlin, L. G. (1970). Freedom of choice and behavior in a physical setting. In H. M. Proshansky, W. H. Ittleson, & L. G. Rivlin (Eds.), Environmental psychology. New York: Holt, Rinehart & Winston. Rodriguez, S. (2013, August 8). iOS 7’s controversial new feature: Frequent Locations tracking map. Los Angeles Times. Los Angeles. Retrieved from http://www.latimes.com/business/technology/la-fi-tn-ios-7-frequent-locations- tracking-map-20130808,0,3403916.story Schwarz, O. (2010). On friendship, boobs and the logic of the catalogue: Online self-portraits as a means for the exchange of capital. Convergence: The International Journal of Research into New Media Technologies, 16(2), 163–183. doi:10.1177/1354856509357582 Seder, J. P., & Oishi, S. (2009). Ethnic/racial homogeneity in college students’ Facebook friendship networks and subjective well-being. Journal of Research in Personality, 43(3), 438–443. doi:10.1016/j.jrp.2009.01.009 Sellen, A., Fogg, A., Aitken, M., Hodges, S., Rother, C., Wood, K., & Ave, J. J. T. (2007). Do life-logging technologies support memory for the past? An experimental study using SenseCam. In CHI ’07 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 81–90). Shein, E. (2013). Ephemeral data. Communications of the ACM, 56(9), 20. doi:10.1145/2500468.2500474 Sommer, R., & Becker, F. D. (1969). Territorial defense and the good neighbor. Journal of personality and social psychology, 11(2), 85–92.
  • 47. 47 Staten, J. (2013). The cost of PRISM will be larger than ITIF projects. Forrester Research. Retrieved January 06, 2014, from http://blogs.forrester.com/james_staten/13-08-14- the_cost_of_prism_will_be_larger_than_itif_projects Strano, M. M. (2008). User descriptions and interpretations of self presentation through Facebook profile images. Journal of Psychosocial Research on Cyberspace, 2(2), 1. Retrieved from http://cyberpsychology.eu/view.php?cisloclanku=2008110402&article=1 Stutzman, F., Gross, R., & Acquisti, A. (2012). Silent listeners: The evolution of privacy and disclosure on Facebook. Journal of Privacy and Confidentiality, 4(2), 7–41. Suler, J. (2004). The online disinhibition effect. CyberPsychology & Behavior, 7(3), 321– 326. doi:10.1089/1094931041291295 Thorbecke, W., & Grotevant, H. D. (1982). Gender differences in adolescent interpersonal identity formation. Journal of youth and adolescence, 11(6), 479–92. doi:10.1007/BF01538808 Tokunaga, R. S. (2011). Social networking site or social surveillance site? Understanding the use of interpersonal electronic surveillance in romantic relationships. Computers in Human Behavior, 27(2), 705–713. doi:10.1016/j.chb.2010.08.014 Valenzuela, S., Park, N., & Kee, K. F. (2009). Is there social capital in a social network site?: Facebook use and college students’ life satisfaction, trust, and participation. Journal of Computer-Mediated Communication, 14(4), 875–901. doi:10.1111/j.1083- 6101.2009.01474.x Vitak, J. (2012). The impact of context collapse and privacy on social network site disclosures. Journal of Broadcasting & Electronic Media, 56(4), 451–470. doi:10.1080/08838151.2012.732140
  • 48. 48 Vitak, J., Zube, P., Smock, A., Carr, C. T., Ellison, N. B., & Lampe, C. (2011). It’s complicated: Facebook users' political participation in the 2008 election. Cyberpsychology, behavior and social networking, 14(3), 107–14. doi:10.1089/cyber.2009.0226 Warren, S., & Brandeis, L. D. (1890). The right to privacy. Harvard Law Review, 4(5), 193– 220. Retrieved from http://www.jstor.org/stable/1321160 Weber, S., & Mitchell, C. (2008). Imagining, keyboarding, and posting identities: Young people and new media technologies. In D. Buckingham (Ed.), Youth, Identity, and Digital Media (pp. 25–48). Cambridge, MA: The MIT Press. Westlake, E. J. (2008). Friend me if you Facebook: Generation Y and performative surveillance. TDR/The Drama Review, 52(4), 21–40. doi:10.1162/dram.2008.52.4.21 Whitehead, T. (2013, October 22). We will lose terrorists because of GCHQ leaks, warns minister. The Telegraph. Retrieved from http://www.telegraph.co.uk/news/uknews/terrorism-in-the-uk/10397795/We-will- lose-terrorists-because-of-GCHQ-leaks-warns-minister.html Wiley, R. E., & Berman, S. L. (2012). The relationships among caregiver and adolescent identity status, identity distress and psychological adjustment. Journal of adolescence, 35(5), 1203–13. doi:10.1016/j.adolescence.2012.04.001 Wilson, R. A. (2000). Prometheus rising (2nd ed.). Tempe, AZ: New Falcon. Xenos, M., & Foot, K. (2008). Not your father’s internet: The generation gap in online politics. In W. L. Bennett (Ed.), Civic life online: Learning how digital media can engage youth (pp. 51–70). Cambridge, MA: The MIT Press. doi:10.1162/dmal.9780262524827.051
  • 49. 49 Youn, S., & Hall, K. (2008). Gender and online privacy among teens: Risk perception, privacy concerns, and protection behaviors. CyberPsychology & Behavior, 11(6), 763–5. doi:10.1089/cpb.2007.0240 Zimmer, M. (2010). “But the data is already public”: On the ethics of research in Facebook. Ethics and Information Technology, 12(4), 313–325. doi:10.1007/s10676-010-9227-5 Zittrain, J. (2012). Meme patrol: “When something online is free, you’re not the customer, you’re the product.” Future of the Internet. Retrieved January 23, 2014, from http://blogs.law.harvard.edu/futureoftheinternet/2012/03/21/meme-patrol-when- something-online-is-free-youre-not-the-customer-youre-the-product/ Zukowski, T., & Brown, I. (2007). Examining the influence of demographic factors on internet users’ information privacy concerns. In SAICSIT 2007 (pp. 197–204). Fish River Sun, Sunshine Coast, South Africa.