property, privacy and personhood in a world of ambient intelligence
TRANSCRIPT
ORIGINAL PAPER
Property, privacy and personhood in a world of ambientintelligence
Niels van Dijk
Published online: 3 November 2009
� Springer Science+Business Media B.V. 2009
Abstract Profiling technologies are the facilitating force
behind the vision of Ambient Intelligence in which
everyday devices are connected and embedded with all
kinds of smart characteristics enabling them to take deci-
sions in order to serve our preferences without us being
aware of it. These technological practices have consider-
able impact on the process by which our personhood takes
shape and pose threats like discrimination and normalisa-
tion. The legal response to these developments should
move away from a focus on entitlements to personal data,
towards making transparent and controlling the profiling
process by which knowledge is produced from these data.
The tendency in intellectual property law to commodify
information embedded in software and profiles could
counteract this shift to transparency and control. These
rights obstruct the access and contestation of the design of
the code that impacts one’s personhood. This triggers a
political discussion about the public nature of this code and
forces us to rethink the relations between property, privacy
and personhood in the digital age.
Keywords Ambient intelligence � Data protection �Personhood � Intellectual property � Privacy � Profiling �Property � Transparency � Transparency enhancing
technologies � Ubiquitous computing
Introduction
This article is about the shifting field of relations between
property, privacy and personhood in a time of increasing
pervasiveness of digital technologies. The focus will be
upon both the positive and negative functions that property
and intellectual rights can have for personhood and how
these functions either collide with, or obstruct privacy
protection. The article will refer to the writings of Locke,
Marx and Foucault in order to clarify the relations between
the property rights and personhood. When this nexus of
interrelations is evaluated within the framework of ambient
intelligent technologies, it will be argued that privacy
resistance to the negative functions of property will only be
meaningful and successful when conditions of transparency
have been met.
A few preliminary conceptual clarifications have to be
made before starting the analysis. The authors treated use
different concepts when talking about persons: ‘‘person-
hood’’, ‘‘personality’’ and ‘‘personal identity’’. To provide
the reader with some clarity, for purposes of this article the
term personhood will refer to ‘‘being a person’’. Personality
will refer to ‘‘being a kind of person’’ and is thus a specific
articulation of personhood. Personal identity will refer to
someone being identifiable as the same person or kind of
person at different moments of time.
The concepts of property, privacy and personality are
legally not of the same ‘‘kind’’. Property and privacy both
have explicit standing in the law, although in different
ways. Property rights are legally recognized as subjective
rights. Privacy is recognized as either a general principle of
law, a fundamental freedom or/and a subjective right.1N. van Dijk (&)
Center for Law, Science, Technology & Society Studies (LSTS),
Vrije Universiteit Brussel, Building B, 4th floor, room C339,
Pleinlaan 2, 1050 Brussels, Belgium
e-mail: [email protected]
1 See Gutwirth (2002, pp. 39–42) however on the reasons why
privacy is a freedom and not a subjective right.
123
Ethics Inf Technol (2010) 12:57–69
DOI 10.1007/s10676-009-9211-0
Personhood and personality have a different standing in the
law. Personhood is a precondition to be an actor in law and
to be recognized as a bearer of rights and duties in the first
place. Personality is often recognized as a general principle
of law. In most jurisdictions no explicit subjective right to
personality currently exists.2 The protection of personality
however is one of the main principles behind privacy. As
we will see personhood has been used as a justification for
property rights in land and objects.
The first section of this article will describe two con-
flicting approaches with regard to the relation between
property rights and personhood. The second section sket-
ches the tendency in ‘‘intellectual property’’ towards an
increasing ‘‘propertization of the intellectual commons’’ in
the digital age. Building upon these two sections an anal-
ysis will be presented of how new information technologies
like Ambient Intelligence transform the relation between
property and personhood. Section three investigates how
these technologies have an impact on personal develop-
ment. The fourth section will provide a brief overview of
the framing and development of the right to privacy in
relation to new technologies and conceptions of person-
hood and property. Section five and six will evaluate these
consequences in a privacy framework and discuss the two
different functions of property within this framework. In
section seven several factors will be discussed that are
relevant for striking a balance between both legal regimes.
Property and personhood
The term ‘‘property’’ comes from Latin ‘‘proprius’’
meaning ‘‘one’s own’’ or ‘‘something private or peculiar to
oneself’’.3 Almost all theories on private property rights
refer to some sort of personhood. Property rights are
attached to an individual person and the particular view of
the nature of personhood greatly influences the kind of
rights conferred. The focus will be on two different func-
tions property rights have in relation to personhood,
namely the constructive and destructive function. Locke’s
labour theory of property and his theory on personal
identity will be used as the paradigmatic example for the
constructive function of property. The destructive function
of property will build on Marx description of the relations
between property and personhood and Foucault’s expan-
sions on these themes.4
According to Locke all humans are by their very nature
in ‘‘a state of perfect freedom to order their actions, and
dispose of their possessions, and persons as they think fit’’
(Locke 1689, §4, my italics). This statement places humans
in a relation of ‘‘disposal’’ with regard to their person-
hood.5 This disposal of our persons is practiced well when
it consists in a reasonable ‘‘support and comfort’’ of our
being. This ‘‘support and comfort’’ consists in acts of self-
preservation and self-development by making best advan-
tage of life and its conveniences.6 These two premises
necessitate a right of appropriation of land and tangible
goods. Locke formulates his mixing theory of property
rights as follows:
Through the earth and all inferior creatures be com-
mon to all men, yet every man has a property in his
own person. This no body has any right to but him-
self. The labour of his body, and the work of his
2 Germany could be considered an exception. Article 2(1) of its
constitution states that: ‘‘Jeder hat das Recht auf die freie Entfaltungseiner Personlichkeit, soweit er nicht die Rechte anderer verletzt undnicht gegen die verfassungsmaßige Ordnung oder das Sittengesetzverstoßt.’’3 In the English language ‘‘property’’ has the double meaning of
‘‘attribute’’ and ‘‘ownership’’ which seem to fit with these two
respective translations.
4 The other main theories of property and their links to personhood
are Hegel’s personality theory and Bentham’s utilitarian theory of
property. With regard to the connection between property and
personhood one could say the following. In Hegel’s theory this link
is crucial. Somebody only becomes a concrete materialized person by
engaging in property relationships with external things in which one’s
will becomes embodied. ‘‘Property is the first embodiment of freedom’’
(Hegel 1821, Grundlinien der Philosophie des Rechts §44–45).
In utilitarian theories of law minimal entitlements to resources are
necessary for the dignity of people. Property rights should contribute
to the maximization of the overall welfare of people. Bentham, in this
regard, remarked that ‘‘property is only a foundation of expectation—
the expectation of deriving certain advantages from the thing said to
be possessed, in consequence of the relations in which one already
stands to it’’ (Bentham, Principles of the Civil Code, in Bowring (ed.),
The Collected Works of Jeremy Bentham, 1843).
Bentham’s utilitarian theory of property has seen a ‘‘recent’’ revival
in the law & economics approach to law. An important development
in this movement is the idea that entitlements to resources can be
protected by property rules, liability rules or inalienability rules. The
choice between the three will depend to an important extent on
reasons of economic efficiency (Calabresi and Melamed 1972). Later
in this article we will show how this approach has also been applied to
the question how to protect entitlements to personal data.5 Note that ‘‘possessions’’ and ‘‘personhood’’ are juxtaposed here:
humans dispose of both. Locke says that this freedom to dispose is
only bound by the law of nature, which he calls reason.6 Reasonable disposal of our persons first and foremost teaches us
that we don’t have the liberty to destroy ourselves and that we are thus
bound to self-preservation (§6). Earth and the things (fruits and
beasts) it spontaneously produces are in their natural state given to all
humans in common. Since this is the ground rule, Locke has to
demonstrate how man can have property in things. He argues that the
right for self-preservation implies that we have an entitlement to
whatever things are necessary for our subsistence. Apart from
preserving our life, reason also teaches us to make to make use of
things to ‘‘the best advantage of life, and convenience’’ (§26). In this
way we support and comfort our being and dispose well of our person.
The conveniences of life can be further improved by invention (of
utensils and money) and art (of government; §§36, 43–44).
58 N. van Dijk
123
hands, we may say, are properly his. Watsoever then
he removes out of the state that nature hath provided
and left it in, he hath mixed his labour with, and
joyned to it something that is his own, and thereby
making it his property. (Locke 1689, §27)
This statement can be interpreted in the light of his
writings on personal identity where the person is under-
stood as the conscious part of man and not as his bodily
part.7 In Locke’s view, man as a person has property in his
own body and its actions. By his laborious actions, oriented
to his maintenance and development of his person, he
mixes a part of himself with things in their natural state and
appropriates them.8
Ever since Marx however these constructive aspects of
property for the personhood of its possessor have been
opposed with the destructive effects of property on the
personhood of others. In the enclosure movement in eigh-
teenth century England common land was privatised.
Existing property relations were redistributed and concen-
trated in the hands of a few owners. These developments
had harmful consequences to the personhood of the people
involved: they caused loss of traditional forms of life,
social relations and material possessions. According to
Marx persons, being expropriated from the means of pro-
duction and subsistence, lost control over their own activity
of working which they are forced to sell as a commodity on
the labour market. They could thus not fully realize
themselves as persons through their work (Marx 1887,
Chap. 32).9 We can here also see a strong interrelation
between property and personhood although in a very dif-
ferent configuration than in Locke.
After this ‘‘great enclosure’’ old customary rights and
privileges to common land were treated as theft by the new
owners and were illegalized. The same was true for theft of
commodities, including the ones newly produced by
industry.10 Foucault calls this a transition from the toler-
ated ‘‘illegality of rights’’ to the absolute ‘‘illegality of
property’’. These changes necessitated a shift in the prac-
tice of punishments and required new practices of sur-
veillance by the police.
It was an ‘‘effort to adjust the mechanisms of power
that frame the everyday lives of individuals; an
adaptation and a refinement of the machinery that
assumes responsibility for and places under surveil-
lance their everyday behaviour, their identity, their
apparently unimportant gestures; another policy for
that multiplicity of bodies and forces that constitutes
a population.’’ (Foucault 1991, pp. 77–78)
This development also coincided with shifts within the
relations between the appropriators of the newly enclosed
spaces and the expropriated masses. People were forced to
migrate and offer themselves as labourers within capitalist
farms, manufactories and factories. These working spaces
were embedded with their own proper techniques of sur-
veillance, punishment and discipline.11 The techniques
were designed to normalize the behaviour of people into
conformity and turn them into useful productive
7 In the Essay on Human Understanding of a year later, Locke’s
terminology is more precise. He considers the identity of man to
consist in a combination of the physical sameness of the organization
of the body over time, and the sameness of the rational thinking soul.
He then defines a person as ‘‘a thinking intelligent being that has
reason and reflection, and can consider itself as itself, the same
thinking thing, in different times and places, which it does only by
that consciousness which is inseparable from thinking’’. Personal
identity according to Locke then consists in ‘‘the sameness of a
rational being: and as far as this consciousness can be extended
backwards to any past action or thought, so far reaches the identity of
that person’’ (Locke 1690, B II, Chap. XXVII, §9).8 According to Radin (1982) the statement that a person has property
in his own body, actions and products either means that one literally
owns his body, its limbs and their products or that one has an
entitlement to be a person with a right to self-preservation that
justifies appropriation of things. In the first case there are some
paradoxes of bodily continuity. Bodily parts like blood, hair, organs
can become fungible commodities when detached from the person’s
body. They could then be sold and end up in the possession of
someone else. Apart from the question whether this should be
allowed, it shows that ‘‘property requires the notion of a thing, and the
notion of a thing requires separation from self’’ (Radin 1982, p. 966).
In order to attribute property rights to individuals, certain boundaries
must be drawn between things and persons. Property refers to
something separate from ourselves that is supposed to be clearly
demarcated in physical and legal terms, although in practice it turns
out that specifying such boundaries is a (more and more) difficult
undertaking. The contradiction shows that literal property of one’s
body and person is infeasible due to its inalienable character.
9 In the young Marx a philosophical conception of personal identity
would be implied in this process. Human essence (Gattungswesen) on
this view consisted in the creative capacity to materialize one’s ideas
through one’s labour. It is impossible to realize oneself when one is
forced to sell the product of his labour. This would mean an alienation
(Verfremdung) from one’s essence. The arguments offered in DasKapital are socio-economical. They focusing on the kind of
exploitation (Ausbeutung) of the labourer described here.10 It applies to other kinds of commodities as well. Marx for instance,
criticized a draft of a law which classified all pilfering of wood as
‘‘theft’’, because it included such different actions as taking away
felled wood, which is worked upon by someone, and gathering fallen
wood, which is already naturally separated from property, under the
legal category of theft. He states that if ‘‘every violation of property
without distinction, without a more exact definition, is termed theft,
will not all private property be theft? By my private ownership do
I not exclude every other person from this ownership?’’ (Marx 1842).11 Techniques of discipline were also incorporated in the penal system,
especially in prisons. Prisons became the place for the ‘‘transformation
of the individual as a whole’’ by the use of instruments of penal
apprenticeship in order to ‘‘create a mass of new workers’’, instruments
of spiritual conversion and instruments of continuous observation and
monitoring.
Property, privacy and personhood in a world of ambient intelligence 59
123
labourers.12 According to Foucault ‘‘discipline ‘makes’
individuals; it is the specific type technique of power that
regards individuals both as objects and as instruments of its
exercise’’ (Foucault 1991, p. 170).
Where Marx stressed the direct negative impacts of the
shifted power relations and exclusionary property redistri-
butions for the development as a person, Foucault puts an
emphasis on the techniques of policing, punishment and
discipline that arose to enforce the new exclusive property
relations and distributions. These techniques were to
exclude people from the possibilities to develop a certain
personhood and subjected them into being different ‘‘kinds
of persons’’.13 In this sense each exclusion always becomes
an inclusion in a new field of techniques and possibilities in
which the person is differently articulated. In his later work
Foucault stressed the possibilities of people to actively
resist such techniques of power and subjectification. He
argued that the exercise of power is about the capacity to
direct (conduire) the conduct of others through the tech-
niques of subjectification as described above. Power,
however, is simultaneously about the freedom to refuse to
submit to these mechanisms.14 Through strategies of
resistance a transformative confrontation is produced with
the existing relations of power that keep one tied to certain
stable personalities. Freedom is about the capacity of let-
ting new forms of personality come into existence, by
modifying the constraints of the existing systems of dif-
ferentiation. Legally this implies that a society has to
acquire those rules of law that will allow persons to engage
in games of freedom with as little asymmetry of power
positions as possible (Foucault 1996). We will come back
to these legal ‘‘rules of resistance’’ in our discussion of
privacy.
Concluding this section we can point out three central
functional elements: the (initial) granting of property rights
to support and empower people to develop their person-
hood, the distribution and enforcement of property rights
that can exclude one from possibilities to identify with and
develop into certain types of personality and be directed
towards others, and the capacity to resist such in/exclusions
and impositions of personality. Later in this article these
three elements will be further articulated in relation to
profiling technologies.
Intellectual ‘‘property’’
The author has played the role of the regulator of the
fictive, a role quite characteristic of our era of
industrial and bourgeois society, of individualism and
private property. (Foucault 1979, p. 159)
The historical birth of the field of copyright, or droit
d’auteur, was made possible by a mix between the concept
of property applied to intangibles and the rise in prominence
of the individual figure of the author (Strowel 1997).
Locke’s labour argument was extended to the creation of
‘‘intangible’’ works and served as an important justificatory
strand for granting intellectual rights. One this ‘‘natural
rights’’ line of argument one is entitled to the fruits of his or
her intellectual labour. The intellectual labourer is granted
exclusive rights that confer a temporary monopoly of cer-
tain uses of the protected creation. On the utilitarian strand
of justification granting intellectual rights should provide an
incentive for people to create works.15 On both views
intellectual ‘‘property’’ rights are supposed to guarantee or
support the existence of an intellectual commons from
which other creators might draw their inspiration and
materials. After formulating his mixing theory of property
Locke formulated what is known as the ‘‘Lockean proviso’’.
He stated that although labour is ‘‘the unquestionable
property of the labourer, no Man but he can have a right to
what is once joyned to, at least where there is enough, and as
good left in common for others’’.16 The baseline was that
intellectual rights were the exception rather than the rule,
guaranteeing that ideas and facts remain or are brought into
the public domain.
In resonance with the quotation by Foucault it is some-
times argued that a movement similar to the enclosure of
common land outlined in the last section is currently taking
place in the field of intellectual property rights and that it is
fencing off ‘‘the intangible commons of the mind’’ (Boyle
2003). In the age of digital technologies the making of
copies is the very mode of functioning. Since the author has
the exclusive right to reproduce or copy the work, this
development transforms copyright in the dominant legal
form for the world of digital technologies. It is implicated in
all its operations thus pervasively extending its reign
everywhere. As Boyle states: ‘‘intellectual property is the
legal form of the information age. It is the locus of the most
important decisions in informational policy. It profoundly
12 Compare Marx: ‘‘Thus were the agricultural people, first forcibly
expropriated from the soil, driven from their homes, turned into
vagabonds, and then whipped, branded, tortured by laws grotesquely
terrible, into the discipline necessary for the wage system.’’ (Marx
1887, VIII, Chap. 28).13 This expression is Ian Hacking’s (2002, pp. 106–107).14 ‘‘For, if it is true that at the heart of power relations and as a
permanent condition of their existence there is an insubordination and
a certain essential obstinacy on the part of the principles of freedom,
then there is no relationship of power without the means of escape or
possible flight. Every power relationship implies, at least in potential,
a strategy of struggle […] Each constitutes for the other a kind of
permanent limit, a point of possible reversal’’ (Foucault 1982, p. 225).
15 See Hettinger (1989) for a criticism of these two positions.16 Locke (1690, B II, Chap. XXVII, §27).
60 N. van Dijk
123
affects the distribution of political and economic power in
the digital environment’’ (Boyle 1997, p. 90). There are
structural tendencies to overprotect digital information
goods.
It must be remarked that there are important dissimilar-
ities between these ‘‘physical’’ and ‘‘intellectual’’ enclo-
sures due to the different nature of their subject matter;
so called ‘‘tangibles’’ and ‘‘intangibles’’. In case of the
enclosure of the intellectual commons the famous ‘‘tragedy
of the commons’’ doesnot occur.17 This is the case when too
many people are granted an entitlement to use a common
resource like land and no rights to exclude others are
granted. The result will be an overuse of the resource.
Contrary to tangible property like land, intangible goods are
non-rivalrous—use by others doesnot diminish the original
use of the good—and access to them is not excludable once
they are shared. Intellectual ‘‘property’’ rights are thus
property in a metaphorical rather than in a literal sense.18
Persons in intelligent environments
Digital technologies are still undergoing a remarkably rapid
development. An important strand of current development
of information processing technologies is based upon the
paradigm of ubiquitous computing developed by Mark
Weiser of IBM (Weiser 1991). Ubiquitous computing is
based on the philosophical or psychological observation
that the most important, well-functioning technologies are
those that recede in the background when used. They are
‘‘ready-to-hand’’ and weave themselves in the ‘‘tacit’’ fabric
of our everyday lives, without us necessarily becoming
aware of them.19 The vision of Ambient Intelligence
expands on these technologies and insights (Aarts and
Marzano 2003). In a world of Ambient Intelligence humans
will live in an environment in which many daily devices are
connected and embedded with all kinds of smart computing
characteristics which are based upon digital profiling
technologies. ‘‘A new dimension has been added to the
world of information and communication technologies
(ICTs): from anytime, any place connectivity for anyone,
we will now have connectivity for anything’’ (ITU 2005, p.
8). These intelligent environments will be able to take
decisions in order to serve our preferences without us being
aware of it.
Profiling technologies are an important enabling tech-
nology for Ambient Intelligence.20 The goal of these tech-
nologies is to mine for patterns or correlations in large
amounts of data. When these data are related to people these
correlations constitute categories or types of persons. Indi-
vidual persons can be classified and profiled as such a kind
of person on basis of their data (Hildebrandt and Gutwirth
2008). Profiling technologies will be embedded within
several domains of our everyday private and public lives:
the home, work, education, health, shopping, mobility and
even in public city life in general (Cook et al. 2009). Each of
these applications will give rise to its own specific issues
depending on the context in which the technologies are
embedded (Wright et al. 2006). In this section we will focus
on some threats profiling practices might pose for personal
development.21 The next sections will deal with the role of
privacy and its relation to property and intellectual rights
within this socio-technological framework.
One consequence of the use of profiling technologies is
the potential of unjustifiable discrimination. ‘‘The com-
puter profile is a discriminatory technology. It is a resource
used to differentiate between persons and groups’’ (Gandy
2000, p. 11). This in itself is not enough to make this kind
of discrimination unjustified. More important in this
respect are the kind of uses to which these algorithms and
profiles are put. This, in turn, leads us to an analysis of the
norms of the institutions in which these technologies are
embedded and the policies these institutions adopt for
making decisions about individuals on the basis of profiles.
The justifiability of profiling discrimination will also turn
on the kind of consequences it can have for people like
being deprived of important material and informational
opportunities. An example of unjustified discrimination is
the use profiling technologies in financial institutions.
Based on the profiled segment of the population one is
assigned to, certain services like granting mortgages, loans
or insurances are either offered or denied (Custers 2009). In
17 In fact the opposite could well be the case in some areas of
intellectual rights. Heller and Eisenberg have argued that the granting
of too many patents in biomedical research can cause a ‘‘tragedy of
the anticommons’’. This will be the case when ‘‘multiple owners each
have a right to exclude others from a scarce resource and no one has
an effective privilege of use’’ (Heller and Eisenberg 1998, p. 698).
The resource will then be underused. In the case of biomedical
research this could mean that granting too many patents on
discoveries could deter innovation because too many people have a
right to exclude others from using the resources essential for doing
further ‘‘downstream’’ research.18 The arguments for justifying the ownership of classical property
on tangible things cannot thus be uncritically transferred to the
justification of intellectual property rights. These are tied to specific
rules that are meant to limit their scope and potential harmful uses to
which they might be put.19 Weiser refers to Heidegger and Polyani here.
20 Examples of profiling technologies used in Ambient Intelligence
are recommender systems based on both collaborative filtering (web-
based group profiling) and personal filtering (individual user profil-
ing) (Aarts et al. 2005).21 There are many different kinds of profiling practices: credit
scoring, fraud prevention, customer and consumer profiling, profiling
of employees, profiling of web users, profiling for attention support in
education, location based services (mobile marketing), behavioral and
biometric profiling (Hildebrandt and Gutwirth 2008).
Property, privacy and personhood in a world of ambient intelligence 61
123
this way whole segments of the population are withheld
these important material resources without having aware-
ness or access to information about the grounds of exclu-
sion.22 This leads Gandy to claim that ‘‘[o]ur concern
should be based more generally on what we understand to
be the social consequences that flow from using a decision
system that systematically bars members of groups or seg-
ments of the population from acquiring the informational
resources that are essential to their individual development
and their collective participation in the economy and the
public sphere’’ (Gandy 2002, p. 13). In other words; in these
cases members of a profiled segment of the population will
not have access to the information they need in order to
develop as persons in society. Furthermore, the fact that
profiling technologies always contain a margin of error
potentially leading to wrong categorizations, and that they
operate in an opaque and automated manner make this kind
of discrimination more unjustifiable.23 The threat here thus
consists in the fact that the application of profiling tech-
nologies in institutional decision-making can function as a
‘‘technique of exclusion’’. They can exclude people from
certain important material and informational possibilities
and resources for personal development.
Another consequence of the application of profiling
technologies is the normalisation of behaviour and per-
sonalities by a specific feedback mechanism (Lessig 2006).
Technologies like Ambient Intelligence are designed to
satisfy human needs. What counts as a human need is
inscribed in the algorithmic ‘‘script’’ of the devices on the
basis of which they are able to interact with users at all.
Similarly these scripts also need to be inscribed with a
vision of what is to count as a user that interacts with the
system. Although profiling technologies are partly dynamic
in adapting to human behaviour, they operate with a certain
formalised social ontology that they impose upon users
when reacting to their behaviour. These built-in presuppo-
sitions and the profiles that are constructed on the basis of
these have a self-enforcing feedback effect on the behaviour
of people. According to his or her behavioural data, a user is
matched to a certain profile. On the basis of such classifi-
cations the systems will act and the profiled preferences will
be fed back to the user. If he or she reacts conformingly the
initially constructed profile will be enforced and can
eventually attain the status of a norm. The range of possible
actions available to the user is limited by such recursive
closures.24 On a larger scale a population becomes seg-
mented and stabilized on the basis of these norms into
certain types of personalities. This is similar to the effects of
subjectification of disciplinary techniques that Foucault
described. Although a strategy of resistance to these nor-
malizing processes will potentially be always possible, it
will not be readily available. This is due to the extreme
opacity of these profiling systems that, especially in
Ambient Intelligence, are designed to retreat in the back-
ground of our attention. Resistance without awareness is
futile.
Privacy and personality
One possibility for finding a legal solution to the issues of
unjustified discrimination and normalisation is by address-
ing them in the framework of privacy. In order to do so, we
need to be clear about what we mean by privacy and how it
relates to the remarks made about personhood. We will then
see that this element of resistance is crucial. Privacy is
inextricably tied up with the concept of the ‘‘private’’, an
ambiguous notion with great historical variability. The
foundations for the conception of the private came about
in the period between Renaissance and Enlightenment. A
combination of political, cultural and technological changes
gave rise to a development of a personal sphere: the rise
of the central state and its disciplinary techniques of sub-
jectivation, the importance of the family as central unit in
the economy and the resurge of Christian practices of
private withdrawal for confession and introspection during
the Reformation (Foucault 1994; Gutwirth 2002).25 The
invention of the printing press also had an important impact
on the conception of the private and public by catalyzing the
increase of literacy and reading. As McLuhan states, the
printing press ‘‘created the portable book, which men could
read in privacy and isolation from others […] the printed
book added much to the new cult of individualism. The
private fixed point of view became possible and literacy
conferred the power of detachment’’ (McLuhan et al. 1967,
p. 50). At the same time the printing press created a public
22 In section six we will discuss how intellectual property rights in
profiling software and profiles further block the transparency of these
processes.23 The fact that the automated nature of decision-making is a
criterion of consideration in this case, is confirmed by article 15 of the
Data Protection Directive. This article gives the individual the right
‘‘not to be subject to a decision which produces Legal effects
concerning him or significantly affects and which is based solely on
automated processing of data intended to evaluate certain personal
aspects relating to him’’. See also Schreurs et al. 2008 on the issue of
profiling and discrimination.
24 Goss describes a similar feedback mechanism that is brought about
by the use of geodemographic profiling systems in consumer
marketing. ‘‘The genius of geodemographics is that it systematically
produces such [profiled] lifestyles from us and for us: it presents
descriptions of our consuming selves that are actually normative
models, or mean characteristics of our consumption (stereo) type to
which we are exhorted to conform’’ (Goss 1995, p. 191).25 These three events correspond to the three modes of governmen-
tality that Foucault distinguishes: government of the state, govern-
ment of the family and self-government (Foucault 1994).
62 N. van Dijk
123
to whom the published books and the ideas they contained
were distributed. People gained access to increasing infor-
mational resources to develop their public opinions. These
socio-technological developments had a transformative
effect upon the demarcation between the public and the
private.
The legal conception of privacy was coined by Warren
and Brandeis (1890) as the ‘‘right to be left alone’’, inter-
preted as being shielded from the gaze of others. It is
interesting to see that in endeavouring to establish the
independence and nature of privacy as a full-fledged right,
the authors start from (property and) copyright. Both con-
tain a right to determine the moment and extent to which
thoughts, ideas, and sentiments will be communicated to
others by publication. Copyright however only applies to
literary and artistic productions and securing the profits
thereof,26 whereas privacy is about an absolute control of
the act of publication and not limited to literary or artistic
content or pecuniary value.27 They conclude that in this
sense copyright is merely an instance of ‘‘the more general
right of the individual to be left alone’’ (Warren and
Brandeis 1890, p.205). This right is not a principle of
property, but a ‘‘right to one’s personality’’. The right to be
left alone is a negative liberty to be free from interference
from others,28 both government officials and private
individuals.29
The right to privacy has always been closely related to
technologies. Warren and Brandeis framed the right to pri-
vacy in reaction to the consequences of new photographic
technologies and their use by the gossip press, for the
‘‘sacred precincts of private and domestic life’’ (Warren and
Brandeis 1890, p. 195). The core of this concept of privacy is
related to the home and other enclosed ‘‘private’’ spaces. It is
spatial in character and is called relational privacy. Due to
the rise of all kinds of technologies for the storage and
processing of information at the end of the 19600s the focus
of privacy shifted to personal data. This conception of
informational privacy is defined as ‘‘the claim of individuals,
groups, or institutions to determine for themselves when,
how, and to what extent information about themselves is
communicated to others’’ (Westin 1967, p. 7).30
A new generation of ubiquitous profiling technologies
pose new challenges to this framework. The construction of
personality will be increasingly influenced by the ‘‘digital
persona’’ that the profiling systems continuously impose on
us. Clarke defined such a digital persona as ‘‘a model of the
individual established through the collection, storage and
analysis of data about that person’’ (Clarke 1996). The
individual doesnot have the same kind of control over these
imposed digital personae as he has over the digital personae
he publicly projects himself.31 Agre and Rotenberg expand
on this insight when they define privacy as ‘‘the freedom
from unreasonable constraints on the construction of one’s
own identity’’ (Agre and Rotenberg 1988, p.7). Privacy is
here thus framed in terms of a freedom to develop personal
identity by assuming some kind of control over the digital
personae imposed by others. In a larger context such control
can be seen as a balance of powers within the democratic
constitutional state. Privacy here has a prohibitive and nor-
mative nature in that it sets limits to the power of others to
interfere or influence a person’s behaviour (Gutwirth and De
Hert 2008). Privacy, in this sense, is about counter-empow-
erment to provide the very point of resistance to face up to
techniques of power as described by Foucault (Gutwirth
2002, p. 58).
In the discussion of this conception of privacy through-
out the next sections we will encounter (intellectual)
property rights at two different points: in the proposal for
protecting privacy by the propertization of personal data
and when we see the principle of transparency of processing
blocked by intellectual rights on software. These two
functions of property coincide with the two relations
between property and personhood outlined earlier: property
as a guarantee or property as an exclusion from the devel-
opment into certain kinds of persons.
26 In the common law traditions with a copyright regime the creator
of a work is only granted economical rights in order to secure profits.
In legal traditions with ‘‘author rights’’ (droit de auteur, auteursrecht)apart from these economical rights, the author is also granted moral
rights like paternity and integrity. These moral rights bear a close
connection to considerations about the development of the personality
of an author.27 Warren & Brandeis wrote their article a few years after the Berne
Convention for the Protection of Literary and Artistic Works in 1886
to which the US was not yet a party. Apart from literary and artistic
works, this treaty also included productions in the scientific domain
within the scope of protection.28 A legal example is article 12 of the Universal Declaration of
Human Rights, which clearly formulates this point: ‘‘No one shall be
subjected to arbitrary interference with his privacy, family, home or
correspondence, nor to attacks upon his honor and reputation.
Everyone has the right to the protection of the law against such
interference or attacks’’.29 ‘‘The common law has always recognized a man’s house as his
castle, impregnable, often, even to its own officers engaged in the
execution of its commands. Shall the courts thus close the front
entrance to constituted authority, and open wide the back door to idle
or prurient curiosity?’’ (Warren and Brandeis 1890, p. 220).
30 It could be argued however that information about people was
exactly the issue at stake when Warren and Brandeis framed privacy
as the right to be left alone by the gossip press. The distinctive
element of informational privacy could well be the decontextualized
character of the data. However, as we will see, with the next
generation of real-time context-embedded technologies like Ambient
Intelligence, data will again become contextualized and spatialized.31 Clarke compares the relation between the digital persona and the
individual to the distinction in Jungian psychology between the inner
personality (anima) and the public personality (persona). Apart for
calling the relation between both ‘‘representational’’, he doesn’t
elaborate further on the issue.
Property, privacy and personhood in a world of ambient intelligence 63
123
Data propertization for privacy
The main legal approach for achieving the kind of control
described by Agre & Rotenberg, has focussed on the pro-
tection of personal data. A technological way to effect this
protection is by using Privacy Enhancing Technologies
(PETs) or Identity Management Tools. Privacy Enhancing
Technologies are defined as ‘‘a coherent system of ICT
measures that protects privacy […] by eliminating or
reducing personal data or by preventing unnecessary and/or
undesired processing of personal data; all without losing
the functionality of the system’’ (Borking quoted in Hil-
debrandt and Koops 2007, p. 49). These technologies
provide a digital architecture that enables uniform and
directly enforceable privacy choices by users when circu-
lating through different digital environments.32 In order to
make this technical protection of privacy effective, Lessig
thinks legal regulation essentially needs to supplement
these technologies. He proposes property rights on personal
data as a solution to enforce this architectural solution
(Lessig 2006).33 The legal solution of granting individuals
property rights on their personal data is mainly considered
in the United States. On this view personal data become
privatized commodities which can be traded given the
individual’s freedom of contract. There are several prob-
lems with treating personal data as property.34 Firstly,
intangible data have several characteristics which compli-
cate their conceptualisation as classical tangible property.
Secondly, within the US legal scholars have duly criticized
Lessig’s solution for protecting personal data by properti-
zation. Schwartz, for one, argues that the propertization of
data will not be effective due to serious market failures
(Schwartz 2000).35 Thirdly, the PET infrastructure that
enforces privacy protection is particularly ill-suited for
allowing legitimate exceptions of use of the personal data.
This problem is similar to that faced in the enforcement of
intellectual property rights through DRMs. Many excep-
tions of fair use have been blocked by these technological
systems. Fourthly when personal data do not relate to just
one person, there will be the problem of shared data and
potential conflicts between its ‘‘owners’’ (Prins 2006).
In the European tradition privacy is generally considered
a human right that is inalienable and thus uncommodifi-
able. The inalienable substance of this right would be
violated if personal data could be sold and end up in the
possession of someone else.36 Besides, the kinds of per-
sonal data covered by this subjective right must be defined
beforehand. This means a person will see his or her ‘‘per-
sonalness’’ already highly pre-categorized either by the
legislator or, in case of judicial conflicts, by the courts.
This would run contrary to someone’s freedom to self
develop personality and not be subjected to such kinds of
personalness.37 In this framework there is thus little room
for propertization of personal data. It became clear how-
ever that this right of privacy was insufficient to offer
safeguards against the massive processing of personal data
by new information technologies. In the EU this resulted in
the creation the Data Protection framework of Directive
95/46/EC. The directive is based on the premise that the
processing of data is allowed if it is based on the principles
like fairness, finality, data quality, collection limitation,
transparency, proportionality, security and accountability.
The Data Protection framework is a balance between
conflictive values like privacy and the free flow of
32 An objection against PETs like P3P that is often voiced is the fact
that there is no way to check whether the privacy policy announced by
the service provider (in a machine readable manner) is actually
endorsed.33 Lessig bases himself on the Calabresi & Melamed framework
(mentioned in footnote 9) in opting for a property rule on personal
data instead of protecting them by a liability rule (Lessig 2006,
pp. 228–229).34 See Prins (2006) for an extensive treatment of the advantages and
disadvantages of propertization of personal data in Europe.35 According to Schwartz there are 4 causes for these market failures
(Schwartz 2000):
1. Lack of knowledge about how data are processed. This under-
standing must extend to the direct recipient of the data but also to
secondary and tertiary users, which make the issue too complex
to understand well.
2. Problem of collective action, since the costs of detecting whether
companies comply with privacy policies, there is little info about
knowing how to bargain well.
3. Bounded rationality: consumers have general inertia away from
default rules, which is a limit on free choice. Propertization of
Footnote 35 continued
data will thus only benefit a limited amount of market participants.
This leads to a power imbalance.
4. When exit values are high from certain practices, poor privacy
settings can be locked in. Cookies and Web bugs are the most
general examples since one cannot escape from them and still
maintain normal surfing behavior. Propertization of data will
enforce the ability of the employer to snoop on the employees
since all the information they generate is owned by the employer.
Propertization of data will thus both cause deadweight losses and
unfortunate distributional results. In regarding the merits of protecting
data by property or liability rules, Schwartz thinks a mixed regime is
to be preferred (Schwartz 2000).36 It must be added however, that commodification is not a necessary
condition for making trade in data possible. In the Data Protection
Directive, which we will treat later, allows such trade due to the
important role attributed to principle of consent. Trading one’s data
against a discount is not a violation of any law in either Europe or the
US. This leads us to question both the status of Data Protection as an
inalienable human right and the added value of commodification of
data in the first place.37 See also Gutwirth (2002, pp. 39–41) and Prins (2006,
pp. 248–249).
64 N. van Dijk
123
information.38 It has little to do with the prohibitive sub-
stantive approach to protecting a private sphere. Instead it
is directed towards procedurally regulating the processing
of personal data. The powers which process such data are
compelled towards good practices by making the process
more transparent (Gutwirth and De Hert 2008).
The propertization of digital code and the obstruction
of transparency
Privacy through propertization of data and enforced through
PET’s presupposes the principle of data minimisation.39
The same is true for parts of the Data Protection Framework
that is partially based on principles like data quality40 and
collection limitation. Ambient Intelligence technologies
however can only operate in a data rich environment. Data
maximisation is the underlying principle. The more the
embedded systems know about the user the better they are
supposed to satisfy his preferences. Furthermore a user
might find himself fit into group profiles which are not even
necessarily constructed out of his personal data (Hilde-
brandt and Koops 2007 and Hildebrandt and Gutwirth
2008). A focus on the protection of personal data alone
cannot guarantee control over the boundaries between
environment and person in these situations.
We have seen that Ambient Intelligence poses threats of
discrimination and normalisation of behaviour. These
threats are related to the intransparency of the profiling
operations and unequal access to informational resources
and possibilities. A precondition for making a person able
to resist these developments is having a certain awareness
or knowledge about his or her predicament. This will be
achieved when information processing systems satisfy the
principles of reciprocity and understanding (Rousos and
Peterson 2003). These principles would guarantee an
increase in transparency and a fair distribution of knowl-
edge about the process by which these data are transformed
into profiles and applied to users. The transparency of the
profiling processes relating to the individual is thus coun-
terbalanced by the transparency of the actions of the pro-
filer. Potential knowledge/power disbalances between both
are hereby decreased. A few caveats must be made at this
point. Procedural safeguards like transparency and reci-
procity will not guarantee that people will be offered real
alternatives apart from simply opting out of the profiling
system. Many systems will work on a take-it-or-leave-it
basis offering people no effective choice. When no alter-
natives are offered people should additionally be empow-
ered by democratic state mechanisms (Hildebrandt and
Koops 2007, p. 60). Lastly, these procedural safeguards
also will not prevent the anticipative conformity of people
who know that they are being watched. The opposite will
actually be the case: transparency and reciprocity will
increase anticipatory awareness of profiling. In combina-
tion with means of empowerment this however is a pre-
condition for being able to resist the effects of these
profiling technologies at all.
Law can provide the tools for making the profiling
process transparent, reciprocal and for controlling it. An
existing example of such a tool would be article 12(a) of
the European Data Protection Directive (Directive 95/46/
EC) which grants the data subject the right of access to
‘‘the logic involved in any automatic processing of data
concerning him’’. This right should at least be guaranteed
when the data subject is subject to a decision ‘‘based solely
on automated processing of data intended to evaluate cer-
tain personal aspects relating to him’’ (art. 15 (1)). The
problem with this right of access is firstly that it only
applies in cases in which personal data are used. This does
not provide protection against group profiles that are not
constructed out of a subject’s data.41 Secondly, the right is
in many cases practically useless, since the majority of
‘‘data subjects’’ will not be able to understand these algo-
rithms and their effects. More specific legal tools are
required that make visible according to what social ontol-
ogy individuals are typified, who has the instruments and
power to do so and in which context and for what purpose
the data are used. We will discuss examples of such tools.
The principles of transparency and reciprocity can
become embodied in the algorithms of so-called Trans-
parency Enhancing Technologies (TETs). TETs will have
to make visible who can access which data and perform
which actions on them.42 They can also aim at anticipating
profiles that may be applied to a particular data subject by
accessing the data and algorithms used in the process.
38 See for example recital 3 of the Directive.39 In the definition of PETs we saw that its purpose was to eliminate
or reduce personal data and processing.40 One of these principles states that personal data must be
‘‘adequate, relevant and not excessive in relation to the purposes for
which they are collected’’ (Article 6(1b)).
41 When a group profile is applied to a person it renders a
personalized profile. This personalized profile however does become
a personal data in the sense of the Directive. The person involved
would turn into a ‘‘data subject’’ with the right of access as mentioned
in art. 12 of the Directive. Note that this would only provide
protection after the group profile is already applied to someone. It
similarly doesn’t offer access to and protection against group profiles
that have not yet been applied to anyone.42 Thus distinctions have to be made between the following actors
involved in the data processing sequence: the address providers
assigning identifiers or addresses to a person, the data collector who
monitors and stores information, the linker who connects the collected
data according to linking algorithms, the analyzer who analyses the
data by applying analysis algorithms, the decision makers who
decides on basis of the results of analysis and the data subject who is
concerned by the decision (Hansen et al. 2007).
Property, privacy and personhood in a world of ambient intelligence 65
123
These algorithms can then be used for constructing a
‘‘counterprofile’’.43 Whereas PETs build on data minimi-
sation, TETs build on the principles of minimisation of
information and knowledge asymmetries (Hildebrandt and
Koops 2007). In this sense they deal more with discrimi-
nation than with privacy.44 In a similar context Philips
argues that such an approach ‘‘moves the arguments over
information environments away from issues of privacy,
probing instead the ethical allocation of the resources of
visibility and knowledge production’’ (Philips 2005, p. 95).
This is a shift away from classical notions of privacy
towards the role of procedural principles like transparency,
democracy and equality in the design of code.
If TETs are used to construct a reliable counterprofile
they need to have access to the same computer programs
that are used in the profiling process. Access to this logic of
processing in the sense of article 12 (a) Directive 95/46/EC
is often refused on the basis of copyright or trade secrets
invested in these computer programs. Copyright forbids the
copying of the program necessary for making the coun-
terprofile by anyone who does not have a licence. The
average user will not possess such a licence. Trade secrets
will prevent access to these programs since the business
models of service providers rely on them and provide them
with an economic advantage over competitors. The pro-
filing practices are thus not open for public democratic
testing while the impact on the position, social roles, status
and freedom of individuals is great. This process in which
privately owned knowledge influences our individual
development has been called the ‘‘commodification of
identities’’ (Prins 2006). The framers of the Data Protection
Directive have realized the potential conflict between these
two legal regimes. In Recital 41 of the Directive they stated
that although the right of access to logic of processing
‘‘must not adversely affect trade secrets or intellectual
property in particular the copyright protecting the software
[…] these considerations must not, however, result in the
data subject being refused all information.’’ It is a curious
formulation of a precarious circular balance between
access rights and intellectual rights that needs further legal
articulation.
Striking a balance
The role of property in a context of ubiquitously embedded
profiling technologies seems to simultaneously go beyond
the several classical conceptions of ownership of land,
objects and traditional works of literature, as it seems to
integrate them. In a world of Ambient Intelligence the
profiling code, which is invisibly embedded in everyday
objects, defines the way our autonomic environments
interact with us and create our intelligent habitat.45 This
code is, in turn, protected by intellectual rights that confer
exclusive rights to the right holder. This implies that the
right-holder has important power over the way our auto-
nomic environments interact with us. A person moving in
such an environment is subjected to opaque normalizing
effects and to important limits to the possibilities to
develop as a person. This consequence raises questions
about the legitimacy of a system of private ownership over
the infrastructures in which our personhood takes shape
and about the proper delineation of the commons with
regard to these systems, programs and profiles. This com-
mons, as Boyle remarks, is not about what is owned by all
or what is not owned by anybody (common property or res
nullius). Instead it is about ‘‘that which will cure monopoly
control of standards with strong network effects’’ (Boyle
2003, p. 64). The norm here is non-discriminatory access to
the power points imposed by others on someone’s freedom
and the empowerment to resist such impositions.
In order to arrive at a balance between intellectual rights
and these rights of transparency and resistance, the nature
of the conflict has to further clarified. From a legal per-
spective we need to determine the legal status of these
autonomic profiling technologies. This firstly requires the
identification of the relevant legal objects in these tech-
nological processes and the relevant intellectual rights that
might constitute an obstacle for transparency. These are the
sui generis right on databases, copyright on software and
trade secrets on profiles (Van Dijk 2009a, b). Analysis of
legislation and jurisprudence provides examples of how to
strike a balance between intellectual and transparency
43 At the current state of development, these ideas are still
controversial, especially if TETs are defined in terms of reverse
engineering. This is not deemed feasible by most technical experts.
Counterprofiling can also be taken to mean profiling the responses of
the environment to your behaviours to figure out how it profiles you.
In that case one doesn’t need reverse engineering or access to
algorithms. Instead it would require well-constructed and direct feed-
back mechanisms to user behavior. Research to TETs is currently in
progress within the FIDIS network (Deliverable 7.12 on Biometric
Behavioral Profiling and Transparency Enhancing Tools).44 This is the case when privacy is regarded as ‘‘protection of
personal information or data’’. If privacy is more broadly conceived
as ‘‘autonomy’’ then TETs can both be said to deal with privacy and
equality.
45 This statement has its analogue in Lessig’s point that the
programming code of virtual spaces regulates the possible behaviour
and relations of people with things. He compares this code to ‘‘laws of
nature’’ or ‘‘statements of logic’’ in the ‘‘real world’’ (Lessig 2006,
pp. 14–15, 24). Ambient Intelligence technologies take this argument
further, extending it to ‘‘real world’’ objects that function partly on the
basis of digital code and that are connected to other coded objects
with which they form ‘‘intelligent’’ networks. This mixed nature puts
limitations on the possibility, present in virtual spaces, to ‘‘code
problems away’’ by changing ‘‘the laws of nature’’ (p. 15).
66 N. van Dijk
123
rights (Van Dijk 2009c). One example is the German draft
bill proposed to amend the Federal Data Protection Act.
This draft bill is framed by the Bundesrat in response to the
lack of transparency of scoring practices in financial
institutions. Because of the opacity the data subject is no
longer capable of checking how profiled credit decisions
come about. With regard to the conflict of legal regimes,
the Bundesrat considers the protection of intellectual rights
in software a legitimate interest that prevails (uberwie-
gendes schutzwurdiges Interesse) over the transparency
interests of the data subject. The data subject cannot
however be refused all information (Bundesrat-Drs. 2008,
548/08). The goal of the transparency rights is to open the
possibility for the data subject to test the legitimacy of the
grounds for decisions affecting him or her. This entails that
reference to intellectual rights or trade secrets cannot lead
to a situation in which interferences with the right to
informational self-determination46 cannot be tested (Kamp
and Weichert 2005). The German draft bill empowers the
data subject by expanding the existing transparency rights.
These new rights require the decision-making institution to
provide the data subject upon request with the current
credit scores (the profiles), the type of data processed and
an understandable explanation of the constitution of the
individual probability score (Bundesrat-Drs. 2008, 548/08).
These considerations only address one specific practice
of profiling: credit scoring. They nevertheless provide an
interesting legislative articulation of a possible balance
between transparency and intellectual rights. An analysis of
jurisprudence about conflicts between transparency rights
and confidentiality interests further reveals additional rel-
evant factors for striking this balance (Van Dijk 2009c).
With regard to the right of access to information, it is
important that the data subject receives information from
the profiler that is sufficiently specific, complete and
understandable for being able to judge the correctness and
legitimacy of the data processing. This implies that the
profiling institution has to communicate the outcome of the
profiling process—the profile—and details about the con-
text in which the profiling occurred. With regard to indi-
vidual empowerment, the data subject should be granted
the possibility of independent testing of decisions taken on
the basis of profiles. This could be achieved by setting up
an independent testing authority.
It still remains to be seen whether the current balancing
factors discussed sufficiently reduce the informational
power asymmetries between the profiler and the data sub-
ject and whether they provide effective rules of resistance
against exclusions from informational resources and
imposition of profiled personalities. Checks and balances
are always installed within a specific constellation of power
positions and interests, in our case between legal interests
in the protection of (informational) privacy and (intellec-
tual) property. Both legal regimes have traditionally been
justified by their positive effects on the development of
personhood and personality. In the current information age
the balance between these regimes has been upset. The
increasing expansion of intellectual rights has led to an
overprotection of digital information goods. This expan-
sion has multiplied the destructive effects of (intellectual)
property rights on the personal development of others. It
has also led to an increase of informational monopolies that
can cause a tragedy of the anti-commons in which inno-
vation is obstructed and the free flow of information is
blocked.47 This contradicts with the other justification for
intellectual rights as incentives for the production of
informational goods. These considerations call for a new
evaluation of the legal equilibrium. The power of the
owners over the nodes of informational networks has to be
properly checked when we will come to pass through them
both virtually and physically in Ambient Intelligence. In
the light of the broader justificatory framework described
in this article it could be questioned whether intellectual
rights still provide a ‘‘prevailing legitimate interest’’, or
whether an interest in limiting the detriment to personal
development might outweigh the interest in receiving
returns on intellectual investment.
Conclusion
In a world of Ambient Intelligence objects in our everyday
environments will be interconnected and embedded with all
kinds of smart characteristics enabling the system to
‘‘autonomically’’ take decisions about how to serve our
profiled preferences. These profiling processes are designed
to invisibly recede to the background of our attention. The
interaction between ambient intelligent systems and users
creates a feedback loop in which initially constructed pro-
files become the norms of behaviour. The algorithmic code
according to which these systems interact with us will come
to constitute the nature of the places we are in and will
manage the flow of informational resources. These devel-
opments have considerable impact on the process by which
personhood takes shape and is characterized by a lack of
visibility and control of the user over this process.
46 The German Constitutional Court framed the concept of ‘‘infor-
mational self-determination’’ as ‘‘the capacity of the individual to
determine in principle the disclosure and use of his/her personal
data’’. The right was based on the general right to personality and can
be situated within the framework of informational privacy and
data protection (BVerfG 15 December 1983, (Volkszahlung),
BVerfGE 65, 1). 47 See footnote 17.
Property, privacy and personhood in a world of ambient intelligence 67
123
The legal response to these developments should not be
limited to a focus on entitlements to personal data protected
by Privacy Enhancing Technologies. In addition it should
focus on making transparent the profiling processes by
which knowledge is produced from these data, implemented
by Transparency Enhancing Technologies. This is a pre-
condition for empowered resistance within a privacy
framework. The tendency in intellectual property law
towards overprotection could counteract this shift to trans-
parency and control. These rights can obstruct the access
and contestation to the design of this profiling code. This
code is then not open for public democratic testing while the
impact on the position, social roles, status and freedom of
individuals is great. This triggers a crucial discussion about
whether these systems, programs and profiles belong to the
public domain and whether the infrastructure in which our
personhood takes shape can become the privatized owner-
ship of others. In an age of increasing enclosure and pri-
vatisation of information, these issues force us to rethink the
relations between property and privacy, their justifications
in relation to personal development and whether sufficient
checks and balance are into place to make informed resis-
tance possible.
Acknowledgments I would like to thank Mireille Hildebrandt,
Serge Gutwirth, Katja de Vries and Sari Depreeuw for their valuable
comments on earlier drafts of this article. I would also like the
anonymous reviewers whose comments have proved to be very
instructive and inspiring.
References
Aarts, E., Korst, J., & Verhaegh, W. F. J. (2005). Algorithms in
ambient intelligence. In W. Weber, J. M. Rabaey, & E. Aarts
(Eds.), Ambient Intelligence. Berlijn: Springer.
Aarts, E., & Marzano, S. (Eds.). (2003). The new everyday. Views onambient intelligence. Rotterdam: 010 Publishers.
Agre, P. E., & Rotenberg, M. (2001). Technology and privacy: Thenew landscape. Cambridge, Massachusetts: MIT Press.
Boyle, J. (1997). A politics of intellectual property: Environmental-
ism for the net? Duke Law Journal, 47(87), 87–116.
Boyle, J. (2003). The second enclosure movement and the construc-
tion of the public domain. Law and Contemporary Problems,66(33), 33–74.
Bundesrat, Entwurf eines Gesetzes zur Anderung des Bundesda-
tenschutzgesetzes, Drucksache 548/08. (2008). http://www.bund
esrat.de/cln_090/SharedDocs/Drucksachen/2008/0501-600/548-
08,templateId=raw,property=publicationFile.pdf/548-08.pdf.
Calabresi, G., & Melamed, A. D. (1972). Property rules, liability rules
and inalienability rules: One view of the cathedral. Harvard LawReview, 85(6),1089–1128.
Clarke, R. (1996). The digital persona and its application to data
surveillance. Information Society, 10(2), 77–92.
Cook, D. J., Augusto, J. C., & Jakkula, V. R. (2009). Ambient
intelligence: Technologies, applications, and opportunities. Per-vasive and Mobile Computing, 5, 277–298.
Custers, B. (Ed.). (2009). Profiling in financial institutions, future of
identity in the information society (FIDIS), D.7.16, at http://
www.fidis.net/resources/deliverables/profiling/.
Foucault, M. (1979). What is an author? In J. V. Harari (Ed.), Textualstrategies. New York: Cornell University Press.
Foucault, M. (1982). The subject and power. In H. L. Dreyfus
& P. Rabinow (Eds.), Michel Foucault: Beyond structuralismand hermeneutics. New York: Harvester Weatsheaf.
Foucault, M. (1991). Discipline and punish. London: Penguin.
Foucault, M. (1994). La ‘‘Gouvernementalite’’, in Dits et ecrits. Paris:
Gallimard.
Foucault, M. (1996). The ethics of the concern for self as a practice
of freedom, in Foucault Live (Interviews 1961–1984), Semiotexte.
Gandy, O. (2000). Exploring identity and identification in cyberspace.
Notre Dame Journal of Law, Ethics, and Public Policy, 14(2),
1085–1111.
Gandy, O. (2002) Data mining and surveillance in the post 9/11environment, Presentation at IAMCR (pp. 1–18), Barcelona.
Goss, J. (1995). We know who you are and we know where you live:
The instrumental rationality of geodemographic systems. Eco-nomic Geography, 71(2), 171–198.
Gutwirth, S. (2002). Privacy and the information age. Lanham:
Rowman & Littlefield.
Gutwirth, S., & De Hert, P. (2008). Regulating profiling in a
democratic constitutional state. In M. Hildebrandt & S. Gutwirth
(Eds.), Profiling the European citizen. Cross disciplinaryperspectives. Dordrecht: Springer.
Hacking, I. (2002). Making up people. In Historical Ontology,
(Chapter 6). Harvard: Harvard University Press
Hansen, M., Hansen, M., Hauser, M., Janneck, K., Krasemann, H.,
Meints, M., et al. (2007). Verkettung digitaler Identitaten.
Germany: Study Commissioned by the Federal Ministery of
Education and Research.
Heller, M. A., & Eisenberg, R. S. (1998). Can patents deter
innovation? The Anticommons in Biomedical Research Science,280, 698.
Hettinger, E. C. (1989). Justifying intellectual property. Philosophyand Public Affairs, 18(1), 31–52.
Hildebrandt, M., & Gutwirth, S. (Eds.). (2008). Profiling the Europeancitizen. Cross disciplinary perspectives. Dordrecht: Springer.
Hildebrandt, M., & Koops, B. J. (2007). A vision of ambient law,
FIDIS Consortium, D.7.9, at http://www.fidis.net/resources/
deliverables/profiling/.
ITU. (2005). The internet of things, executive summary, internationaltelecommunications union (pp. 1–28). Geneva.
Kamp, M., & Weichert, T. (2005). Scoringsysteme zur Beurteilung
der Kreditwurdigkeit, Kiel, available at: http://www.bmelv.de/
cln_045/nn_749972/SharedDocs/downloads/02-Verbrauchersch
utz/Markt/scoring.html__nnn=true.
Lessig, L. (2006). Code 2.0. New York: Basic Books.
Locke, J. (1689). Two treatises of government. Cambridge: Cam-
bridge University Press, 2005.
Locke, J. (1690). An essay concerning human understanding. At
http://arts.cuhk.edu.hk/Philosophy/Locke/echu/.
Marx, K. (1842). Debates on the law of thefts of wood, translation by
Dutt, C., from Rheinische Zeitung, No. 298, at http://www.
marxists.org/archive/marx/works/1842/10/25.htm#p1.
Marx, K. (1887). Capital, Vol. 1. The process of production of capital,
Progress Publishers, Moscow, at http://www.marxists.org/arch
ive/marx/works/1867-c1/.
McLuhan, M., Fiore, Q., & Agel, J. (1967). The medium is themassage. London: Penguin.
Philips, D. J. (2005). From privacy to visibility. Context, identity, and
power in ubiquitous computing environments. Social Text, 23(2),
95–108.
Prins, J. E. J. (2006). Property and privacy: European perspectives
and the commodification of our identity. In L. Guibault & P. B.
Hugenholtz (Eds.), The future of the public domain. Dordrecht:
Kluwer Law International.
68 N. van Dijk
123
Radin, M. J. (1982). Property and personhood. Stanford Law Review,34(957), 957–1016.
Rousos, G., & Peterson, D. (2003). Mobile identity management: An
enacted view. International Journal of Electronic Commerce, 8,
81–100.
Schreurs, W., Hildebrandt, M., Kindt, E., & Vanfleteren, M. (2008).
The role of data protection and non-discrimination law in group
profiling in the private sector. In M. Hildebrandt & S. Gutwirth
(Eds.), Profiling the European Citizen. Cross disciplinaryperspectives. Dordrecht: Springer.
Schwartz, P. (2000). Beyond Lessig0s code for the internet privacy:
Cyberspace filters, privacy-control and fair information prac-
tices. Wisconsin law review, 743, 743–788.
Strowel, A. (1997). Liberte, Propriete, Originalite: Retour aux sources
du Droit d’auteur. In B. Libois & A. Strowel (Eds.), Profils de laCreation. Brussels: Facultes universitaires Saint-Louis.
Van Dijk, N. (2009a). Intellectual rights in profiling processes. In: B.
Custers (Ed.), Profiling in financial institutions, future of identityin the information society (FIDIS), D.7.16, at http://www.
fidis.net/resources/deliverables/profiling/.
Van Dijk, N. (2009b). The legal status of profiles, in intelligent
environments 2009. In: Proceedings of the 5th internationalconference on intelligent environments, Barcelona, IOS Press.
Van Dijk, N. (2009c). Intellectual rights as obstacles for transparency
in data protection. In: A. Deuker (Ed.), Mobile marketing in theperspective of identity, privacy and transparency, future ofidentity in the information society (FIDIS), D.11.12.
http://www.fidis.net/resources/deliverables/mobility-and-
identity/.
Warren, S., & Brandeis, L. (1890). The right to privacy. Harvard LawReview, 4(5), 193–220.
Weiser, M. (1991). The computer for the twenty-first century(pp. 94–104). Scientific American, 265, 3, New York: Scientific
American.
Westin, A. F. (1967). Privacy and freedom. London: The Bodley
Head.
Wright, D., Gutwirth, S., Friedewald, M., Vildjiounaite, E., & Punie,
Y. (Eds.). (2006). Safeguards in a world of ambient intelligence.
Berlin: Springer.
Property, privacy and personhood in a world of ambient intelligence 69
123